I am developing an android app for voice chat and using webrtc in client side and node.js as server. I have successfully being able to stream voice between two peers and used node.js server for signalling.But this method has a huge problem, because webrtc connects directly the peers, when a peer is connected directly to 200 peers, it will use alot of the device's CPU and bandwidth and i want 500 and more peers to be able to voice chat without consuming much bandwidth and device cpu.To reduce the load on cpu and bandwidth usage I thought of creating a streaming link directly with the node.js server and from there stream it to the other peers like that a peer will have a single link that communicate with others.I want to know if there is a node.js module capable of linking with android's libjingle_peerconnection. I have tried node-webrtc and does not work with recent libjingle_peerconnection.
An Android device will not be able to directly connect to hundreds of WebRTC peers; this simply requires too many resources.
You want to look at a media server, like Kurento. Kurento will run on a server, and be able to send WebRTC media streams from one client to many other clients in the manner you describe. You have to write the signalling layer specific to your application, which you can do in node.js similarly to the two client case.
Related
I am building an android application which will stream video calling from one device to another android device for that i am using wowza video streaming API(Media engine) With that i am able to steam video from android app to web but is it possible device to device video steaming?
If you are planning to develop all the infrastructure , then these are the points to be evaluated and concluded.
What Technology is used
WebRTC is the technology used to support video calling. WebRTC is a free, open project that provides browsers and mobile applications with Real-Time Communications (RTC) capabilities via APIs. Check out WebRTC Details here It was introduced by Google in 2010. This allows real time communication between two browsers / mobiles.
Concepts involved
1. Data streams and Hardware
WebRTC helps into setting up/identifying hardware’s and identifying network with STUN server ( What is STUN server ) along with hardware ( microphone/camera and speakers) . For mobiles this comes as inbuilt hardwares
2. Audio Video CODECS
Google has made audio/video required for these features as open source. Generally Audio G711 for phones (still varies in specific cases) . And for Video VP8 and VP9
3. Peer Discovery
For making a call , generally either address is required. Now in internet most IP’s are dynamic. To solve this , server needs to keep track of who is online. This can be done using XMPP, SIP or some custom protocols. So for anyone to receive a call, the caller should check with server or the other way around
4. STUN Server
Once signalling (peer discovery) is done, then STUN server is required. This server will faciliate to determine external IP address as well as info whether two or more devices can talk to each other or not
5. TURN Server
If a peer-to-peer session is not possible , then a TURN server is required. The TURN server will basically shift the bits for you through open holes in the firewall between the two clients. This happens due to asymmetric firewalls and the possibility to punch holes on different ports in firewalls
Or else you can use providers like SINCH who already handles and configures the basic requirement and you only need to concentrate on mobile front end.
Check out SINCH ANDROID SAMPLE as well
I would like to implement a voice chat (even just a primitive support for it), without the hassle of implementing the whole VoIP stack. I would also like to make it available to both iOS and Android.
I have been searching for a way to do this, and it just seems to me that using Socket.IO would be something to start with. Is it possible to use Socket.IO as a signalling server (to discover who is available, and who to start a voice chat with), and then establish a peer-to-peer connection between two devices in order to transfer audio data?
NOTE: Although I would prefer peer-to-peer, because I would like to avoid overloading the server with transferring such an enormous amount of data, is it possible to have socket.io receiving and transmitting audio data between devices (because I heard that iOS and Android cannot establish a P2P connection with one another). If it's not possible, then maybe it's possible to use a socket.io server as an intermediary for sending audio data?
I found the following helpful links (which are probably based on socket.io) but they have not any reference for mobile devices.
http://peerjs.com/
https://simplewebrtc.com/
Any help is greatly appreciated. Any alternatives to socket.io, are also accepted, if there are any.
I'm looking for options for communication between an Android device - running a native app - and a website.
In basics, the Android device is just a sensor for movement, while the website is the receiving end and will process the sensor data. The website will then have to visualize this movement.
The goal is that this happens instantly and constantly, as the sensor data can easily reach up to 50 updates a second.
I'm looking for some proper options and possibly shared experiences for streaming this data as far as possible;
So far it has crossed my mind to;
Use techniques like Bluetooth, Wifi Direct or USB. Probably not
reachable from a website.
Use a Node.js server for a simple socket connection.
Use Google App Engine. The channel (java) client would be nice for
this, but it seems that the app engine can only be the transmitting end.
I would do this:
Webserver: node express + socket.io
Android device: use https://github.com/Gottox/socket.io-java-client to stream events to webserver.
Browser: uses socket.io client to get a live stream of events.
The node socket.io server just takes the sensor data and broadcasts it.
About socket.io:
Socket.io uses Websockets. However, if the client doesn't support Websockets, it falls back to long polling etc... to emulate Websockets. On top of that, it gives you a pub/sub framework which Websockets doesn't provide out of the box.
The new version of socket.io (available on github) uses engine.io to provide the websocket abstraction and then puts a pub/sub framework on top of that.
In appengine use a frontend to post data to it, no need for 2way channel. If you really want 2way use sockets.
I'm trying to develop a proyect like PTTDroid, I mean a Push-To-Talk or Walkie-Talkie application.
The issue is that in this app you can´t use 3G to access the web, so I've decided to use a Node.js server and implement an Android client to comunicate with it. I tried to do a multiplattform proyect using Phonegap the problem is that for audio record you can't access to buffer, you can only start and stop or pause the recording process but not send data while capturing. So my problem is that is possible to streams audio capture in real time by native Android functions (Audiorecord class) with a Node.js server by Socket.IO or similar?
I discovered this project, Asimi JS, but I don't know if someone else knows a better way to do what I want.
Thank you very much for your help!
It is certainly possible to do it, but a standard NodeJS http server would not be advisable as it uses tcp. You want to use UDP as a transport layer for audio, since it will be faster and the small packet loss that can occur will most likely not be a problem.
To be completely honest with you it sounds like you need to write a few demo applications on the native platforms - so do not use phonegap. You need native platforms in order to access things suchs as the mircrophone and to stream over UDP.
When you have a demo working, you can go on and try with another platform afterwards, but start with a simple setup instead of trying to do it all at once - if it was that easy, someone else would have done it before you.
Let me recommend a simple UDP server in whatever language you are most comfortable with such as (NodeJS, Java, C, C++, C#). Let the UDP server receive and save the content into a file that you can then play back on a desktop computer to verify the result. As a simple client, build one either on Android or iOS, and stream a file that you have already recorded and included in the app. When you have this setup working, you can try to capture the microphone, then do a user interface, then support multiple phones, then build a server which records the conversations, then build a user database, and so on a so forth. But start with a prototype of your main feature.
I've finally discovered and solved my problem (at least that's what I think)...First of all I created a server to send and receive UDP packets by DatagramSocket and after that, to achieve communication between server and client, when I was connected by 3G, I needed to have a static port and IP, that's why my server couldn't connect with the client. With data connection, the user IP and port is not always the same and you have to keep the same socket always opened if you want to send and receive. On the other hand the server has to store the adress and port from the client in the moment of connection.
Thank you very much for your help ExxKA
The majority of sim accounts are public dynamic. Most if not all cellular providers do not allow incoming connections to public dynamic ip addresses. (3g anyway, maybe not 4g/LTE)
The issue of connecting is not one of dynamic ips, but rather blocked incoming ports.
So, if I wanted to stream video from an android phone on demand (based on information gleaned from this conversation (Streaming video from Android camera to server)), what would be the chain of events to properly intitiate a connection.
My idea of this (roughly):
app on android phone initiates and keeps open some sort of connection to media server (wowza or something).
At some point when server wants video from phone, it uses the open connection to request a video stream.
Android phone pushes rtsp stream to server.
Is this correct, and if so, what type of connection should i use as the permanent control connection. Also, is it possible to push rtsp or would i have to do something else?
Thanks!
I know this is an old question but if anybody else is searching for something similar the following is now available:
http://developer.android.com/guide/google/gcm/index.html
This essentially allows a message to be sent from a server to an app on an Android device (it replaces C2DM which did a similar thing).
Update
Google GCM has now being replaced in turn by Google Firebase Cloud Messaging:
https://firebase.google.com/docs/cloud-messaging/
Using a could based app messaging service like this, the steps would be:
Add a message subscription service to your app (e.g. Firebase)
The App registers with the cloud messaging service when it starts up
When the server wants video from the phone (as noted in the questions above) the server sends a message to the app
The app opens a connections to the streaming server and starts to stream video to the server.
Note: there is a comment below about how this approach does not allow an incoming connection from the server to the Android phone.
This, in fact, is not how streaming from a phone typically works. The phone actually makes an 'outgoing' connection to a streaming server which it then streams the video to. Other devices wanting to see the video then stream it form here.
There are several reasons why this is the preferred approach, one of the key ones being that supporting a quality streaming service that will play back on most common devices, browsers, OS's etc requires transcoding the video into multiple bit rates, and even encodings in some cases, and packaging and serving in the appropriate streaming packaging format. Doing all this on the mobile device would be very compute and storage intensive.