WebRTC In Android - android

I downloaded the open source code of webRtc.In side that I found the WEbRTCDemo test project for Android.I am able to generate the APK BUt when I install it my device not Able to communicate with both device....
Steps What I am doing..
1. In application settings->HostId: I puted Ip Address of other Android device and pressing the start call button but problem is in another Side Nothing happening.
My question is
1.for communicating the with other Device I have to setUp any server??
2.Can any one explain how its working in case of Android Device.
Please help me.
Thank You
Krishna.

Yes, you most definitely do need a 'messaging server'. Your task here is to relay the SDP from one client to the other. The SDP includes the ICE Candidates, which basically tells a client how to directly 'reach' the other (IP Address + Port combination). Once both the clients have exchanged these 'handshake' signals, they can start transferring their streams peer to peer.
Now, the implementation of this server is completely left to you. Since it is decoupled from the rest of the WebRTC API, so you can safely resort to any technology to make sure that these signalling messages are exchanged between the two clients. And once you have successfully established a PeerConnection, you can from there on even use the DataChannels to re-negotiate.
To sum things up,
Yes you do need a server to relay the messages between two clients.
Since this is independent of the WebRTC implementation, you can resort to any technology of your choice.

Related

Add dial/receive signalling mechanism for AppRTCDemo at client side

Hi I am using AppRTCDemo and its working on their server. How ever the current mechanism is for exchanging chat-rooms name and entering the same room connects the peers.
But I want to dial a call from one device to receive a call from other device and then peers should enter a room for video session ,
I have searched a lot , I have come up with for that I need signalling-server which I don't have and don't want to put hands on it ,
Now in this situation how can the other device know that device one is dialing and sharing particular room name to accept and enter the same room for video call at client side in Android .
https://github.com/njovy/AppRTCDemo
There are two android apk available for WebRTC, appRtcDemo and webRtcDemo. appRtcDemo apk can be used for android device to browser connectivity. You need to provide room id to connect to one room. If you are the room initiator then you have to enter -1.
If you want to connect two android device then you have to compile and install webRtcDemo apk. This apk interface provide place to enter ip address of another device and vice versa then both device will be connected.
Please go through -> http://www.webrtc.org/reference/getting-started
For more information. Both the apks i've compiled and installed and checked how it works.
I was able to make calls successfully between two android device using webrtcdemo. But I tested using WLAN of my office network. I did not use it further because I was using apprtcdemo for app reference. My suggestion is when you enter remote ip in webrtcdemo,just check if loop-back is unchecked. I guess for you loop-back is enabled, so you are receiving your own video packet, though you have entered remote ip. Make sure loop-back is disabled while making call.
A very good explanation can be found in this book http://chimera.labs.oreilly.com/books/1230000000545/ch03.html#STUN_TURN_ICE
which provides the fundamentals on how WebRTC uses ICE technology.
In particular assuming the IP address of the STUN server is known, the WebRTC application first sends a binding request to the STUN server. The STUN server replies with a response that contains the public IP address and port of the client as seen from the public network.
Now the application discovers its public IP and port tuple which can send to the other peer through SDP. (note that SDP are sent over an external signalling channel, f.i. websocket established through a web service)
With this mechanism in place, whenever two peers want to talk to each other over UDP, they can then use the established public IP and port tuples to exchange data.
Unfortunately, in some cases UDP may be blocked by a firewall. To address this issue, whenever STUN fails, we can use the Traversal Using Relays around NAT (TURN) protocol as a fallback, which can run over UDP and switch to TCP if all else fails.
WebRTC gives SDP Offer to the client JS app to send (however the JS app wants) to the other device, which uses that to generate an SDP Answer.
The trick is that the SDP includes ICE candidates (effectively "try to talk to me at this IP address and this port"). ICE works to punch open ports in the firewalls; though if both sides are symmetric NATs it won't be possible generally, and an alternative candidate (on a TURN server) can be used.
Once they're talking directly (or via TURN, which is effectively a packet-mirror), they can open a DTLS connection and use it to key the SRTP-DTLS media streams, and to send DataChannels over DTLS.
Edit:
Acronyms here: http://blog.1click.io/10-jargons-abbreviations-for-webrtc-fans/ for the rest, there is Google. Most of these are defined by the IETF (http://ietf.org/)
Edit 2:
Firefox and Chrome (and the spec) have moved to using "trickle" for ICE candidates, so the ICE candidates are generally added after-the-face to the PeerConnection and exchanged independently of the initial SDP (though you can wait until the initial candidates are ready before sending an offer, and bundle them together).
See https://webrtcglossary.com/trickle-ice/ and https://datatracker.ietf.org/doc/draft-ietf-ice-trickle/

WebRTC local signaling server

I am trying to figure out a solution for a signaling server for an Android WebRTC based project. Both clients will be Android and both located close to each other, i.e. - within 100 yards or less. I would like the solution to work without the use of a public signaling server. I would rather just have one of the clients also act as the server.
So, my question is :
1. How can I achieve it so that one is the server? i.e. - Can I set one as a hotspot or use wifi direct?
2. If I can achieve #1, then what is a good solution for a signaling server running on android ? Can I run one of the nodejs servers on android ?
A signaling server is simply a way to exchange messages between two parties. In the WebRTC case these messages are the offer/answer and ICE candidates.
You can use whatever type of server you want to do this, you can even do it manually :).
You can use one of the clients as the server too, but then you will have to communicate the IP to the other somehow. Maybe use Wi-Fi direct and get it programatically.
With WebRTC, signaling server is just the way to help you transfer your message, exchange your information (SDP package(createOffer/answer), exchange candidates etc).
Example : You can use GCM (Free) as a signaling server, or using Nodejs with socket.io, websocket, XMPP etc. Only thing you need is transfer your message between two peers.
You can refer to this tutorial : http://www.html5rocks.com/en/tutorials/webrtc/basics/

Peer to Peer communication with Cordova on a local network

I am working on a Phonegap application and I want a means by which two people with this app installed can share some information offline, but on the same network like a wifi or tethered hotspot.
I have tried to search the web for possible libraries or ways of doing it, but all I see are libraries that pass the data through their servers before sending it to the peer, meaning they must be online.
So please can someone point me to the right direction on how to do this or to the valid resources available.
Thanks
I've used sockets-for-cordova for communicating between a cordova app and an Arduino. However, seeing as it's TCP it wouldn't support broadcasts - you'll need a UDP socket for that.
There is also cordova-plugin-chrome-apps-sockets-udp which does appear to support broadcasts. I've no experience with it though.

Android Client and Node.js Server

I'm trying to develop a proyect like PTTDroid, I mean a Push-To-Talk or Walkie-Talkie application.
The issue is that in this app you canĀ“t use 3G to access the web, so I've decided to use a Node.js server and implement an Android client to comunicate with it. I tried to do a multiplattform proyect using Phonegap the problem is that for audio record you can't access to buffer, you can only start and stop or pause the recording process but not send data while capturing. So my problem is that is possible to streams audio capture in real time by native Android functions (Audiorecord class) with a Node.js server by Socket.IO or similar?
I discovered this project, Asimi JS, but I don't know if someone else knows a better way to do what I want.
Thank you very much for your help!
It is certainly possible to do it, but a standard NodeJS http server would not be advisable as it uses tcp. You want to use UDP as a transport layer for audio, since it will be faster and the small packet loss that can occur will most likely not be a problem.
To be completely honest with you it sounds like you need to write a few demo applications on the native platforms - so do not use phonegap. You need native platforms in order to access things suchs as the mircrophone and to stream over UDP.
When you have a demo working, you can go on and try with another platform afterwards, but start with a simple setup instead of trying to do it all at once - if it was that easy, someone else would have done it before you.
Let me recommend a simple UDP server in whatever language you are most comfortable with such as (NodeJS, Java, C, C++, C#). Let the UDP server receive and save the content into a file that you can then play back on a desktop computer to verify the result. As a simple client, build one either on Android or iOS, and stream a file that you have already recorded and included in the app. When you have this setup working, you can try to capture the microphone, then do a user interface, then support multiple phones, then build a server which records the conversations, then build a user database, and so on a so forth. But start with a prototype of your main feature.
I've finally discovered and solved my problem (at least that's what I think)...First of all I created a server to send and receive UDP packets by DatagramSocket and after that, to achieve communication between server and client, when I was connected by 3G, I needed to have a static port and IP, that's why my server couldn't connect with the client. With data connection, the user IP and port is not always the same and you have to keep the same socket always opened if you want to send and receive. On the other hand the server has to store the adress and port from the client in the moment of connection.
Thank you very much for your help ExxKA

Peer-to-peer SIP call with Android SIP Stack?

I have been looking for a way to set up the Android SIP stack to be able to establish a SIP call between two devices on the same network, in an ad-hoc manner. i.e without REGISTERing to a SIP server.
I have not been able to get this to work, as the SIP Demo includes server registration, and I cannot get it to make or receive a call without this step.
I am not even sure if this is supposed to be possible. The little mention of this I have been able to find is conflicting (some say it can be done with a specific set up which they do not say what is, and some say the Android SIP API is not meant for this).
I was wondering if anyone has got this to work or has any clues as to how I could go about configuring the API for this, as I would like to use the built in SIP API before looking at third party ones.
The application I am developing is an internal one which will always be running on the same devices, so the fact that the SIP API is not present on all devices will not be an issue for me.
I have been stuck on the same problematic.
If you can make it without the android sip api, you can look at the rtp api which gives you a bit lower-level tools to make a P2P VOIP application without the need of a server.
To support audio conferencing and similar usages, you need to
instantiate two classes as endpoints for the stream:
AudioStream specifies a remote endpoint and consists of network
mapping and a configured AudioCodec. AudioGroup represents the local
endpoint for one or more AudioStreams. The AudioGroup mixes all the
AudioStreams and optionally interacts with the device speaker and the
microphone at the same time.
The counterpart is that you have to write your own device discovery protocol in order to know the port used by the audiostream peer as explained in this answer
The problem is not so hard if you only intend to make one-to-one conversation but is a little bit trickier if you want to make one-to-n conversation.
For a one-to-n conversation, the conference host has to instanciate n audiostream for each remote device he wants to call. Each remote peer has only one audiostream linked to one of the host audiostream.
You can do this with CSipSimple, which is open source: http://code.google.com/p/csipsimple/
You set up local accounts, register to yourself instead of a server, then make a phone call using TXT mode and dial remote_account_name#remote_ip_address.
Sip peer is like an extension number used to configure in sip phone . Please find details for creating sip peer . I am using centos 6.9 64 bit and having installed asterisk 11
You can create sip peer using asterisk server .
Goto vi /etc/asterisk/sip.conf
[1001]
username=1001
secret=123
qualify=yes
type=friend
disallow=all
allow=ulaw,alaw,gsm
host=dynamic
For more detail and easy understanding. Please refer given below link
https://youtu.be/27wm-fu25SM
or
http://rulariteducation.blogspot.in/2017/07/how-to-add-sip-peer-in-asterisk.html

Categories

Resources