I am working with WebRTC to make a basic video call application that works between two Android phones, I have been searching for about more than 10 days, I have understood everything regarding the Android side, but I really can't get it in the web side, signalling, TURN and STUN. Unfortunately I am not a web guy (at least not now) and I am very very confused about what to do about the servers setup. I don't even understand exactly when to use what and why. to make the story shorter what I need is:
I need a roadmap to continue in the servers setup.
thank you in advance.
UPDATE:
The backend has been implemented and it seems to be working cuz I receive voice without any problem, I also receive the MediaStream which contains both the video and the audio, but no video is being displayed.
private void gotRemoteStream(MediaStream stream) {
//we have remote video stream. add to the renderer.
Log.d("KingArmstring", "gotRemoteStream: 1 stream == null" + String.valueOf(stream == null));
Log.d("KingArmstring", "the value of the received stream: " + String.valueOf(stream));
final VideoTrack videoTrack = stream.videoTracks.get(0);
Log.d("TAG", "gotRemoteStream: we get here");
runOnUiThread(() -> {
try {
Log.d("TAG", "we get here");
remoteRenderer = new VideoRenderer(new VideoRenderer.Callbacks() {
#Override
public void renderFrame(VideoRenderer.I420Frame i420Frame) {
Log.d("TAG", "renderFrame: we get here");
}
});
remoteVideoView.setVisibility(View.VISIBLE);
videoTrack.addRenderer(remoteRenderer);
} catch (Exception e) {
e.printStackTrace();
}
});
}
I played around with webRTC on Android and web. I was able to make my own project with the help of these projects:
https://github.com/pchab/ProjectRTC
https://github.com/pchab/AndroidRTC
What I suggest is to run these projects. After you success doing it, you can try to change the code to meet with your needs. Now I will explain some details about TURN and STUN.
STUN - this is a way to know what is you real ip. If you use your phone with wifi, what will happen is that you will have ip like: 192.168.1.14. This is internal ip. Your real ip is something else. You need some server like google to tell you what is your real ip. Try typing on google search what is my ip and you will see it is different than what you see in the ifconfig.
TURN - This is a relay of the stream of voice/video data. What happen is some cellular carrier cut of the voice/video data for some reason, what you can do to overcome this is use TURN, you send the data to the TURN and it transfer this to the other side.
Signaling - this is a way 1 side calls the other side. lets say you have 2 guys that want to communicate, they need a way to send the communication data before the call starts. webRTC doesn't give you a mechanism. It gives you a json that you need 1 guy to send it to the second guy. The links I provided uses socket.io but there are other implementation like FCM. The data that travels is the first guy ip, the codacs that he wants to use, and things like that. The second guy needs to send the accept response and the voice call begins.
I have figured finally the problem, thanks to Uriel cuz his answer helped me a lot my answer can't stand alone, it can only be added to his answer. You can see that the remoteRenderer has been initialized this way:
remoteRenderer = new VideoRenderer(new VideoRenderer.Callbacks() {
#Override
public void renderFrame(VideoRenderer.I420Frame i420Frame) {
Log.d("TAG", "renderFrame: we get here");
}
});
(I have add that in the UPDATE in my qustion)
instead of that we should initialize it this way:
remoteRenderer = new VideoRenderer(remoteVideoView);
when I finish this part of the app, I will try to add a git repo for this webRTC part so that anyone can take advantage of using any part of it.
Related
I am trying make an application that will allow a registered client to make an audio call to another registered client using Wi-Fi(It doesn't require internet).
I was able to successfully register and make call using SIP.
After the call is picked up, I don't know how to handle the RTP stream and connect it with the microphone and speaker of the phone(Android and IOS) to perform normal calling functionality.
I am using Xamarin and SIP Sorcery library. I am new to Xamarin and mobile application development.
Below is a part of code to explain myself a little better:
async Task Call()
{
Console.WriteLine("Start of Calling section");
rtpSession = new RTPMediaSession((int)SDPMediaFormatsEnum.PCMU, AddressFamily.InterNetwork);
// May be somthing like this to connect audio devices to RTP session.
//get microphone
//get speaker
//ConnectAudioDevicesToRtp(rtpSession, microphone, speaker);
// Place the call and wait for the result.
bool callResult = await userAgent.Call(DESTINATION, ssid, userName, registerPassword, domainHost, rtpSession);
if (callResult)
{
Console.WriteLine("Call attempt successful. Start talking");
//I am reaching to this point and need help with how to move forward from here to support audio calling functionality for both Android and IOS
}
else
{
Console.WriteLine("Call attempt failed.");
}
}
Any help or direction would be appreciated. Thank you.
I looked at the documentation from SIP Sorcery, there I found only an example for windows (https://sipsorcery.github.io/sipsorcery/articles/sipuseragent.html), but not for ios or android.
Here is the description from SIP Sorcery for cross platform (https://sipsorcery.github.io/sipsorcery/). I think you need the SIPSorceryMedia.FFmpeg library
I am currently using an app that uses the method exemplified on libstreaming-example-1 (libstreaming) to stream the camera from an Android Device to an Ubuntu Server (using openCV and libVLC). This way, my Android device acts like a Server and waits for the Client (Ubuntu Server) to send the play signal over RTSP and then start the streaming over UDP.
The problem I am facing with the streaming is that I am getting a delay of approximately 1.1s during the transmission and I want to get it down to 150ms maximum.
I tried to implement the libstreaming-example-2 of libstreaming-examples, but I couldn't I don't have access to a detailed documentation and I couldn't figure out how to get the right signal to display the streaming on my server. Other than that, I was trying to see what I can do with the example 1 in order to get it down, but nothing new until now.
PS: I am using a LAN, so network/bandwidth is not the problem.
Here come the questions:
Which way is the best to get the lowest latency possible while
streaming video from the camera?
How can I implement example-2?
Is example-2 method of streaming better to get the latency down to
150ms?
Is this latency related to the decompression of the video on
the server side? (No frames are dropped, FPS: 30)
Thank you!
had same issue as you with huge stream delay (around 1.5 - 1.6 sec)
My setup is Android device which streams its camera over RTSP using libStreaming, receiving side is Android device using libVlc as media player. Now I found a solution to decrease delay to 250-300 ms. It was achieved by setting up libVlc with following parameters.
mLibvlc = new LibVLC();
mLibvlc.setVout(LibVLC.VOUT_ANDROID_WINDOW);
mLibvlc.setDevHardwareDecoder(LibVLC.DEV_HW_DECODER_AUTOMATIC);
mLibvlc.setHardwareAcceleration(LibVLC.HW_ACCELERATION_DISABLED);
mLibvlc.setNetworkCaching(150);
mLibvlc.setFrameSkip(true);
mLibvlc.setChroma("YV12");
restartPlayer();
private void restartPlayer() {
if (mLibvlc != null) {
try {
mLibvlc.destroy();
mLibvlc.init(this);
} catch (LibVlcException lve) {
throw new IllegalStateException("LibVLC initialisation failed: " + LibVlcUtil.getErrorMsg());
}
}
}
You can play with setNetworkCaching(int networkCaching) to customize a bit delay
Please let me know if it was helpful for you or you found better solution with this or another environment.
im trying to control a bluetooth bracelet with vibration function via HFP (hands free profile) in Android. I've been able to connect to the bracelet and access the input- and outputstream.
My goal is to simulate an incoming call so that the bluetooth bracelet starts vibrating (which seems to be the only way to do that). To do this, im using AT commands. In the bluetooth specs at https://www.bluetooth.org/docman/handlers/downloaddoc.ashx?doc_id=238193 on page 22 you can see the handshake to establish service level connection.
I need to establish this connection to use the "+CIEV" command (see handshake page 48).
But when my bracelet returns the command "AT+CIND=?" I dont know how to respond. I can't find any hints on how to answer with the "CIND:" command. Also I dont know how to send the acknowledgement (is it just "OK"?).
That might even be the completely wrong way to do this. Every suggestion is appreciated. I only found one post on stackoverflow that helped me in some way, rest of the posts I found were unanswered.
By the way, im using a smartphone with Android 4.1.2. The bracelet supports HFP and HSP. Thanks in advance.
UPDATE 10/29/2014
===== Connection through RFCOMM Socket established at this point =====
// read AT+BRSF=0 from device
byte[] buffer = new byte[200];
mBluetoothSocket.getInputStream().read(buffer);
Log.d(TAG, new String(buffer).trim());
//write answer BRSF: ...
mBluetoothSocket.getOutputStream().write("+BRSF=20\r".getBytes());
mBluetoothSocket.getOutputStream().write("OK\r".getBytes());
// read AT+CIND=? command
buffer = new byte[200];
mBluetoothSocket.getInputStream().read(buffer);
Log.d(TAG, new String(buffer).trim());
//write answer CIND: ...
mBluetoothSocket.getOutputStream().write("+CIND: (\"battchg\",(0-5)),(\"signal\",(0-5)),
(\"service\",(0,1)),(\"call\",(0,1)),(\"callsetup\",(0-3)),
(\"callheld\",(0-2)),(\"roam\",(0,1))".getBytes());
mBluetoothSocket.getOutputStream().write("OK".getBytes());
// read AT+CIND?
buffer = new byte[200];
mBluetoothSocket.getInputStream().read(buffer);
Log.d(TAG, new String(buffer).trim());
Following the procedure of the protocol, I should receive the "AT+CIND?" command and then I could send the command "+CIND: 5,5,1,0,0,0,0", but...I dont receive the "AT+CIND?" command. Actually im not receiving anything. Am I missing something? Sending an "OK" doesnt change anything btw...
I was fiddeling with exaclty the same problem. After days of trial and error, I finally got it to work.
I think it depends on the speed at wich you answer the HF's commands, as well as on the correct line-endings ("COMMAND").
Here is my DroidScript which works. It's not cleaned up, but it works.
https://gist.github.com/t-oster/68a568ac4c4e133f67ac
Also, the one example I found that seemed to almost work, it's expecting the responses to be top and tailed with crlf:
"\r\n+BRSF=20\r\n"
"\r\nOK\r\n"
Still struggling with the rest of it myself.
refer to bluetooth hfp 1.5 spec in which you can understand CIEV response
normally when not in any call setup, response can be +CIND = 1,0,0,0,5,0,5
Note these values are based on the hfp spec, on incoming call return +CIEV: ,
ind- indicator for callsetup and value as 1 and then RING commands to the bracelet
I am new to android and is trying to develop an application. I have a local server that has the address like http://abc:9070/
i.e: the server is running only on port number 9070 in my laptop.
Now i want to debug my program using a android device and i have to make sure that the android device listens to port number 9070, So that i can make the post http request call to the url and fetch some information.
Can someone tell me how can i make my device to listen to port number 9070?
Also can someone tell me whether changing default port number of adb solve this.
I have tried a lot to search for a solution. But i am not able to come up with any good answers.
Thanks in advance.
Nobody has expressed an opinion yet. May be the question is not clear, at least I found it very difficult to understand what you are trying to do.
You say you have a server (laptop) listening on port 9070 and you want a device to connect to this server thru this port? Is that right?
Have you try, from your device, launch the navigator and connect to that address? http://abc:9070
Anyway, the java code to make a socket connection is something similar to this:
try
{
Socket clientSocket = new Socket("YOUR_LAPTOP_IP", 9070);
// 1024 is an arbitrary number, could be 512, 65535, etc
byte[] buffer = new byte[1024];
int ret=0;
while ((ret=clientSocket.getInputStream().read(buffer)) > 0)
{
// from now on it's up to you what to do with the data you read
}
clientSocket.close();
}
catch (Exception e)
{
e.printStackTrace();
}
This is my first post so my apologies in advance if this is the wrong site to post this particular question to.
Question
I have integrated Nokia's MMS implementation for android (http://androidbridge.blogspot.com/2011/03/how-to-send-mms-programmatically-in.html) into an android application I am writing and I am able to send MMS's from my personal Metro PCS device to Metro PCS's MMSC and messages are delivered to any recipient without issue.
This is how I am sending the MMS:
public Boolean sendMMSMessage(final String senderNumber, final String smsText, final File imageFile, final Integer requestId){
byte[] out;
Enumeration keys;
//set image File
setImageFile(imageFile);
//create MMMessage
setMMMessage(new MMMessage());
//add text
addText(getMMMessage(),smsText,"<0>",IMMConstants.CT_TEXT_PLAIN);
//add image file
addFromFile(getMMMessage(),getImageFile(),"<1>",IMMConstants.CT_IMAGE_JPEG);
//set MMEncoder
setMMEncoder(new MMEncoder());
getMMEncoder().setMessage(getMMMessage());
//transaction ID (second parameter) is arbitrary
setMessage(getMMMessage(),"T135d743a6b7",senderNumber);
try {
getMMEncoder().encodeMessage();
out = getMMEncoder().getMessage();
setMMSender(new MMSender());
getMMSender().setMMSCURL("http://mms.metropcs.net:3128/mmsc");
//'min' of sending device. Required by Metro PCS MMSC.
getMMSender().addHeader("X-DEVICE-MIN", min);
setMMResponse(getMMSender().send(out));
} catch (Exception e) {
System.out.println(e.getMessage());
return false;
}
return (getMMResponse().getResponseCode()==IMMConstants.HTTP_RESPONSE_OK);
}
I am wondering if it is possible to 'tweak' Nokia's code (if this is necessary) such that any device can send a properly constructed MMS request to Metro PCS's MMSC using my 'min' credentials. I have studied the packet flow (via 'WireShark') of what occurs when an MMS is successfully sent from my particular device to other recipients however when I run this same android app. on another device (a non-Metro PCS device), MMS messages fail to send and 'WireShark' is not helpful in explaining why. Can anyone help lead me in the direction of how I might make this work?
Update: It may help to add that logcat reports:
java.net.SocketTimeoutException: Connection timed out
Second Update: I took a look at another post regarding this issue. It is titled "Android sending image via MMS programatically(Operation timed out)" but unfortunately there currently is not a definite answer and this question has been live for two months. I will try increasing the read 'timeout' as someone suggested (I doubt this is the cause) but if anyone DOES know what the problem might be but simply wishes not to provide a direct answer this is fine. I just need a hint of where to look.
Third Update: Now that I think about it, I wonder if the senders IP address (the actual IP address used by the device) constitutes a factor here. Can anyone confirm?
Fourth Update: I just took a closer look at the code for 'MMSender.java' (specifically at whats going on with the 'HttpURLConnection' object) and according to its setReadTimeout(ms) method, the default value ('0') establishes an infinite wait time anyways and this method is not called anywhere in the code. Just for kicks, however, I manually set this value to 1 minute for both setConnectionTimeout(ms) and setReadTimeout(ms) and as I suspected, no dice. Same connect timeout issue.
Final Update: Sorry. I just realized that I copied this line of code from another posting some time ago:
((ConnectivityManager)getSystemService(Context.CONNECTIVITY_SERVICE)).startUsingNetworkFeature(ConnectivityManager.TYPE_MOBILE,"enableSUPL");
and now after looking at this call more closely, I wonder if it is possible that I am supplying incorrect parameter values to startUsingNetworkFeature() (at least for the non-Metro PCS device I am trying the application on). I don't want to overkill my 'updates' here for this question but I want you guys (or gals) to be well informed so...; if these parameter values do turn out to be the problem, I will definitely post that fact but this will be my final update. In the mean time, any advice is greatly appreciated.