I want to develop a program for a video calls in android. I thought of using the built in sip that introduced in android 2.3.3. But how can I initiate the video calls? I see that it is not supported.
I believe the generic Android SIP stack supports video.
Taken from:
https://developer.android.com/reference/android/net/sip/package-summary.html
If you want to create generic SIP connections (such as for video calls
or other), you can create a SIP connection from the SipManager,
using open(). If you only want to create audio SIP calls,
though, you should use the SipAudioCall class, as described
above.
If you don't mind using external SIP stacks, check out this:
http://www.youtube.com/watch?v=g1NHEsXFEns
which uses Jain-SIP.
EDIT: As of late, this project seems to be the leader in the native Android SIP space:
https://code.google.com/p/csipsimple/ - open source, and they offer everything you need to make voice and video calls.
Related
I am working on an android voip application that need not work on PSTN. I am completely novice to this field and any little help will be appreciated.
I started by researching how whatsapp voice call works and found out that it is using PJSIP which is open source sip stack library(Source: What's up with WhatsApp and WebRTC? - webrtcHacks). I also found that codecs are used in voip to compress and then decompress the voip packets.
Knowing that I am extremely comfused betweet those sip libraries and codec. Do an android voip app have to have implement sip library? Every sip library supports a few codec.
Is there any general format by which I can integrate any codec within my android app whether it is OPUS or Speex or anything like that which is independent of sip implementation?
May be I am sounding too confusing but that is true. Even googling so much on this specific topic did not help me and my last stop is this community. Any little guidance will be appreciated.
Yes, usually every app implements the codecs on their own. Some codec is available in the Android SDK but even in these cases a proper implementation is better.
G.711 (PCMU and PCMA) are very simple which can be implemented within a single java class (or even in a single function if you wish). The others are more complicated, but you can find open source implementations for almost each of them.
Also note that codec's are implemented also within PJSIP, so if you are using this library then you already have the most popular codec's available.
I am trying to write a native android application in which I want to play couple of audio streams. In order to have proper synchronization with other audio streams in rest of the android system, I need to manage playback of these streams properly. While programming in java, android framework provides APIs like 'AudioManager.requestAudioFocus', 'AudioManager.abandonAudioFocus' and also provides appropriate callbacks according to behavior of other audio streams.
So, is there any possible way by means of which I can call these methods from a native code ?
It seems there is one more way of using OpenSL APIs. Does OpenSl provides methods similar to requestAudioFocus / abandonAudioFocus ?
Thanks in advance
I am developing an Android application, in which I want to make a calls using internet using SIP in android. So I need to maintain my own SIP sever for my app users, how can I create my own SIP server?
I would not advise creating your our sip server as it would take a large number of man years of development and there are a lot of pitfalls.
There are some open source implementations that you could install and setup yourself. Like FreeSwitch or Asterisk. Both are large and complete to setup as there is a lot of domain knowledge required to understand how to set them up correctly.
There are also free server that you could try out as well like Sip2Sip.
Then there is the job of creating a sip client on Android. Again it's not that simple either. I would look at using a open source library here as well, like pjsip. This gives you the advantage of being able to look at examples of full sip clients already developed for Android like csipsimple. pjsip also has the advantage of being cross-platform, so you could reuse it in IOS for example.
Good luck.
The Server
As a communication server, choose for example sip:providerCE v2.6. The easiest way to get started with it is to download the VMware or Virtualbox image and fire it up on a suitable machine. If you get more serious, you want to install the system from scratch on a dedicated server with a public static IP. If you’re new to VoIP and SIP, do NOT try to install it on an Amazon EC2 instance, as they’re using destination NAT, which is a big pain for SIP and needs some experience with the SPCE to tweak it properly for that scenario.
Note that the SPCE is a 64bit system, so in order to run the VM images, you need to turn on 64bit CPU virtualization in your BIOS if VMware or Virtualbox warns you about it.
...There is very good tutorial HERE! on how to set things up.
...Don't forget there is a technical advise concerning SIP check the accepted ANS!
...Least but not last check THIS! VOIP Wiki, It covers everything related to VOIP.
SIP RFC is very easy protocol to implement. Just create a socket listener and implement RFC-3261. Start with with a basic codec GSM, then move up to A-LAW (G.711), as needed.
The tricky parts with SIP are (A) ensuring your call flows are correct (RFC-3665) and (B) media encoding/compresion. Use Asterisk (FreeSwitch) and WireShark to test your call flow. If you need DTMF support you'll need RFC-2833. If you need advanced codecs, consider using open-source library like FFMPEG.
I used uSIPserver on android. It works well and support video call. If you use client app which supports video then you can video call eachother on wifi.
It is so simple to use.
Good luck :-)
I want to develop a app which uses 3G for video calling where secondary camera will be use for video call.
Is that possible to make video call? If it is possible please give me some reference or tutorial.
you should look for SIP protocol stack in android. There are various open source projects out there. look for SipDroid, IMSDroid is also a good example. And Sip Api is also available in Android after API level 9.
Another one worth looking at is the CSipSimple project. It is using pjsua as SIP library. The video call is developed in a branch of the project.
I have not tested it yet but it seems to work according to an issue. The issue starts getting interesting from comment 27 onwards since after that video calls seem to be working.
Check it out and look into it. It will probably take some time to find your way around the code at first.
Since Android API 12, RTP is supported in the SDK, which includes RtpStream as the base class, and AudioStream, AudioCodec, and AudioGroup. However, there is no documentation, examples, or tutorials to help me use these specific APIs to take input from the device's microphone, and output it to an RTP stream.
Where do I specify using the mic as the source, and not to use a speaker? Does it perform any RTCP? Can I extend the RtpStream base class to create my own VideoStream class (ideally I would like to use these for video streaming too)?
Any help out there on these new(ish) APIs please?
Unfortunately these APIs are the thinnest necessary wrapper around native code that performs the actual work. This means that they cannot be extended in java, and to extend them in C++ you would have to have a custom Android version I believe.
As far as I can see the AudioGroup cannot actually be set to not output sound.
I don't believe it does an RTCP but my use of it doesn't involve RTCP so I would not know.
My advice is that if you want to be able to extend functionality or have greater flexibility, then you should find a C or C++ native library that someone has written or ported to Android and use that instead, this should allow you to control what audio it uses and add video streaming and other such extensions.