I am working on an android voip application that need not work on PSTN. I am completely novice to this field and any little help will be appreciated.
I started by researching how whatsapp voice call works and found out that it is using PJSIP which is open source sip stack library(Source: What's up with WhatsApp and WebRTC? - webrtcHacks). I also found that codecs are used in voip to compress and then decompress the voip packets.
Knowing that I am extremely comfused betweet those sip libraries and codec. Do an android voip app have to have implement sip library? Every sip library supports a few codec.
Is there any general format by which I can integrate any codec within my android app whether it is OPUS or Speex or anything like that which is independent of sip implementation?
May be I am sounding too confusing but that is true. Even googling so much on this specific topic did not help me and my last stop is this community. Any little guidance will be appreciated.
Yes, usually every app implements the codecs on their own. Some codec is available in the Android SDK but even in these cases a proper implementation is better.
G.711 (PCMU and PCMA) are very simple which can be implemented within a single java class (or even in a single function if you wish). The others are more complicated, but you can find open source implementations for almost each of them.
Also note that codec's are implemented also within PJSIP, so if you are using this library then you already have the most popular codec's available.
Related
I am writing a custom Android application that allows the user to make VOIP calls using SIP. When the user presses a button, a voice call is initiated with another SIP user. That's it. The only other requirements are that it has to work on 2.3+ Android devices and must not be limited to wifi only.
Android already includes a SIP stack (as of 2.3) and I was able to modify the Walkie-Talkie sample project to work exactly how I wanted it to. Unfortunately I was not able to use this app with certain devices (Casio Commando being one of them), and worse, the application only works over wifi (as I mentioned, unacceptable for my project).
I started searching for another SIP stack with an easy to use API and came across CSimpleSip. I compiled it and ran their demo project (SipHome) and it worked perfectly on my unsupported phones (including the Commando) and it also worked over wifi, satisfying my requirements. I was so excited... until I looked at the source code for CSipSimple. I have no idea how to begin extracting out the actual calls to the underlying pjsip API, nor was I able to get the pjsip demo application working after 10+ hours.
Has anyone deconstructed CSipSimple and separated out the SIP parts from their incredibly complicated UI, or does anyone know of a simpler to use SIP library? With the native SIP API I was able to make a 200 line Android activity that made the call perfectly... how can I accomplish this with a third party SIP stack that supports non-wifi?
Thanks for any input, I know quite a few people have gotten stuck at this same stage.
Instead of going for the more complicated CSipSimple, you should maybe attack the most basic apjsua, which runs the same pjsip stack, so it should hopefully fit your requirements too: http://trac.pjsip.org/repos/wiki/Getting-Started/Android.
I totally agree with Balint, apjsua is the app which helps you for a better understanding, however it may be not obvious to start with it when you're not familiar with C (like I was) but it's much more efficient this way.
You can take a look to www.pjsip.org, take just care about the package you'll download because the tutorial isn't so clear: for instance they talk you a lot about apjsua (the android implementation of pjsua) and this app is not included in the download link they provide, you can see my question here about that:
where's apjsua?
And of course you'll have to watch the tutorial for android in the pjsip website.
Hope this helps.
Use csipsimple as a library project.There is a api in csipsimple project for using it as library.You can bind to csipsimple service and make calls.
Register broadcastrecievers and intent filters for get call back from csipsimple. Analyse Incall activity in csipsimple for more details.
Im fairly new to Android development but from what I read I am pretty sure there is no direct API access to the in-call audio stream. I need to make something very close to an answering machine so I was wondering if there is any API support to pass an audiostream to the microhpone internally and use that as a workaround for not having access to the call stream. IE someone calls, we get the answer, mute the microphone so that he cant hear the environment and pass an alternative audiostream(some kind of recording) to the microphone while it is muted...That was the only workaround I can think of and I have no idea if something like that is possible, but I will appreciate any feedback. Thanks!
No, Android does not support injection of audio into the voice call uplink.There are some mobile platforms that support this functionality, but it's not something that's available to app developers since there's no API in place for it, regardless of whether you're using the Java APIs or the native APIs.
The accepted answer isn't correct now. I don't know maybe it was true 9 years ago but I know that it's possible to inject audio onto uplink using tinyalsa library.
I am looking for general advice (technologies, best practice etc.) regarding the development of a VoIP application for Android. Similar questions have been posted, but I included specific questions.
I did quite some research and I noticed that there are several possible solutions (of course I know about the SIP stack in the SDK (also that it includes even voice transmission), but since it's not available on most devices, I don't intend to use it. Also, I read about Adobe Flex implementations but I would like to stick to something native).
What stuck to my mind is the following:
initiate the session using SIP (the server exists). Use SDP in the messages to discribe the session.
establish p2p connection (firewalls/routing may be a problem - STUN could be used - DNS may be involced than)
make actual transmission, I believe that the packets/procols involved are (payload) in -> RTP in -> UDP in -> IP.
This seems quite complicated at the moment so my first question is:
1)is this a standard approach? Is it best practice? I got some hints that, instead, MSRP could be used to transmit content, but I have read that it is only for IM, files etc.
2) Which SIP stack should I use for the best results/performance? I can use Java/Eclipse for development. I was thiking to choose JSIP (or tinysip, based on jsip) but am not sure.
3) Please give me a few hints about how to implement the data transmission in Java for Android (RTP)
As a last note, I am not excluding at all pjsip. I am thinking that it may be actually faster since it uses the NDK (I could switch to C++, np) . I also read that it already includes audio/video transmission.
I just don't know how easy it is to use and extend it and how good it really is. If you have used it, please let me know!
Thank you.
PS: Although not urgent or certain, portability may be an important factor for future.
I will need video transmission as well in the near future.
You should check out the IMSDROID project. It uses Doubango Framework which is written in C and is highly portable. What more, it is open sourced too so you can play around with their code and possibly contribute towards the community.
http://code.google.com/p/imsdroid/
cheers :)
Take a look at teamSpeak. They provide native sdk-s. But they aren't free.
It's not P2P, requires a server (at least as I know)
May worth a minute.
http://www.teamspeak.com/?page=teamspeak3sdk
Writing your own SIP stack takes a few months with only basic features.
I would recommend to use an existing sip stack. There are a few opensource discussed here.
I want to develop a program for a video calls in android. I thought of using the built in sip that introduced in android 2.3.3. But how can I initiate the video calls? I see that it is not supported.
I believe the generic Android SIP stack supports video.
Taken from:
https://developer.android.com/reference/android/net/sip/package-summary.html
If you want to create generic SIP connections (such as for video calls
or other), you can create a SIP connection from the SipManager,
using open(). If you only want to create audio SIP calls,
though, you should use the SipAudioCall class, as described
above.
If you don't mind using external SIP stacks, check out this:
http://www.youtube.com/watch?v=g1NHEsXFEns
which uses Jain-SIP.
EDIT: As of late, this project seems to be the leader in the native Android SIP space:
https://code.google.com/p/csipsimple/ - open source, and they offer everything you need to make voice and video calls.
Can somebody give me some direction on how to synthesize sounds of instruments (Piano, Drums, Guitar, etc...)
I am not even sure what to look for.
Thanks
Not sure if this is still the case but Android seems to have latency issues that inhibit it from being able to do true sound synthesis. NanoStudio, in my opinion, is the best audio app on the iOS and the author so far refuses to make an Android version because the framework isn't there yet.
See these links:
http://www.google.com/search?sourceid=chrome&ie=UTF-8&q=nanostudio+android#hl=en&q=+site:forums.blipinteractive.co.uk+nanostudio+android&bav=on.2,or.r_gc.r_pw.&fp=ee1cd411508a9e34&biw=1194&bih=939
It all depends on what kind of application you're making, if it's going to be a Akai APC firing off sounds you could be alright. If you're after true synthesis (crafting wave forms so they replicate pianos, guitars, and drums), which is what JASS mentioned above does, then Android might not be able to handle it.
If you're looking for a guide on emulating organic instruments via synthesis check out the books by Fred Welsh http://www.synthesizer-cookbook.com/
Synthesizing a guitar, piano, or natural drums would be difficult. Triggering samples that you pass through a synthesis engine less so. If you want to synthesize analog synth sounds that's easier.
Here is a project out there you might be able to grab code from:
https://sites.google.com/site/androidsynthesizer/
In the end if you want to create a full synthesizer or multi-track application you'll have to render your oscillators + filters, etc into an audio stream that can be piped into the MediaPlayer. You don't necessarily need MIDI to do that.
Here is one persons experience:
http://jazarimusic.com/2011/06/audio-on-android-a-developers-perspective/
Interesting read.
Two projects that might be worth looking at JASS (Java Audio Synthesis System) and PureData . PureData is quite interesting though probably the harder path.
MIDI support on Android sucks. (So does audio support in general, but that's another story.) There's an interesting blog post here that discusses the (lack of) MIDI capabilities on Android. Here's what he did to work around some of the limitations:
Personally I solved the dynamic midi generation issue as follows: programmatically generate a midi file, write it to the device storage, initiate a mediaplayer with the file and let it play. This is fast enough if you just need to play a dynamic midi sound. I doubt it’s useful for creating user controlled midi stuff like sequencers, but for other cases it’s great.
Android unfortunately took out MIDI support in the official Java SDK.
That is, you cannot play audio streams directly. You must use the provided MediaStream classes.
You will have to use some DSP (digital signal processing) knowledge and the NDK in order to do this.
I would not be surprised if there was a general package (not necessarily for Android) to allow you to do this.
I hope this pointed you in the right direction!