I was going through some sample VOIP SDK for Android. The SDK providers say that they are using the G729 voice codec in the SDK. But those codecs are hidden. Is there any way to clearly identify the voice codec used in the application?
There is no way I know if you have access to the raw audio data only. However, if you can somehow access the signaling data (e.g SIP), you can look at the selected SDP payload. If it is 18 then the call is encoded in G.729.
Hope that helps...
Related
In Android application, we're considering implement Deezer SDK with some post-processing of audio sources to boost up audio quality for end-users. To make it possible, we need to get raw PCM samples directly from the SDK but sadly there seems no such API. (I just looked through the API document. It has their native/android player providing play controls only?)
Is there a way to get this work? Or does Deezer has plan for providing such feature?
Thanks :)
Unfortunately, no, there's no way to get the raw PCM of Deezer's tracks using the SDK.
As stated in the comment, Deezer needs to make sure that the music is not pirated in any way.
Giving away the raw PCM data would be an easy way to allow piracy.
About giving a feature allowing developers to apply post-processing on the PCM data, IMHO it would be tricky to make sure that the PCM is still not leaked, but it's a good idea for a new feature.
I would like to know the following whether possible with Android or not. I keep searching on line but no luck. I feel that may be possible with new versions of the Android. I want check with experts on Stackoverflow.
Shall I Auto answer the call and play a pre-defined Audio file? At this time, we should not use Micro phone or Speaker.
Once we Auto answer the Call, shall we play pre-defined Audio files based on the DTMF tones received from other end? Do we have Native APIs to read and the DTMF tones? - Simply, can I build a "IVR System" as an Android App? (asking too much? send me your suggestions)
Shall we record the telephony streams as Audio files? We can write a trans-coder if we have access to the streams using the native APIs.
I may be asking too much here, because am a new to android and did not find any absolute answer online.
Thanks in Advance,
- PC Varma
I am beginner in android development and felt like google documentation is not able to help me out. So if anyone knows whether it is possible to send an audio file directly to uplink in between a call? Also share how incase it is possible.
There are no Android APIs that allow you to access, read or write to the audio stream in a call. If any manufacturer provides these APIs, I am not aware of them.
The reason for this is that, at least for AOSP Android, the call part of the device is handled on a hardware level, and not much control is available to Android itself.
It is not possible via the NDK either. The only way you'd have a chance of achieving this is if you were to modify and build Android directly from source.
Is it possible to intercept audio data using google+ hangout api? I writing an app using g+ hangout for android and I would like to process the audio. To be precessive, I want to denoise speech and use speech-to-text (e.g. google search, sphinx) to make basic voice commands.
Because I have full control of the android app it doesn't matter for me if I will have a callback with audio data from hangout or I can record audio using android AudioRecorder and then somehow forward those data to google hangout (Though the latter solution would be better because we can denoise on the android device). Actually I would be happy with any feasible workaround that may work at this stage of the API.
The Hangouts API is not going to help you develop this feature.
What you need is a platform agnostic API for accessing hangouts data. The API is instead intended to solve a different problem. It allows you to write HTML/JavaScript applications that run inside the canvas of hangouts running on desktop web browsers.
One possible "workaround" that I'm currently investigating, myself—
publish the hangout "on air"
get the youtube live id (available as of 2012-08-22, roughly... since Hangout API 1.2) ~ gapi.hangout.onair.getYouTubeLiveId() https://developers.google.com/+/hangouts/api/gapi.hangout.onair#gapi.hangout.onair.getYouTubeLiveId (note that this can only be grabbed by the host?)
grab http://www.youtube.com/watch?v=${LIVEID} // suggestion: look at youtube-dl: http://rg3.github.com/youtube-dl/documentation.html
and then use ffmpeg to process the flv
Information for this answer was primarily grabbed from Downloading videos in flv format from youtube. and http://grokbase.com/t/gg/google-plus-developers/128fbteedb/google-hangout-api-url-to-youtube-stream-screenshot-and-hangout-topic
After carrying out a lot of research I have come to the conclusion that Java and the Java Media Framework (JMF) is not suitable for developing a streaming server that supports the RTSP protocol on the server side for streaming video and audio. I have read very good things about Live555 media server and the testOnDemandRTSPServer source code for a basis of design. My only worry is that this is written in C++ and I am a predominantly Java programmer. This server is a large portion of my final year project at university so my degree kind of hangs on its successful implementation and I am running out of time. If any one has any experience with implementing a RTSP server that can stream to an android handset or belive they can point me in the right direction to learn how to do it, please let me know. Thanks in advance.
My project also has the RTSP server module to be run on Android phone. I think we can build rtsp library as name.so file and can interface with java by using JNI.
This also works for Android!
http://net7mma.codeplex.com/
You can see the article on CodeProject # http://www.codeproject.com/Articles/507218/Managed-Media-Aggregation-using-Rtsp-and-Rtp
The live555 RTSP server is a fully fledged RTSP server that implements most payloads (H.263, H.264, MPEG2, PCM, AMR, AAC, etc. You can read up on the website whether it already supports the media types you want to stream. It also features an RTSP client. With respect to streaming to an android handset: that is the whole point of RTSP: it doesn't matter what type of client you're streaming to, and as for the server side development, there isn't really much dev to do, unless you need to implement an unsupported media type. The code can be quite complex if you're not well versed in c++, but it sounds like your goal is more related to setting up streaming to android as opposed to implementing the RTSP server and client? So check if live555 supports your media types and if it does, I wouldn't bother writing one in JAVA, that can be quite involved. If you do choose to go that route, your best friend is of course the RFC (http://tools.ietf.org/html/rfc2326).
As for the client, I'm not sure if android already has an RTSP library/client. The one other thing you have to consider is which media types are supported by android.