Could anyone tell me if it is possible to use alsa lib directly in native C code in Android?
Because I must receive raw data from native C code socket fd, I think it is better to play it out in native C, too. I have searched a lot and found that I could play pcm data directly by using msm_pcm_out device, but I could not find this device in my platform. So my last choice is using alsa lib. My questions are:
Could I add msm_pcm_out device by myself ? And use it just like the sample playwav.c?
If adding the msm_pcm_out device is not possible, could I use alsa lib in native C?
If all I mentioned above are impossible, any suggestion?
Thank you.
Search result :
MSM_PCM_* are the specified devices in Qualcomm MSM7K series chip, so other platform would not have this !
It seems that using ALSA lib directly in native C is not possible because the ALSA resource would be engaged by system.
Update : Use ALSA API to do the play would not work, but record works ! (Strange...)
My last choice might be using the OpenSL ES...
Any suggestion would be appreciated.
Related
I am working on an android voip application that need not work on PSTN. I am completely novice to this field and any little help will be appreciated.
I started by researching how whatsapp voice call works and found out that it is using PJSIP which is open source sip stack library(Source: What's up with WhatsApp and WebRTC? - webrtcHacks). I also found that codecs are used in voip to compress and then decompress the voip packets.
Knowing that I am extremely comfused betweet those sip libraries and codec. Do an android voip app have to have implement sip library? Every sip library supports a few codec.
Is there any general format by which I can integrate any codec within my android app whether it is OPUS or Speex or anything like that which is independent of sip implementation?
May be I am sounding too confusing but that is true. Even googling so much on this specific topic did not help me and my last stop is this community. Any little guidance will be appreciated.
Yes, usually every app implements the codecs on their own. Some codec is available in the Android SDK but even in these cases a proper implementation is better.
G.711 (PCMU and PCMA) are very simple which can be implemented within a single java class (or even in a single function if you wish). The others are more complicated, but you can find open source implementations for almost each of them.
Also note that codec's are implemented also within PJSIP, so if you are using this library then you already have the most popular codec's available.
I'm writing a small call recording library for my rooted phone.
I saw in some application that recording is done through ALSA or CAF on rooted phones.
I couldn't find any example / tutorial on how to use ALSA or CAF for call recording (or even for audio recording for that matter).
I saw tinyAlsa lib project, but I couldn't figure how to use it in an android app.
Can someone please show me some tutorial or code example on how to integrate ALSA or CAF in an Android application?
Update
I managed to wrap tinyAlsa with JNI calls. However, calls like mixer_open(0) returns null pointers, and calls like pcm_open(...) returns a pointer but subsequent call to is_pcm_ready(pcm) always returns false.
Am I doing something wrong? Am I missing something?
Here's how to build ALSA lib using the Android's toolchain.
and here you can find another repo mentioning ALSA for android
I suggest you to read this post on order to understand what are your choices and the current platform situation.
EDIT after comments:
I think that you need to implement your solution with tinyalsa assuming by you are using the base ALSA implementation. If the tiny version is missing something then you may need to ask the author (but it sounds strange to me, because you are doing basic operations).
After reading this post, we can get some clues about why root is needed (accessing protected mount points).
Keep us updated with your progress, it's an interesting topic!
I have an android STB and I'd like to know if there is any library with I can use the built in DVB-T tuner?
Thanks in advance!
No, there is no DVB-T library for android.
However, you can develop your own:
Firstly, is your DVB-T device supported by the kernel? To check, see if DVB related debug comes out of dmesg when you boot up the box. Also, ls /dev/dvb* to see what is already there.
If there is no DVB support in the kernel, you will need to add it. First you need access to the kernel source. Using this, modify the kernel menuconfig to add DVB related modules, and specific ones for your tuner - sometimes some Remote Control ones are required also. Then build these modules, and insmod the modules on the box. Sometimes firmware is required too. Check the initial check again.
Then you can cross-compile dvb-apps for android (specifically tzap), or the newer v4l-utils for android. This gives you c code to tune to DVB-T transponders. Then write some JNI to access the API from Java, and create an app to perform tuning.
Finally, you can send a URI to the /dev/dvb0.dvr0, to a third-party video player like VLC. This is a TS stream containing MPEG-2 for SD, and H264 for HD.
As you can see, it is a lot of work, but entirely possible.
Similar to this question. Since I don't want to pass the voice data to a server, doing this may cost me more time. I wonder if I can use HTK to recognize the voice data locally with the Android application so that I won't need to pass that audio to the server.
Here may be the solution, but can anyone give me a more detialed tutorial on how to build HTK with android ndk? Thank you!
Maybe pocketsphinx is more suitable for you, see this tutorial: http://cmusphinx.sourceforge.net/wiki/tutorialandroid . And to build the JNI API from pocketsphinx yourself, see Building PocketSphinxAndroidDemo (from CMUSphinx project) for more details.
I want to use the codecs in Android from my application. For now I just want to use the H.264 codec for testing, unless the mp3 or aac codecs provide functions for sending the audio to the device's speaker in which case I would prefer one of those.
I have the NDK installed along with Cygwin, GNU Make, and GNU Awk. I can't figure out what I need to do from here though. I'm downloading the entire OpenCORE tree right now but I don't even know how to build it or make Eclipse plugin know it needs to include the files.
An example or a tutorial would be much appreciated.
EDIT:
It looks like I can use JNI like P/Invoke which would mean I don't have to build the OpenCORE libraries myself. However, I can't find any documentation on the names of the libraries I need to load.
I'm also confused as to how to do it. I'm looking at http://www.koushikdutta.com/2009/01/jni-in-android-and-foreword-of-why-jni.html and I don't understand what the purpose of writing a library to access a library is. Couldn't you just use something like System.loadLibrary("opencore.so")?
You cannot build opencore seperately. It has to be built with whole source code. What are you trying to acheive. If you just want to play a video/audio, use VideoView or MediaPlayer object.
Build the Android source and use the headers and the static library from it. This will propel you straight to the zone of unsupported APIs.