audio capturing/rendering in native code on android - android

I am looking into clearing up my confusion on how to capture and render audio using native code on the Android platform. What I've heard is that theres an API for audio called OpenSL. Is there any recommended guides and tutorials on how to use it?
Also, is there any good audio wrappers for OpenSL, such as an OpenAL wrapper or something? I've developed the audio part with OpenAL on other platforms, so it would be nice to re-use the code.
Is there limitations to OpenSL - like, something that has to be done in Java code?
How much does OpenSL differ to OpenAL?
Thanks!

There's a native audio example included in the samples/ directory of recent ndk releases.
It claims to use OpenSL ES

OpenSL and OpenAL differ quick a bit in terms of interfaces. However, they do have a very similar pattern and the use case is similar too. One this to be aware of is that in the current implementation OpenSL suffers from the same latency issues the java audio apis have.
When using OpenSL you don't have to call any Java code. The latest NDK has support for a native asset manager so no more going through JNI to pass byte arrays around :)

Related

any good ways to do mediamuxer on android?

I am trying to build a video system on android. I am using the sample provided by Qualcomm, which allows me to use openmax and do hardware-acceleration on Qualcomm customer device.
Anyway, this sample only generates .h264 file. So I am looking forword a good way to do the muxer work. I've used MediaMuxer before, but it supports system later than android4.3, so this doesn't work on this sample. (Qualcomm sample only support android4.2 and before)
Does anyone have any ideas? Thank you!
you can use ffmpeg. build ffmpeg for android, create jni wrapper and easily expose muxing functionality to java level

Decode H.264(AVC) bitstream on Android?

I am writting an app which needs to decode H.264(AVC) bitstream. I find there are AVC codec sources exist in /frameworks/base/media/libstagefright/codecs/avc, does anyone know how can one get access to those codecs in an Android app? I guess it's through JNI, but not clear about how this can be done.
After some investigation I think one approach is to create my own classes and JNI interfaces in the Android source to enable using the CODECS in an Android App.
Another way which does not require any changes in Android source is to include CODECS as shared library in my application, use NDK. Any thoughts on these? Which way is better(if feasible)?
I didn't find much information about Stagefright, it would be great if anyone can point out some? I am developing on Android 2.3.3.
Any comments are highly appreciated.Thanks!
Stagefright does not support elementary H.264 decoding. However it does have H.264 decoder component. In theory, this library could be used. But in reality, it will be tough to use it as a standalone library due to its dependencies.
Best approach would be use to JNI wrapped independent h.264 decoder(like the one available with ffmpeg).

Is the native AAC decoder available when using Android NDK?

I'm looking for a way to decode AAC natively to PCM on Android. The decoder source code is at https://android.googlesource.com/platform/external/opencore/+/master/codecs_v2/audio/aac/dec, but I'm not familiar with NDK at all.
1) There's no way of doing this directly using the Android SDK, but can this be done via the NDK?
2) I would especially be interested in a simple way of accessing the decoder from SDK, with a short "bridge" through the NDK. Is this feasible?
3) Would such a solution work all Android versions (1.5-2.2)?
4) I guess I could use http://code.google.com/p/aacplayer-android/ instead, but it looks like this implementation is fairly CPU intensive. Does anyone have experiences with this?
Not sure what the policy is here for answering really old questions but what is working well for me is using OpenSL with the NDK; it comes built in and in fact the NDK comes with an example "native-audio" that demonstrates what you need.
One thing you may look into is the FFMpeg stuff, it is GPL and TuneIn radio posted their mods here: http://radiotime.com/mobile/android#/support/open-source

Audio Record using Android NDK

I try to record audio using android ndk. people say I can use "frameworks/base/media/libmedia/AudioRecord.cpp". but it is in kernel. how can I access and use it?
The C++ libmedia library is not part of the public API. Some people use it, but this is highly discouraged because it might break on some devices and/or in a future Android release.
I developed an audio recording app, and trust me, audio support is very inconsistent across devices, it's very tricky, so IMO using libmedia directly is a bad idea.
The only way to capture raw audio with the public API is to use the Java AudioRecord class. It will gives you PCM data, which you can then choose to pass to your C optimized routines.
Alternatively, although that's a bit harder, you could write a C/C++ wrapper around the Java AudioRecord class, as it is possible to instantiate Java objects and call methods through JNI.
May be a little-bit outdated but:
the safest way of playing/recording audio in native code is by using OpenSL ES interfaces.
Nevertheless it's available only on android 2.3+ and for now works over generic AudioFlinger API.
the more robust and simple way is using platform source-codes to get AudioFlinger headers and some generic libmedia.so used for linking on the build stage.
Device-dependent libmedia.so should be preloaded at application initialization stage for AudioFlinger to work normally (generally it is done automatically). Take a note that some vendors try changing AudioFlinger internals (by ambiguous reasons), so you may encounter some memory or behavior issues.
In my experience AudioFlinger worked on all (2.0+) devices but sometimes required allocating more memory for the object than it was supposed by default implementation.
Finally saying OpenSL ES is a wrapper with dynamically loadable C-interface which allows using it with any particular AudioFlinger implementation. It is pretty complicated for simple usage, and may have even more overhead than using Java AudioTrack/AudioRecord because of internal threading, buffering, etc..
So consider using Java or not-so-safe native AudioFlinger until Google implements some high-performance audio interface (which is doubtful for now).
The OpenSL Es API is available from Android 2.3 on (API-Level 9)

How do I use Android OpenCORE codecs using JNI?

I want to use the codecs in Android from my application. For now I just want to use the H.264 codec for testing, unless the mp3 or aac codecs provide functions for sending the audio to the device's speaker in which case I would prefer one of those.
I have the NDK installed along with Cygwin, GNU Make, and GNU Awk. I can't figure out what I need to do from here though. I'm downloading the entire OpenCORE tree right now but I don't even know how to build it or make Eclipse plugin know it needs to include the files.
An example or a tutorial would be much appreciated.
EDIT:
It looks like I can use JNI like P/Invoke which would mean I don't have to build the OpenCORE libraries myself. However, I can't find any documentation on the names of the libraries I need to load.
I'm also confused as to how to do it. I'm looking at http://www.koushikdutta.com/2009/01/jni-in-android-and-foreword-of-why-jni.html and I don't understand what the purpose of writing a library to access a library is. Couldn't you just use something like System.loadLibrary("opencore.so")?
You cannot build opencore seperately. It has to be built with whole source code. What are you trying to acheive. If you just want to play a video/audio, use VideoView or MediaPlayer object.
Build the Android source and use the headers and the static library from it. This will propel you straight to the zone of unsupported APIs.

Categories

Resources