I am building a custom audio player with MediaCodec/MediaExtractor/AudioTrack etc. which mixes and plays multiple audio files.
Therefore I need a resampling algorithm, if one of the files has a different samplerate.
I can see that there is a native AudioResample class available:
https://android.googlesource.com/platform/frameworks/av/+/jb-mr1.1-release/services/audioflinger/AudioResampler.h -
But so far I did not find any examples how it can be used.
My question:
Is it possible to use the native resampler on Android? (in Java or with JNI)
If yes, does anyone know an example out there? Or any docs how one can use this custom AudioResampler class?
Thanks for any hints!
This is not a public API, so you can't officially rely on using it (and even unofficially, using it would be very hard). You need to find a library (ideally in C, for NDK) to bundle within your app.
Related
I'm trying to implement a audio jack data interface using AFSK and a micro-controller.
Through searches I've seen a couple implementations that use iPhones, such as this:
http://www.creativedistraction.com/demos/sensor-data-to-iphone-through-the-headphone-jack-using-arduino/comment-page-1/#comment-243826
There they used "Perceptive Developmentās SerialModem for iPhone", although that seems to contain a hex file and a circuit schematic?
I haven't been able to find anything by searching for "AFSK Android library", "FSK android library" or various other combinations of that. Does anyone know of a good source for these kinds of tools for Android?
Alternatively, is there a library that implements the simplified FFT that you could use to demodulate the data? Naturally you don't want to do a full FFT because you're just trying to distinguish between
(Ideas drawn from here: http://labs.perceptdev.com/how-to-talk-to-tin-can/) but I'm sure there's something like
I looked into spandsp, http://www.soft-switch.org/ , looking for more general DSP libraries. Not sure if these can be used on Android though.
Thanks for your help
Since Android API 12, RTP is supported in the SDK, which includes RtpStream as the base class, and AudioStream, AudioCodec, and AudioGroup. However, there is no documentation, examples, or tutorials to help me use these specific APIs to take input from the device's microphone, and output it to an RTP stream.
Where do I specify using the mic as the source, and not to use a speaker? Does it perform any RTCP? Can I extend the RtpStream base class to create my own VideoStream class (ideally I would like to use these for video streaming too)?
Any help out there on these new(ish) APIs please?
Unfortunately these APIs are the thinnest necessary wrapper around native code that performs the actual work. This means that they cannot be extended in java, and to extend them in C++ you would have to have a custom Android version I believe.
As far as I can see the AudioGroup cannot actually be set to not output sound.
I don't believe it does an RTCP but my use of it doesn't involve RTCP so I would not know.
My advice is that if you want to be able to extend functionality or have greater flexibility, then you should find a C or C++ native library that someone has written or ported to Android and use that instead, this should allow you to control what audio it uses and add video streaming and other such extensions.
As you can see in architecture diagram below android platform has been built using different layers.
Application are developed in Java
Application Framework is written using Java (according to my understanding)
Libraries are in C/C++
For some insane reason I have to play/deal with devices like accelerometer, compass and camera using C/C++ which means directly accessing them in 3rd layer i.e. Libraries. According to my understanding the Application Framework itself would be consuming Libraries for accessing these devices and then providing APIs for Applications.
I am looking for any documentation/tutorials/demo which can help me in this regard i.e how to access and use these devices like camera, accelerometer and compass from C/C++ code or in other words how to play with these devices directly from Libraries layer.
My last option would be to get the android source code and dig deep into it to find out what I am looking for but I would like some easy way in form of a documentation/demo/tutorial/anything that can make this a bit easy for me.
I am looking for any documentation/tutorials/demo which can help me in this regard i.e how to access and use these devices like camera, accelerometer and compass from C/C++ code or in other words how to play with these devices directly from Libraries layer.
You don't. You access them from Java code. Reorganize your C/C++ code to support your Java code.
For the camera, you can use opencv to access the frames with a c++ library. For the Accelerometer, I'm looking for how to access using c++.
I want to create simple equalizer for android. How can I do it?
Try to find some methods in MediaPlayer class. But all my attempts failed.
Android has built-in qualizer engine, though it isn't located in MediaPlayer class, becouse it's a class itself located in android.media.audioFx package.
http://developer.android.com/reference/android/media/audiofx/Equalizer.html
Simple answer... you can't do it with the framework or with Java (because there is no JMF support in Android). You have to use the NDK and JNI to compile a native library with equalizer support. If you know C/C++ there are plenty of libraries around that will provide this functionality but if you don't know C/C++ or have the means to pay someone that does I would recommend you move on to something else within your means... There are even some working examples for Android, if you look around, that use libmpg123... but libmpg123 only provides an equalizer interface for mp3's. I found that it's pretty buggy in general and compromised the stability of the app in such a way that it would lock up android and I would have to pull the battery to reboot the phone. In addition, there was alot of audio clipping even with the equalizer flatlined. That is my experience...
I hope that below link is useful for you.
https://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/media/AudioFxDemo.html
I want to use the codecs in Android from my application. For now I just want to use the H.264 codec for testing, unless the mp3 or aac codecs provide functions for sending the audio to the device's speaker in which case I would prefer one of those.
I have the NDK installed along with Cygwin, GNU Make, and GNU Awk. I can't figure out what I need to do from here though. I'm downloading the entire OpenCORE tree right now but I don't even know how to build it or make Eclipse plugin know it needs to include the files.
An example or a tutorial would be much appreciated.
EDIT:
It looks like I can use JNI like P/Invoke which would mean I don't have to build the OpenCORE libraries myself. However, I can't find any documentation on the names of the libraries I need to load.
I'm also confused as to how to do it. I'm looking at http://www.koushikdutta.com/2009/01/jni-in-android-and-foreword-of-why-jni.html and I don't understand what the purpose of writing a library to access a library is. Couldn't you just use something like System.loadLibrary("opencore.so")?
You cannot build opencore seperately. It has to be built with whole source code. What are you trying to acheive. If you just want to play a video/audio, use VideoView or MediaPlayer object.
Build the Android source and use the headers and the static library from it. This will propel you straight to the zone of unsupported APIs.