Android call recording using ALSA / CAF - android

I'm writing a small call recording library for my rooted phone.
I saw in some application that recording is done through ALSA or CAF on rooted phones.
I couldn't find any example / tutorial on how to use ALSA or CAF for call recording (or even for audio recording for that matter).
I saw tinyAlsa lib project, but I couldn't figure how to use it in an android app.
Can someone please show me some tutorial or code example on how to integrate ALSA or CAF in an Android application?
Update
I managed to wrap tinyAlsa with JNI calls. However, calls like mixer_open(0) returns null pointers, and calls like pcm_open(...) returns a pointer but subsequent call to is_pcm_ready(pcm) always returns false.
Am I doing something wrong? Am I missing something?

Here's how to build ALSA lib using the Android's toolchain.
and here you can find another repo mentioning ALSA for android
I suggest you to read this post on order to understand what are your choices and the current platform situation.
EDIT after comments:
I think that you need to implement your solution with tinyalsa assuming by you are using the base ALSA implementation. If the tiny version is missing something then you may need to ask the author (but it sounds strange to me, because you are doing basic operations).
After reading this post, we can get some clues about why root is needed (accessing protected mount points).
Keep us updated with your progress, it's an interesting topic!

Related

Is there any dvb-t library for android?

I have an android STB and I'd like to know if there is any library with I can use the built in DVB-T tuner?
Thanks in advance!
No, there is no DVB-T library for android.
However, you can develop your own:
Firstly, is your DVB-T device supported by the kernel? To check, see if DVB related debug comes out of dmesg when you boot up the box. Also, ls /dev/dvb* to see what is already there.
If there is no DVB support in the kernel, you will need to add it. First you need access to the kernel source. Using this, modify the kernel menuconfig to add DVB related modules, and specific ones for your tuner - sometimes some Remote Control ones are required also. Then build these modules, and insmod the modules on the box. Sometimes firmware is required too. Check the initial check again.
Then you can cross-compile dvb-apps for android (specifically tzap), or the newer v4l-utils for android. This gives you c code to tune to DVB-T transponders. Then write some JNI to access the API from Java, and create an app to perform tuning.
Finally, you can send a URI to the /dev/dvb0.dvr0, to a third-party video player like VLC. This is a TS stream containing MPEG-2 for SD, and H264 for HD.
As you can see, it is a lot of work, but entirely possible.

Android, msm_pcm_out device and ALSA lib

Could anyone tell me if it is possible to use alsa lib directly in native C code in Android?
Because I must receive raw data from native C code socket fd, I think it is better to play it out in native C, too. I have searched a lot and found that I could play pcm data directly by using msm_pcm_out device, but I could not find this device in my platform. So my last choice is using alsa lib. My questions are:
Could I add msm_pcm_out device by myself ? And use it just like the sample playwav.c?
If adding the msm_pcm_out device is not possible, could I use alsa lib in native C?
If all I mentioned above are impossible, any suggestion?
Thank you.
Search result :
MSM_PCM_* are the specified devices in Qualcomm MSM7K series chip, so other platform would not have this !
It seems that using ALSA lib directly in native C is not possible because the ALSA resource would be engaged by system.
Update : Use ALSA API to do the play would not work, but record works ! (Strange...)
My last choice might be using the OpenSL ES...
Any suggestion would be appreciated.

cvCreateVideoWriter (OpenCV 2.2 + FFMPEG)

I'm working with OpenCV 2.2 for Android under Windows, and faced a problem when using cvCreateVideoWriter. It always returns NULL. I'm guessing it has something to do with library FFMPEG not being properly built. The thing is that I followed instructions in http://opencv.willowgarage.com/wiki/Android2.2, and since FFMPEG is included as a 3rd party library (at least I can see the source withing the whole OpenCV package) I thought I didn't have to do anything extra to get this library installed. I might be wrong. How do I check if the library was correctly built (or built at all)? Do I need to make any changes to the default make files?
Any help is much appreciated.
Thanks!
There are 2 important things to consider when using cvCreateVideoWriter():
Your application needs rights to create files and be able to write on them. Make sure you have setup the necessary directory permissions for it to do so.
The 2nd argument of the function is the code of codec used to compress the frames. For For instance, CV_FOURCC('P','I','M','1') is MPEG-1 codec and CV_FOURCC('M','J','P','G') defines motion-jpeg.
A typical call may look like this:
CvVideoWriter *writer = cvCreateVideoWriter("video.avi", CV_FOURCC('M','J','P','G'), fps, size, 0);
if (!write)
{
// handle error
}
I suggest calling cvCreateVideoWriter with different codecs. It may be that your platform doesn't support the one you are using right now.
I don't know if the default build for Android enables the flag HAVE_FFMPEG, but you need to have ffmpeg installed and it's best to make sure this flag is enable when compiling OpenCV.

How to synthesize sounds of instruments on Android (Piano, Drums, Guitar, etc...)

Can somebody give me some direction on how to synthesize sounds of instruments (Piano, Drums, Guitar, etc...)
I am not even sure what to look for.
Thanks
Not sure if this is still the case but Android seems to have latency issues that inhibit it from being able to do true sound synthesis. NanoStudio, in my opinion, is the best audio app on the iOS and the author so far refuses to make an Android version because the framework isn't there yet.
See these links:
http://www.google.com/search?sourceid=chrome&ie=UTF-8&q=nanostudio+android#hl=en&q=+site:forums.blipinteractive.co.uk+nanostudio+android&bav=on.2,or.r_gc.r_pw.&fp=ee1cd411508a9e34&biw=1194&bih=939
It all depends on what kind of application you're making, if it's going to be a Akai APC firing off sounds you could be alright. If you're after true synthesis (crafting wave forms so they replicate pianos, guitars, and drums), which is what JASS mentioned above does, then Android might not be able to handle it.
If you're looking for a guide on emulating organic instruments via synthesis check out the books by Fred Welsh http://www.synthesizer-cookbook.com/
Synthesizing a guitar, piano, or natural drums would be difficult. Triggering samples that you pass through a synthesis engine less so. If you want to synthesize analog synth sounds that's easier.
Here is a project out there you might be able to grab code from:
https://sites.google.com/site/androidsynthesizer/
In the end if you want to create a full synthesizer or multi-track application you'll have to render your oscillators + filters, etc into an audio stream that can be piped into the MediaPlayer. You don't necessarily need MIDI to do that.
Here is one persons experience:
http://jazarimusic.com/2011/06/audio-on-android-a-developers-perspective/
Interesting read.
Two projects that might be worth looking at JASS (Java Audio Synthesis System) and PureData . PureData is quite interesting though probably the harder path.
MIDI support on Android sucks. (So does audio support in general, but that's another story.) There's an interesting blog post here that discusses the (lack of) MIDI capabilities on Android. Here's what he did to work around some of the limitations:
Personally I solved the dynamic midi generation issue as follows: programmatically generate a midi file, write it to the device storage, initiate a mediaplayer with the file and let it play. This is fast enough if you just need to play a dynamic midi sound. I doubt it’s useful for creating user controlled midi stuff like sequencers, but for other cases it’s great.
Android unfortunately took out MIDI support in the official Java SDK.
That is, you cannot play audio streams directly. You must use the provided MediaStream classes.
You will have to use some DSP (digital signal processing) knowledge and the NDK in order to do this.
I would not be surprised if there was a general package (not necessarily for Android) to allow you to do this.
I hope this pointed you in the right direction!

How to create equalizer for android

I want to create simple equalizer for android. How can I do it?
Try to find some methods in MediaPlayer class. But all my attempts failed.
Android has built-in qualizer engine, though it isn't located in MediaPlayer class, becouse it's a class itself located in android.media.audioFx package.
http://developer.android.com/reference/android/media/audiofx/Equalizer.html
Simple answer... you can't do it with the framework or with Java (because there is no JMF support in Android). You have to use the NDK and JNI to compile a native library with equalizer support. If you know C/C++ there are plenty of libraries around that will provide this functionality but if you don't know C/C++ or have the means to pay someone that does I would recommend you move on to something else within your means... There are even some working examples for Android, if you look around, that use libmpg123... but libmpg123 only provides an equalizer interface for mp3's. I found that it's pretty buggy in general and compromised the stability of the app in such a way that it would lock up android and I would have to pull the battery to reboot the phone. In addition, there was alot of audio clipping even with the equalizer flatlined. That is my experience...
I hope that below link is useful for you.
https://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/media/AudioFxDemo.html

Categories

Resources