AudioSource value for USB microphone on Android - android

An audio capture application on rooted MK809/Android 4.1.1. There is no internal mic so I am trying to use a USB one which is correctly detected as "USB Audio Device" in Settings/Sound/Sound Devices Manager/Sound Input Devices when connected.
What is this device's AudioSource value to pass into AudioRecord constructor (first argument). I tried every one in MediaRecorder.AudioSource, none worked. I am only interested in reading the capture buffer, not saving into a file.

Answering my own question. The following values did work: DEFAULT, MIC, CAMCORDER, probably others too as it is the only input device.
I was trying to use sample rate of 48000 (works on Windows) and AudioRecord creation failed with:
ERROR/AudioRecord(1615): Could not get audio input for record source 1
ERROR/AudioRecord-JNI(1615): Error creating AudioRecord instance: initialization check failed.
ERROR/AudioRecord-Java(1615): [ android.media.AudioRecord ] Error code -20 when initializing native AudioRecord object.
Somewhat misleading info considering that a call to getMinBufferSize() with the same set of agruments does not return an error as it is supposed to. I assumed that it was a valid sample rate for the device. Setting it to 44100 (guranteed) fixed the problem.
USB audio input devices do work on Android, Jelly Bean at least. Hope this helps someone.

FWIW, this is implementation specific (it can differ between different platform vendors and OEMs).
On the devices I've worked on, the USB accessory's mic would be chosen if the AudioSource is DEFAULT, MIC or VOICE_RECOGNITION, and the only sample rates supported in the audio HAL for USB audio recording were 8, 16 and 48 kHz (although the AudioFlinger is able to resample to other rates within a certain range).

Related

Android audio record external jack

I would like to record the sound of the jack entry of my android phone. I've been searching about the Audio Capture class in Android, and i've found this:
https://developer.android.com/guide/topics/media/audio-capture.html
In the settings of this class, there are many options to choose the rec default mic, as this:
Set the audio source using MediaRecorder.setAudioSource(). You will probably want to use MediaRecorder.AudioSource.MIC.
What should I use to get the sound of the jack entry? Is there any example?
Thank you!
The API you provided is the correct one.
Calling mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC) (assuming the correct initialization of MediaRecorder as described here https://developer.android.com/guide/topics/media/camera.html -> Capturing Videos) will behave like this:
When you launch your app and start recording without any Jacks connected, phone's default microphones will be used. As soon as Jack microphone pin has detected a microphone connected, the system will use the Jack mic pin as an audio imput. Hovewer, you need to know that even though the audio record will have two channels, they will be identical, as Jack microphone can only record mono stream.

Android AudioRecord Configuration does not match recorded Audio

I am intending to record stereo audio on an Android 4.4.2 device. However the audio being recorded via a simple recording App (using AudioRecord) does not match the configuration supplied. I would hope to see error messages in logcat if the device is using default configuration values, but I can see that the supplied values appear to be accepted by AudioHardware, and AudioPolicyManagerBase.
The current configuration for:
recorder = new AudioRecord(Media.Recorder.MIC,
sampleRate,
AudioFormat.CHANNEL_IN_STEREO,
AudioFormat.ENCODING_PCM_16BIT,
audioBufferSizeInBytes);
Changing the Media.Recorder.AudioSource has been raised an option for trying to resolve this issue; But this has not changed how the Android stack behaves- Except to (understandably) fail to load the recorder when the configuration is invalid.
Changing SampleRate has shown also produced no visible change in the output- both 44.1kHz, and 16kHz are valid options, however both produce 16kHz audio when examined. The output audio also appears to be one channel of audio upmixed to stereo.
TinyALSA/Tinycap is available to capture the audio, and this appears to behave as expected.
Could this be an issue within the Android Stack? Or is this more likely to be an issue with the code supplied by the OEM?
The reason for the downmixed audio in this case was that the Speex Codec was being used in the HAL to downmix and de-noise the stereo input.
This was configured under:
<android source tree>/hardware/<OEM>/audio/audiohardware.h
The alternatives to this problem would be to route the audio out of ALSA, and around the android stack via a unix domain stream socket. This would be accessible in the application layer with androids localSockets.

AudioTrack playing with inconsistent speed on different devices

I am currently working with AudioTrack. I am loading an mp3 file and playing it. However, depending on devices, the music plays either at half the rate or normal rate.
My code:
AudioTrack track= new AudioTrack( AudioManager.STREAM_MUSIC,
sampleRate,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT,
initBuffer,
AudioTrack.MODE_STREAM);
sampleRate is the sample rate return by AudioFormat and initBuffer the buffer size from AudioTrack.getMinBufferSize().
I have tried to change the sample rate but no difference. Buffer size also has no impact.
In fact, switching to CHANNEL_CONFIGURATION_STEREO does make the music play at normal rate on the devices that were slow, but it also makes the ones working fine playing at twice the normal speed. My problem is that I want all devices to play at the normal speed.
Any suggestions?
I have read this thread:Android AudioTrack slow playback, but it doesn't tell me how to find out which devices should play in mono or stereo.
Devices at normal speed: Urbano 4.2.2, Galaxy S4 4.3
Devices at half the speed, Galaxy S4 4.2.2, Experia Z 4.2.2
BTW, I cannot use MediaPlayer for playback. The AudioTrack is included in a custom player and I need to write audio data as I extract it. MediaPlayer won't do the trick.
Thanks.
I experienced it when trying to decode mono file using MediaCodec. On some devices,the codec would output stereo stream while mono on other. When setting up the audio track I used media format returned by MediaExtractor, which was mono. On devices where the codec would produce stereo stream, the audio track would be fed with twice as many samples. The solution is to listen for MediaFormatChanged event from MediaCodec and adjust the MediaFormat of the AudioTrack.
AudioTrack only allows PCM audio data, you cannot play mp3 wit AudioTrack directly, you have to convert your mp3 to PCM or use something else (e.g. MediaPlayer) than AudioTrack.
I finally solved my problem. Not the best fix but it works at least.
For those who are interested, I am reading the BufferInfo size and based on it, decide at which playback rate I should play. Basically when playback is slow the size is twice bigger than for normal speed playback. Just a guess, but the MediaCodec might duplicate data for stereo configuration.
This one is a little tricky but here is what you can do using adb and ALSA.
Android internally uses ALSA.
Your device should have ALSA, try :
root#user:/$ adb shell cat /proc/asound/version
Advanced Linux Sound Architecture Driver Version 1.0.25.
Just take a look at Reading ALSA Device names.
Every device (subdivision of a card) has a capture and a playback component.
Like /proc/asound/card0/pcm1p and /proc/asound/card0/pcm1p where card is 0 device is 1
pcm1p is your playback and pcm1c is capture (for recording).
Access your device using adb:
root#user:/$ adb shell
shell#android:/$:
Identifying your device:
So you see /proc/asound/pcm will give you a long list.
shell#android:/$: cat /proc/asound/pcm
00-00: MultiMedia1 (*) : : playback 1 : capture 1
00-01: MultiMedia2 (*) : : playback 1 : capture 1
00-02: CS-Voice (*) : : playback 1 : capture 1
From the above I find 00-00: MultiMedia1 (*) as card0 and device0 is for Multimedia playback.
Getting Playback Parameter:
Play your loaded mp3 file using your standard music player Application.
While the song is playing.
Issue the following commands for card0, device0(p - playback), and subdevice0
shell#android:/$: cat /proc/asound/card0/pcm0p/sub0/hw_params
access: RW_INTERLEAVED
format: S16_LE
subformat: STD
channels: 2
rate: 44100 (44100/1)
period_size: 1920
buffer_size: 3840
So try using the same values when you call AudioTrack track= new AudioTrack(...);
The above value will only be visible when the device is in open (Playing some audio).
If you issue this to a wrong device (like say pcm1p), you will see the following:
shell#android:/$: cat /proc/asound/card0/pcm1p/sub0/hw_params
closed
Note :
Step 1 doesn't require phone to be rooted, How to Setup ADB.
Alternatives:
Have you tried the AudioTrack APIs like getChannelConfiguration() or getChannelCount() from here - Just Asking.
Have you tried looking at the MP3 file properties, the metadata has information on the audio parameters.

How can I record 2 microphone in Android simultaneously?

I'm trying to record audio signals from 2 in-built microphone(bottom, top) at the same time. I can pick up bottom microphone signal using
MediaRecorder.AudioSource.MIC
and top microphone signal using
MediaRecorder.AudioSource.CAMCORDER
I can record separately but I want to record at the same time from 2 microphones.
Does anyone know how to record simultaneously?
I tried & or | operator but I can get only 1 channel signal.
I use Galaxy S2 device.
I will appreciate any response :)
Thanks in advance.
There is a misconception that in devices with 2 microphones, both the microphones will be used when recording in the stereo mode.
In my 3 years experience of testing on tens of devices, I have found that this was never the case.
The primary mic alone is used both in mono and stereo recording in the wide range of Android devices that I have worked with - from low-cost mass models to flagships.
One reason for this is that the primary mic is of a better quality (more sensitive, less noisy, etc.) and costlier than the secondary mic.
You can achieve this by doing a stereo recording using the AudioRecord (http://developer.android.com/reference/android/media/AudioRecord.html) class. Have a look at How to access the second mic android such as Galaxy 3.
Specifying the audio format as stereo and the audio source as the camcorder automatically selects two microphones, one for each channel, on a (compatible) two microphone device.
For example:
audioRecorder = new AudioRecord(MediaRecorder.AudioSource.CAMCORDER,sampleRate,android.media.AudioFormat.CHANNEL_CONFIGURATION_STEREO,android.media.AudioFormat.ENCODING_PCM_16BIT,bufferSize);
will initialise a new AudioRecord class, that can record from two device microphones in stereo in PCM, 16 bit format.
For more help on recording using AudioRecord (to record .wav), have a look at: http://i-liger.com/article/android-wav-audio-recording.

What is AudioFlinger and why does it fail TONE_PROP_ACK?

In my application I issue the following statement:
toneGenerator.startTone(ToneGenerator.TONE_PROP_ACK, 600);
Which works very well on a cheap LG LS670 running Android 2.3.3 but doesn't sound at all on all other phones I have, ranging from Android 2.2.1 to Android 2.3.4.
So I know the OS version doesn't play a role here (I also verified in the documentation that it has been supported since API 1).
Also, both Ringer volume and Media volume are set to maximum and toneGenerator is initialized with:
toneGenerator = new ToneGenerator(ToneGenerator.TONE_DTMF_1, 100);
And I verified that Settings.System.DTMF_TONE_WHEN_DIALING is set to 1.
Baffled by this inconsistent behavior (across different phones), I examined the system logs when this happens and the only suspicious difference I have been able to find is that the phones who fail to sound TONE_PROP_ACK have this line in their log:
AudioFlinger setParameters(): io 25, keyvalue routing=0, tid 155, calling tid 121
What is the purpose of AudioFlinger and what could be its connection to muting TONE_PROP_ACK?
Any idea how to fix my code so that that TONE_PROP_ACK always sounds, regardless of phone model?
One work around is to generate the tone in something like Audacity and play it through SoundPool or the api of your choice.
According to the Android docs ToneGenerator.TONE_PROP_ACK is:
1200Hz, 100ms ON, 100ms OFF 2 bursts
If you choose SoundPool, I suggest saving in ogg file format and loop the tone until complete. This while provide seamless audio with a very small clip and not using a lot of resources.
The parsing/decoding is handled by Stage fright, which is used by the
media player service. The decoded data is written to an Audio Track
through an Audio Sink, and the tracks are then mixed by the
Audio Flinger's mixer thread(s) and written to an output stream
(Audio Hardware). The output stream object fills up its own buffer(s)
and then writes the data to the PCM output device file (which may or
may not be an ALSA driver).

Categories

Resources