I'm working on an Android app centered around audio playback and I'm experiencing some erratic behavior (audio stuttering and hiccups) that I suspect might be inherent to certain devices, OS versions or perhaps the native buffer size of the device.
About my implementation - I'm in need of low latency so I'm processing my audio in OpenSL ES callbacks and using a fairly small buffer size of 128 samples to enqueue the buffers. I am doing mp3 decoding during the callback, but with my ring buffer size, I do not need to decode during every callback cycle.
I'm using a remote testing service to gauge audio playback quality on a variety of devices and OS versions and here's a few examples of the inconsistencies I'm finding.
Samsung Galaxy S4 w/ Android 4.4 - no audio playback issues
Samsung Galaxy S4 w/ Android 4.3 - user experiences audio drop-outs/stuttering when locking/un-locking the device
Samsung Galaxy Note 2 w/ Android 4.1.2 - no issues
Samsung Galaxy Note 2 w/ Android 4.3 - audio drop-outs during playback and stuttering when locking/unlocking screen.
Personally, I have a Galaxy S3 w/ 4.1.2 and a Nexus 5 with 4.4 and don't ever experience these issues. I also have a few older 2.3.7 devices where these issues do not occur (2010 Droid Incredible, LG Optimus Elite).
I am fairly confident that I'm not over-working the processor since I can get this running on older, Gingerbread devices just fine.
My questions:
If I raise my base SDK to 4.2, I can detect native buffer size from the hardware and use some multiple of this during my buffer queue callbacks. Would this make much of a difference in cases where stuttering and drop-outs are problematic especially during screen lock?
Is there a known bug with Android 4.3 where audio playback suffers,
especially during screen lock actions? Is this possibly just a Samsung issue?
Are there other ways of improving performance to avoid this problem? I absolutely need OpenSL ES for my app.
Thanks.
Increasing buffer size solves some problems regarding distortion and noise. Yes, you can detect native buffer size from the hardware when your SDK is above 4.2 with this:
String size = audioManager.getProperty(AudioManager.PROPERTY_OUTPUT_FRAMES_PER_BUFFER);
But yet another approach to get the minimum buffer size for Audio Record is:
int minForAudioRecord = AudioRecord.getMinBufferSize(8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
And for Audio Playback:
int minForAudioTrack = AudioTrack.getMinBufferSize(8000,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT);
If your SDK version is greater than or equal to 4.2, then you can have preferred sample rate for your audio.
String rate = audioManager.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE);
I've found samsung devices are the worst while dealing with these things because each device has different approach to deal with audio drivers.
Related
I have found that setvideoencodingbitrate(800000) works for most of the devices I am using but on the Samsung Galaxy S6 it seems to record at 1.3 MBs rather than 800 kbs as set.
I am assuming this is because the device doesnt support that bit rate (I could be wrong)
Is there a way of getting an android devices supported videoencoding bitrates? or at least seeing what the MediaRecorder has been set to after calling on prepare? I cant seem to find any kind of mediaRecorder.getvideoencodingbitrate call?
Code below.
mediaRecorder.setVideoEncodingBitRate(500000); //500000 works with galaxy s6
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
mediaRecorder.setVideoFrameRate(mCaptureProfile.framesPerSecond);
// Audio Settings
mediaRecorder.setAudioChannels(mCaptureProfile.audioNumChannels);
mediaRecorder.setAudioSamplingRate(mCaptureProfile.audioSampleRate);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
mediaRecorder.setAudioEncodingBitRate(mCaptureProfile.audioBitRate);
mediaRecorder.prepare();
In general, it's recommended you use CamcorderProfile to select the encoding settings, including bitrate, when recording video with a MediaRecorder. Select the profile with the resolution you want (for the camera you want), and then set it on the media recorder with MediaRecorder.setProfile.
This allows the device vendor to set the recommended bitrate, which may vary greatly between devices, due to different hardware encoders, supported parts of the video recording specs, and so on.
To look at the actual supported bitrates, that may be available somewhere in MediaCodecInfo.
I am attempting to record an mp4 through the use of OpenGL SurfaceView and using a MediaCodec encoder to encode the frames of the video in one thread and in another thread I am using AudioRecord to record audio samples and using a seperate encoder to encode the audio data and finally a MediaMuxer to do the actualy muxing.
On some devices (mostly higher end devices) the method that I am using works completely fine and the output video is to be expected with no error or anything like that. However, on some other devices that isnt the case.
What happens is on some devices, (ie Moto Droid Turbo 2) the audio encoder (which is running in its own thread) will process a few audio samples then return MediaCodec.INFO_BUFFER_UNAVAILABLE when attempting to dequeue an input buffer for the encoder, this start happening after about 5 samples that are successfully encoded and the video encoder runs completely fine. But on another device (ie the Samsung Galaxy Alpha) just opposite happens where the video encoder begins to return the MediaCodec.INFO_BUFFER_UNAVAILABLE status when dequeue an input buffer for the video encoder and the audio encoder runs fine.
I guess my question is, and I have looked all over for a completely clear explanation of this, is what is the cause of a buffer being unavailable? Other then not releasing the buffer before next use, what can cause this? I am 98% certain that I am releasing the buffers when the need to be released because the exact same code will work on most devices that I have tested on. (Nexus 5x, Nexus 6P, Samsung S6, Samsung Edge/Edge+, Moto X (2014), Moto X (2015), HTC One M9, HTC One M8, etc).
Does anyone have any ideas? I didnt think posting code examples was necessary because I know it works on most devices I have tested but my question is what can cause MediaCodec.INFO_BUFFER_UNAVAILABLE to be returned aside from not release the buffer, if anything?
EDIT:
Further investigating the issue and it become a little strange even. I was starting to think that this had something to do with processing power of the device, which would explain why the higher end devices worked fine but the lower end did not. HOWEVER, the Samsung Note Edge (which works fine) has the same CPU, GPU, chipset AND amount of RAM as the Moto Droid Turbo, which encounters the error with dequeing audio buffers from the audio encoder.
Edit 2:
Yup, this was entirely something that I was doing that caused this. The issue was happening because I missed a release call to the buffer in the instance that the muxer hadn't yet started yet. So instead of releasing the buffer in that instance, I just ignore it and moved it which caused the buffer to be hung up. problem solved
So far, I thought it was a firmware error, but now I just found a 2nd device that has two internal microphone capsules but only yields a monophonic signal (1st one was the S2 Plus GT-I9105P with Android 4.1.2 and now the HTC One M7; the Nexus 10 with Android 4.4.2 has only one mic).
I tested with:
bufSize = AudioRecord.getMinBufferSize(44100, AudioFormat.CHANNEL_IN_STEREO, AudioFormat.ENCODING_PCM_16BIT);
ar = new AudioRecord(source, 44100, AudioFormat.CHANNEL_IN_STEREO, AudioFormat.ENCODING_PCM_16BIT, bufSize);
and set source according to How avoid automatic gain control with AudioRecord?
I also tested some apps if they were able to produce real stereo recordings but did find none (I tested the standard Camera/Camcorder, Audiorecorder, and RecForge Lite).
So my question is: How to enable stereo recording on tablets/smartphones yielding only mono albeit having two internal microphones?
Add on question if there is no way to achieve real stereo recordings: Could you name other devices with also two internal microphones but only mono recording capability?
Is there any API or trick to find out programmatically?
See what worked for me in the Motorola Moto G. I understand it is very dependent on the vendor/model, but I tried many combinations until I found that only with 48000 Hz I was getting stereo recording in that particular phone:
Recording stereo audio with a Motorola Moto G (or Moto X)
People on the net give latency numbers of around 88mS for Galaxy Nexus running ICS and 72mS for Nexus 7 running JB 4.1.1 I have tried with both AudioTrack and OpenES and found that I cannot get less than 140mS latency on either device. Am I missing something? I have set my output threads to URGENT_AUDIO priority, pass the audio in small chunks (eg 160 shorts) and use the minimum buffer size (in the AudioTrack case).
Are the quoted numbers only valid for short sounds played through SoundPool and not applicable to streaming PCM? Just to be clear, I am talking about playback only not recording.
This is androids dirty little secret. It is not fixed, all you need is an app, an ear, and a finger to find the truth.
I'm new to the android platform, and I wanted to develop an app that runs in the background and reads the microphone input, applies a transformation to it, and outputs the resulting audio to the speaker.
I'm wondering if there is any lag perceived by the user in this process, or if it's possible to do it in near-realtime so that the user can hear the transformed audio in sync with the ambient audio. Thanks!
Yes, users will hear a severe latency lag or echo with attempts at real-time audio on current unmodified Android devices using the provided APIs.
The summary is that Android devices are configured for fairly long audio buffers, which has been reported to be in the somewhere around the range of 100 to 400 milliseconds long, depending on the particular device and the Android OS version it is running. (Shorter buffers might be possible on Android devices on which one can build and install a modified custom build of the OS with your own custom audio drivers.)
(Humans hear echoes at somewhere around or above 25 mS. Audio buffers on iOS can be as short as 5.8 mS, so you may have better luck trying to develop your near-real-time audio processing on a different device platform.)
Audio processing on android isn't all the great, in fact to be honest, it sucks. The out-of-the-box latency on android devices for such things is pretty awful. You can however tinker with the NDK and try to put together something based on OpenSL ES which will have significantly low latency.
There is a similar StackOverflow question: Playing back sound coming from microphone in real-time
Some other helpful links:
http://arunraghavan.net/2012/01/pulseaudio-vs-audioflinger-fight/
http://www.musiquetactile.fr/android-is-far-behind-ios/
http://www.geardiary.com/2012/02/21/the-dismal-state-of-android-as-a-music-production-solution/
On the other side of the coin, android mic quality is way better than IOS quality. I have a galaxy s4 and a huawei very low end phone and both have a wonderful mic quality when recording.