I am attempting to record an mp4 through the use of OpenGL SurfaceView and using a MediaCodec encoder to encode the frames of the video in one thread and in another thread I am using AudioRecord to record audio samples and using a seperate encoder to encode the audio data and finally a MediaMuxer to do the actualy muxing.
On some devices (mostly higher end devices) the method that I am using works completely fine and the output video is to be expected with no error or anything like that. However, on some other devices that isnt the case.
What happens is on some devices, (ie Moto Droid Turbo 2) the audio encoder (which is running in its own thread) will process a few audio samples then return MediaCodec.INFO_BUFFER_UNAVAILABLE when attempting to dequeue an input buffer for the encoder, this start happening after about 5 samples that are successfully encoded and the video encoder runs completely fine. But on another device (ie the Samsung Galaxy Alpha) just opposite happens where the video encoder begins to return the MediaCodec.INFO_BUFFER_UNAVAILABLE status when dequeue an input buffer for the video encoder and the audio encoder runs fine.
I guess my question is, and I have looked all over for a completely clear explanation of this, is what is the cause of a buffer being unavailable? Other then not releasing the buffer before next use, what can cause this? I am 98% certain that I am releasing the buffers when the need to be released because the exact same code will work on most devices that I have tested on. (Nexus 5x, Nexus 6P, Samsung S6, Samsung Edge/Edge+, Moto X (2014), Moto X (2015), HTC One M9, HTC One M8, etc).
Does anyone have any ideas? I didnt think posting code examples was necessary because I know it works on most devices I have tested but my question is what can cause MediaCodec.INFO_BUFFER_UNAVAILABLE to be returned aside from not release the buffer, if anything?
EDIT:
Further investigating the issue and it become a little strange even. I was starting to think that this had something to do with processing power of the device, which would explain why the higher end devices worked fine but the lower end did not. HOWEVER, the Samsung Note Edge (which works fine) has the same CPU, GPU, chipset AND amount of RAM as the Moto Droid Turbo, which encounters the error with dequeing audio buffers from the audio encoder.
Edit 2:
Yup, this was entirely something that I was doing that caused this. The issue was happening because I missed a release call to the buffer in the instance that the muxer hadn't yet started yet. So instead of releasing the buffer in that instance, I just ignore it and moved it which caused the buffer to be hung up. problem solved
Related
I made the observation that for devices in the same price range, Snapdragon-based decoders can have a much higher decoding latency than Exynos-based decoders. This is most noticable when decoding h264 streams with the value "pic_order_cnt_type" inside the SPS set to 0 instead of 2.
I am wondering if you have also observed this behaviour and if there is any fix for that ( I have already opened an issue here but no response so far)
Some technical details:
I have built a simple example app that uses AMediaCodec to decode h264 streams. It uploads the "decoding latency", as a test result into a Firestore database. code
Here is a comparison of the decoding latency for different h264 streams on a Pixel 4 (using snapdragon decoder) and a Samsung Galaxy Note 9 (using exynos decoder):
Pixel 4
Galaxy Note 9
As you can see, for the video called jetson/h264/2/test.h264, the decoding time on the snapdragon device is ~21 times higher than for the Samsung device. pic_order_cnt_type==0 for this stream. However, on other streams the difference in decoding time is insignificant. (they all use pic_order_cnt_type==2)
The main parameter determining if the snapdragon decoder enters a "low-latency decoding path" seems to be the value pic_order_cnt_type mentioned above.
If I understand the h264 specs correctly, if this value is set to 2 picture re-ordering is impossible (no buffered frames). If it is set to 0, picture re-ordering is possible, but not neccessarily used by the encoder. However, the snapdragon decoder doesn't differentiate between "possible" and "actually used by the encoder", leading to the big decoding latency difference.
I was able to reduce the decoding latency on snapdragon by manipulating the bitstream before sending it to the decoder (adding VUI with num_reorder_frames=0 and max_dec_frame_buffering=0) but it never results in 0 buffered frames, only less buffered frames.
I'm working on an Android app centered around audio playback and I'm experiencing some erratic behavior (audio stuttering and hiccups) that I suspect might be inherent to certain devices, OS versions or perhaps the native buffer size of the device.
About my implementation - I'm in need of low latency so I'm processing my audio in OpenSL ES callbacks and using a fairly small buffer size of 128 samples to enqueue the buffers. I am doing mp3 decoding during the callback, but with my ring buffer size, I do not need to decode during every callback cycle.
I'm using a remote testing service to gauge audio playback quality on a variety of devices and OS versions and here's a few examples of the inconsistencies I'm finding.
Samsung Galaxy S4 w/ Android 4.4 - no audio playback issues
Samsung Galaxy S4 w/ Android 4.3 - user experiences audio drop-outs/stuttering when locking/un-locking the device
Samsung Galaxy Note 2 w/ Android 4.1.2 - no issues
Samsung Galaxy Note 2 w/ Android 4.3 - audio drop-outs during playback and stuttering when locking/unlocking screen.
Personally, I have a Galaxy S3 w/ 4.1.2 and a Nexus 5 with 4.4 and don't ever experience these issues. I also have a few older 2.3.7 devices where these issues do not occur (2010 Droid Incredible, LG Optimus Elite).
I am fairly confident that I'm not over-working the processor since I can get this running on older, Gingerbread devices just fine.
My questions:
If I raise my base SDK to 4.2, I can detect native buffer size from the hardware and use some multiple of this during my buffer queue callbacks. Would this make much of a difference in cases where stuttering and drop-outs are problematic especially during screen lock?
Is there a known bug with Android 4.3 where audio playback suffers,
especially during screen lock actions? Is this possibly just a Samsung issue?
Are there other ways of improving performance to avoid this problem? I absolutely need OpenSL ES for my app.
Thanks.
Increasing buffer size solves some problems regarding distortion and noise. Yes, you can detect native buffer size from the hardware when your SDK is above 4.2 with this:
String size = audioManager.getProperty(AudioManager.PROPERTY_OUTPUT_FRAMES_PER_BUFFER);
But yet another approach to get the minimum buffer size for Audio Record is:
int minForAudioRecord = AudioRecord.getMinBufferSize(8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
And for Audio Playback:
int minForAudioTrack = AudioTrack.getMinBufferSize(8000,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT);
If your SDK version is greater than or equal to 4.2, then you can have preferred sample rate for your audio.
String rate = audioManager.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE);
I've found samsung devices are the worst while dealing with these things because each device has different approach to deal with audio drivers.
I am working on an implementation of one of the Android Test Cases regarding previewTexture recording with the new MediaCodec and MediaMuxer API's of Android 4.3.
I've managed to record the preview stream with a framerate of about 30fps by setting the recordingHint to the camera paremeters.
However, I ran into a delay/lag problem and don't really know how to fix that. When recording the camera preview with quite standard quality settings (1280x720, bitrate of ~8.000.000) the preview and the encoded material suffers from occasional lags. To be more specific: This lag occurs about every 2-3 seconds and takes about 300-600ms.
By tracing the delay I was able to figure out the delay comes from the following line of code in the "drainEncoder" method:
mMuxer.writeSampleData(mTrackIndex, encodedData, mBufferInfo);
This line is called in a loop if the encoder has data available for muxing. Currently I don't record audio so only the h264 streams is converted to a mp4 format by the MediaMuxer.
I don't know if this has something to do with that delay, but it always occurs when the loop needs two iterations to dequeue all available data of the encoder (to be even more specific it occurs always in the first of these two iterations). In most cases one iteration is enough to dequeue the encoder.
Since there is not much information online about these new API's any help is very appreciated!
I suspect you're getting bitten by the MediaMuxer disk write. The best way to be sure is to run systrace during recording and see what's actually happening during the pause. (systrace docs, explanation, bigflake example -- as of right now only the latter is updated for Android 4.3)
If that's the case, you may be able to mitigate the problem by running the MediaMuxer instance on a separate thread, feeding the H.264 data to it through a synchronized queue.
Do these pauses happen regularly, every 5 seconds? The CameraToMpegTest example configures the encoder to output an I-frame every 5 seconds (with an expected frame rate of 30fps), which results in a full-sized frame being output rather than tiny deltas.
As #fadden points out, this is a disk write issue that occurs mostly on devices with lower writing flash speeds or if you try to write to the SD card.
I have written a solution on how to buffer MediaMuxer's write in a similar question here.
I am trying to create a real time communication application (SIP like, Skype like, etc.) that is using Android's OpenSL ES implementation.
The thing is, I need to play some WAV audio file over the communication (I am doing this with MediaPlayer in Java).
Of course, mediaplayer works fine when OpenSL ES is not running. But when it is, all hell breaks loose : the result is strongly inconsistent across devices.
On Nexus 7 (4.2.2) : The wav plays as it should
On Nexus 4 (4.2.2) & Galaxy S3 (4.1.2) : The wav plays very low (even at the volume sets at its maximum)
On Galaxy S4 (4.2.2) : The wav plays very loud and saturated (even at the volume sets at its minimum)
I have created an example project to demonstrate that issue, if you play the sound (pweeet button) before starting the engine, it works. If you play it after, it depends on the device.
Here are my observations :
In OpenSL ES, if ONLY the player works OR the recorder, everything works as expected. It is the combination of player & recorder that makes the bug (in MainActivity.java, just comment StartPlayer() [l.47] or StartRecorder() [l.48] to see that).
If I disable the player enqueuing (in Audio.cpp, comment (*playerBufferQueue)->Enqueue [l.78-80]) everything works as expected.
If I don't set OpenSL ES to play on the voice stream (In Audio.cpp, comment (*playerConfig)->SetConfiguration [l.146-187]) everything works as expected.
Of course, nothing above is a solution as I need to...
record from OpenSL ES as a voice communication
play from OpenSL ES in the voice stream
play from MediaPlayer in the media Stream
...all at the same time.
Finaly, I should point out that, in the Galaxy S4, when I enqueue frame that I received from the network, OpenSL ES plays it so loud and saturated that the application is unusable. So I don't think the issue is on the MediaPlayer Java side.
Thanks to Michael comment, I have solve my problem by setting Audio mode IN_COMMUNICATION.
People on the net give latency numbers of around 88mS for Galaxy Nexus running ICS and 72mS for Nexus 7 running JB 4.1.1 I have tried with both AudioTrack and OpenES and found that I cannot get less than 140mS latency on either device. Am I missing something? I have set my output threads to URGENT_AUDIO priority, pass the audio in small chunks (eg 160 shorts) and use the minimum buffer size (in the AudioTrack case).
Are the quoted numbers only valid for short sounds played through SoundPool and not applicable to streaming PCM? Just to be clear, I am talking about playback only not recording.
This is androids dirty little secret. It is not fixed, all you need is an app, an ear, and a finger to find the truth.