I am using MediaCodec API to decode a H264 video stream using a SurfaceView as the output surface. The decoder is configured successfully without any errors. When I try to finally render the decoded video frame onto the SurfaceView using releaseOutputBuffer(bufferIndex, true), it throws MediaCodec.CodecException, however the video is rendered correctly.
Calling getDiagnosticInfo() and getErrorCode() on the exception object return an error code of -34, but I can't find in the docs what this error code means. The documentation is also very unclear about when this exception is thrown.
Has anyone faced this exception/error code before? How can I fix this?
PS: Although the video works fine but this exeception is thrown at everyreleaseOutputBuffer(bufferIndex, true), call.
Android media-codec is very dependant on the device vendor. Samsung is incredibly problematic other devices running the same code will run fine. This has been my life for the last 6 months.
The best approach to do although it can feel wrong is to try + catch + retry. There are 4 distinct places where the MediaCodec will throw exceptions:
Configuration - NativeDecoder.Configure(...);
Start - NativeDecoder.Start();
Render output - NativeDecoder.ReleaseOutputBuffer(...);
Input - codec.QueueInputBuffer(...);
NOTE: my code is in Xamarin but the calls map very closely to raw java.
The way you configure your format description also matters. The media-codec can crash on NEXUS devices if you don't specify:
formatDescription.SetInteger(MediaFormat.KeyMaxInputSize, currentPalette.Width * currentPalette.Height);
When you catch any exception you will need to ensure the mediacodec is reset. Unfortunatly reset isnt available to older api-levels but you can simulate the same effect with:
#region Close + Release Native Decoder
void StopAndReleaseNativeDecoder() {
FlushNativeDecoder();
StopNativeDecoder();
ReleaseNativeDecoder();
}
void FlushNativeDecoder() {
if (NativeDecoder != null) {
try {
NativeDecoder.Flush();
} catch {
// ignore
}
}
}
void StopNativeDecoder() {
if (NativeDecoder != null) {
try {
NativeDecoder.Stop();
} catch {
// ignore
}
}
}
void ReleaseNativeDecoder() {
while (NativeDecoder != null) {
try {
NativeDecoder.Release();
} catch {
// ignore
} finally {
NativeDecoder = null;
}
}
}
#endregion
Once you catch the error when you pass new input you can check:
if (!DroidDecoder.IsRunning && streamView != null && streamView.VideoLayer.IsAvailable) {
DroidDecoder.StartDecoder(streamView.VideoLayer.SurfaceTexture);
}
DroidDecoder.DecodeH264FrameBuffer(payload, payloadSize, frameDuration, presentationTime, isKeyFrame);
Rendering to a texture-view seems to be the most stable option currently. But the device fragmentation has really hurt android in this area. We have found cheaper devices such as a the Tesco Hudl to be of the most stable for video. Even had up to 21 concurrent videos on screen at 1 time. Samsung S4 can get around 4-6 depending on the resolution/fps but something like the HTC can work as well as the Hudl. Its been a wake up call and made me realise samsung devices are literally copying apple design and twiddling with the android-sdk and actually breaking a lot of functionality along the way.
It is most probably an issue with the codec you are using. Try using something like this to
private static MediaCodecInfo selectCodec(String mime){
int numCodecs = MediaCodecList.getCodecCount();
for(int i = 0; i < numCodecs; i++){
MediaCodecInfo codecInfo = MediaCodecList.getCodecInfoAt(i);
if(!codecInfo.isEncoder()){
continue;
}
String[] types = codecInfo.getSupportedTypes();
for(int j = 0; j < types.length; j++){
if(types[j].equalsIgnoreCase(mime)){
return codecInfo;
}
}
}
return null;
}
And then setting your encoder with:
MediaCodecInfo codecInfo = selectCodec(MIME_TYPE);
mEncoder = MediaCodec.createCodecByName(codecInfo.getName());
That may resolve your error by ensuring that the Codec you've chosen is fully supported.
Related
I'm developing an Android native app where I record the screen video stream (encoding it with the native AMediaCodec library to video/avc) and I mux it with an AAC / audio/mp4a-latm audio track. My code works just fine on several devices, but I got some problems with some devices (Huawei P8 and P10 lite, running Android 6.0.0 and 7.0 respectively, and Nexus 5 running Android 6.0.1). The issue is that, whenever I try to add the second track to the muxer (no matter the order I add them), it fails returning a -10000 error code.
I've simplified the problem, trying to just mux an audio and a video file together; the results are the same. In this simplified version I use two AMediaExtractor's to get the audio and video formats to configure the AMediaMuxer, but when I add the second track I still get an error. Here's the code:
const auto videoSample = "videosample.mp4";
const auto audioSample = "audiosample.aac";
const auto filePath = "muxed_file.mp4"
auto* extractorV = AMediaExtractor_new();
AMediaExtractor_setDataSource(extractorV, videoSample);
AMediaExtractor_selectTrack(extractorV, 0U); // here I take care to select the right "video/avc" track
auto* videoFormat = AMediaExtractor_getTrackFormat(extractorV, 0U);
auto* extractorA = AMediaExtractor_new();
AMediaExtractor_setDataSource(extractorA, audioSample);
AMediaExtractor_selectTrack(extractorA, 0U); // here I take care to select the right "mp4a-latm" track
auto* audioFormat = AMediaExtractor_getTrackFormat(extractorA, 0U);
auto fd = open(filePath.c_str(), O_WRONLY | O_CREAT | O_TRUNC, 0666);
auto* muxer = AMediaMuxer_new(fd, AMEDIAMUXER_OUTPUT_FORMAT_MPEG_4);
auto videoTrack = AMediaMuxer_addTrack(muxer, videoFormat); // the operation succeeds: videoTrack is 0
auto audioTrack = AMediaMuxer_addTrack(muxer, audioFormat); // error: audioTrack is -10000
AMediaExtractor_seekTo(extractorV, 0, AMEDIAEXTRACTOR_SEEK_CLOSEST_SYNC);
AMediaExtractor_seekTo(extractorA, 0, AMEDIAEXTRACTOR_SEEK_CLOSEST_SYNC);
AMediaMuxer_start(muxer);
Is there something wrong with my code? Is it something that is not supposed to work on Android prior to 8 or it'a pure coincidence? I've read a lot of posts (especially by #fadden) here on SO, but I'm not able to figure it out.
Let me give you some context:
the failure is independent from the order I add the two tracks: it will always be the second AMediaMuxer_addTrack() to fail
audio and video tracks should be ok: when I mux only one of the tracks, everything works well even on Huaweis and Nexus 5, I obtain correct output files, both with the audio or video track alone
I tried to move the AMediaExtractor_seekTo() calls to other positions, without success
the same code works just fine on other devices (OnePlus 5 and Nokia 7 plus, both running Android >= 8.0)
Just for completeness, this is the code I later use to obtain the output mp4 file:
AMediaMuxer_start(muxer);
// mux the VIDEO track
std::array<uint8_t, 256U * 1024U> videoBuf;
AMediaCodecBufferInfo videoBufInfo{};
videoBufInfo.flags = AMediaExtractor_getSampleFlags(extractorV);
bool videoEos{};
while (!videoEos) {
auto ts = AMediaExtractor_getSampleTime(extractorV);
videoBufInfo.presentationTimeUs = std::max(videoBufInfo.presentationTimeUs, ts);
videoBufInfo.size = AMediaExtractor_readSampleData(extractorV, videoBuf.data(), videoBuf.size());
if(videoBufInfo.presentationTimeUs == -1 || videoBufInfo.size < 0) {
videoEos = true;
} else {
AMediaMuxer_writeSampleData(muxer, videoTrack, videoBuf.data(), &videoBufInfo);
AMediaExtractor_advance(extractorV);
}
}
// mux the audio track
std::array<uint8_t, 256U * 1024U> audioBuf;
AMediaCodecBufferInfo audioBufInfo{};
audioBufInfo.flags = AMediaExtractor_getSampleFlags(extractorA);
bool audioEos{};
while (!audioEos) {
audioBufInfo.size = AMediaExtractor_readSampleData(extractorA, audioBuf.data(), audioBuf.size());
if(audioBufInfo.size < 0) {
audioEos = true;
} else {
audioBufInfo.presentationTimeUs = AMediaExtractor_getSampleTime(extractorA);
AMediaMuxer_writeSampleData(muxer, audioTrack, audioBuf.data(), &audioBufInfo);
AMediaExtractor_advance(extractorA);
}
}
AMediaMuxer_stop(muxer);
AMediaMuxer_delete(muxer);
close(fd);
AMediaFormat_delete(audioFormat);
AMediaExtractor_delete(extractorA);
AMediaFormat_delete(videoFormat);
AMediaExtractor_delete(extractorV);
I have a screen recording app that uses a MediaCodec encoder to encode the video frames. Here's one way I retrieve the video-encoder:
videoCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);
I then try to determine the best bitrate-mode that this encoder supports, with my order of preference being "Constant Quality" mode, Variable Bitrate mode, Constant Bitrate mode. This is how I try to do it:
MediaCodecInfo.CodecCapabilities capabilities = videoCodec.getCodecInfo().getCapabilitiesForType(MediaFormat.MIMETYPE_VIDEO_AVC);
MediaCodecInfo.EncoderCapabilities encoderCapabilities = capabilities.getEncoderCapabilities();
if (encoderCapabilities.isBitrateModeSupported(MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CQ)) {
Timber.i("Setting bitrate mode to constant quality");
videoFormat.setInteger(MediaFormat.KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CQ);
} else if (encoderCapabilities.isBitrateModeSupported(MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_VBR)) {
Timber.w("Setting bitrate mode to variable bitrate");
videoFormat.setInteger(MediaFormat.KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_VBR);
} else if (encoderCapabilities.isBitrateModeSupported(MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CBR)) {
Timber.w("Setting bitrate mode to constant bitrate");
videoFormat.setInteger(MediaFormat.KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CBR);
}
Running this on my Samsung Galaxy S7 ends up selecting VBR mode, i.e. Constant Quality mode is supposedly not supported. However, if I just set the BITRATE_MODE to Constant Quality, it not only works but in fact produces a better quality video than VBR mode.
So, if Constant Quality mode is apparently supported by this encoder, why do I get a false negative from isBitrateModeSupported()? Am I missing something here?
It's super late but maybe i can help people that arrive here in the future.
As far as android 26, MediaCodec class will only accept BITRATE_MODE_CQ for MIMETYPE_AUDIO_FLAC codecs.
I dont know why but this is hard coded into the class:
/**
* Query whether a bitrate mode is supported.
*/
public boolean isBitrateModeSupported(int mode) {
for (Feature feat: bitrates) {
if (mode == feat.mValue) {
return (mBitControl & (1 << mode)) != 0;
}
}
return false;
}
private void applyLevelLimits() {
String mime = mParent.getMimeType();
if (mime.equalsIgnoreCase(MediaFormat.MIMETYPE_AUDIO_FLAC)) {
mComplexityRange = Range.create(0, 8);
mBitControl = (1 << BITRATE_MODE_CQ);
} else if (mime.equalsIgnoreCase(MediaFormat.MIMETYPE_AUDIO_AMR_NB)
|| mime.equalsIgnoreCase(MediaFormat.MIMETYPE_AUDIO_AMR_WB)
|| mime.equalsIgnoreCase(MediaFormat.MIMETYPE_AUDIO_G711_ALAW)
|| mime.equalsIgnoreCase(MediaFormat.MIMETYPE_AUDIO_G711_MLAW)
|| mime.equalsIgnoreCase(MediaFormat.MIMETYPE_AUDIO_MSGSM)) {
mBitControl = (1 << BITRATE_MODE_CBR);
}
}
considering that BITRATE_MODE_CQ is 0 isBitrateModeSupported will only return true case MIMETYPE_AUDIO_FLAC is selected.
This is the answer why it returns false up to android lvl 26
why it is coded like this i wont know.
I guess a simple way to check is try to create the encoder with the format you wish and catch any possible exception
I've compiled an FFMPEG library for use on Android with libx264 and using the NDK.
I want to encode an MPEG video file however the application is failing when opening the encoder codec, in avcodec_open2.
The FFMPEG logs I receive from avcodec_open2 are below with the function returning -22.
Picture size %ux%u is invalid.
ignoring invalid width/height values
Specified pix_fmt is not supported
On windows this code works fine, it's only on Android that there is a failure. Any ides why this would fail on Android?
if (!(codec = avcodec_find_encoder(AV_CODEC_ID_MPEG1VIDEO)))
{
return -1;
}
//Allocate context based on codec
if (!(context = avcodec_alloc_context3(codec)))
{
return -2;
}
//Setup Context
// put sample parameters
context->bit_rate = 4000000;
// resolution must be a multiple of two
context->width = 1280;
context->height = 720;
// frames per second
context->time_base = (AVRational){1,25};
context->inter_quant_bias = 96;
context->gop_size = 10;
context->max_b_frames = 1;
//IDs
context->pix_fmt = AV_PIX_FMT_YUV420P;
context->codec_id = AV_CODEC_ID_MPEG1VIDEO;
context->codec_type = AVMEDIA_TYPE_VIDEO;
if (AV_CODEC_ID_MPEG1VIDEO == AV_CODEC_ID_H264)
{
av_opt_set(context->priv_data, "preset", "slow", 0);
}
if ((result = avcodec_open2(context, codec, NULL)) < 0)
{
//Failed opening Codec!
}
This problem was caused by building FFMPEG with outdated source code.
I got the most recent source from https://www.ffmpeg.org/ and compiled it in the same way and the new library works fine.
Note: I hadn't considered the full implications regarding licenses of using libx264. I've since dropped it.
Is there a way to ask an Android device what audio and video Codecs it supports for encoding?
I found devices that do not support some of the codecs listed as mandatory in
http://developer.android.com/guide/appendix/media-formats.html
and there seem to be devices supporting additional codec not listed there.
That could be interesting for you:
private static MediaCodecInfo selectCodec(String mimeType) {
int numCodecs = MediaCodecList.getCodecCount();
for (int i = 0; i < numCodecs; i++) {
MediaCodecInfo codecInfo = MediaCodecList.getCodecInfoAt(i);
if (!codecInfo.isEncoder()) {
continue;
}
String[] types = codecInfo.getSupportedTypes();
for (int j = 0; j < types.length; j++) {
if (types[j].equalsIgnoreCase(mimeType)) {
return codecInfo;
}
}
}
return null;
}
Found it here. AS you can see you get the number of installed codecs with MediaCodecList.getCodecCount();. With MediaCodecInfo codecInfo = MediaCodecList.getCodecInfoAt(i); you get information about a specific codec out of the list. codecInfo.getName() for example tells you title/name of the codec.
Is there a way to ask an Android device what audio and video Codecs it supports for encoding?
I really wish there were, but there is not, at least through ICS.
Jelly Bean offers a MediaCodec class. While it does not have a "give me a list of supported codecs", it does have createEncoderByType(), where you pass in a MIME type. Presumably, that will throw a RuntimeException or return null if your desired MIME type is not supported. And I cannot promise that just because MediaCodec reports that an encoder is available that it is guaranteed to work from, say, MediaRecorder.
The simplest way is using
MediaCodecList(MediaCodecList.ALL_CODECS).codecInfos
It returns a array of all encoders and decoders available on your devices like this image.
And then, you can use filter to query the specific encoders and decoders you are looking for. For example:
MediaCodecList(MediaCodecList.ALL_CODECS).codecInfos.filter {
it.isEncoder && it.supportedTypes[0].startsWith("video")
}
This returns all available video encoders.
Here is an updated code based on Jonson's answer, written in Kotlin and not using deprecated methods:
fun getCodecForMimeType(mimeType: String): MediaCodecInfo? {
val mediaCodecList = MediaCodecList(MediaCodecList.REGULAR_CODECS)
val codecInfos = mediaCodecList.codecInfos
for (i in codecInfos.indices) {
val codecInfo = codecInfos[i]
if (!codecInfo.isEncoder) {
continue
}
val types = codecInfo.supportedTypes
for (j in types.indices) {
if (types[j].equals(mimeType, ignoreCase = true)) {
return codecInfo
}
}
}
return null
}
I tried to get the SLDeviceVolumeItf interface of the RecorderObject on Android but I got the error: SL_RESULT_FEATURE_UNSUPPORTED.
I read that the Android implementation of OpenSL ES does not support volume setting for the AudioRecorder. Is that true?
If yes is there a workaround? I have a VOIP application that does not worl well on Galaxy Nexus because of the very high mic gain.
I also tried to get the SL_IID_ANDROIDCONFIGURATION to set the streamType to the new VOICE_COMMUNINCATION audio-source but again I get error 12 (not supported).
// create audio recorder
const SLInterfaceID id[2] = { SL_IID_ANDROIDSIMPLEBUFFERQUEUE, SL_IID_ANDROIDCONFIGURATION };
const SLboolean req[2] = { SL_BOOLEAN_TRUE, SL_BOOLEAN_TRUE };
result = (*engine)->CreateAudioRecorder(engine, &recorderObject, &audioSrc, &audioSnk, 2, id, req);
if (SL_RESULT_SUCCESS != result) {
return false;
}
SLAndroidConfigurationItf recorderConfig;
result = (*recorderObject)->GetInterface(recorderObject, SL_IID_ANDROIDCONFIGURATION, &recorderConfig);
if(result != SL_RESULT_SUCCESS) {
error("failed to get SL_IID_ANDROIDCONFIGURATION interface. e == %d", result);
}
The recorderObject is created but I can't get the SL_IID_ANDROIDCONFIGURATION interface.
I tried it on Galaxy Nexus (ICS), HTC sense (ICS) and Motorola Blur (Gingerbread).
I'm using NDK version 6.
Now I can get the interface. I had to use NDK 8 and target-14.
When I tried to use 10 as a target, I had an error compiling the native code (dirent.h was not found).
I had to use target-platform-14.
I ran into a similar problem. My results were returning the error code for not implemented. However, my problem was that I wasn't creating the recorder with the SL_IID_ANDROIDCONFIGURATION interface flag.
apiLvl = (*env)->GetStaticIntField(env, versionClass, sdkIntFieldID);
SLint32 streamType = SL_ANDROID_RECORDING_PRESET_GENERIC;
if(apiLvl > 10){
streamType = SL_ANDROID_RECORDING_PRESET_VOICE_COMMUNICATION;
I("set SL_ANDROID_RECORDING_PRESET_VOICE_COMMUNICATION");
}
result = (*recorderConfig)->SetConfiguration(recorderConfig, SL_ANDROID_KEY_RECORDING_PRESET, &streamType, sizeof(SLint32));
if (SL_RESULT_SUCCESS != result) {
return 0;
}
Even i tried to find a way to change the gain in OpenSL, looks like there is no api/interface for that. i implemented a work around by implementing a simple shift gain multiplier
void multiply_gain(void *buffer, int bytes, int gain_val)
{
int i = 0, j = 0;
short *buffer_samples = (short*)buffer;
for(i = 0, j = 0; i < bytes; i+=2,j++)
{
buffer_samples[j] = (buffer_samples[j] >> gain_val);
}
}
But here the gain is multiplied/divided (based on << or >>) by a factor or 2. if you need a smoother gain curve, you need to write a more complex digital gain function.