A23 MediaCodec API - android

In a Android 4.2, A23 processor, We are trying to implement hardware encoding on the camera using the standard MediaCodec api, but couldn;t figure out what is the correct color format to use, for example, when we use this color format: MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar, this one is the value returned by android api to use, it's supposed to work, the encode inits well, but logcat issues error
E/omx_venc( 108): do not support this format: 21
We have tried many different formats, none works well.
/**
* Video encoding is done by a MediaCodec.
*/
protected void prepareHwEncoder() throws RuntimeException, IOException
{
Log.d(LOG_TAG, "Video encoded using the MediaCodec API with a buffer");
int encoderColorFormat = MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar;
// .COLOR_FormatYUV420SemiPlanar;
mEncoder = MediaCodec.createEncoderByType(MIME_TYPE);
MediaFormat mediaFormat = MediaFormat.createVideoFormat(MIME_TYPE, imageWidth, imageHeight);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, encoderColorFormat);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
mEncoder.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
}
any example code that works on the camera is highly appreciated.

You need to query the codec to determine what color formats it supports. Typical color formats are listed here. Note that all supported MediaCodec color formats have the U/V planes reversed relative to the Camera color formats.
You can find an example of this in the CTS EncodeDecodeTest -- see selectColorFormat().
Unfortunately, in Android 4.2, there's no guarantee that a reasonable color format is supported.

Related

Which AVC profile/level will be set in Android MediaCodec if we don't set this value manually?

In Android ACodec.cpp setupAVCEncoderParameters, it reads profile and level from msg(msg->findInt32("profile", &profile)/msg->findInt32("level", &level)), the msg appears to be coming from format in MediaCodec.configure(). So, I think we can set profile/level manually before MediaCodec.configure() like below:
format.setInteger(MediaCodec.KEY_PROFILE, MediaCodecInfo.CodecProfileLevel.AVCProfileHigh);
format.setInteger(MediaCodec.KEY_LEVEL, MediaCodecInfo.CodecProfileLevel.AVCLevel5);
Of course, I agree it's not a good idea, because we don't konw if our device support the profile/level we set. And also I find most example code didn't set these values at all.
For example:
MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, mWidth, mHeight);
format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface); // API >= 18
format.setInteger(MediaFormat.KEY_BIT_RATE, calcBitRate());
format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 10)
mMediaCodec = MediaCodec.createEncoderByType(MIME_TYPE);
mMediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
Maybe they will be set automatically? Will MediaCodec query which profile/level current device support and choose one automatically? If device support more than one profile/level, which one will be selected? The lower level one(baseline, e.g.) or the higher level one(high, e.g.)?
In most cases (as far as I know), it will be baseline, since higher profiles (with B-frames enabled) requires you as a caller to be prepared to handle it correctly (a trivial/naive application might have assumptions that don't hold up with B-frames), and an encoder with B-frames has got higher latency.

MediaCodec is giving a storeMetaDataInBuffers trace error

I'm getting on the logcat the next error while encoding via the MediaCodec in Android.
The actual encoding works fine and the output is produced correctly, so I can't really understand why I get this trace. Is it a harmless error trace, or is there something I'm missing?
E/ACodec(6438): [OMX.qcom.video.encoder.h263] storeMetaDataInBuffers (output) failed w/ err -1010
Next is the code where I get the trace
final int BIT_RATE = 4000000;
final int FRAME_RATE = 30;
final int IFRAME_INTERVAL = 5;
final String MIME_TYPE = "video/avc";
final MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
format.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
MediaCodec encoder = MediaCodec.createEncoderByType(MIME_TYPE);
//---------------------------------
// NEXT LINE PRODUCES THE TRACE
encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
//---------------------------------
It's harmless, most devices show this. See Q12 at http://bigflake.com/mediacodec/.
This only tells that the first way of signaling surface encoding wasn't supported by the encoder, so it used some other ways of setting it up. (There are multiple ways for the MediaCodec/ACodec layer to tell the individual encoder about it.)
The previous answer has indicated that the warning is quite harmless. Some additional information on the log and reasons behind the same
This trace in the log is indicating that the encoder is not supporting storeMetadataInBuffers on the output port. For an encoder, this mode could be supported on both input and output ports.
This mode is employed for input port to pass raw image data in metadata format i.e. pass only a reference to the gralloc handle which can employed by the encoder to access the data. This is employed by the camera and/or other screen recording applications to pass a reference to YUV data to the encoder.
The metadata mode was supported for output port also for potential encapsulation of output bitstream data. For example, when a Miracast or WiFi-Display session is active and the data being encoded is secure like a premium content, it becomes necessary to protect data between the encoder and HDCP encryption module, during which metadata format becomes handy. Not many encoders support this mode and hence, you observe this warning.

Problems of using MediaCodec.getOutputFormat() for an encoder in Android 4.1/4.2 devices

I'm trying to use MediaCodec to encode frames (either by camera or decoder) into a video.
When processing the encoder output by dequeueOutputBuffer(), I expect to receive the return index = MediaCodec.INFO_OUTPUT_FORMAT_CHANGED, so I can call getOutputFormat() to get the encoder output format as the input of the currently used ffmpeg muxer.
I have tested some pad/phone devices with Android version 4.1~4.3. All of them have at least one hardware video AVC encoder and is used in the test. On the devices with version 4.3, the encoder gives MediaCodec.INFO_OUTPUT_FORMAT_CHANGED before writing the encoded data as expected, and the output format returned from getOutputFormat() can be used by the muxer correctly. On the devices with 4.2.2 or lower, the encoder never gives MediaCodec.INFO_OUTPUT_FORMAT_CHANGED while it can still output the encoded elementary stream, but the muxer cannot know the exact output format.
I want to ask the following questions:
Does the behavior of encoder (gives MediaCodec.INFO_OUTPUT_FORMAT_CHANGED or not before outputing encoded data) depend on the Android API Level or the chips on individual devices?
If the encoder writes data before MediaCodec.INFO_OUTPUT_FORMAT_CHANGED appears, is there any way to get the output format of the encoded data?
The encoder still output the codec config data (with flag MediaCodec.BUFFER_FLAG_CODEC_CONFIG) on the devices before the encoded data. It is mostly used to config a decoder, but can I derive the output format by the codec config data?
I have tried these solutions to get the output format but failed:
Call getOutputFormat() frequently during the whole encode process. However, all of them throw IllegalStateException without the appearance of MediaCodec.INFO_OUTPUT_FORMAT_CHANGED.
Use the initial MediaFormat use to config the encoder at the beginning, like the example:
m_init_encode_format = MediaFormat.createVideoFormat(m_encode_video_mime, m_frame_width, m_frame_height);
int encode_bit_rate = 3000000;
int encode_frame_rate = 15;
int encode_iframe_interval = 2;
m_init_encode_format.setInteger(MediaFormat.KEY_COLOR_FORMAT, m_encode_color_format);
m_init_encode_format.setInteger(MediaFormat.KEY_BIT_RATE, encode_bit_rate);
m_init_encode_format.setInteger(MediaFormat.KEY_FRAME_RATE, encode_frame_rate);
m_init_encode_format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, encode_iframe_interval);
m_encoder = MediaCodec.createByCodecName(m_video_encoder_codec_info.getName());
m_encoder.configure(m_init_encode_format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
// Assume m_init_encode_format is the output format of the encoder
However it fails since the output format of the encoder is still "changed" from the initial one.
Please help me to realize the behavior of an encoder, and if there is any solution to query the output format if the required MediaCodec.INFO_OUTPUT_FORMAT_CHANGED is missing.
By comparing the output format and the codec config data, the missing fields are csd-0, csd-1, and a "what" field with value = 1869968451.
(I do not understand the "what" field. It seems to be a constant and is not required. Can anyone tell me about its meaning?)
If I parse the codec config data as the csd-1 field (last 8 bytes) and csd-0 field (remaining bytes), it seems that the muxer can work correctly and output a video playable on all of the testing devices.
(But I want to ask: is this 8-byte assumption correct, or there is more reliable way to parse the data?)
However, I got another problem that If I decode the video by Android MediaCodec again, the BufferInfo.presentationTimeUs value get by dequeueOutputBuffer() is 0 for most of the decoded frames. Only the last few frames has correct time. The sample time get by MediaExtractor.getSampleTime() is correct and exactly the value I set to the encoder/muxer, but the decoded frame time is not. This issue only happen on 4.2.2 or lower device.
It is strange that the frame time is incorrect but the video can be playback in correct speed on the device. (Most of the devices with 4.2.2 or lower I've tested has only 1 Video AVC decoder.) Do I need to set other fields that may affect the presentation time?
The behavior of MediaCodec encoders was changed in Android 4.3 to accommodate the introduction of the MediaMuxer class. In Android 4.3, you will always receive INFO_OUTPUT_FORMAT_CHANGED from the encoder. In previous releases, you will not. (I've updated the relevant FAQ entry.)
There is no way to query the encoder for the MediaFormat.
I haven't used an ffmpeg-based muxer, so I'm not sure what information it needs. If it's looking for the csd-0 / csd-1 keys, you can extract those from the CODEC_CONFIG packet (I think you have to parse the SPS / PPS values out and place them in the separate keys). Examining the contents of the MediaFormat on a 4.3 device will show you which fields you're lacking.
To init ffmpeg muxer for video correctly i use next:
int outputBufferIndex = videoCodec.dequeueOutputBuffer(bufferInfo, -1);
if (MediaCodec.BUFFER_FLAG_CODEC_CONFIG == bufferInfo.flags) {
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
headerData = new byte[bufferInfo.size];
outputBuffer.get(headerData);
// jni call
WriteVideoHeader(headerData, headerData.length);
videoCodec.releaseOutputBuffer(outputBufferIndex, false);
}
In jni I use something like this:
jint Java_com_an_FileWriterEx_WriteVideoHeader(JNIEnv * env, jobject this, jbyteArray data, jint datasize)
{
jboolean isCopy;
jbyte* rawjBytes = (*env)->GetByteArrayElements(env, data, &isCopy);
WriteVideoHeaderInternal(env, m_pFormatCtx, m_pVideoStream, rawjBytes, datasize);
(*env)->ReleaseByteArrayElements(env, data, rawjBytes, 0);
return 0;
}
jint WriteVideoHeaderInternal(JNIEnv * env, AVFormatContext* pFormatCtx, AVStream* pVideoStream, jbyte* data, jint datasize)
{
jboolean bNoError = JNI_TRUE;
jbyte* pExtDataBuffer = av_malloc(datasize);
if(!pExtDataBuffer)
{
LOGI("av alloc error\n");
bNoError = JNI_FALSE;
}
if (bNoError)
{
memcpy(pExtDataBuffer, data, datasize * sizeof(jbyte));
pVideoStream->codec->extradata = pExtDataBuffer;
pVideoStream->codec->extradata_size = datasize;
}
}
For the parsing of codec config data, it is wrong that assuming the last 8 bytes are the PPS data. The data must be parsed according to the start code and nal_unit_type.

Not able to use H264 (video/avc) Encoder on Intel x86 device, Android 4.2.2

I intend to encode raw YUV Data to H264 data for which I'm using Android's MediaCodec interface. Below is the snippet I have in place for the same:
MediaCodec mEncoder = MediaCodec.createEncoderByType("video/avc");
MediaFormat mVideoFormat = MediaFormat.createVideoFormat("video/avc", 640 , 480);
mVideoFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mVideoFormat.setInteger(MediaFormat.KEY_BIT_RATE, 64000);
mVideoFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 24);
mVideoFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mVideoFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AVCProfileBaseline);
mEncoder.configure(mVideoFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mEncoder.start();
ByteBuffer[] mInputVideoBuffers = mEncoder.getInputBuffers();
ByteBuffer[] mOutputVideoBuffers = mEncoder.getOutputBuffers();
Although it works well on ARM devices, it fails on Intel x86 device I have (Samsung Tab 3) with below message:
E/ACodec(21756): [OMX.Intel.VideoEncoder.AVC] ERROR(0x80001001)
E/MediaCodec(21756): Codec reported an error. (omx error 0x80001001,
internalError -2147483648)
Any help on this would be useful.
Found the fix for the issue. I did not release the Codec before creating another one. Multiple instances of Encoder is not permissible on Samsung Tab 3 running on Intel x86 device. This behaviour is pretty inconsistent across android devices; taking into account other devices on which I've had tested my code.
The code shown won't work on some ARM devices. COLOR_FormatYUV420SemiPlanar isn't supported everywhere.
You need to detect the set of available color formats at runtime. See the isRecognizedFormat() method in EncodeDecodeTest. To pass CTS, the device must allow one of those formats. There's five listed, but really there's only two (planar and semi-planar), and they're not radically different.
For Intel Devices Encoder.getOutput is Crashing , Created a media format and directly supplied to the encoder
MediaFormat mVideoFormat = MediaFormat.createVideoFormat("video/avc", 640 , 480);
mVideoFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
mVideoFormat.setInteger(MediaFormat.KEY_BIT_RATE, 64000);
mVideoFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 24);
mVideoFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mTrackIndex = mMuxer.addTrack(mVideoFormat );

How to set encoder buffer size created by MediaCodec

I am trying to use Nexus to test encoding with Mediacodec APIs. I can see the inputBuffers provided by the encoder is 119040 (by logging inputBuffers.capacity). But the size of the frame, i.e. input, is 460800. I got error message at inputBuffer.put with buffer overflow. So I was about to set the input buffer to 460800. The API I could find is BufferInfo.set. However, I cannot find a way to attach this setting to the encoder. Could someone help? Thanks!!!
encoder = MediaCodec.createByCodecName(codecInfo.getName());
ByteBuffer[] inputBuffers = encoder.getInputBuffers();
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(input);
encoder.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);}
I'm late to the party, but based on: Android MediaCodec Documentation I think the correct way to change the buffer will be to adjust the MAX_INPUT_SIZE, something like:
int width=800;
int height=480;
encoder = MediaCodec.createByCodecName(codecInfo.getName());
format = MediaFormat.createVideoFormat ("video/avc", width, height);
format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE,655360); // 0.5MB but adjust it as you need.
You don't set the size of the input buffer. The size is determined by the MediaFormat, specifically the width, height, and color format. If your input data has a different size, you will need to convert it to the format that the codec is expecting.
This isn't entirely trivial but is doable. For examples, see the buffer-to-buffer tests in the CTS EncodeDecodeTest. The test queries the codec to see what color format is supported, generates frames in that format, submits them to the encoder, then decodes the video and confirms that what comes out is the same as what went in.
The test generally requires API 18 (Android 4.3), but the buffer-to-buffer code will work in API 16. Whether or not it works on any given device is a different question -- since the CTS test didn't exist until API 18, it's possible for pre-4.3 devices to do this wrong.

Categories

Resources