I'm getting on the logcat the next error while encoding via the MediaCodec in Android.
The actual encoding works fine and the output is produced correctly, so I can't really understand why I get this trace. Is it a harmless error trace, or is there something I'm missing?
E/ACodec(6438): [OMX.qcom.video.encoder.h263] storeMetaDataInBuffers (output) failed w/ err -1010
Next is the code where I get the trace
final int BIT_RATE = 4000000;
final int FRAME_RATE = 30;
final int IFRAME_INTERVAL = 5;
final String MIME_TYPE = "video/avc";
final MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
format.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
MediaCodec encoder = MediaCodec.createEncoderByType(MIME_TYPE);
//---------------------------------
// NEXT LINE PRODUCES THE TRACE
encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
//---------------------------------
It's harmless, most devices show this. See Q12 at http://bigflake.com/mediacodec/.
This only tells that the first way of signaling surface encoding wasn't supported by the encoder, so it used some other ways of setting it up. (There are multiple ways for the MediaCodec/ACodec layer to tell the individual encoder about it.)
The previous answer has indicated that the warning is quite harmless. Some additional information on the log and reasons behind the same
This trace in the log is indicating that the encoder is not supporting storeMetadataInBuffers on the output port. For an encoder, this mode could be supported on both input and output ports.
This mode is employed for input port to pass raw image data in metadata format i.e. pass only a reference to the gralloc handle which can employed by the encoder to access the data. This is employed by the camera and/or other screen recording applications to pass a reference to YUV data to the encoder.
The metadata mode was supported for output port also for potential encapsulation of output bitstream data. For example, when a Miracast or WiFi-Display session is active and the data being encoded is secure like a premium content, it becomes necessary to protect data between the encoder and HDCP encryption module, during which metadata format becomes handy. Not many encoders support this mode and hence, you observe this warning.
Related
I'm trying to decode video with non-default colorimetry using MediaCodec NDK. I provide the SPS and PPS into the csd-0 and csd-1 buffers respectively, but that information does not seem to affect how the decoded video looks.
First, I initialize the AMediaFormat
AMediaFormat * format = AMediaFormat_new ();
AMediaFormat_setString (format, AMEDIAFORMAT_KEY_MIME, "video/avc");
AMediaFormat_setInt32 (format, AMEDIAFORMAT_KEY_WIDTH, this->width);
AMediaFormat_setInt32 (format, AMEDIAFORMAT_KEY_HEIGHT, this->height);
AMediaFormat_setInt32 (format, AMEDIAFORMAT_KEY_FRAME_RATE, this->fps_n);
Then I provide the SPS and PPS buffers for my video stream
uint8_t sps[] = { 0,0,0,1,103,100,0,52,172,43,64,8,0,24,54,2,220,4,32,6,148,0,0,15,160,0,7,83,2,61,42,128 };
uint8_t pps[] = { 0,0,0,1,104,238,60,176 };
const size_t sps_len = 32;
const size_t pps_len = 8;
AMediaFormat_setBuffer (format, "csd-0", sps, sps_len);
AMediaFormat_setBuffer (format, "csd-1", pps, pps_len);
And finally, I configure and start the codec
AMediaCodec_configure (codec, format, window, NULL, 0);
AMediaCodec_start (codec);
AMediaFormat_delete (format);
I would now begin queueing input buffers for decompression as usual. This runs, without any error in the logs, but the decoded video looks exactly the same, regardless of what I have set for the transfer characteristics (above it's set to '8' for linear gamma).
Does anyone have any suggestions on why the media codec doesn't seem to be actually using the colorimetry data that I have provided?
The color-space information in the H.264 stream is informational metadata only. So your observation is correct and the decompressor works as it should.
You will get the decompressed bitmap in the same color-space as it was encoded.
Usually the decompressor doesn't do or care about color-spaces. You have to do a color-space conversation after decompression.
I am developing a H.264 decoder using MediaCodec API. I am trying to call MediaCodec java API in JNI layer inside a function like:
void Decompress(const unsigned char *encodedInputdata, unsigned int inputLength, unsigned char **outputDecodedData, int &width, int &height) {
// encodedInputdata is encoded H.264 remote stream
// .....
// outputDecodedData = call JNI function of MediaCodec Java API to decode
// .....
}
Later I will send the outputDecodedData to my existing video rendering pipeline and render on Surface.
I hope I will be able to write a Java function to decode the input stream, but these would be challenge -
This resource states that -
...you can't do anything with the decoded video frame but render them
to surface
Here a Surface has been passed decoder.configure(format, surface, null, 0) to render the output ByteBuffer on the surface and claimed We can't use this buffer but render it due to the API limit.
So, will I able to send the output ByteBuffer to native layer to cast as unsigned char* and pass to my rendering pipeline instead of passing a Surface ot configure()?
I see two fundamental problems with your proposed function definition.
First, MediaCodec operates on access units (NAL units for H.264), not arbitrary chunks of data from a stream, so you need to pass in one NAL unit at a time. Once the chunk is received, the codec may want to wait for additional frames to arrive before producing any output. You cannot in general pass in one frame of input and wait to receive one frame of output.
Second, as you noted, the ByteBuffer output is YUV-encoded in one of several color formats. The format varies from device to device; Qualcomm devices notably use their own proprietary format. (It has been reverse-engineered, though, so if you search around you can find some code to unravel it.)
The common workaround is to send the video frames to a SurfaceTexture, which converts them to GLES "external" textures. These can be manipulated in various ways, or rendered to a pbuffer and extracted with glReadPixels().
I'm trying to use MediaCodec to encode frames (either by camera or decoder) into a video.
When processing the encoder output by dequeueOutputBuffer(), I expect to receive the return index = MediaCodec.INFO_OUTPUT_FORMAT_CHANGED, so I can call getOutputFormat() to get the encoder output format as the input of the currently used ffmpeg muxer.
I have tested some pad/phone devices with Android version 4.1~4.3. All of them have at least one hardware video AVC encoder and is used in the test. On the devices with version 4.3, the encoder gives MediaCodec.INFO_OUTPUT_FORMAT_CHANGED before writing the encoded data as expected, and the output format returned from getOutputFormat() can be used by the muxer correctly. On the devices with 4.2.2 or lower, the encoder never gives MediaCodec.INFO_OUTPUT_FORMAT_CHANGED while it can still output the encoded elementary stream, but the muxer cannot know the exact output format.
I want to ask the following questions:
Does the behavior of encoder (gives MediaCodec.INFO_OUTPUT_FORMAT_CHANGED or not before outputing encoded data) depend on the Android API Level or the chips on individual devices?
If the encoder writes data before MediaCodec.INFO_OUTPUT_FORMAT_CHANGED appears, is there any way to get the output format of the encoded data?
The encoder still output the codec config data (with flag MediaCodec.BUFFER_FLAG_CODEC_CONFIG) on the devices before the encoded data. It is mostly used to config a decoder, but can I derive the output format by the codec config data?
I have tried these solutions to get the output format but failed:
Call getOutputFormat() frequently during the whole encode process. However, all of them throw IllegalStateException without the appearance of MediaCodec.INFO_OUTPUT_FORMAT_CHANGED.
Use the initial MediaFormat use to config the encoder at the beginning, like the example:
m_init_encode_format = MediaFormat.createVideoFormat(m_encode_video_mime, m_frame_width, m_frame_height);
int encode_bit_rate = 3000000;
int encode_frame_rate = 15;
int encode_iframe_interval = 2;
m_init_encode_format.setInteger(MediaFormat.KEY_COLOR_FORMAT, m_encode_color_format);
m_init_encode_format.setInteger(MediaFormat.KEY_BIT_RATE, encode_bit_rate);
m_init_encode_format.setInteger(MediaFormat.KEY_FRAME_RATE, encode_frame_rate);
m_init_encode_format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, encode_iframe_interval);
m_encoder = MediaCodec.createByCodecName(m_video_encoder_codec_info.getName());
m_encoder.configure(m_init_encode_format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
// Assume m_init_encode_format is the output format of the encoder
However it fails since the output format of the encoder is still "changed" from the initial one.
Please help me to realize the behavior of an encoder, and if there is any solution to query the output format if the required MediaCodec.INFO_OUTPUT_FORMAT_CHANGED is missing.
By comparing the output format and the codec config data, the missing fields are csd-0, csd-1, and a "what" field with value = 1869968451.
(I do not understand the "what" field. It seems to be a constant and is not required. Can anyone tell me about its meaning?)
If I parse the codec config data as the csd-1 field (last 8 bytes) and csd-0 field (remaining bytes), it seems that the muxer can work correctly and output a video playable on all of the testing devices.
(But I want to ask: is this 8-byte assumption correct, or there is more reliable way to parse the data?)
However, I got another problem that If I decode the video by Android MediaCodec again, the BufferInfo.presentationTimeUs value get by dequeueOutputBuffer() is 0 for most of the decoded frames. Only the last few frames has correct time. The sample time get by MediaExtractor.getSampleTime() is correct and exactly the value I set to the encoder/muxer, but the decoded frame time is not. This issue only happen on 4.2.2 or lower device.
It is strange that the frame time is incorrect but the video can be playback in correct speed on the device. (Most of the devices with 4.2.2 or lower I've tested has only 1 Video AVC decoder.) Do I need to set other fields that may affect the presentation time?
The behavior of MediaCodec encoders was changed in Android 4.3 to accommodate the introduction of the MediaMuxer class. In Android 4.3, you will always receive INFO_OUTPUT_FORMAT_CHANGED from the encoder. In previous releases, you will not. (I've updated the relevant FAQ entry.)
There is no way to query the encoder for the MediaFormat.
I haven't used an ffmpeg-based muxer, so I'm not sure what information it needs. If it's looking for the csd-0 / csd-1 keys, you can extract those from the CODEC_CONFIG packet (I think you have to parse the SPS / PPS values out and place them in the separate keys). Examining the contents of the MediaFormat on a 4.3 device will show you which fields you're lacking.
To init ffmpeg muxer for video correctly i use next:
int outputBufferIndex = videoCodec.dequeueOutputBuffer(bufferInfo, -1);
if (MediaCodec.BUFFER_FLAG_CODEC_CONFIG == bufferInfo.flags) {
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
headerData = new byte[bufferInfo.size];
outputBuffer.get(headerData);
// jni call
WriteVideoHeader(headerData, headerData.length);
videoCodec.releaseOutputBuffer(outputBufferIndex, false);
}
In jni I use something like this:
jint Java_com_an_FileWriterEx_WriteVideoHeader(JNIEnv * env, jobject this, jbyteArray data, jint datasize)
{
jboolean isCopy;
jbyte* rawjBytes = (*env)->GetByteArrayElements(env, data, &isCopy);
WriteVideoHeaderInternal(env, m_pFormatCtx, m_pVideoStream, rawjBytes, datasize);
(*env)->ReleaseByteArrayElements(env, data, rawjBytes, 0);
return 0;
}
jint WriteVideoHeaderInternal(JNIEnv * env, AVFormatContext* pFormatCtx, AVStream* pVideoStream, jbyte* data, jint datasize)
{
jboolean bNoError = JNI_TRUE;
jbyte* pExtDataBuffer = av_malloc(datasize);
if(!pExtDataBuffer)
{
LOGI("av alloc error\n");
bNoError = JNI_FALSE;
}
if (bNoError)
{
memcpy(pExtDataBuffer, data, datasize * sizeof(jbyte));
pVideoStream->codec->extradata = pExtDataBuffer;
pVideoStream->codec->extradata_size = datasize;
}
}
For the parsing of codec config data, it is wrong that assuming the last 8 bytes are the PPS data. The data must be parsed according to the start code and nal_unit_type.
I'm trying to implement HW-accelrated H264 video encoding on Android ICS 4.0.4. Since MediaCodec class is not available I have to use stagefright API. But when I put HardwareCodecsOnly flag, OMXCodec::Create always returns NULL.
If I call OMXCodec::findMatchingCodecs() with flag kHardwareCodecsOnly, I got following list:
- OMX.TI.DUCATI1.VIDEO.H264E
- OMX.qcom.7x30.video.encoder.avc
- OMX.qcom.video.encoder.avc
- OMX.TI.Video.encoder
- OMX.Nvidia.h264.encoder
- OMX.SEC.AVC.Encoder
so I guess it means that HW-encoding supported by hardware.
When I put no flags in OMXCodec::Create - codec created well, but I guess it is in software mode
(btw, how can I check- which codec exactly was created?)
Browsing OMXCodec sources I've found interesting lines:
if (createEncoder) {
sp<MediaSource> softwareCodec =
InstantiateSoftwareEncoder(componentName, source, meta);
if (softwareCodec != NULL) {
LOGV("Successfully allocated software codec '%s'", componentName);
return softwareCodec;
}
}
it looks like for Encoder it always tries to instance Software codec first.
What am I doing wrong? Any help wil be greatly appreciated. Thanks
Here's a code of OMXCodec creation:
mClient = new OMXClient();
mClient->connect();
logger->log("mClient.connect();");
enc_meta = new MetaData;
// frame size of target video file
int width = 640; //720;
int height = 480;
int kFramerate = 15;
int kVideoBitRate = 500000;
int kIFramesIntervalSec = 5;
int32_t colorFormat = OMX_COLOR_FormatYUV420SemiPlanar;
enc_meta->setCString(kKeyMIMEType, MEDIA_MIMETYPE_VIDEO_AVC); //MEDIA_MIMETYPE_VIDEO_MPEG4); //MEDIA_MIMETYPE_VIDEO_H263);//MEDIA_MIMETYPE_VIDEO_AVC);
enc_meta->setInt32(kKeyWidth, width);
enc_meta->setInt32(kKeyHeight, height);
enc_meta->setInt32(kKeyFrameRate, kFramerate);
enc_meta->setInt32(kKeySampleRate, 44100);
enc_meta->setInt32(kKeyBitRate, kVideoBitRate);
enc_meta->setInt32(kKeyStride, width);
enc_meta->setInt32(kKeySliceHeight, height);
enc_meta->setInt32(kKeyIFramesInterval, kIFramesIntervalSec);
enc_meta->setInt32(kKeyColorFormat, colorFormat);
mVideoSource = OMXCodec::Create(
mClient->interface(),
enc_meta,
true,
mSrc,
NULL,
OMXCodec::kHardwareCodecsOnly );
logger->log("OMXCodec_CREATED result: %d", (mVideoSource!=NULL) ? 1 : 0);
In Android ICS 4.0.4, the codec registration was static i.e. all codecs were registered as part of an array KEncoderInfo as can be found here.
The methodology to differentiate between hardware and software codecs is pretty simple. If the component name doesn't start with OMX, then it is construed to be a software codec as shown in theIsSoftwareCodec method.
Since you are trying an AVC encoder, the software codec if created would be AVCEncoder as can be found from it's Factory reference.
To check which codec was created, you can enable logs in OMXCodec.cpp file by removing the comment as #define LOG_NDEBUG 0 in this line, save and recompile to build libstagefright.so which could be used to generate the logs on logcat screen.
EDIT:
In case of rtsp streaming, one needs to enable the logs in ACodec.cpp .
One needs to ascertain if libstagefrighthw.so is present in /system/lib which will register the OMX core with the Stagefright framework.
I am trying to use Nexus to test encoding with Mediacodec APIs. I can see the inputBuffers provided by the encoder is 119040 (by logging inputBuffers.capacity). But the size of the frame, i.e. input, is 460800. I got error message at inputBuffer.put with buffer overflow. So I was about to set the input buffer to 460800. The API I could find is BufferInfo.set. However, I cannot find a way to attach this setting to the encoder. Could someone help? Thanks!!!
encoder = MediaCodec.createByCodecName(codecInfo.getName());
ByteBuffer[] inputBuffers = encoder.getInputBuffers();
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(input);
encoder.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);}
I'm late to the party, but based on: Android MediaCodec Documentation I think the correct way to change the buffer will be to adjust the MAX_INPUT_SIZE, something like:
int width=800;
int height=480;
encoder = MediaCodec.createByCodecName(codecInfo.getName());
format = MediaFormat.createVideoFormat ("video/avc", width, height);
format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE,655360); // 0.5MB but adjust it as you need.
You don't set the size of the input buffer. The size is determined by the MediaFormat, specifically the width, height, and color format. If your input data has a different size, you will need to convert it to the format that the codec is expecting.
This isn't entirely trivial but is doable. For examples, see the buffer-to-buffer tests in the CTS EncodeDecodeTest. The test queries the codec to see what color format is supported, generates frames in that format, submits them to the encoder, then decodes the video and confirms that what comes out is the same as what went in.
The test generally requires API 18 (Android 4.3), but the buffer-to-buffer code will work in API 16. Whether or not it works on any given device is a different question -- since the CTS test didn't exist until API 18, it's possible for pre-4.3 devices to do this wrong.