I've used Android MediaCodec library to transcode video files (mainly change the resolution Sample code here)
Another thing I want to achieve is to truncate the video - to only take the beginning 15 seconds. The logic is to check videoExtractor.getSampleTime() if it's greater than the 15 seconds, I'll just write an EOS to the decoder buffer.
But I get an exception Caused by: android.media.MediaCodec$CodecException: Error 0xfffffff3
Here is my code:
while ((!videoEncoderDone) || (!audioEncoderDone)) {
while (!videoExtractorDone
&& (encoderOutputVideoFormat == null || muxing)) {
int decoderInputBufferIndex = videoDecoder.dequeueInputBuffer(TIMEOUT_USEC);
if (decoderInputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER)
break;
ByteBuffer decoderInputBuffer = videoDecoderInputBuffers[decoderInputBufferIndex];
int size = videoExtractor.readSampleData(decoderInputBuffer, 0);
long presentationTime = videoExtractor.getSampleTime();
if (size >= 0) {
videoDecoder.queueInputBuffer(
decoderInputBufferIndex,
0,
size,
presentationTime,
videoExtractor.getSampleFlags());
}
videoExtractorDone = !videoExtractor.advance();
if (!videoExtractorDone && videoExtractor.getSampleTime() > mVideoDurationLimit * 1000000) {
videoExtractorDone = true;
}
if (videoExtractorDone)
videoDecoder.queueInputBuffer(decoderInputBufferIndex,
0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
break;
}
The full source code can be found here.
I am not sure if this is the source of the error or not, but i think it is not safe write EOS to decoder buffer at arbitrary point.
The reason is when the input video is using H264 Main Profile or above,
pts may not be in increasing order (because the existence of B-frame) so you may miss several frames at the end of the video.
Also, when the last frame you send to the decoder is B-frame, decoder might be expecting the next packet but you send the EOS flag and produce error (not quite sure).
What you can do though, you can send EOS flag to the encoder using videoEncoder.signalEndOfInputStream() after you reach your desired frame, (pts of the output of decoder is guaranted to be in increasing order, at least after android version >= 4.3 ?)
Related
I am working on an app that records the screen of my Android device and streams it over rtsp to another client. I am using VirtualDisplay and MediaCodec for this.
I have an issue that I don't know how to solve. When I start streaming, the client doesn't receive anything until the screen changes. I guess it makes sense, the buffer contains nothing, so nothing is sent to client. The code for that is this:
MediaCodec buildMediaCodec() throws IOException {
MediaFormat format = MediaFormat.createVideoFormat(VIDEO_MIME_TYPE, VIDEO_WIDTH, VIDEO_HEIGHT);
// Set some required properties. The media codec may fail if these aren't defined.
format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
format.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1); // 1 seconds between I-frames
// Create a MediaCodec encoder and configure it. Get a Surface we can use for recording into.
MediaCodec mediaCodec = MediaCodec.createEncoderByType(VIDEO_MIME_TYPE);
mediaCodec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
return mediaCodec;
}
// This is passed to buildVirtualDisplay(), and I get it from calling buildMediaCodec()
Surface mediaCodecSurface = mMediaCodec.createInputSurface();
VirtualDisplay buildVirtualDisplay(MediaProjection mediaProjection, Surface mediaCodecSurface, DisplayMetrics displayMetrics) {
if (mediaProjection == null || mediaCodecSurface == null || displayMetrics == null) {
throw new InvalidParameterException("MediaProjection, Surface and DisplayMetrics are mandatory");
}
return mediaProjection.createVirtualDisplay("Recording Display", VIDEO_WIDTH, VIDEO_HEIGHT, SCREEN_DPI, 0 /* flags */, mediaCodecSurface, null /* callback */, null /* handler */);
}
...
mIndex = mMediaCodec.dequeueOutputBuffer(mBufferInfo, 500000);
if (mIndex >= 0) {
mBuffer = mMediaCodec.getOutputBuffer(mIndex);
if (mBuffer == null) {
throw new RuntimeException("couldn't fetch buffer at index " + mIndex);
}
mBuffer.position(0);
break;
} else if (mIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
mMediaFormat = mMediaCodec.getOutputFormat();
Log.i(TAG, mMediaFormat.toString());
} else if (mIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
Log.v(TAG, "No buffer available...");
} else {
Log.e(TAG, "Message: " + mIndex);
}
In the logs I can see No buffer available... one after another. In the moment that the screen changes it stops.
The problem is when I stop interacting with the phone. The screen is not refreshed as nothing is changing, so I keep getting MediaCodec.INFO_TRY_AGAIN_LATER. After 10 seconds or so, the client disconnects. I guess that it doesn't receive anything so it just shuts down the connection.
I also observed, that the longer I wait at the beginning the bigger is the delay between the server and client devices.
If I put a progress bar everything is ok, it seems that the screen is re-rendered so the buffer contains data to be sent.
I have looked for info about this problem. Any suggestion of what I might do to prevent this from happening? Should I used another Surface between MediaCodec and VirtualDisplay and try to "force" the rendering?
Thanks.
I found out that client disconnects after not receiving data for at least 10 seconds. I try KEY_REPEAT_PREVIOUS_FRAME_AFTER from MediaFormat to prevent this, but so far no luck.
I am using MediaCodec API's to play the video stream(H.264) I am receiving through ethernet port.
From what I understood from official documentation and various examples, I need to perform the following operations.
Create a MediaCodec instance based on the mimetype(video/avc for H.264)
Feed "csd-0" with SPS frame data. SPS frame should start with 0x00000001
Feed "csd-1" buffer with PPS frame data. PPS frame should start with 0x00000001
Call decoder.configure() and decoder.start(). If all goes well, decoder is correctly configured and there is no exception.
Once the MediaCodec is configured, we can supply to the decoder all the rest of the frames (depacketed NAL units).
Upon successfull dequeue operation, the decoded buffer can be rendered onto the screen as follows.
outputBufferId = decoder.dequeueOutputBuffer(info, 1000);
decoder.releaseOutputBuffer(outputBufferId, true);
Problem
decoder.dequeueOutputBuffer() Is returning -3 on android 6.0.
Official documentation says that in the case of any errors during dequeueOutputBuffer() only -1 (INFO_TRY_AGAIN_LATER) and -2 (INFO_OUTPUT_FORMAT_CHANGED) are returned and that -3 (INFO_OUTPUT_BUFFERS_CHANGED) is deprecated. So why am I receiving -3?
How am I supposed to correct this?
Code Snippet :
String mimeType = "video/avc";
decoder = MediaCodec.createDecoderByType(mimeType);
mformat = MediaFormat.createVideoFormat(mimeType, 1920, 1080);
while (!Thread.interrupted()) {
byte[] data = new byte[size];
bin.read(data, 0, size);
if (data is SPS frame) {
mformat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 0);
mformat.setByteBuffer("csd-0", ByteBuffer.wrap(data));
continue;
}
if (data is PPS frame) {
mformat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 0);
mformat.setByteBuffer("csd-1", ByteBuffer.wrap(data));
decoder.configure(mformat. surface, null, 0);
decoder.start();
is_decoder_configured = true;
continue;
}
if (!is_decoder_configured)
continue;
index = decoder.dequeueInputBuffer(1000);
if (index < 0) {
Log.e(TAG, "Dequeue in put buffer failed..\n");
continue;
}
buf = decoder.getInputBuffer(index);
if (buf == null)
continue;
buf.put(data);
decoder.queueInputBuffer(index, 0, data.length, 0, 0);
int outputBufferId = decoder.dequeueOutputBuffer(info, 1000);
switch (outputBufferId) {
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
Log.i("DecodeActivity", "New format " + decoder.getOutputFormat());
break;
case MediaCodec.INFO_TRY_AGAIN_LATER:
Log.i("DecodeActivity", "dequeueOutputBuffer timed out!");
break;
default:
Log.i(TAG, "Successfully decoded the output : " + outputBufferId);
decoder.releaseOutputBuffer(outputBufferId, true);
break;
}
}
if (is_decoder_configured()) {
decoder.stop();
decoder.release();
}
In the case you see any other errors, please let me know. Will be grateful!
I don't see anything saying that it will never be returned. The documentation says this:
This constant was deprecated in API level 21.
This return value can be ignored as getOutputBuffers() has been deprecated. Client should request a current buffer using on of the get-buffer or get-image methods each time one has been dequeued.
That is, if you're using getOutputBuffers(), you need to listen for this return value and act on it - but doing this is deprecated. If you're not using getOutputBuffers(), just ignore this return value, i.e. call dequeueOutputBuffer() again with the same parameters and see what it returns afterwards.
I've a problem with this library
https://github.com/fyhertz/libstreaming
it allows to send via wireless the streaming of photocamera, it use 3 methods: two with mediacodec and one with mediarecorder.
I would like to modify it, and I have to use only the mediacodec;however first of all I tried the code of the example 2 of the library, but I've always found the same error:
the log tell me that the device can use the mediacodec, it set the encoder and when it test the decoder it fall and the buffer is filled with -1.
This is the method in the EncoderDebugger class where the exception occurs, some kind soul can help me please?
private long decode(boolean withPrefix) {
int n =3, i = 0, j = 0;
long elapsed = 0, now = timestamp();
int decInputIndex = 0, decOutputIndex = 0;
ByteBuffer[] decInputBuffers = mDecoder.getInputBuffers();
ByteBuffer[] decOutputBuffers = mDecoder.getOutputBuffers();
BufferInfo info = new BufferInfo();
while (elapsed<3000000) {
// Feeds the decoder with a NAL unit
if (i<NB_ENCODED) {
decInputIndex = mDecoder.dequeueInputBuffer(1000000/FRAMERATE);
if (decInputIndex>=0) {
int l1 = decInputBuffers[decInputIndex].capacity();
int l2 = mVideo[i].length;
decInputBuffers[decInputIndex].clear();
if ((withPrefix && hasPrefix(mVideo[i])) || (!withPrefix && !hasPrefix(mVideo[i]))) {
check(l1>=l2, "The decoder input buffer is not big enough (nal="+l2+", capacity="+l1+").");
decInputBuffers[decInputIndex].put(mVideo[i],0,mVideo[i].length);
} else if (withPrefix && !hasPrefix(mVideo[i])) {
check(l1>=l2+4, "The decoder input buffer is not big enough (nal="+(l2+4)+", capacity="+l1+").");
decInputBuffers[decInputIndex].put(new byte[] {0,0,0,1});
decInputBuffers[decInputIndex].put(mVideo[i],0,mVideo[i].length);
} else if (!withPrefix && hasPrefix(mVideo[i])) {
check(l1>=l2-4, "The decoder input buffer is not big enough (nal="+(l2-4)+", capacity="+l1+").");
decInputBuffers[decInputIndex].put(mVideo[i],4,mVideo[i].length-4);
}
mDecoder.queueInputBuffer(decInputIndex, 0, l2, timestamp(), 0);
i++;
} else {
if (VERBOSE) Log.d(TAG,"No buffer available !7");
}
}
// Tries to get a decoded image
decOutputIndex = mDecoder.dequeueOutputBuffer(info, 1000000/FRAMERATE);
if (decOutputIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
decOutputBuffers = mDecoder.getOutputBuffers();
} else if (decOutputIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
mDecOutputFormat = mDecoder.getOutputFormat();
} else if (decOutputIndex>=0) {
if (n>2) {
// We have successfully encoded and decoded an image !
int length = info.size;
mDecodedVideo[j] = new byte[length];
decOutputBuffers[decOutputIndex].clear();
decOutputBuffers[decOutputIndex].get(mDecodedVideo[j], 0, length);
// Converts the decoded frame to NV21
convertToNV21(j);
if (j>=NB_DECODED-1) {
flushMediaCodec(mDecoder);
if (VERBOSE) Log.v(TAG, "Decoding "+n+" frames took "+elapsed/1000+" ms");
return elapsed;
}
j++;
}
mDecoder.releaseOutputBuffer(decOutputIndex, false);
n++;
}
elapsed = timestamp() - now;
}
throw new RuntimeException("The decoder did not decode anything.");
}
Here's my suggestions:
(1) check the settings of encoder and decoder, and make sure that they match. For example, revolution and color format are the same.
(2) make sure the very first packet generated by the encoder has been sent and pushed into the decoder. This packet defines the basic settings of the video stream.
(3) the decoder usually buffers 5-10 frames. So data in the buffer is invalid for a few hundred ms.
(4) while initiating the decoder, set the surface as null. Otherwise the output buffer will be read by the surface and probably released automatically.
I'm working on video transcoding in Android, and using the standard method as these samples to extract/decode a video. I test the same process on different devices with different video devices, and I found a problem on the frame count of decoder input/output.
For some timecode issues as in this question, I use a queue to record the extracted video samples, and check the queue when I got a decoder frame output, like the following codes:
(I omit the encoding-related codes to make it clearer)
Queue<Long> sample_time_queue = new LinkedList<Long>();
....
// in transcoding loop
if (is_decode_input_done == false)
{
int decode_input_index = decoder.dequeueInputBuffer(TIMEOUT_USEC);
if (decode_input_index >= 0)
{
ByteBuffer decoder_input_buffer = decode_input_buffers[decode_input_index];
int sample_size = extractor.readSampleData(decoder_input_buffer, 0);
if (sample_size < 0)
{
decoder.queueInputBuffer(decode_input_index, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
is_decode_input_done = true;
}
else
{
long sample_time = extractor.getSampleTime();
decoder.queueInputBuffer(decode_input_index, 0, sample_size, sample_time, 0);
sample_time_queue.offer(sample_time);
extractor.advance();
}
}
else
{
DumpLog(TAG, "Decoder dequeueInputBuffer timed out! Try again later");
}
}
....
if (is_decode_output_done == false)
{
int decode_output_index = decoder.dequeueOutputBuffer(decode_buffer_info, TIMEOUT_USEC);
switch (decode_output_index)
{
case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
{
....
break;
}
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
{
....
break;
}
case MediaCodec.INFO_TRY_AGAIN_LATER:
{
DumpLog(TAG, "Decoder dequeueOutputBuffer timed out! Try again later");
break;
}
default:
{
ByteBuffer decode_output_buffer = decode_output_buffers[decode_output_index];
long ptime_us = decode_buffer_info.presentationTimeUs;
boolean is_decode_EOS = ((decode_buffer_info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0);
if (is_decode_EOS)
{
// Decoder gives an EOS output.
is_decode_output_done = true;
....
}
else
{
// The frame time may not be consistent for some videos.
// As a workaround, we use a frame time queue to guard this.
long sample_time = sample_time_queue.poll();
if (sample_time == ptime_us)
{
// Very good, the decoder input/output time is consistent.
}
else
{
// If the decoder input/output frame count is consistent, we can trust the sample time.
ptime_us = sample_time;
}
// process this frame
....
}
decoder.releaseOutputBuffer(decode_output_index, false);
}
}
}
In some cases, the queue can "correct" the PTS if the decoder gives error value (e.g. a lot of 0s). However, there are still some issues about the frame count of decoder input/output.
On an HTC One 801e device, I use the codec OMX.qcom.video.decoder.avc to decode the video (with MIME types video/avc). The sample time and PTS is matched well for the frames, except the last one.
For example, if the extractor feeds 100 frames and then EOS to the decoder, the first 99 decoded frames has the exactly same time values, but the last frame is missing and I get output EOS from the decoder. I test different videos encoded by the built-in camera, by ffmpeg muxer, or by a video processing AP on Windows. All of them have the last one frame disappeared.
On some pads with OMX.MTK.VIDEO.DECODER.AVC codec, things becomes more confused. Some videos has good PTS from the decoder and the input/output frame count is correct (i.e. the queue is empty when the decoding is done.). Some videos has consistent input/output frame count with bad PTS in decoder output (and I can still correct them by the queue). For some videos, a lot of frames are missing during the decoding. For example, the extractor get 210 frames in a 7 second video, but the decoder only output the last 180 frames. It is impossible to recover the PTS using the same workaround.
Is there any way to expect the input/output frame count for a MediaCodec decoder? Or more accurately, to know which frame(s) are dropped by the decoder while the extractor gives it video samples with correct sample time?
Same basic story as in the other question. Pre-4.3, there were no tests confirming that every frame fed to an encoder or decoder came out the other side. I recall that some devices would reliably drop the last frame in certain tests until the codecs were fixed in 4.3.
I didn't search for a workaround at the time, so I don't know if one exists. Delaying before sending EOS might help if it's causing something to shut down early.
I don't believe I ever saw a device drop large numbers of frames. This seems like an unusual case, as it would have been noticeable in any apps that exercised MediaCodec in similar ways even without careful testing.
I'm using the following code to prepare the hardware decoder. I expect outputBufferIndex to be -1 and then followed by MediaCodec.INFO_OUTPUT_FORMAT_CHANGED. It shouldn't >=0 before notifying format changed.
I tested the code in 25 different devices, and 7 of them never return INFO_OUTPUT_FORMAT_CHANGED. mediaCodec.getOutputFormat() returned IllegalStateException when I got outputBufferIndex >= 0. I have no idea if it was a coincidence that all devices did't work were android 4.2.2 with OMX.qcom.video.decoder.avc decoder.
for (int i = 0; i < videoExtractor.getTrackCount(); i++) {
MediaFormat mediaFormat = videoExtractor.getTrackFormat(i);
String mime = mediaFormat.getString(MediaFormat.KEY_MIME);
if (mime.startsWith("video/")) {
videoExtractor.selectTrack(i);
videoCodec = MediaCodec.createDecoderByType(mediaFormat.getString(MediaFormat.KEY_MIME));
videoCodec.configure(mediaFormat, null, null, 0);
videoCodec.start();
}
}
ByteBuffer[] videoInputBuffers = videoCodec.getInputBuffers();
while (true) {
int sampleTrackIndex = videoExtractor.getSampleTrackIndex();
if (sampleTrackIndex == -1) {
break;
} else { // decode video
int inputBufferIndex = videoCodec.dequeueInputBuffer(0);
if (inputBufferIndex >= 0) {
int bytesRead = videoExtractor.readSampleData(videoInputBuffers[inputBufferIndex], 0);
if (bytesRead >= 0) {
videoCodec.queueInputBuffer(inputBufferIndex, 0, bytesRead,
videoExtractor.getSampleTime(), 0);
videoExtractor.advance();
}
}
MediaCodec.BufferInfo videoBufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = videoCodec.dequeueOutputBuffer(videoBufferInfo, 0);
if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat format = videoCodec.getOutputFormat();
Log.w("video format changed: " + videoCodec.getOutputFormat());
//do something...
break;
} else if (outputBufferIndex >= 0) {
//not supposed to happen!
}
}
}
Thank you very much for the clues and helps!
In Android 4.3, a collection of MediaCodec tests were added to CTS. If you look at the way doEncodeDecodeVideoFromBuffer() works in EncodeDecodeTest, you can see that it expects the INFO_OUTPUT_FORMAT_CHANGED result before any data. If it doesn't get it, the call to checkFrame() will fail when it tries to get the color format. Prior to Android 4.3, there were no tests, and any behavior is possible.
Having said that, I don't recall seeing this behavior on the (Qualcomm-based) Nexus 4.
At any rate, I'm not sure how much this will actually hold you back, unless you're able to decode the proprietary buffer layout Qualcomm uses. You can see in that same checkFrame() function that it punts when it sees OMX_QCOM_COLOR_FormatYUV420PackedSemiPlanar64x32Tile2m8ka. Sending the output to a Surface may be a viable alternative depending on what you're up to.
Most of the MediaCodec code on bigflake and in Grafika targets API 18 (Android 4.3), because that's when the behavior became more predictable. (The availability of surface input and MediaMuxer is also of tremendous value.)