I'm having a very difficult time with MediaCodc. I've used it previously to decode a raw h.264 stream and learned a significant amount. At least I thought I had.
My stream is h.264 in Annex B format. Looking at the raw data, the structure of my NAL packet code types are as follows:
[0x09][0x09][0x06] and then 8 packets of [0x21].
[0x09][0x09][0x27][0x28][0x06] and then 8 packets of [0x21].
This is not how I am receiving them. I am attempting to build a complete Access Unit from these raw NAL unit types.
First thing that is strange to me is the double [0x09] which is the Access Unit Delimiter packet. I am pretty sure the h.264 document specifies only 1 AUD per Access Unit. BTW, I am able to record the raw data and play it using ffmpeg with, and without, the extra AUD. For now, I am detecting this message and stripping the first one off before sending the entire Access Unit to the MediaCodec.
Second thing is, I have hardcoded the SPS/PPS byte array messages [0x27/0x28] and I am setting these in the MediaFormat used to initialize the MediaCodec, similar to:
format.setByteBuffer("csd-0", ByteBuffer.wrap( mySPS ));
format.setByteBuffer("csd-1", ByteBuffer.wrap( myPPS ));
My video stream provider vendor tells me the video is 1280 x 720, however, when I convert it to an mp4 file, the metadata says its 960 x 720. Another oddity.
Changing these different parameters around, I am still unable to get a valid buffer index in my Thread that processes the decoder output (dequeueOutputBuffer returns -1). I have also varied the timeout for this to no avail. If I manually set the SPS/PPS as the first packet NOT using the example above, I do get the -3 "output buffers have changed" which is meaningless since I am using API 20. But everything else I get returned is -1.
I've read about the Emulation Prevention Byte encoding of h.264. I am able to strip this byte out and send to the MediaCodec. Doesn't seem to make a difference. Also, the documentation for MediaCodec doesn't explicitly say if the code is expecting the EPB to be stripped out or not ... ?
Other than the video frame resolution, the only other thing that is different than my previous success is the existence of the SEI packet type [0x06]. I'm not sure if I should be doing something special with this or not?
I know there are a number of folks who have used MediaCodec have had issues with it, mostly because the documentation is not very good. Can anyone offer any other advice as to what I could be doing wrong?
Related
I have some automated tests that try to decode a few m4a files to PCM data using Android's MediaDecoder and MediaExtractor. The files are generated with various encoders: fdk-aac, ffmpeg (with fdk or the default aac encoder), iOS.
On Android 9 the test fails for the clips created with ffmpeg, which results in empty PCM files. The same clips are decoded fine on older versions of Android.
I double checked my code and the decoding process goes as expected:
I extract compressed data using MediaExtractor
Enqueue it to the codec
Dequeue the output buffer from the codec.
The issue is that by the time the last available input buffer is enqueued and the output buffer with MediaCodec.BUFFER_FLAG_END_OF_STREAM is dequeued, all output buffers are empty!
Then I noticed that the MediaFormat info extracted from the audio file with MediaExtractor.getTrackFormat(int track) contains an undocumented "encoder-delay" key.
For android 8 and lower, that key is only present for m4a clips encoded with the iTunSMPB tag info. Here's a summary of the values I get for my test files:
iOS-encoded file: 2112 frames
fdkaac with iTunSMPB tag: 2048 frames
fdkaac with ISO delay info: key not present
ffmpeg: key not present
ffmpeg (fdk): key not present
On Android 9, instead, I get the following results:
iOS-encoded file: 2112 frames
fdkaac with iTunSMPB tag: 2048 frames
fdkaac with ISO delay info: 2048 frames
ffmpeg: 45158 frames
ffmpeg (fdk): 90317 frames
It looks like something has changed and MediaExtractor is now able to retrieve the encoder delay for all the files under test. This is good in theory, since the files with no "encoder-delay" info do show a delay in the decoded PCM data (this was a known issue).
But... while the value for the "fdkaac with ISO delay info" case is correct and leads to a valid PCM file with no initial padding (finally!), the values for the ffmpeg-generated files look huge and likely wrong!
I know the real encoder delay values are 1024 for the ffmpeg case, and 2048 for the ffmpeg (fdk) case, and I think the high value for key in the extracted format is the reason why the file is empty.
In fact, if I try setting the "encoder-delay" key to 0 in the format just before passing it to MediaCodec.configure(...) I get the correct uncompressed data with the expected delay.
My guess at this point is that the MediaExtractor encoder delay value retrieval has some bug, but maybe there's something I am overlooking.
Since ffmpeg is quite popular, it's quite likely that many of my app users will try importing files generated with it, and at this point I can't see a foolproof solution to the issue.
Does anyone have a suggestion / workaround?
I opened an issue on the android issue tracker:
https://issuetracker.google.com/issues/118398811
And for now I just implemented a workaround: when the "encoder-delay" value is present in the MediaFormat object and it's an impossibly high value, I just set it to zero. Something like:
if (format.containsKey("encoder-delay") && format.getInteger("encoder-delay") > THRESHOLD) {
format.setInteger("encoder-delay", 0);
}
NB: This means the initial gap will not be trimmed away, but for M4a files that don't have such info this is already the case on pre-android-9 devices.
I am using mediacodec to decodec a h264 stream on samsung S6, android 5.1.1, found the input buffer to mediacodec must start with "0001"(and don't need to set pps, sps), or the ACodec will report error.
I also tried to use mediaextractor to play a mp4 file, it works fine, but the buffer to mediacodec is not start with "0001".
I don't know why decodec a h264 stream has such limitation, currently i need to analyze the stream from socket, and cut the data into small packages(each package start with 0001) and then give them to mediacodec, but it is inefficient.
MediaFormat format = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 1024, 1024);
Some specific decoders may also be able to decode H264 NAL units in the "mp4" format (with a different kind of startcode), but that's not guaranteed across all devices.
It may be that Samsung's version of MediaExtractor returns it in this format, if they know that their own decoder can handle it. There is at least earlier precedent that Samsung did the same, nonstandard thing with timestamps with their version of MediaExtractor, see e.g. https://code.google.com/p/android/issues/detail?id=74356.
(Having MediaExtractor return data that only the current device's decoder can handle is wrong IMO, though, since one may want to use MediaExtractor to read a file but send the compressed data over the network to another device for decoding, and in these cases, returning data in a nonstandard format is wrong.)
As fadden wrote, MediaCodec operates on full NAL units though, so you need to provide data in this format (even if you think it feels inefficient). If you receive data over a socket in a format where this information (about frame boundaries) isn't easily available, then that's an issue with your protocol format (e.g., implementing RTP reception is not easy!), not with MediaCodec itself - it's a quite common limitation to need to have full frames before decoding, instead of being able to feed random chunks until you have a full frame. This shouldn't be inefficient unless your own implementation of it is inefficient.
In general android will expect nal units for each input. For some devices i have found that setting the csd-0/1 on the media-format for h264 to not work consistently. But if you feed each of the parameters sets as input buffers the media-codec will pick it up as a format-change.
int outputBufferIndex = NativeDecoder.DequeueOutputBuffer (info, 1000);
if (outputBufferIndex == (int)MediaCodec.InfoOutputFormatChanged) {
Console.WriteLine ("Format changed: {0}", NativeDecoder.OutputFormat);
} else if (outputBufferIndex >= 0) {
CodecOutputBufferAvailable (NativeDecoder, outputBufferIndex, info);
}
Also note it is mandatory for Nexus and some other samsung devices to set:
formatDescription.SetInteger(MediaFormat.KeyWidth, SelectedPalette.Value.Width);
formatDescription.SetInteger(MediaFormat.KeyHeight, SelectedPalette.Value.Height);
formatDescription.SetInteger(MediaFormat.KeyMaxInputSize, SelectedPalette.Value.Width * SelectedPalette.Value.Height);
I am lucky in my situation i can query these resolutions. But you can parse the resolution manually from SPS and PPS nal units.
// NOTE i am using Xamarin here. But the calls and things are pretty much the same. I am fairly certain there are bugs in the iOS VideoToolbox Xamarin Wrapper so yeah.. Keep that in mind if your ever considering Xamarin for video decoding. Its great for everything but anything thats slightly more custom or low-level.
I am successfully using MediaCodec to decode audio, however when I load a file with 24-bit samples, I have no way of knowing this has occurred. Since the application was assuming 16-bit samples, it fails.
When I print the MediaFormat, I see
{mime=audio/raw, durationUs=239000000, bits-format=6, channel-count=2, channel-mask=0, sample-rate=96000}
I assume that the "bits-format" would be a hint, however this key is not declared in the API, and is not actually emitted when the output format changes. I get
{mime=audio/raw, what=1869968451, channel-count=2, channel-mask=0, sample-rate=96000}
(By the way what is the "what" key? I notice if I interpret as a 4charcode, it is "outC"... just a flag that it is an output format?)
So what is the best recourse here? If I feed the ByteBuffer straight to the AudioTrack it plays static of course (assuming PCM 16).
If I know the value, then I can convert it myself!
I understand from other questions that you cannot dictate the output format either.
I am trying to use the MediaCodec API for decoding without using the MediaExtractor API. Instead, i use mp4parser to get the samples from the mp4 files. For now, i am only using h.264 / avc coded video content.
The official documentation of the MediaCodec API states:
buffers do not start and end on arbitrary byte boundaries, this is not a stream of bytes, it's a stream of access units.
Meaning, i have to feed access units to the decoder. However, i miss some details in this information:
For h.264, in an mp4 sample, there can be multiple NAL units, that are each preceded by 4 (default) bytes specifying the NAL unit length.
Now my questions:
There can be mp4 samples, where codec config NAL units (sps, pps) are mixed with NAL units containing coded (parts of) frames. In that case, should i pass the flag BUFFER_FLAG_CODEC_CONFIG at the call of queueInputBuffers()?
There can also be other (additional) NAL units in mp4 samples, like SEI or access unit delimiter NAL units. What about those? No problem?
I tried different kinds of possibilities, but all the feedback i get from Android is that the calls of dequeueOutputBuffer() time out (or don't return, if i pass -1 as timeout parameter). As a result, i don't seem have a way to troubleshoot this issue.
Any advice what to do or where to look is of course very welcome as well.
The NAL length prefixes that specify the NAL unit length need to be converted to Annex-B startcodes (bytes 0x00, 0x00, 0x00, 0x01) before passing to MediaCodec for decoding. (Some decoders might actually accept the MP4 format straight away, but it's not too common.)
The SPS/PPS that is stored in the avcC atom in the file also needs to be converted to use Annex-B startcodes. Note that the avcC atom contains a few other fields that you don't need to pass on to the decoder. You can either pass the SPS and PPS packed in one buffer (with startcodes before each of them) with the BUFFER_FLAG_CODEC_CONFIG flag set before sending any actual frames, or pass them (with Annex-B startcodes) in the MediaFormat you use to configure the decoder (either in one ByteBuffer with the key "csd-0", or in two separate keys as "csd-0" and "csd-1").
If your file has got more SPS/PPS inside each frame, you should just be able to pass them as part of the frame, and most decoders should be able to cope with it (especially if it's the same SPS/PPS as before and not a configuration change).
Thus: Pass all NAL units belonging to one sample in one single buffer, but with all NAL unit length headers rewritten to startcodes. And to work with MP4 files that don't happen to have SPS/PPS inside the stream itself, parse the avcC atom (I don't know in which format mp4parser returns this) and pass the SPS and PPS with startcodes to the decoder (either via MediaFormat as "csd-0" or as the first buffer, with BUFFER_FLAG_CODEC_CONFIG set).
getting -1 is normal just carry on decoding you should see something on the screen. As long as it doesn't throw the IllegalState exception just carry on decoding
I have a project where I have been asked to display a video stream in android, the stream is raw H.264 and I am connecting to a server and will receive a byte stream from the server.
Basically I'm wondering is there a way to send raw bytes to a decoder in android and display it on a surface?
I have been successful in decoding H264 wrapped in an mp4 container using the new MediaCodec and MediaExtractor API in android 4.1, unfortunately I have not found a way to decode a raw H264 file or stream using these API's.
I understand that one way is to compile and use FFmpeg but I'd rather use a built in method that can use HW acceleration. I also understand RTSP streaming is supported in android but this is not an option. Android version is not an issue.
I can't provide any code for this unfortunately, but I'll do my best to explain it based on how I got it to work.
So here is my overview of how I got raw H.264 encoded video to work using the MediaCodec class.
Using the link above there is an example of getting the decoder setup and how to use it, you will need to set it up for decoding H264 AVC.
The format of H.264 is that it’s made up of NAL Units, each starting with a start prefix of three bytes with the values 0x00, 0x00, 0x01 and each unit has a different type depending on the value of the 4th byte right after these 3 starting bytes. One NAL Unit IS NOT one frame in the video, each frame is made up of a number of NAL Units.
Basically I wrote a method that finds each individual unit and passes it to the decoder (one NAL Unit being the starting prefix and any bytes there after up until the next starting prefix).
Now if you have the decoder setup for decoding H.264 AVC and have an InputBuffer from the decoder then you are ready to go. You need to fill this InputBuffer with a NAL Unit and pass it back to the decoder and continue doing this for the length of the stream.
But, to make this work I had to pass the decoder a SPS (Sequence Parameter Set) NAL Unit first. This unit has a byte value of 0x67 after the starting prefix (the 4th byte), on some devices the decoder would crash unless it received this Unit first.
Basically until you find this unit, ignore all other NAL Units and keep parsing the stream until you get this unit, then you can pass all other units to the decoder.
Some devices didn't need the SPS first and some did, but you are better of passing it in first.
Now if you had a surface that you passed to the decoder when you configured it then once it gets enough NAL units for a frame it should display it on the surface.
You can download the raw H.264 from the server, then offer it via a local HTTP server running on the phone and then let VLC for Android do playback from that HTTP server. You should use VLC's http/h264:// scheme to force the demuxer to raw H.264 (if you don't force the demuxer VLC may not be able to recognize the stream, even when the MIME type returned by the HTTP server is set correctly). See
https://github.com/rauljim/tgs-android/blob/integrate_record/src/com/tudelft/triblerdroid/first/VideoPlayerActivity.java#L211
for an example on how to create an Intent that will launch VLC.
Note: raw H.264 apparently has no timing info, so VLC will play as fast as possible.
First embedding it in MPEGTS will be better. Haven't found a Android lib that will do that yet.
Here are the resources I've found helpful in a similar project:
This video has been super insightful in understanding how MediaCodec handles raw h.264 streams on a high level.
This thread goes into a bit more detail as to handling the SPS/PPS NALUs specifically. As was mentioned above, you need to separate individual NAL Units using the start prefix, and then hand the remaining data to the MediaCodec.
This repo (libstreaming) is a great example of decoding an H264 stream in Android using RTSP/RTP for transmission.