Decoding Raw H264 stream in android? - android

I have a project where I have been asked to display a video stream in android, the stream is raw H.264 and I am connecting to a server and will receive a byte stream from the server.
Basically I'm wondering is there a way to send raw bytes to a decoder in android and display it on a surface?
I have been successful in decoding H264 wrapped in an mp4 container using the new MediaCodec and MediaExtractor API in android 4.1, unfortunately I have not found a way to decode a raw H264 file or stream using these API's.
I understand that one way is to compile and use FFmpeg but I'd rather use a built in method that can use HW acceleration. I also understand RTSP streaming is supported in android but this is not an option. Android version is not an issue.

I can't provide any code for this unfortunately, but I'll do my best to explain it based on how I got it to work.
So here is my overview of how I got raw H.264 encoded video to work using the MediaCodec class.
Using the link above there is an example of getting the decoder setup and how to use it, you will need to set it up for decoding H264 AVC.
The format of H.264 is that it’s made up of NAL Units, each starting with a start prefix of three bytes with the values 0x00, 0x00, 0x01 and each unit has a different type depending on the value of the 4th byte right after these 3 starting bytes. One NAL Unit IS NOT one frame in the video, each frame is made up of a number of NAL Units.
Basically I wrote a method that finds each individual unit and passes it to the decoder (one NAL Unit being the starting prefix and any bytes there after up until the next starting prefix).
Now if you have the decoder setup for decoding H.264 AVC and have an InputBuffer from the decoder then you are ready to go. You need to fill this InputBuffer with a NAL Unit and pass it back to the decoder and continue doing this for the length of the stream.
But, to make this work I had to pass the decoder a SPS (Sequence Parameter Set) NAL Unit first. This unit has a byte value of 0x67 after the starting prefix (the 4th byte), on some devices the decoder would crash unless it received this Unit first.
Basically until you find this unit, ignore all other NAL Units and keep parsing the stream until you get this unit, then you can pass all other units to the decoder.
Some devices didn't need the SPS first and some did, but you are better of passing it in first.
Now if you had a surface that you passed to the decoder when you configured it then once it gets enough NAL units for a frame it should display it on the surface.

You can download the raw H.264 from the server, then offer it via a local HTTP server running on the phone and then let VLC for Android do playback from that HTTP server. You should use VLC's http/h264:// scheme to force the demuxer to raw H.264 (if you don't force the demuxer VLC may not be able to recognize the stream, even when the MIME type returned by the HTTP server is set correctly). See
https://github.com/rauljim/tgs-android/blob/integrate_record/src/com/tudelft/triblerdroid/first/VideoPlayerActivity.java#L211
for an example on how to create an Intent that will launch VLC.
Note: raw H.264 apparently has no timing info, so VLC will play as fast as possible.
First embedding it in MPEGTS will be better. Haven't found a Android lib that will do that yet.

Here are the resources I've found helpful in a similar project:
This video has been super insightful in understanding how MediaCodec handles raw h.264 streams on a high level.
This thread goes into a bit more detail as to handling the SPS/PPS NALUs specifically. As was mentioned above, you need to separate individual NAL Units using the start prefix, and then hand the remaining data to the MediaCodec.
This repo (libstreaming) is a great example of decoding an H264 stream in Android using RTSP/RTP for transmission.

Related

how to retrieve NV21 data from DJI camera Phantom 3 Professional drone

As I described in a previous post, I'm working on an Android mobile app oriented to the real time augmented visualization of a drone's camera view (specifically I'm working on a DJI Phantom 3 Professional with relative SDK), using Wikitude framework for the AR part. Thanks to Alex's response, I implemented my own Wikitude Input Plugin in combination with dji's Video Stream Decoding.
I have some issues now. First of all, "DJI's Video Stream Decoding" demo uses FFmpeg for video frame parsing and MediaCodec for hardware decoding. So, it helps to parse video frames and decode the raw video stream data from DJI Camera and output the YUV data. You adviced me to "get the raw video data from the dji sdk and pass it to the Wikitude SDK": since Wikitude Input Plugin needs YUV 420 format, arranged to be compliant to the NV21 standard in order to provide the custom camera, I should pass to it the YUV data output of the MediaCodec, right?
About this point, I tried to retrieve bytebuffers from the MediaCodec output (and this is possible by setting Surface parameter to null into configure() method, which have the effect to invoke a callback and pass it out to an external listener), but I'm having some issues about colours in visualization, because the encoded video colour is not right (blue and red seem to be reversed, and there is too much noise when camera moves).. (please note that, when I pass a Surface not null, after the instruction codec.releaseOutputBuffer(outIndex, true), MediaCodec renders frames on that and shows video stream properly, but I need to pass the video stream to Wikitude Plugin and so I must set surface to null).
I tried to set different MediaFormat.KEY_COLOR_FORMAT but none of them works properly. How can I solve this point?
When decoding into bytebuffers with MediaCodec, you can't decide what color format the buffer uses; the decoder decides, and you have to deal with it. Each decoder can use a different format; some of them can be a standard format like COLOR_FormatYUV420Planar (corresponding to I420) or COLOR_FormatYUV420SemiPlanar (corresponding to NV12 - not NV21), while others can use completely proprietary formats.
See e.g. https://android.googlesource.com/platform/cts/+/jb-mr2-release/tests/tests/media/src/android/media/cts/EncodeDecodeTest.java#401 for an example on what formats the decoder can return that are supported, and https://android.googlesource.com/platform/cts/+/jb-mr2-release/tests/tests/media/src/android/media/cts/EncodeDecodeTest.java#963 for a reference showing that it is ok for decoders to return private formats.
You can have a look at e.g. http://git.videolan.org/?p=vlc.git;a=blob;f=modules/codec/omxil/qcom.c;h=301e9150ae66075ca264e83566504802ed57578c;hb=bdc690e9c0e2516c00a6d3733a77a87a25d9b6e3 for an example on how to interpret one common proprietary color format.

Media codec decoder and playback

I’m trying to show live preview from one android device to another.
Here is what I did,
Sender : 1.Camera frame(YUV) -> 2.Mediacodec (encode h264 byte[])-> 3.MediaMuxer-> 4.mp4
I'm sending output of media encoder via socket connection.
Receiver : 5.byte[] via socket ->6. Mediacodec (decoder) -> 7.Play.
Till step 5 everything works fine.
However I'm not able to decode the byte[]. What is missing here? I guess, I'm not able to send(don't know how to send) sps and pps properly! also how to test that what I'm sending is correct data?
Thanks.
You are muxing to mp4 (container format) the encoded h264 video. However, on the decoding side, you are not using demuxer (or parser/splitter). Media codec decoder can take elementary video not any container format.
Use mediaextractor to demux in android. (https://developer.android.com/reference/android/media/MediaExtractor.html)

mediacodec decode h264 stream limitation

I am using mediacodec to decodec a h264 stream on samsung S6, android 5.1.1, found the input buffer to mediacodec must start with "0001"(and don't need to set pps, sps), or the ACodec will report error.
I also tried to use mediaextractor to play a mp4 file, it works fine, but the buffer to mediacodec is not start with "0001".
I don't know why decodec a h264 stream has such limitation, currently i need to analyze the stream from socket, and cut the data into small packages(each package start with 0001) and then give them to mediacodec, but it is inefficient.
MediaFormat format = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 1024, 1024);
Some specific decoders may also be able to decode H264 NAL units in the "mp4" format (with a different kind of startcode), but that's not guaranteed across all devices.
It may be that Samsung's version of MediaExtractor returns it in this format, if they know that their own decoder can handle it. There is at least earlier precedent that Samsung did the same, nonstandard thing with timestamps with their version of MediaExtractor, see e.g. https://code.google.com/p/android/issues/detail?id=74356.
(Having MediaExtractor return data that only the current device's decoder can handle is wrong IMO, though, since one may want to use MediaExtractor to read a file but send the compressed data over the network to another device for decoding, and in these cases, returning data in a nonstandard format is wrong.)
As fadden wrote, MediaCodec operates on full NAL units though, so you need to provide data in this format (even if you think it feels inefficient). If you receive data over a socket in a format where this information (about frame boundaries) isn't easily available, then that's an issue with your protocol format (e.g., implementing RTP reception is not easy!), not with MediaCodec itself - it's a quite common limitation to need to have full frames before decoding, instead of being able to feed random chunks until you have a full frame. This shouldn't be inefficient unless your own implementation of it is inefficient.
In general android will expect nal units for each input. For some devices i have found that setting the csd-0/1 on the media-format for h264 to not work consistently. But if you feed each of the parameters sets as input buffers the media-codec will pick it up as a format-change.
int outputBufferIndex = NativeDecoder.DequeueOutputBuffer (info, 1000);
if (outputBufferIndex == (int)MediaCodec.InfoOutputFormatChanged) {
Console.WriteLine ("Format changed: {0}", NativeDecoder.OutputFormat);
} else if (outputBufferIndex >= 0) {
CodecOutputBufferAvailable (NativeDecoder, outputBufferIndex, info);
}
Also note it is mandatory for Nexus and some other samsung devices to set:
formatDescription.SetInteger(MediaFormat.KeyWidth, SelectedPalette.Value.Width);
formatDescription.SetInteger(MediaFormat.KeyHeight, SelectedPalette.Value.Height);
formatDescription.SetInteger(MediaFormat.KeyMaxInputSize, SelectedPalette.Value.Width * SelectedPalette.Value.Height);
I am lucky in my situation i can query these resolutions. But you can parse the resolution manually from SPS and PPS nal units.
// NOTE i am using Xamarin here. But the calls and things are pretty much the same. I am fairly certain there are bugs in the iOS VideoToolbox Xamarin Wrapper so yeah.. Keep that in mind if your ever considering Xamarin for video decoding. Its great for everything but anything thats slightly more custom or low-level.

Android MediaCodec decoding of raw h.264

I'm having a very difficult time with MediaCodc. I've used it previously to decode a raw h.264 stream and learned a significant amount. At least I thought I had.
My stream is h.264 in Annex B format. Looking at the raw data, the structure of my NAL packet code types are as follows:
[0x09][0x09][0x06] and then 8 packets of [0x21].
[0x09][0x09][0x27][0x28][0x06] and then 8 packets of [0x21].
This is not how I am receiving them. I am attempting to build a complete Access Unit from these raw NAL unit types.
First thing that is strange to me is the double [0x09] which is the Access Unit Delimiter packet. I am pretty sure the h.264 document specifies only 1 AUD per Access Unit. BTW, I am able to record the raw data and play it using ffmpeg with, and without, the extra AUD. For now, I am detecting this message and stripping the first one off before sending the entire Access Unit to the MediaCodec.
Second thing is, I have hardcoded the SPS/PPS byte array messages [0x27/0x28] and I am setting these in the MediaFormat used to initialize the MediaCodec, similar to:
format.setByteBuffer("csd-0", ByteBuffer.wrap( mySPS ));
format.setByteBuffer("csd-1", ByteBuffer.wrap( myPPS ));
My video stream provider vendor tells me the video is 1280 x 720, however, when I convert it to an mp4 file, the metadata says its 960 x 720. Another oddity.
Changing these different parameters around, I am still unable to get a valid buffer index in my Thread that processes the decoder output (dequeueOutputBuffer returns -1). I have also varied the timeout for this to no avail. If I manually set the SPS/PPS as the first packet NOT using the example above, I do get the -3 "output buffers have changed" which is meaningless since I am using API 20. But everything else I get returned is -1.
I've read about the Emulation Prevention Byte encoding of h.264. I am able to strip this byte out and send to the MediaCodec. Doesn't seem to make a difference. Also, the documentation for MediaCodec doesn't explicitly say if the code is expecting the EPB to be stripped out or not ... ?
Other than the video frame resolution, the only other thing that is different than my previous success is the existence of the SEI packet type [0x06]. I'm not sure if I should be doing something special with this or not?
I know there are a number of folks who have used MediaCodec have had issues with it, mostly because the documentation is not very good. Can anyone offer any other advice as to what I could be doing wrong?

Decode mp4/h.264 using MediaCodec without MediaExtractor, expected access unit format

I am trying to use the MediaCodec API for decoding without using the MediaExtractor API. Instead, i use mp4parser to get the samples from the mp4 files. For now, i am only using h.264 / avc coded video content.
The official documentation of the MediaCodec API states:
buffers do not start and end on arbitrary byte boundaries, this is not a stream of bytes, it's a stream of access units.
Meaning, i have to feed access units to the decoder. However, i miss some details in this information:
For h.264, in an mp4 sample, there can be multiple NAL units, that are each preceded by 4 (default) bytes specifying the NAL unit length.
Now my questions:
There can be mp4 samples, where codec config NAL units (sps, pps) are mixed with NAL units containing coded (parts of) frames. In that case, should i pass the flag BUFFER_FLAG_CODEC_CONFIG at the call of queueInputBuffers()?
There can also be other (additional) NAL units in mp4 samples, like SEI or access unit delimiter NAL units. What about those? No problem?
I tried different kinds of possibilities, but all the feedback i get from Android is that the calls of dequeueOutputBuffer() time out (or don't return, if i pass -1 as timeout parameter). As a result, i don't seem have a way to troubleshoot this issue.
Any advice what to do or where to look is of course very welcome as well.
The NAL length prefixes that specify the NAL unit length need to be converted to Annex-B startcodes (bytes 0x00, 0x00, 0x00, 0x01) before passing to MediaCodec for decoding. (Some decoders might actually accept the MP4 format straight away, but it's not too common.)
The SPS/PPS that is stored in the avcC atom in the file also needs to be converted to use Annex-B startcodes. Note that the avcC atom contains a few other fields that you don't need to pass on to the decoder. You can either pass the SPS and PPS packed in one buffer (with startcodes before each of them) with the BUFFER_FLAG_CODEC_CONFIG flag set before sending any actual frames, or pass them (with Annex-B startcodes) in the MediaFormat you use to configure the decoder (either in one ByteBuffer with the key "csd-0", or in two separate keys as "csd-0" and "csd-1").
If your file has got more SPS/PPS inside each frame, you should just be able to pass them as part of the frame, and most decoders should be able to cope with it (especially if it's the same SPS/PPS as before and not a configuration change).
Thus: Pass all NAL units belonging to one sample in one single buffer, but with all NAL unit length headers rewritten to startcodes. And to work with MP4 files that don't happen to have SPS/PPS inside the stream itself, parse the avcC atom (I don't know in which format mp4parser returns this) and pass the SPS and PPS with startcodes to the decoder (either via MediaFormat as "csd-0" or as the first buffer, with BUFFER_FLAG_CODEC_CONFIG set).
getting -1 is normal just carry on decoding you should see something on the screen. As long as it doesn't throw the IllegalState exception just carry on decoding

Categories

Resources