Media codec decoder and playback - android

I’m trying to show live preview from one android device to another.
Here is what I did,
Sender : 1.Camera frame(YUV) -> 2.Mediacodec (encode h264 byte[])-> 3.MediaMuxer-> 4.mp4
I'm sending output of media encoder via socket connection.
Receiver : 5.byte[] via socket ->6. Mediacodec (decoder) -> 7.Play.
Till step 5 everything works fine.
However I'm not able to decode the byte[]. What is missing here? I guess, I'm not able to send(don't know how to send) sps and pps properly! also how to test that what I'm sending is correct data?
Thanks.

You are muxing to mp4 (container format) the encoded h264 video. However, on the decoding side, you are not using demuxer (or parser/splitter). Media codec decoder can take elementary video not any container format.
Use mediaextractor to demux in android. (https://developer.android.com/reference/android/media/MediaExtractor.html)

Related

how to mux a multi-slice h264 stream via android mediamuxer

I'm trying to mux a mp4 file using mediaMuxer. I've already set SPS, PPS and replaced the 4-bit header. When the h.264 frame formed with single slice, everything is ok, but when I change it into multi-slice, the result is wrong.
sounds multi-slice is not fully supported in android mediaMuxer, it will cause compatible issue when plays on iPhone/QuickTime player.

Consume RTSP H.264 encoded stream with Android MediaCodec

I'm trying to display a video offered via a RTSP (RTP/H.264 encoded) GStreamer Server. I can open the stream with VLC and ffmpeg.
I've implemented RTSP and RTP in Java/Android already.
Everytime I put the H.264 frame (= Payload of RTP) in the MediaCodec, I get a "MediaCodec.INFO_TRY_AGAIN_LATER" on dequeing the output buffers of the MediaCodec.
I tried to prepend 0x0,0x0,0x0,0x1 to the H.264 packets but this did not solve the issue. How can I find out, what's going wrong. Are there any examples how to get the h.264 frame from RTP on Android/Java to pass it over to the media codec?
Edit:
I dumped the H.264 Frames with FFMpeg to file. When I push these files to the MediaCodec on Android (with setting SPS and PPS) it does work for the sample RTSP stream rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov but not for my Gstreamer created RTSP stream. All packets (files) produced with Gstreamer have a Access Unit Delimiter (0x0 0x0 0x0 0x1 0x9...). The sample stream does not have this AUDs. I tried to remove these AUDs before passing the frames to the mediacodec, but still no luck.
Thank you

Android: RTP issue with H264, MediaCodec as an encoder

I am wrapping H264 video with RTP for network streaming. In order to do this, I am using Android's MediaCodec configured as an encoder to generate the H264. The RTP code is my own.
When I read the stream (using my local network to the streaming server), I can tell when the keyframe is received because the video sharpens up to what I would expect. However, as I move my hand across the video view, I see a significant amount of pixelation until the next keyframe comes in. My video is 960x720, 30fps and I am sending a keyframe every 2 seconds.
I can pump the raw h264 (not wrapped as RTP) from the MediaCodec encoder via a Datagram socket and play it back using ffplay. There are no such effects when I do this. It has to be something with the RTP encapsulation. I verified that my stop/stop bits of the FU-A fragmentation packets are correct, as is the RTP header marker bit. I'm sort of at a loss for what else could be the problem?

How do you encode a video with audio using MediaCodec

So, I have encoded a h264 elementary stream with MediaCodec by collectings the frames using Camera's onPreviewFrame method.(using Encoding H.264 from camera with Android MediaCodec). Then, I generated an mp4 video using the h264 stream. Unfortunately, it doesn't have any audio in it.
I notice that MediaCodec should also allow encoding audio because it has settings for audio codecs.
Now is there any ways to add audio to the h264 stream?
thanks for reading and would appreciate any comments or suggestions.
A given instance of MediaCodec will encode either video or audio. You would need to create a second instance of MediaCodec to do the audio encoding, and then combine the streams with the MediaMuxer class (introduced in Android 4.3, API 18).
There are examples of using MediaMuxer on bigflake, but at the time I'm writing this there isn't one that demonstrates combining audio and video (they're just "muxing" the video into a .mp4 file). It should be enough to demonstrate how to use the class though.

Decoding Raw H264 stream in android?

I have a project where I have been asked to display a video stream in android, the stream is raw H.264 and I am connecting to a server and will receive a byte stream from the server.
Basically I'm wondering is there a way to send raw bytes to a decoder in android and display it on a surface?
I have been successful in decoding H264 wrapped in an mp4 container using the new MediaCodec and MediaExtractor API in android 4.1, unfortunately I have not found a way to decode a raw H264 file or stream using these API's.
I understand that one way is to compile and use FFmpeg but I'd rather use a built in method that can use HW acceleration. I also understand RTSP streaming is supported in android but this is not an option. Android version is not an issue.
I can't provide any code for this unfortunately, but I'll do my best to explain it based on how I got it to work.
So here is my overview of how I got raw H.264 encoded video to work using the MediaCodec class.
Using the link above there is an example of getting the decoder setup and how to use it, you will need to set it up for decoding H264 AVC.
The format of H.264 is that it’s made up of NAL Units, each starting with a start prefix of three bytes with the values 0x00, 0x00, 0x01 and each unit has a different type depending on the value of the 4th byte right after these 3 starting bytes. One NAL Unit IS NOT one frame in the video, each frame is made up of a number of NAL Units.
Basically I wrote a method that finds each individual unit and passes it to the decoder (one NAL Unit being the starting prefix and any bytes there after up until the next starting prefix).
Now if you have the decoder setup for decoding H.264 AVC and have an InputBuffer from the decoder then you are ready to go. You need to fill this InputBuffer with a NAL Unit and pass it back to the decoder and continue doing this for the length of the stream.
But, to make this work I had to pass the decoder a SPS (Sequence Parameter Set) NAL Unit first. This unit has a byte value of 0x67 after the starting prefix (the 4th byte), on some devices the decoder would crash unless it received this Unit first.
Basically until you find this unit, ignore all other NAL Units and keep parsing the stream until you get this unit, then you can pass all other units to the decoder.
Some devices didn't need the SPS first and some did, but you are better of passing it in first.
Now if you had a surface that you passed to the decoder when you configured it then once it gets enough NAL units for a frame it should display it on the surface.
You can download the raw H.264 from the server, then offer it via a local HTTP server running on the phone and then let VLC for Android do playback from that HTTP server. You should use VLC's http/h264:// scheme to force the demuxer to raw H.264 (if you don't force the demuxer VLC may not be able to recognize the stream, even when the MIME type returned by the HTTP server is set correctly). See
https://github.com/rauljim/tgs-android/blob/integrate_record/src/com/tudelft/triblerdroid/first/VideoPlayerActivity.java#L211
for an example on how to create an Intent that will launch VLC.
Note: raw H.264 apparently has no timing info, so VLC will play as fast as possible.
First embedding it in MPEGTS will be better. Haven't found a Android lib that will do that yet.
Here are the resources I've found helpful in a similar project:
This video has been super insightful in understanding how MediaCodec handles raw h.264 streams on a high level.
This thread goes into a bit more detail as to handling the SPS/PPS NALUs specifically. As was mentioned above, you need to separate individual NAL Units using the start prefix, and then hand the remaining data to the MediaCodec.
This repo (libstreaming) is a great example of decoding an H264 stream in Android using RTSP/RTP for transmission.

Categories

Resources