Android: RTP issue with H264, MediaCodec as an encoder - android

I am wrapping H264 video with RTP for network streaming. In order to do this, I am using Android's MediaCodec configured as an encoder to generate the H264. The RTP code is my own.
When I read the stream (using my local network to the streaming server), I can tell when the keyframe is received because the video sharpens up to what I would expect. However, as I move my hand across the video view, I see a significant amount of pixelation until the next keyframe comes in. My video is 960x720, 30fps and I am sending a keyframe every 2 seconds.
I can pump the raw h264 (not wrapped as RTP) from the MediaCodec encoder via a Datagram socket and play it back using ffplay. There are no such effects when I do this. It has to be something with the RTP encapsulation. I verified that my stop/stop bits of the FU-A fragmentation packets are correct, as is the RTP header marker bit. I'm sort of at a loss for what else could be the problem?

Related

MediaCodec decoder latency for live-streaming

I'm using MediaCodec to decode a H.264 video # 30FPS that I receive from an RTSP live stream, the decoder runs on an android device.
However, I see a latency in the output of the MediaCodec's decoder.
It looks like the decoder waits until it receives about 15 frames before providing the decoded frames, resulting in ~500ms latency in the rendered video.
The latency is not accepted for my project, as the user expects to see the live video immediately when it arrives to his device.
Is there a way to configure the MediaCodec, so it doesn't buffer the incoming frames and outputs the decoded frames as soon as they are ready to be displayed?
Thanks for the help.
If possible, try to change the encoding of the videos.

Media codec decoder and playback

I’m trying to show live preview from one android device to another.
Here is what I did,
Sender : 1.Camera frame(YUV) -> 2.Mediacodec (encode h264 byte[])-> 3.MediaMuxer-> 4.mp4
I'm sending output of media encoder via socket connection.
Receiver : 5.byte[] via socket ->6. Mediacodec (decoder) -> 7.Play.
Till step 5 everything works fine.
However I'm not able to decode the byte[]. What is missing here? I guess, I'm not able to send(don't know how to send) sps and pps properly! also how to test that what I'm sending is correct data?
Thanks.
You are muxing to mp4 (container format) the encoded h264 video. However, on the decoding side, you are not using demuxer (or parser/splitter). Media codec decoder can take elementary video not any container format.
Use mediaextractor to demux in android. (https://developer.android.com/reference/android/media/MediaExtractor.html)

How to implement frame skipping in MediaCodec android

I'm making an app which is using MediaCodec APIs.
The app runs on two phones. The first phone reads the video from the sdcard and then uses the MediaCodec encoder to encode the frames in avc format and then streams the frames to another device. The second device has a MediaCodec decoder running. The decoder decodes the frames and render them on a Surface.
The code is running fine but after sometime when the size of the frames gets more, the first device is sometime not able to stream the video and the encoder stops reporting the following log :
E/OMX-VENC-720p( 212): Poll timedout, pipeline stalled due to client/firmware ETB: 496, EBD: 491, FTB: 492, FBD: 492
So I want to implement frame skipping on the encoder side.
What's the best way to skip the frames and not stream them to the other device. ?
PS. On a separate note if anyone can suggest me of any other way of streaming a video to other device it'll be really nice.
Please try Intel INDE Media Pack with tutorials on https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials. It has Camera, File and Game streaming components, which make streaming with help of Wowza and a set of samples demonstrating how to use it as a server and as a client

Decoding Raw H264 stream in android?

I have a project where I have been asked to display a video stream in android, the stream is raw H.264 and I am connecting to a server and will receive a byte stream from the server.
Basically I'm wondering is there a way to send raw bytes to a decoder in android and display it on a surface?
I have been successful in decoding H264 wrapped in an mp4 container using the new MediaCodec and MediaExtractor API in android 4.1, unfortunately I have not found a way to decode a raw H264 file or stream using these API's.
I understand that one way is to compile and use FFmpeg but I'd rather use a built in method that can use HW acceleration. I also understand RTSP streaming is supported in android but this is not an option. Android version is not an issue.
I can't provide any code for this unfortunately, but I'll do my best to explain it based on how I got it to work.
So here is my overview of how I got raw H.264 encoded video to work using the MediaCodec class.
Using the link above there is an example of getting the decoder setup and how to use it, you will need to set it up for decoding H264 AVC.
The format of H.264 is that it’s made up of NAL Units, each starting with a start prefix of three bytes with the values 0x00, 0x00, 0x01 and each unit has a different type depending on the value of the 4th byte right after these 3 starting bytes. One NAL Unit IS NOT one frame in the video, each frame is made up of a number of NAL Units.
Basically I wrote a method that finds each individual unit and passes it to the decoder (one NAL Unit being the starting prefix and any bytes there after up until the next starting prefix).
Now if you have the decoder setup for decoding H.264 AVC and have an InputBuffer from the decoder then you are ready to go. You need to fill this InputBuffer with a NAL Unit and pass it back to the decoder and continue doing this for the length of the stream.
But, to make this work I had to pass the decoder a SPS (Sequence Parameter Set) NAL Unit first. This unit has a byte value of 0x67 after the starting prefix (the 4th byte), on some devices the decoder would crash unless it received this Unit first.
Basically until you find this unit, ignore all other NAL Units and keep parsing the stream until you get this unit, then you can pass all other units to the decoder.
Some devices didn't need the SPS first and some did, but you are better of passing it in first.
Now if you had a surface that you passed to the decoder when you configured it then once it gets enough NAL units for a frame it should display it on the surface.
You can download the raw H.264 from the server, then offer it via a local HTTP server running on the phone and then let VLC for Android do playback from that HTTP server. You should use VLC's http/h264:// scheme to force the demuxer to raw H.264 (if you don't force the demuxer VLC may not be able to recognize the stream, even when the MIME type returned by the HTTP server is set correctly). See
https://github.com/rauljim/tgs-android/blob/integrate_record/src/com/tudelft/triblerdroid/first/VideoPlayerActivity.java#L211
for an example on how to create an Intent that will launch VLC.
Note: raw H.264 apparently has no timing info, so VLC will play as fast as possible.
First embedding it in MPEGTS will be better. Haven't found a Android lib that will do that yet.
Here are the resources I've found helpful in a similar project:
This video has been super insightful in understanding how MediaCodec handles raw h.264 streams on a high level.
This thread goes into a bit more detail as to handling the SPS/PPS NALUs specifically. As was mentioned above, you need to separate individual NAL Units using the start prefix, and then hand the remaining data to the MediaCodec.
This repo (libstreaming) is a great example of decoding an H264 stream in Android using RTSP/RTP for transmission.

H.264 Real-time Streaming, Timestamp in NAL Units?

I'm trying to build a system that live-streams video and audio captured by android phones. Video and auido are captured on the android side using MediaRecorder, and then pushed directly to a server written in python. Clients should access this live feed using their browser, so the I implemented the streaming part of the system using flash. Right now both video and audio content appear on the client side, but the problem is that they are out of sync. I'm sure this is caused by wrong timestamp values in flash (currently I increment ts by 60ms for a frame of video, but clearly this value should be variable).
The audio is encoded into amr on the android phone, so I know exactly each frame of amr is 20ms. However, this is not the case with video, which is encoded into H.264. To synchronized them together, I would have to know exactly how many millisecs each frame of H.264 lasts, so that I can timestamp them later when delivering content using flash. My question is is this kind of information available in NAL units of H.264? I tried to find the answer in H.264 standard, but the information there is just overwhelming.
Can someone please point me at the right direction? Thanks.
Timestamps are not in NAL units, but are typically part of RTP. RTP/RTCP also takes care of media synchronisation.
The RTP payload format for H.264 might also be of interest to you.
If you are not using RTP, are you just sending raw data units over the network?

Categories

Resources