Consume RTSP H.264 encoded stream with Android MediaCodec - android

I'm trying to display a video offered via a RTSP (RTP/H.264 encoded) GStreamer Server. I can open the stream with VLC and ffmpeg.
I've implemented RTSP and RTP in Java/Android already.
Everytime I put the H.264 frame (= Payload of RTP) in the MediaCodec, I get a "MediaCodec.INFO_TRY_AGAIN_LATER" on dequeing the output buffers of the MediaCodec.
I tried to prepend 0x0,0x0,0x0,0x1 to the H.264 packets but this did not solve the issue. How can I find out, what's going wrong. Are there any examples how to get the h.264 frame from RTP on Android/Java to pass it over to the media codec?
Edit:
I dumped the H.264 Frames with FFMpeg to file. When I push these files to the MediaCodec on Android (with setting SPS and PPS) it does work for the sample RTSP stream rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov but not for my Gstreamer created RTSP stream. All packets (files) produced with Gstreamer have a Access Unit Delimiter (0x0 0x0 0x0 0x1 0x9...). The sample stream does not have this AUDs. I tried to remove these AUDs before passing the frames to the mediacodec, but still no luck.
Thank you

Related

how to mux a multi-slice h264 stream via android mediamuxer

I'm trying to mux a mp4 file using mediaMuxer. I've already set SPS, PPS and replaced the 4-bit header. When the h.264 frame formed with single slice, everything is ok, but when I change it into multi-slice, the result is wrong.
sounds multi-slice is not fully supported in android mediaMuxer, it will cause compatible issue when plays on iPhone/QuickTime player.

Media codec decoder and playback

I’m trying to show live preview from one android device to another.
Here is what I did,
Sender : 1.Camera frame(YUV) -> 2.Mediacodec (encode h264 byte[])-> 3.MediaMuxer-> 4.mp4
I'm sending output of media encoder via socket connection.
Receiver : 5.byte[] via socket ->6. Mediacodec (decoder) -> 7.Play.
Till step 5 everything works fine.
However I'm not able to decode the byte[]. What is missing here? I guess, I'm not able to send(don't know how to send) sps and pps properly! also how to test that what I'm sending is correct data?
Thanks.
You are muxing to mp4 (container format) the encoded h264 video. However, on the decoding side, you are not using demuxer (or parser/splitter). Media codec decoder can take elementary video not any container format.
Use mediaextractor to demux in android. (https://developer.android.com/reference/android/media/MediaExtractor.html)

Android: RTP issue with H264, MediaCodec as an encoder

I am wrapping H264 video with RTP for network streaming. In order to do this, I am using Android's MediaCodec configured as an encoder to generate the H264. The RTP code is my own.
When I read the stream (using my local network to the streaming server), I can tell when the keyframe is received because the video sharpens up to what I would expect. However, as I move my hand across the video view, I see a significant amount of pixelation until the next keyframe comes in. My video is 960x720, 30fps and I am sending a keyframe every 2 seconds.
I can pump the raw h264 (not wrapped as RTP) from the MediaCodec encoder via a Datagram socket and play it back using ffplay. There are no such effects when I do this. It has to be something with the RTP encapsulation. I verified that my stop/stop bits of the FU-A fragmentation packets are correct, as is the RTP header marker bit. I'm sort of at a loss for what else could be the problem?

How do you encode a video with audio using MediaCodec

So, I have encoded a h264 elementary stream with MediaCodec by collectings the frames using Camera's onPreviewFrame method.(using Encoding H.264 from camera with Android MediaCodec). Then, I generated an mp4 video using the h264 stream. Unfortunately, it doesn't have any audio in it.
I notice that MediaCodec should also allow encoding audio because it has settings for audio codecs.
Now is there any ways to add audio to the h264 stream?
thanks for reading and would appreciate any comments or suggestions.
A given instance of MediaCodec will encode either video or audio. You would need to create a second instance of MediaCodec to do the audio encoding, and then combine the streams with the MediaMuxer class (introduced in Android 4.3, API 18).
There are examples of using MediaMuxer on bigflake, but at the time I'm writing this there isn't one that demonstrates combining audio and video (they're just "muxing" the video into a .mp4 file). It should be enough to demonstrate how to use the class though.

Streaming audio/video from Android

I am writing Android app to stream audio/video to Wowza server in RTSP interleaved mode. Using AAC and H.264 encoders. I created packetizers for both audio and video. The problem that I am facing is that when I send both streams simultaneously I am losing video stream. I only get audio on Wowza and VLC. When I do not stream audio video works just fine. This proves that my packetizers and RTP streaming code perform as expected. It looks as if I cannot send video fast enough to sustain the stream.
Similarly architected code on iOS provides stable video and audio feed.
Thank you

Categories

Resources