how to mux a multi-slice h264 stream via android mediamuxer - android

I'm trying to mux a mp4 file using mediaMuxer. I've already set SPS, PPS and replaced the 4-bit header. When the h.264 frame formed with single slice, everything is ok, but when I change it into multi-slice, the result is wrong.

sounds multi-slice is not fully supported in android mediaMuxer, it will cause compatible issue when plays on iPhone/QuickTime player.

Related

Android mediacodec: Is it possible to encode audio and video at the same time using mediacodec and muxer?

There is some good documentation on this site called big flake about how to use media muxer and mediacodec to encode then decode video as mp4, or extract video then encode it again and more stuff.
But it doesnt seem that there is a way to encode audio with video at the same time, no documentation or code about this. It doesn't seem impossible.
Question
Do you know any stable way of doing it that will work on all devices greater than android 18?
Why no one implemented it, is it hard to implement?
You have to create 2 Mediacodec instances, one for video and one for audio and then use MediaMuxer to mux the video with audio after encoding, you can take a look at ExtractDecodeEditEncodeMuxTest.java and at this project to capture camera/mic and save to mp4 file using Mediamuxer and Mediacodec

Media codec decoder and playback

I’m trying to show live preview from one android device to another.
Here is what I did,
Sender : 1.Camera frame(YUV) -> 2.Mediacodec (encode h264 byte[])-> 3.MediaMuxer-> 4.mp4
I'm sending output of media encoder via socket connection.
Receiver : 5.byte[] via socket ->6. Mediacodec (decoder) -> 7.Play.
Till step 5 everything works fine.
However I'm not able to decode the byte[]. What is missing here? I guess, I'm not able to send(don't know how to send) sps and pps properly! also how to test that what I'm sending is correct data?
Thanks.
You are muxing to mp4 (container format) the encoded h264 video. However, on the decoding side, you are not using demuxer (or parser/splitter). Media codec decoder can take elementary video not any container format.
Use mediaextractor to demux in android. (https://developer.android.com/reference/android/media/MediaExtractor.html)

Consume RTSP H.264 encoded stream with Android MediaCodec

I'm trying to display a video offered via a RTSP (RTP/H.264 encoded) GStreamer Server. I can open the stream with VLC and ffmpeg.
I've implemented RTSP and RTP in Java/Android already.
Everytime I put the H.264 frame (= Payload of RTP) in the MediaCodec, I get a "MediaCodec.INFO_TRY_AGAIN_LATER" on dequeing the output buffers of the MediaCodec.
I tried to prepend 0x0,0x0,0x0,0x1 to the H.264 packets but this did not solve the issue. How can I find out, what's going wrong. Are there any examples how to get the h.264 frame from RTP on Android/Java to pass it over to the media codec?
Edit:
I dumped the H.264 Frames with FFMpeg to file. When I push these files to the MediaCodec on Android (with setting SPS and PPS) it does work for the sample RTSP stream rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov but not for my Gstreamer created RTSP stream. All packets (files) produced with Gstreamer have a Access Unit Delimiter (0x0 0x0 0x0 0x1 0x9...). The sample stream does not have this AUDs. I tried to remove these AUDs before passing the frames to the mediacodec, but still no luck.
Thank you

How do you encode a video with audio using MediaCodec

So, I have encoded a h264 elementary stream with MediaCodec by collectings the frames using Camera's onPreviewFrame method.(using Encoding H.264 from camera with Android MediaCodec). Then, I generated an mp4 video using the h264 stream. Unfortunately, it doesn't have any audio in it.
I notice that MediaCodec should also allow encoding audio because it has settings for audio codecs.
Now is there any ways to add audio to the h264 stream?
thanks for reading and would appreciate any comments or suggestions.
A given instance of MediaCodec will encode either video or audio. You would need to create a second instance of MediaCodec to do the audio encoding, and then combine the streams with the MediaMuxer class (introduced in Android 4.3, API 18).
There are examples of using MediaMuxer on bigflake, but at the time I'm writing this there isn't one that demonstrates combining audio and video (they're just "muxing" the video into a .mp4 file). It should be enough to demonstrate how to use the class though.

assembly container for Android MediaCodec

Is there anything that does the opposite of a MediaExtractor on android?
Take one or multiple streams from MediaCodecs (e.g. 1 video and 1 audio)
and package them in a container format for streaming or writing to files?
Looks like the answer is no.
Mostly because of the underlying API is designed for video streaming and not for video compression.
Writing encoder's output to file you'll get raw h264 file. Which can be played using mplayer or ffplay for example.
Also ffmpeg can be used to mux this raw file into some container. But first of all you need to build ffmpeg for android.

Categories

Resources