I have been asked to display a video stream (the stream is not from HTTP)in android, the stream is raw H.264 which is Recorded and encoded in a PC ,and I get it through WIFI.
When I get the stream, can use the MediaCodec decoder to decode the stream and display it?
Yes. Configure the MediaCodec as a "video/avc" decoder, and pass an output Surface to the configure() call.
The MediaCodec API is pretty low-level, and there's not a lot of sample code available. It might be easier to use MediaPlayer instead.
Update:
There's now a bunch of sample code here. Most of it makes use of Android 4.3 (API 18) features, but if you don't need MediaMuxer or Surface input to MediaCodec it'll work on API 16.
See the Video Encoding Recommendations here
Related
I'm trying to make some deep learning experiments on android on video samples. And I've got stuck into remuxing videos. I have a couple of questions to arrange information in my head:) I have read some pages: https://vec.io/posts/android-hardware-decoding-with-mediacodec and https://bigflake.com/mediacodec/#ExtractMpegFramesTest but still I have a mess.
My questions:
Can I read video with MediaExtractor and then pass data to MediaMuxer to save video in another file? Without using MediaCodec?
If I want to modify frames before saving, can I do that without using Surface? Just by modifying ByteBuffer? I assume that I need to decode data from MediaExtractor, then modify content, then encode it to MediaMuxer.
Does sample is the same as frame in context of method MediaExtractor::readSampleData ?
Do I need to decode sample?
This is a brief description of what each class does:
MediaExtrator: Extracts encoded video/audio data
MediaCodec: Depending on how its configured it can be a decoder or an encoder.
MediaMuxer: Muxes streams of data into an output file.
This is how you pipeline should generally look like:
MediaExtractor -> MediaCodec(As Decoder) -> Your editing -> MediaCodec(As Encoder) -> MediaMuxer
To answer you questions:
MediaExtractor will give you encoded data, if you want to do
anything with it you will have to decode it using a MediaCodec.
It might be possible to do so without a surface but it will be
pretty limited. Surfaces is the way to go. You can find more info
here:
Editing frames and encoding with MediaCodec
Sample can be a video frame or an audio sample
Yes you do need to decode samples to edit them
There is some good documentation on this site called big flake about how to use media muxer and mediacodec to encode then decode video as mp4, or extract video then encode it again and more stuff.
But it doesnt seem that there is a way to encode audio with video at the same time, no documentation or code about this. It doesn't seem impossible.
Question
Do you know any stable way of doing it that will work on all devices greater than android 18?
Why no one implemented it, is it hard to implement?
You have to create 2 Mediacodec instances, one for video and one for audio and then use MediaMuxer to mux the video with audio after encoding, you can take a look at ExtractDecodeEditEncodeMuxTest.java and at this project to capture camera/mic and save to mp4 file using Mediamuxer and Mediacodec
So, I have encoded a h264 elementary stream with MediaCodec by collectings the frames using Camera's onPreviewFrame method.(using Encoding H.264 from camera with Android MediaCodec). Then, I generated an mp4 video using the h264 stream. Unfortunately, it doesn't have any audio in it.
I notice that MediaCodec should also allow encoding audio because it has settings for audio codecs.
Now is there any ways to add audio to the h264 stream?
thanks for reading and would appreciate any comments or suggestions.
A given instance of MediaCodec will encode either video or audio. You would need to create a second instance of MediaCodec to do the audio encoding, and then combine the streams with the MediaMuxer class (introduced in Android 4.3, API 18).
There are examples of using MediaMuxer on bigflake, but at the time I'm writing this there isn't one that demonstrates combining audio and video (they're just "muxing" the video into a .mp4 file). It should be enough to demonstrate how to use the class though.
Starting from Android 4.1 (API level 16) MediaCodec APIs have been introduced. These APIs support Elementary stream Decoding & Encoding. Also MediaExtractor API will give elementary track details by analyzing Media streams.
My Question is, I setup a video encoder using MediaCodec API, which gives me encoded file in .h264 format. I want to write .h264 file into a .mp4 file for playing/storing/sharing purposes. I don't find any .MP4 file-writer API for android. Is there any way to achieve it?
Thanks,
Satish.
As of Android 4.3 (API 18) you can use the MediaMuxer class to convert the raw H.264 stream to a .mp4 file (and even merge an audio stream in).
See the EncodeAndMuxTest and CameraToMpegTest sources on this page for sample code.
This response may be of use. It suggests using the isoparser library. It works pretty well if you have elementary streams saved to disk, but it doesn't work if you want to live stream from the MediaCodec output.
Another way is to use FFmpeg (http://sourceforge.net/projects/ffmpeg4android/) for muxing h264-file to mp4 container.
I understand like, there are two ways of capturing video in android.
1) using SurfaceView API
2) using MediaRecorder API
I want to capture the H.264 encoded frames using the Android (3.0+) 's default encoder to send it over network using RTP.
While using preview callbacks with SurfaceView and SurfaceHolder classes, we are able to get raw frames shown as preview to the user. We were getting the frames in "onPreviewFrame" method of "PreviewCallback" class.
But, those frames are H.264 encoded.
So, I tried with "MediaRecorder" API to set H.264 encoding and "SurfaceView" to get the preview frames.
In this case, the previewcallbacks are not getting called.
Can you please let me know how to achieve this. Our main aim is to get the H.264 encoded frame (which hass been encoded using android's default codec).
Ref: 1) https://stackoverflow.com/a/8655244/698316
2) Similar issue: http://permalink.gmane.org/gmane.comp.handhelds.android.devel/214422
Can you suggest a way to capture the H.264 encoded frames using android's default H.264 codec support.
See Spydroid http://code.google.com/p/spydroid-ipcamera/
Basically you let the video encoder write a .mp4 with H.264 to a special file descriptor that calls your code on write. Then strip off the MP4 header and turn the H.264 NALUs into RTP packets.