I have successfully ported ffmpeg library to Android by using Bambuser's ffmpeg port. I'm currently investigating the ffmpeg's source codes especially ffplay.c and api-examples.c files.
I want to extract elementary streams from Android 2.2 recorded videos. For example I can record a H.263 encoded video in the MPEG-4 container. Lets say; test.mp4 file.
What I want to achieve is, I want to extract H.263 elementary stream video from the test.mp4 file something like test.h263.
It can be extracted by using the CLI of the ffmpeg. Just like;
ffmpeg -i test.mp4 -vcodec copy test.h263
But unfortunately I don't know how to extract elementary stream from a container by using the API of ffmpeg.
Thanks.
Related
I am using ffmpeg library in my project with custom VideoPlayer. I can convert video from .avi to .mp4, but it takes more time to convert. Can I convert video on the fly using ffmpeg?
EDIT
So I want to play videos with different formats, that standard MediaPlayer doesn't support now. I've built ffmpeg for my project, now I have ffmpeg.so file and I can convert video through ffmpeg, for example:
ffmpeg input.avi -vcodec mpeg4 -acodec aac -strict -2 output.mp4
But it can spend more time, if video is big. How can I realize different formats support using ffmpeg?
Yes, you can convert video on the fly using ffmpeg (it would actually be called transcoding).
Here's an older guide written for linux. All of it should still work on Android.
I'm trying to encode a local/static input file (can MP4, for example) into a smaller video file (either by resizing, lesser quality video, etc.) and stream it in parallel (i.e. I can't wait for the encoding process to finish before streaming it back), so it can be played by an Android client (the standard Android video player).
So I've tried using ffmpeg as follows:
ffmpeg -re -i input.mp4 -g 52 -acodec libvo_aacenc -ab 64k -vcodec libx264 -vb 448k -f mp4 -movflags frag_keyframe+empty_moov -
Notice I'm using stdout as the output so I can run ffmpeg and stream its output on the fly
However, such methods (and other similar methods) don't seem to work on Android - it can't simply play it once it receives "non-standard" files (such as a fragmented MP4s) - it seems like the empty moov atom messes it up.
I also tried other container formats, such as 3GPP and WebM.
I'd love to hear any kind of input on this issue...
Thanks
You can specify multiple outputs in ffmpeg, see here http://trac.ffmpeg.org/wiki/Creating%20multiple%20outputs
For Android if newer than 3.0 try HLS as an output
Is there anything that does the opposite of a MediaExtractor on android?
Take one or multiple streams from MediaCodecs (e.g. 1 video and 1 audio)
and package them in a container format for streaming or writing to files?
Looks like the answer is no.
Mostly because of the underlying API is designed for video streaming and not for video compression.
Writing encoder's output to file you'll get raw h264 file. Which can be played using mplayer or ffplay for example.
Also ffmpeg can be used to mux this raw file into some container. But first of all you need to build ffmpeg for android.
I have the following stream mms://77.238.11.5:8080, you can access it using Windows Mediaplayer.
I don't find any solution to view it on Android devices using MediaPlayer or VideoView, so my idea is to convert is using VLC or FFMPEG to a different format like MP4 or else.
You can use ffmpeg to convert the stream to a file:
ffmpeg -i "mmsh://77.238.11.5:8080/" output.mp4
Option -t duration limits the length of the video, if needed.
They say that VPlayer can play such streams.
i have saved the mjpeg stream to the sdcard as xxx.mjpeg .However, the mjpeg video file was not supported in android. so how could i encode mjpeg video into 3gp or mp4 format and then store them on sdcard ,at last ,i can play back the 3gp or mp4 video on my android phone ,thanks in advance.
I am not aware of state of mobo's ffmpeg source out there. I had built it long back.
I tried the rockplayer's ffmpeg port. This has hassle free build.
I was able to build it today successfully on NDK_r4b.
You can download the source from here : http://www.rockplayer.com/tech_en.html
modify the config.mk to change your tool paths and run build_andriod.sh (spelling is wrong, but it works :) )
let me know how it goes
There is no way you can achieve this with current Android API.
You need to encode the frames using an encoder in C++ and pass your bitmaps to the encoder via JNI.
You can start with MoboPlayer's ffmpeg port. You may find the download link to their ffmpeg port at the bottom of this page
If you have the image sequence in Bitmaps, you can access the Bitmap's buffer from JNI using the AndroidBitmap_* methods and pass it on to ffmpeg for encoding