I'm trying to implement a Video Compressor in android using the Media Codec API,
I've successfully decoded a file into raw format and encoded it & getting different encoded tracks (Video, and Audio's ). How to store these files into a container format? Is there any in built functions available to do the same.
How to synchronize the audio, video and subtitles files ?
Android 4.3 (API 18) added the MediaMuxer class. This allows you to convert the raw H.264 video stream to a .mp4 file, optionally merging an audio stream in.
Each frame of data is submitted with a presentation time stamp.
Related
Since MediaRecoreder not supports encoding to mp3 format.
Can ffmpeg to record audio directly from Android audio source (how to specify this) or it only encodes one file to other format?
You mean using Android to record audio and save it as mp3?
If so, you can code mp3 using libmp3lame. And this is very useful.
Since Android Record outputs a pcm, liblmp3lame can use pcm as input and then output mp3 data.
I want to play some animtaions (actually, gifs) which were previously encoded (with h264 codec) and downloaded from network as mp4 files (because mp4 files are much smaller than corresponding gifs). So, on a device I want to decode mp4 and get all the frames to create animations. And the question - what is the best way to decode mp4 to accomplish my task?
I have been trying to list out all supported codec in android device using IOMX as shown here by binding with media player service. I got my code working and get the list of components. But I observed that there were no encoder component. Only all decoder component were listed. Then I go to inbuilt camera application provided by android and started recording video which was stored in mp4 file format. When I check the mp4 file's codec information from VLC player, It showed "H264 mpeg4 part10 avc". So here comes my doubt. If there is no component listed for h264/avc encoding then how android can encode frames in h264 format?
Any Suggestion?
Thanks.
In android distribution, there is a suite of codecs that are bundled by Google commonly referred to as "plain vanilla codecs". In case of H.264, there is an encoder bundled, whose sources are at frameworks/av/media/libstagefright/codecs/avc/enc/. Hence, when you encode from the camera, this default codec is used to encode the frames. To verify the same, you could enable the logs in OMXCodec or ACodec and observe that the name of the encoder component is OMX.google.h264.encoder.
The complete list of codecs is available in SoftOMXPlugin.cpp at frameworks/av/media/libstagefright/omx/
Is there anything that does the opposite of a MediaExtractor on android?
Take one or multiple streams from MediaCodecs (e.g. 1 video and 1 audio)
and package them in a container format for streaming or writing to files?
Looks like the answer is no.
Mostly because of the underlying API is designed for video streaming and not for video compression.
Writing encoder's output to file you'll get raw h264 file. Which can be played using mplayer or ffplay for example.
Also ffmpeg can be used to mux this raw file into some container. But first of all you need to build ffmpeg for android.
I need to encode the audio/video files that are placed in sdcard.
Then later on I have an User-Interface, where there is a media player where in I play these encoded files after decoding them.
Reason I need to encode these files is because, I dont want these files to be played by Android's default Media Player and want them to be played by my own App's Media Player.
Could you please suggest me some audio/video encoders and decoders supported by android.
Thank you