Starting from Android 4.1 (API level 16) MediaCodec APIs have been introduced. These APIs support Elementary stream Decoding & Encoding. Also MediaExtractor API will give elementary track details by analyzing Media streams.
My Question is, I setup a video encoder using MediaCodec API, which gives me encoded file in .h264 format. I want to write .h264 file into a .mp4 file for playing/storing/sharing purposes. I don't find any .MP4 file-writer API for android. Is there any way to achieve it?
Thanks,
Satish.
As of Android 4.3 (API 18) you can use the MediaMuxer class to convert the raw H.264 stream to a .mp4 file (and even merge an audio stream in).
See the EncodeAndMuxTest and CameraToMpegTest sources on this page for sample code.
This response may be of use. It suggests using the isoparser library. It works pretty well if you have elementary streams saved to disk, but it doesn't work if you want to live stream from the MediaCodec output.
Another way is to use FFmpeg (http://sourceforge.net/projects/ffmpeg4android/) for muxing h264-file to mp4 container.
Related
There is some good documentation on this site called big flake about how to use media muxer and mediacodec to encode then decode video as mp4, or extract video then encode it again and more stuff.
But it doesnt seem that there is a way to encode audio with video at the same time, no documentation or code about this. It doesn't seem impossible.
Question
Do you know any stable way of doing it that will work on all devices greater than android 18?
Why no one implemented it, is it hard to implement?
You have to create 2 Mediacodec instances, one for video and one for audio and then use MediaMuxer to mux the video with audio after encoding, you can take a look at ExtractDecodeEditEncodeMuxTest.java and at this project to capture camera/mic and save to mp4 file using Mediamuxer and Mediacodec
How can I convert audio .wav to .m4a programmatically using API 14?
MediaCodec and MediaMuxer supported on latest versions API.
FFMPEG is a viable options that seems to be reliable. I'm currently using it in a production app and it does it's purpose.
I have been asked to display a video stream (the stream is not from HTTP)in android, the stream is raw H.264 which is Recorded and encoded in a PC ,and I get it through WIFI.
When I get the stream, can use the MediaCodec decoder to decode the stream and display it?
Yes. Configure the MediaCodec as a "video/avc" decoder, and pass an output Surface to the configure() call.
The MediaCodec API is pretty low-level, and there's not a lot of sample code available. It might be easier to use MediaPlayer instead.
Update:
There's now a bunch of sample code here. Most of it makes use of Android 4.3 (API 18) features, but if you don't need MediaMuxer or Surface input to MediaCodec it'll work on API 16.
See the Video Encoding Recommendations here
I am working on an music player for android,i want to know which would be better for me?
FFMpeg or OpenSL-ES ? which one is easy to deal with?
Thanks
(old question but since i'm passing by...)
You can't compare both, they don't do the same job :
FFMpeg will decode your stream (example : MP3 file) to output PCM for instance
OpenSL will transmit PCM samples to your audio hardware to output the sound (and apply filters and effects)
Actually, OpenSL with API 14 (Android 4.0) is also able to decode some audio codecs such as MP3.
I have simplified my question and offered a bounty:
What options are there for compressing raw PCM audio data to a mp3 on a Android device.
My original post:
I'm creating a synthesiser on my Android phone, and I've been generating PCM data to send to the speakers. Now I'm wondering if I can encode this PCM data as a mp3 to save to the sdcard. The MediaRecorder object can encode audio coming from the microphone into various formats, but doesn't allow the encoding from programmatically generated audio data.
So my question is, is there a standard Android API for encoding audio? If not, what pure Java or NDK based solutions are there? And can you recommend any of them?
Failing this I'll just have to save my generated audio as a WAV file, which I can easily do.
Pure Java
Look into Tritonus's clean room implementation of javasound which offers an MP3 encoder plugin here: http://www.tritonus.org/plugins.html
Secondly, I would suggest looking into jzoom's libraries JLayer or JLayerME: http://www.javazoom.net/javalayer/javalayer.html (this may only be decode, not sure)
If those doesn't suit your need you can look at this article from 2000 about adding MP3 capabilities to J2SE (with source): http://www.javaworld.com/javaworld/jw-11-2000/jw-1103-mp3.html
Native route
If you want "native" performance I would look at an FFmpeg or Lame port for Android.
Lame: http://lame.sourceforge.net/
As far as i know you can't do this using only the tools in the SDK. According to the official developer guide there isn't an MP3 encoder in the platform (Android Supported Media Formats), so you have to port an encoder on your own using the NDK, then write some wrapper code to receive the audio samples through JNI.
I'm currently working on porting some audio decoders from the Rockbox project for my own music player, and it can record audio into MP3, so maybe you should try to look into it's source and find the encoder library. Most of the decoders have ARM optimalizations which speeds up things noticable, so i guess some of the encoders have also this addition.
Mp3 encoder is not available in android.you have to compile libav with mp3 lame lib you can find code from
http://libavandroid.wordpress.com