I'm looking for, either open-source or commercially available, Audio and Videl encoder & decoder for Android for an application I want to write for Android. For audio, I want to be able to both encode-decode the AMR/AD-PCM/AAC formats and for Video H.263/H.264 & MPEG4 formats.
I can see from Android documentation that encoding & decoding AMR-NB audio format is provided by the Android platform and for Video H.263 is provided. But, for rest of the codecs (both Audio & Video) that I've listed, decoder is there but not encoder (If I got it right).
Can anyone please help me in providing me with the pointers/suggestions for how/where can I find these codecs that are optimized/suitable for Android?
Thanks & Regards,
Harsha
Can anyone please help me in providing
me with the pointers/suggestions for
how/where can I find these codecs that
are optimized/suitable for Android?
Contact PacketVideo (authors of the OpenCORE multimedia engine), and be prepared to write a check for a very large sum of money.
Or, use the Native Development Kit (NDK) and transcode the video from a supported format to the one you want.
Or, use a server to transcode the video from a supported format to the one you want.
Related
In my project I need to implement an HLS (HTTP live Streaming) for an android device and it stream to an iOS device to play where android device will record the video and send it to server and iOS device will play the stream from the server using an m3u8 file. In the link below
Click Here
They have mention "Currently, the supported delivery format is MPEG-2 Transport Streams for audio-video".
Now problem is that in android you can record only in mp4 by default (correct me if i am wrong). Now I need some third party API or library like ffmpeg, Gstreamer, Xuggler, Jcodec to transcode recorded mp4 to ts files.
ffmpeg, jffmpeg and Gstreamer have a learning curve and to setup time and also need NDK. So I need some help because I don't have enough time to try one of these please refer me if you know any library which is easy to use and does not have a complex learning and setup time. Like Jcodec which is pure java base and plug and play type library but I don't think it can do this for me as they have mention in there documentation they support h262 codec support yet but i need h264 and ACC for audio.
FYI:
JJPMEG
It is a Java binding to FFmpeg and it have an android verison too. Maybe you can give it a try.
https://code.google.com/p/jjmpeg/
Or:
Maybe you can just record the video with supporting encoding and transcode the video in the server side?
i am developing a media player application in android which uses ffmpeg for decoding which i think is software decoding. it doesn't play high resolution videos smoothly so i would like to switch to hardware decoding. I came to know that libstagefright will do the thing. But how to implement it using libstagefright? Is there any samples or documentation . Please help in using the libstagefright.
if you are using ICS you can use MediaCodec to encode or decode using hardware.
see http://developer.android.com/reference/android/media/MediaCodec.html for more details and examples.
Thanks,
NinjAndroid,
MoMinis R&D team
I am receiving the MPEG-TS (MPEG transport stream) packets with the multiplexed H.264 video and AAC audio streams. I need to be able to show the audio and video on the Android phone. My assumption is that I need:
MPEG-TS de-multiplexer
AAC decoder
H.264 decoder
Synchronize the audio and video playback
Assuming that I am right then (in Android 2.x) MPEG-TS de-multiplexer is not part of the OS and must be ported, both AAC and H.264 decoder are part of the Android OS, but I am not sure if they have interface, which allows passing the data in buffers and if they allow mutual timing synchronization. In the worst case those components must be ported here as well.
Can you give me some advices where to start? I was thinking about the FFMPEG porting. Are there any other ways?
Regards,
STeN
Android 4.x has OpenMAX which can play TS with H264 and AAC. You don't even need to worry about synchronisation of audio and video.
Look at the nativemedia sample in the NDK.
If you want to support previous versions of Android, then ffmpeg might be a good choice, but it the maximum it can give you is just decoded video frames in RGB or any other format and decoded audio in PCM. Then you will have to implement renderer and audio playback yourself. I would recommend reading this tutorial - http://dranger.com/ffmpeg/. It is not android specific but it will give you idea how video play works.
You may refer to the android-ffmpeg project on github.
https://github.com/guardianproject/android-ffmpeg
In Gingerbread ( 2.3 ), actually there is a MPEG TS parser in the stagefright framework that you could use. Also, I believe it is well integrated with H264 and AAC decoders. MPEG TS parser is not advertised anywhere but the support is silently sitting there. I believe they have brought it to support Apple HTTP Live streaming in HC or later version but the code is sitting there in the Gingerbread ( 2.3 ) codebase as well. With a minor modification in the framework, you can playback http live streaming ( which actually sends TS packets). I guess the above information would be helpful for you.
Vibgyor
(DISCLAIMER: I'm personally involved in developing the free and open source program linked below)
A static version of FFMpeg (both library and commandline) is provided by ZShaolin http://dyne.org/software/zshaolin also contains other media conversion tools.
Its use can facilitate scripting experiments without having to compile FFMpeg from scratch.
How I can get an audio file recorded via iPhone to play back in Android.
I don't see iLBC codec listed in the decoder section of Android supported media formats
Looking at the iPhone list of supported audio formats: iPhone audio formats, it looks like if you're just going iPhone => Android, then AAC or PCM are your best bets for encoding on iPhone, decoding on Android.
You should record your audio on iPhone with AVAudioRecorder .wav format.
See this thread how to configure AVAudioRecorder to get wav format on iOS.
Good luck
In terms of a built-in capability you can check the documentation as easily as I can.
Assuming no built-in capability, if you have working java, C, or (with caveats) C++ code capable of decoding the file to linear pcm samples and no legal obstacles to using it, then you can write an application to do so.
I have simplified my question and offered a bounty:
What options are there for compressing raw PCM audio data to a mp3 on a Android device.
My original post:
I'm creating a synthesiser on my Android phone, and I've been generating PCM data to send to the speakers. Now I'm wondering if I can encode this PCM data as a mp3 to save to the sdcard. The MediaRecorder object can encode audio coming from the microphone into various formats, but doesn't allow the encoding from programmatically generated audio data.
So my question is, is there a standard Android API for encoding audio? If not, what pure Java or NDK based solutions are there? And can you recommend any of them?
Failing this I'll just have to save my generated audio as a WAV file, which I can easily do.
Pure Java
Look into Tritonus's clean room implementation of javasound which offers an MP3 encoder plugin here: http://www.tritonus.org/plugins.html
Secondly, I would suggest looking into jzoom's libraries JLayer or JLayerME: http://www.javazoom.net/javalayer/javalayer.html (this may only be decode, not sure)
If those doesn't suit your need you can look at this article from 2000 about adding MP3 capabilities to J2SE (with source): http://www.javaworld.com/javaworld/jw-11-2000/jw-1103-mp3.html
Native route
If you want "native" performance I would look at an FFmpeg or Lame port for Android.
Lame: http://lame.sourceforge.net/
As far as i know you can't do this using only the tools in the SDK. According to the official developer guide there isn't an MP3 encoder in the platform (Android Supported Media Formats), so you have to port an encoder on your own using the NDK, then write some wrapper code to receive the audio samples through JNI.
I'm currently working on porting some audio decoders from the Rockbox project for my own music player, and it can record audio into MP3, so maybe you should try to look into it's source and find the encoder library. Most of the decoders have ARM optimalizations which speeds up things noticable, so i guess some of the encoders have also this addition.
Mp3 encoder is not available in android.you have to compile libav with mp3 lame lib you can find code from
http://libavandroid.wordpress.com