I am trying to use x264 on Android to build a video system, which could do the encoding and decoding work. I've finished the video data capturing and encoding work. But I am not sure about how to deal with the audio stuff. Does x264 support audio recording? Or should I use some other api provided by Android? If I do the video capturing and audio capturing job seperately, how could I make sure they are time synchronization?
The x264 is only a video codec. For audio you need to use some audio codec (for example aac could be used here). Having these you can merge them into output file (.mkv or whatever you have).
https://sites.google.com/site/linuxencoding/x264-encoding-guide
Related
There is some good documentation on this site called big flake about how to use media muxer and mediacodec to encode then decode video as mp4, or extract video then encode it again and more stuff.
But it doesnt seem that there is a way to encode audio with video at the same time, no documentation or code about this. It doesn't seem impossible.
Question
Do you know any stable way of doing it that will work on all devices greater than android 18?
Why no one implemented it, is it hard to implement?
You have to create 2 Mediacodec instances, one for video and one for audio and then use MediaMuxer to mux the video with audio after encoding, you can take a look at ExtractDecodeEditEncodeMuxTest.java and at this project to capture camera/mic and save to mp4 file using Mediamuxer and Mediacodec
I am working on a audio application, for development of this I am using pjsip . I compiled pjsip with opus but now I want encode audio with different bitrate . Is that possible ? please anyone help me .
I'm not familiar with pjsip or opus, but I can say from experience that it is possible to change audio bitrate using Android's MediaCodec. It should just be a matter of decoding the audio stream with one MediaCodec and re-encoding it with another MediaCodec with the newly specified bitrate. If, however, you want to resample the audio, you would have to do a lot of extra work by hand between the decode and encode steps provided by MediaCodec. Again, I can tell you from experience that it is possible though.
Is there anything that does the opposite of a MediaExtractor on android?
Take one or multiple streams from MediaCodecs (e.g. 1 video and 1 audio)
and package them in a container format for streaming or writing to files?
Looks like the answer is no.
Mostly because of the underlying API is designed for video streaming and not for video compression.
Writing encoder's output to file you'll get raw h264 file. Which can be played using mplayer or ffplay for example.
Also ffmpeg can be used to mux this raw file into some container. But first of all you need to build ffmpeg for android.
I am working on an music player for android,i want to know which would be better for me?
FFMpeg or OpenSL-ES ? which one is easy to deal with?
Thanks
(old question but since i'm passing by...)
You can't compare both, they don't do the same job :
FFMpeg will decode your stream (example : MP3 file) to output PCM for instance
OpenSL will transmit PCM samples to your audio hardware to output the sound (and apply filters and effects)
Actually, OpenSL with API 14 (Android 4.0) is also able to decode some audio codecs such as MP3.
I have simplified my question and offered a bounty:
What options are there for compressing raw PCM audio data to a mp3 on a Android device.
My original post:
I'm creating a synthesiser on my Android phone, and I've been generating PCM data to send to the speakers. Now I'm wondering if I can encode this PCM data as a mp3 to save to the sdcard. The MediaRecorder object can encode audio coming from the microphone into various formats, but doesn't allow the encoding from programmatically generated audio data.
So my question is, is there a standard Android API for encoding audio? If not, what pure Java or NDK based solutions are there? And can you recommend any of them?
Failing this I'll just have to save my generated audio as a WAV file, which I can easily do.
Pure Java
Look into Tritonus's clean room implementation of javasound which offers an MP3 encoder plugin here: http://www.tritonus.org/plugins.html
Secondly, I would suggest looking into jzoom's libraries JLayer or JLayerME: http://www.javazoom.net/javalayer/javalayer.html (this may only be decode, not sure)
If those doesn't suit your need you can look at this article from 2000 about adding MP3 capabilities to J2SE (with source): http://www.javaworld.com/javaworld/jw-11-2000/jw-1103-mp3.html
Native route
If you want "native" performance I would look at an FFmpeg or Lame port for Android.
Lame: http://lame.sourceforge.net/
As far as i know you can't do this using only the tools in the SDK. According to the official developer guide there isn't an MP3 encoder in the platform (Android Supported Media Formats), so you have to port an encoder on your own using the NDK, then write some wrapper code to receive the audio samples through JNI.
I'm currently working on porting some audio decoders from the Rockbox project for my own music player, and it can record audio into MP3, so maybe you should try to look into it's source and find the encoder library. Most of the decoders have ARM optimalizations which speeds up things noticable, so i guess some of the encoders have also this addition.
Mp3 encoder is not available in android.you have to compile libav with mp3 lame lib you can find code from
http://libavandroid.wordpress.com