I want to play static HLS content (not live video content) on my app in Android. What I currently do is download all the segments from the .m3u8 file and merge it into one file. When I play this file, I can see this the video being played, but it is not seekable. As per this link, .ts files are not seekable on Android.
I cannot risk running ffmpeg on phone for converting the file to MP4 format. I have studied MP4 format and its atom structure. What I want to know is, if there is an easy way to create MP4 container (atoms hierarchy) which would simply refer to the .ts segment (the merged segment that was created from sub-segments) in its data atom (mdat)
I would really appreciate any help/suggestions.
Not possible without a copy. TS is uses 188 byte packets with headers. These headers create breaks in the middle of frames. In mp4, frames must be contiguous.
Android provides support libraries such as MediaCodec and MediaExtractor that provides access to low level media encoding/decoding. It is fast and efficient as it uses hardware acceleration.
Here's how I believe one is suppose to do it on Android unless you are ok with using ffmpeg which of course if resource intensive operation.
1) Use MediaExtractor to extract data from the file.
2) Pass the extracted data to MediaCodec.
3) Use MediaCodec to render output to a surface (in case of video) and AudioTrack (in case of audio).
4) This is the most difficult step: Synchronize audio/video. I haven't implemented this yet. But this would require keeping track of time sync between audio and video. Audio would be played normally and you might have to drop some frames in case of video to keep it in sync with audio playback.
Here's code for decoding audio/video and playing them respectively using AudioTrack and Surface.
In case of video decoding, there's a sleep to slowdown the frame rendering.
public void decodeVideo(Surface surface) throws IOException {
MediaExtractor extractor = new MediaExtractor();
MediaCodec codec;
ByteBuffer[] codecInputBuffers;
ByteBuffer[] codecOutputBuffers;
extractor.setDataSource(file);
Log.d(TAG, "No of tracks = " + extractor.getTrackCount());
MediaFormat format = extractor.getTrackFormat(0);
String mime = format.getString(MediaFormat.KEY_MIME);
Log.d(TAG, "mime = " + mime);
Log.d(TAG, "format = " + format);
codec = MediaCodec.createDecoderByType(mime);
codec.configure(format, surface, null, 0);
codec.start();
codecInputBuffers = codec.getInputBuffers();
codecOutputBuffers = codec.getOutputBuffers();
extractor.selectTrack(0);
final long timeout_in_Us = 5000;
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
boolean sawInputEOS = false;
boolean sawOutputEOS = false;
int noOutputCounter = 0;
long startMs = System.currentTimeMillis();
while(!sawOutputEOS && noOutputCounter < 50) {
noOutputCounter++;
if(!sawInputEOS) {
int inputBufIndex = codec.dequeueInputBuffer(timeout_in_Us);
if(inputBufIndex >= 0) {
ByteBuffer dstBuf = codecInputBuffers[inputBufIndex];
int sampleSize = extractor.readSampleData(dstBuf, 0);
long presentationTimeUs = 0;
if(sampleSize < 0) {
Log.d(TAG, "saw input EOS.");
sawInputEOS = true;
sampleSize = 0;
} else {
presentationTimeUs = extractor.getSampleTime();
}
codec.queueInputBuffer(inputBufIndex, 0, sampleSize, presentationTimeUs, sawInputEOS ? MediaCodec.BUFFER_FLAG_END_OF_STREAM : 0);
if(!sawInputEOS) {
extractor.advance();
}
}
}
int res = codec.dequeueOutputBuffer(info, timeout_in_Us);
if(res >= 0) {
if(info.size > 0) {
noOutputCounter = 0;
}
int outputBufIndex = res;
while(info.presentationTimeUs/1000 > System.currentTimeMillis() - startMs) {
try {
Thread.sleep(5);
} catch (Exception e) {
break;
}
}
codec.releaseOutputBuffer(outputBufIndex, true);
if((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) == MediaCodec.BUFFER_FLAG_END_OF_STREAM) {
Log.d(TAG, "saw output EOS.");
sawOutputEOS = true;
}
} else if(res == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
codecOutputBuffers = codec.getOutputBuffers();
Log.d(TAG, "output buffers have changed.");
} else if(res == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat format1 = codec.getOutputFormat();
Log.d(TAG, "output format has changed to " + format1);
} else if(res == MediaCodec.INFO_TRY_AGAIN_LATER) {
Log.d(TAG, "Codec try again returned" + res);
}
}
codec.stop();
codec.release();
}
private int audioSessionId = -1;
private AudioTrack createAudioTrack(MediaFormat format) {
int channelConfiguration = format.getInteger(MediaFormat.KEY_CHANNEL_COUNT) == 1 ? AudioFormat.CHANNEL_OUT_MONO : AudioFormat.CHANNEL_OUT_STEREO;
int bufferSize = AudioTrack.getMinBufferSize(format.getInteger(MediaFormat.KEY_SAMPLE_RATE), channelConfiguration, AudioFormat.ENCODING_PCM_16BIT) * 8;
AudioTrack audioTrack;
if(audioSessionId == -1) {
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, format.getInteger(MediaFormat.KEY_SAMPLE_RATE), channelConfiguration,
AudioFormat.ENCODING_PCM_16BIT, bufferSize, AudioTrack.MODE_STREAM);
} else {
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, format.getInteger(MediaFormat.KEY_SAMPLE_RATE), channelConfiguration,
AudioFormat.ENCODING_PCM_16BIT, bufferSize, AudioTrack.MODE_STREAM, audioSessionId);
}
audioTrack.play();
audioSessionId = audioTrack.getAudioSessionId();
return audioTrack;
}
public void decodeAudio() throws IOException {
MediaExtractor extractor = new MediaExtractor();
MediaCodec codec;
ByteBuffer[] codecInputBuffers;
ByteBuffer[] codecOutputBuffers;
extractor.setDataSource(file);
Log.d(TAG, "No of tracks = " + extractor.getTrackCount());
MediaFormat format = extractor.getTrackFormat(1);
String mime = format.getString(MediaFormat.KEY_MIME);
Log.d(TAG, "mime = " + mime);
Log.d(TAG, "format = " + format);
codec = MediaCodec.createDecoderByType(mime);
codec.configure(format, null, null, 0);
codec.start();
codecInputBuffers = codec.getInputBuffers();
codecOutputBuffers = codec.getOutputBuffers();
extractor.selectTrack(1);
AudioTrack audioTrack = createAudioTrack(format);
final long timeout_in_Us = 5000;
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
boolean sawInputEOS = false;
boolean sawOutputEOS = false;
int noOutputCounter = 0;
while(!sawOutputEOS && noOutputCounter < 50) {
noOutputCounter++;
if(!sawInputEOS) {
int inputBufIndex = codec.dequeueInputBuffer(timeout_in_Us);
if(inputBufIndex >= 0) {
ByteBuffer dstBuf = codecInputBuffers[inputBufIndex];
int sampleSize = extractor.readSampleData(dstBuf, 0);
long presentationTimeUs = 0;
if(sampleSize < 0) {
Log.d(TAG, "saw input EOS.");
sawInputEOS = true;
sampleSize = 0;
} else {
presentationTimeUs = extractor.getSampleTime();
}
codec.queueInputBuffer(inputBufIndex, 0, sampleSize, presentationTimeUs, sawInputEOS ? MediaCodec.BUFFER_FLAG_END_OF_STREAM : 0);
if(!sawInputEOS) {
extractor.advance();
}
}
}
int res = codec.dequeueOutputBuffer(info, timeout_in_Us);
if(res >= 0) {
if(info.size > 0) {
noOutputCounter = 0;
}
int outputBufIndex = res;
//Possibly store the decoded buffer
ByteBuffer buf = codecOutputBuffers[outputBufIndex];
final byte[] chunk = new byte[info.size];
buf.get(chunk);
buf.clear();
if(chunk.length > 0) {
audioTrack.write(chunk, 0 ,chunk.length);
}
codec.releaseOutputBuffer(outputBufIndex, false);
if((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) == MediaCodec.BUFFER_FLAG_END_OF_STREAM) {
Log.d(TAG, "saw output EOS.");
sawOutputEOS = true;
}
} else if(res == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
codecOutputBuffers = codec.getOutputBuffers();
Log.d(TAG, "output buffers have changed.");
} else if(res == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat format1 = codec.getOutputFormat();
Log.d(TAG, "output format has changed to " + format1);
audioTrack.stop();
audioTrack = createAudioTrack(codec.getOutputFormat());
} else if(res == MediaCodec.INFO_TRY_AGAIN_LATER) {
Log.d(TAG, "Codec try again returned" + res);
}
}
codec.stop();
codec.release();
}
Related
I need to convert a PCM file to AAC or MP4 file. Until now, I did it with MediaCodec and MediaMuxer, But MediaMuxer is supported from Android 4.3. Is there a method to do the conversion without the use of MediaMuxer?
My code is this:
MediaMuxer mux = null;
try {
File inputFile = new File(filePath + ".pcm");
FileInputStream fis = new FileInputStream(inputFile);
mux = new MediaMuxer(filePath + ".mp4", MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
MediaFormat outputFormat = MediaFormat.createAudioFormat(COMPRESSED_AUDIO_FILE_MIME_TYPE,
SAMPLING_RATE, 1);
outputFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
outputFormat.setInteger(MediaFormat.KEY_BIT_RATE, COMPRESSED_AUDIO_FILE_BIT_RATE);
MediaCodec codec = MediaCodec.createEncoderByType(COMPRESSED_AUDIO_FILE_MIME_TYPE);
codec.configure(outputFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
codec.start();
ByteBuffer[] codecInputBuffers = codec.getInputBuffers();
ByteBuffer[] codecOutputBuffers = codec.getOutputBuffers();
MediaCodec.BufferInfo outBuffInfo = new MediaCodec.BufferInfo();
byte[] tempBuffer = new byte[BUFFER_SIZE];
boolean hasMoreData = true;
double presentationTimeUs = 0;
int audioTrackIdx = 0;
int totalBytesRead = 0;
int percentComplete;
do {
int inputBufIndex = 0;
while (inputBufIndex != -1 && hasMoreData) {
inputBufIndex = codec.dequeueInputBuffer(CODEC_TIMEOUT_IN_MS);
if (inputBufIndex >= 0) {
ByteBuffer dstBuf = codecInputBuffers[inputBufIndex];
dstBuf.clear();
int bytesRead = fis.read(tempBuffer, 0, dstBuf.limit());
if (bytesRead == -1) { // -1 implies EOS
hasMoreData = false;
codec.queueInputBuffer(inputBufIndex, 0, 0, (long) presentationTimeUs, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
} else {
totalBytesRead += bytesRead;
dstBuf.put(tempBuffer, 0, bytesRead);
codec.queueInputBuffer(inputBufIndex, 0, bytesRead, (long) presentationTimeUs, 0);
presentationTimeUs = 1000000l * (totalBytesRead / 2) / SAMPLING_RATE;
}
}
}
// Drain audio
int outputBufIndex = 0;
while (outputBufIndex != MediaCodec.INFO_TRY_AGAIN_LATER) {
outputBufIndex = codec.dequeueOutputBuffer(outBuffInfo, CODEC_TIMEOUT_IN_MS);
if (outputBufIndex >= 0) {
ByteBuffer encodedData = codecOutputBuffers[outputBufIndex];
encodedData.position(outBuffInfo.offset);
encodedData.limit(outBuffInfo.offset + outBuffInfo.size);
if ((outBuffInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0 && outBuffInfo.size != 0) {
codec.releaseOutputBuffer(outputBufIndex, false);
} else {
mux.writeSampleData(audioTrackIdx, codecOutputBuffers[outputBufIndex], outBuffInfo);
codec.releaseOutputBuffer(outputBufIndex, false);
}
} else if (outputBufIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
outputFormat = codec.getOutputFormat();
Log.v("AUDIO", "Output format changed - " + outputFormat);
audioTrackIdx = mux.addTrack(outputFormat);
mux.start();
} else if (outputBufIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
Log.e("AUDIO", "Output buffers changed during encode!");
} else if (outputBufIndex != MediaCodec.INFO_TRY_AGAIN_LATER){
Log.e("AUDIO", "Unknown return code from dequeueOutputBuffer - " + outputBufIndex);
}
}
percentComplete = (int) Math.round(((float) totalBytesRead / (float) inputFile.length()) * 100.0);
Log.v("AUDIO", "Conversion % - " + percentComplete);
} while (outBuffInfo.flags != MediaCodec.BUFFER_FLAG_END_OF_STREAM);
fis.close();
mux.stop();
mux.release();
As you already pointed out, there's no public system API compatible with older version of Android for this kind of job.
Anyway, you can pursue a custom solution using a native encoder (like FFMPEG). I can suggested you the following: timsu/android-aac-enc
Android AAC Encoder project
Extraction of Android Stagefright VO AAC encoder with a nice Java API.
This project offers an easy Java API for the underlying JNI encoder, and it should be ready to use since the native library is already compiled for generic ARM architectures (please note that I haven't tested it). The whole library is just 500kb so it won't fatten your APK that much.
For a quick test, import in your project the following parts:
Java bindings for the native AAC encorder
Pre-compiled AAC encoder .so library
Example of usage (speech encoding)
You should be able to easily adapt the example to your code.
The question is simple, But I have no any clue to solve it:
I write a single line code to create an AAC encoder on my Nexus 4 (Android 4.4.2)
MediaCodec codec = MediaCodec.createEncoderByType("audio/mp4a-latm");
The return value saved in "codec" is not null, but I get a red error message in Logcat:
03-20 15:25:08.985: E/OMXMaster(24517): A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
I have also tried another line:
MediaCodec codec = MediaCodec.createByCodecName("OMX.google.aac.encoder");
And get the same error result.
Did I miss any initialization steps before using MediaCodec? I did not find any information about this in the official document.
Did anyone run into this problem?
Actually I am trying to encode PCM to AAC file. And I have read this post #hubeir. It seems that he has made it work. I did the same thing:(1)setup the mediacodec and feed in PCM data to get encoded frame. To do that, I read the code from cts . Every encoded frame length is about 371-379. (2)Add adts header to the frame, then save to file. I have checked the head bit by bit, it is correct. But the file is still not playable. So I think maybe the error log is the problem.
The following is my whole code, for reference:
MediaCodec codec = MediaCodec.createByCodecName("OMX.google.aac.encoder");
MediaFormat format = new MediaFormat();
format.setString(MediaFormat.KEY_MIME, "audio/mp4a-latm");
format.setInteger(MediaFormat.KEY_AAC_PROFILE,
MediaCodecInfo.CodecProfileLevel.AACObjectELD);
format.setInteger(MediaFormat.KEY_SAMPLE_RATE, nSamplerate);
format.setInteger(MediaFormat.KEY_BIT_RATE, 128000);
format.setInteger(MediaFormat.KEY_CHANNEL_COUNT, nChannels);
codec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
codec.start();
ByteBuffer[] inputBuffers = codec.getInputBuffers();
ByteBuffer[] outputBuffers = codec.getOutputBuffers();
boolean bEndInput = false;
boolean bEndOutput = false;
while(true)
{
if (!bEndInput)
{
int inputBufferIndex = codec.dequeueInputBuffer(0);
if (inputBufferIndex >= 0)
{
int nLen = app.readPCM(nHandle,inputBuffers[inputBufferIndex]);//This line read PCM, return 0 if end of data.
int nBufLen = inputBuffers[inputBufferIndex].capacity();
if (nLen == nBufLen)
codec.queueInputBuffer(inputBufferIndex, 0, nLen, 0, MediaCodec.BUFFER_FLAG_SYNC_FRAME);
else if (nLen < nBufLen)
{
codec.queueInputBuffer(inputBufferIndex, 0, nLen, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
bEndInput = true;
break;
}
}
}
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
if (!bEndOutput)
{
int outputBufferIndex = codec.dequeueOutputBuffer(info, 0);
if (outputBufferIndex >= 0)
{
int outBitsSize = info.size;
Log.d("test", "Offset:"+info.offset);
Log.d("test", "Size:"+info.size);
Log.d("test", "Time:"+info.presentationTimeUs);
Log.d("test", "Flags:"+info.flags);
if (outBitsSize <= 10)
{
codec.releaseOutputBuffer(outputBufferIndex, false /* render */);
continue;
}
int outPacketSize = outBitsSize + 7; // 7 is ADTS size
ByteBuffer outBuf = outputBuffers[outputBufferIndex];
outBuf.position(info.offset);
outBuf.limit(info.offset + outBitsSize);
try {
byte[] data = new byte[outPacketSize]; //space for ADTS header included
addADTStoPacket(data, outPacketSize);
outBuf.get(data, 7, outBitsSize);
outBuf.position(info.offset);
outputStream.write(data, 0, outPacketSize); //open FileOutputStream beforehand
} catch (IOException e) {
Log.e("test", "failed writing bitstream data to file");
e.printStackTrace();
}
outBuf.clear();
codec.releaseOutputBuffer(outputBufferIndex, false /* render */);
Log.d("test", " dequeued " + outBitsSize + " bytes of output data.");
Log.d("test", " wrote " + outPacketSize + " bytes into output file.");
if (info.flags == MediaCodec.BUFFER_FLAG_END_OF_STREAM)
{
bEndOutput = true;
//break;
}
}
else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED)
{
outputBuffers = codec.getOutputBuffers();
}
else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED)
{
}
}
if (bEndInput && bEndOutput)
break;
}
I figured out it.
(1) The error setting in the media format:
format.setInteger(MediaFormat.KEY_AAC_PROFILE,
MediaCodecInfo.CodecProfileLevel.AACObjectELD);
should be
format.setInteger(MediaFormat.KEY_AAC_PROFILE,
MediaCodecInfo.CodecProfileLevel.AACObjectLC);
(2) The encoded frame could be written to file with adts file only when (info.flags == 0)
(3) And the output file name suffix should be "aac". "mp4" or "m4a" may not work for some application.
MediaCodec codec = MediaCodec.createByCodecName("OMX.google.aac.encoder");
MediaFormat format = new MediaFormat();
format.setString(MediaFormat.KEY_MIME, "audio/mp4a-latm");
format.setInteger(MediaFormat.KEY_AAC_PROFILE,
MediaCodecInfo.CodecProfileLevel.AACObjectLC); //fixed version
format.setInteger(MediaFormat.KEY_SAMPLE_RATE, nSamplerate);
format.setInteger(MediaFormat.KEY_BIT_RATE, 128000);
format.setInteger(MediaFormat.KEY_CHANNEL_COUNT, nChannels);
codec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
codec.start();
ByteBuffer[] inputBuffers = codec.getInputBuffers();
ByteBuffer[] outputBuffers = codec.getOutputBuffers();
boolean bEndInput = false;
boolean bEndOutput = false;
while(true)
{
if (!bEndInput)
{
int inputBufferIndex = codec.dequeueInputBuffer(0);
if (inputBufferIndex >= 0)
{
int nLen = app.readPCM(nHandle,inputBuffers[inputBufferIndex]);//This line read PCM, return 0 if end of data.
int nBufLen = inputBuffers[inputBufferIndex].capacity();
if (nLen == nBufLen)
codec.queueInputBuffer(inputBufferIndex, 0, nLen, 0, MediaCodec.BUFFER_FLAG_SYNC_FRAME);
else if (nLen < nBufLen)
{
codec.queueInputBuffer(inputBufferIndex, 0, nLen, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
bEndInput = true;
break;
}
}
}
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
if (!bEndOutput)
{
int outputBufferIndex = codec.dequeueOutputBuffer(info, 0);
if (outputBufferIndex >= 0)
{
int outBitsSize = info.size;
Log.d("test", "Offset:"+info.offset);
Log.d("test", "Size:"+info.size);
Log.d("test", "Time:"+info.presentationTimeUs);
Log.d("test", "Flags:"+info.flags);
if (info.flags != 0) //fixed version
{
codec.releaseOutputBuffer(outputBufferIndex, false /* render */);
continue;
}
int outPacketSize = outBitsSize + 7; // 7 is ADTS size
ByteBuffer outBuf = outputBuffers[outputBufferIndex];
outBuf.position(info.offset);
outBuf.limit(info.offset + outBitsSize);
try {
byte[] data = new byte[outPacketSize]; //space for ADTS header included
addADTStoPacket(data, outPacketSize);
outBuf.get(data, 7, outBitsSize);
outBuf.position(info.offset);
outputStream.write(data, 0, outPacketSize); //open FileOutputStream beforehand
} catch (IOException e) {
Log.e("test", "failed writing bitstream data to file");
e.printStackTrace();
}
outBuf.clear();
codec.releaseOutputBuffer(outputBufferIndex, false /* render */);
Log.d("test", " dequeued " + outBitsSize + " bytes of output data.");
Log.d("test", " wrote " + outPacketSize + " bytes into output file.");
if (info.flags == MediaCodec.BUFFER_FLAG_END_OF_STREAM)
{
bEndOutput = true;
//break;
}
}
else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED)
{
outputBuffers = codec.getOutputBuffers();
}
else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED)
{
}
}
if (bEndInput && bEndOutput)
break;
}
Can you please check media_codecs_xxxx.xml, found under device folder
(note: xxxx normally depends on your HW platform).
Here you may have OMX.qcom.audio.decoder.aac and
OMX.google.aac.encoder.Please use any one of them and comment the other
Please refer https://source.android.com/devices/media.html (Exposing Codecs to the Framework),
this will help you
Interesting, sometime ago I met simular problem on some device: AAC decoder is created instead of encoder if MediaCodec.createEncoderByType() is used. To workaround this I used:
String codecName = selectEncoder(mime);
mediaCodec = MediaCodec.createByCodecName(codecName);
and
private String selectEncoder(String mime) {
for (int index = 0; index < MediaCodecList.getCodecCount(); index++) {
MediaCodecInfo codecInfo = MediaCodecList.getCodecInfoAt(index);
if (!codecInfo.isEncoder()) {
continue;
}
for (String type : codecInfo.getSupportedTypes()) {
if (type.equalsIgnoreCase(mime)) {
return codecInfogetName();
}
}
}
return null;
}
something like that
The code given below works fine on the emulator, but not the device. I found the following lines that looked suspicious to me:
V/MediaExtractor(5030): Autodetected media content as 'audio/mpeg' with confidence 0.20
V/ChromiumHTTPDataSource(5030): mContentSize is undefined or network might be disconnected
V/ChromiumHTTPDataSource(5030): mContentSize is undefined or network might be disconnected
D/com.example.mediacodectest(5030): MIME TYPE: audio/mpeg
I am looking for hints/suggestions. Thanks in advance...
private class PlayerThread extends Thread {
#Override
public void run() {
MediaExtractor extractor;
MediaCodec codec;
ByteBuffer[] codecInputBuffers;
ByteBuffer[] codecOutputBuffers;
AudioTrack mAudioTrack;
mAudioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC,
44100,
AudioFormat.CHANNEL_OUT_STEREO,
AudioFormat.ENCODING_PCM_16BIT,
8192 * 2,
AudioTrack.MODE_STREAM);
extractor = new MediaExtractor();
try
{
extractor.setDataSource("http://anmp3streamingsource.com/stream");
MediaFormat format = extractor.getTrackFormat(0);
String mime = format.getString(MediaFormat.KEY_MIME);
Log.d(TAG, String.format("MIME TYPE: %s", mime));
codec = MediaCodec.createDecoderByType(mime);
codec.configure(
format,
null /* surface */,
null /* crypto */,
0 /* flags */ );
codec.start();
codecInputBuffers = codec.getInputBuffers();
codecOutputBuffers = codec.getOutputBuffers();
extractor.selectTrack(0); // <= You must select a track. You will read samples from the media from this track!
boolean sawInputEOS = false;
boolean sawOutputEOS = false;
for (;;) {
int inputBufIndex = codec.dequeueInputBuffer(-1);
if (inputBufIndex >= 0) {
ByteBuffer dstBuf = codecInputBuffers[inputBufIndex];
int sampleSize = extractor.readSampleData(dstBuf, 0);
long presentationTimeUs = 0;
if (sampleSize < 0) {
sawInputEOS = true;
sampleSize = 0;
} else {
presentationTimeUs = extractor.getSampleTime();
}
codec.queueInputBuffer(inputBufIndex,
0, //offset
sampleSize,
presentationTimeUs,
sawInputEOS ? MediaCodec.BUFFER_FLAG_END_OF_STREAM : 0);
if (!sawInputEOS) {
extractor.advance();
}
MediaCodec.BufferInfo info = new BufferInfo();
final int res = codec.dequeueOutputBuffer(info, -1);
if (res >= 0) {
int outputBufIndex = res;
ByteBuffer buf = codecOutputBuffers[outputBufIndex];
final byte[] chunk = new byte[info.size];
buf.get(chunk); // Read the buffer all at once
buf.clear(); // ** MUST DO!!! OTHERWISE THE NEXT TIME YOU GET THIS SAME BUFFER BAD THINGS WILL HAPPEN
mAudioTrack.play();
if (chunk.length > 0) {
mAudioTrack.write(chunk, 0, chunk.length);
}
codec.releaseOutputBuffer(outputBufIndex, false /* render */);
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
sawOutputEOS = true;
}
}
else if (res == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED)
{
codecOutputBuffers = codec.getOutputBuffers();
}
else if (res == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED)
{
final MediaFormat oformat = codec.getOutputFormat();
Log.d(TAG, "Output format has changed to " + oformat);
mAudioTrack.setPlaybackRate(oformat.getInteger(MediaFormat.KEY_SAMPLE_RATE));
}
}
}
}
catch (IOException e)
{
Log.e(TAG, e.getMessage());
}
}
}
I haven't worked with audio, but I think I may see the problem. You're hanging in dequeueOutputBuffer() because the codec is waiting for more input.
Some of the video codecs want ~4 buffers of input before they'll even finish initialization (for example). I expect some audio codecs may behave the same way. Codec implementations vary from device to device, so it's not surprising that what runs on the emulator behaves much differently.
Change the timeouts from -1 (wait forever) to something modest (say, 1000 microseconds).
In my app I'm trying to upload some videos that the user picked from gallery.
The problem is that usually the android video files are too big to upload and so- we want to compress them first by lower bitrate/ resolution.
I've just heard about the new MediaCodec api that introduce with API 16 (I perviously tried to do so with ffmpeg).
What I'm doing right now is the following:
First decode the input video using a video decoder, and configure it with the format that was read from the input file.
Next, I create a standard video encoder with some predefined parameters, and use it for encoding the decoder output buffer. Then I save the encoder output buffer to a file.
Everything looks good - the same number of packets are written and read from each input and output buffer, but the final file doesn't look like a video file and can't be opened by any video player.
Looks like the decoding is ok, because I test it by displaying it on Surface. I first configure the decoder to work with a Surface, and when we call releaseOutputBuffer we use the render flag, and we're able to see the video on the screen.
Here is the code I'm using:
//init decoder
MediaCodec decoder = MediaCodec.createDecoderByType(mime);
decoder.configure(format, null , null , 0);
decoder.start();
ByteBuffer[] codecInputBuffers = decoder.getInputBuffers();
ByteBuffer[] codecOutputBuffers = decoder.getOutputBuffers();
//init encoder
MediaCodec encoder = MediaCodec.createEncoderByType(mime);
int width = format.getInteger(MediaFormat.KEY_WIDTH);
int height = format.getInteger(MediaFormat.KEY_HEIGHT);
MediaFormat mediaFormat = MediaFormat.createVideoFormat(mime, width, height);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 400000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 25);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
encoder.configure(mediaFormat, null , null , MediaCodec.CONFIGURE_FLAG_ENCODE);
encoder.start();
ByteBuffer[] encoderInputBuffers = encoder.getInputBuffers();
ByteBuffer[] encoderOutputBuffers = encoder.getOutputBuffers();
extractor.selectTrack(0);
boolean sawInputEOS = false;
boolean sawOutputEOS = false;
boolean sawOutputEOS2 = false;
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
BufferInfo encoderInfo = new MediaCodec.BufferInfo();
while (!sawInputEOS || !sawOutputEOS || !sawOutputEOS2) {
if (!sawInputEOS) {
sawInputEOS = decodeInput(extractor, decoder, codecInputBuffers);
}
if (!sawOutputEOS) {
int outputBufIndex = decoder.dequeueOutputBuffer(info, 0);
if (outputBufIndex >= 0) {
sawOutputEOS = decodeEncode(extractor, decoder, encoder, codecOutputBuffers, encoderInputBuffers, info, outputBufIndex);
} else if (outputBufIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
Log.d(LOG_TAG, "decoding INFO_OUTPUT_BUFFERS_CHANGED");
codecOutputBuffers = decoder.getOutputBuffers();
} else if (outputBufIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
final MediaFormat oformat = decoder.getOutputFormat();
Log.d(LOG_TAG, "decoding Output format has changed to " + oformat);
} else if (outputBufIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
Log.d(LOG_TAG, "decoding dequeueOutputBuffer timed out!");
}
}
if (!sawOutputEOS2) {
int encodingOutputBufferIndex = encoder.dequeueOutputBuffer(encoderInfo, 0);
if (encodingOutputBufferIndex >= 0) {
sawOutputEOS2 = encodeOuput(outputStream, encoder, encoderOutputBuffers, encoderInfo, encodingOutputBufferIndex);
} else if (encodingOutputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
Log.d(LOG_TAG, "encoding INFO_OUTPUT_BUFFERS_CHANGED");
encoderOutputBuffers = encoder.getOutputBuffers();
} else if (encodingOutputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
final MediaFormat oformat = encoder.getOutputFormat();
Log.d(LOG_TAG, "encoding Output format has changed to " + oformat);
} else if (encodingOutputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
Log.d(LOG_TAG, "encoding dequeueOutputBuffer timed out!");
}
}
}
//clear some stuff here...
and those are the method I use for decode/ encode:
private boolean decodeInput(MediaExtractor extractor, MediaCodec decoder, ByteBuffer[] codecInputBuffers) {
boolean sawInputEOS = false;
int inputBufIndex = decoder.dequeueInputBuffer(0);
if (inputBufIndex >= 0) {
ByteBuffer dstBuf = codecInputBuffers[inputBufIndex];
input1count++;
int sampleSize = extractor.readSampleData(dstBuf, 0);
long presentationTimeUs = 0;
if (sampleSize < 0) {
sawInputEOS = true;
sampleSize = 0;
Log.d(LOG_TAG, "done decoding input: #" + input1count);
} else {
presentationTimeUs = extractor.getSampleTime();
}
decoder.queueInputBuffer(inputBufIndex, 0, sampleSize, presentationTimeUs, sawInputEOS ? MediaCodec.BUFFER_FLAG_END_OF_STREAM : 0);
if (!sawInputEOS) {
extractor.advance();
}
}
return sawInputEOS;
}
private boolean decodeOutputToFile(MediaExtractor extractor, MediaCodec decoder, ByteBuffer[] codecOutputBuffers,
MediaCodec.BufferInfo info, int outputBufIndex, OutputStream output) throws IOException {
boolean sawOutputEOS = false;
ByteBuffer buf = codecOutputBuffers[outputBufIndex];
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
sawOutputEOS = true;
Log.d(LOG_TAG, "done decoding output: #" + output1count);
}
if (info.size > 0) {
output1count++;
byte[] outData = new byte[info.size];
buf.get(outData);
output.write(outData, 0, outData.length);
} else {
Log.d(LOG_TAG, "no data available " + info.size);
}
buf.clear();
decoder.releaseOutputBuffer(outputBufIndex, false);
return sawOutputEOS;
}
private boolean encodeInputFromFile(MediaCodec encoder, ByteBuffer[] encoderInputBuffers, MediaCodec.BufferInfo info, FileChannel channel) throws IOException {
boolean sawInputEOS = false;
int inputBufIndex = encoder.dequeueInputBuffer(0);
if (inputBufIndex >= 0) {
ByteBuffer dstBuf = encoderInputBuffers[inputBufIndex];
input1count++;
int sampleSize = channel.read(dstBuf);
if (sampleSize < 0) {
sawInputEOS = true;
sampleSize = 0;
Log.d(LOG_TAG, "done encoding input: #" + input1count);
}
encoder.queueInputBuffer(inputBufIndex, 0, sampleSize, channel.position(), sawInputEOS ? MediaCodec.BUFFER_FLAG_END_OF_STREAM : 0);
}
return sawInputEOS;
}
Any suggestion on what I'm doing wrong?
I didn't find too much examples for encoding with MediaCodec just a few samples code for decoding...
Thanks a lot for the help
The output of MediaCodec is a raw elementary stream. You need to package it up into a video file format (possibly muxing the audio back in) before many players will recognize it. FWIW, I've found that the GStreamer-based Totem Movie Player for Linux will play "raw" video/avc files.
Update:
The way to convert H.264 to .mp4 on Android is with the MediaMuxer class, introduced in Android 4.3 (API 18). There are a couple of examples (EncodeAndMuxTest, CameraToMpegTest) that demonstrate its use.
You can use MediaRecorder to record a stream directly to AAC but there doesn't seem to be a way to encode an existing PCM/WAV file to AAC. The ability to encode to AAC exists natively in Android and I'd like to use that. Is there no way to do it with a pre-existing audio file?
Look at this beautiful (and perfectly working) example:
Mp4ParserSample
Look at the final part of the class (rows 335-442), the convert Runnable object just does the job! You have to shape that code to your needs, adjust the input and the output file paths and the conversion parameters (sampling rate, bit rate, etc).
public static final String AUDIO_RECORDING_FILE_NAME = "audio_Capturing-190814-034638.422.wav"; // Input PCM file
public static final String COMPRESSED_AUDIO_FILE_NAME = "convertedmp4.m4a"; // Output MP4/M4A file
public static final String COMPRESSED_AUDIO_FILE_MIME_TYPE = "audio/mp4a-latm";
public static final int COMPRESSED_AUDIO_FILE_BIT_RATE = 64000; // 64kbps
public static final int SAMPLING_RATE = 48000;
public static final int BUFFER_SIZE = 48000;
public static final int CODEC_TIMEOUT_IN_MS = 5000;
String LOGTAG = "CONVERT AUDIO";
Runnable convert = new Runnable() {
#TargetApi(Build.VERSION_CODES.JELLY_BEAN_MR2)
#Override
public void run() {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_BACKGROUND);
try {
String filePath = Environment.getExternalStorageDirectory().getPath() + "/" + AUDIO_RECORDING_FILE_NAME;
File inputFile = new File(filePath);
FileInputStream fis = new FileInputStream(inputFile);
File outputFile = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/" + COMPRESSED_AUDIO_FILE_NAME);
if (outputFile.exists()) outputFile.delete();
MediaMuxer mux = new MediaMuxer(outputFile.getAbsolutePath(), MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
MediaFormat outputFormat = MediaFormat.createAudioFormat(COMPRESSED_AUDIO_FILE_MIME_TYPE,SAMPLING_RATE, 1);
outputFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
outputFormat.setInteger(MediaFormat.KEY_BIT_RATE, COMPRESSED_AUDIO_FILE_BIT_RATE);
outputFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 16384);
MediaCodec codec = MediaCodec.createEncoderByType(COMPRESSED_AUDIO_FILE_MIME_TYPE);
codec.configure(outputFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
codec.start();
ByteBuffer[] codecInputBuffers = codec.getInputBuffers(); // Note: Array of buffers
ByteBuffer[] codecOutputBuffers = codec.getOutputBuffers();
MediaCodec.BufferInfo outBuffInfo = new MediaCodec.BufferInfo();
byte[] tempBuffer = new byte[BUFFER_SIZE];
boolean hasMoreData = true;
double presentationTimeUs = 0;
int audioTrackIdx = 0;
int totalBytesRead = 0;
int percentComplete = 0;
do {
int inputBufIndex = 0;
while (inputBufIndex != -1 && hasMoreData) {
inputBufIndex = codec.dequeueInputBuffer(CODEC_TIMEOUT_IN_MS);
if (inputBufIndex >= 0) {
ByteBuffer dstBuf = codecInputBuffers[inputBufIndex];
dstBuf.clear();
int bytesRead = fis.read(tempBuffer, 0, dstBuf.limit());
Log.e("bytesRead","Readed "+bytesRead);
if (bytesRead == -1) { // -1 implies EOS
hasMoreData = false;
codec.queueInputBuffer(inputBufIndex, 0, 0, (long) presentationTimeUs, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
} else {
totalBytesRead += bytesRead;
dstBuf.put(tempBuffer, 0, bytesRead);
codec.queueInputBuffer(inputBufIndex, 0, bytesRead, (long) presentationTimeUs, 0);
presentationTimeUs = 1000000l * (totalBytesRead / 2) / SAMPLING_RATE;
}
}
}
// Drain audio
int outputBufIndex = 0;
while (outputBufIndex != MediaCodec.INFO_TRY_AGAIN_LATER) {
outputBufIndex = codec.dequeueOutputBuffer(outBuffInfo, CODEC_TIMEOUT_IN_MS);
if (outputBufIndex >= 0) {
ByteBuffer encodedData = codecOutputBuffers[outputBufIndex];
encodedData.position(outBuffInfo.offset);
encodedData.limit(outBuffInfo.offset + outBuffInfo.size);
if ((outBuffInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0 && outBuffInfo.size != 0) {
codec.releaseOutputBuffer(outputBufIndex, false);
}else{
mux.writeSampleData(audioTrackIdx, codecOutputBuffers[outputBufIndex], outBuffInfo);
codec.releaseOutputBuffer(outputBufIndex, false);
}
} else if (outputBufIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
outputFormat = codec.getOutputFormat();
Log.v(LOGTAG, "Output format changed - " + outputFormat);
audioTrackIdx = mux.addTrack(outputFormat);
mux.start();
} else if (outputBufIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
Log.e(LOGTAG, "Output buffers changed during encode!");
} else if (outputBufIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
// NO OP
} else {
Log.e(LOGTAG, "Unknown return code from dequeueOutputBuffer - " + outputBufIndex);
}
}
percentComplete = (int) Math.round(((float) totalBytesRead / (float) inputFile.length()) * 100.0);
Log.v(LOGTAG, "Conversion % - " + percentComplete);
} while (outBuffInfo.flags != MediaCodec.BUFFER_FLAG_END_OF_STREAM);
fis.close();
mux.stop();
mux.release();
Log.v(LOGTAG, "Compression done ...");
} catch (FileNotFoundException e) {
Log.e(LOGTAG, "File not found!", e);
} catch (IOException e) {
Log.e(LOGTAG, "IO exception!", e);
}
//mStop = false;
// Notify UI thread...
}
};
you can get your hands dirty with the native code and use the IOMX C++ interface to decoders in the framework. But this is build sensitive and will not work on other phones and android flavours.
Another option is port an opensource aac encoder like ffmpeg and write an app over it over jni. Will atleast work with phones with same architecture (arm-9, cortex a8..).
JB has a MediaCodec just to fulfill your wishes. But the problem would be the install base for devices with JB will be lean for some more time.
http://developer.android.com/about/versions/android-4.1.html#Multimedia
I think you can use this library.
https://github.com/timsu/android-aac-enc
http://betaful.com/post/82668810035/encoding-aac-audio-in-android