Thank you for your focus!
I want to use Android MediaCodec APIs to encode the video frame which aquired from Camera,
unfortunately, I have not success to do that! I still not familiar with the MediaCodec API。
The follow is my codes,I need your help to figure out what I should do.
1、The Camera setting:
Parameters parameters = mCamera.getParameters();
parameters.setPreviewFormat(ImageFormat.NV21);
parameters.setPreviewSize(320, 240);
mCamera.setParameters(parameters);
2、Set the encoder:
private void initCodec() {
try {
fos = new FileOutputStream(mVideoFile, false);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
mMediaCodec = MediaCodec.createEncoderByType("video/avc");
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc",
320,
240);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 125000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mMediaCodec.configure(mediaFormat,
null,
null,
MediaCodec.CONFIGURE_FLAG_ENCODE);
mMediaCodec.start();
inputBuffers = mMediaCodec.getInputBuffers();
outputBuffers = mMediaCodec.getOutputBuffers();
}
private void encode(byte[] data) {
int inputBufferIndex = mMediaCodec.dequeueInputBuffer(0);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(data);
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0);
} else {
return;
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
Log.i(TAG, "outputBufferIndex-->" + outputBufferIndex);
do {
if (outputBufferIndex >= 0) {
ByteBuffer outBuffer = outputBuffers[outputBufferIndex];
System.out.println("buffer info-->" + bufferInfo.offset + "--"
+ bufferInfo.size + "--" + bufferInfo.flags + "--"
+ bufferInfo.presentationTimeUs);
byte[] outData = new byte[bufferInfo.size];
outBuffer.get(outData);
try {
if (bufferInfo.offset != 0) {
fos.write(outData, bufferInfo.offset, outData.length
- bufferInfo.offset);
} else {
fos.write(outData, 0, outData.length);
}
fos.flush();
Log.i(TAG, "out data -- > " + outData.length);
mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo,
0);
} catch (IOException e) {
e.printStackTrace();
}
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = mMediaCodec.getOutputBuffers();
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat format = mMediaCodec.getOutputFormat();
}
} while (outputBufferIndex >= 0);
}
I guess the problem occurred in the encoder method,the method will be used in the Camera Preview Callback ,like
initCodec();
//mCamera.setPreviewCallback(new MyPreviewCallback());
mCamera.setPreviewCallback(new PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
encode(data);
}
});
I just have no idea how to do it correctly with the MediaCodec API.Can you give me some advice or links about it?
Thank you!
I have solved the problem.As follows:
private synchronized void encode(byte[] data)
{
inputBuffers = mMediaCodec.getInputBuffers();// here changes
outputBuffers = mMediaCodec.getOutputBuffers();
int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
Log.i(TAG, "inputBufferIndex-->" + inputBufferIndex);
//......
And next,you will find your encoded video color is not right, for more information,please go to here MediaCodec and Camera: colorspaces don't match
The YUV420 formats output by the camera are incompatible with the formats accepted by the MediaCodec AVC encoder. In the best case, it's essentially NV12 vs. NV21 (U and V planes are reversed), requiring a manual reordering. In the worst case, as of Android 4.2, the encoder input format may be device-specific.
You're better off using MediaRecorder to connect the camera hardware to the encoder.
Update:
It's now possible to pass the camera's Surface preview to MediaCodec, instead of using the YUV data in the ByteBuffer. This is faster and more portable. See the CameraToMpegTest sample here.
Related
I am saving frames to H264 format and I saved audio to aac format. Then I concatenate of these format to creating mp4 format using ffmpeg player in android but When I concatenate Audio and Video, audio getting back of video they aren't playing at synchronous mode,how can I played video and audio at syncronous mode? H264 video format is 6 second, audio format is 8 second When I concatenate of these getting 8 second and audio coming longer and occuring asyncronous.
Recording Audio to AAC format
recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
recorder.setAudioEncodingBitRate(48000);//48000
recorder.setAudioSamplingRate(720);//16000
recorder.setOutputFile(path2);
try {
recorder.prepare();
} catch (IOException e) {
e.printStackTrace();
}
recorder.start();
Saving Video to H264 format
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc",
1280,
720);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 6000000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 720); //video second
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
try {
mMediaCodec = MediaCodec.createEncoderByType("video/avc");
} catch (IOException e) {
e.printStackTrace();
}
mMediaCodec.configure(mediaFormat,
null,
null,
MediaCodec.CONFIGURE_FLAG_ENCODE);
mMediaCodec.start();
//Video format H264
private synchronized void encode(byte[] data) {
ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mMediaCodec.getOutputBuffers();
int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.capacity();
inputBuffer.clear();
inputBuffer.put(data);
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0);
} else {
return;
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
Log.i(TAG, "outputBufferIndex-->" + outputBufferIndex);
do {
if (outputBufferIndex >= 0) {
ByteBuffer outBuffer = outputBuffers[outputBufferIndex];
System.out.println("buffer info-->" + bufferInfo.offset + "--"
+ bufferInfo.size + "--" + bufferInfo.flags + "--"
+ bufferInfo.presentationTimeUs);
byte[] outData = new byte[bufferInfo.size];
outBuffer.get(outData);
try {
if (bufferInfo.offset != 0) {
fos.write(outData, bufferInfo.offset, outData.length
- bufferInfo.offset);
} else {
fos.write(outData, 0, outData.length);
}
fos.flush();
Log.i(TAG, "out data -- > " + outData.length);
mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo,
0);
} catch (IOException e) {
e.printStackTrace();
}
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = mMediaCodec.getOutputBuffers();
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat format = mMediaCodec.getOutputFormat();
}
} while (outputBufferIndex >= 0);
}
Concatenate Video and Audio with using ffmpeg
String[] cmd = {"-i", h264_video_path, "-i", aac_audio_path, "-c", "copy", "-map","0:v:0","-map","1:a:0", outpath_mp4};
try {
//FFMPEG execute command
executeCommand(cmd);
} catch (FFmpegCommandAlreadyRunningException e) {
e.printStackTrace();
}
Would like to pick a video from the device and decode it inorder to change its frame rate and then encode and save it to the device. How is this possible using MediaCodec? Went through many documentations, but couldn't find a method. I have the following code for decoding. Will it be of any good for my purpose. If yes how to use that decoded data to save it with changed fps.
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 1080, 720);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 2500000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 20);
try {
decoder = MediaCodec.createDecoderByType("video/avc");
} catch (IOException e) {
Log.d("Error", "Fail to create MediaCodec: " + e.toString());
}
///Commenting for testing...
/*
// Pass the decoded data to the surface to display
decoder.configure(mediaFormat, null, null, 0);
//decoder.configure(mediaFormat, null, null, 0);
decoder.start();
*/
///Commenting for testing...
// new BufferInfo();
ByteBuffer[] inputBuffers = decoder.getInputBuffers();
ByteBuffer[] outputBuffers = decoder.getOutputBuffers();
if (null == inputBuffers) {
Log.d("Error", "null == inputBuffers");
}
if (null == outputBuffers) {
Log.d("Error", "null == outbputBuffers 111");
}
FileInputStream file = null;
try {
file = new FileInputStream(data.getData().getPath().toString());
} catch (FileNotFoundException e) {
Log.d("Error", "open file error: " + e.toString());
return;
}
int read_size = -1;
int mCount = 0;
for (; ; ) {
byte[] h264 = null;
try {
byte[] length_bytes = new byte[4];
read_size = file.read(length_bytes);
if (read_size < 0) {
Log.d("Error", "read_size<0 pos1");
break;
}
int byteCount = bytesToInt(length_bytes, 0);
//Changed to .length
//int byteCount=length_bytes.length;
Log.d("Error", "byteCount: " + byteCount);
h264 = new byte[byteCount];
read_size = file.read(h264, 0, byteCount);
// Log.d("Error", "read_size: " + read_size);
if (read_size < 0) {
Log.d("Error", "read_size<0 pos2");
break;
}
// Log.d("Error", "pos: " + file.)
} catch (IOException e) {
Log.d("Error", "read_size 2: " + read_size);
Log.d("Error", "e.toStrinig(): " + e.toString());
break;
}
int inputBufferIndex = decoder.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(h264);
// long sample_time = ;
decoder.queueInputBuffer(inputBufferIndex, 0, h264.length, mCount * 1000000 / 20, 0);
++mCount;
} else {
Log.d("Error", "dequeueInputBuffer error");
}
ByteBuffer outputBuffer = null;
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = decoder.dequeueOutputBuffer(bufferInfo, 0);
while (outputBufferIndex >= 0) {
outputBuffer = outputBuffers[outputBufferIndex];
decoder.releaseOutputBuffer(outputBufferIndex, true);
outputBufferIndex = decoder.dequeueOutputBuffer(bufferInfo, 0);
}
// Pass the decoded data to the surface to display
decoder.configure(mediaFormat,mPreview.getHolder().getSurface() , null, 0);
//decoder.configure(mediaFormat, null, null, 0);
decoder.start();
if (outputBufferIndex >= 0) {
decoder.releaseOutputBuffer(outputBufferIndex, false);
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = decoder.getOutputBuffers();
Log.d("Error", "outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED");
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// Subsequent data will conform to new format.
Log.d("Error", "outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED");
}
try {
Thread.sleep(1000/20);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
public int bytesToInt(byte[] src, int offset) {
int value;
value = (int) ((src[offset] & 0xFF)
| ((src[offset+1] & 0xFF)<<8)
| ((src[offset+2] & 0xFF)<<16)
| ((src[offset+3] & 0xFF)<<24));
return value;
}
You can take a look at DecodeEditEncode, a great starting point for decoding and re-encoding using surfaces (output surface for decoder -> input surface for encoder).
Take a look especially at this method
private void editVideoData(VideoChunks inputData, MediaCodec decoder,
OutputSurface outputSurface, InputSurface inputSurface, MediaCodec encoder,
VideoChunks outputData)
The working flow that you have to follow is similar to bellow:
Extract video track (MediaExtractor)
Feed the decoder input buffers
render the decoded frame to the surface
When rendered, the encoder will get the frame (you have to set timestamp too)
Use MediaMuxer to mux the encoder frame with audio track.
Extra links : some examples
ExtractDecodeEditEncodeMuxTest
VideoResample.java (very interesting)
I am trying to capture video frames, encode it with MediaCodec, and save it into a file. The code that I am using is:
public class AvcEncoder {
private static String TAG = AvcEncoder.class.getSimpleName();
private MediaCodec mediaCodec;
private BufferedOutputStream outputStream;
public AvcEncoder(String fileDir) {
Log.d(TAG, "Thread Id: " + Thread.currentThread().getId());
File f = new File(Environment.getExternalStorageDirectory(), "Download/LiveCamera/video_encoded.h264");
try {
outputStream = new BufferedOutputStream(new FileOutputStream(f));
Log.i("AvcEncoder", "outputStream initialized");
} catch (Exception e){
e.printStackTrace();
}
mediaCodec = MediaCodec.createEncoderByType("video/avc");
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 960, 720);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 2000000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
//mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();
}
public void close() throws IOException {
mediaCodec.stop();
mediaCodec.release();
mediaCodec = null;
//outputStream.flush();
outputStream.close();
}
public void byteWriteTest(byte[] input) {
try {
outputStream.write(input, 0, input.length);
} catch(Exception e) {
Log.d("AvcEncoder", "Outputstream write failed");
e.printStackTrace();
}
Log.i("AvcEncoder", input.length + " bytes written");
}
// called from Camera.setPreviewCallbackWithBuffer(...) in other class
public void offerEncoder(byte[] input) {
try {
ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(input);
mediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,0);
while (outputBufferIndex >= 0) {
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outputBuffer.get(outData);
try {
outputStream.write(outData, 0, outData.length);
} catch(Exception e) {
Log.d("AvcEncoder", "Outputstream write failed");
e.printStackTrace();
}
//Log.i("AvcEncoder", outData.length + " bytes written");
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
}
} catch (Throwable t) {
t.printStackTrace();
}
}
}
For each frame arrived in SurfaceView's onPreviewFrame, AvcEncoder's offerEncoder() method is called, as follows
public class CameraView extends SurfaceView implements Camera.PreviewCallback,
SurfaceHolder.Callback {
...
#Override
public void onPreviewFrame(byte[] pData, Camera pCamera) {
if (VIDEO_ENCODE) {
avcEncoder.offerEncoder(pData);
}
}
}
The Problem
Now, the problem I am having is with writing the encoded frames to file. It seems that after every N frames (roughly, not exactly the same every time), the statement outputStream.write(outData, 0, outData.length), in AvcEncoder's offerEncoder method, takes longer time (few handred times longer than at other iterations). I have assumed this probably happens when it flushes the buffer, that is actually writes to file. (Please correct me if this assumption is not correct).
This results in dropping the frames that arrive during that time (again, I assume, based on the video result), which in turn results in a pause in the recorded video after every N frame.
When I comment this statement out, then iterations of offerEncoder method takes roughly equal time.
The Question
How can I solve this issue such that writing to file is smooth.Has anyone else encountered this problem. I see that many people use this code, but no one has complained so far about this issue (or at least I did not find one).
Thanks.
I am creating raw H.264 stream output by MediaCodec. The problem is the output file is not playable in android default player (API 16). How can it be that Android can export file that is not playable in player, only in VLC on the PC. Maybe some thing wrong with my code? My video is 384x288.
public class AvcEncoder {
private MediaCodec mediaCodec;
private BufferedOutputStream outputStream;
private File f;
public AvcEncoder(int w, int h, String file_name)
{
f = new File(file_name + ".mp4");
try {
outputStream = new BufferedOutputStream(new FileOutputStream(f));
} catch (Exception e){
e.printStackTrace();
}
String key_mime = "video/avc"; //video/mp4v-es, video/3gpp, video/avc
mediaCodec = MediaCodec.createEncoderByType(key_mime);
MediaFormat mediaFormat = MediaFormat.createVideoFormat(key_mime, w, h);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, (w * h) << 3);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 25);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mediaFormat.setInteger(MediaFormat.KEY_AAC_PROFILE,MediaCodecInfo.CodecProfileLevel.MPEG4ProfileMain);
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();
}
public void close() {
try {
mediaCodec.stop();
mediaCodec.release();
outputStream.flush();
outputStream.close();
} catch (Exception e){
e.printStackTrace();
}
}
public void offerEncoder(byte[] input) {
try {
ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
int inputBufferIndex = mediaCodec.dequeueInputBuffer(0);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(input);
mediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
while (outputBufferIndex >= 0) {
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outputBuffer.get(outData);
outputStream.write(outData, 0, outData.length);
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
}
} catch (Throwable t) {
}
}
}
The Android MediaPlayer doesn't handle raw H.264 streams.
One difficulty with such streams is that the H.264 NAL units don't have timestamp information, so unless the video frames are at a known fixed frame rate the player wouldn't know when to present them.
You can either create your own player with MediaCodec (see e.g. "Play video (TextureView)" in Grafika), or convert the raw stream to a .mp4 file. The latter requires MediaMuxer, available in API 18, or the use of a 3rd-party library like ffmpeg.
I'm trying to use the MediaCodec to retrive all the frames from a video for image processing stuff, I'm trying to render the video and to capture the frame from the outBuffers
but I can't initiate a bitmap instance from the received bytes.
I've tried to render it to a surface or to nothing(null), because I've notice that when you rendering to null then the outBuffers are getting the bytes of the rendered frames.
This is the code:
private static final String SAMPLE = Environment.getExternalStorageDirectory() + "/test_videos/sample2.mp4";
private PlayerThread mPlayer = null;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
SurfaceView sv = new SurfaceView(this);
sv.getHolder().addCallback(this);
setContentView(sv);
}
protected void onDestroy() {
super.onDestroy();
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
if (mPlayer == null) {
mPlayer = new PlayerThread(holder.getSurface());
mPlayer.start();
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
if (mPlayer != null) {
mPlayer.interrupt();
}
}
private void writeFrameToSDCard(byte[] bytes, int i, int sampleSize) {
try {
Bitmap bmp = BitmapFactory.decodeByteArray(bytes, 0, sampleSize);
File file = new File(Environment.getExternalStorageDirectory() + "/test_videos/sample" + i + ".png");
if (file.exists())
file.delete();
file.createNewFile();
FileOutputStream out = new FileOutputStream(file.getAbsoluteFile());
bmp.compress(Bitmap.CompressFormat.PNG, 90, out);
out.close();
} catch (Exception e) {
e.printStackTrace();
}
}
private class PlayerThread extends Thread {
private MediaExtractor extractor;
private MediaCodec decoder;
private Surface surface;
public PlayerThread(Surface surface) {
this.surface = surface;
}
#Override
public void run() {
extractor = new MediaExtractor();
extractor.setDataSource(SAMPLE);
int index = extractor.getTrackCount();
Log.d("MediaCodecTag", "Track count: " + index);
for (int i = 0; i < extractor.getTrackCount(); i++) {
MediaFormat format = extractor.getTrackFormat(i);
String mime = format.getString(MediaFormat.KEY_MIME);
if (mime.startsWith("video/")) {
extractor.selectTrack(i);
decoder = MediaCodec.createDecoderByType(mime);
decoder.configure(format, surface, null, 0);
break;
}
}
if (decoder == null) {
Log.e("DecodeActivity", "Can't find video info!");
return;
}
decoder.start();
ByteBuffer[] inputBuffers = decoder.getInputBuffers();
ByteBuffer[] outputBuffers = decoder.getOutputBuffers();
BufferInfo info = new BufferInfo();
boolean isEOS = false;
long startMs = System.currentTimeMillis();
int i = 0;
while (!Thread.interrupted()) {
if (!isEOS) {
int inIndex = decoder.dequeueInputBuffer(10000);
if (inIndex >= 0) {
ByteBuffer buffer = inputBuffers[inIndex];
int sampleSize = extractor.readSampleData(buffer, 0);
if (sampleSize < 0) {
decoder.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
isEOS = true;
} else {
decoder.queueInputBuffer(inIndex, 0, sampleSize, extractor.getSampleTime(), 0);
extractor.advance();
}
}
}
/* saves frame to sdcard */
int outIndex = decoder.dequeueOutputBuffer(info, 10000); // outIndex most of the times null
switch (outIndex) {
case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
Log.d("DecodeActivity", "INFO_OUTPUT_BUFFERS_CHANGED");
outputBuffers = decoder.getOutputBuffers();
break;
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
Log.d("DecodeActivity", "New format " + decoder.getOutputFormat());
break;
case MediaCodec.INFO_TRY_AGAIN_LATER:
Log.d("DecodeActivity", "dequeueOutputBuffer timed out!");
break;
default:
ByteBuffer buffer = outputBuffers[outIndex];
Log.v("DecodeActivity", "We can't use this buffer but render it due to the API limit, " + buffer);
// We use a very simple clock to keep the video FPS, or the video
// playback will be too fast
while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
try {
sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
break;
}
}
decoder.releaseOutputBuffer(outIndex, true);
try {
byte[] dst = new byte[outputBuffers[outIndex].capacity()];
outputBuffers[outIndex].get(dst);
writeFrameToSDCard(dst, i, dst.length);
i++;
} catch (Exception e) {
Log.d("iDecodeActivity", "Error while creating bitmap with: " + e.getMessage());
}
break;
}
// All decoded frames have been rendered, we can stop playing now
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
Log.d("DecodeActivity", "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
break;
}
}
decoder.stop();
decoder.release();
extractor.release();
}
}
Any help would be much appriciated
You can decode to a Surface or to a ByteBuffer, but not both. Because you are configuring a Surface, there will always be zero bytes of data in the output buffer.
If you configure for ByteBuffer decoding, the data format will vary, but to my knowledge will never be an ARGB format that Bitmap understands. You can see examples of two YUV formats being examined in the buffer-to-buffer tests in the CTS EncodeDecodeTest in method checkFrame(). Note, however, that the first thing it does is check the format and return immediately if it's not recognized.
At present (Android 4.4), the only reliable way to do this is to decode to a SurfaceTexture, render that with GLES, and extract RGB data with glReadPixels(). Sample code is available on bigflake -- see ExtractMpegFramesTest (requires API 16+).