Android, Writing video frames in real-time results in pauses - android

I am trying to capture video frames, encode it with MediaCodec, and save it into a file. The code that I am using is:
public class AvcEncoder {
private static String TAG = AvcEncoder.class.getSimpleName();
private MediaCodec mediaCodec;
private BufferedOutputStream outputStream;
public AvcEncoder(String fileDir) {
Log.d(TAG, "Thread Id: " + Thread.currentThread().getId());
File f = new File(Environment.getExternalStorageDirectory(), "Download/LiveCamera/video_encoded.h264");
try {
outputStream = new BufferedOutputStream(new FileOutputStream(f));
Log.i("AvcEncoder", "outputStream initialized");
} catch (Exception e){
e.printStackTrace();
}
mediaCodec = MediaCodec.createEncoderByType("video/avc");
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 960, 720);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 2000000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
//mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();
}
public void close() throws IOException {
mediaCodec.stop();
mediaCodec.release();
mediaCodec = null;
//outputStream.flush();
outputStream.close();
}
public void byteWriteTest(byte[] input) {
try {
outputStream.write(input, 0, input.length);
} catch(Exception e) {
Log.d("AvcEncoder", "Outputstream write failed");
e.printStackTrace();
}
Log.i("AvcEncoder", input.length + " bytes written");
}
// called from Camera.setPreviewCallbackWithBuffer(...) in other class
public void offerEncoder(byte[] input) {
try {
ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(input);
mediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,0);
while (outputBufferIndex >= 0) {
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outputBuffer.get(outData);
try {
outputStream.write(outData, 0, outData.length);
} catch(Exception e) {
Log.d("AvcEncoder", "Outputstream write failed");
e.printStackTrace();
}
//Log.i("AvcEncoder", outData.length + " bytes written");
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
}
} catch (Throwable t) {
t.printStackTrace();
}
}
}
For each frame arrived in SurfaceView's onPreviewFrame, AvcEncoder's offerEncoder() method is called, as follows
public class CameraView extends SurfaceView implements Camera.PreviewCallback,
SurfaceHolder.Callback {
...
#Override
public void onPreviewFrame(byte[] pData, Camera pCamera) {
if (VIDEO_ENCODE) {
avcEncoder.offerEncoder(pData);
}
}
}
The Problem
Now, the problem I am having is with writing the encoded frames to file. It seems that after every N frames (roughly, not exactly the same every time), the statement outputStream.write(outData, 0, outData.length), in AvcEncoder's offerEncoder method, takes longer time (few handred times longer than at other iterations). I have assumed this probably happens when it flushes the buffer, that is actually writes to file. (Please correct me if this assumption is not correct).
This results in dropping the frames that arrive during that time (again, I assume, based on the video result), which in turn results in a pause in the recorded video after every N frame.
When I comment this statement out, then iterations of offerEncoder method takes roughly equal time.
The Question
How can I solve this issue such that writing to file is smooth.Has anyone else encountered this problem. I see that many people use this code, but no one has complained so far about this issue (or at least I did not find one).
Thanks.

Related

Picking a video, decoding it, changing its fps, encoding and saving using mediacodec

Would like to pick a video from the device and decode it inorder to change its frame rate and then encode and save it to the device. How is this possible using MediaCodec? Went through many documentations, but couldn't find a method. I have the following code for decoding. Will it be of any good for my purpose. If yes how to use that decoded data to save it with changed fps.
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 1080, 720);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 2500000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 20);
try {
decoder = MediaCodec.createDecoderByType("video/avc");
} catch (IOException e) {
Log.d("Error", "Fail to create MediaCodec: " + e.toString());
}
///Commenting for testing...
/*
// Pass the decoded data to the surface to display
decoder.configure(mediaFormat, null, null, 0);
//decoder.configure(mediaFormat, null, null, 0);
decoder.start();
*/
///Commenting for testing...
// new BufferInfo();
ByteBuffer[] inputBuffers = decoder.getInputBuffers();
ByteBuffer[] outputBuffers = decoder.getOutputBuffers();
if (null == inputBuffers) {
Log.d("Error", "null == inputBuffers");
}
if (null == outputBuffers) {
Log.d("Error", "null == outbputBuffers 111");
}
FileInputStream file = null;
try {
file = new FileInputStream(data.getData().getPath().toString());
} catch (FileNotFoundException e) {
Log.d("Error", "open file error: " + e.toString());
return;
}
int read_size = -1;
int mCount = 0;
for (; ; ) {
byte[] h264 = null;
try {
byte[] length_bytes = new byte[4];
read_size = file.read(length_bytes);
if (read_size < 0) {
Log.d("Error", "read_size<0 pos1");
break;
}
int byteCount = bytesToInt(length_bytes, 0);
//Changed to .length
//int byteCount=length_bytes.length;
Log.d("Error", "byteCount: " + byteCount);
h264 = new byte[byteCount];
read_size = file.read(h264, 0, byteCount);
// Log.d("Error", "read_size: " + read_size);
if (read_size < 0) {
Log.d("Error", "read_size<0 pos2");
break;
}
// Log.d("Error", "pos: " + file.)
} catch (IOException e) {
Log.d("Error", "read_size 2: " + read_size);
Log.d("Error", "e.toStrinig(): " + e.toString());
break;
}
int inputBufferIndex = decoder.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(h264);
// long sample_time = ;
decoder.queueInputBuffer(inputBufferIndex, 0, h264.length, mCount * 1000000 / 20, 0);
++mCount;
} else {
Log.d("Error", "dequeueInputBuffer error");
}
ByteBuffer outputBuffer = null;
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = decoder.dequeueOutputBuffer(bufferInfo, 0);
while (outputBufferIndex >= 0) {
outputBuffer = outputBuffers[outputBufferIndex];
decoder.releaseOutputBuffer(outputBufferIndex, true);
outputBufferIndex = decoder.dequeueOutputBuffer(bufferInfo, 0);
}
// Pass the decoded data to the surface to display
decoder.configure(mediaFormat,mPreview.getHolder().getSurface() , null, 0);
//decoder.configure(mediaFormat, null, null, 0);
decoder.start();
if (outputBufferIndex >= 0) {
decoder.releaseOutputBuffer(outputBufferIndex, false);
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = decoder.getOutputBuffers();
Log.d("Error", "outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED");
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// Subsequent data will conform to new format.
Log.d("Error", "outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED");
}
try {
Thread.sleep(1000/20);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
public int bytesToInt(byte[] src, int offset) {
int value;
value = (int) ((src[offset] & 0xFF)
| ((src[offset+1] & 0xFF)<<8)
| ((src[offset+2] & 0xFF)<<16)
| ((src[offset+3] & 0xFF)<<24));
return value;
}
You can take a look at DecodeEditEncode, a great starting point for decoding and re-encoding using surfaces (output surface for decoder -> input surface for encoder).
Take a look especially at this method
private void editVideoData(VideoChunks inputData, MediaCodec decoder,
OutputSurface outputSurface, InputSurface inputSurface, MediaCodec encoder,
VideoChunks outputData)
The working flow that you have to follow is similar to bellow:
Extract video track (MediaExtractor)
Feed the decoder input buffers
render the decoded frame to the surface
When rendered, the encoder will get the frame (you have to set timestamp too)
Use MediaMuxer to mux the encoder frame with audio track.
Extra links : some examples
ExtractDecodeEditEncodeMuxTest
VideoResample.java (very interesting)

Android background video recording

I am developing a recording service for a custom Android platform. When the application starts it will start recording a video in the background. Unfortunately this application runs on hardware that prevents me from using video recording.
My solution to this problem is to take images and hold them in a circular buffer, when an event happens it will stop feeding images to the buffer and place them together in a video.
The problem I am encountering is that when I save the images to video I just get a noisy green screen.
I based my code on this example: Using MediaCodec to save series of images as Video
Note: I cannot use MediaMux either, I am developing for API level <18.
I will guide your through the steps I take. On creation of the service I simply open the camera, I set the preview on a SurfaceTexture and I will add images to my buffer when the PreviewCallback is called.
private Camera mCamera;
private String mTimeStamp;
SurfaceTexture mSurfaceTexture;
private CircularBuffer<ByteArrayOutputStream> mCircularBuffer;
private static final int MAX_BUFFER_SIZE = 200;
private int mWidth = 720;
private int mHeight = 480;
#Override
public void onCreate() {
try {
mCircularBuffer = new CircularBuffer(MAX_BUFFER_SIZE);
mTimeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
mSurfaceTexture = new SurfaceTexture(10);
mCamera = getCameraInstance();
Parameters parameters = mCamera.getParameters();
parameters.setJpegQuality(20);
parameters.setPictureSize(mWidth, mHeight);
mCamera.setParameters(parameters);
mCamera.setPreviewTexture(mSurfaceTexture);
mCamera.startPreview();
mCamera.setPreviewCallback(mPreviewCallback);
} catch (IOException e) {
Log.d(TAG, "IOException: " + e.getMessage());
} catch (Exception e) {
Log.d(TAG, "Exception: " + e.getMessage());
}
}
private PreviewCallback mPreviewCallback = new PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
ByteArrayOutputStream out = new ByteArrayOutputStream();
YuvImage yuvImage = new YuvImage(data, ImageFormat.NV21, mWidth, mHeight, null);
Rect rectangle = new Rect(0, 0, mWidth, mHeight);
yuvImage.compressToJpeg(rectangle, 20, out);
mCircularBuffer.add(out);
}
};
All of this works, when I convert the byte arrays to jpg at this point they are all correct image files.
Now when an event happens, the service will be destroyed and the last 200 images will need to be placed behind each other and converted to mp4. I do this by first saving it to H264, based on the code provided in the link above. And then converting that file to mp4 by using mp4parser.
#Override
public void onDestroy() {
super.onDestroy();
mCamera.stopPreview();
saveFileToH264("video/avc");
convertH264ToMP4();
}
private void saveFileToH264(String MIMETYPE) {
MediaCodec codec = MediaCodec.createEncoderByType(MIMETYPE);
MediaFormat mediaFormat = null;
int height = mCamera.getParameters().getPictureSize().height;
int width = mCamera.getParameters().getPictureSize().width;
Log.d(TAG, height + ", " + width);
mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, width, height);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 1000000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 10);
codec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
codec.start();
ByteBuffer[] inputBuffers = codec.getInputBuffers();
ByteBuffer[] outputBuffers = codec.getOutputBuffers();
boolean sawInputEOS = false;
int inputBufferIndex = -1, outputBufferIndex = -1;
BufferInfo info = null;
try {
File file = new File("/sdcard/output.h264");
FileOutputStream fstream2 = new FileOutputStream(file);
DataOutputStream dos = new DataOutputStream(fstream2);
// loop through buffer and get image output streams
for (int i = 0; i < MAX_BUFFER_SIZE; i++) {
ByteArrayOutputStream out = mCircularBuffer.getData(i);
byte[] dat = out.toByteArray();
long WAITTIME = 50;
inputBufferIndex = codec.dequeueInputBuffer(WAITTIME);
int bytesread = MAX_BUFFER_SIZE - 1 - i;
int presentationTime = 0;
if (bytesread <= 0)
sawInputEOS = true;
if (inputBufferIndex >= 0) {
if (!sawInputEOS) {
int samplesiz = dat.length;
inputBuffers[inputBufferIndex].put(dat);
codec.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, 0);
presentationTime += 100;
info = new BufferInfo();
outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);
Log.i("BATA", "outputBufferIndex=" + outputBufferIndex);
if (outputBufferIndex >= 0) {
byte[] array = new byte[info.size];
outputBuffers[outputBufferIndex].get(array);
if (array != null) {
try {
dos.write(array);
} catch (IOException e) {
e.printStackTrace();
}
}
codec.releaseOutputBuffer(outputBufferIndex, false);
inputBuffers[inputBufferIndex].clear();
outputBuffers[outputBufferIndex].clear();
if (sawInputEOS)
break;
}
} else {
codec.queueInputBuffer(inputBufferIndex, 0, 0, presentationTime,
MediaCodec.BUFFER_FLAG_END_OF_STREAM);
info = new BufferInfo();
outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);
if (outputBufferIndex >= 0) {
byte[] array = new byte[info.size];
outputBuffers[outputBufferIndex].get(array);
if (array != null) {
try {
dos.write(array);
} catch (IOException e) {
e.printStackTrace();
}
}
codec.releaseOutputBuffer(outputBufferIndex, false);
inputBuffers[inputBufferIndex].clear();
outputBuffers[outputBufferIndex].clear();
break;
}
}
}
}
codec.flush();
try {
fstream2.close();
dos.flush();
dos.close();
} catch (IOException e) {
e.printStackTrace();
}
codec.stop();
codec.release();
codec = null;
} catch (FileNotFoundException e) {
Log.d(TAG, "File not found: " + e.getMessage());
} catch (Exception e) {
Log.d(TAG, "Exception: " + e.getMessage());
}
}
private void convertH264ToMP4() {
try {
DataSource videoFile = new FileDataSourceImpl("/sdcard/output.h264");
H264TrackImpl h264Track = new H264TrackImpl(videoFile, "eng", 5, 1);
// 5fps. you can play with timescale and timetick to get non integer fps, 23.967 is
// 24000/1001
Movie movie = new Movie();
movie.addTrack(h264Track);
Container out = new DefaultMp4Builder().build(movie);
FileOutputStream fos = new FileOutputStream(new File("/sdcard/output.mp4"));
out.writeContainer(fos.getChannel());
fos.flush();
fos.close();
Log.d(TAG, "Video saved to sdcard");
} catch (Exception e) {
Log.d(TAG, "No file was saved");
}
}
I'm pretty sure the problem is in the saveFileToH264 code. I've read a post, on the link provided above, that this is probably a stride and/or alignment issue(?). I have however no experience with encoding/decoding so I'm not sure how to solve this issue. If anyone could help that would be greatly appreciated!
Note: I know the code is not optimal and I still need to add more checks and whatnot, but I first want to get a working video out of this.

Raw H.264 stream output by MediaCodec not playble

I am creating raw H.264 stream output by MediaCodec. The problem is the output file is not playable in android default player (API 16). How can it be that Android can export file that is not playable in player, only in VLC on the PC. Maybe some thing wrong with my code? My video is 384x288.
public class AvcEncoder {
private MediaCodec mediaCodec;
private BufferedOutputStream outputStream;
private File f;
public AvcEncoder(int w, int h, String file_name)
{
f = new File(file_name + ".mp4");
try {
outputStream = new BufferedOutputStream(new FileOutputStream(f));
} catch (Exception e){
e.printStackTrace();
}
String key_mime = "video/avc"; //video/mp4v-es, video/3gpp, video/avc
mediaCodec = MediaCodec.createEncoderByType(key_mime);
MediaFormat mediaFormat = MediaFormat.createVideoFormat(key_mime, w, h);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, (w * h) << 3);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 25);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mediaFormat.setInteger(MediaFormat.KEY_AAC_PROFILE,MediaCodecInfo.CodecProfileLevel.MPEG4ProfileMain);
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();
}
public void close() {
try {
mediaCodec.stop();
mediaCodec.release();
outputStream.flush();
outputStream.close();
} catch (Exception e){
e.printStackTrace();
}
}
public void offerEncoder(byte[] input) {
try {
ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
int inputBufferIndex = mediaCodec.dequeueInputBuffer(0);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(input);
mediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
while (outputBufferIndex >= 0) {
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outputBuffer.get(outData);
outputStream.write(outData, 0, outData.length);
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
}
} catch (Throwable t) {
}
}
}
The Android MediaPlayer doesn't handle raw H.264 streams.
One difficulty with such streams is that the H.264 NAL units don't have timestamp information, so unless the video frames are at a known fixed frame rate the player wouldn't know when to present them.
You can either create your own player with MediaCodec (see e.g. "Play video (TextureView)" in Grafika), or convert the raw stream to a .mp4 file. The latter requires MediaMuxer, available in API 18, or the use of a 3rd-party library like ffmpeg.

MediaCodec get all frames from video

I'm trying to use the MediaCodec to retrive all the frames from a video for image processing stuff, I'm trying to render the video and to capture the frame from the outBuffers
but I can't initiate a bitmap instance from the received bytes.
I've tried to render it to a surface or to nothing(null), because I've notice that when you rendering to null then the outBuffers are getting the bytes of the rendered frames.
This is the code:
private static final String SAMPLE = Environment.getExternalStorageDirectory() + "/test_videos/sample2.mp4";
private PlayerThread mPlayer = null;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
SurfaceView sv = new SurfaceView(this);
sv.getHolder().addCallback(this);
setContentView(sv);
}
protected void onDestroy() {
super.onDestroy();
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
if (mPlayer == null) {
mPlayer = new PlayerThread(holder.getSurface());
mPlayer.start();
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
if (mPlayer != null) {
mPlayer.interrupt();
}
}
private void writeFrameToSDCard(byte[] bytes, int i, int sampleSize) {
try {
Bitmap bmp = BitmapFactory.decodeByteArray(bytes, 0, sampleSize);
File file = new File(Environment.getExternalStorageDirectory() + "/test_videos/sample" + i + ".png");
if (file.exists())
file.delete();
file.createNewFile();
FileOutputStream out = new FileOutputStream(file.getAbsoluteFile());
bmp.compress(Bitmap.CompressFormat.PNG, 90, out);
out.close();
} catch (Exception e) {
e.printStackTrace();
}
}
private class PlayerThread extends Thread {
private MediaExtractor extractor;
private MediaCodec decoder;
private Surface surface;
public PlayerThread(Surface surface) {
this.surface = surface;
}
#Override
public void run() {
extractor = new MediaExtractor();
extractor.setDataSource(SAMPLE);
int index = extractor.getTrackCount();
Log.d("MediaCodecTag", "Track count: " + index);
for (int i = 0; i < extractor.getTrackCount(); i++) {
MediaFormat format = extractor.getTrackFormat(i);
String mime = format.getString(MediaFormat.KEY_MIME);
if (mime.startsWith("video/")) {
extractor.selectTrack(i);
decoder = MediaCodec.createDecoderByType(mime);
decoder.configure(format, surface, null, 0);
break;
}
}
if (decoder == null) {
Log.e("DecodeActivity", "Can't find video info!");
return;
}
decoder.start();
ByteBuffer[] inputBuffers = decoder.getInputBuffers();
ByteBuffer[] outputBuffers = decoder.getOutputBuffers();
BufferInfo info = new BufferInfo();
boolean isEOS = false;
long startMs = System.currentTimeMillis();
int i = 0;
while (!Thread.interrupted()) {
if (!isEOS) {
int inIndex = decoder.dequeueInputBuffer(10000);
if (inIndex >= 0) {
ByteBuffer buffer = inputBuffers[inIndex];
int sampleSize = extractor.readSampleData(buffer, 0);
if (sampleSize < 0) {
decoder.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
isEOS = true;
} else {
decoder.queueInputBuffer(inIndex, 0, sampleSize, extractor.getSampleTime(), 0);
extractor.advance();
}
}
}
/* saves frame to sdcard */
int outIndex = decoder.dequeueOutputBuffer(info, 10000); // outIndex most of the times null
switch (outIndex) {
case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
Log.d("DecodeActivity", "INFO_OUTPUT_BUFFERS_CHANGED");
outputBuffers = decoder.getOutputBuffers();
break;
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
Log.d("DecodeActivity", "New format " + decoder.getOutputFormat());
break;
case MediaCodec.INFO_TRY_AGAIN_LATER:
Log.d("DecodeActivity", "dequeueOutputBuffer timed out!");
break;
default:
ByteBuffer buffer = outputBuffers[outIndex];
Log.v("DecodeActivity", "We can't use this buffer but render it due to the API limit, " + buffer);
// We use a very simple clock to keep the video FPS, or the video
// playback will be too fast
while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
try {
sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
break;
}
}
decoder.releaseOutputBuffer(outIndex, true);
try {
byte[] dst = new byte[outputBuffers[outIndex].capacity()];
outputBuffers[outIndex].get(dst);
writeFrameToSDCard(dst, i, dst.length);
i++;
} catch (Exception e) {
Log.d("iDecodeActivity", "Error while creating bitmap with: " + e.getMessage());
}
break;
}
// All decoded frames have been rendered, we can stop playing now
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
Log.d("DecodeActivity", "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
break;
}
}
decoder.stop();
decoder.release();
extractor.release();
}
}
Any help would be much appriciated
You can decode to a Surface or to a ByteBuffer, but not both. Because you are configuring a Surface, there will always be zero bytes of data in the output buffer.
If you configure for ByteBuffer decoding, the data format will vary, but to my knowledge will never be an ARGB format that Bitmap understands. You can see examples of two YUV formats being examined in the buffer-to-buffer tests in the CTS EncodeDecodeTest in method checkFrame(). Note, however, that the first thing it does is check the format and return immediately if it's not recognized.
At present (Android 4.4), the only reliable way to do this is to decode to a SurfaceTexture, render that with GLES, and extract RGB data with glReadPixels(). Sample code is available on bigflake -- see ExtractMpegFramesTest (requires API 16+).

How to use Android MediaCodec encode Camera data(YUV420sp)

Thank you for your focus!
I want to use Android MediaCodec APIs to encode the video frame which aquired from Camera,
unfortunately, I have not success to do that! I still not familiar with the MediaCodec API。
The follow is my codes,I need your help to figure out what I should do.
1、The Camera setting:
Parameters parameters = mCamera.getParameters();
parameters.setPreviewFormat(ImageFormat.NV21);
parameters.setPreviewSize(320, 240);
mCamera.setParameters(parameters);
2、Set the encoder:
private void initCodec() {
try {
fos = new FileOutputStream(mVideoFile, false);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
mMediaCodec = MediaCodec.createEncoderByType("video/avc");
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc",
320,
240);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 125000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mMediaCodec.configure(mediaFormat,
null,
null,
MediaCodec.CONFIGURE_FLAG_ENCODE);
mMediaCodec.start();
inputBuffers = mMediaCodec.getInputBuffers();
outputBuffers = mMediaCodec.getOutputBuffers();
}
private void encode(byte[] data) {
int inputBufferIndex = mMediaCodec.dequeueInputBuffer(0);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(data);
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0);
} else {
return;
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
Log.i(TAG, "outputBufferIndex-->" + outputBufferIndex);
do {
if (outputBufferIndex >= 0) {
ByteBuffer outBuffer = outputBuffers[outputBufferIndex];
System.out.println("buffer info-->" + bufferInfo.offset + "--"
+ bufferInfo.size + "--" + bufferInfo.flags + "--"
+ bufferInfo.presentationTimeUs);
byte[] outData = new byte[bufferInfo.size];
outBuffer.get(outData);
try {
if (bufferInfo.offset != 0) {
fos.write(outData, bufferInfo.offset, outData.length
- bufferInfo.offset);
} else {
fos.write(outData, 0, outData.length);
}
fos.flush();
Log.i(TAG, "out data -- > " + outData.length);
mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo,
0);
} catch (IOException e) {
e.printStackTrace();
}
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = mMediaCodec.getOutputBuffers();
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat format = mMediaCodec.getOutputFormat();
}
} while (outputBufferIndex >= 0);
}
I guess the problem occurred in the encoder method,the method will be used in the Camera Preview Callback ,like
initCodec();
//mCamera.setPreviewCallback(new MyPreviewCallback());
mCamera.setPreviewCallback(new PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
encode(data);
}
});
I just have no idea how to do it correctly with the MediaCodec API.Can you give me some advice or links about it?
Thank you!
I have solved the problem.As follows:
private synchronized void encode(byte[] data)
{
inputBuffers = mMediaCodec.getInputBuffers();// here changes
outputBuffers = mMediaCodec.getOutputBuffers();
int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
Log.i(TAG, "inputBufferIndex-->" + inputBufferIndex);
//......
And next,you will find your encoded video color is not right, for more information,please go to here MediaCodec and Camera: colorspaces don't match
The YUV420 formats output by the camera are incompatible with the formats accepted by the MediaCodec AVC encoder. In the best case, it's essentially NV12 vs. NV21 (U and V planes are reversed), requiring a manual reordering. In the worst case, as of Android 4.2, the encoder input format may be device-specific.
You're better off using MediaRecorder to connect the camera hardware to the encoder.
Update:
It's now possible to pass the camera's Surface preview to MediaCodec, instead of using the YUV data in the ByteBuffer. This is faster and more portable. See the CameraToMpegTest sample here.

Categories

Resources