I'm pulling video from a drone that encodes the video stream in h264. It sends each NAL unit via 1-5 UDP packets. I have code that creates a NAL unit out of those packets and removes the header. It then passes it to MediaCodec which then passes it to a Surface object
The output should be a video feed but for some reason, all I get is a black screen. I know that the surface is working as intended because if I futz with the NAL unit I get this green garbage that I think happens when MediaCodec doesn't know what it's looking at.
Anyways here's the section of code that handles the actual decoding. Is there anything actually wrong with it or am I looking for the issue in the wrong place?
//This part initializes the decoder and generally sets up everything needed for the while loop down below
encodedVideo = new ServerSocket(11111, 1460, false, 0);
MediaFormat format = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 1920, 1080);
format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 100000);
try {
m_codec = MediaCodec.createDecoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);
m_codec.configure(format, new Surface(droneSight.getSurfaceTexture()), null, 0);
} catch (Exception e) {
e.printStackTrace();
}
m_codec.start();
running = true;
initialFrame = Arrays.copyOf(encodedVideo.getPacketData(true,true),encodedVideo.getPacketData(true,false).length);
//This section handles the actual grabbing and decoding of each NAL unit. Or it would, if it worked.
while (running) {
byte[] frame = this.getNALUnit();
int inputIndex = m_codec.dequeueInputBuffer(-1);
if (inputIndex >= 0) {
ByteBuffer buf = m_codec.getInputBuffer(inputIndex);
buf.put(frame);
m_codec.queueInputBuffer(inputIndex, 0, frame.length, 0, 0);
}
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
int outputIndex = m_codec.dequeueOutputBuffer(info, 0);
if (outputIndex >= 0) {
m_codec.releaseOutputBuffer(outputIndex, true);
}
}
Related
I initially tried How to play raw NAL units in Andoid exoplayer? but I noticed I'm gonna have to do things in low level.
I've found this simple MediaCodec example. As you can see, it's a thread that plays a file on a surface passed to it.
Notice the lines
mExtractor = new MediaExtractor();
mExtractor.setDataSource(filePath);
It looks like that I should create my own MediaExtractor which, instead of extracting the video units from a file, it'll use the h264 NAL units from a buffer I'll provide.
I can then call mExtractor.setDataSource(MediaDataSource dataSource), see MediaDataSource
It has readAt(long position, byte[] buffer, int offset, int size)
This is where it reads the NAL units. However, how should I pass them? I have no information on the structure of the buffer that needs to be read.
Should I pass a byte[] buffer with the NAL units in it, and if so, in which format? What is the offset for? If it's a buffer, shouldn't I just erase the lines that were read and thus have no offset or size?
By the way, the h264 NAL units are streaming ones, they come from RTP packets, not files. I'm gonna get them through C++ and store them on a buffer an try to pass to the mediaExtractor.
UPDATE:
I've been reading a lot about MediaCodec and I think I understand it better. According to https://developer.android.com/reference/android/media/MediaCodec, everything relies on something of this type:
MediaCodec codec = MediaCodec.createByCodecName(name);
MediaFormat mOutputFormat; // member variable
codec.setCallback(new MediaCodec.Callback() {
#Override
void onInputBufferAvailable(MediaCodec mc, int inputBufferId) {
ByteBuffer inputBuffer = codec.getInputBuffer(inputBufferId);
// fill inputBuffer with valid data
…
codec.queueInputBuffer(inputBufferId, …);
}
#Override
void onOutputBufferAvailable(MediaCodec mc, int outputBufferId, …) {
ByteBuffer outputBuffer = codec.getOutputBuffer(outputBufferId);
MediaFormat bufferFormat = codec.getOutputFormat(outputBufferId); // option A
// bufferFormat is equivalent to mOutputFormat
// outputBuffer is ready to be processed or rendered.
…
codec.releaseOutputBuffer(outputBufferId, …);
}
#Override
void onOutputFormatChanged(MediaCodec mc, MediaFormat format) {
// Subsequent data will conform to new format.
// Can ignore if using getOutputFormat(outputBufferId)
mOutputFormat = format; // option B
}
#Override
void onError(…) {
…
}
});
codec.configure(format, …);
mOutputFormat = codec.getOutputFormat(); // option B
codec.start();
// wait for processing to complete
codec.stop();
codec.release();
As you can see, I can pass input buffers and get decoded output buffers. The exact byte formats are still a mystery, but I think that's how it works. Also according to the same article, the usage of ByteBuffers is slow, and Surfaces are preferred. They consume the output buffers automatically. Although there's no tutorial on how to do it, there's a section in the article that says it's almost identical, so I guess I just need to add the additional lines
codec.setInputSurface(Surface inputSurface)
codec.setOutputSurface(Surface outputSurface)
Where inputSurface and outputSurface are Surfaces which I pass to a MediaPlayer which I use (how) to display the video in an activity. And the output buffers will simply not come on onOutputBufferAvailable (because the surface consumes them first), neither onInputBufferAvailable.
So the questions now are: how exactly do I construct a Surface which contains the video buffer, and how do I display a MediaPlayer into an activity
For output I can simply create a Surface and pass to a MediaPlayer and MediaCodec, but what about input? Do I need ByteBuffer for the input anyways, and Surface would just be for using other outputs as inputs?
you first need to remove the NAL units , and feed the raw H264 bytes into this method, how ever in your case your reading from the file , so no need to remove any thing since your not using packets , just feed the data bytes to this method:
rivate void initDecoder(){
try {
writeHeader = true;
if(mDecodeMediaCodec != null){
try{
mDecodeMediaCodec.stop();
}catch (Exception e){}
try{
mDecodeMediaCodec.release();
}catch (Exception e){}
}
mDecodeMediaCodec = MediaCodec.createDecoderByType(MIME_TYPE);
//MIME_TYPE = video/avc in your case
mDecodeMediaCodec.configure(format,mSurfaceView.getHolder().getSurface(),
null,
0);
mDecodeMediaCodec.start();
mDecodeInputBuffers = mDecodeMediaCodec.getInputBuffers();
} catch (IOException e) {
e.printStackTrace();
mLatch.trigger();
}
}
private void decode(byte[] data){
try {
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
int inputBufferIndex = mDecodeMediaCodec.dequeueInputBuffer(1000);//
if (inputBufferIndex >= 0) {
ByteBuffer buffer = mDecodeInputBuffers[inputBufferIndex];
buffer.clear();
buffer.put(data);
mDecodeMediaCodec.queueInputBuffer(inputBufferIndex,
0,
data.length,
packet.sequence / 1000,
0);
data = null;
//decodeDataBuffer.clear();
//decodeDataBuffer = null;
}
int outputBufferIndex = mDecodeMediaCodec.dequeueOutputBuffer(info,
1000);
do {
if (outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
//no output available yet
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
//encodeOutputBuffers = mDecodeMediaCodec.getOutputBuffers();
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
format = mDecodeMediaCodec.getOutputFormat();
//mediaformat changed
} else if (outputBufferIndex < 0) {
//unexpected result from encoder.dequeueOutputBuffer
} else {
mDecodeMediaCodec.releaseOutputBuffer(outputBufferIndex,
true);
outputBufferIndex = mDecodeMediaCodec.dequeueOutputBuffer(info,
0);
}
} while (outputBufferIndex > 0);
}
please dont forget that iFrame (the first frame bytes) contains sensitive data and MUST be fed to the decoder first
I have set a series of classes to decode H264 streaming video from a server and render it on a SurfaceView. Now this code is working perfectly on every device I've tried including the emulator, but suddenly I bought myself a S7 and in this device it does not work properly anymore.
The weird thing is that some times it will work perfectly, then some other times it will throw this error:
06-15 16:41:40.249 13300-24605/cm.myapp E/ACodec: [OMX.Exynos.avc.dec] ERROR(0x90000012)
06-15 16:41:40.249 13300-24605/cm.myapp E/ACodec: signalError(omxError 0x90000012, internalError -2147483648)
06-15 16:41:40.249 13300-24604/cm.myapp E/MediaCodec: Codec reported err 0x90000012, actionCode 0, while in state 6
06-15 16:41:40.249 13300-24578/cm.myapp W/MediaStreamerThread: Failed to draw media.
Sometimes it will crash on the dequeueInputBuffers call:
java.lang.IllegalStateException
at android.media.MediaCodec.native_dequeueOutputBuffer(Native Method)
at android.media.MediaCodec.dequeueOutputBuffer(MediaCodec.java:2379)
And then again, some other times it will throw this very different error:
06-15 16:34:57.239 13300-16625/cm.myapp W/System.err: java.lang.IllegalArgumentException: The surface has been released
06-15 16:34:57.239 13300-16625/cm.myapp W/System.err: at android.media.MediaCodec.native_configure(Native Method)
06-15 16:34:57.239 13300-16625/cm.myapp W/System.err: at android.media.MediaCodec.configure(MediaCodec.java:1778)
java.lang.RuntimeException: Could not create h264 decoder
These errors by themselves are not very verbose and thus I cannot figure out where the problem might be.
Again my code works perfectly on most devices but it's failing on this one. How is this possible? Any ideas?
This is my decoding code:
public class H264Decoder
{
static private final long TIMEOUT_US = 10000L;
private MediaCodec mDecoder;
private Surface mSurface;
private static final List<byte[]> EMPTY_ENCODE_RESULT = new ArrayList<>();
public void init()
{
try
{
mDecoder = MediaCodec.createDecoderByType( "video/avc" );
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 640, 480);
mDecoder.configure(mediaFormat, mSurface, null, 0);
mDecoder.start();
}
catch(NoClassDefFoundError ex) {
ex.printStackTrace();
throw new RuntimeException("Could not create h264 decoder", ex);
}
}
public List<byte[]> offer(byte[] data)
{
List<byte[]> returnValue = new ArrayList<>();
returnValue.add(decode(data, true));
return returnValue;
}
public void release()
{
assert mSurface != null;
assert mDecoder != null;
mDecoder.stop();
mDecoder.release();
mDecoder = null;
}
public H264Decoder(Surface surface)
{
mSurface = surface;
}
public byte[] decode(byte[] data, boolean updateRender)
{
if (mSurface == null)
{
return null;
}
// INPUT -----------------------------------------------------------------------------------
int inputBufferIndex = mDecoder.dequeueInputBuffer(TIMEOUT_US);
if (inputBufferIndex >= 0)
{
// Get an input buffer from the codec, fill it with data, give it back
ByteBuffer inputBuffer = mDecoder.getInputBuffers()[inputBufferIndex];
inputBuffer.put(data);
mDecoder.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0 );
}
// OUTPUT ----------------------------------------------------------------------------------
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
int outputBufferIndex = mDecoder.dequeueOutputBuffer(info, TIMEOUT_US);
if ( outputBufferIndex >= 0 )
{
final ByteBuffer[] outputBuffer = mDecoder.getOutputBuffers();
mDecoder.releaseOutputBuffer(outputBufferIndex, updateRender);
}
else if ( outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED )
{
//outputBuffers = codec.getOutputBuffers();
}
else if ( outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED )
{
MediaFormat format = mDecoder.getOutputFormat();
}
return null;
}
}
It is hard to say what can go wrong. One thing is though
mDecoder.releaseOutputBuffer(outputBufferIndex, updateRender);
if(!updateRender) {
return outputBuffer[outputBufferIndex].array();
}
I would not recommend returning array from output buffer. According to documentation it states
Once an output buffer is released to the codec, it MUST NOT be used
If you truly need to have encoded sample, I would better create a copy of output buffer, call release, and then return the copy.
If byte array that's returned used to be processed somehow else, I would recommend extracting this method of decoder into the loop and process the byte buffer (output buffer), and then call release output buffer. For more reference I would look into MediaCodec Synchronous Processing using Buffers
For Samsung S7, you have to make sure that the first H264 frame inserted in decoder is an I frame.
On Samsung Galaxy S8, I would get ERROR(0x90000012) asynchronously after calling MediaCodec.configure(), before feeding any stream input.
One sure way of getting ERROR(0x90000012) is to use streams in AVCC format (like Apple Quicktime movies).
In fact, it seems across many Android devices, we have more success using H264 in Annex-B format.
Annex-b also means not needing to pass out of band extra data using MediaCodec.BUFFER_FLAG_CODEC_CONFIG or sps/pps configured via MediaFormat (csd-0, csd-1 keys).
I've a problem with this library
https://github.com/fyhertz/libstreaming
it allows to send via wireless the streaming of photocamera, it use 3 methods: two with mediacodec and one with mediarecorder.
I would like to modify it, and I have to use only the mediacodec;however first of all I tried the code of the example 2 of the library, but I've always found the same error:
the log tell me that the device can use the mediacodec, it set the encoder and when it test the decoder it fall and the buffer is filled with -1.
This is the method in the EncoderDebugger class where the exception occurs, some kind soul can help me please?
private long decode(boolean withPrefix) {
int n =3, i = 0, j = 0;
long elapsed = 0, now = timestamp();
int decInputIndex = 0, decOutputIndex = 0;
ByteBuffer[] decInputBuffers = mDecoder.getInputBuffers();
ByteBuffer[] decOutputBuffers = mDecoder.getOutputBuffers();
BufferInfo info = new BufferInfo();
while (elapsed<3000000) {
// Feeds the decoder with a NAL unit
if (i<NB_ENCODED) {
decInputIndex = mDecoder.dequeueInputBuffer(1000000/FRAMERATE);
if (decInputIndex>=0) {
int l1 = decInputBuffers[decInputIndex].capacity();
int l2 = mVideo[i].length;
decInputBuffers[decInputIndex].clear();
if ((withPrefix && hasPrefix(mVideo[i])) || (!withPrefix && !hasPrefix(mVideo[i]))) {
check(l1>=l2, "The decoder input buffer is not big enough (nal="+l2+", capacity="+l1+").");
decInputBuffers[decInputIndex].put(mVideo[i],0,mVideo[i].length);
} else if (withPrefix && !hasPrefix(mVideo[i])) {
check(l1>=l2+4, "The decoder input buffer is not big enough (nal="+(l2+4)+", capacity="+l1+").");
decInputBuffers[decInputIndex].put(new byte[] {0,0,0,1});
decInputBuffers[decInputIndex].put(mVideo[i],0,mVideo[i].length);
} else if (!withPrefix && hasPrefix(mVideo[i])) {
check(l1>=l2-4, "The decoder input buffer is not big enough (nal="+(l2-4)+", capacity="+l1+").");
decInputBuffers[decInputIndex].put(mVideo[i],4,mVideo[i].length-4);
}
mDecoder.queueInputBuffer(decInputIndex, 0, l2, timestamp(), 0);
i++;
} else {
if (VERBOSE) Log.d(TAG,"No buffer available !7");
}
}
// Tries to get a decoded image
decOutputIndex = mDecoder.dequeueOutputBuffer(info, 1000000/FRAMERATE);
if (decOutputIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
decOutputBuffers = mDecoder.getOutputBuffers();
} else if (decOutputIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
mDecOutputFormat = mDecoder.getOutputFormat();
} else if (decOutputIndex>=0) {
if (n>2) {
// We have successfully encoded and decoded an image !
int length = info.size;
mDecodedVideo[j] = new byte[length];
decOutputBuffers[decOutputIndex].clear();
decOutputBuffers[decOutputIndex].get(mDecodedVideo[j], 0, length);
// Converts the decoded frame to NV21
convertToNV21(j);
if (j>=NB_DECODED-1) {
flushMediaCodec(mDecoder);
if (VERBOSE) Log.v(TAG, "Decoding "+n+" frames took "+elapsed/1000+" ms");
return elapsed;
}
j++;
}
mDecoder.releaseOutputBuffer(decOutputIndex, false);
n++;
}
elapsed = timestamp() - now;
}
throw new RuntimeException("The decoder did not decode anything.");
}
Here's my suggestions:
(1) check the settings of encoder and decoder, and make sure that they match. For example, revolution and color format are the same.
(2) make sure the very first packet generated by the encoder has been sent and pushed into the decoder. This packet defines the basic settings of the video stream.
(3) the decoder usually buffers 5-10 frames. So data in the buffer is invalid for a few hundred ms.
(4) while initiating the decoder, set the surface as null. Otherwise the output buffer will be read by the surface and probably released automatically.
I'm using the following code to prepare the hardware decoder. I expect outputBufferIndex to be -1 and then followed by MediaCodec.INFO_OUTPUT_FORMAT_CHANGED. It shouldn't >=0 before notifying format changed.
I tested the code in 25 different devices, and 7 of them never return INFO_OUTPUT_FORMAT_CHANGED. mediaCodec.getOutputFormat() returned IllegalStateException when I got outputBufferIndex >= 0. I have no idea if it was a coincidence that all devices did't work were android 4.2.2 with OMX.qcom.video.decoder.avc decoder.
for (int i = 0; i < videoExtractor.getTrackCount(); i++) {
MediaFormat mediaFormat = videoExtractor.getTrackFormat(i);
String mime = mediaFormat.getString(MediaFormat.KEY_MIME);
if (mime.startsWith("video/")) {
videoExtractor.selectTrack(i);
videoCodec = MediaCodec.createDecoderByType(mediaFormat.getString(MediaFormat.KEY_MIME));
videoCodec.configure(mediaFormat, null, null, 0);
videoCodec.start();
}
}
ByteBuffer[] videoInputBuffers = videoCodec.getInputBuffers();
while (true) {
int sampleTrackIndex = videoExtractor.getSampleTrackIndex();
if (sampleTrackIndex == -1) {
break;
} else { // decode video
int inputBufferIndex = videoCodec.dequeueInputBuffer(0);
if (inputBufferIndex >= 0) {
int bytesRead = videoExtractor.readSampleData(videoInputBuffers[inputBufferIndex], 0);
if (bytesRead >= 0) {
videoCodec.queueInputBuffer(inputBufferIndex, 0, bytesRead,
videoExtractor.getSampleTime(), 0);
videoExtractor.advance();
}
}
MediaCodec.BufferInfo videoBufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = videoCodec.dequeueOutputBuffer(videoBufferInfo, 0);
if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat format = videoCodec.getOutputFormat();
Log.w("video format changed: " + videoCodec.getOutputFormat());
//do something...
break;
} else if (outputBufferIndex >= 0) {
//not supposed to happen!
}
}
}
Thank you very much for the clues and helps!
In Android 4.3, a collection of MediaCodec tests were added to CTS. If you look at the way doEncodeDecodeVideoFromBuffer() works in EncodeDecodeTest, you can see that it expects the INFO_OUTPUT_FORMAT_CHANGED result before any data. If it doesn't get it, the call to checkFrame() will fail when it tries to get the color format. Prior to Android 4.3, there were no tests, and any behavior is possible.
Having said that, I don't recall seeing this behavior on the (Qualcomm-based) Nexus 4.
At any rate, I'm not sure how much this will actually hold you back, unless you're able to decode the proprietary buffer layout Qualcomm uses. You can see in that same checkFrame() function that it punts when it sees OMX_QCOM_COLOR_FormatYUV420PackedSemiPlanar64x32Tile2m8ka. Sending the output to a Surface may be a viable alternative depending on what you're up to.
Most of the MediaCodec code on bigflake and in Grafika targets API 18 (Android 4.3), because that's when the behavior became more predictable. (The availability of surface input and MediaMuxer is also of tremendous value.)
I am trying to use MediaCodec to save a series of Images, saved as Byte Arrays in a file, to a video file. I have tested these images on a SurfaceView (playing them in series) and I can see them fine. I have looked at many examples using MediaCodec, and here is what I understand (please correct me if I am wrong):
Get InputBuffers from MediaCodec object -> fill it with your frame's
image data -> queue the input buffer -> get coded output buffer ->
write it to a file -> increase presentation time and repeat
However, I have tested this a lot and I end up with one of two cases:
All sample projects I tried to imitate have caused Media server to die when calling queueInputBuffer for the second time.
I tried calling codec.flush() at the end (after saving output buffer to file, although none of the examples I saw did this) and the media server did not die, however, I am not able to open the output video file with any media player, so something is wrong.
Here is my code:
MediaCodec codec = MediaCodec.createEncoderByType(MIMETYPE);
MediaFormat mediaFormat = null;
if(CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)){
mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, 1280 , 720);
} else {
mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, 720, 480);
}
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 700000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 10);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
codec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
codec.start();
ByteBuffer[] inputBuffers = codec.getInputBuffers();
ByteBuffer[] outputBuffers = codec.getOutputBuffers();
boolean sawInputEOS = false;
int inputBufferIndex=-1,outputBufferIndex=-1;
BufferInfo info=null;
//loop to read YUV byte array from file
inputBufferIndex = codec.dequeueInputBuffer(WAITTIME);
if(bytesread<=0)sawInputEOS=true;
if(inputBufferIndex >= 0){
if(!sawInputEOS){
int samplesiz=dat.length;
inputBuffers[inputBufferIndex].put(dat);
codec.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, 0);
presentationTime += 100;
info = new BufferInfo();
outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);
Log.i("BATA", "outputBufferIndex="+outputBufferIndex);
if(outputBufferIndex >= 0){
byte[] array = new byte[info.size];
outputBuffers[outputBufferIndex].get(array);
if(array != null){
try {
dos.write(array);
} catch (IOException e) {
e.printStackTrace();
}
}
codec.releaseOutputBuffer(outputBufferIndex, false);
inputBuffers[inputBufferIndex].clear();
outputBuffers[outputBufferIndex].clear();
if(sawInputEOS) break;
}
}else{
codec.queueInputBuffer(inputBufferIndex, 0, 0, presentationTime, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
info = new BufferInfo();
outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);
if(outputBufferIndex >= 0){
byte[] array = new byte[info.size];
outputBuffers[outputBufferIndex].get(array);
if(array != null){
try {
dos.write(array);
} catch (IOException e) {
e.printStackTrace();
}
}
codec.releaseOutputBuffer(outputBufferIndex, false);
inputBuffers[inputBufferIndex].clear();
outputBuffers[outputBufferIndex].clear();
break;
}
}
}
}
codec.flush();
try {
fstream2.close();
dos.flush();
dos.close();
} catch (IOException e) {
e.printStackTrace();
}
codec.stop();
codec.release();
codec = null;
return true;
}
My question is, how can I get a working video from a stream of images using MediaCodec. What am I doing wrong?
Another question (if I am not too greedy), I would like to add an Audio track to this video, can it be done with MediaCodec as well, or must I use FFmpeg?
Note: I know about MediaMux in Android 4.3, however, it is not an option for me as my App must work on Android 4.1+.
Update
Thanks to fadden answer, I was able to reach EOS without Media server dying (Above code is after modification). However, the file I am getting is producing gibberish. Here is a snapshot of the video I get (only works as .h264 file).
My Input image format is YUV image (NV21 from camera preview). I can't get it to be any playable format. I tried all COLOR_FormatYUV420 formats and same gibberish output. And I still can't find away (using MediaCodec) to add audio.
I think you have the right general idea. Some things to be aware of:
Not all devices support COLOR_FormatYUV420SemiPlanar. Some only accept planar. (Android 4.3 introduced CTS tests to ensure that the AVC codec supports one or the other.)
It's not the case that queueing an input buffer will immediately result in the generation of one output buffer. Some codecs may accumulate several frames of input before producing output, and may produce output after your input has finished. Make sure your loops take that into account (e.g. your inputBuffers[].clear() will blow up if it's still -1).
Don't try to submit data and send EOS with the same queueInputBuffer call. The data in that frame may be discarded. Always send EOS with a zero-length buffer.
The output of the codecs is generally pretty "raw", e.g. the AVC codec emits an H.264 elementary stream rather than a "cooked" .mp4 file. Many players won't accept this format. If you can't rely on the presence of MediaMuxer you will need to find another way to cook the data (search around on stackoverflow for ideas).
It's certainly not expected that the mediaserver process would crash.
You can find some examples and links to the 4.3 CTS tests here.
Update: As of Android 4.3, MediaCodec and Camera have no ByteBuffer formats in common, so at the very least you will need to fiddle with the chroma planes. However, that sort of problem manifests very differently (as shown in the images for this question).
The image you added looks like video, but with stride and/or alignment issues. Make sure your pixels are laid out correctly. In the CTS EncodeDecodeTest, the generateFrame() method (line 906) shows how to encode both planar and semi-planar YUV420 for MediaCodec.
The easiest way to avoid the format issues is to move the frames through a Surface (like the CameraToMpegTest sample), but unfortunately that's not possible in Android 4.1.