MediaCodec encode video with inputSurface but drop frames - android

I'm looking for the fastest way to "decode edit encode" video on Android devices. So I choose MediaCodec with Surface for Input And Output.
This is my idea:
1. decode the mp4 file with MediaCoder, the output is SurfaceTexture;
2. use OpenGL's to edit surface, the output is texture;
3. use MediaCodec to encode, the input is Surface
the problem is:
decode and edit are much faster than encode, so when I had decode and edit 50 frames, encode maybe just consume 10 frames. but as I use the surface for input with Encode, I don't know if the encoder has consumed all previous frames. so the 40 frames are lost.
Is there any way let me know the surface consume state, so I can control decode speed or any other idea?

Related

How to get Android MediaCodec return decoded frame immediately after feeding every frame

I am following the Google basic media decoder example to design the decoder. The decoder will not return decoded frame until few input buffers are filled. I guess this is fine for most applications. However, my application requires zero-latency, meaning each frame must be decoded before feeding next frame. Is there anyway to force decoder to decode a frame? I tried setting EOS on every frame but the following flush() will clear internal decoder state, leading to P frames undecodable.

MediaCodec decoder output is empty

My simple app must show a Camera preview, which I created through a SurfaceView, and while frames become available (through
onPreviewFrame(byte[] data, Camera camera)
callback) a secondary thread sub-sample frames and use a MediaCodec encoder instance to encode them in mp4. As soon as encoded frames are available, a second MediaCodec decoder instance decodes them in order to return to raw frames. When I call
decoder.dequeueOutputBuffer(info2, TIMEOUT_USEC)
bufferInfo's offset and size are always equal to 0!! How is it possible? I already checked timestamps and they are never duplicated so we have no discarded frames

Decoding H.264 NALU stream

I am trying to get a frame out of a videoStream in H.264 format on Android. I have a callback function that receives (byte[] video, int size) where video is composed of NALU packets and the 'size' int seems to be the size of the video (I've been logging them and both video.length and size seem to be a size of 1024). I'm using this jcodec to try to decode a frame:
ByteBuffer videoBuffer = ByteBuffer.wrap(video);
H264Decoder decoder = new H264Decoder();
Picture out = Picture.create(1920, 1088, ColorSpace.YUV420);
Picture real = decoder.decodeFrame(videoBuffer, out.getData());
Bitmap outputImage = AndroidUtil.toBitmap(real);
However, video is not the entire video, it's packets of the video. How should I collect these packets in order to decode them? How do I need to modify the code in order to actually get the full current frame?
You need to parse our each nalu-segment from the payload you are receiving. Please see my previous answer:
MediaCodec crash on high quality stream

Is it possible to feed MediaCodec Bytearray received by Server, and show them on SurfaceView

everyone. How i can do this if i take frame by frame stream video by server, I have PPS and SPS, and configure mediaCodec with this parametrs, also I have width and height. I'll glad any help.This example how id did it.
Mediacodec, decode byte packet from server and render it on surface

Very difficult to grab YUV buffer after picture taken

I am trying to implement a camera app, but it looks only the onJepgTaken in PictureCallback can take a buffer, while onRaw and onPostView takes null. And getSupportedPictureFormats always returns 256, so no hope to get YUV directly. If this is the case, I guess if I want to process the large image after image taking, I can only get the jpeg decoder for processing.
Updated: seems NV16 is available in takepicture callback and even if YUV buffer is not available, libjpeg is still there for codec stuff.

Categories

Resources