Decoding H.264 NALU stream - android

I am trying to get a frame out of a videoStream in H.264 format on Android. I have a callback function that receives (byte[] video, int size) where video is composed of NALU packets and the 'size' int seems to be the size of the video (I've been logging them and both video.length and size seem to be a size of 1024). I'm using this jcodec to try to decode a frame:
ByteBuffer videoBuffer = ByteBuffer.wrap(video);
H264Decoder decoder = new H264Decoder();
Picture out = Picture.create(1920, 1088, ColorSpace.YUV420);
Picture real = decoder.decodeFrame(videoBuffer, out.getData());
Bitmap outputImage = AndroidUtil.toBitmap(real);
However, video is not the entire video, it's packets of the video. How should I collect these packets in order to decode them? How do I need to modify the code in order to actually get the full current frame?

You need to parse our each nalu-segment from the payload you are receiving. Please see my previous answer:
MediaCodec crash on high quality stream

Related

MediaCodec encode video with inputSurface but drop frames

I'm looking for the fastest way to "decode edit encode" video on Android devices. So I choose MediaCodec with Surface for Input And Output.
This is my idea:
1. decode the mp4 file with MediaCoder, the output is SurfaceTexture;
2. use OpenGL's to edit surface, the output is texture;
3. use MediaCodec to encode, the input is Surface
the problem is:
decode and edit are much faster than encode, so when I had decode and edit 50 frames, encode maybe just consume 10 frames. but as I use the surface for input with Encode, I don't know if the encoder has consumed all previous frames. so the 40 frames are lost.
Is there any way let me know the surface consume state, so I can control decode speed or any other idea?

How to get Android MediaCodec return decoded frame immediately after feeding every frame

I am following the Google basic media decoder example to design the decoder. The decoder will not return decoded frame until few input buffers are filled. I guess this is fine for most applications. However, my application requires zero-latency, meaning each frame must be decoded before feeding next frame. Is there anyway to force decoder to decode a frame? I tried setting EOS on every frame but the following flush() will clear internal decoder state, leading to P frames undecodable.

How to extract 24 different frames per second from video?

I want to create a GIF from mp4 video. So I need to extract the frames from video first. Here is the code I use to extract frames:
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
retriever.setDataSource(mFilePath);
Bitmap bitmap = retriever.getFrameAtTime(i,
MediaMetadataRetriever.OPTION_CLOSEST);
Note that variable i is time in microseconds. Since I want to get 24 frames/second, I call retriever.getFrameAtTime() with i = 42000, 84000, .... (microseconds).
The problem is: when I collect extracted frames to a video, I see only 4-5 different frames. In other words, I didn't get a smooth video. It seems that MediaMetadataRetriever often returns the same frame with different given time. Please help me!

Is it possible to feed MediaCodec Bytearray received by Server, and show them on SurfaceView

everyone. How i can do this if i take frame by frame stream video by server, I have PPS and SPS, and configure mediaCodec with this parametrs, also I have width and height. I'll glad any help.This example how id did it.
Mediacodec, decode byte packet from server and render it on surface

How to display a decoded(using VP8 software decoder) stream of video, stored in a buffer, on surfaceView

I am working on a project like that of skype for android..
I have implemented the decoding of received data and display using surfaceView and mediacodec.
But many devices supports only H264 decoding. So I have a software decoder for decoding VP8 video streams which decodes data and puts into a global buffer. Now I want to display that decoded stream using surface view (it was done using openGL but it is very slow).
But not getting an idea to do it using surfaceView.
Do anyone have an idea on how to render a decoded video stream to a surface of surfaceView.
Thanks a lot....................
I found the answer
bitmapSurface = Bitmap.createBitmap(320, 240, Bitmap.Config.ARGB_8888);
bitmapSurface.copyPixelsFromBuffer(ByteBuffer.wrap(argb_frame));
c = myVideoSurfaceView.getHolder().lockCanvas();
bitmapSurface = Bitmap.createScaledBitmap(bitmapSurface,surflp.width, surflp.height, true);
c.drawBitmap(bitmapSurface, 0, 0, null);
myVideoSurfaceView.getHolder().unlockCanvasAndPost(c);

Categories

Resources