I followed the steps mentioned in below link to get gralloc buffer. But how to get size of buffer?
How to dump YUV from OMXCodec decoding output
for testing i took length as, width x height x 1.5 as OMXCodec decoder output format was OMX_QCOM_420Planer32m.
But when i write YUV frame to file, my yuv viewer is not able to render it. then i tried length from range_length(). for this also same issue.
Also i converted the file to JPEG, it is not proper as YUV it self is wrong.
please help me. how to dump yuv buffer to file. i'm testing in Kitkat (moto g) and in ICS (Samsung tablet)
thank you,
Raghu
As noted in links from the article you linked to, the buffer is not in a simple planar YUV format, but rather in a Qualcomm-proprietary format.
You need code that knows how to decode it. This accepted answer to this question seems to have it, though I don't know how stable the format definitions are, and there are some comments that suggest the code as posted isn't quite right.
Related
I'm trying to capture Android's views as bitmaps and save them as .mp4 file.
I'm using MediaCodec to encode bitmaps and MediaMuxer to mux them into .mp4.
Using YUV420p color format I expect input buffers from MediaCodec to be of size resWidth * resHeight * 1.5 but Qualcomm's OMX.qcom.video.encoder.avc gives me more than that (no matter what resolution I choose). I believe that it wants me to do some alignment in my input byte stream but I have no idea how to find out what exactly it expects me to do.
This is what I get when I pack my data tightly in input buffers on Nexus 7 (2013) using Qualcomm's codec: https://www.youtube.com/watch?v=JqJD5R8DiC8
And this video is made by the very same app ran on Nexus 10 (codec OMX.Exynos.AVC.Encoder): https://www.youtube.com/watch?v=90RDXAibAZI
So it looks like luma plane is alright in faulty video but what happened with chroma plane is a mystery for me.
I prepared minimal (2 classes) working code example exposing this issue: https://github.com/eeprojects/MediaCodecExample
You can get videos shown above just by running this app (there will be same artefacts if your device utilizes Qualcomm's codec).
There are multiple ways of storing YUV 420 in buffers; you need to check the individual pixel format you chose. MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar and MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar are in practice the same, called planar or I420 for short, while the others, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar and MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar are called semiplanar or NV12.
In semiplanar, you don't have to separate planes for U and V, but you have one single plane with pairs of interleaved U,V.
See
https://android.googlesource.com/platform/cts/+/jb-mr2-release/tests/tests/media/src/android/media/cts/EncodeDecodeTest.java (lines 925-949) for an example on how to fill in the buffer for the semiplanar formats.
Hello I have a really difficult question. I got the nexus 6 and I want to have the preview stream in RAW fromat (ImageFormate.RAW_SENSOR) at the camera2 API. Is this even possible?
I use the android-Camera2Raw (https://github.com/googlesamples/android-Camera2Raw)
Currently, this is not possible. Please see this table for a list of formats possible as a stream: https://developer.android.com/reference/android/hardware/camera2/CameraCharacteristics.html#SCALER_STREAM_CONFIGURATION_MAP
Your most "raw" choice would be turning off noise reduction (if supported by the hardware) in your CaptureRequest.Builder, like so:
builder.set(CaptureRequest.NOISE_REDUCTION_MODE, CaptureRequest.NOISE_REDUCTION_MODE_OFF);
If frame rate isn't an issue, you could send repeated CaptureRequests for RAW images, and on your ImageReader.OnImageAvailableListener() process the RAW, convert it to a Bitmap, then place that into an ImageView, but that approach is exactly as impractical as it sounds :)
I am using libstagefright to decode a 1020p video having baseline high and level 31 on android.
On emulator the video decoding fails, and i assume the reason is that softavc does not support high baseline according to the code here
But I tested this code on a real device which uses the OMX.MTK.VIDEO.DECODER.AVC decoder. And according to this link, this decoder supports high baseline decoding with level 31.
But the video result is garbled.
Does anyone have any insight, why this is so? And what could be the possible solution?
There could be 3 reasons for garbled output
The decoder employs a stride which you haven't factored into your calculations. Recommended Solution: Please check OMXcomponent's port parameters and look for stride and make suitable modifications.
I presume you are decoding 1080, please consider 1088 for calculations instead of 1080. This can be confirmed if your output has clear luma, but jumbled chroma. Does 720p play fine for you?
If it's neither of the above 2 conditions are true, then the decoder may be outputting a tiled output. Please check with the vendor's specifications. If this is true, you will need to convert from the tiled format to a more common format like NV12.
EDIT: I think tiling is your problem. Please check this github commit which is related to your problem and has a solution for color conversion.
I presume you are taking about videos with a resolution of 1920 x 1080. It is recommended to align decoded buffer width and height to the nearest multiple of 128 and 32 respectively to avoid garbled output.
ALIGN(decoded_buffer_width, 128)
ALIGN(decoded_buffer_height, 32)
So you must be using 1920 x 1088 for your calculations
I am passing the output of a MediaExtractor into a MediaCodec decoder, and then passing the decoder's output buffer to an encoder's input buffer. The problem I have is that I need to reduce the resolution from the 1920x1080 output from the decoder to 1280x720 by the time it comes out of the encoder. I can do this using a Surface, but I am trying to target Android 4.1 so will need to achieve this another way. Does anyone know how to change the resolution of a video file using MediaCodec but in a way that is compatible with 4.1?
You can use libswscale from libav/ffmpeg, or libyuv, or any other YUV handling library, or write your own downscaling routine - it's not very hard actually.
Basically, when you feed the output from the decoder output buffer into the encoder input buffer, you already can't assume you can do a plain copy, because the two may use different color formats. So to be flexible, your code for copying data already needs to be able to convert any supported decoder output color format into any supported encoder input color format. In this copy step, you can just scale down the data. A trivial nearest neighbor downscale is very simple to implement; better looking scaling require a bit more work.
You don't need to do a full SW decode/encode, you can just use SW to adjust the data in the intermediate copy step. But as fadden pointed out, MediaCodec isn't completely stable prior to 4.3 anyway, so it may still not work on all devices.
I'm trying to stream video in android through ffmpeg,the output which i am getting after the decoding is YUV format.Is it possible to render YUV image format directly in the Android screen?
Yes and no.
The output of the camera and hardware video decoders is generally YUV. Frames from these sources are generally sent directly to the display. They may be converted by the driver, typically with a hardware scaler and format converter. This is necessary for efficiency.
There isn't an API to allow an app to pass YUV frames around the same way. The basic problem is that "YUV" covers a lot of ground. The buffer format used by the video decoder may be a proprietary internal format that the various hardware modules can process efficiently; for your app to create a surface in this format, it would have to perform a conversion, and you're right back where you were performance-wise.
You should be able to use GLES2 shaders to do the conversion for you on the way to the display, but I don't have a pointer to code that demonstrates this.
Update: an answer to this question has a link to a WebRTC source file that demonstrates doing the YUV conversion in a GLES2 shader.