I'm trying to develop an android application that encodes opencv array of Mats of resolution 1200x1200 4 channel images to a mp4 video using Android media codec. The problem I'm facing is that, when I'm trying to use emulator(which uses YUV420P color format - I'm using OpenCV COLOR_BGRA2YUV_I420 conversion) the video output color is same as the array of images, but on real android devices the color output is completely different, so I've debugged and found out that my android devices color format is YUV420SP. Since there are no inbuilt opencv functions to convert RGBA/BGRA to YUV420SP, I've converted image to YUV_YV12 and then to YUV420SP/NV21 using the below code
public byte[] YV12toNV21(final byte[] input, final int width, final int height) {
byte[] output = input;
final int size = width * height;
final int quarter = size / 4;
final int vPosition = size; // This is where V starts
final int uPosition = size + quarter; // This is where U starts
System.arraycopy(input, 0, output, 0, size); // Y is same
for (int i = 0; i < quarter; i++) {
output[size + i*2 ] = input[vPosition + i]; // For NV21, V first
output[size + i*2 + 1] = input[uPosition + i]; // For Nv21, U second
}
return output;
}
But still I'm facing the same issue.
This is the Original RGBA Picture
This one is the output of video
UPDATE:
After swapping U and V
Related
I am developing custom camera API 2 app, and I notice that the capture format conversion is different on some devices when I use ImageReader callback.
For example in Nexus 4 doesn't work fine and in Nexus5X looks OK, here is the output.
I initialize the ImageReader in this form:
mImageReader = ImageReader.newInstance(320, 240, ImageFormat.YUV_420_888,2);
And my callback is simple callback ImageReader Callback.
mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable( ImageReader reader) {
try {
mBackgroundHandler.post(
new ImageController(reader.acquireNextImage())
);
}
catch(Exception e)
{
//exception
}
}
};
And in the case of Nexus 4: I had this error.
D/qdgralloc: gralloc_lock_ycbcr: Invalid format passed: 0x32315659
When I try to write the raw file in both devices, I have these different images. So I understand that the Nexus 5X image has NV21 codification and the Nexus 4 has YV12 codification.
I found a specification of image format and I try to get the format in ImageReader.
There are YV12 and NV21 options, but obviously, I get the YUV_420_888 format when I try to obtain the format.
int test=mImageReader.getImageFormat();
So is there any way to get the camera input format (NV21 or YV12) to discriminate this codification types in the camera class? CameraCharacteristics maybe?
Thanks in advance.
Unai.
PD: I use OpenGL for displayin RGB images, and I use Opencv to make the conversions to YUV_420_888.
YUV_420_888 is a wrapper that can host (among others) both NV21 and YV12 images. You must use the planes and strides to access individual colors:
ByteBuffer Y = image.getPlanes()[0];
ByteBuffer U = image.getPlanes()[1];
ByteBuffer V = image.getPlanes()[2];
If the underlying pixels are in NV21 format (as on Nexus 4), the pixelStride will be 2, and
int getU(image, col, row) {
return getPixel(image.getPlanes()[1], col/2, row/2);
}
int getPixel(plane, col, row) {
return plane.getBuffer().get(col*plane.getPixelStride() + row*plane.getRowStride());
}
We take half column and half row because this is how U and V (chroma) planes are stored in 420 image.
This code is for illustration, it is very inefficient, you probably want to access pixels at bulk, using get(byte[], int, int), or via a fragment shader, or via JNI function GetDirectBufferAddress in native code. What you cannot use, is method plane.array(), because the planes are guaranteed to be direct byte buffers.
Here useful method which converts from YV12 to NV21.
public static byte[] fromYV12toNV21(#NonNull final byte[] yv12,
final int width,
final int height) {
byte[] nv21 = new byte[yv12.length];
final int size = width * height;
final int quarter = size / 4;
final int vPosition = size; // This is where V starts
final int uPosition = size + quarter; // This is where U starts
System.arraycopy(yv12, 0, nv21, 0, size); // Y is same
for (int i = 0; i < quarter; i++) {
nv21[size + i * 2] = yv12[vPosition + i]; // For NV21, V first
nv21[size + i * 2 + 1] = yv12[uPosition + i]; // For Nv21, U second
}
return nv21;
}
I'm trying to get images from HD Live Stream. Getting OMX Decoder YUV Streams and converting them into JPG. JPEG is completely disturbed. Tried some suggestions from group but not working.
My resolution is 320x240.
i will get buffer length is (386 * 256 * 1.5) for configured 320 * 240 resolution. I'm not getting how to get this new width and height information.
JPG conversion code i have in Java and using OMXCodec is in Native. Please help me.
final int frameSize = width * height;
final int qFrameSize = frameSize/4;
int padding = 0;/*(width*height + 2047) & ~2047;
if ((width % 32) != 0) {
padding = (width*height) % 1024;
} else {
padding = (width*height) % 2048;
}
System.arraycopy(input, 0, output, 0, frameSize); // Y
for (int i = 0; i < qFrameSize; i++) {
output[frameSize + i*2 + padding] = input[frameSize + i + qFrameSize ]; // Cb (U)
output[frameSize + i*2 + 1 + padding] = input[frameSize + i ]; // Cr (V)
}
return ;
}
thank you,
Raghu
The output of QCom video decoder is usually a specific custom color format which is typically known as tiled format. Please refer to these questions which have more inputs on how to convert the data to a more cleaner frame
QOMX_COLOR_FormatYUV420PackedSemiPlanar64x32Tile2m8ka converter
QOMX_COLOR_FormatYUV420PackedSemiPlanar64x32Tile2m8ka color format
Am trying to show a yuv video file in android, I have a few yuv video files that am using.
This video yuv file video1 (160*120 resolution) is one that I captured from my server as raw h264 data and converted to yuv file using OpenH264.
I used YUV Player Deluxe to play the above yuv video files and it plays perfectly well.
When I try to play the same in Android am not getting the color component reproduced properly.The image almost appears black and white with a few traces of color in between of the image frame.
To display the video in Android, what I did was read the yuv video file frame by frame where each frame is of size = (w*h*1.5)bytes and obtain rgb array out of that using the code mentioned below
public static int[] convertYUV420_NV21toARGB8888(byte [] data, int width, int height) {
int size = width*height;
int offset = size;
int[] pixels = new int[size];
int u, v, y1, y2, y3, y4;
// i along Y and the final pixels
// k along pixels U and V
for(int i=0, k=0; i < size; i+=2, k+=2) {
y1 = data[i ]&0xff;
y2 = data[i+1]&0xff;
y3 = data[width+i ]&0xff;
y4 = data[width+i+1]&0xff;
v = data[offset+k ]&0xff;
u = data[offset+k+1]&0xff;
v = v-128;
u = u-128;
pixels[i ] = convertYUVtoARGB(y1, u, v);
pixels[i+1] = convertYUVtoARGB(y2, u, v);
pixels[width+i ] = convertYUVtoARGB(y3, u, v);
pixels[width+i+1] = convertYUVtoARGB(y4, u, v);
if (i!=0 && (i+2)%width==0)
i += width;
}
return pixels;
}
private static int convertYUVtoARGB(int y, int u, int v) {
int r = y + (int)(1.772f*v);
int g = y - (int)(0.344f*v + 0.714f*u);
int b = y + (int)(1.402f*u);
r = r>255? 255 : r<0 ? 0 : r;
g = g>255? 255 : g<0 ? 0 : g;
b = b>255? 255 : b<0 ? 0 : b;
return 0xff000000 | (r<<16) | (g<<8) | b;
}
Using the rgb[] obtained from mehod convertYUV420_NV21toARGB8888 above I constructed a bitmap to display in ImageView.
I tried various other codes to convert yuv[] to rgb[] but the result is same. Also I tried using Androids YUVImage API and the result is yet the same.
I know that the code stated above is to convert Y'UV420sp (NV21) to ARGB8888, but am not getting the difference between YUV to RGB and Y'UV420sp (NV21) to ARGB8888 conversion (just below the previous link....)
Please could anyone help me out....
Your main problem as you expected is that you treat input data as NV21 and not as YUV. The difference between this two formats is that in NV21 format chroma (U/V) samples are interleaved (i.e. VUVUVUVUVU...) and in YUV format they are in separate planes (i.e. UUUUU...VVVVV...) and another order so this part of you code should look like:
u = data[offset+k ]&0xff;
v = data[offset+k + size/4]&0xff;
and k in loop should increase by 1 (not by 2).
I have to make a simple app that will messure the saturation, brightness etc. from camera preview on Android. My code is now sending image to data in:
public void onPreviewFrame(byte[] data, Camera camera){
}
...and if I'm not wrong it is in YUV420SP format. I have tried to find some information about this but unsuccessful. Can anyone tell me how to manage this format?
See Android document: http://developer.android.com/reference/android/hardware/Camera.Parameters.html#setPreviewFormat(int); the image format is described here: http://www.fourcc.org/yuv.php#NV21.
In the nutshell, this byte[] contains two parts: luma and chroma. You can use the camera object to find the current parameters (don't use this in production code in every call to onPreviewFrame(), because these calls are performance burden, but reuse the values):
int w = camera.getParameters().getPreviewSize().width;
int h = camera.getParameters().getPreviewSize().height;
byte[] luma = new byte[w*h];
byte[] chroma = new byte[w*h/2];
System.arraycopy(data, 0, luma, 0, w*h);
System.arraycopy(data, w*h, chroma, 0, w*h/2);
int Y_at_x_y = luma[x + y*w]; // or data[x + y*w]
int U_at_x_y = chroma[x/2 + y*w/2 + 1]; // or data[w*h + x/2 + y*w/2 + 1]
int V_at_x_y = chroma[x/2 + y*w/2]; // or data[w*h + x/2 + y*w/2]
On camera preview frame I get data in yv12 format on Android side. I need to convert it to YUV420P on jni side. How can I do it? As I have read from many sources in YUV420P format y samples appears first which is followed by u samples. u samples are followed by v sample. yv12 format is same as YUV420P except u and v samples appears in reverse order, that means y samples are followed by v and then u samples. Keeping that in mind I have used following swapping code to produce YUV420P data from yv12 data format before encoding.
avpicture_fill((AVPicture*)outframe, (uint8_t*)camData, codecCtx->pix_fmt, codecCtx->width, codecCtx->height);
uint8_t * buf_store = outframe->data[1];
outframe->data[1]=outframe->data[2];
outframe->data[2]=buf_store;
But it does not seems to be working. How should I adjust my code?
Do, t use avpicture_fill. I have implemented in my application like this and its working fine
picture->linesize[0] = frameWidth;
picture->linesize[1] = frameWidth/2;
picture->linesize[2] = frameWidth/2;
picture->data[0] = camData;
picture->data[1] = camData + picture->linesize[0]*frameHeight+picture->linesize[1]*frameHeight/2;
picture->data[2] = camData + picture->linesize[0]*frameHeight;
May be you need this method:
//yv12 to yuv420p
public static void swapYV12toI420(final byte[] yv12bytes, final byte[] i420bytes, int width, int height) {
int size = width * height;
int part = size / 4;
System.arraycopy(yv12bytes, 0, i420bytes, 0, size);
System.arraycopy(yv12bytes, size + part, i420bytes, size, part);
System.arraycopy(yv12bytes, size, i420bytes, size + part, part);
}
For every YV12 data packages you received, swap to i420.