Decoding YUV to RGB in C/C++ with NDK - android

I'm trying to convert the Android camera feed to a bitmap for image processing.
I have some code that converts YUV to RGB in native java which works, however, this process isn't quick enough for real time video so I think I need to convert it in either C or C++ before I apply the filters. I already have the NDK set up and working so the only bit I don't know how to do is port the following code to C or C++:
// decode Y, U, and V values on the YUV 420 buffer described as YCbCr_422_SP by Android
// David Manpearl 081201
public void decodeYUV(int[] out, byte[] fg, int width, int height)
throws NullPointerException, IllegalArgumentException {
int sz = width * height;
if (out == null)
throw new NullPointerException("buffer out is null");
if (out.length < sz)
throw new IllegalArgumentException("buffer out size " + out.length
+ " < minimum " + sz);
if (fg == null)
throw new NullPointerException("buffer 'fg' is null");
if (fg.length < sz)
throw new IllegalArgumentException("buffer fg size " + fg.length
+ " < minimum " + sz * 3 / 2);
int i, j;
int Y, Cr = 0, Cb = 0;
for (j = 0; j < height; j++) {
int pixPtr = j * width;
final int jDiv2 = j >> 1;
for (i = 0; i < width; i++) {
Y = fg[pixPtr];
if (Y < 0)
Y += 255;
if ((i & 0x1) != 1) {
final int cOff = sz + jDiv2 * width + (i >> 1) * 2;
Cb = fg[cOff];
if (Cb < 0)
Cb += 127;
else
Cb -= 128;
Cr = fg[cOff + 1];
if (Cr < 0)
Cr += 127;
else
Cr -= 128;
}
int R = Y + Cr + (Cr >> 2) + (Cr >> 3) + (Cr >> 5);
if (R < 0)
R = 0;
else if (R > 255)
R = 255;
int G = Y - (Cb >> 2) + (Cb >> 4) + (Cb >> 5) - (Cr >> 1)
+ (Cr >> 3) + (Cr >> 4) + (Cr >> 5);
if (G < 0)
G = 0;
else if (G > 255)
G = 255;
int B = Y + Cb + (Cb >> 1) + (Cb >> 2) + (Cb >> 6);
if (B < 0)
B = 0;
else if (B > 255)
B = 255;
out[pixPtr++] = 0xff000000 + (B << 16) + (G << 8) + R;
}
}
}
...
decodeYUV(argb8888, data, camSize.width, camSize.height);
Bitmap bitmap = Bitmap.createBitmap(argb8888, camSize.width,
camSize.height, Config.ARGB_8888);
Does anyone know how to do this?
Many thanks!
Update
This is how far I've got:
JNIEXPORT void JNICALL Java_com_twothreetwo_zoomplus_ZoomPlus_YUVtoRGB(JNIEnv * env, jobject obj, jintArray rgb, jbyteArray yuv420sp, jint width, jint height)
{
int sz;
int i;
int j;
int Y;
int Cr = 0;
int Cb = 0;
int pixPtr = 0;
int jDiv2 = 0;
int R = 0;
int G = 0;
int B = 0;
int cOff;
sz = width * height;
//if(out == null) throw new NullPointerException("buffer 'out' is null");
//if(out.length < sz) throw new IllegalArgumentException("buffer 'out' size " + out.length + " < minimum " + sz);
//if(fg == null) throw new NullPointerException("buffer 'fg' is null");
//if(fg.length < sz) throw new IllegalArgumentException("buffer 'fg' size " + fg.length + " < minimum " + sz * 3/ 2);
for(j = 0; j < height; j++) {
pixPtr = j * width;
jDiv2 = j >> 1;
for(i = 0; i < width; i++) {
Y = yuv420sp[pixPtr]; if(Y < 0) Y += 255;
if((i & 0x1) != 1) {
cOff = sz + jDiv2 * width + (i >> 1) * 2;
Cb = yuv420sp[cOff];
if(Cb < 0) Cb += 127; else Cb -= 128;
Cr = yuv420sp[cOff + 1];
if(Cr < 0) Cr += 127; else Cr -= 128;
}
R = Y + Cr + (Cr >> 2) + (Cr >> 3) + (Cr >> 5);
if(R < 0) R = 0; else if(R > 255) R = 255;
G = Y - (Cb >> 2) + (Cb >> 4) + (Cb >> 5) - (Cr >> 1) + (Cr >> 3) + (Cr >> 4) + (Cr >> 5);
if(G < 0) G = 0; else if(G > 255) G = 255;
B = Y + Cb + (Cb >> 1) + (Cb >> 2) + (Cb >> 6);
if(B < 0) B = 0; else if(B > 255) B = 255;
rgb[pixPtr++] = 0xff000000 + (B << 16) + (G << 8) + R;
}
}
}
But I'm getting the following C errors:
apps/zoomplusndk/jni/zoomplusndk.c:53: warning: dereferencing 'void *' pointer
apps/zoomplusndk/jni/zoomplusndk.c:53: error: void value not ignored as it ought to be
apps/zoomplusndk/jni/zoomplusndk.c:56: warning: dereferencing 'void *' pointer
apps/zoomplusndk/jni/zoomplusndk.c:56: error: void value not ignored as it ought to be
apps/zoomplusndk/jni/zoomplusndk.c:58: warning: dereferencing 'void *' pointer
apps/zoomplusndk/jni/zoomplusndk.c:58: error: void value not ignored as it ought to be
apps/zoomplusndk/jni/zoomplusndk.c:67: warning: dereferencing 'void *' pointer
apps/zoomplusndk/jni/zoomplusndk.c:67: error: invalid use of void expression
Line 53 is Y = yuv420sp[pixPtr]; if(Y < 0) Y += 255;

Here is your code corrected:
#include <jni.h>
#include <android/log.h>
int* rgbData;
int rgbDataSize = 0;
JNIEXPORT void JNICALL Java_mk_g6_transparency_CameraPreview_YUVtoRBG(JNIEnv * env, jobject obj, jintArray rgb, jbyteArray yuv420sp, jint width, jint height)
{
int sz;
int i;
int j;
int Y;
int Cr = 0;
int Cb = 0;
int pixPtr = 0;
int jDiv2 = 0;
int R = 0;
int G = 0;
int B = 0;
int cOff;
int w = width;
int h = height;
sz = w * h;
jbyte* yuv = yuv420sp;
if(rgbDataSize < sz) {
int tmp[sz];
rgbData = &tmp[0];
rgbDataSize = sz;
__android_log_write(ANDROID_LOG_INFO, "JNI", "alloc");
}
for(j = 0; j < h; j++) {
pixPtr = j * w;
jDiv2 = j >> 1;
for(i = 0; i < w; i++) {
Y = yuv[pixPtr];
if(Y < 0) Y += 255;
if((i & 0x1) != 1) {
cOff = sz + jDiv2 * w + (i >> 1) * 2;
Cb = yuv[cOff];
if(Cb < 0) Cb += 127; else Cb -= 128;
Cr = yuv[cOff + 1];
if(Cr < 0) Cr += 127; else Cr -= 128;
}
R = Y + Cr + (Cr >> 2) + (Cr >> 3) + (Cr >> 5);
if(R < 0) R = 0; else if(R > 255) R = 255;
G = Y - (Cb >> 2) + (Cb >> 4) + (Cb >> 5) - (Cr >> 1) + (Cr >> 3) + (Cr >> 4) + (Cr >> 5);
if(G < 0) G = 0; else if(G > 255) G = 255;
B = Y + Cb + (Cb >> 1) + (Cb >> 2) + (Cb >> 6);
if(B < 0) B = 0; else if(B > 255) B = 255;
rgbData[pixPtr++] = 0xff000000 + (B << 16) + (G << 8) + R;
}
}
(*env)->SetIntArrayRegion(env, rgb, 0, sz, ( jint * ) &rgbData[0] );
}
it's also online at http://dl.dropbox.com/u/49855874/yuv-decoder.c
My test:
Camera preview at 640x480 Processing ~20ms on HTC Desire
Camera preview at 320x240 Processing ~6ms on HTC Desire

I was getting Fatal signal 11 (SIGSEGV) until i changed this line:
jbyte* yuv = yuv420sp;
to this:
jboolean isCopy;
jbyte* yuv = (*env)->GetByteArrayElements(env, yuv420sp, &isCopy);

What I suggest is that don't do that yourself, due to somebody already done that perfect, use GPUImage library or just move this part of code.
There is a android version of GPUImage:
https://github.com/CyberAgent/android-gpuimage
You can include this library if you use gradle, and call the method:
GPUImageNativeLibrary.YUVtoRBGA( inputArray, WIDTH, HEIGHT, outputArray);

Related

How do I write rgb2yuv based on the existing yuv2rgb method?

I took over an Android project, which already has a yuv2rgb method. I have to write this method rgb2yuv. Please help me, thank you!
public static int[] yuv2rgb(byte[] pYUV, int width, int height) {
int[] pRGB = new int[width * height];
int i, j, yp;
int hfWidth = width >> 1;
int size = width * height;
int qtrSize = size >> 2;
for (i = 0, yp = 0; i < height; i++) {
int uvp = size + (i >> 1) * hfWidth, u = 0, v = 0;
for (j = 0; j < width; j++, yp++) {
int y = (0xff & pYUV[yp]) - 16;
if ((j & 1) == 0) {
u = (0xff & pYUV[uvp + (j >> 1)]) - 128;
v = (0xff & pYUV[uvp + qtrSize + (j >> 1)]) - 128;
}
int y1192 = 1192 * y;
int r = (y1192 + 1634 * v);
int g = (y1192 - 833 * v - 400 * u);
int b = (y1192 + 2066 * u);
if (r < 0) r = 0;
else if (r > 262143) r = 262143;
if (g < 0) g = 0;
else if (g > 262143) g = 262143;
if (b < 0) b = 0;
else if (b > 262143) b = 262143;
pRGB[i * width + j] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) & 0xff00) | ((b >> 10) & 0xff);
}
}
return pRGB;
}
My colleague helped me find a way,it's work!
public static byte[] colorconvertRGB_IYUV_I420(int[] aRGB, int width, int height) {
final int frameSize = width * height;
final int chromasize = frameSize / 4;
int yIndex = 0;
int uIndex = frameSize;
int vIndex = frameSize + chromasize;
byte[] yuv = new byte[width * height * 3 / 2];
int a, R, G, B, Y, U, V;
int index = 0;
for (int j = 0; j < height; j++) {
for (int i = 0; i < width; i++) {
//a = (aRGB[index] & 0xff000000) >> 24; //not using it right now
R = (aRGB[index] & 0xff0000) >> 16;
G = (aRGB[index] & 0xff00) >> 8;
B = (aRGB[index] & 0xff) >> 0;
Y = ((66 * R + 129 * G + 25 * B + 128) >> 8) + 16;
U = ((-38 * R - 74 * G + 112 * B + 128) >> 8) + 128;
V = ((112 * R - 94 * G - 18 * B + 128) >> 8) + 128;
yuv[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y));
if (j % 2 == 0 && index % 2 == 0) {
yuv[uIndex++] = (byte) ((U < 0) ? 0 : ((U > 255) ? 255 : U));
yuv[vIndex++] = (byte) ((V < 0) ? 0 : ((V > 255) ? 255 : V));
}
index++;
}
}
return yuv;
}

Is converting float to matrix and matrix to bitmap possible in android?

I already converted the data from onPreviewFrame to rgb which is an int array. Gaussian Blur uses float/double values and I wasn't able to find methods for converting float/double values to bitmap.
The only way I have in my mind is convert these float values into Matrix and then into a bitmap. But i am not really sure if this is possible.
public Bitmap getProcessedImage(byte[] image){
Bitmap bmp = null;
int rgb[] = new int[image.length];
decodeYUV420SP(rgb, image);
convertToGrayscale(rgb);
applyGaussianBlur(rgb);
bmp = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
bmp.copyPixelsFromBuffer(IntBuffer.wrap(rgb));
return bmp;
}
public void convertToGrayscale(int rgb[]){
for(int i = 0 ; i < rgb.length;i++){
int R = (rgb[i] >> 16) & 0xff;
int G = (rgb[i] >> 8) & 0xff;
int B = rgb[i] & 0xff;
int gray = (R + G + B )/ 3 ;
rgb[i] = 0xFF000000 | (gray << 16) | (gray << 8) | gray;
}
}
//Method from Ketai project!
void decodeYUV420SP(int[] rgb, byte[] yuv420sp) {
final int frameSize = width * height;
for (int j = 0, yp = 0; j < height; j++) { int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
for (int i = 0; i < width; i++, yp++) {
int y = (0xff & ((int) yuv420sp[yp])) - 16;
if (y < 0)
y = 0;
if ((i & 1) == 0) {
v = (0xff & yuv420sp[uvp++]) - 128;
u = (0xff & yuv420sp[uvp++]) - 128;
}
int y1192 = 1192 * y;
int r = (y1192 + 1634 * v);
int g = (y1192 - 833 * v - 400 * u);
int b = (y1192 + 2066 * u);
if (r < 0)r = 0;else if (r > 262143)r = 262143;
if (g < 0)g = 0;else if (g > 262143)g = 262143;
if (b < 0)b = 0;else if (b > 262143)b = 262143;
rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) & 0xff00) | ((b >> 10) & 0xff);
}
}
}
public void applyGaussianBlur(int rgb[]){
int kernelWidth = 3, kernelHeight = 3;
float darray[][] = new float[width][height];
float matrix[][] = new float[kernelWidth][kernelHeight];
float sigma = 3.0f;
float sum = 0;
for(int j = 0, ct = 0; j < width;j++){
for(int m = 0; m < height;m++, ct++){
darray[j][m] = rgb[ct];
}
}
for(int x = -1; x < kernelWidth-1;x++){
for(int y = -1; y < kernelHeight-1; y++){
matrix[x+1][y+1] = (float)((1/(2*Math.PI*Math.pow(sigma, 2))) * Math.exp(-(Math.pow(x, 2)+Math.pow(y, 2))/(2*Math.pow(sigma, 2))));
sum+=matrix[x+1][y+1];
}
}
for(int x = -1; x < kernelWidth-1;x++){
for (int y = -1; y < kernelHeight-1; y++) {
matrix[x+1][y+1] = matrix[x+1][y+1]/sum;
}
}
for (int j = 1; j < width-1; j++){
for (int m = 1; m < height-1;m++){
darray[j-1][m-1] = (darray[j-1][m-1] * matrix[0][0]);
darray[j-1][m] = (darray[j-1][m] * matrix[0][1]);
darray[j-1][m+1] = (darray[j-1][m+1] * matrix[0][2]);
darray[j][m-1] = (darray[j][m-1] * matrix[1][0]);
darray[j][m] = (darray[j][m] * matrix[1][1]);
darray[j][m+1] = (darray[j][m+1] * matrix[1][2]);
darray[j+1][m-1] = (darray[j+1][m-1] * matrix[2][0]);
darray[j+1][m] = (darray[j+1][m] * matrix[2][1]);
darray[j+1][m+1] = (darray[j+1][m+1] * matrix[2][2]);
}
}
for(int j = 0, ct = 0; j < width;j++){
for(int m = 0; m < height;m++, ct++){
rgb[ct] = Math.round(darray[j][m]);
}
}
}

Calling JNI with Android NDK issue

I have a C function that I'm calling from Java using the Android NDK. Essentially it takes the camera data and converts it from YUV to RGB format. The problem is I'm not sure what object type imageOut is returned in as in C it's simply of type jobject. This is the snippet of code I have (unfortunately I have nothing else to go by):
JNIEXPORT void JNICALL Java_com_twothreetwo_zoomplus_ZoomPlus_yuvrgb(JNIEnv *env, jobject obj, jbyteArray imageIn, jint widthIn, jint heightIn, jobject imageOut, jint widthOut, jint heightOut)
{
LOGI("width is %d; height is %d;",widthIn,heightIn);
jbyte *cImageIn = (*env)->GetByteArrayElements(env, imageIn, NULL);
jbyte *cImageOut = (jbyte*)(*env)->GetDirectBufferAddress(env, imageOut);
unsigned int *rgbs = (unsigned int*)cImageOut;
int half_widthIn = widthIn >> 1;
//the end of the luminance data
int lumEnd = (widthIn * heightIn) >> 1;
//points to the next luminance value pair
int lumPtr = 0;
//points to the next chromiance value pair
int chrPtr = lumEnd;
//the end of the current luminance scanline
int lineEnd = half_widthIn;
unsigned short *yuvs;
int x,y;
for (y=0;y<heightIn;y++) {
int yPosOut=(y*widthOut) >> 1;
for (x=0;x<half_widthIn;x++) {
//read the luminance and chromiance values
int Y1 = yuvs[lumPtr++];
int Y2 = (Y1 >> 8) & 0xff;
Y1 = Y1 & 0xff;
int Cr = yuvs[chrPtr++];
int Cb = ((Cr >> 8) & 0xff) - 128;
Cr = (Cr & 0xff) - 128;
int R, G, B;
//generate first RGB components
B = Y1 + ((454 * Cb) >> 8);
if (B < 0) B = 0; if (B > 255) B = 255;
G = Y1 - ((88 * Cb + 183 * Cr) >> 8);
if (G < 0) G = 0; if (G > 255) G = 255;
R = Y1 + ((359 * Cr) >> 8);
if (R < 0) R = 0; if (R > 255) R = 255;
int val = ((R & 0xf8) << 8) | ((G & 0xfc) << 3) | (B >> 3);
//generate second RGB components
B = Y1 + ((454 * Cb) >> 8);
if (B < 0) B = 0; if (B > 255) B = 255;
G = Y1 - ((88 * Cb + 183 * Cr) >> 8);
if (G < 0) G = 0; if (G > 255) G = 255;
R = Y1 + ((359 * Cr) >> 8);
if (R < 0) R = 0; if (R > 255) R = 255;
rgbs[yPosOut+x] = val | ((((R & 0xf8) << 8) | ((G & 0xfc) << 3) | (B >> 3)) << 16);
}
//skip back to the start of the chromiance values when necessary
chrPtr = lumEnd + ((lumPtr >> 1) / half_widthIn) * half_widthIn;
lineEnd += half_widthIn;
}
(*env)->ReleaseByteArrayElements(env, imageIn, cImageIn, JNI_ABORT);
}
I'm calling the function in the onPreviewFrame function:
public native void yuvrgb(byte[] yuvImageIn, int widthIn, int heightIn, Bitmap imageOut, int widthOut, int heightOut);
public void onPreviewFrame(byte[] data, Camera camera)
{
yuvrgb(data,480,640,bitmapWip,480,640);
cameraImageView.setImageBitmap(bitmapWip);
}
As you can see, I'm currently declaring imageOut as a Bitmap which is where I think I'm going wrong as I just guessed the type.
I don't get any errors, the app simply crashes instantly. Does anyone know what I'm doing wrong?
cImageOut is an array of bytes. It's declared at the top of your C function:
jbyte *cImageOut = (jbyte*)(*env)->GetDirectBufferAddress(env, imageOut);
You should declare it as a byte array in your java code and then convert it to a Bitmap using BitmapFactory.decodeByteArray
http://developer.android.com/reference/android/graphics/BitmapFactory.html#decodeByteArray%28byte[],%20int,%20int%29

Region of interest

I am interested to capture only part of the cameraPreview.
I have a rectangle which has a onTouchListener. I can use this to drag my region of interest anywhere on the camera preview.
My ultimate goal is to capture only that part of the preview. But I am not able to find a way.
Is there any API I can make use of?
Can anyone guide me here.
Thanks
There's no support for that in the API that I know of but I have successfully done something similar.
Basically you'll need to:
implement Camera.PreviewCallback
decode the YUV preview frame to RGB
make a temp bitmap from the RGB bytes
scale the rect from your on screen rect to the appropriate size for the preview frames dimensions
then finally, use createBitmap (Bitmap source, int x, int y, int width, int height) to crop a new bitmap from the temp one.
Note that there is no API supported YUV decoder until FROYO via the YuvImage class which you can use something like:
YuvImage yuvimage=new YuvImage(yuv,ImageFormat.NV21,width,height,null);
ByteArrayOutputStream baos=new ByteArrayOutputStream();
yuvimage.compressToJpeg(new Rect(0,0,width,height), 100, baos);
rgb = BitmapFactory.decodeByteArray(baos.toByteArray(), 0, baos.size());
If youre only going to support 2.2 and later, your could skip the temp bitmap and use the YuvImage.compressToJpeg to crop to your scaled rect.
For OS versions prior to 2.2 you'll need you own implementation of YUV decode which can be done as follows:
/**
* Decodes YUV frame to a buffer which can be use to create a bitmap
* decode Y, U, and V values on the YUV 420 buffer described as YCbCr_422_SP by Android
* #param out the outgoing array of RGB bytes
* #param fg the incoming frame bytes
* #param width of source frame
* #param height of source frame
* #throws NullPointerException
* #throws IllegalArgumentException
*/
public static void decodeYUV(int[] out, byte[] fg, int width, int height)
throws NullPointerException, IllegalArgumentException {
int sz = width * height;
if (out == null)
throw new NullPointerException("buffer out is null");
if (out.length < sz)
throw new IllegalArgumentException("buffer out size " + out.length
+ " < minimum " + sz);
if (fg == null)
throw new NullPointerException("buffer 'fg' is null");
if (fg.length < sz)
throw new IllegalArgumentException("buffer fg size " + fg.length
+ " < minimum " + sz * 3 / 2);
int i, j;
int Y, Cr = 0, Cb = 0;
for (j = 0; j < height; j++) {
int pixPtr = j * width;
final int jDiv2 = j >> 1;
for (i = 0; i < width; i++) {
Y = fg[pixPtr];
if (Y < 0)
Y += 255;
if ((i & 0x1) != 1) {
final int cOff = sz + jDiv2 * width + (i >> 1) * 2;
Cb = fg[cOff];
if (Cb < 0)
Cb += 127;
else
Cb -= 128;
Cr = fg[cOff + 1];
if (Cr < 0)
Cr += 127;
else
Cr -= 128;
}
int R = Y + Cr + (Cr >> 2) + (Cr >> 3) + (Cr >> 5);
if (R < 0)
R = 0;
else if (R > 255)
R = 255;
int G = Y - (Cb >> 2) + (Cb >> 4) + (Cb >> 5) - (Cr >> 1)
+ (Cr >> 3) + (Cr >> 4) + (Cr >> 5);
if (G < 0)
G = 0;
else if (G > 255)
G = 255;
int B = Y + Cb + (Cb >> 1) + (Cb >> 2) + (Cb >> 6);
if (B < 0)
B = 0;
else if (B > 255)
B = 255;
out[pixPtr++] = (0xff000000 + (B << 16) + (G << 8) + R);
}
}
}
Credit: http://groups.google.com/group/android-developers/browse_thread/thread/c85e829ab209ceea/3f180a16a4872b58?lnk=gst&q=onpreviewframe&pli=1

YUV decode function error or hardware problem?

I get the YUV to RGB function 1&2 (from stack overflow)
but the result is wrong like this http://163.18.62.32/device.jpg
I dont understand what's wrong in the step
my device is Moto Milestone with 2.1 update 1
function 1
public int[] decodeYUV420SP( byte[] yuv420sp, int width, int height) {
final int frameSize = width * height;
int rgb[]=new int[width*height];
for (int j = 0, yp = 0; j < height; j++) {
int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
for (int i = 0; i < width; i++, yp++) {
int y = (0xff & ((int) yuv420sp[yp])) - 16;
if (y < 0) y = 0;
if ((i & 1) == 0) {
v = (0xff & yuv420sp[uvp++]) - 128;
u = (0xff & yuv420sp[uvp++]) - 128;
}
int y1192 = 1192 * y;
int r = (y1192 + 1634 * v);
int g = (y1192 - 833 * v - 400 * u);
int b = (y1192 + 2066 * u);
if (r < 0) r = 0; else if (r > 262143) r = 262143;
if (g < 0) g = 0; else if (g > 262143) g = 262143;
if (b < 0) b = 0; else if (b > 262143) b = 262143;
rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) &
0xff00) | ((b >> 10) & 0xff);
}
}
return rgb;
}
function 2
public void decodeYUV(int[] out, byte[] fg, int width, int height)
throws NullPointerException, IllegalArgumentException {
int sz = width * height;
if (out == null)
throw new NullPointerException("buffer out is null");
if (out.length < sz)
throw new IllegalArgumentException("buffer out size " + out.length
+ " < minimum " + sz);
if (fg == null)
throw new NullPointerException("buffer 'fg' is null");
if (fg.length < sz)
throw new IllegalArgumentException("buffer fg size " + fg.length
+ " < minimum " + sz * 3 / 2);
int i, j;
int Y, Cr = 0, Cb = 0;
for (j = 0; j < height; j++) {
int pixPtr = j * width;
final int jDiv2 = j >> 1;
for (i = 0; i < width; i++) {
Y = fg[pixPtr];
if (Y < 0)
Y += 255;
if ((i & 0x1) != 1) {
final int cOff = sz + jDiv2 * width + (i >> 1) * 2;
Cb = fg[cOff];
if (Cb < 0)
Cb += 127;
else
Cb -= 128;
Cr = fg[cOff + 1];
if (Cr < 0)
Cr += 127;
else
Cr -= 128;
}
int R = Y + Cr + (Cr >> 2) + (Cr >> 3) + (Cr >> 5);
if (R < 0)
R = 0;
else if (R > 255)
R = 255;
int G = Y - (Cb >> 2) + (Cb >> 4) + (Cb >> 5) - (Cr >> 1)
+ (Cr >> 3) + (Cr >> 4) + (Cr >> 5);
if (G < 0)
G = 0;
else if (G > 255)
G = 255;
int B = Y + Cb + (Cb >> 1) + (Cb >> 2) + (Cb >> 6);
if (B < 0)
B = 0;
else if (B > 255)
B = 255;
out[pixPtr++] = 0xff000000 + (B << 16) + (G << 8) + R;
}
}
}
and i use them like this
function 1
int[] rgbBuf =decodeYUV420SP(_data,height,width);
function 2
int[] rgbBuf = new int[height*width];
decodeYUV(rgbBuf,_height,.width);
than convert to Bitmap and show on
Bitmap bm = Bitmap.createBitmap(rgbBuf,width,height);
View01.setImageBitmap(bm);
As far as I see there is no Bitmap.createBitmap(.....);
with following set of arguments (byte[] inVal, width, height)
The signatures on both decode methods have width then height.
Your calls seem to have the params flipped.
public int[] decodeYUV420SP( byte[] yuv420sp, int width, int height)
but you call:
decodeYUV420SP(_data,height,width);
and
public void decodeYUV(int[] out, byte[] fg, int width, int height)
but you call:
decodeYUV(rgbBuf,_height,.width);

Categories

Resources