How to convert an Image to Mat Android - android

I need to process Image for my Application. I get the Images for ImageReader.
reader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image mImage = reader.acquireNextImage();
//mImage to Mat here
mImage.close();
}
},null);
But now, I need to convert those images in Mat.
I know that I can pass by the Bitmap class, but i don't know how to convert an Image into Bitmap too.

I think that i found a possible answer.
I give to my ImageReader a simple plane format like JPEG.
reader = ImageReader.newInstance(previewSize.getWidth(),previewSize.getHeight(), ImageFormat.JPEG, 2);
Then i do that :
ByteBuffer bb = image.getPlanes()[0].getBuffer();
byte[] buf = new byte[bb.remaining()];
bb.get(buf);
imageGrab = new Mat();
imageGrab.put(0,0,buf);

Try this
// image to byte array
ByteBuffer bb = image.getPlanes()[0].getBuffer();
byte[] data = new byte[bb.remaining()];
bb.get(data);
// byte array to mat
Mat = Imgcodecs.imdecode(new MatOfByte(data), Imgcodecs.CV_LOAD_IMAGE_UNCHANGED);

Related

Converting YUV_420_888 To Base64 Android corrupted image

I am currently working on a project which uses OpenCV in background mode to detect faces while the app is playing videos .
I've managed to run OpenCV as a service and I am using an ImageReader instance to capture the images
private ImageReader mImageReader = ImageReader.newInstance(mWidth, mHeight, ImageFormat.YUV_420_888, 1);
What I am trying to do is to get the detected face image and send it to the backend side , the image from the Imagereader is converted to Mat so I have both access to the Mat type and the Image type.
I've managed to convert the aquired image to Yuv image then to jpg ( byte array ) by using both toYuvImage and toJpegImage methods from this link : https://blog.minhazav.dev/how-to-convert-yuv-420-sp-android.media.Image-to-Bitmap-or-jpeg/#how-to-convert-yuv_420_888-image-to-jpeg-format
After converting the image to an array of bytes , I'm also trying to convert it to base64 to send it using http , the problem is when I try to put the imageQuality to 100 in toJpegImage , the result of the base64 image is looking corrupted , but when I put the value to something lower like 15 or 10 the image output ( resolution ) is better but the quality is bad , I am not sure if this problem is related to the resolution
byte[] jpegDataTest = ImageUtil.toJpegImage(detectionImage,15);
String base64New = Base64.encodeToString(jpegDataTest, Base64.DEFAULT);
PS : I am converting the image each time a face is detected in a for loop
for(Rect rect : faceDetections.toArray()){}
compress quality is set to 100 : https://i.postimg.cc/YqSmFxrT/quality100.jpg
compress quality is set to 15
public static byte[] toJpegImage(Image image, int imageQuality) {
if (image.getFormat() != ImageFormat.YUV_420_888) {
throw new IllegalArgumentException("Invalid image format");
}
YuvImage yuvImage = toYuvImage(image);
int width = image.getWidth();
int height = image.getHeight();
// Convert to jpeg
byte[] jpegImage = null;
try (ByteArrayOutputStream out = new ByteArrayOutputStream()) {
yuvImage.compressToJpeg(new Rect(0, 0, width, height), imageQuality, out);
jpegImage = out.toByteArray();
} catch (IOException e) {
e.printStackTrace();
}
return jpegImage;
}
private static byte[] YUV_420_888toNV21(Image image) {
byte[] nv21;
ByteBuffer yBuffer = image.getPlanes()[0].getBuffer();
ByteBuffer vuBuffer = image.getPlanes()[2].getBuffer();
int ySize = yBuffer.remaining();
int vuSize = vuBuffer.remaining();
nv21 = new byte[ySize + vuSize];
yBuffer.get(nv21, 0, ySize);
vuBuffer.get(nv21, ySize, vuSize);
return nv21;
}

How to scale an image saved as byte array?

When you have an image (PNG in this case) saved as byte array on android, how can scale it to get a new image scaled in byte array format?
Have in mind that the image to scaling is to a smaller size, avoiding loss of data.
Image croping byte uri image convert byte array the step
InputStream iStream = getContentResolver().openInputStream(uri);
byte[] inputData = getBytes(iStream);
One quick way of doing this, it's letting Android to resolve the algorithm for us.
So, we convert the byte array to a bitmap, the bitmap can create a new bitmap with new sizes defined and finally convert back to byte array.
private byte[] getScaledImage(byte[] originalImage, int newWidth, int newHeight) {
// PNG has not losses, it just ignores this field when compressing
final int COMPRESS_QUALITY = 0;
// Get the bitmap from byte array since, the bitmap has the the resize function
Bitmap bitmapImage = (BitmapFactory.decodeByteArray(originalImage, 0, originalImage.length));
// New bitmap with the correct size, may not return a null object
Bitmap mutableBitmapImage = Bitmap.createScaledBitmap(bitmapImage,newWidth, newHeight, false);
// Get the byte array from tbe bitmap to be returned
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
mutableBitmapImage.compress(Bitmap.CompressFormat.PNG, 0 , outputStream);
if (mutableBitmapImage != bitmapImage) {
mutableBitmapImage.recycle();
} // else they are the same, just recycle once
bitmapImage.recycle();
return outputStream.toByteArray();
}

How to display an `Image` in the `ImageView`?

I have an ImageView object acquired from the .xml file:
mCameraView = (ImageView) findViewById(R.id.camera_view);
I record an image taken from the camera many times a second. For each new image frame the following method gets called.
#Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireLatestImage();
mCameraView.somehowDisplay(image); // HOW?
}
I want place the image into the image view: How to do that?
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
Bitmap myBitmap = BitmapFactory.decodeByteArray(bytes,0,bytes.length,null);
mCameraView.setImageBitmap(myBitmap);
This should work.
Reference Answer

Convert YUV Image into greyscale Image - Same Result as RGB to Grayscale?

I want to do some Imageprocessing on a YUV_420_888 Image and need a greyscale Edition from it. As I read about the YUV Image it should be enough to extract the Y Plane of the Image. In Android I'll try that with this workflow to convert the Y Plane into a byte Array.
Image.Plane Y = img.getPlanes()[0];
ByteBuffer byteBuffer = Y.getBuffer();
byte[] data = new byte[byteBuffer.remaining()];
byteBuffer.get(data);
So as I want to compare this Image I get now with another grayscale Image (or at least a Result of the image processing) I have the Question, Is the grayscale Image I get extracting the Y-Plane nearly the same as a RGB which was turned into grayscale? Or do I have to do some additional processing steps for that?
Yes, the data you get from Y plane should be the same as if you go through an RGB image.
No, I am using the IR sensor in which I am getting a YUV_420_888 image which is already grey scale. But to convert it in bytes I used following function which gave me error. As per your answer, I took only Y plane and on result it gave me green screen.
ByteBuffer[] buffer = new ByteBuffer[1];
Image image = reader.acquireNextImage();
buffer[0] = image.getPlanes()[0].getBuffer().duplicate();
//buffer[1] = image.getPlanes()[1].getBuffer().duplicate();
int buffer0_size = buffer[0].remaining();
//int buffer1_size = buffer[1].remaining();
buffer[0].clear();
//buffer[1].clear();
byte[] buffer0_byte = new byte[buffer0_size];
//byte[] buffer1_byte = new byte[buffer1_size];
buffer[0].get(buffer0_byte, 0, buffer0_size);
//buffer[1].get(buffer1_byte, 0, buffer1_size);
byte[] byte2 = buffer0_byte;
//byte2=buffer0_byte;
//byte2[1]=buffer1_byte;
image.close();
mArrayImageBuffer.add(byte2);
After dequeing the bytes and goes to funcion:
public static byte[] convertYUV420ToNV12(byte[] byteBuffers){
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
try {
outputStream.write( byteBuffers);
//outputStream.write( byteBuffers[1] );
} catch (IOException e) {
e.printStackTrace();
}
// outputStream.write( buffer2_byte );
byte[] rez = outputStream.toByteArray();
return rez;
}

camera2 API captured image preview

The function onImageAvailable handles the image save, how can it be used to set a preview for the user to review, typically on an ImageView.
#Override
public void onImageAvailable(ImageReader reader) {
//mBackgroundHandler.post(new ImageSaver(reader.acquireNextImage(), mFile));
Image image = reader.acquireNextImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
Bitmap bmp = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
setPreviewImage(bmp);
}
This does not work for me.

Categories

Resources