camera2 API captured image preview - android

The function onImageAvailable handles the image save, how can it be used to set a preview for the user to review, typically on an ImageView.
#Override
public void onImageAvailable(ImageReader reader) {
//mBackgroundHandler.post(new ImageSaver(reader.acquireNextImage(), mFile));
Image image = reader.acquireNextImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
Bitmap bmp = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
setPreviewImage(bmp);
}
This does not work for me.

Related

Not able to get actual uri of image clicked from camera

After implementing Camera2 Api for clicking image I am getting image in this listener
private ImageReader.OnImageAvailableListener imageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image img = reader.acquireLatestImage();
final Image.Plane[] planes = img.getPlanes();
final ByteBuffer buffer = planes[0].getBuffer();
byte[] byteArray = new byte[buffer.remaining()];
buffer.get(byteArray);
bmp = BitmapFactory.decodeByteArray(byteArray, 0, byteArray.length);
uri = getImageUri(getContext(), bmp);
}
};
I tried to parse the Image to Bitmap in order to get the image uri using this method
private Uri getImageUri(Context context, Bitmap inImage) {
this.saveToInternalStorage(inImage);
ByteArrayOutputStream bytes = new ByteArrayOutputStream();
inImage.compress(Bitmap.CompressFormat.JPEG, 100, bytes);
String path = MediaStore.Images.Media.insertImage(context.getContentResolver(), inImage, "CAPTURE" + captureCount, null);
return Uri.parse(path);
}
But the image uri, which I get from above method, when I try to use it for displaying image in ImageView doesnot shows up any thing but I can see the clicked image in gallery.
Image uri which I get starts like external/**/<some number>
What I am trying to do is I want to get uri of clicked image and use
that uri to edit it.
If there is any other way to edit image may be like using bitmap or some other form pls do mention.

How to display an `Image` in the `ImageView`?

I have an ImageView object acquired from the .xml file:
mCameraView = (ImageView) findViewById(R.id.camera_view);
I record an image taken from the camera many times a second. For each new image frame the following method gets called.
#Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireLatestImage();
mCameraView.somehowDisplay(image); // HOW?
}
I want place the image into the image view: How to do that?
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
Bitmap myBitmap = BitmapFactory.decodeByteArray(bytes,0,bytes.length,null);
mCameraView.setImageBitmap(myBitmap);
This should work.
Reference Answer

How to convert an Image to Mat Android

I need to process Image for my Application. I get the Images for ImageReader.
reader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image mImage = reader.acquireNextImage();
//mImage to Mat here
mImage.close();
}
},null);
But now, I need to convert those images in Mat.
I know that I can pass by the Bitmap class, but i don't know how to convert an Image into Bitmap too.
I think that i found a possible answer.
I give to my ImageReader a simple plane format like JPEG.
reader = ImageReader.newInstance(previewSize.getWidth(),previewSize.getHeight(), ImageFormat.JPEG, 2);
Then i do that :
ByteBuffer bb = image.getPlanes()[0].getBuffer();
byte[] buf = new byte[bb.remaining()];
bb.get(buf);
imageGrab = new Mat();
imageGrab.put(0,0,buf);
Try this
// image to byte array
ByteBuffer bb = image.getPlanes()[0].getBuffer();
byte[] data = new byte[bb.remaining()];
bb.get(data);
// byte array to mat
Mat = Imgcodecs.imdecode(new MatOfByte(data), Imgcodecs.CV_LOAD_IMAGE_UNCHANGED);

Camera.PreviewCallback equivalent in Camera2 API

Is there any equivalent for Camera.PreviewCallback in Camera2 from API 21,better than mapping to a SurfaceTexture and pulling a Bitmap ? I need to be able to pull preview data off of the camera as YUV?
You can start from the Camera2Basic sample code from Google.
You need to add the surface of the ImageReader as a target to the preview capture request:
//set up a CaptureRequest.Builder with the output Surface
mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);
mPreviewRequestBuilder.addTarget(mImageReader.getSurface());
After that, you can retrieve the image in the ImageReader.OnImageAvailableListener:
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
if (image != null) {
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
Bitmap bitmap = fromByteBuffer(buffer);
image.close();
}
} catch (Exception e) {
Log.w(LOG_TAG, e.getMessage());
}
}
};
To get a Bitmap from the ByteBuffer:
Bitmap fromByteBuffer(ByteBuffer buffer) {
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes, 0, bytes.length);
return BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
}
Yes, use the ImageReader class.
Create an ImageReader using the format ImageFormat.YUV_420_888 and your desired size (make sure you select a size that's supported by the camera device you're using).
Then use ImageReader.getSurface() for a Surface to provide to CameraDevice.createCaptureSession(), along with your other preview outputs, if any.
Finally, in your repeating capture request, add the ImageReader provided surface as a target before setting it as the repeating request in your capture session.

Android byte[] to image in Camera.onPreviewFrame

When trying to convert the byte[] of Camera.onPreviewFrame to Bitamp using BitmapFactory.decodeByteArray gives me an error SkImageDecoder::Factory returned null
Following is my code:
public void onPreviewFrame(byte[] data, Camera camera) {
Bitmap bmp=BitmapFactory.decodeByteArray(data, 0, data.length);
}
This has been hard to find! But since API 8, there is a YuvImage class in android.graphics. It's not an Image descendent, so all you can do with it is save it to Jpeg, but you could save it to memory stream and then load into Bitmap Image if that's what you need.
import android.graphics.YuvImage;
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
try {
Camera.Parameters parameters = camera.getParameters();
Size size = parameters.getPreviewSize();
YuvImage image = new YuvImage(data, parameters.getPreviewFormat(),
size.width, size.height, null);
File file = new File(Environment.getExternalStorageDirectory()
.getPath() + "/out.jpg");
FileOutputStream filecon = new FileOutputStream(file);
image.compressToJpeg(
new Rect(0, 0, image.getWidth(), image.getHeight()), 90,
filecon);
} catch (FileNotFoundException e) {
Toast toast = Toast
.makeText(getBaseContext(), e.getMessage(), 1000);
toast.show();
}
}
Since Android 3.0 you can use a TextureView and TextureSurface to display the camera, and then use mTextureView.getBitmap() to retrieve a friendly RGB preview frame.
A very skeletal example of how to do this is given in the TextureView docs. Note that you'll have to set your application or activity to be hardware accelerated by putting android:hardwareAccelerated="true" in the manifest.
I found the answer after a long time. Here it is...
Instead of using BitmapFactory, I used my custom method to decode this byte[] data to a valid image format. To decode the image to a valid image format, one need to know what picture format is being used by the camera by calling camera.getParameters().getPictureFormat(). This returns a constant defined by ImageFormat. After knowing the format, use the appropriate encoder to encode the image.
In my case, the byte[] data was in the YUV format, so I looked for YUV to BMP conversion and that solved my problem.
you can try this:
This example send camera frames to server
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
try {
byte[] baos = convertYuvToJpeg(data, camera);
StringBuilder dataBuilder = new StringBuilder();
dataBuilder.append("data:image/jpeg;base64,").append(Base64.encodeToString(baos, Base64.DEFAULT));
mSocket.emit("newFrame", dataBuilder.toString());
} catch (Exception e) {
Log.d("########", "ERROR");
}
}
};
public byte[] convertYuvToJpeg(byte[] data, Camera camera) {
YuvImage image = new YuvImage(data, ImageFormat.NV21,
camera.getParameters().getPreviewSize().width, camera.getParameters().getPreviewSize().height, null);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
int quality = 20; //set quality
image.compressToJpeg(new Rect(0, 0, camera.getParameters().getPreviewSize().width, camera.getParameters().getPreviewSize().height), quality, baos);//this line decreases the image quality
return baos.toByteArray();
}

Categories

Resources