I am trying to build a camera app that takes in the camera preview, manipulates the pixels, and then displays the new image. I need the manipulation to happen in real time.
From what I have read online, and from questions here, you need to make a custom surface view and manipulate the pixel array from the onPreviewFrame method. I have built a custom surface view and have this method running. I have converted the YUV to RGB.
Now, my question is, how do I display this new pixel array on the screen in real time? Do I somehow return it in the onPreviewFrame method? Do I have to change the byte[] array? Do I take my new pixel array and display it using a Bitmap? Is there a way to get the byte[] array from the camera preview without even displaying the preview?
If someone could answer these questions, with code examples that would be great! I am kind of new to Android, so I need the answers explained well enough for me to understand. Here is part of the code I have that runs the camera preview:
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// If your preview can change or rotate, take care of those events here.
// Make sure to stop the preview before resizing or reformatting it.
if (mHolder.getSurface() == null){
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e){
// ignore: tried to stop a non-existent preview
}
// start preview with new settings
try {
//parameters.setPreviewSize(w, h);
mCamera.setParameters(parameters);
mCamera.setPreviewDisplay(mHolder);
mCamera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera camera)
{
System.out.println("onPreviewFrame");
//transforms NV21 pixel data into RGB pixels
decodeYUV420SP(pixels, data, previewSize.width, previewSize.height);
//Outuput the value of the top left pixel in the preview to LogCat
Log.i("Pixels", "The top right pixel has the following RGB (hexadecimal) values:"
+Integer.toHexString(pixels[0]));
}
});
mCamera.startPreview();
} catch (Exception e){
Log.d(null, "Error starting camera preview: " + e.getMessage());
}
}
This gives me the rgb pixel array I want to display instead of the preview. How do I do this?
You cannot manipulate the surface that you connected to camera preview. The byte array you receive in onPreviewFrame() is just a copy of what the framework displays on the screen. Moreover, you will find that the two streams are asynchronous: you can slow down the callbacks (e.g. by adding some sleep() into your callback), but the preview surface will be updated nevertheless.
You can hide the preview SurfaceView by placing other views on top of it, or you can get rid of this view altogether by using setPreviewTexture() instead of setPreviewDisplay() (note: Added in API level 11). Hiding the surface is not as easy as it may seem: the framework may pop it up to the top, it requires careful synchronization of camera start or restart with layout.
Anyway, after you have the surface hidden, you can use the byte array received in onPreviewFrame() to generate an image and display it. You are free to manipulate the pixels to your liking. I believe that the optimal technique is to send the pixel data to OpenGL: you can use a shader to offload YCrCb (NV21) to RGB conversion to GPU.
Related
I am using Camera2 API to create a Camera component that can scan barcodes and has ability to take pictures during scanning. It is kinda working but the preview is flickering - it seems like previous frames and sometimes green frames are interrupting realtime preview.
My code is based on Google's Camera2Basic. I'm just adding one more ImageReader and its surface as a new output and target for CaptureRequest.Builder. One of the readers uses JPEG and the other YUV. Flickering disappears when I remove the JPEG reader's surface from outputs (not passing this into createCaptureSession).
There's quite a lot of code so I created a gist: click - Tried to get rid of completely irrelevant code.
Is the device you're testing on a LEGACY-level device?
If so, any captures targeting a JPEG output may be much slower since they can run a precapture sequence, and may briefly pause preview as well.
But it should not cause green frames, unless there's a device-level bug.
If anyone ever struggles with this. There is table in the docs showing that if there are 3 targets specified, the YUV ImageReader can use images with maximum size equal to the preview size (maximum 1920x1080). Reducing this helped!
Yes you can. Assuming that you configure your preview to feed the ImageReader with YUV frames (because you could also put JPEG there, check it out), like so:
mImageReaderPreview = ImageReader.newInstance(mPreviewSize.getWidth(), mPreviewSize.getHeight(), ImageFormat.YUV_420_888, 1);
You can process those frames inside your OnImageAvailable listener:
#Override
public void onImageAvailable(ImageReader reader) {
Image mImage = reader.acquireNextImage();
if (mImage == null) {
return;
}
try {
// Do some custom processing like YUV to RGB conversion, cropping, etc.
mFrameProcessor.setNextFrame(mImage));
mImage.close();
} catch (IllegalStateException e) {
Log.e("TAG", e.getMessage());
}
I've bumped into the issue with slow focusing on Nexus 6.
I develop camera application and now I'm using camera2 API.
For application needs we create preview request with 2 surfaces
- SurfaceView (viewfinder)
- YUV ImageReader surface (to use data in hstogram calculation)
And there is a critical point! If just add only viewfinder surface, focusing occurs as normal. But with 2 those surfaces focusing occurs very slow with visual steps of lens moving!
Code is quite standard, written according google documentations:
mImageReaderPreviewYUV = ImageReader.newInstance(previewWidth, previewHeight, ImageFormat.YUV_420_888, 2);
previewRequestBuilder = camDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewRequestBuilder.addTarget(getCameraSurface()); //Add surface of SurfaceView
previewRequestBuilder.addTarget(mImageReaderPreviewYUV); //Add ImageReader
mCaptureSession.setRepeatingRequest(previewRequestBuilder.build(), captureCallback null);
Does the system logcat show any warnings about buffers not being available?
Is the preview frame rate slow, or is smooth (~30fps) but focusing just works oddly?
If the former, you may not be returning Image objects to the ImageReader (by closing them once done with them) at 30 fps, so the camera device is starved for buffers to fill, and cannot maintain 30fps preview.
To test this, implement the minimal ImageReaderListener.onImageAvailable(ImageReader reader) method that just returns the image immediately:
public class TestImageListener extends ImageReaderListener {
public void onImageAvailable(ImageReader reader) {
Image img = reader.acquireNextImage();
img.close();
}
}
...
mImageReaderPreviewYUV.setOnImageAvailableListener(new TestImageListener());
If this lets you get fluid preview, then your image processing is too slow.
As a solution, you should increase the number of buffers in your ImageReader, and the nuse the reader.acquireLatestImage() to drop older buffers and only process the newest Image each time you calculate your histogram.
I had the same issues on the N6 and I think it works smoother now - add the ImageReader surface before the camera surface:
previewRequestBuilder = camDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewRequestBuilder.addTarget(mImageReaderPreviewYUV); //Add ImageReader
previewRequestBuilder.addTarget(getCameraSurface()); //Add surface of SurfaceView
I also tested my camera app with a N4/5.0.1 and both ways work perfectly there.
I am developing an android application in which the camera preview is shown at all times. When the user wants, he can take a picture of that preview with the takePicture (), and apply different filters that picture.
The problem is that when I take a picture with takePicture () method, the picture freezes for a while and I want this freeze-frame remains as long as the user wants to apply the necessary filters, and when he wants, return to preview mode the camera immediately.
I am using the OpenCV library and specifically the JavaCameraView class to get the parameters of the camera and then take the pictures I want.
mCamera.takePicture(null,null,new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera myCamera) {
if(data != null){
bm = BitmapFactory.decodeByteArray(data, 0, data.length);
Log.i(TAG, "Caracteristicas de la foto: "+bm.getWidth()+"x"+bm.getHeight());
mCamera.startPreview();
}
}
};
In the onPictureTaken () method I store the image into a bitmap, and then to draw it on a canvas with the resolution of the camera. This will be done in CameraBridgeViewBase class.
Thank you in advance!
You can restart preview immediately from onPictureTaken() callback. The trick is that you put the ImageView that displays the captured image on top of the SurfaceView where you display live preview. Now, it's enough to imageView.setVisibility(View.GONE) and your preview will be immediately live.
I have a class that extends the SurfaceView and implements Camera.PreviewCallback. In this class I setup the camera to preview to supplied buffers (setPreviewCallbackWithBuffer) and a couple of buffers (addCallbackBuffer). After I call startPreview the onPreviewFrame callback is successfully called.
In onPreviewFrame I hand the work over to a different thread that does some processing and eventually visualizes the data.
However, I noticed that occasionally the onPreviewFrame is not called anymore. During exploration of this issue, I observed that it is most likely to occur when processing 1280x720 frames - which the camera supports) but it also happens on lower resolutions but less frequently.
I eventually stripped down the code down to an almost empty onPreviewFrame (it only logs receiving the call and calling addCallbackBuffer again; nothing more). The processing thread is not started. The same behavior can be observed.
In this case, with 3 buffers for the preview at 1280x720 this runs for about 20 minutes and then onPreviewFrame is not being called anymore. Logcat doesn't show any other issues.
This 20 minutes varies, sometimes its 5 minutes, sometimes under a minute.
Using logging I verified that the buffer sequence is rather clean (buffer1, buffer2, buffer3, buffer1, buffer2, ...) on each onPreviewFrame invocation.
The device I'm working on is a Galaxy Tab 2 (using Android 4.2.2 CyanogenMod); but I've seen this on other devices using stock roms also - so I doubt that it is due to the rom.
So I guess it comes to these questions:
how many buffers do I need to supply (given a certain resolution and a certain processing time)?
Why does the onPreviewFrame not get called anymore (the are free buffer at that time)?
The relevant code boils down to this:
private void startPreview()
{
Camera.Parameters parameters = mCamera.getParameters();
int width = 1280;
int height = 720;
Resolution bestCameraResolution = getBestCameraResolutionMatch(width, height);
width = bestCameraResolution.width();
height = bestCameraResolution.height();
parameters.setPreviewSize(width, height);
mCamera.setParameters(parameters);
try {
mCamera.setPreviewDisplay(mSurfaceHolder);
} catch (IOException e) {
Log.d(TAG, "Error setting camera preview: " + e.getMessage());
}
// calculate imageBufferSize here...
mCamera.addCallbackBuffer(new byte[imageBufferSize]);
mCamera.addCallbackBuffer(new byte[imageBufferSize]);
mCamera.addCallbackBuffer(new byte[imageBufferSize]);
mCamera.setPreviewCallbackWithBuffer(this);
mCamera.startPreview();
Log.d(TAG, "Start preview with " + width + "x" + height);
}
#Override
public void onPreviewFrame(byte[] frameData, Camera camera)
{
// Log to a separate tag to allow better filtering
Log.d(TAG + "Frame", "preview frame on buffer " + frameData.toString());
// Give used buffer back for future grabbing
if (mCamera != null) {
mCamera.addCallbackBuffer(frameData);
}
}
If the code is updated to use not a dummy SurfaceView and we render our processing results to a second SurfaceView (that lies on top) the code works.
We have to be a bit more careful about synchronization I guess, but it works on all devices we tested it on so far.
I am creating an Android app to do some image processing techniques with the camera and it needs to be fast. This is the pseudo-code of how the entire system works:
1. loop while not finished
1.1 get image frame
1.2 process image for object detection
2. end loop
I actually have questions on the basics of the Camera class:
Is previewing the perceived image from the camera faster than no previews at all? The former means using SurfaceView to preview the image.
Let's say from the takePicture() method, can the image data array be obtained without the previews?
My real question is, what is the best way to obtain the image data (say, byte[] array) quickly and iteratively after processing the image (as stated on top)?
I planned to use takePicture() method to get the image data, but I need your opinion if this is the only way or if there other better ways.
You can setup a SurfaceView as the Camera's preview display and get the data of every preview frame using the PreviewCallback. This would be better than using takePicture if you don't need the high resolution that takePicture captures. In other words, if you want to capture images of lower quality at a faster rate, use PreviewCallback... if you want to capture images of higher quality at a very slow rate, use takePicture.
As for your questions, I don't think you can take pictures without using a preview display, but i could be wrong.
class MainActivity extends Activity implements Camera.PreviewCallback, SurfaceHolder.Callback {
...
public void surfaceCreated(SurfaceHolder holder) {
camera = Camera.open();
camera.setPreviewCallback(this);
...
}
public void onPreviewFrame(byte[] data, Camera camera) {
// image data contained in data... do as you wish
}
}