I am currently working on my Final Year Project which is building an Android Camera App.
I am confused about the Camera interfaces : PictureCallback, ShutterCallback and PreviewCallback.
I know that when I use Camera.takePicture(), PictureCallback will be called. But what about ShutterCallback and PreviewCallback? When do I use them?
I have search thru the internet but I still don't have a clear idea of how to use them. Can anyone provide a clear explanation for these 3 callbacks?
ShutterCallback is used to provide feedback to the user that the camera is working, for example to play a sound, animate your shutter button, etc.
PreviewCallback gives you a byte array of the current camera preview frame as it gets captured by the camera sensor. Use this if you want to alter the frames in some sort, like to display rectangles over detected faces.
PictureCallback gives you a byte array that represents the captured picture. The format of the picture (jpg, raw or postview) depends on where you passed the callback in takePicture()
For further reference, you should read the documentation for Camera and its related interfaces and classes.
I think google's documents give enough explanation for your question :
Camera.ShutterCallback
public abstract void onShutter ()
Called as near as possible to the moment when a photo is captured from
the sensor. This is a good opportunity to play a shutter sound or give
other feedback of camera operation. This may be some time after the
photo was triggered, but some time before the actual data is
available.
Camera.PreviewCallback
public abstract void onPreviewFrame (byte[] data, Camera camera)
Called as preview frames are displayed. This callback is invoked on
the event thread open(int) was called from.
If using the YV12 format, refer to the equations in
setPreviewFormat(int) for the arrangement of the pixel data in the
preview callback buffers.
You can use ShutterCallback when a picture is taken and PreviewCallBack to capture frames from camera. Keep in mind PreviewCallBack has some problems. see link.
references:
http://developer.android.com/reference/android/hardware/Camera.ShutterCallback.html
http://developer.android.com/reference/android/hardware/Camera.PreviewCallback.html
Related
I need to show a camera preview in the SurfaceView with delay about 5 seconds.
So, I think I need somehow to capture Frames from a camera before they go to the SurfaceView and put them to the buffer, and then when buffer will be full, get a stored Frames from the buffer and show them to the SurfaceView.
But I don't know how to get frames before they will be drawn on the SurfaceView.
I only know how to get frames from PreviewCallback, onPreviewFrame(byte[] data, Camera camera) method:
PreviewCallback previewCb = new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera camera) {
// byte[] data is the Frame
}
};
But I don't know how to get a Frames from a camera directly, to store them to the buffer, and then restore the Frames from buffer to the SurfaceView.
Any help is very appreciated.
Finally, I ended up working with OpenCV. There is a nice example in the samples folder which name is "tutorial-1-camerapreview".
There I have method :
public Mat onCameraFrame(CvCameraViewFrame inputFrame) {
return inputFrame.rgba();
}
So, I can do what I want with frames inside this method and then just return them.
Also, maybe to someone it can be useful.
I have found a nice example of how to get a frames from onPreviewFrame, then convert them from yuv into rgb format using jni (because it faster.. I think so..) and then draw the frames to the custom SurfaceView.
Example 1.
Example 2.
Hope it will be useful to someone.
I can't understand how to display filtered video from camera on Android correctly...
I wrote for sdk-8, so I've used the scheme below:
Camera.setPreviewDisplay(null); // use null surface holder to identify the fact that I don't want to see raw camera preview.
Camera.setPreviewCallbackWithBuffer() + Camera.addCallbackBuffer() // to get camera data, modify it and draw on my GLSurfaceView
And this scheme is wonderful works on 2.2.* androids... and I had been happy, until didn't try application on 4.* =) my callback function for receive frame data doesn't called at all!
According documentation, I shouldn't use null as argument for setPreviewDisplay... without surface instance, video stream will not run... but if I give him surface he will start drawing camera raw preview on that surface....
The question is: How can I correctly draw filtered camera video by my self?!
I have an app in which the client uses the camera to take a picture. The preview of the image is being shown in the tablet using a SurfaceView, before the person hits my "click" button. When the person hits the click button, the method onPictureTaken is called, and, in that method, I save the image and also call the camera.stopPreview() method (so the user can see the picture that was taken).
There is an issue, however... If the user is moving around the tablet at the moment that the picture is taken, the still picture actually shown after the stopPreview method is called DOES NOT correspond to the one that I get in the byte array of the onPictureTaken method. There is a delay of some miliseconds there in which make that difference to stand out when the user is moving around the tablet just before the picture is taken (I know that 99% of the people will not move the tablet around while taken the picture, but my client actually noticed this issue and want it fixed...). I have tried to move the save operation to a separete thread, as shown below, so the onPictureTaken method can execute as fast as possible. Still, it had no effect at all...
private PictureCallback pictureCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
camera.stopPreview();
reference = data;
new PictureCallbackHeavy().execute();
}
};
I have also trield to call camera.stopPreview() just BEFORE I call the takePicture method (and not inside the onPictureTaken() method). But the result is the same.
What can I do to sync the stopPreview method so I can show EXACTLY the image that was taken and that is in the byte array of the onPictureTaken() callback?
Thank you in advance!! =)
Unfortunately you can't acquire a reasonable good preview image just by calling stopPreview() because between the moment the picture is taken and the moment onPictureTaken() is called there can pass quite some time because it works like this:
The camera actually takes the picture (that's what you want to preview)
onShutter() is called
onPictureTaken() for the raw image data is called (on some devices)
onPictureTaken() for a scaled preview image is called (on some devices)
onPictureTaken() for the final compressed image data is called (the one we are talking about here)
So you have to convert the byte[] data in your onPictureTaken() callback into a Bitmap and map that Bitmap onto an ImageView that you should position above your SurfaceView to show the still preview image.
The code will probably look something like this:
public void onPictureTaken(byte[] data, Camera camera) {
camera.stopPreview();
final Bitmap image = BitmapFactory.decodeByteArray(data, 0, data.length);
surfaceView.setVisibility(SurfaceView.GONE);
imageView.setVisibility(ImageView.VISIBLE);
imageView.setImageBitmap(image);
reference = data;
new PictureCallbackHeavy().execute();
}
I am creating an Android app to do some image processing techniques with the camera and it needs to be fast. This is the pseudo-code of how the entire system works:
1. loop while not finished
1.1 get image frame
1.2 process image for object detection
2. end loop
I actually have questions on the basics of the Camera class:
Is previewing the perceived image from the camera faster than no previews at all? The former means using SurfaceView to preview the image.
Let's say from the takePicture() method, can the image data array be obtained without the previews?
My real question is, what is the best way to obtain the image data (say, byte[] array) quickly and iteratively after processing the image (as stated on top)?
I planned to use takePicture() method to get the image data, but I need your opinion if this is the only way or if there other better ways.
You can setup a SurfaceView as the Camera's preview display and get the data of every preview frame using the PreviewCallback. This would be better than using takePicture if you don't need the high resolution that takePicture captures. In other words, if you want to capture images of lower quality at a faster rate, use PreviewCallback... if you want to capture images of higher quality at a very slow rate, use takePicture.
As for your questions, I don't think you can take pictures without using a preview display, but i could be wrong.
class MainActivity extends Activity implements Camera.PreviewCallback, SurfaceHolder.Callback {
...
public void surfaceCreated(SurfaceHolder holder) {
camera = Camera.open();
camera.setPreviewCallback(this);
...
}
public void onPreviewFrame(byte[] data, Camera camera) {
// image data contained in data... do as you wish
}
}
I am doing some image processing of previews captured by the camera. It is cpu consuming task and I have to stop preview to make it faster. Before new frame is being processed I am calling Camera.stopPreview() and after Camera.startPreview().
However, I would like to have last captured frame displayed on SurfaceView after stopping the preview. It works 'out of the box' on 2.3 devices, however, SurfaceView gets black after calling Camera.stopPreview() on older versions of SDK. Does anyone know what has changed and what to do?
Yes, this was an improvement of the 2.3.
I had this problem in 2.2 as well, there was no way to work on a preview image despite the fact this was theoretically possible according to the API. To solve this I had to actually take a picture using Camera.takePicture(null, null, Camera.PictureCallback myCallback) (see info here) and then to implement a callback to handle the taken picture. The instance of the class that implements this callback is actually the parameter to pass to Camera.takePicture() and the callback method itself looks like this:
public void onPictureTaken(byte[] JPEGData, Camera camera) {
final Bitmap bitmap = createBitmapFromView(JPEGData);
// do something with the Bitmap
}
Doing that way prevents the picture to be saved on the external storage with the regular pictures taken with the camera application. Should you need to serialize the Bitmap you'll have to do it explicitely. But it doesn't prevent the camera's trigger sound from being emitted.
Camera.takePicture() has to be called wile the preview is running. stopPreview() can be called right after.
One thing to be careful with /!\:
Camera.takePicture() is not reentrant (at all). The callback must have returned before any subsequent call of Camera.takePicture(). This was freezing my phone, I had to shutdown and restart it before it to be usable again. As the action was triggered by a button, on my side, I had to shield it with a boolean:
if (!mPictureTaken) {
mPictureTaken = true; // absolutely NOT reentrant. Any double click sticks the phone otherwise.
mCameraView.takePicture(callback);
}