Android How to capture two consecutive frames from camera - android

I'm trying to program Optical flow on android device.
My problem is to get two consecutive frames from camera.
That's the code to get ONE frame.
mCamera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera camera) {
synchronized (SampleViewBase.this) {
mFrame2 = data;
SampleViewBase.this.notify();
}
}
});

Can't you then do something like:
private byte[] currFrame;
private byte[] prevFrame;
private void copyFrame(byte[] a){
if(a != null) prevFrame = a;
}
mCamera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera camera) {
synchronized (SampleViewBase.this) {
copyFrame(currFrame);
currFrame = data;
SampleViewBase.this.notify();
}
}
});
I'm not sure that's proper Java syntax but just copy the currentFrame, before assigning data to it. Anyway, I think you can also use the VideoCapture class to obtain the frames already in a Mat format. I'm not sure if this class is still available in the latest release, but from my experience with Opencv 2.3 it was much faster to use it to grab camera frames than to use the Android camera.

Related

How to take two consecutive pictures from camera?

When I tried to take a single picture from camera, it works fine. And now I added little bit of modification on my code, and wanna take two consecutive picture from camera.
public void takePicture(final boolean isWithFlash) {
Camera.PictureCallback mCall = new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
Bitmap bmp = BitmapFactory.decodeByteArray(data, 0, data.length);
if(isWithFlash) {
savePhotoToDirectory(data, captureImageFileName);
mCamera.stopPreview();
mCamera.release();
mCamera = null;
} else {
savePhotoToDirectory(data, captureImageFileName);
takePicture(true);
}
}
};
if(mCamera != null) {
if(!isWithFlash) {
Parameters param = mCamera.getParameters();
mCamera.takePicture(null, null, mCall);
} else {
Parameters param = mCamera.getParameters();
param.setFlashMode(Parameters.FLASH_MODE_TORCH);
mCamera.setParameters(param);
mCamera.takePicture(null, null, mCall);
}
} else {
Log.d("MYLOG", "Camera is null");
}
}
What I'm trying to do is take a picture without flash, and then take an another picture with flash consecutively. However, when I tried my code, it only takes first photo, and second onPictureTaken() function is not being called.
What am I doing wrong here? Or is there any better way to take two consecutive pictures?
Any comments would be really appreciated!
You don't need to call mCamera.stopPreview() after second call. But you need to call mCamera.startPreview() after the first one. I would introduce some delay between two calls to takePicture(), e.g. bu using View.post() to take the second picture. But maybe this post is not necessary, and delay that it causes is too much for your purposes - that's for you to decide.

Zbar scanner and autofocus

I have used zbar scanner for android and it captures the barcodes quite easily.
But the problem is that on phones which have autofocus, it captures the barcodes too quickly to detect it correctly.
If only it could wait for a few milliseconds more, it could then be able to capture more clearer image and thereby not show "not found" page.
How can I solve this problem?
Is there a provision to delay the focus on the barcode?
Maybe a delay in capturing the image?
Are you talking about the example code, CameraTestActivity.java?
Implement a counter that counts for similar scanning results. If the scanning result remains the same (e.g. for 10 times in a row), we can assume the result is quite reliable.
I really like #Juuso_Ohtonen's reply, and actually just used it in my own reader, however if you want an AutoFocus delay you can create a Camera.AutoFocusCallback object and implement its onAutoFocus method with a .postDelayed. This object is then used on your Camera camera.autoFocus() method.
// Mimic continuous auto-focusing
Camera.AutoFocusCallback autoFocusCB = new Camera.AutoFocusCallback() {
public void onAutoFocus(boolean success, Camera camera) {
autoFocusHandler.postDelayed(doAutoFocus, 1000);
}
};
This section is used in the class that extends SurfaceView, which then implements surfaceChanged();
public CameraPreview(Context context, Camera camera,
PreviewCallback previewCb,
AutoFocusCallback autoFocusCb) {
super(context);
mCamera = camera;
previewCallback = previewCb;
autoFocusCallback = autoFocusCb;
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
/*
* If your preview can change or rotate, take care of those events here.
* Make sure to stop the preview before resizing or reformatting it.
*/
if (mHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
try {
mCamera.setPreviewDisplay(mHolder);
mCamera.setPreviewCallback(previewCallback);
mCamera.startPreview();
mCamera.autoFocus(autoFocusCallback);
} catch (Exception e) {
Log.d("DBG", "Error starting camera preview: " + e.getMessage());
}
}

recreating camera object after error 100 (camera server died)

I have classic android app with camera preview (common implem that can be found in many tutorials [marakana etc.]) that is supposed to take picture in a given time interval. Threading and killing threads is done, errors such "method called after release" are handled. But sometimes the well-known error 100 occurs. I accepted the fact that it happens and tried to handle it too. I implemented ErrorCallback and its onError method where the current camera object is released and instantiated a new one as written in official documentation.
But (with no surprise) it is not enough. New camera is maybe wrongly allocated because an message "CameraDemo has been exited unexpectedly" appears now.
I've read many docs and examples in hope, that a proper proceeding will be somewhere explained but no one has such problem apparently. So I would like to ask what else should I do beside releasing and creating new camera? Here is the code:
ErrorCallback CEC = new ErrorCallback()
{
public void onError(int error, Camera camera)
{
Log.d("CameraDemo", "camera error detected");
if(error == Camera.CAMERA_ERROR_SERVER_DIED)
{
Log.d("CameraDemo", "attempting to reinstantiate new camera");
camera.stopPreview();
camera.setPreviewCallback(null);
camera.release(); //written in documentation...
camera = null;
camera = Camera.open();
}
}
};
Shortly - if I release and recreate camera in onError callback then RuntimeException Method called after release (takePicture) is raised. So should I somehow assign the surface holder to camera again or recreate the surface holder too?
It would be enough to direct me e.g. to some forums, where it is described or solved, etc. Thanks for any help.
In my app to handle the camere i use this :
public void onResume() {
super.onResume();
if(mCamera == null)
mCamera = getCameraInstance():
}
public static Camera getCameraInstance() {
mCamera = null;
try {
mCamera = Camera.open();
Parameters parameters = mCamera.getParameters();
mCamera.cancelAutoFocus();
mCamera.setPreviewCallback(yourPreviewCb);
mCamera.startPreview();
mCamera.setParameters(parameters);
mCamera.autoFocus(yourAutoFocusCB);
} catch (Exception e) {
//TODO
}
return mCamera;
}
The mCamera = null in the getCameraInstance() is just to be sure there is no camera running at all.
I think you need to recreate a complete camera, not just open it with the
camera.open();
Set this in the onResume or in the error callback, depending on your needs.
This is how I fixed it, here is a sample of code, think you get the idea:
private Camera camera;
// code...
public Camera getCameraInstance() {
Camera camera = Camera.open();
// code...
camera.setErrorCallback(new ErrorCallback() {
#Override
public void onError(int error, Camera camera) {
if(error == Camera.CAMERA_ERROR_SERVER_DIED) {
releaseCamera();
startCamera();
}
}
});
return camera;
}
protected void startCamera() {
if(getCamera() == null)
setCamera(getCameraInstance());
refreshCamera();
}
protected void releaseCamera() {
if (getCamera() != null) {
getCamera().release();
setCamera(null);
}
}
public Camera getCamera() {
return camera;
}
public void setCamera(Camera camera) {
this.camera = camera;
}

Processing camera frames on Android

i want to use my android phone to process image, for example, make any
operation with de frame and show it with the change (show the image in
black/white, grayscale, sepia, etc).
This is my code:
public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback,PreviewCallback {
SurfaceHolder mHolder;
Camera mCamera;
private Parameters parameters;
private Size previewSize;
private int[] pixels;
public CameraPreview(Context context) {
super(context);
SurfaceHolder mHolder = getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_NORMAL);
this.setFocusable(true);
this.requestFocus();
}
public void surfaceCreated(SurfaceHolder holder) {
mCamera = Camera.open();
}
public void surfaceDestroyed(SurfaceHolder holder) {
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
setImageSize();
mCamera.startPreview();
mCamera.setPreviewCallback(this);
}
public void onPreviewFrame(byte[] data, Camera camera) {
// transforms NV21 pixel data into RGB pixels
decodeYUV420SP(pixels, data, previewSize.width, previewSize.height);
//here process the image
}
}
the problem is that i don't know how to show the new image processed.
In onPreviewFrame I convert yuv to rgb, then I process the image i.e.
convert in grayscale, but what i do for show the new image?
I need help, thanks!!!!!!!!!!!
You may want to consider using OpenCV or looking at one of their samples for image manipulations:
http://opencv.itseez.com/doc/tutorials/introduction/android_binary_package/android_binary_package.html
I read an article at: http://nhenze.net/?p=107
From what I read in the post, it's not very easy to do, because the onPreviewFrame is not synced with displaying the frames.
Instead, they use a trick to set another view on top of the SurfaceView. On that new View, they render images on an OpenGL texture with OpenGL ES.
I haven't played with it myself, but hope this will put you in the right direction.

Taking picture from camera without preview

I am writing an Android 1.5 application which starts just after boot-up. This is a Service and should take a picture without preview. This app will log the light density in some areas whatever. I was able to take a picture but the picture was black.
After researching for a long time, I came across a bug thread about it. If you don't generate a preview, the image will be black since Android camera needs preview to setup exposure and focus. I've created a SurfaceView and the listener, but the onSurfaceCreated() event never gets fired.
I guess the reason is, the surface is not being created visually. I've also seen some examples of calling the camera statically with MediaStore.CAPTURE_OR_SOMETHING which takes a picture and saves in the desired folder with two lines of code, but it doesn't take a picture too.
Do I need to use IPC and bindService() to call this function? Or is there an alternative method to achieve this?
it is really weird that camera on android platform can't stream video until it given valid preview surface. it seems that the architects of the platform was not thinking about 3rd party video streaming applications at all. even for augmented reality case the picture can be presented as some kind of visual substitution, not real time camera stream.
anyway, you can simply resize preview surface to 1x1 pixels and put it somewhere in the corner of the widget (visual element). please pay attention - resize preview surface, not camera frame size.
of course such trick does not eliminate unwanted data streaming (for preview) which consumes some system resources and battery.
I found the answer to this in the Android Camera Docs.
Note: It is possible to use MediaRecorder without creating a camera
preview first and skip the first few steps of this process. However,
since users typically prefer to see a preview before starting a
recording, that process is not discussed here.
You can find the step by step instructions at the link above. After the instructions, it will state the quote that I have provided above.
Actually it is possible, but you have to fake the preview with a dummy SurfaceView
SurfaceView view = new SurfaceView(this);
c.setPreviewDisplay(view.getHolder());
c.startPreview();
c.takePicture(shutterCallback, rawPictureCallback, jpegPictureCallback);
Update 9/21/11: Apparently this does not work for every Android device.
Taking the Photo
Get this working first before trying to hide the preview.
Correctly set up the preview
Use a SurfaceView (pre-Android-4.0 compatibility) or SurfaceTexture (Android 4+, can be made transparent)
Set and initialise it before taking the photo
Wait for the SurfaceView's SurfaceHolder (via getHolder()) to report surfaceCreated() or the TextureView to report onSurfaceTextureAvailable to its SurfaceTextureListener before setting and initialising the preview.
Ensure the preview is visible:
Add it to the WindowManager
Ensure its layout size is at least 1x1 pixels (you might want to start by making it MATCH_PARENT x MATCH_PARENT for testing)
Ensure its visibility is View.VISIBLE (which seems to be the default if you don't specify it)
Ensure you use the FLAG_HARDWARE_ACCELERATED in the LayoutParams if it's a TextureView.
Use takePicture's JPEG callback since the documentation says the other callbacks aren't supported on all devices
Troubleshooting
If surfaceCreated/onSurfaceTextureAvailable doesn't get called, the SurfaceView/TextureView probably isn't being displayed.
If takePicture fails, first ensure the preview is working correctly. You can remove your takePicture call and let the preview run to see if it displays on the screen.
If the picture is darker than it should be, you might need to delay for about a second before calling takePicture so that the camera has time to adjust its exposure once the preview has started.
Hiding the Preview
Make the preview View 1x1 size to minimise its visibility (or try 8x16 for possibly more reliability)
new WindowManager.LayoutParams(1, 1, /*...*/)
Move the preview out of the centre to reduce its noticeability:
new WindowManager.LayoutParams(width, height,
Integer.MIN_VALUE, Integer.MIN_VALUE, /*...*/)
Make the preview transparent (only works for TextureView)
WindowManager.LayoutParams params = new WindowManager.LayoutParams(
width, height, /*...*/
PixelFormat.TRANSPARENT);
params.alpha = 0;
Working Example (tested on Sony Xperia M, Android 4.3)
/** Takes a single photo on service start. */
public class PhotoTakingService extends Service {
#Override
public void onCreate() {
super.onCreate();
takePhoto(this);
}
#SuppressWarnings("deprecation")
private static void takePhoto(final Context context) {
final SurfaceView preview = new SurfaceView(context);
SurfaceHolder holder = preview.getHolder();
// deprecated setting, but required on Android versions prior to 3.0
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
holder.addCallback(new Callback() {
#Override
//The preview must happen at or after this point or takePicture fails
public void surfaceCreated(SurfaceHolder holder) {
showMessage("Surface created");
Camera camera = null;
try {
camera = Camera.open();
showMessage("Opened camera");
try {
camera.setPreviewDisplay(holder);
} catch (IOException e) {
throw new RuntimeException(e);
}
camera.startPreview();
showMessage("Started preview");
camera.takePicture(null, null, new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
showMessage("Took picture");
camera.release();
}
});
} catch (Exception e) {
if (camera != null)
camera.release();
throw new RuntimeException(e);
}
}
#Override public void surfaceDestroyed(SurfaceHolder holder) {}
#Override public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {}
});
WindowManager wm = (WindowManager)context
.getSystemService(Context.WINDOW_SERVICE);
WindowManager.LayoutParams params = new WindowManager.LayoutParams(
1, 1, //Must be at least 1x1
WindowManager.LayoutParams.TYPE_SYSTEM_OVERLAY,
0,
//Don't know if this is a safe default
PixelFormat.UNKNOWN);
//Don't set the preview visibility to GONE or INVISIBLE
wm.addView(preview, params);
}
private static void showMessage(String message) {
Log.i("Camera", message);
}
#Override public IBinder onBind(Intent intent) { return null; }
}
On Android 4.0 and above (API level >= 14), you can use TextureView to preview the camera stream and make it invisible so as to not show it to the user. Here's how:
First create a class to implement a SurfaceTextureListener that will get the create/update callbacks for the preview surface. This class also takes a camera object as input, so that it can call the camera's startPreview function as soon as the surface is created:
public class CamPreview extends TextureView implements SurfaceTextureListener {
private Camera mCamera;
public CamPreview(Context context, Camera camera) {
super(context);
mCamera = camera;
}
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
Camera.Size previewSize = mCamera.getParameters().getPreviewSize();
setLayoutParams(new FrameLayout.LayoutParams(
previewSize.width, previewSize.height, Gravity.CENTER));
try{
mCamera.setPreviewTexture(surface);
} catch (IOException t) {}
mCamera.startPreview();
this.setVisibility(INVISIBLE); // Make the surface invisible as soon as it is created
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
// Put code here to handle texture size change if you want to
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return true;
}
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
// Update your view here!
}
}
You'll also need to implement a callback class to process the preview data:
public class CamCallback implements Camera.PreviewCallback{
public void onPreviewFrame(byte[] data, Camera camera){
// Process the camera data here
}
}
Use the above CamPreview and CamCallback classes to setup the camera in your activity's onCreate() or similar startup function:
// Setup the camera and the preview object
Camera mCamera = Camera.open(0);
CamPreview camPreview = new CamPreview(Context,mCamera);
camPreview.setSurfaceTextureListener(camPreview);
// Connect the preview object to a FrameLayout in your UI
// You'll have to create a FrameLayout object in your UI to place this preview in
FrameLayout preview = (FrameLayout) findViewById(R.id.cameraView);
preview.addView(camPreview);
// Attach a callback for preview
CamCallback camCallback = new CamCallback();
mCamera.setPreviewCallback(camCallback);
There is a way of doing this but it's somewhat tricky.
what should be done, is attach a surfaceholder to the window manager from the service
WindowManager wm = (WindowManager) mCtx.getSystemService(Context.WINDOW_SERVICE);
params = new WindowManager.LayoutParams(WindowManager.LayoutParams.WRAP_CONTENT,
WindowManager.LayoutParams.WRAP_CONTENT,
WindowManager.LayoutParams.TYPE_SYSTEM_OVERLAY,
WindowManager.LayoutParams.FLAG_WATCH_OUTSIDE_TOUCH,
PixelFormat.TRANSLUCENT);
wm.addView(surfaceview, params);
and then set
surfaceview.setZOrderOnTop(true);
mHolder.setFormat(PixelFormat.TRANSPARENT);
where mHolder is the holder you get from the surface view.
this way, you can play with the surfaceview's alpha, make it completly transparent, but the camera will still get frames.
that's how i do it. hope it helps :)
We solved this problem by using a dummy SurfaceView (not added to actual GUI) in versions below 3.0 (or let's say 4.0 as a camera service on a tablet does not really make sense).
In versions >= 4.0 this worked in the emulator only ;(
The use of SurfaceTexture (and setSurfaceTexture()) instead of SurfaceView (and setSurfaceView()) worked here. At least this works on Nexus S.
I think this really is a shortcoming of the Android framework.
In the "Working Example by Sam" (Thank you Sam... )
if at istruction "wm.addView(preview, params);"
obtain exception "Unable to add window android.view.ViewRoot -- permission denied for this window type"
resolve by using this permission in AndroidManifest:
<uses-permission android:name="android.permission.SYSTEM_ALERT_WINDOW"/>
You can try this working code, This service click front picture, if you want to capture back camera picture then uncomment back camera in code and comment front camera.
Note :- Allow Camera and Storage permission to App And startService from Activity or anywhere.
public class MyService extends Service {
#Nullable
#Override
public IBinder onBind(Intent intent) {
return null;
}
#Override
public void onCreate() {
super.onCreate();
CapturePhoto();
}
private void CapturePhoto() {
Log.d("kkkk","Preparing to take photo");
Camera camera = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
int frontCamera = 1;
//int backCamera=0;
Camera.getCameraInfo(frontCamera, cameraInfo);
try {
camera = Camera.open(frontCamera);
} catch (RuntimeException e) {
Log.d("kkkk","Camera not available: " + 1);
camera = null;
//e.printStackTrace();
}
try {
if (null == camera) {
Log.d("kkkk","Could not get camera instance");
} else {
Log.d("kkkk","Got the camera, creating the dummy surface texture");
try {
camera.setPreviewTexture(new SurfaceTexture(0));
camera.startPreview();
} catch (Exception e) {
Log.d("kkkk","Could not set the surface preview texture");
e.printStackTrace();
}
camera.takePicture(null, null, new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
File pictureFileDir=new File("/sdcard/CaptureByService");
if (!pictureFileDir.exists() && !pictureFileDir.mkdirs()) {
pictureFileDir.mkdirs();
}
SimpleDateFormat dateFormat = new SimpleDateFormat("yyyymmddhhmmss");
String date = dateFormat.format(new Date());
String photoFile = "ServiceClickedPic_" + "_" + date + ".jpg";
String filename = pictureFileDir.getPath() + File.separator + photoFile;
File mainPicture = new File(filename);
try {
FileOutputStream fos = new FileOutputStream(mainPicture);
fos.write(data);
fos.close();
Log.d("kkkk","image saved");
} catch (Exception error) {
Log.d("kkkk","Image could not be saved");
}
camera.release();
}
});
}
} catch (Exception e) {
camera.release();
}
}
}

Categories

Resources