We are developing an app for the Vuzix M100, which should continuously read a barcode and display the result in a textView. Therefore, the camera takes a picture every 5 seconds and sends the bitmap to the zxing barcode scanner. We are almost done, but the camera is only focusing at the first picture. Any suggestions?
This is the important part of our code:
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
tv = (TextView) findViewById(R.id.textView1);
mPreview = (SurfaceView) findViewById(R.id.sv1);
mPreview.getHolder().addCallback(this);
mCamera = Camera.open();
final Parameters param = mCamera.getParameters();
param.setJpegQuality(100);
param.setPictureSize(1600, 1200);
param.setFocusMode(param.FOCUS_MODE_CONTINUOUS_PICTURE);
param.setSceneMode(Parameters.SCENE_MODE_BARCODE);
mCamera.setParameters(param);
final Handler h = new Handler();
final int delay = 5000;
h.postDelayed(new Runnable(){
int count = 1;
public void run(){
tryAutoFocus();
param.setFocusMode(param.FOCUS_MODE_FIXED);
param.setFocusMode(param.FOCUS_MODE_AUTO);
h.postDelayed(this, delay);
}
}, delay);
}
.
private void tryAutoFocus() {
final PictureCallback myPictureCallback = new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
try {
Options options = new BitmapFactory.Options();
options.inScaled = false;
Bitmap bmp = BitmapFactory.decodeByteArray(data, 0, data.length, options);
createExternalStoragePublicPicture(bmp);
} catch (Exception e) {
e.printStackTrace();
}
}
};
AutoFocusCallback myAutoFocusCallback = new AutoFocusCallback() {
#Override
public void onAutoFocus(boolean success, Camera arg1) {
// TODO: Problem: Fokussiert nur beim ersten Foto
mCamera.takePicture(null, null, null, myPictureCallback);
mCamera.cancelAutoFocus();
mCamera.startPreview();
}
};
mCamera.autoFocus(myAutoFocusCallback);
}
This is an old question, but still:
Note that setting a scene via Camera.Parameters.setSceneMode() may actually override any previous setting. Quoting the API documentation:
Changing scene mode may override other parameters (such as flash mode, focus mode, white balance). For example, suppose originally flash mode is on and supported flash modes are on/off. In night scene mode, both flash mode and supported flash mode may be changed to off. After setting scene mode, applications should call getParameters to know if some parameters are changed.
Also the docs imply that not only the current settings may be overriden, but also the supported parameter values.
So I suggest to:
Set the scene mode before setting the focus mode
After setting the scene mode, check if the desired focus mode is still supported with getSupportedFocusModes()
I'm using a SurfaceTexture to run the camera of a device in the background and get frames from it. I see that that onFrameAvailable callback don't bring a frame, but a SurfaceTexture instead. I want to get the frame or an image representation of it, but I'm not sure how to do that. When I searched I found that I need to use GraphicBuffer, but it seems too complicated and it's not clear to me how to use it.
I've also looked at solutions here:
Texture Image processing on the GPU?
Android SDK: Get raw preview camera image without displaying it
But it's not clear how to do it in the code. Here is my code:
public class BackgroundService extends Service {
private Camera camera = null;
private int NOTIFICATION_ID= 1;
private static final String TAG = "OCVSample::Activity";
// Binder given to clients
private final IBinder mBinder = new LocalBinder();
private WindowManager windowManager;
private SurfaceTexture mSurfaceTexture= new SurfaceTexture (10);
Intent intializerIntent;
public class LocalBinder extends Binder {
BackgroundService getService() {
// Return this instance of this service so clients can call public methods
return BackgroundService.this;
}
}//end inner class that returns an instance of the service.
#Override
public IBinder onBind(Intent intent) {
intializerIntent = intent;
return mBinder;
}//end onBind.
#Override
public void onCreate() {
Log.i(TAG, "onCreate is called");
// Start foreground service to avoid unexpected kill
startForeground(NOTIFICATION_ID, buildNotification());
Thread thread = new Thread() {
public void run() {
camera = Camera.open(1);
mSurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
#Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
Log.i(TAG, "frame captured from texture");
if (camera!=null) {
//camera.setPreviewCallbackWithBuffer(null);
//camera.setPreviewCallback(null);
//camera.setOneShotPreviewCallback(null);
//if the following two lines are not called, many frames will be droped.
camera.stopPreview();
camera.startPreview();
}
}
});
//now try to set the preview texture of the camera which is actually the surfaceTexture that has just been created.
try {
camera.setPreviewTexture(mSurfaceTexture);
} catch (IOException e) {
Log.e(TAG, "Error in setting the camera surface texture");
}
camera.startPreview();
}
};
thread.start();
}
private Notification buildNotification () {
NotificationCompat.Builder notificationBuilder=new NotificationCompat.Builder(this);
notificationBuilder.setOngoing(true); //this notification should be ongoing
notificationBuilder.setContentTitle(getString(R.string.notification_title))
.setContentText(getString (R.string.notification_text_and_ticker))
.setSmallIcon(R.drawable.vecsat_logo)
.setTicker(getString(R.string.notification_text_and_ticker));
return(notificationBuilder.build());
}
#Override
public void onDestroy() {
Log.i(TAG, "surfaceDestroyed method");
camera.stopPreview();
//camera.lock();
camera.release();
mSurfaceTexture.detachFromGLContext();
mSurfaceTexture.release();
stopService(intializerIntent) ;
//windowManager.removeView(surfaceView);
}
}
How can I get the frames and process them whether that's on GPU or CPU? If there is a way to do that on GPU, then get the results from there, that would be great as it seems more efficient.
Thank you.
SurfaceTexture takes whatever is sent to the Surface and wrangles it into an OpenGL ES "external" texture. If you want access to those pixels from software, you will need to render the texture to a framebuffer, then read the pixels out with glReadPixels(). One example of this is the bigflake ExtractMpegFramesTest, which converts frames of decoded video to PNG.
Better performance can be achieved by doing all processing on the GPU (see e.g. this demo), but that's not always feasible.
I have used zbar scanner for android and it captures the barcodes quite easily.
But the problem is that on phones which have autofocus, it captures the barcodes too quickly to detect it correctly.
If only it could wait for a few milliseconds more, it could then be able to capture more clearer image and thereby not show "not found" page.
How can I solve this problem?
Is there a provision to delay the focus on the barcode?
Maybe a delay in capturing the image?
Are you talking about the example code, CameraTestActivity.java?
Implement a counter that counts for similar scanning results. If the scanning result remains the same (e.g. for 10 times in a row), we can assume the result is quite reliable.
I really like #Juuso_Ohtonen's reply, and actually just used it in my own reader, however if you want an AutoFocus delay you can create a Camera.AutoFocusCallback object and implement its onAutoFocus method with a .postDelayed. This object is then used on your Camera camera.autoFocus() method.
// Mimic continuous auto-focusing
Camera.AutoFocusCallback autoFocusCB = new Camera.AutoFocusCallback() {
public void onAutoFocus(boolean success, Camera camera) {
autoFocusHandler.postDelayed(doAutoFocus, 1000);
}
};
This section is used in the class that extends SurfaceView, which then implements surfaceChanged();
public CameraPreview(Context context, Camera camera,
PreviewCallback previewCb,
AutoFocusCallback autoFocusCb) {
super(context);
mCamera = camera;
previewCallback = previewCb;
autoFocusCallback = autoFocusCb;
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
/*
* If your preview can change or rotate, take care of those events here.
* Make sure to stop the preview before resizing or reformatting it.
*/
if (mHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
try {
mCamera.setPreviewDisplay(mHolder);
mCamera.setPreviewCallback(previewCallback);
mCamera.startPreview();
mCamera.autoFocus(autoFocusCallback);
} catch (Exception e) {
Log.d("DBG", "Error starting camera preview: " + e.getMessage());
}
}
I am trying to create a surface view for a camera so it renders on the surface whenever is in the view of the camera. At the moment all I can see on my camera view is a black screen view. I have tried to look on Google and here but so far I haven't found what I am looking for. Anyone can suggest me some idea.
I have written a class that can help you.
public class Preview_can_work extends Activity {
private SurfaceView surface_view;
private Camera mCamera;
SurfaceHolder.Callback sh_ob = null;
SurfaceHolder surface_holder = null;
SurfaceHolder.Callback sh_callback = null;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFormat(PixelFormat.TRANSLUCENT);
surface_view = new SurfaceView(getApplicationContext());
addContentView(surface_view, new LayoutParams(LayoutParams.FILL_PARENT, LayoutParams.FILL_PARENT));
if (surface_holder == null) {
surface_holder = surface_view.getHolder();
}
sh_callback = my_callback();
surface_holder.addCallback(sh_callback);
}
SurfaceHolder.Callback my_callback() {
SurfaceHolder.Callback ob1 = new SurfaceHolder.Callback() {
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
mCamera = Camera.open();
try {
mCamera.setPreviewDisplay(holder);
} catch (IOException exception) {
mCamera.release();
mCamera = null;
}
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
mCamera.startPreview();
}
};
return ob1;
}
}
in your manifest file copy this code for camera permission
<uses-permission android:name="android.permission.CAMERA"/>
Explanation:
SurfaceView is a type of View which contains a SurfaceHolder. SurfaceHolder holds the surface on which we can display our media (generally frames).
mCamera is a Camera object which will contains the camera instance.
When you want to hold default Camera instance then you can simply call Camera.open();
Camera mCamera = Camera.open();
Now you have an open camera or you are having default camera instance. Now you need to capture frames from the camera and display it on a surface. But you cannot display it without any
surface. Here the surfaceView provides surfaceHolder and surfaceHolder provides surface to display camera frames. Now when surface will be created three callback functions will be
called.
1. public void surfaceCreated(SurfaceHolder holder)
2. public void surfaceChanged(SurfaceHolder holder, int format, int width, int height)
3. public void surfaceDestroyed(SurfaceHolder holder)
Note:- Surface will be destroyed when your application will go on pause.
surfaceCreated:
surfaceCreated is a callback function which will be called when your surface will be created. In this, you can open your camera and set other attributes.
surfaceChanged:
This will be called atleast one time when your surface will be created. After that it will be called whenever your surface will change(In device rotation). Here you can
start your preview because your surface have already created.
surfaceDestroyed:
This will be called every time when your surface will destroy. Now if you dont have surface then where you can display you camera frames so I have released camera by using
mCamera.release(). This is very important because if your activity will be on pause and any other activity tries to open camera then it will not able to open it as you have
already open camera. Camera is a shared resource so one time only one application can use it. So remember one thing whenever you open a camera then always release it.
stopPreview:
When you start preview then your camera starts capturing your frames and display it on a surface. Now if your surface have destroyed then you need to stop capturing frames
from camera so you have to call mCamera.stopPreview.
Make shure you added the permission :
<uses-permission android:name="android.permission.CAMERA"/>
Also these window properties:
getWindow().setFormat(PixelFormat.TRANSLUCENT);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN | WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON,
WindowManager.LayoutParams.FLAG_FULLSCREEN | WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
Post some code if that doesn't work in order to help you
I am writing an Android 1.5 application which starts just after boot-up. This is a Service and should take a picture without preview. This app will log the light density in some areas whatever. I was able to take a picture but the picture was black.
After researching for a long time, I came across a bug thread about it. If you don't generate a preview, the image will be black since Android camera needs preview to setup exposure and focus. I've created a SurfaceView and the listener, but the onSurfaceCreated() event never gets fired.
I guess the reason is, the surface is not being created visually. I've also seen some examples of calling the camera statically with MediaStore.CAPTURE_OR_SOMETHING which takes a picture and saves in the desired folder with two lines of code, but it doesn't take a picture too.
Do I need to use IPC and bindService() to call this function? Or is there an alternative method to achieve this?
it is really weird that camera on android platform can't stream video until it given valid preview surface. it seems that the architects of the platform was not thinking about 3rd party video streaming applications at all. even for augmented reality case the picture can be presented as some kind of visual substitution, not real time camera stream.
anyway, you can simply resize preview surface to 1x1 pixels and put it somewhere in the corner of the widget (visual element). please pay attention - resize preview surface, not camera frame size.
of course such trick does not eliminate unwanted data streaming (for preview) which consumes some system resources and battery.
I found the answer to this in the Android Camera Docs.
Note: It is possible to use MediaRecorder without creating a camera
preview first and skip the first few steps of this process. However,
since users typically prefer to see a preview before starting a
recording, that process is not discussed here.
You can find the step by step instructions at the link above. After the instructions, it will state the quote that I have provided above.
Actually it is possible, but you have to fake the preview with a dummy SurfaceView
SurfaceView view = new SurfaceView(this);
c.setPreviewDisplay(view.getHolder());
c.startPreview();
c.takePicture(shutterCallback, rawPictureCallback, jpegPictureCallback);
Update 9/21/11: Apparently this does not work for every Android device.
Taking the Photo
Get this working first before trying to hide the preview.
Correctly set up the preview
Use a SurfaceView (pre-Android-4.0 compatibility) or SurfaceTexture (Android 4+, can be made transparent)
Set and initialise it before taking the photo
Wait for the SurfaceView's SurfaceHolder (via getHolder()) to report surfaceCreated() or the TextureView to report onSurfaceTextureAvailable to its SurfaceTextureListener before setting and initialising the preview.
Ensure the preview is visible:
Add it to the WindowManager
Ensure its layout size is at least 1x1 pixels (you might want to start by making it MATCH_PARENT x MATCH_PARENT for testing)
Ensure its visibility is View.VISIBLE (which seems to be the default if you don't specify it)
Ensure you use the FLAG_HARDWARE_ACCELERATED in the LayoutParams if it's a TextureView.
Use takePicture's JPEG callback since the documentation says the other callbacks aren't supported on all devices
Troubleshooting
If surfaceCreated/onSurfaceTextureAvailable doesn't get called, the SurfaceView/TextureView probably isn't being displayed.
If takePicture fails, first ensure the preview is working correctly. You can remove your takePicture call and let the preview run to see if it displays on the screen.
If the picture is darker than it should be, you might need to delay for about a second before calling takePicture so that the camera has time to adjust its exposure once the preview has started.
Hiding the Preview
Make the preview View 1x1 size to minimise its visibility (or try 8x16 for possibly more reliability)
new WindowManager.LayoutParams(1, 1, /*...*/)
Move the preview out of the centre to reduce its noticeability:
new WindowManager.LayoutParams(width, height,
Integer.MIN_VALUE, Integer.MIN_VALUE, /*...*/)
Make the preview transparent (only works for TextureView)
WindowManager.LayoutParams params = new WindowManager.LayoutParams(
width, height, /*...*/
PixelFormat.TRANSPARENT);
params.alpha = 0;
Working Example (tested on Sony Xperia M, Android 4.3)
/** Takes a single photo on service start. */
public class PhotoTakingService extends Service {
#Override
public void onCreate() {
super.onCreate();
takePhoto(this);
}
#SuppressWarnings("deprecation")
private static void takePhoto(final Context context) {
final SurfaceView preview = new SurfaceView(context);
SurfaceHolder holder = preview.getHolder();
// deprecated setting, but required on Android versions prior to 3.0
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
holder.addCallback(new Callback() {
#Override
//The preview must happen at or after this point or takePicture fails
public void surfaceCreated(SurfaceHolder holder) {
showMessage("Surface created");
Camera camera = null;
try {
camera = Camera.open();
showMessage("Opened camera");
try {
camera.setPreviewDisplay(holder);
} catch (IOException e) {
throw new RuntimeException(e);
}
camera.startPreview();
showMessage("Started preview");
camera.takePicture(null, null, new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
showMessage("Took picture");
camera.release();
}
});
} catch (Exception e) {
if (camera != null)
camera.release();
throw new RuntimeException(e);
}
}
#Override public void surfaceDestroyed(SurfaceHolder holder) {}
#Override public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {}
});
WindowManager wm = (WindowManager)context
.getSystemService(Context.WINDOW_SERVICE);
WindowManager.LayoutParams params = new WindowManager.LayoutParams(
1, 1, //Must be at least 1x1
WindowManager.LayoutParams.TYPE_SYSTEM_OVERLAY,
0,
//Don't know if this is a safe default
PixelFormat.UNKNOWN);
//Don't set the preview visibility to GONE or INVISIBLE
wm.addView(preview, params);
}
private static void showMessage(String message) {
Log.i("Camera", message);
}
#Override public IBinder onBind(Intent intent) { return null; }
}
On Android 4.0 and above (API level >= 14), you can use TextureView to preview the camera stream and make it invisible so as to not show it to the user. Here's how:
First create a class to implement a SurfaceTextureListener that will get the create/update callbacks for the preview surface. This class also takes a camera object as input, so that it can call the camera's startPreview function as soon as the surface is created:
public class CamPreview extends TextureView implements SurfaceTextureListener {
private Camera mCamera;
public CamPreview(Context context, Camera camera) {
super(context);
mCamera = camera;
}
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
Camera.Size previewSize = mCamera.getParameters().getPreviewSize();
setLayoutParams(new FrameLayout.LayoutParams(
previewSize.width, previewSize.height, Gravity.CENTER));
try{
mCamera.setPreviewTexture(surface);
} catch (IOException t) {}
mCamera.startPreview();
this.setVisibility(INVISIBLE); // Make the surface invisible as soon as it is created
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
// Put code here to handle texture size change if you want to
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return true;
}
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
// Update your view here!
}
}
You'll also need to implement a callback class to process the preview data:
public class CamCallback implements Camera.PreviewCallback{
public void onPreviewFrame(byte[] data, Camera camera){
// Process the camera data here
}
}
Use the above CamPreview and CamCallback classes to setup the camera in your activity's onCreate() or similar startup function:
// Setup the camera and the preview object
Camera mCamera = Camera.open(0);
CamPreview camPreview = new CamPreview(Context,mCamera);
camPreview.setSurfaceTextureListener(camPreview);
// Connect the preview object to a FrameLayout in your UI
// You'll have to create a FrameLayout object in your UI to place this preview in
FrameLayout preview = (FrameLayout) findViewById(R.id.cameraView);
preview.addView(camPreview);
// Attach a callback for preview
CamCallback camCallback = new CamCallback();
mCamera.setPreviewCallback(camCallback);
There is a way of doing this but it's somewhat tricky.
what should be done, is attach a surfaceholder to the window manager from the service
WindowManager wm = (WindowManager) mCtx.getSystemService(Context.WINDOW_SERVICE);
params = new WindowManager.LayoutParams(WindowManager.LayoutParams.WRAP_CONTENT,
WindowManager.LayoutParams.WRAP_CONTENT,
WindowManager.LayoutParams.TYPE_SYSTEM_OVERLAY,
WindowManager.LayoutParams.FLAG_WATCH_OUTSIDE_TOUCH,
PixelFormat.TRANSLUCENT);
wm.addView(surfaceview, params);
and then set
surfaceview.setZOrderOnTop(true);
mHolder.setFormat(PixelFormat.TRANSPARENT);
where mHolder is the holder you get from the surface view.
this way, you can play with the surfaceview's alpha, make it completly transparent, but the camera will still get frames.
that's how i do it. hope it helps :)
We solved this problem by using a dummy SurfaceView (not added to actual GUI) in versions below 3.0 (or let's say 4.0 as a camera service on a tablet does not really make sense).
In versions >= 4.0 this worked in the emulator only ;(
The use of SurfaceTexture (and setSurfaceTexture()) instead of SurfaceView (and setSurfaceView()) worked here. At least this works on Nexus S.
I think this really is a shortcoming of the Android framework.
In the "Working Example by Sam" (Thank you Sam... )
if at istruction "wm.addView(preview, params);"
obtain exception "Unable to add window android.view.ViewRoot -- permission denied for this window type"
resolve by using this permission in AndroidManifest:
<uses-permission android:name="android.permission.SYSTEM_ALERT_WINDOW"/>
You can try this working code, This service click front picture, if you want to capture back camera picture then uncomment back camera in code and comment front camera.
Note :- Allow Camera and Storage permission to App And startService from Activity or anywhere.
public class MyService extends Service {
#Nullable
#Override
public IBinder onBind(Intent intent) {
return null;
}
#Override
public void onCreate() {
super.onCreate();
CapturePhoto();
}
private void CapturePhoto() {
Log.d("kkkk","Preparing to take photo");
Camera camera = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
int frontCamera = 1;
//int backCamera=0;
Camera.getCameraInfo(frontCamera, cameraInfo);
try {
camera = Camera.open(frontCamera);
} catch (RuntimeException e) {
Log.d("kkkk","Camera not available: " + 1);
camera = null;
//e.printStackTrace();
}
try {
if (null == camera) {
Log.d("kkkk","Could not get camera instance");
} else {
Log.d("kkkk","Got the camera, creating the dummy surface texture");
try {
camera.setPreviewTexture(new SurfaceTexture(0));
camera.startPreview();
} catch (Exception e) {
Log.d("kkkk","Could not set the surface preview texture");
e.printStackTrace();
}
camera.takePicture(null, null, new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
File pictureFileDir=new File("/sdcard/CaptureByService");
if (!pictureFileDir.exists() && !pictureFileDir.mkdirs()) {
pictureFileDir.mkdirs();
}
SimpleDateFormat dateFormat = new SimpleDateFormat("yyyymmddhhmmss");
String date = dateFormat.format(new Date());
String photoFile = "ServiceClickedPic_" + "_" + date + ".jpg";
String filename = pictureFileDir.getPath() + File.separator + photoFile;
File mainPicture = new File(filename);
try {
FileOutputStream fos = new FileOutputStream(mainPicture);
fos.write(data);
fos.close();
Log.d("kkkk","image saved");
} catch (Exception error) {
Log.d("kkkk","Image could not be saved");
}
camera.release();
}
});
}
} catch (Exception e) {
camera.release();
}
}
}