I'm creating barcode reading application using google vision and it consists a flash on/off function also. i was able to implement flash using "CameraManager" like in the following code due to the "Camera" is now deprecated.but when camera screen is on, flash light is not working.when camera is freeze(when barcode is detected i'm stoping the camerasource), flash light is working. but i want to flash light on/off with out regarding the camera view is on or stop.i need this get done without using the deprecated APIs.can some one tell me how i can get solved this problem. thanks in advance.
private void setupBarcode(){
cameraPreview = (SurfaceView) findViewById(R.id.cameraPreview);
txtResult = findViewById(R.id.txtResult);
barcodeDetector = new BarcodeDetector.Builder(this)
.setBarcodeFormats(Barcode.ALL_FORMATS)
.build();
cameraSource = new CameraSource
.Builder(this, barcodeDetector)
.setAutoFocusEnabled(true)
.build();
//Add Event
cameraPreview.getHolder().addCallback(new SurfaceHolder.Callback() {
#Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
if (ActivityCompat.checkSelfPermission(getApplicationContext(), android.Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
//Request permission
ActivityCompat.requestPermissions(ScanActivity.this,
new String[]{Manifest.permission.CAMERA}, RequestCameraPermissionID);
return;
}
try {
cameraSource.start(cameraPreview.getHolder());
txtResult.setText("");
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {
}
#Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
cameraSource.stop();
}
});
barcodeDetector.setProcessor(new Detector.Processor<Barcode>() {
#Override
public void release() {
}
#Override
public void receiveDetections(Detector.Detections<Barcode> detections) {
final SparseArray<Barcode> qrcodes = detections.getDetectedItems();
if (qrcodes.size() != 0) {
txtResult.post(new Runnable() {
#Override
public void run() {
okButton.setEnabled(true);
txtResult.setText(qrcodes.valueAt(0).displayValue);
cameraSource.stop();
}
});
}
}
});
}
private void flashLightOn() {
CameraManager cameraManager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
String cameraId = cameraManager.getCameraIdList()[0];
cameraManager.setTorchMode(cameraId, true);
} catch (CameraAccessException e) {
}
}
Use the open source 'CameraSource.java' file from goole vision library in your project.
Get file from here
And then use the following code:
public void switchFlashLight(boolean status) {
cameraSource.setFlashMode(status ? Camera.Parameters.FLASH_MODE_TORCH : Camera.Parameters.FLASH_MODE_OFF);
}
I've integrated camera2 with textureVIew. It works on all devices but on tablet when we capture image for first time, it crashesand displays following logs.
Fatal Exception: java.lang.IllegalArgumentException: Surface had no valid native Surface.
at android.hardware.camera2.legacy.LegacyCameraDevice.nativeGetSurfaceId(LegacyCameraDevice.java)
at android.hardware.camera2.legacy.LegacyCameraDevice.getSurfaceId(LegacyCameraDevice.java:658)
at android.hardware.camera2.legacy.LegacyCameraDevice.containsSurfaceId(LegacyCameraDevice.java:678)
at android.hardware.camera2.legacy.RequestThreadManager$2.onPictureTaken(RequestThreadManager.java:220)
at android.hardware.Camera$EventHandler.handleMessage(Camera.java:1248)
at android.os.Handler.dispatchMessage(Handler.java:111)
at android.os.Looper.loop(Looper.java:207)
at android.hardware.camera2.legacy.CameraDeviceUserShim$CameraLooper.run(CameraDeviceUserShim.java:136)
at java.lang.Thread.run(Thread.java:818)
Following code is used to capture the image.
protected void takePicture() {
if (getContext() == null || cameraDevice == null) return;
lockFocus();
CameraManager manager = (CameraManager) getContext().getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
if (characteristics != null) {
sizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
ImageReader reader = getImageReader();
if (reader == null) return;
List<Surface> outputSurfaces = getSurfaces(reader);
final CaptureRequest.Builder captureBuilder = getCaptureBuilder(reader);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, null);
} catch (Exception e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, null);
} catch (Exception e) {
e.printStackTrace();
}
}
Any help would be appreciated.
That can happen if the ImageReader gets garbage collected before the camera picture capture completes.
Does the getImageReader method store the image reader somewhere permanent (like as a class member)? If not, the Surface from the ImageReader is like a weak reference, and will not keep it from being removed.
I'm trying to create an app using the camera2 api, what I need is to create a burst of 30 fps which I was able to create.
The problems is both the preview images and the saved images interlaced (I'm photographing some blinking leds so its easy to see).
I tried to disable the auto exposure and set the sensitivity myself but that didn't work.
private void captureStillPicture() {
try {
final Activity activity = getActivity();
mPictureCounter = 0;
if (null == activity || null == mCameraDevice) {
return;
}
List<CaptureRequest> captureList = new ArrayList<CaptureRequest>();
// This is the CaptureRequest.Builder that we use to take a picture.
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
// Use the same AE and AF modes as the preview.
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
captureBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, 2000);
//Auto focus - should keep that
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, Consts.aeMode);
captureBuilder.addTarget(mImageReader.getSurface());
for(int i = 0; i < Consts.frameAmount; i++) {
captureList.add(captureBuilder.build());
}
CameraCaptureSession.CaptureCallback CaptureCallback = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
mPictureCounter++;
unlockFocus();
}
};
mCaptureSession.stopRepeating();
mCaptureSession.captureBurst(captureList, CaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Any thought?
I'm working on Android and I'm trying to capture a picture without displaying any preview. I tried to simplify the process by making a class. It's working but all the pictures are really really dark.
Here is my class :
public class Cam {
private Context context;
private CameraManager manager;
private CameraDevice camera;
private CameraCaptureSession session;
private ImageReader reader;
public static String FRONT="-1";
public static String BACK="-1";
private boolean available=true;
private String filepath;
private static final String NO_CAM = "No camera found on device!";
private static final String ERR_CONFIGURE = "Failed configuring session";
private static final String ERR_OPEN = "Can't open the camera";
private static final String CAM_DISCONNECT = "Camera disconnected";
private static final String FILE_EXIST = "File already exist";
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
public Cam(Context context) throws CameraAccessException {
this.context = context;
this.manager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
String ids[] = manager.getCameraIdList();
if(ids.length==2){
BACK=ids[0];
FRONT=ids[1];
}
else if(ids.length==1){
BACK=ids[0];
}
else{
available=false;
throw new CameraAccessException(-1, NO_CAM);
}
}
public void takePicture(String camId, String filepath) throws CameraAccessException {
if(available){
this.filepath=filepath;
StreamConfigurationMap map = manager.getCameraCharacteristics(camId).get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
Size largest = Collections.max(Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)), new CompareSizesByArea());
reader=ImageReader.newInstance(largest.getWidth(), largest.getHeight(), ImageFormat.JPEG, 1);
reader.setOnImageAvailableListener(imageListener, null);
manager.openCamera(camId, cameraStateCallback, null);
}
else
throwError(NO_CAM);
}
private CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(CameraDevice camera) {
Cam.this.camera=camera;
try {
camera.createCaptureSession(Collections.singletonList(reader.getSurface()), sessionStateCallback, null);
} catch (CameraAccessException e) {
throwError(e.getMessage());
}
}
#Override
public void onDisconnected(CameraDevice camera) {
throwError(CAM_DISCONNECT);
}
#Override
public void onError(CameraDevice camera, int error) {
throwError(ERR_OPEN);
}
};
private CameraCaptureSession.StateCallback sessionStateCallback = new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
Cam.this.session=session;
try {
CaptureRequest.Builder request = camera.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
request.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
request.addTarget(reader.getSurface());
int rotation = ((WindowManager)context.getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay().getRotation();
request.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
session.capture(request.build(), captureCallback, null);
} catch (CameraAccessException e) {
throwError(e.getMessage());
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
throwError(ERR_CONFIGURE);
}
};
private CameraCaptureSession.CaptureCallback captureCallback = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureFailed(CameraCaptureSession session, CaptureRequest request, CaptureFailure failure) {
super.onCaptureFailed(session, request, failure);
throwError(failure.toString());
}
};
private ImageReader.OnImageAvailableListener imageListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireLatestImage();
try {
File file = saveImage(image);
// Send file via a listener
closeCamera();
} catch (IOException e) {
throwError(e.getMessage());
}
reader.close();
}
};
private File saveImage(Image image) throws IOException {
File file = new File(filepath);
if (file.exists()) {
throwError(FILE_EXIST);
return null;
}
else {
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
FileOutputStream output = new FileOutputStream(file);
output.write(bytes);
image.close();
output.close();
return file;
}
}
static class CompareSizesByArea implements Comparator<Size> {
#Override
public int compare(Size lhs, Size rhs) {
return Long.signum((long) lhs.getWidth() * lhs.getHeight() - (long) rhs.getWidth() * rhs.getHeight());
}
}
private void closeCamera(){
if(session!=null) {session.close();}
if(reader!=null) {reader.close();}
if(camera!=null) {camera.close();}
}
Then I call the Cam object in my Activity :
Cam cam = new Cam(MainActivity.this);
cam.takePicture(Cam.BACK, "/sdcard/pic.jpg");
A listener prevent the MainActivity when the picture is available, but I removed the code to clear a bit.
I don't know what I am doing wrong, the pictures are really dark. Maybe a flag or something... Any help will be appreciated.
EDIT:
Working class : https://github.com/omaflak/Android-Camera2-Library/blob/master/ezcam/src/main/java/me/aflak/ezcam/EZCam.java
Example: https://github.com/omaflak/Android-Camera2-Library/blob/master/app/src/main/java/me/aflak/libraries/MainActivity.java
If the only capture request you send to the camera is the one for the final picture, this is not surprising.
The camera automatic exposure, focus, and white balance routines generally need a second or two of streaming buffers before they converge to good results.
While you don't need to draw preview on screen, the simplest method here is to first run a repeating request targeting a dummy SurfaceTexture for a second or two, and then fire off the JPEG capture.
You could just stream the JPEG capture, but JPEG capture is slow, so you'll need a longer time for convergence (plus it's more likely a camera implementation has a bug with repeated JPEG capture and getting good exposure, than with a typical preview).
So, create a dummy SurfaceTexture with a random texture ID argument:
private SurfaceTexture mDummyPreview = new SurfaceTexture(1);
private Surface mDummySurface = new Surface(mDummyPreview);
and include the Surface in your session configuration. Once the session is configured, create a preview request that targets the dummy preview, and after N capture results have come in, submit the capture request for the JPEG you want. You'll want to experiment with N, but probably ~30 frames is enough.
Note that you're still not dealing with:
Locking AF to ensure a sharp image for your JPEG
Running AE precapture to allow for flash metering, if you want to allow for flash use
Having some way for the user to know what they'll be capturing, since there's no preview, they can't aim the device at anything very well.
The AF lock and precapture trigger sequences are included in Camera2Basic sample here: https://github.com/googlesamples/android-Camera2Basic, so you can take a look at what those do.
Maybe you can try to turn on the automatic exposure mode and automatic white balance:
request.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
request.set(CaptureRequest.CONTROL_AWB_MODE, CaptureRequest.CONTROL_AWB_MODE_AUTO);
I hope it will help :)
In my case just configuration FPS helps me. And don't forget to put it to CaptureRequest.Builder for preview and ALSO to CaptureRequest.Builder capture builder. As usual FPS 10 or 15 frames quite enough for photo and preview.
Capture builder
// This is the CaptureRequest.Builder that we use to take a picture.
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
...
setupFPS(captureBuilder);
Preview builder:
// We set up a CaptureRequest.Builder with the output Surface.
mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);
...
// set FPS rate
setupFPS(mPreviewRequestBuilder);
Where setupFPS:
private void setupFPS(CaptureRequest.Builder builder){
if(fpsRange != null) {
builder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, fpsRange);
}
}
And initialization of FPS with:
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
try {
Range<Integer>[] ranges = characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
if(ranges != null) {
for (Range<Integer> range : ranges) {
int upper = range.getUpper();
Log.i(TAG, "[FPS Range Available]:" + range);
if (upper >= 10) {
if (fpsRange == null || upper < fpsRange.getUpper()) {
fpsRange = range;
}
}
}
}
} catch (Exception e) {
e.printStackTrace();
}
Log.i(TAG, "[FPS Range] is:" + fpsRange);
I was facing a dark image issue for the last two days and now I have a solution for it.
You need to set CaptureRequest like below.
I tried to set the brightness in camera2 API.
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
captureBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
captureBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, 10);
captureBuilder.addTarget(imageReader.getSurface());
I know this is old, but I ran into a similar issue. Sharing what worked for me for anyone else who stumbles upon this. Tried all sorts of answers here with no success.
Setting CONTROL_AE_MODE to CONTROL_AE_MODE_ON (as some suggested) also did not fix it (you think it would).
What fixed it for me was setting the CONTROL_CAPTURE_INTENT to CONTROL_CAPTURE_INTENT_VIDEO_RECORD.
request.set(CaptureRequest.CONTROL_CAPTURE_INTENT, CaptureRequest.CONTROL_CAPTURE_INTENT_VIDEO_RECORD);
Once adding this line and building, auto exposure was enabled on the camera as expected and the camera adjusted automatically.
Refer to https://developer.android.com/reference/android/hardware/camera2/CameraMetadata for more information. I used this as a guide to discover what options were available.
In a previous way, flashlight feature could be used by using Camera class. But now that the entire Camera and Camera-related classes in android.hardware packages are deprecated, I should alternatively use some other classes in the android.hardware.camera2 package.
Traditionally, I coded the flashlight part like this.
// getting camera parameters
private void getCamera() {
if (camera == null) {
try {
camera = Camera.open();
params = camera.getParameters();
} catch (RuntimeException e) {
Log.e("Camera Error. Failed to Open. Error: ", e.getMessage());
}
}
}
/*
* Turning On flash
*/
private void turnOnFlash() {
if (!isFlashOn) {
if (camera == null || params == null) {
return;
}
// play sound
playSound();
params = camera.getParameters();
params.setFlashMode(Parameters.FLASH_MODE_TORCH);
camera.setParameters(params);
camera.startPreview();
isFlashOn = true;
// changing button/switch image
toggleButtonImage();
}
}
But now with the new API I'm getting so confused how to use the new one. Can anybody explain?
for flashlight I advise using the Camera2 API only from Android 6 (api 23), my function for toggling the flashlight looks like
#TargetApi(Build.VERSION_CODES.M)
public void toggleMarshmallowFlashlight(boolean enable) {
try {
final CameraManager manager = (CameraManager) mContext.getSystemService(Context.CAMERA_SERVICE);
final String[] list = manager.getCameraIdList();
manager.setTorchMode(list[0], enable);
} catch (CameraAccessException e) {
}
}