Managing the camera - Fail to access camera - android

I'm writing a program that accesses the camera. It worked for a while, but now it is the Camera API is throwing exceptions because the camera is still in use when I try to open it. I think this is because I am not handling it correctly in my onPause, onResume, and where I use it. Here is my code:
#Override
protected void onResume(){
if (!getPackageManager()
.hasSystemFeature(PackageManager.FEATURE_CAMERA)) {
Toast.makeText(this, "No camera on this device", Toast.LENGTH_LONG)
.show();
} else {
cameraId = findFrontFacingCamera();
if (cameraId < 0) {
Toast.makeText(this, "No front facing camera found.",
Toast.LENGTH_LONG).show();
} else {
try {
if (camera != null) {
camera.release();
camera = null;
}
camera = Camera.open(cameraId);
try {
camera.setPreviewTexture(cameraTexture);
}
catch(IOException e){
Log.i("CameraHome", "could not set camera preview");
}
}
catch (Exception e){
}
}
}
super.onResume();}
Pause:
#Override
public void onPause(){
stopLocationUpdates();
SharedPreferences.Editor editor = sharedPreferences.edit();
editor.putLong("timeLeft", timeLeft);
editor.putBoolean("lockedOut", lockedOut);
editor.apply();
if (camera != null) {
camera.release();
camera = null;
}
super.onPause();
}
In use:
//Take picture
camera.startPreview();
camera.takePicture(null, null,
new PhotoHandler(getApplicationContext()));
camera.stopPreview();
camera.release();
Got any ideas what I'm doing wrong? Every time I relaunch the application, it throws an exception saying that the camera is in use, even if I launch the default camera app and close it.
EDIT: The camera is being accessed, but my callback (onPictureTaken in PhotoHandler) is never being called, as if the picture isn't being captured properly.

Turns out this code works. Had to uninstall, restart, and reinstall. Then my onPictureTaken callback wasn't being called. I believe this was because I was calling stopPreview too soon. I moved that to my onPause and took away the release call after the takePicture call. Works fine now

Related

How to automatically turn on flashlight for video recording when the scene is dark?

My app shows a preview and video recording starts with a button press.
What I'm trying to achieve is to automatically turn on flashlight (torch mode) as soon as the video recording starts.
However I couldn't find a way to do so. On Camera2 API we can use FLASH_MODE_AUTO which will use the flashlight when capturing photo when the scene is dark, but that doesn't work for video recording.
There's this FLASH_MODE_TORCH which I could use to turn on the flashlight just like I wanted, but there isn't a FLASH_MODE_TORCH_AUTO to automatically do so when the scene is dark..
There were some answers that uses Ambient light sensor (Sensor.TYPE_LIGHT) of the device to determine whether we are in a dark scene, however that uses the front ambient light sensor instead of the camera itself I think. This is not ideal as the ambient light can be low but the rear camera is able to adjust exposure level to achieve good enough image quality without using flash. So ideally if the camera says 'flash is required' then only the app activates FLASH_MODE_TORCH.
Since the app shows a preview the device already know whether flash is needed before the button press, is there a way to determine whether flash is required during preview?
Please try the below method where you need you can use it
below is for Camera API
public void switchFlashOnMode() {
Camera.Parameters p = getCamera().getParameters();
try {
//p.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
p.setFlashMode(Parameters.FLASH_MODE_AUTO);
getCamera().setParameters(p);
getCamera().startPreview();
isFlashTorch = true;
}catch (Exception e){
e.printStackTrace();
}
}
public void switchFlashOffMode() {
Camera.Parameters p = getCamera().getParameters();
try {
p.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
getCamera().setParameters(p);
Thread.sleep(200);
getCamera().stopPreview();
isFlashTorch = false;
}catch (Exception e){
e.printStackTrace();
}
}
below is for Camera2 API
void switchFlashMode() {
if (!flashSupport) return;
try {
if (isFlashTorch) {
isFlashTorch = false;
requestBuilder.set(CaptureRequest.FLASH_MODE, CameraMetadata.FLASH_MODE_OFF);
} else {
isFlashTorch = true;
//requestBuilder.set(CaptureRequest.FLASH_MODE,CameraMetadata.FLASH_MODE_TORCH);
requestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
}
cameraCaptureSession.setRepeatingRequest(requestBuilder.build(), null, null);
} catch (Exception e) {
e.printStackTrace();
}
}
hope it will help you
Finally figured this out, gpuser's answer used the right flag but it is not complete - still need to code the callback and turn on the torchlight when needed.
I also found that for video recording, we still use the same Camera2 API init and configuration steps, just that some of the callbacks will be fired multiple times, so I added a flag to perform the flash detection only once.
1)After camera started capturing, run this code
performAutoTorchDetectionOnce = true; // set this flag first, will be used later
captureRequestBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH); // CONTROL_AE_MODE_ON_AUTO_FLASH is important here, to enable flash detection
captureSession.setRepeatingRequest(captureRequestBuilder.build(), captureCallback, null);
2)And this is my captureCallback implementation, change it depending on your needs. The gist of it is that eventually the camera capture will fall into one of the two states, CONTROL_AE_STATE_CONVERGED or CONTROL_AE_STATE_FLASH_REQUIRED. These two states mean that auto exposure algorithm has finished running, if it is converged means no flash is needed whereas flash_required will mean that we have to turn on flash. In the latter we will then need to manually turn on the flash in the next step.
private CameraCaptureSession.CaptureCallback captureCallback =
new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureStarted(CameraCaptureSession session, CaptureRequest request,
long timestamp, long frameNumber) {
super.onCaptureStarted(session, request, timestamp, frameNumber);
}
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
if (aeState != null) {
if (performAutoTorchDetectionOnce) {
if (aeState == CameraMetadata.CONTROL_AE_STATE_CONVERGED // CONTROL_AE_STATE_CONVERGED means Auto-exposure has finished
|| aeState == CameraMetadata.CONTROL_AE_STATE_FLASH_REQUIRED) { // CONTROL_AE_STATE_FLASH_REQUIRED means Auto-exposure has finished, but flash is required
performAutoTorchDetectionOnce = false;
enableTorch(aeState == CameraMetadata.CONTROL_AE_STATE_FLASH_REQUIRED);
}
}
}
}
};
3)Here's the enableTorch implementation. I tried leaving CONTROL_AE_MODE as CONTROL_AE_MODE_ON_AUTO_FLASH but it didn't work, torch light does not turn on, so I have to change it to CONTROL_AE_MODE_ON.
public synchronized void enableTorch(boolean enable) {
Timber.d("enableTorch(" + enable + ") called");
try {
if (isCaptureStarted()) {
if (enable) {
captureRequestBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_TORCH);
} else {
captureRequestBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_OFF);
}
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
captureSession.setRepeatingRequest(captureRequestBuilder.build(), null, null);
}
} catch (CameraAccessException e) {
Timber.e(e, "enableTorch(" + enable + ") failed: ");
}
}

The camera is in use by another app

I am developing a webRTC app using opentok. The app was working fine. I converted the app to library and launching the library activity by adding it to another project. The app is connecting to server but camera not opening. I am getting camera error as follows
E/opentok-videocapturer: The camera is in use by another app
java.lang.RuntimeException: Fail to connect to camera service
at android.hardware.Camera.<init>(Camera.java:518)
at android.hardware.Camera.open(Camera.java:360)
at com.opentok.android.DefaultVideoCapturer.init(DefaultVideoCapturer.java:110)
at com.opentok.android.BaseVideoCapturer.initTrap(BaseVideoCapturer.java:223)
public boolean isCameraUsebyApp() {
Camera camera = null;
try {
camera = Camera.open();
}
catch (RuntimeException e)
{
return true;
}
finally
{
if (camera != null)
{
camera.release();
}
}
return false;
}

How to save camera instance?

How to save the state of the camera so when i close the screen and turn it on again the camera gets back to its previous state.Should i bundle it?
public void getCamera() {
if (camera == null) {
try {
camera = Camera.open();
params = camera.getParameters();
} catch (RuntimeException e) {
Log.e(e.getMessage(), "Camera Error. Failed to Open. Error:Application will close! ");
finish();
}
}
}
Generally you don't want to do this- you want to release the camera in onPause and recapture it in onResume so other apps can use it as well.

Surfaceview using Mobile Vision Displays the Camera in Landscape

I am using SurfaceView and Google's Mobile Vision library. For many devices it looks fine but when using with few devices like Nexus 7 the camera view comes in Landscape mode. Which makes it difficult for Scanning barcodes etc as it is difficult to focus and position correctly.
In Vision library as I have explored there is no method such that they return the hardware camera so we can manage the orientation like if the camera view returns landscape then we can dynamically rotate the view to make it look like portrait.
So wanted to ask if there is any way for Devices like Nexus 7 to change the Camera or View to Portrait.
Any help will be welcomed! Thanks
Many tabs has their camera mounted rotated, so that when held horizontally, the picture will be taken as "portrait", even though the image is actually wider than it is high.
I learned it the hard way, on an app i built some time ago. The only way was to check the screen-aspect vs the image-aspect and image rotation.
By comparing these, you can infer whether a camera image is rotated correctly, or whether it needs a 90 degree post-rotation.
I found a solution for myself getting an idea from this persons answer:
https://stackoverflow.com/a/41634379/5028531
So what I did:
cameraPreview.getHolder().addCallback(new SurfaceHolder.Callback() {
#Override
public void surfaceCreated(SurfaceHolder holder) {
if (ActivityCompat.checkSelfPermission(getActivity(), Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED) {
try {
cameraSource.start(cameraPreview.getHolder());
Field[] declaredFields = CameraSource.class.getDeclaredFields();
for (Field field : declaredFields) {
if (field.getType() == Camera.class) {
field.setAccessible(true);
try {
Camera camera = (Camera) field.get(cameraSource);
if (camera != null) {
Camera.Parameters params= camera.getParameters();
camera.setDisplayOrientation(0);
}
} catch (IllegalAccessException | RuntimeException e) {
e.getMessage();
}
break;
}
}
} catch (IOException e) {
Log.e("CAMERA SOURCE", e.getMessage());
e.printStackTrace();
}
} else {
Log.w("CAMERA SOURCE", "Permission not granted");
Toast.makeText(getActivity(), "Camera permission denied", Toast.LENGTH_SHORT).show();
}
}

Runtime Error: failed to connect to camera service in android

I wanted to make use of the zxing library to detect qrcodes in my app. But for the apps viewing purpose, i had to change the custom display orientation to portrait. Hence i had to integrate the whole zxing library into my app and addded camera.setDisplayOrientation(90) to the openDriver() method.
After doing this, the program works, but I get "Runtime exceptions : Fail to connect to camera service" randomly.
public void openDriver(SurfaceHolder holder) throws IOException {
if (camera == null) {
camera = Camera.open();
camera.setDisplayOrientation(90);
if (camera == null) {
throw new IOException();
}
}
camera.setPreviewDisplay(holder);
if (!initialized) {
initialized = true;
configManager.initFromCameraParameters(camera);
}
configManager.setDesiredCameraParameters(camera);
SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context);
reverseImage = prefs.getBoolean(PreferencesActivity.KEY_REVERSE_IMAGE, false);
if (prefs.getBoolean(PreferencesActivity.KEY_FRONT_LIGHT, false)) {
FlashlightManager.enableFlashlight();
}
}
public void closeDriver() {
if (camera != null) {
FlashlightManager.disableFlashlight();
camera.release();
camera = null;
framingRect = null;
framingRectInPreview = null;
}
}
/**
* Asks the camera hardware to begin drawing preview frames to the screen.
*/
public void startPreview() {
if (camera != null && !previewing) {
camera.startPreview();
previewing = true;
}
}
/**
* Tells the camera to stop drawing preview frames.
*/
public void stopPreview() {
if (camera != null && previewing) {
if (!useOneShotPreviewCallback) {
camera.setPreviewCallback(null);
}
camera.stopPreview();
previewCallback.setHandler(null, 0);
autoFocusCallback.setHandler(null, 0);
previewing = false;
}
}
I doubt that the orientation change is causing that. I have found you will get that error whenever an activity stops but fails to call Camera.release in their onPause. The result is that the next time you try to do Camera.open you get that runtime error since the driver still considers it open regardless of the app/activity that opened it being gone.
You can easily get this to happen while debugging/testing stuff when something throws an exception and brings the activity down. You need to be very diligent about catching all exceptions and being sure to release the camera before finishing the activity.
BTW, are you finding you need to power cycle the device in order to be able to open the camera again?

Categories

Resources