I'm working on an Android camera app that needs to use a fixed (manual) focus, and always uses the flash. I'm having some issues that seem to be with the flash timing. The flash always fires and an image is always acquired, but sometimes the flash doesn't actually illuminate the captured frame. Some frames have flash, some are overexposed, and some are dark; basically it's inconsistent and unpredictable.
I based my code off the Camera2Basic example. I think I've shown all the relevant parts here:
My preview request builder has the following setup
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_OFF);
float minimumLens = mCameraCharacteristics.get(CameraCharacteristics.LENS_INFO_MINIMUM_FOCUS_DISTANCE);
mPreviewRequestBuilder.set(CaptureRequest.LENS_FOCUS_DISTANCE, minimumLens);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AWB_MODE,CaptureRequest.CONTROL_AWB_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE,CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE, CameraMetadata.FLASH_MODE_OFF);
Then the actual sequence that acquires the pictures (almost straight from Camera2Basic) is:
private void takePicture() {
runPrecaptureSequence();
}
private CameraCaptureSession.CaptureCallback mCaptureCallback = new CameraCaptureSession.CaptureCallback() {
private void process(CaptureResult result) {
switch (mState) {
case STATE_PREVIEW: {
break;
}
case STATE_WAITING_PRECAPTURE: {
Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
if (aeState == null ||
aeState == CaptureResult.CONTROL_AE_STATE_PRECAPTURE ||
aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) {
mState = STATE_CAPTURE;
}
break;
}
case STATE_CAPTURE: {
// CONTROL_AE_STATE can be null on some devices
Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
if (aeState == null || aeState != CaptureResult.CONTROL_AE_STATE_PRECAPTURE) {
mState = STATE_PICTURE_TAKEN;
captureStillPicture();
}
break;
}
}
}
#Override
public void onCaptureProgressed(**ARGS**) {
process(partialResult);
}
#Override
public void onCaptureCompleted(**ARGS**) {
process(result);
}
};
private void runPrecaptureSequence() {
try { mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);
mState = STATE_WAITING_PRECAPTURE;
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private void captureStillPicture() {
try {
final Activity activity = getActivity();
if (null == activity || null == mCameraDevice) {
return;
}
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_OFF); CaptureBuilder.set(CaptureRequest.CONTROL_AWB_MODE,CaptureRequest.CONTROL_AWB_MODE_OFF);
float minimumLens = mCameraCharacteristics.get(CameraCharacteristics.LENS_INFO_MINIMUM_FOCUS_DISTANCE);
captureBuilder.set(CaptureRequest.LENS_FOCUS_DISTANCE, minimumLens);
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE,CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
captureBuilder.set(CaptureRequest.FLASH_MODE, CameraMetadata.FLASH_MODE_OFF);
int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(rotation));
mFileName = getFileNameFromTime() + ".jpg";
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
resumePreview();
}
};
mCaptureSession.stopRepeating();
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Saving the image takes place in the ImageReader onImageAvailableListener call and works just fine.
It does seem like the flash is firing before the image is acquired, so I tried the suggestion in this answer, but the FLASH_FIRED conditional suggested never triggered.
Can anybody with better familiarity with Camera2 than me see where I'm screwing up?
i have seen the code of camera2basic https://github.com/googlearchive/android-Camera2Basic and its newer java version https://github.com/android/camera-samples/tree/master/Camera2BasicJava
so basically there are three places in the code where the code sets flash(i am talking about the newer java version)
1) private void createCameraPreviewSession() function
2) private void captureStillPicture()
3) private void unlockFocus()
1) createCameraPreviewSession() function gets called when the preview is about to get displayed (so i you want to start flash from the beginning of the app start call these function accordingly)
2) captureStillPicture() function gets called when it is going to capture an HD picture
3) unlockFocus() function gets called when you have captured image and want to unlock focus when you click an HD picture
so if flash blinks or image gets washed out you may be setting flashmode torch at only one of last two functions
set flash mode consitent in last two functions and try to avoid first frame that you capture just after starting flash
i was setting flash like this
int flashFlag=1;
private void setAutoFlash(CaptureRequest.Builder requestBuilder) {
if (mFlashSupported) {
if (flashFlag==1)
{
requestBuilder.set(CaptureRequest.CONTROL_AE_MODE,CaptureRequest.CONTROL_AE_MODE_ON);
requestBuilder.set(CaptureRequest.FLASH_MODE,CaptureRequest.FLASH_MODE_TORCH);
}
else
{
requestBuilder.set(CaptureRequest.CONTROL_AE_MODE,
CaptureRequest.CONTROL_AE_MODE_ON);
}
}
}
i was making flashFlag a global variable and setting it at both the places together and rejecting the first flash HD frame because it usually gets washed out
also do not try to turnOn flash without capturing an HD picture because it will lock you in focus
Add the following three lines to the Camera2Basic Sample:
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, 4000);
mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE, FLASH_MODE_TORCH);
When Flash is activated and a repeatingRequest is started, the manual settings are being overwritten by some kind of default-Settings (Sensor-Sensitivity: 100). But the request clearly states Sensor-Sensitivity should be 4000.
I tested it with these two lines in the onCaptureCompleted-method:
Log.d(TAG, "request: "+request.get(CaptureRequest.SENSOR_SENSITIVITY));
Log.d(TAG, "result: "+result.get(CaptureResult.SENSOR_SENSITIVITY));
Related
I have a Camera App developed for Android 26 SDK. I have been using it happily with a Motorola G5 and G6 but when I move to a Motorola G7 the app crashes when I press the button to take a picture in my app.
The G7 is running Android 9. I have another Android 9 phone a Samsung S10 plus. The S10 plus does not crash when I press the button for taking a picture.
While debugging I noticed that the G7 doesn't call ImageReader.OnImageAvailableListener while the S10 does. Looking at the code this is where the image is saved for use later on in CameraCaptureSession.CaptureCallback. The callback expects bytes to be populated and crashes when it isn't (I haven't included the stack trace because it's not a little unhelpful but I can if you think you would like to see it).
I can get the G7 to save the image if I run it slowly through debug on 'some' occasions.
So I have a button that calls the function onImageCaptureClick() inside it does a bunch of stuff but one of the things it does is create an ImageReader.OnImageAvailableListener. The OnImageAvailableListener saves the image and populates a variable bytes from the image buffer. This onImageAvailableListener is attached to my reader by using reader.setOnImageAvailableListener(readerListener, null), and this listener is never used. When I get in to the CaptureCallBack the class variable bytes is not populated and the app crashes.
Do you have any idea where I would look to solve this?
protected void onImageCaptureClick() {
if (null == mCameraDevice) {
logger.debug("null == mCameraDevice");
Log.e(TAG, "cameraDevice is null");
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(mCameraDevice.getId());
Size[] jpegSizes = null;
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
int width = 640;
int height = 480;
if (jpegSizes != null && 0 < jpegSizes.length) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List < Surface > outputSurfaces = new ArrayList < > (2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(mTextureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
if (mFlashMode == FLASH_MODE_OFF) {
captureBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_OFF);
logger.debug("FLASH OFF");
}
if (mFlashMode == CONTROL_AE_MODE_ON) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE,
CaptureRequest.CONTROL_AE_MODE_ON);
captureBuilder.set(CaptureRequest.FLASH_MODE,
CaptureRequest.FLASH_MODE_TORCH);
logger.debug("FLASH ON");
}
if (mFlashMode == CONTROL_AE_MODE_ON_AUTO_FLASH) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
captureBuilder.set(CaptureRequest.FLASH_MODE,
CaptureRequest.FLASH_MODE_OFF);
logger.debug("FLASH AUTO");
}
captureBuilder.set(CaptureRequest.SCALER_CROP_REGION, zoom);
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
final File file = new File(_pictureUri.getPath());
logger.debug("OnImageCaptureClick: _pictureUri is: " + _pictureUri.getPath());
// ************************************
// this listener is not used on the G7,
// and so the image isn't saved.
// ************************************
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
bytes = new byte[buffer.capacity()];
buffer.get(bytes);
logger.debug("onImageCaptureClick, the filesize to save is: " + bytes.toString());
save();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (image != null) {
image.close();
}
}
}
private void save() throws IOException {
OutputStream output = null;
try {
output = new FileOutputStream(file);
output.write(bytes);
} finally {
if (null != output) {
output.close();
}
}
}
};
// ********************************************************
// the reader sets the listener here but it is never called
// and when I get in to the CaptureCallback the BitmapUtils
// expects bytes to be populated and crashes the app
// ********************************************************
reader.setOnImageAvailableListener(readerListener, null);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
try {
BitmapUtils.addTimeStampAndRotate(_pictureUri, bytes);
Intent intent = new Intent(CameraActivity.this, CameraReviewPhotoActivity.class);
intent.putExtra(MediaStore.EXTRA_OUTPUT, _pictureUri);
startActivityForResult(intent, CameraActivity.kRequest_Code_Approve_Image);
} catch (IOException e) {
e.printStackTrace();
} catch (ImageReadException e) {
e.printStackTrace();
} catch (ImageWriteException e) {
e.printStackTrace();
}
}
};
mCameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
Log.w(TAG, "Failed to configure camera");
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
} finally {
takePictureButton.setEnabled(false);
mTextureView.setEnabled(false);
}
The API makes no guarantee about the order of onCaptureCompleted and OnImageAvailableListener. They may arrive in arbitrary order, depending on the device, capture settings, the load on the device, or even the particular OS build you have.
Please don't make any assumptions about it.
Instead, if you need both callbacks to fire before you process something, then wait for both to happen before you move forward. For example, check if the other callback has fired in each callback, and if so, call the method to do the processing.
I think I have found the solution to this.
I have the following phones:
Samsung S10 Plus
Motorola G7
Motorola G6
My app works on the S10 and the G6.
The S10 and G6 both call the OnImageAvailableListener function before the onCaptureCompleted callback. The G7 however calls them both the other way around onCaptureCompleted then OnImageAvailableListener.
According to https://proandroiddev.com/understanding-camera2-api-from-callbacks-part-1-5d348de65950 the correct way is onCaptureCompleted then OnImageAvailableListener.
In my code I am assuming that OnImageAvailableListener has saved the image and then OnCaptureCompleted tries to manipulate it, which causes the crash.
Looking at the INFO_SUPPORTED_HARDWARE_LEVEL of each device I have the following levels of support from none level 0 to uber level 3.
Samsung S10 Plus reports device level support level 1
Motorola G7 reports device level support level 3
Motorola G6 reports device level support level 2
My assumption at this point is that the events fire in a different order when you support the android-camera2 API at Level 3 compared to other levels.
Hope this helps
I am writing an Android app that supports saving RAW/JPEG and records videos at the same time. I tried passing 4 surfaces when creating CameraCaptureSession: preview, 2x ImageSaver and 1x PersistentInputSurface created by MediaCodec#createPersistentInputSurface. By using persistent input surface, I intend to avoid a stoppage between 2 captures.
When creating the session it fails with:
W/CameraDevice-JV-0: Stream configuration failed due to: endConfigure:380: Camera 0: Unsupported set of inputs/outputs provided
Session 0: Failed to create capture session; configuration failed
I have tried taking out all other surfaces, leaving only the PersistentInputSurface, still fails.
#Override
public void onResume() {
super.onResume();
//Some other setups...
if (persistentRecorderSurface == null) {
persistentRecorderSurface = MediaCodec.createPersistentInputSurface();
}
startBackgroundThread();
startCamera();
if (mPreviewView.isAvailable()) {
configureTransform(mPreviewView.getWidth(), mPreviewView.getHeight());
} else {
mPreviewView.setSurfaceTextureListener(mSurfaceTextureListener);
}
if (mOrientationListener != null && mOrientationListener.canDetectOrientation()) {
mOrientationListener.enable();
}
}
private void createCameraPreviewSessionLocked() {
try {
SurfaceTexture texture = mPreviewView.getSurfaceTexture();
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
Surface surface = new Surface(texture);
mPreviewRequestBuilder = mBackCameraDevice.createCaptureRequest(
CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);
mBackCameraDevice.createCaptureSession(Arrays.asList(
surface,
mJpegImageReader.get().getSurface(),
mRAWImageReader.get().getSurface(),
persistentRecorderSurface
), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
synchronized (mCameraStateLock) {
if (mBackCameraDevice == null) {
return;
}
try {
setup3AControlsLocked(mPreviewRequestBuilder);
session.setRepeatingRequest(mPreviewRequestBuilder.build(),
mPreCaptureCallback, mBackgroundHandler);
mState = CameraStates.PREVIEW;
} catch (CameraAccessException | IllegalStateException e) {
e.printStackTrace();
return;
}
mSession = session;
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
showToast("Failed to configure camera.");
}
}, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
It'd be helpful to see the system log lines right before that error line to confirm, but most likely:
You need to actually tie the persistentRecorderSurface to a MediaRecorder or MediaCodec, and call prepare() on those, before you create the camera capture session.
Otherwise, there's nothing actually at the other end of the persistent surface, and the camera can't tell what resolution or other settings are required.
Also keep in mind that there are limits on how many concurrent outputs you can have from the camera, depending on its supported hardware level and capabilities. There is currently no requirement that a device must support your combination of outputs (preview, record, JPEG, RAW), unfortunately, so it's very likely many or all devices will still give you an error.
I need to do an application with an activity wich captures images in burst mode at 10 FPS.
My problem is that I'm able to capture very fast pictures with the burst mode, but it seems that my buffer of images is full before I can have the time to treat them (I just want to put my images into an ArrayList).
So my buffer is full and my app crashes.
The size of the buffer that I've choose is 50 (I cannot take too much memory).
It seems that my burst mode can take 30 FPS, but I'm able to treat my images at 10 FPS.
So, I want to slow down my capturing, but when I do that (by waiting before repeat another capture), it slows down my preview too. So I have a lag in my preview.
I don't want to save my images, I just want to stock them in order to treat them later (in an other activity).
Do you have any idea, maybe to slow down my capturing, or maybe to speed up my treatment.
Code ::
My code is based on the sample Camera2Basic.
private void captureStillPicture() {
try {
final Activity activity = getActivity();
if (null == activity || null == mCameraDevice) {
return;
}
mPreviewRequestBuilder.set(CaptureRequest.EDGE_MODE, CaptureRequest.EDGE_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE, CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE_ON);
mPreviewRequestBuilder.set(CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE, CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.NOISE_REDUCTION_MODE, CaptureRequest.NOISE_REDUCTION_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_CANCEL);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AWB_LOCK, true);
mCaptureSession.stopRepeating();
mCaptureSession.abortCaptures();
mPreviewRequestBuilder.addTarget(mImageReader.getSurface());
mCaptureSession.capture(mPreviewRequestBuilder.build(), captureCallback, null);
mPreviewRequestBuilder.removeTarget(mImageReader.getSurface());
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
blockCapture() permits to wait 100 ms between captures.
private void blockCapture() {
mPreviewRequestBuilder.removeTarget(mImageReader.getSurface());
while(Math.abs(timerBurst - System.currentTimeMillis()) < 1000){}
System.out.println("ELAPSED :::::::: " + Math.abs(timerBurst - System.currentTimeMillis()));
timerBurst = System.currentTimeMillis();
mPreviewRequestBuilder.addTarget(mImageReader.getSurface());
for(int i =0; i< NB_BURST; i++){captureList.add(mPreviewRequestBuilder.build());}
}
private final CameraCaptureSession.CaptureCallback captureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
showToast("Saved: " + mFile);
Log.d(TAG,"Saved: " + mFile);
mCounterImageForSave ++;
mCounterBurst++;
if(mCounterImageForSave <= 1) {
timerBurst = System.currentTimeMillis();
}
if(mCounterBurst >= NB_BURST && isPlayed) {
mCounterBurst = 0;
try {
blockCapture();
mCaptureSession.capture(mPreviewRequestBuilder.build(),captureCallback,null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
if (mCounterBurst >= NB_BURST || !isPlayed)
unlockFocus();
}
};
I have tried to capture YUV_420_888 images instead of JPEG, but my images where all black, and I cannot convert them.
Sorry if I'd make mistakes with my english ...
I'm working with the new Camera2 API on a Samsung S5. The supported hardware level this device is reporting is LEGACY, which is fine.
However, I cannot seem to be able to auto-focus on this device. The request to trigger auto-focus looks like this:
previewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
previewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_START);
state = STATE_PREVIEW;
try {
captureSession.setRepeatingRequest(previewRequestBuilder.build(), captureCallback, backgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
After the request is sent, the result of the request is always CONTROL_AF_STATE_ACTIVE_SCAN and occasionally CONTROL_AF_STATE_NOT_FOCUSED_LOCKED.
The strange thing is that, when the state is CONTROL_AF_STATE_NOT_FOCUSED_LOCKED, the auto-focus goes back into the CONTROL_AF_STATE_ACTIVE_SCAN state for a while and then back to CONTROL_AF_STATE_NOT_FOCUSED_LOCKED, resulting in a infinite focus loop. According to the docs, when state is CONTROL_AF_STATE_NOT_FOCUSED_LOCKED...
The lens will remain stationary until the AF mode (android.control.afMode) is changed or a new AF trigger is sent to the camera device (android.control.afTrigger).
I'm wondering if this discrepancy is because of the fact that the hardware level is LEGACY and that I should go back to using the deprecated Camera API, but that seems crazy for such a prevalent feature such as auto focus.
Is there any reccomendations how how to treat devices that are reporting LEGACY?
I branched form google's Camera2Basic example and changed it to use CaptureRequest.CONTROL_AF_MODE_AUTO instead of CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE
You can take the project from git and test it - https://github.com/pinhassi/android-Camera2Basic
Or just add this to Camera2BasicFragment:
private static final long LOCK_FOCUS_DELAY_ON_FOCUSED = 5000;
private static final long LOCK_FOCUS_DELAY_ON_UNFOCUSED = 1000;
private Integer mLastAfState = null;
private Handler mUiHandler = new Handler(); // UI handler
private Runnable mLockAutoFocusRunnable = new Runnable() {
#Override
public void run() {
lockAutoFocus();
}
};
public void lockAutoFocus() {
try {
// This is how to tell the camera to lock focus.
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_START);
CaptureRequest captureRequest = mPreviewRequestBuilder.build();
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, null); // prevent CONTROL_AF_TRIGGER_START from calling over and over again
mCaptureSession.capture(captureRequest, mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
*
* #return
*/
private float getMinimumFocusDistance() {
if (mCameraId == null)
return 0;
Float minimumLens = null;
try {
CameraManager manager = (CameraManager) getActivity().getSystemService(Context.CAMERA_SERVICE);
CameraCharacteristics c = manager.getCameraCharacteristics(mCameraId);
minimumLens = c.get(CameraCharacteristics.LENS_INFO_MINIMUM_FOCUS_DISTANCE);
} catch (Exception e) {
Log.e(TAG, "isHardwareLevelSupported Error", e);
}
if (minimumLens != null)
return minimumLens;
return 0;
}
/**
*
* #return
*/
private boolean isAutoFocusSupported() {
return isHardwareLevelSupported(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY) || getMinimumFocusDistance() > 0;
}
// Returns true if the device supports the required hardware level, or better.
#TargetApi(Build.VERSION_CODES.LOLLIPOP)
private boolean isHardwareLevelSupported(int requiredLevel) {
boolean res = false;
if (mCameraId == null)
return res;
try {
CameraManager manager = (CameraManager) getActivity().getSystemService(Context.CAMERA_SERVICE);
CameraCharacteristics cameraCharacteristics = manager.getCameraCharacteristics(mCameraId);
int deviceLevel = cameraCharacteristics.get(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL);
switch (deviceLevel) {
case CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_3:
Log.d(TAG, "Camera support level: INFO_SUPPORTED_HARDWARE_LEVEL_3");
break;
case CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_FULL:
Log.d(TAG, "Camera support level: INFO_SUPPORTED_HARDWARE_LEVEL_FULL");
break;
case CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY:
Log.d(TAG, "Camera support level: INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY");
break;
case CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED:
Log.d(TAG, "Camera support level: INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED");
break;
default:
Log.d(TAG, "Unknown INFO_SUPPORTED_HARDWARE_LEVEL: " + deviceLevel);
break;
}
if (deviceLevel == CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY) {
res = requiredLevel == deviceLevel;
} else {
// deviceLevel is not LEGACY, can use numerical sort
res = requiredLevel <= deviceLevel;
}
} catch (Exception e) {
Log.e(TAG, "isHardwareLevelSupported Error", e);
}
return res;
}
Then, add to STATE_PREVIEW block:
case STATE_PREVIEW: {
// We have nothing to do when the camera preview is working normally.
// TODO: handle auto focus
Integer afState = result.get(CaptureResult.CONTROL_AF_STATE);
if (afState != null && !afState.equals(mLastAfState)) {
switch (afState) {
case CaptureResult.CONTROL_AF_STATE_INACTIVE:
Log.d(TAG, "CaptureResult.CONTROL_AF_STATE_INACTIVE");
lockAutoFocus();
break;
case CaptureResult.CONTROL_AF_STATE_ACTIVE_SCAN:
Log.d(TAG, "CaptureResult.CONTROL_AF_STATE_ACTIVE_SCAN");
break;
case CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED:
Log.d(TAG, "CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED");
mUiHandler.removeCallbacks(mLockAutoFocusRunnable);
mUiHandler.postDelayed(mLockAutoFocusRunnable, LOCK_FOCUS_DELAY_ON_FOCUSED);
break;
case CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED:
mUiHandler.removeCallbacks(mLockAutoFocusRunnable);
mUiHandler.postDelayed(mLockAutoFocusRunnable, LOCK_FOCUS_DELAY_ON_UNFOCUSED);
Log.d(TAG, "CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED");
break;
case CaptureResult.CONTROL_AF_STATE_PASSIVE_UNFOCUSED:
mUiHandler.removeCallbacks(mLockAutoFocusRunnable);
//mUiHandler.postDelayed(mLockAutoFocusRunnable, LOCK_FOCUS_DELAY_ON_UNFOCUSED);
Log.d(TAG, "CaptureResult.CONTROL_AF_STATE_PASSIVE_UNFOCUSED");
break;
case CaptureResult.CONTROL_AF_STATE_PASSIVE_SCAN:
Log.d(TAG, "CaptureResult.CONTROL_AF_STATE_PASSIVE_SCAN");
break;
case CaptureResult.CONTROL_AF_STATE_PASSIVE_FOCUSED:
mUiHandler.removeCallbacks(mLockAutoFocusRunnable);
//mUiHandler.postDelayed(mLockAutoFocusRunnable, LOCK_FOCUS_DELAY_ON_FOCUSED);
Log.d(TAG, "CaptureResult.CONTROL_AF_STATE_PASSIVE_FOCUSED");
break;
}
}
mLastAfState = afState;
break;
}
And replace all occurrences of:
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
With:
if (isAutoFocusSupported())
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_AUTO);
else
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
I think the issue is with your setRepeatingRequest. As far as I know, CaptureRequest.CONTROL_AF_MODE_AUTO should only cause an autofocus to occur once, but setRepeatingRequest will send continuous requests. Try using capture instead:
previewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
previewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_START);
state = STATE_PREVIEW;
try {
mCaptureSession.capture(mPreviewRequestBuilder.build(), mPreCaptureCallback, mBackgroundHandler);
} catch (Exception e) {e.printStackTrace();}
I experience the same issue with a Galaxy Note 4 running Android 5.1.1 - while the same code works fine on a variety of other Android devices. There have been reports of similar issues with Galaxy-S4/S5/S6.
http://developer.samsung.com/forum/board/thread/view.do?boardName=SDK&messageId=289824&startId=zzzzz~
https://www.youtube.com/watch?v=lnMoYZwVaFM
So to anwer you question: This is most likely a bug in Samsung's implementation of the Camera-2 implementation - which seems to be of very low quality, unfourtunately.
The Samsung S5 with autofocus returned INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY, which means it does not support Camera2 api.
I have the below filter for using camera in my application.
if (Build.VERSION.SDK_INT >= 21 && isDeviceCompatibleOfCamera2()) {
// Use camera2
} else {
// Use old camera
}
#TargetApi(Build.VERSION_CODES.LOLLIPOP)
public boolean isDeviceCompatibleOfCamera2() {
try {
CameraManager manager = (CameraManager) getActivity().getSystemService(Context.CAMERA_SERVICE);
String backCameraId = manager.getCameraIdList()[0];
CameraCharacteristics backCameraInfo = manager.getCameraCharacteristics(backCameraId);
int level = backCameraInfo.get(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL);
return level == CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_FULL;
} catch (CameraAccessException e) {
ETLog.d(TAG, "Device not compatible of camera2 api" + e);
}
return false;
}
I am using the Android Camera2 API to capture and process an image. The image processing happens in reponse to a successful capture. The problem I am having is that the camera captures 2 images, I have been unable to change the code in the Camera2Basic sample to ensure only a single image is captured. The issue can be demonstrated by adding logging code to the ImageSaver.run() method in Camera2BasicFragment.java
public void run() {
Log.d("Camera2", "Saving image");
ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
...
}
Edit - after further investigation the issue appears to be in the implementation of the sample, rather than anything fundamental to the API. In the sample, the following code tracks the state changes of the camera,
private void process(CaptureResult result) {
switch (mState) {
case STATE_PREVIEW: {
// We have nothing to do when the camera preview is working normally.
break;
}
case STATE_WAITING_LOCK: {
int afState = result.get(CaptureResult.CONTROL_AF_STATE);
if (CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED == afState ||
CaptureResult.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED == afState) {
// CONTROL_AE_STATE can be null on some devices
Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
if (aeState == null ||
aeState == CaptureResult.CONTROL_AE_STATE_CONVERGED) {
mState = STATE_WAITING_NON_PRECAPTURE;
/** CAPTURE 1 */
captureStillPicture();
} else {
runPrecaptureSequence();
}
}
break;
}
case STATE_WAITING_PRECAPTURE: {
...
break;
}
case STATE_WAITING_NON_PRECAPTURE: {
// CONTROL_AE_STATE can be null on some devices
Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
if (aeState == null || aeState != CaptureResult.CONTROL_AE_STATE_PRECAPTURE) {
mState = STATE_PICTURE_TAKEN;
/** CAPTURE 2 **/
captureStillPicture();
}
break;
}
}
}
I have verified that both invocations of captureStillPicture result in an image being generated, and hence processed. I'm not entirely sure what the correct state transitions should be.
I am using a Motorola Nexus 6
There does indeed seem to be a redundancy/error in the sample code. The simple solution is to change the line just before you've written /** CAPTURE 1 */ to
mState = STATE_PICTURE_TAKEN;
This case is only entered when all the settings are already right/converged, and so might as well initiate the still capture, as it does. It just wasn't recording that it did so.
since your question I started reading and digging into it, and I believe that's the way it works. As per doc (link) says:
Each request will produce one CaptureResult and produce new frames for
one or more target Surfaces
note that it says "frames". In plural.
Said that I believe you should only consider the last frame from inside onCaptureCompleted callback.
It is because of the implementation of sample. To solve the problem, just call unlock() method at the end of captureStillPicture() method-
mPreviewSession.stopRepeating();
mPreviewSession.capture(captureBuilder.build(), mCaptureCallback, null);
unlockFocus();
and just in case, here is the unlock method
private void unlockFocus() {
try {
// Reset the autofucos trigger
mPreviewBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
mPreviewBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
// After this, the camera will go back to the normal state of preview.
mState = STATE_PREVIEW;
mPreviewSession.setRepeatingRequest(mPreviewBuilder.build(), mCaptureCallback, backgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}