Android L - Take flash image with autofocus using Camera2 api - android

Following camera2basic guide on Android L preview page, I am able to capture normal images, i.e. without flash or using auto-focus mechanism (I rely on passive focus)
However, I would like to take a flash image. The documentation states before taking flash image, I should call android.control.aePrecaptureTrigger to determine correct exposure.
My question:
How can I call AE Precapture trigger, wait for it to complete and then take the image using capture(CaptureRequest, CameraCaptureSession.CaptureListener, Handler)?
Method I've already tried:
After user clicks capture image button, I start a preview
Set CONTROL_AE_PRECAPTURE_TRIGGER to CONTROL_AE_PRECAPTURE_TRIGGER_START
Monitor AE_STATE result in CaptureListener's onCaptureCompleted method
When AE_STATE converges, I set AE lock and take image using capture() method
However, the flash image is still over-exposed and sometimes, I get complete garbage image.
Has anyone able to get this working?
Once this is working, auto-focus mechanism can be used in similar fashion.
Thanks

Thanks for trying out the new camera2 API!
You shouldn't need to lock AE; once you see AE_STATE as CONVERGED (or FLASH_REQUIRED), submit the still capture request.
Things to verify:
Is your AE_MODE either ON_AUTO_FLASH or ON_ALWAYS_FLASH for both the preview and the still capture requests? If not, the metering routines won't be controlling flash power or firing correctly. The still capture and preview templates may just have AE mode set to ON, which means the flash won't be fired under AE control.
Are you using CAPTURE_INTENT_STILL_PICTURE for the still capture? If not, the flash won't be fired by the automatics. This is automatically set for TEMPLATE_STILL_CAPTURE.
If you're seeing garbage images, please feel free to file a bug on our Android AOSP tracker:
http://b.android.com
Detailing the set of outputs you have for your session would be especially helpful, since we know there are some current bugs for certain output Surface sets.

I am not sure you got answer or not. I just figure it out as follows:
First I did for capturebuilder
captureBuilder.set(CaptureRequest.CONTROL_MODE,
CameraMetadata.CONTROL_MODE_AUTO);
captureBuilder.set(CaptureRequest.FLASH_MODE,
CameraMetadata.FLASH_MODE_TORCH);
I set both because I think that flash can be able to take while auto mode.
But the result is can't get flash image when capture.
Now I get flash image after I set boolean value for flash on/off.
if (isFlashOn)
captureBuilder.set(CaptureRequest.FLASH_MODE,
CameraMetadata.FLASH_MODE_SINGLE);
else
captureBuilder.set(CaptureRequest.CONTROL_MODE,
CameraMetadata.CONTROL_MODE_AUTO);

Related

Android 10 (api 29) how to enable HDR and Nigh Mode

I'm building a camera up designed to work exclusively on Pixel 3 XL. I'm using camera2 API and would like to take a picture using front facing camera with HDR and/or Night Mode enabled. This is a code snipped where I setup my capture request:
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mStillImageReader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_SCENE_MODE, CameraMetadata.CONTROL_SCENE_MODE_HDR);
captureBuilder.set(CaptureRequest.CONTROL_AWB_MODE, CameraMetadata.CONTROL_AWB_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CameraMetadata.CONTROL_AE_MODE_ON);
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CameraMetadata.CONTROL_AF_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_SCENE_MODE_NIGHT);
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, 0);
...
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, mBackgroundHandler);
I was hoping to get something close to what Android's native camera app is doing when it's set to shoot in Night Mode or HDR+. Does anybody know if I need to do more than just setting flags in capture request to get the desired behavior?
In Android the image processing algorithms can be implemented at HAL level as well as at application level. Android framework sits between HAL and application and provide an interface to interact with camera. When you call the set method:
captureBuilder.set(
CaptureRequest.CONTROL_SCENE_MODE,
CameraMetadata.CONTROL_SCENE_MODE_HDR);
You request the HAL to perform HDR before returning the image to application. Note that: HAL is implemented by OEMs (or the vendors) (can consider Pixel to be an OEM) and it's upto the implementer to implement different CONTROL_SCENE_MODES. You can and should query the available control modes using:
// Get the camera charecteristics for current camera
CameraCharacteristics cameraCharacteristics = getCameraCharecteristics(cameraFacing);
int[] modes = cameraCharacteristics.get(
CameraCharacteristics.CONTROL_AVAILABLE_SCENE_MODES);
// Check if the modes array has the SCENE MODE you wish to apply
// ... by iterating through the list.
If you run this on Pixel 3 XL, you may not get HDR supported. Correct me if I am wrong.
If you do not wish to write your own algorithms for HDR and Night Mode, you'd have to rely on available scene modes, and for that best would be to query and validate. Alternatively you can request YUV or RAW image from camera 2 and run them through HDR or Night Mode algorithms at application level.
TL;DR;
The Google Pixel's native camera is most likely doing more image processing on top of image returned by camera2 and hence it cannot be replicated by leveraging camera2 alone by setting any configurations.
HDR+ / HDR does not seem supported on Google's own Pixel phones, even with CameraX (and of course not with Camera2)
Myabe worth trying CameraX with a 3rd party extension, like this.
Problem is, CameraX is in a (very) alpha state as of now, and its fate is not clear either.

Changing android camera params after startPreview()

Similar to this question, I'm trying to change parameters of the android camera after calling startPreview(). I'm processing video input frame by frame from the preview and need to adjust exposure settings in real time. lock() doesn't seem to be helpful and stopping the preview doesn't work at frame rate. However, just like the linked question has pointed out, once startPreview() is called, none of the param changes seem to go through to the camera.
In case it's relevant, I'm doing this on Google Glass... but that shouldn't make a difference.

Switching camera to front face causes to failure taking a picture in camera2

i am trying to change camera2 basic example so camera starts showing preview on FACING_FRONT, but in that case snapshot stops working.
Here is a link to the example:
https://github.com/pinguo-yuyidong/Camera2/blob/master/camera2/src/main/java/us/yydcdut/camera2
I am sure, that besides cameraId change i need a lot more additional changes, but i couldn't find what else.
Many front-facing cameras are fixed-focus, so autofocus (AF) state remains INACTIVE, and the AF trigger does nothing.
You need to check if the camera actually supports focusing, and if not, don't use the AF trigger or wait for AF state to change.
To check if the camera supports focusing, look at http://developer.android.com/reference/android/hardware/camera2/CameraCharacteristics.html#LENS_INFO_MINIMUM_FOCUS_DISTANCE

Why is there a preview of the taken image shown after I take a picture with the camera on Android?

when I make a call to mCamera.takePicture(null, null, null, null); (for simplicity I have omitted the callbacks) the preview freezes and shows a preview of the scene that was just captured. Why is that the case. Can I somehow control this behaviour? And, what does actually happen? Is there a new view that gets attached or does my camera preview simple stop?
What does actually happen? Is there a new view that gets attached or does my camera preview simply stop?
No new view gets attached by default. The preview just stops. The documentation for Camera states this clearly:
Preview will be stopped after the image is taken; callers must call startPreview() again if they want to re-start preview or take more pictures.
It also goes on to say:
After calling this method, you must not call startPreview() or take another picture until the JPEG callback has returned.
So, the best place to call startPreview() again would be the JPEG callback. Any time before that, the camera hardware is still processing the previous image, and wouldn't be able to give you a preview. That's the main reason that it "freezes"; the camera hardware is just busy.
It's also a visual cue to the user that:
a picture was taken
the picture looks like "this"
That's icing on the cake, but even if you didn't care about that, it would still do it.
Can I somehow control this behaviour?
Through the publicly expose API? Definitely not. You can restart the preview once the camera is done processing(as above), but you can't prevent it from freeze-framing when you call takePicture().
Whether it's possible by going further into the camera firmware, I can't really say. However, since there are roughly a bazillion different cameras used in Android devices, this would likely be an exercise in futility if you weren't working on one specific device.
Even with one specific device, I can't see how you'd overcome it altogether. At a bare minimum, the camera will be busy processing the image for some amount of time. Even high-end DSLR cameras that I've seen freeze the preview at least for the duration of the exposure.
After calling takePicture() you can hide the preview surface under another view (e.g. ImageView). If you use OpenGL to render the preview texture instead of SurfaceView, you have even more tricks in your sleeve.

android camera preview screen intensity

I am using the custom camera application with preview, using the sdk example on nexus s (2.3)
everything works fine, including taking sharp picture, but the preview does not seem to adjust his level (intensity) like the built in camera does:
when I preview dark objects, the built in camera compensates and increases the intensity, while the custom preview stays in the default intensity, making the preview pretty dark. The images turns out in the correct intensity.
It is not related to white balancing, nor to camera exposure.
I do not want to have a full preview processing chain - just enable the luminance automatic level control - is it possible using the standard API?
Thanks
If you use the autofocus on preview I believe you will get the results you expect.
See Camera.autofocus and Camera.AutoFocusCallback
I believe you'll get one autofocus per call. That is, it's not continuous. You can either implement a way to call it continuously using handlers or simply put a touchlistener on your surfaceview and do autofocus when the user taps the preview.

Categories

Resources