My android application captures preview frames. It is necessary that the frames be not blurred.
This requires to limit the exposure time of the sensor. For example, I want the exposure time be less than 10ms, and the white-balance adjuster uses ISO only.
The only solution I found is the fixing of SENSOR_EXPOSURE_TIME and SENSOR_SENSITIVITY:
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result)
{
// measure ISO and exposure
mExposure = result.get(CaptureResult.SENSOR_EXPOSURE_TIME);
mSensitivity = result.get(CaptureResult.SENSOR_SENSITIVITY);
...
}
void prepareCapturing()
{
// setting the necessary values of ISO and exposure
if (mExposure > 10.0 * 1e+6)
{
double exposure = 10.0 * 1e+6;
double sens = mExposure * mSensitivity / exposure;
mPreviewRequestBuilder.set(CaptureRequest.SENSOR_EXPOSURE_TIME, (long)exposure);
mPreviewRequestBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, (int)sens);
}
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_OFF);
setRepeatingRequest();
}
I call the method prepareCapturing before running my algorithm of preview frame analyzing.
This approach works, but it requires to disable android.control.aeMode, thus the white balance will be off.
Also I tried to use standard scene modes like CONTROL_SCENE_MODE_ACTION, and CONTROL_SCENE_MODE_SPORTS, but exposure time anyway is about 40ms.
The question: is it possible using camera2 intereace to limit exposure time of the sensor, s.t. the white balance be active?
The primary API to require auto-exposure to remain below some maximum exposure time is the CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE control. For example, if you set that to (30,30), then the camera device may not use exposure times longer than 1/30th of a second.
The list of available ranges for a device is provided by CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES.
There's no direct control for setting max/min exposure values by themselves, and if no available target FPS range limits exposure times enough, your only other option is to use manual exposure control (if supported).
Whether that disables AWB is device-dependent, unfortunately - on some devices, the output of the auto-exposure routine is essential for the white balance algorithm to work.
You can know the lower and upper values supported by your phone using the key SENSOR_INFO_EXPOSURE_TIME_RANGE
Check out the next method:
/**
* Get exposure time range.
*
* #param cameraCharacteristics The properties of the camera.
* #return Long exposure compensation range.
* #see <a href="https://developer.android.com/reference/android/hardware/camera2/CameraCharacteristics.html#SENSOR_INFO_EXPOSURE_TIME_RANGE">
* CameraCharacteristics.SENSOR_INFO_EXPOSURE_TIME_RANGE</a>
*/
private static Range<Long> getExposureTimeRange(CameraCharacteristics cameraCharacteristics) {
return cameraCharacteristics.get(CameraCharacteristics.SENSOR_INFO_EXPOSURE_TIME_RANGE);
}
Check out Camera parameters' setExposureCompensation() method.
Here is the URL https://developer.android.com/reference/android/hardware/Camera.Parameters.html#setExposureCompensation(int)
Related
I used the latest Camera2Basic sample program as a source for my trials:
https://github.com/android/camera-samples.git
Basically I configured the CaptureRequest before I call the capture() function in the takePhoto() function like this:
private fun prepareCaptureRequest(captureRequest: CaptureRequest.Builder) {
//set all needed camera settings here
captureRequest.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_OFF)
captureRequest.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_OFF);
//captureRequest.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_CANCEL);
//captureRequest.set(CaptureRequest.CONTROL_AWB_LOCK, true);
captureRequest.set(CaptureRequest.CONTROL_AWB_MODE, CaptureRequest.CONTROL_AWB_MODE_OFF);
captureRequest.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_OFF);
//captureRequest.set(CaptureRequest.CONTROL_AE_LOCK, true);
//captureRequest.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_CANCEL);
//captureRequest.set(CaptureRequest.NOISE_REDUCTION_MODE, CaptureRequest.NOISE_REDUCTION_MODE_FAST);
//flash
if (mState == CaptureState.PRECAPTURE){
//captureRequest.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_OFF);
captureRequest.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_OFF)
}
if (mState == CaptureState.TAKEPICTURE) {
//captureRequest.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_SINGLE)
//captureRequest.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
captureRequest.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_SINGLE)
}
val iso = 100
captureRequest.set(CaptureRequest.SENSOR_SENSITIVITY, iso)
val fractionOfASecond = 750.toLong()
captureRequest.set(CaptureRequest.SENSOR_EXPOSURE_TIME, 1000.toLong() * 1000.toLong() * 1000.toLong() / fractionOfASecond)
//val exposureTime = 133333.toLong()
//captureRequest.set(CaptureRequest.SENSOR_EXPOSURE_TIME, exposureTime)
//val characteristics = cameraManager.getCameraCharacteristics(cameraId)
//val configs: StreamConfigurationMap? = characteristics[CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP]
//val frameDuration = 33333333.toLong()
//captureRequest.set(CaptureRequest.SENSOR_FRAME_DURATION, frameDuration)
val focusDistanceCm = 20.0.toFloat() //20cm
captureRequest.set(CaptureRequest.LENS_FOCUS_DISTANCE, 100.0f / focusDistanceCm)
//captureRequest.set(CaptureRequest.COLOR_CORRECTION_MODE, CameraMetadata.COLOR_CORRECTION_MODE_FAST)
captureRequest.set(CaptureRequest.COLOR_CORRECTION_MODE, CaptureRequest.COLOR_CORRECTION_MODE_TRANSFORM_MATRIX)
val colorTemp = 8000.toFloat();
val rggb = colorTemperature(colorTemp)
//captureRequest.set(CaptureRequest.COLOR_CORRECTION_TRANSFORM, colorTransform);
captureRequest.set(CaptureRequest.COLOR_CORRECTION_GAINS, rggb);
}
but the picture that is returned never is the picture where the flash is at its brightest. This is on a Google Pixel 2 device.
As I only take one picture I am also not sure how to check some CaptureResult states to find the correct one as there is only one.
I already looked at the other solutions to similar problems here but they were either never really solved or somehow took the picture during capture preview which I don't want.
Other strange observations are that on different devices the images are taken (also not always at the right moment), but then the manual values I set are not observed in the JPEG metadata of the image.
If needed I can put my git fork on github.
Long exposure time in combination with flash seems to be the basic issue and when the results are not that good, this means that the timing of your preset isn't that good. You'd have to optimize the exposure time's duration, in relation to the flash's timing (just check the EXIF of some photos for example values). You could measure the luminosity with an ImageAnalysis.Analyzer (this had been removed from the sample application, but elder revisions still have an example). And I've tried with the default Motorola camera app; there the photo also seems to be taken shortly after the flash, when the brightness is already decaying (in order to avoid the dazzling bright). That's the CaptureState.PRECAPTURE, where you switch the flash off. Flashing in two stages is rather the default and this might yield better results.
If you want it to be dazzlingly bright (even if this is generally not desired), you could as well first switch on the torch, that the image, switch off the torch again (I use something alike this, but only for barcode scanning). This would at least prevent any expose/flash timing issues.
When changed values are not represented in EXIF, you'd need to use ExifInterface, in order to update them (there's an example which updates the orientation, but one can update any value).
I do following:
CaptureRequest captureRequest;
captureRequestBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_TORCH);
captureRequest = captureRequestBuilder.build();
cameraCaptureSessions.setRepeatingRequest(captureRequest, captureCallBackListener, backgroundHandler);
...but the flash turns off before the picture was taken.
Maybe this might help:
CONTROL_AE_MODE
added in API level 21
public static final Key CONTROL_AE_MODE
The desired mode for the camera device's auto-exposure routine.
This control is only effective if android.control.mode is AUTO.
When set to any of the ON modes, the camera device's auto-exposure routine is enabled, overriding the application's selected exposure time, sensor sensitivity, and frame duration (android.sensor.exposureTime, android.sensor.sensitivity, and android.sensor.frameDuration). If one of the FLASH modes is selected, the camera device's flash unit controls are also overridden.
The FLASH modes are only available if the camera device has a flash unit (android.flash.info.available is true).
If flash TORCH mode is desired, this field must be set to ON or OFF, and android.flash.mode set to TORCH.
When set to any of the ON modes, the values chosen by the camera device auto-exposure routine for the overridden fields for a given capture will be available in its CaptureResult.
this is from here: https://developer.android.com/reference/android/hardware/camera2/CaptureRequest
I am preparing a custom android camera app and wish to adjust the exposure/brightness of the camera on touch event. The default values look a bit darker than the default camera . I tried using whiteBalance(auto) function but it wont help. Was trying it using exposure like
params.setExposureCompensation(params.getExposureCompensation());
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.ICE_CREAM_SANDWICH) {
if(params.isAutoExposureLockSupported()) {
params.setAutoExposureLock(false);
}
}
but i am not understanding the difference between the exposure function's like ,
getExposureCompensation(), getMaxExposureCompensation(),getExposureCompensationStep()
Firstly, you are not actually setting the exposure.
params.setExposureCompensation(params.getExposureCompensation());
sets the exposure to the previous value i.e it is never changed. What you need to do is set a value between params.getMinExposureCompensation() and params. getMaxExposureCompensation()
Secondly the difference between the exposure functions is clearly explained in the docs
getExposureCompensation
Gets the current exposure compensation index.
current exposure compensation index. The range is getMinExposureCompensation() to getMaxExposureCompensation(). 0 means exposure is not adjusted.
getMaxExposureCompensation
Gets the maximum exposure compensation index.(>=0)
getExposureCompensationStep
exposure compensation step. Applications can get EV by multiplying the exposure compensation index and step. Ex: if exposure compensation index is -6 and step is 0.333333333, EV is -2.
Here EV stands for exposure value
I am new in android and trying to figure out new camera2 effects. I have no idea how to control iso in camera preview manually.
Any help will be appreciated.
Thanks.
One way to determine if your device supports manual ISO control is to check if it supports the MANUAL_SENSOR capability.
If so, you can turn off auto-exposure by either disabling all automatics:
previewBuilder.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_OFF);
or by just disabling auto-exposure, leaving auto-focus and auto-white-balance running:
previewBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_OFF);
Once you've disabled AE, you can manually control exposure time, sensitivity (ISO), and frame duration):
previewBuilder.set(CaptureRequest.SENSOR_EXPOSURE_TIME, exposureTime);
previewBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, sensitivity);
previewBuilder.set(CaptureRequest.SENSOR_FRAME_DURATION, frameDuration);
The valid ranges for these values can be found from SENSOR_INFO_EXPOSURE_TIME_RANGE and SENSOR_INFO_SENSITIVITY_RANGE for exposure and sensitivity. For frame duration, the maximum frame duration can be found from SENSOR_INFO_MAX_DURATION, and the minimum frame duration (max frame rate) depends on your session output configuration. See StreamConfigurationMap.getOutputMinFrameDuration for more details on this.
Note that once you disable AE, you have to control all 3 parameters (there are defaults if you never set one, but they won't vary automatically). You can copy the last-good values for these from the last CaptureResult before you turn off AE, to start with.
You have to set previewbuilder first like this:
mPreviewBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.INFO_SUPPORTED_HARDWARE_LEVEL_FULL);
and than
Range<Integer> range2 = characteristics.get(CameraCharacteristics.SENSOR_INFO_SENSITIVITY_RANGE);
int max1 = range2.getUpper();//10000
int min1 = range2.getLower();//100
int iso = ((progress * (max1 - min1)) / 100 + min1);
mPreviewBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, iso);
progress is a variable for seekBar from onProgressChanged(SeekBar seekBar, int progress, boolean user) override method
My goal is to synchronize movie’s frames with device orientation data from phone’s gyroscope/accelerometer. Android provides orientation data with proper timestamps. As for a video its frame rate known only in general. Experiments show that changes in orientation measured by accelerometer don’t match changes in the scene. Sometimes it goes faster, sometimes it goes slower during the same video.
Is there any way to find out how much time passed between two consequent frames?
I’m going to answer this question myself.
First, yes, it is possible that fps or frame rate is not constant. Here is the quote from Android Developers Guide: “NOTE: On some devices that have auto-frame rate, this sets the maximum frame rate, not a constant frame rate.”
http://developer.android.com/reference/android/media/MediaRecorder.html#setVideoFrameRate%28int%29
Second, there is a function get(CV_CAP_PROP_POS_MSEC) in OpenCV VideoCapture class that allows reading current frame time position:
VideoCapture capture("test.mp4");
double curFrameTime;
double prevFrameTime = 0;
double dt;
while (capture.grab())
{
curFrameTime = capture.get(CV_CAP_PROP_POS_MSEC); //Current frame position
dt = curFrameTime - prevFrameTime; //Time difference
prevFrameTime = curFrameTime; //Previous frame position
}
capture.release();
If there is a better suggestion I’ll be glad to see it.