Is there any API instruction or calculation to obtain the android camera Aperture F value, and the Exposure Time?
The Camera API exposes methods to get the White Balance, the Focal Length, etc, but I couldn't find anything related to the Aperture and the Exposure Time.
Thanks
In the Android Developer's Guide for ExifInterface, which is a class in the android.media API (not, the now deprecated Camera API), there is information for Aperture and Exposure Time
Sample code showing how it could be used is available at ExifHelper
Related
I am trying to get the cameraId of the widest lense available on the rear side of the device. I am getting the "logical" rear camera just fine, as documented here.
But it defaults to a random rear physical camera that is not the widest camera. I tried to follow the Multi-camera API documentation but for any device I ask if any of the cameras has CameraMetadata.REQUEST_AVAILABLE_CAPABILITIES_LOGICAL_MULTI_CAMERA, it always returns false.
For cameraManager.cameraIdList, I seem to always just get one front and rear camera.
cameraManager.getCameraCharacteristics(cameraId).physicalCameraIds also always returns empty.
cameraManager.getCameraCharacteristics(cameraId).availablePhysicalCameraRequestKeys also always returns empty.
What could I be doing wrong? I am targeting API 29 and running on API 30 devices. Don't want to use deprecated camera API, but it seems like this used to be doable in that API. What other info would you all need?
You can calculate which camera has biggest FOV based on sensor size and focal length.
Fov Angle = 2*atan(l/2d), where l is sensor size and d is focal length.
You can find the math behind this formula here.
https://stackoverflow.com/a/3261794/11861734
I'm trying to get the current preview's shutter speed and ISO settings.
I cannot find a way to do this using CameraX or Camera2. Is this not something that is available?
Failing that, is there a way to get the settings that were used to take a photo?
For Camera2, this information is available in the CaptureResult objects you get for every captured image, via onCaptureCompleted. However, not all devices support listing this information; only devices that list the READ_SENSOR_SETTINGS capability will do this. That includes all devices that list hardware level FULL or better, and may include some devices at the LIMITED level.
Specifically, you want to look at SENSOR_SENSITIVITY for ISO and SENSOR_EXPOSURE_TIME for shutter speed.
If you want the values used for a JPEG capture, look at the CaptureResult that comes from the CaptureRequest you used to request the JPEG.
I also was able to GET only the ISO settings using Jetpack/CameraX API through the Camera2CameraInfo.from and then getCameraCharacteristic. But it seems there is no way to set them (using CameraX).
In the latest releases of Jetpack/CameraX it is also possible to set the ISO/ShutterSpeed.
The below example sets the ISO for preview (written in C# language, but it is similar to Java):
var builder = new Preview.Builder();
var ext1 = new Camera2Interop.Extender(builder)
.SetCaptureRequestOption(CaptureRequest.ControlAeMode, (int)ControlAEMode.Off)
.SetCaptureRequestOption(CaptureRequest.SensorSensitivity, 3200);
var preview = builder.SetTargetName("PREVIEW").Build();
preview.SetSurfaceProvider(sfcPreview.SurfaceProvider);
I am using SENSOR_INFO_EXPOSURE_TIME_RANGE to calculate the range of supported exposure time.
I tested this in Huawei P30 Pro and it seems like this API is giving wrong values(10000 - 1000000000).
But when I use the built in camera app's pro mode, there I can set exposure value to 30s which is really larger than what is provided by the API.
Can anyone help me with this? How to get correct values for the range of supported exposure durations?
As Camera algorithms are defined differently on different phone models and manufactures, for Huawei Mate30 Pro, please use HMS camera kit API to get the calculated exposure range instead of using Android Camera2 native API:API example is as follows, then you will see the exposure range returned as : 1/4000-30s.
mMode.getModeCharacteristics()
.getParameterRange(RequestKey.HW_PRO_SENSOR_EXPOSURE_TIME_VALUE)
For more detailed info. on how to integrate HMS camera Pro. Mode, please refer to HMS Camera Engine Developer’s Guide:
Please note: HMS Camera Engine SDK is currently only available on HMS phones.
I'm working on a project which ultimately uses an app to recognize spoken words via face recognition and gives feedback how good your pronunciation was.
I would like to know if there is a way to only partially get the data from the camera sensor (ROI) so not all pixels have to be parsed and processed to possibly improve framerates and lower the datastream.
I'm fairly new to android app dev, so I don't know if there is a way to call the sensor on this level or if such intervention on hardware specific elements can't be handeled by software methods.
So I would appreciate if there is anyone who could tell me if there is a way. Researching android docu didn't get me any results so far.
Thanks in advance and regards from Germany
Generally speaking, this isn't supported.
While the camera API allows for setting a level of digital zoom (for the deprecated android.hardware.Camera API) or an explicit crop region (on the current android.hardware.camera2 API), there's no guarantee that frame rate will increase when you select a smaller region.
This is because digital zoom is expected to be variable per-frame, and image sensors can generally not reconfigure their readout region that quickly. So digital zoom / crop is implemented in the camera image processing pipeline instead, and the sensor always reads out the full frame.
Some lower-resolution output configurations may set the sensor to skipping/binning modes, which does increase the maximum frame rate by reducing the number of pixels read off the sensor, but you indicate you need full resolution on a small ROI, so this doesn't help you.
I know how to get a lux value from the light sensor using android.hardware.sensor.
I saw light meter tools on the market. The application description said it can get the lux value from the camera. How can it do that?
Also how can I set the shutter speed and the aperture?
The android camera API doesn't provide any absolute units for the image data it captures. In addition, it does not allow for manual control of exposure or aperture (although essentially all cell phone cameras have no adjustable aperture anyway).
You can find out what exposure time used for a still capture from the JPEG EXIF, but that's about it.
Because of those limitations, you'll have a hard time getting an absolute light measurement from a captured camera image. You may be able to calibrate a given device to convert from image pixel value to true light level, but it'll be complicated since all devices run auto-exposure and auto-white-balance. Using auto-exposure and auto-white-balance locks introduced in Android 4.0 will help a bit, but there's still an unknown conversion curve between lux and a captured pixel value (not just a scale factor, it's a gamma curve).
Take a look at the Camera.Parameters class.
It has all functions supported by the Camera. Probably setExposureCompensation.
I don't know much about photography but my guess would be that exposure compensation is changing aperture or speed.
It could be that the tool you mentioned is using a bundled native library. Also have a look at how the functions given in Camera.Parameters class work (check android source code).
you can also use Ambient light sensor to get the light level in lux units (if this is within the scope of your project). From android documentation:
Sensor.TYPE_LIGHT:
values[0]: Ambient light level in SI lux units
Android Developer: Light Sensor-Sensor Event
You can find more information about the light sensor in the documentation.
Android Developer: Sensor
Agree, the trick is taking the picture and read its EXIF data.
You can find more about ExifInterface docs
http://developer.android.com/reference/android/media/ExifInterface.html
Depending iso and aperture values, then you can calculate the shuyy