I'm writing an application in which I'd like to retrieve the real-time values of various camera parameters (ie. ISO, shutter speed, aperture, but most importantly ISO). I am using the Android camera2 (and eventually camera1) API's.
I've found some links on stack overflow on retrieving those values for camera1 (though that functionality itself has limited support) and have looked through camera2 documentation, but haven't found a way to get those values (most importantly ISO) in real time. Any instructions on how to retrieve these values in real time/examples of such would be much appreciated!
On devices that support camera status output, those values are all available via CaptureResult, which is produced for each frame via onCaptureCompleted.
All devices that support the READ_SENSOR_SETTINGS capability will have the necessary values present, such as SENSOR_SENSITIVITY.
Related
If I use camera2 API to capture some image I will get "final" image after image processing, so after noise reduction, color correction, some vendor algorithms and etc.
I should also be able to get raw camera image following this.
The question is can I get intermediate stages of image as well? For example let's say that raw image is stage 0, then noise reduction is stage 1 color correction stage 2 and etc. I would like to get all of those stages and present them to user in an app.
In general, no. The actual hardware processing pipelines vary a great deal between different chip manufacturers and chip versions even from the same manufacturer. Plus each Android device maker then adds their own software on top of that.
And often, it's not possible to dump outputs from every step of the process, only some of them.
So making a consistent API for fetching this isn't very feasible, and the camera2 API doesn't have support for it.
You can somewhat simulate it by turning things like noise reduction entirely off (if supported by the device) and capturing multiple images, but that of course isn't as good as multiple versions of a single capture.
I'm trying to get the current preview's shutter speed and ISO settings.
I cannot find a way to do this using CameraX or Camera2. Is this not something that is available?
Failing that, is there a way to get the settings that were used to take a photo?
For Camera2, this information is available in the CaptureResult objects you get for every captured image, via onCaptureCompleted. However, not all devices support listing this information; only devices that list the READ_SENSOR_SETTINGS capability will do this. That includes all devices that list hardware level FULL or better, and may include some devices at the LIMITED level.
Specifically, you want to look at SENSOR_SENSITIVITY for ISO and SENSOR_EXPOSURE_TIME for shutter speed.
If you want the values used for a JPEG capture, look at the CaptureResult that comes from the CaptureRequest you used to request the JPEG.
I also was able to GET only the ISO settings using Jetpack/CameraX API through the Camera2CameraInfo.from and then getCameraCharacteristic. But it seems there is no way to set them (using CameraX).
In the latest releases of Jetpack/CameraX it is also possible to set the ISO/ShutterSpeed.
The below example sets the ISO for preview (written in C# language, but it is similar to Java):
var builder = new Preview.Builder();
var ext1 = new Camera2Interop.Extender(builder)
.SetCaptureRequestOption(CaptureRequest.ControlAeMode, (int)ControlAEMode.Off)
.SetCaptureRequestOption(CaptureRequest.SensorSensitivity, 3200);
var preview = builder.SetTargetName("PREVIEW").Build();
preview.SetSurfaceProvider(sfcPreview.SurfaceProvider);
I'm working with Galaxy Note 10+ and Android Camera2 API.
What I want to do is to get depth(distance) data while I'm taking pictures.(on Preview, View Finder screen)
Like this screenshot of Quick Measure app, I think it's Samsung default app.
Quick Measure
When I tested my device on Google Camera2Basic Sample, I found out that the depth(ToF) camera isn't included to availableCameras list because it doesn't have CameraMetadata.REQUEST_AVAILABLE_CAPABILITIES_BACKWARD_COMPATIBLE among its characteristics.
And the characteristic says
Devices with the DEPTH_OUTPUT capability might not list this capability, indicating that they support only depth measurement, not standard color output.
I have no clue how to use this camera to get depth information.
Should I open more than one camera simultaneously?
Any helps or comments are appreciated!
Regards,
Lee
I wanted to know if there is any API that provides supported video encoding parameters of an Android device. I have gone through the link http://developer.android.com/guide/appendix/media-formats.html. The link mentions the supported formats but different devices may have different parameters support. I want to get those parameters in order to let the user know before hand that whether their device would be able to play back the videos within the app or not.
For example, HD feature may not be available on every device so if there is any API that provides that information then user can be informed that the device do not support it. Similarly, for other encoding parameters such as bitrate, frame rate, video resolution.
You may take a look at the answer I provided here:
Detect max video resolution support for a mobile device on responsive website
It explains the solution in detail and is currently the only one unfortunately.
In API level 9, Android added the CameraInfo class, which includes information on each physical camera in the device. In particular, it includes an orientation attribute, which is "the angle that the camera image needs to be rotated clockwise so it shows correctly on the display in its natural orientation." This is distinct from the actual rotation of the device, which is found from getContext().getWindowManager().getDefaultDisplay().getRotation().
Android's sample code subtracts the rotation of the device from the orientation of the camera for rear-facing cameras (it's slightly more complicated for front-facing ones), and rotates the camera preview by this amount. This allows the preview to display properly in both portrait and landscape orientations of the screen.
How can I get the intrinsic orientation of the camera in API levels less than 9, where there is no CameraInfo class?
There is some platform specific solutions. But there is no easy general solution.
Android has a Hardware Abstract Layer(HAL), different vendors will implement the HAL differently. For example, different camera device may have different drivers, so they have different ways to get their data out, including your cameraInfo. When Android adds an API into the HAL, it requires its vendors to implement that API based on their hardwares. Then the Android framework and Android application can use that feature in a uniform way.
However, as you said, the getCameraInfo is not in the HAL Before Froyo. So a straightforward approach would be get those info from the driver or platform-specific library yourself.
For MSM Camera, there is a mm_camera_get_camera_info function in liboemcamera.so. You can use it to get a list of camera_info_t structs.
typedef struct {
int modes_supported;
int8_t camera_id;
cam_position_t position;
uint32_t sensor_mount_angle;
}camera_info_t;
The function encapsulates the actual system call to the target camera device. ioctl(controlfd, MSM_CAM_IOCTL_GET_CAMERA_INFO, &cameraInfo). You may directly call it if you like.
So unfortunately, you need to get that info based on the device you are working on. But may be you are expecting a general approach. Then I think the only way to achieve this is to implement the HAL yourself. Many if-else to decide which device or which ioctl command you need to use. Good luck man.