I am trying to get the cameraId of the widest lense available on the rear side of the device. I am getting the "logical" rear camera just fine, as documented here.
But it defaults to a random rear physical camera that is not the widest camera. I tried to follow the Multi-camera API documentation but for any device I ask if any of the cameras has CameraMetadata.REQUEST_AVAILABLE_CAPABILITIES_LOGICAL_MULTI_CAMERA, it always returns false.
For cameraManager.cameraIdList, I seem to always just get one front and rear camera.
cameraManager.getCameraCharacteristics(cameraId).physicalCameraIds also always returns empty.
cameraManager.getCameraCharacteristics(cameraId).availablePhysicalCameraRequestKeys also always returns empty.
What could I be doing wrong? I am targeting API 29 and running on API 30 devices. Don't want to use deprecated camera API, but it seems like this used to be doable in that API. What other info would you all need?
You can calculate which camera has biggest FOV based on sensor size and focal length.
Fov Angle = 2*atan(l/2d), where l is sensor size and d is focal length.
You can find the math behind this formula here.
https://stackoverflow.com/a/3261794/11861734
Related
So I have different Samsung devices (up to S22 Ultra) and it is very easy to access ultra wide lens camera because CameraManager.cameraIdList returns 4 cameras which includes general back lens camera and ultra wide lens camera as well.
But many other devices (Xiaomi, Vivo and many others) return only two general cameras - back and front.
Some users of my apps said that they are able to use ultra wide lens camera with apps like mcpro24fps and gcam. One user with Xiaomi POCO X3 (Android 11)
How do such apps can access all cameras?
Also Camera2 API usually returns that video stabilization is not supported (manufacture doesn't exposes it within this api)
val characteristics = getCameraCharacteristics(context, cameraIdx)
val modes =
characteristics.get(CameraCharacteristics.CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES)
?: intArrayOf()
val supported = CameraMetadata.CONTROL_VIDEO_STABILIZATION_MODE_ON in modes
supported usually false but even if it returns true for some devices it seems it still doesn't really do any video stabilization and there is no any crop at the frames, no effect basically, based on users responses of my app
so the following code doesn't change anything if camera2 api returns that video stabilization is supported:
if (isVideoStabilizationSupported()) {
captureRequestBuilder.set(
CaptureRequest.CONTROL_VIDEO_STABILIZATION_MODE,
CameraMetadata.CONTROL_VIDEO_STABILIZATION_MODE_ON
)
but again mcpro24fps and gcam supports this as well but it seems mb a custom solution but I don't really understand how you can implement something custom with camera2 API without effecting performance, because it has to implemented on low level
Update Mb such apps can access ultra wide lens camera by using new zoom ratio camera parameter:
captureRequestBuilder.set(CaptureRequest.CONTROL_ZOOM_RATIO, 0.6f)
It works for my Samsung devices, 0.6...10 zoom ratio. So can switch between lens without changing camera id
The long-term goal for Android's camera APIs is that multi-camera clusters (such as a combination of ultrawide/wide/tele cameras) can be used by applications without having to specially code for it.
That's done via the logical multi-camera APIs. When implemented, that means there'll be one logical camera, composed of two or more physical cameras. What you see in the camera ID list is the logical camera, and you can also get the list of the physical cameras from CameraCharacteristics#getPhysicalCameraIds(). With this arrangement, you can see extended zoom ranges such as what you mention ( 0.6 ... 10), and the camera implementation will automatically switch to the ultrawide or telephoto cameras when you zoom out or zoom in. That's subject to various other conditions; for example, most telephoto lenses can't focus very close, so if you zoom in while focused on a nearby object, the camera will likely stay with the default wide camera; similarly tele cameras are often worse in low light, so digital zoom may result in better quality than optical zoom plus more amplification.
If you have a particular reason to use the underlying uw/wide/tele camera, you can include them in the stream configuration via setPhysicalCameraId() and use them directly; that lets you force which camera is streamed from if you want to provide a 'use telephoto' button in your UI, instead of letting the logical camera try to use its best judgement to select the active camera.
Unfortunately, not all devices have migrated to the logical camera API; those devices might be listing the multi-camera clusters as individual camera IDs as you've seen, or may just hide some cameras from apps entirely. The main problem with this is that it requires an app to do extra work to just zoom out/in with best quality, and the variety of implementations makes it hard to code up in an app so that it works on all devices.
I need to get the camera's horizontal and vertical viewing angles for an app I am writing. I used the approach in the second (not the accepted) answer on this question, which was working fine. I do:
Camera.Parameters p = Camera.open().getParameters();
and can then call
Math.toRadians(p.getVerticalViewAngle());
or the equivalent horizontal method to get the viewing angles.
This worked on my Nexus 4 and on a Samsung tablet, but I decided to try the app on my Nexus 7 and both the horizontal and vertical angles are being returned as pi. Obviously this is a ridiculous value for these attributes. Any idea why I am getting these values for this device?
Also, on a perhaps related note, android.hardware.Camera has been deprecated and replaced by android.hardware.Camera2. I have been unable to find a way of achieving the same goal with Camera2 though, but would welcome any suggestions on how to do this.
The p.getVerticalViewAngle() is probably returning the maximum value for any possible camera.
The answer of pi implies that it has 360° of vision which is improbably but possible and would be the theoretical maximum that a camera could take.
Therefore I would recommend trying to open using the ID:
Camera.Parameters p = Camera.open(/**cameraNumber**/).getParameters();
and checking that the cameraDevice isn't null.
As to the other question, there isn't any way using the camera2 api to get the vertical and horizontal viewing angles, but using the original camera API it does work (at least it's worked for me)
Is there any API instruction or calculation to obtain the android camera Aperture F value, and the Exposure Time?
The Camera API exposes methods to get the White Balance, the Focal Length, etc, but I couldn't find anything related to the Aperture and the Exposure Time.
Thanks
In the Android Developer's Guide for ExifInterface, which is a class in the android.media API (not, the now deprecated Camera API), there is information for Aperture and Exposure Time
Sample code showing how it could be used is available at ExifHelper
In API level 9, Android added the CameraInfo class, which includes information on each physical camera in the device. In particular, it includes an orientation attribute, which is "the angle that the camera image needs to be rotated clockwise so it shows correctly on the display in its natural orientation." This is distinct from the actual rotation of the device, which is found from getContext().getWindowManager().getDefaultDisplay().getRotation().
Android's sample code subtracts the rotation of the device from the orientation of the camera for rear-facing cameras (it's slightly more complicated for front-facing ones), and rotates the camera preview by this amount. This allows the preview to display properly in both portrait and landscape orientations of the screen.
How can I get the intrinsic orientation of the camera in API levels less than 9, where there is no CameraInfo class?
There is some platform specific solutions. But there is no easy general solution.
Android has a Hardware Abstract Layer(HAL), different vendors will implement the HAL differently. For example, different camera device may have different drivers, so they have different ways to get their data out, including your cameraInfo. When Android adds an API into the HAL, it requires its vendors to implement that API based on their hardwares. Then the Android framework and Android application can use that feature in a uniform way.
However, as you said, the getCameraInfo is not in the HAL Before Froyo. So a straightforward approach would be get those info from the driver or platform-specific library yourself.
For MSM Camera, there is a mm_camera_get_camera_info function in liboemcamera.so. You can use it to get a list of camera_info_t structs.
typedef struct {
int modes_supported;
int8_t camera_id;
cam_position_t position;
uint32_t sensor_mount_angle;
}camera_info_t;
The function encapsulates the actual system call to the target camera device. ioctl(controlfd, MSM_CAM_IOCTL_GET_CAMERA_INFO, &cameraInfo). You may directly call it if you like.
So unfortunately, you need to get that info based on the device you are working on. But may be you are expecting a general approach. Then I think the only way to achieve this is to implement the HAL yourself. Many if-else to decide which device or which ioctl command you need to use. Good luck man.
I know how to get a lux value from the light sensor using android.hardware.sensor.
I saw light meter tools on the market. The application description said it can get the lux value from the camera. How can it do that?
Also how can I set the shutter speed and the aperture?
The android camera API doesn't provide any absolute units for the image data it captures. In addition, it does not allow for manual control of exposure or aperture (although essentially all cell phone cameras have no adjustable aperture anyway).
You can find out what exposure time used for a still capture from the JPEG EXIF, but that's about it.
Because of those limitations, you'll have a hard time getting an absolute light measurement from a captured camera image. You may be able to calibrate a given device to convert from image pixel value to true light level, but it'll be complicated since all devices run auto-exposure and auto-white-balance. Using auto-exposure and auto-white-balance locks introduced in Android 4.0 will help a bit, but there's still an unknown conversion curve between lux and a captured pixel value (not just a scale factor, it's a gamma curve).
Take a look at the Camera.Parameters class.
It has all functions supported by the Camera. Probably setExposureCompensation.
I don't know much about photography but my guess would be that exposure compensation is changing aperture or speed.
It could be that the tool you mentioned is using a bundled native library. Also have a look at how the functions given in Camera.Parameters class work (check android source code).
you can also use Ambient light sensor to get the light level in lux units (if this is within the scope of your project). From android documentation:
Sensor.TYPE_LIGHT:
values[0]: Ambient light level in SI lux units
Android Developer: Light Sensor-Sensor Event
You can find more information about the light sensor in the documentation.
Android Developer: Sensor
Agree, the trick is taking the picture and read its EXIF data.
You can find more about ExifInterface docs
http://developer.android.com/reference/android/media/ExifInterface.html
Depending iso and aperture values, then you can calculate the shuyy