Can use BULB mode from Camera Remote API - android

I am working on a simple mobile app that controls the camera from phone. I am interested to take pictures with custom shutter speed. UP to 30 second exposures are easily controlled by setting the shutter speed via API, however longer exposures require use of BULB mode.
Is there way take a picture with BULB mode from the Camera Remote API?
This seems to be a blocker for some use cases i.e. extended bracketing and some weird forms of time-lapse that I want to shoot.
PS I am struggling with few more topics - setting metering mode (spot, center, multi), setting white balance tint (green - purple axis as opposed to temperature yellow - blue). Is there way to control these parameters?

Sorry, none of those are supported in the API currently.
White balance is supported in the current API for QX series and Alpha cameras like A7, NEX-5000, RX100 MIII, however I don't believe tint axis is controllable.

Version 4 of the Camera Remote Application now supports a lot of new functions. Some of those include:
Bulb
Continuous
RAW, movie file download and deleting contents from the SD card
It seems metering mode and the actual metering result is still unavailable. The white balance still only exposes temperature (yellow/blue) and no tint(green/magenta).
These are described in the camera remote API SDK v.2.20 https://developer.sony.com/downloads/all/sony-camera-remote-api-beta-sdk/
Thank you Sony folks

Related

How to achieve Google Duo app's low light mode using Camera2 API?

I'm working on an Android app that streams video with another party, and currently looking at brightening the image in low light scenes.
I noticed that on Google Duo app, the low light mode does it really well.
Is it achievable using just Camera2 API? The reason I am not using CameraX API is because my app is also utilising Vonage (formerly TokBox) SDK for two way video call and it's SDK sample code currently uses Camera2 API and haven't migrated to CameraX yet.
What I tried is to set CONTROL_AE_MODE (Auto Exposure) to CONTROL_AE_MODE_ON, this helps a bit but the image quality is no where as near as Google Duo app's. I'm looking to decrease the Camera FPS next but what else can I do to brighten the image?
If you set CaptureRequest.CONTROL_AE_MODE to CaptureRequest.CONTROL_AE_MODE_OFF then you can control ISO CaptureRequest.SENSOR_SENSITIVITYand exposure time CaptureRequest.SENSOR_EXPOSURE_TIME manually.
You can read the available range of sensorSensitivity (ISO) like this val sensorSensitivityRange = cameraCharacteristics?.get(CameraCharacteristics.SENSOR_INFO_SENSITIVITY_RANGE) as Range<Int>? as this will vary from device to device / camera to camera.
So in low-light mode you could control the brightness yourself, and in normal mode, you let the camera do it automatically.
More information here:
https://developer.android.com/reference/android/hardware/camera2/CaptureRequest

Camera2: setting optical stabilization does nothing (OIS)

I have Samsung S10 which has video stabilization feature.
Using system default Camera app I can see the difference when it's enabled and not: first if it's enabled than there will be some zoomed preview, second it is noticeable during device movements.
I tried to enable stabilization in my own app using Camera2 API, also FullHD and rear camera (the same as with default system app).
I tested that characteristics.get(CameraCharacteristics.CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES) returns only CONTROL_VIDEO_STABILIZATION_MODE_OFF so this is not supported.
But characteristics.get(CameraCharacteristics.LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION)
has CameraMetadata.LENS_OPTICAL_STABILIZATION_MODE_ON
So I as I understand this is exactly the option to enable video stabilization (optical), should be the same as in default system app.
But when I do the next for camera capture session configuration it doesn't change anything, no zoomed preview (as it was with default system camera app) and no changes during movement, so the video is the same in my app as it would have been in default camera app with disabled video stabilization
captureRequestBuilder.set(
CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE,
CameraMetadata.LENS_OPTICAL_STABILIZATION_MODE_ON
)
So setting this parameter doesn't change anything.
Why video stabilization works in default system camera app but not in my own app using Camera2 API?
There are two types of stabilization that camera devices can support on Android
Video stabilization (Electronic Image Stabilization / EIS): This is done as a post-processing step after image capture, by analyzing camera motion between frames and compensating for it by shifting the image a bit. That's why there's a zoom-in effect, to give some room for that shift/warp. This stabilizes video over time, making consecutive image frames stable. The controls for this are accessed via the CONTROL_VIDEO_STABILIZATION_MODE setting, as you've discovered.
Optical image stabilization (OIS): This is a set of high-speed gyros and magnets around the camera lens, which rapidly shifts the lens (or sometimes the sensor) as the camera moves to stabilize the image. The amount of motion is limited, but it's very rapid, so it stabilizes images during a single exposure, not across multiple frames. So it's generally only useful for snapshots, not video. This is accessed via SENSOR_OPTICAL_STABILIZATION_MODE.
Unfortunately, many Android manufacturers do not make their EIS implementations available to applications outside of their default camera app. That's because making a good EIS implementation is complicated, and the manufacturers want to limit it to only working with their own app's recording mode (which is a single target to fix). For example, EIS for recorded video often applies a 1-second delay so that it can adjust image transforms for future frames, in addition to past ones, which is hard to do for real-time apps. Some manufacturers make simpler algorithms visible, or otherwise manage to make EIS work for everyone, but for others, the device doesn't list support for EIS even when the built-in app uses it.
Turning on OIS probably works fine - you'd only see an effect on long-exposure images, where they'll be blurry due to handshake when OIS off, but be sharp when OIS is on. Since it's a self-contained hardware unit, it's easy for manufacturers to make the on-switch available to everyone.

How to change color of Camera LED / Flashlight in Android

I'm trying to change color of Camera LED / Flashlight in Android. is their any way to achieve this ?
I know we can change colors on Nexus One Track Ball. I'm trying to change Camera LED / Flashlight color in Android like that.
I'm trying to change color of Camera LED / Flashlight in Android. is their any way to achieve this ?
Tape a piece of colored transparent plastic over the LED.
I'm trying to change Camera LED / Flashlight color in Android like that.
There is nothing in the Android SDK for this.
You are welcome to try to find some Android device that has a camera flash that uses different colors, then contact the manufacturer of that device to see if they exposed some SDK add-on to give apps control over that color. Out of several hundred million Android devices, there are approximately zero that offer this capability.
You can buy small colored transparent stickers. They come specifically for phone LED lights.You just stick it on your phones LED light.You can also make your own using clear tape and a marker, or colored clear tape.

Custom Camera Application for Android: Adjusting Colors in Real Time

I am considering an application which would adjust the hue/saturation (overall color spectrum) of the camera input in real time. Are current phones (such as the Galaxy S3) powerful enough to allow for real-time color filtering/adjustments of the live camera image, or am I forced to apply the color shift / image manipulation post processing like most of the apps I see?
Essentially I need to convert the real-time image feed to a different color spectrum and display the updated image in real time prior to "snapping a picture". Any feedback on how to do this in real time prior to a "snap" would be greatly appreciated.

Filter infrared sources from an android camera image

I'm trying to find out if there is any infrared source in view of the camera on an android device. (Namely a infrared LED)
Since the camera captures infrared light (I can see the LEDs light up in the preview/pictures) I thought it should be somehow possible to find out if the camera is currently capturing infrared signals, but as the IR 'color' is somehow translated to visible colors (purple like), it's apparently not as easy as just finding out if there is any purple in the picture as it might be real purple not infrared.
The Android reference tells me I can get the picture in different image formats (YCbCr, YUV ,...) but none of these formats seem to be of much help.
Now my idea is, to somehow get the "original" data from the camera, that still includes the information on what is infrared and what not or to basicaly revert the infrared to visible light conversion that apparently happens automatically in the background. Any idea on how I might achive that?
Good question, If I take the remote control for HI-FI or TV and I press the Volume up / down, than I can get the IR light source for the Nexus One camera: it is visible a light purple color flashing. So the Nexus One has IR camera :)
Digital cameras use CCD and other similar sensors to capture infrared
images. Although all digital cameras available on the market are
sensitive to infrared light, they are equipped with infrared-blocking
filters. The main reason for this is that consumer cameras are
designed to capture visible light. But sometimes these filters are
used together, giving very interesting in-camera effects like false
color, wood effects etc. To start with infrared photography, all you
need to have is A digital camera that is sensitive to infrared light.
A visible-light blocking filter (e.g. a Wratten 89B filter)
Image-editing software, such as Photoshop.
http://www.smashingmagazine.com/2009/01/11/40-incredible-near-infrared-photos/
I wrote a very-very simple online radio player. Has an asynchronous call with Media Player. Some devices are playing some not, and different mode. From my sad experience with the simple Madia Player you will have to write like 5 versions to get working in a few devices. Audio-Video has added, removed features from each manufacturer: Samsung, Sony Ericsson, Motorola and you just wonder why doesn't work your code on THAT device if is platform independent Java...
I think to get the original data from camera it would be possible via NDK, but didn't tested.
It provides headers and libraries that allow you to build activities,
handle user input, use hardware sensors, access application resources,
and more, when programming in C or C++
http://developer.android.com/sdk/ndk/index.html
If somebody found a solution using SDK and working with 2 different phone manufacturers please let me know!
My friend only way to find out infrared source in android not relying on color codes
because color spaces are only created for visible lights so no use for invisible light registered inside the sensor as a rgb value even though it happened
logically color space values are only meaningful for visible light.
and color space values are not really created to represent the wavelengths out of our visible range.
there is no correct way to interpret visible color space values such as rgb yuv as infrared unless you know the exact camera sensor detail white papers and tested the sensor with multiple highly accurate filters for each specific wavelengths this is so over kill for and app.
to keep things simple use external external infrared filter for specific wavelengths
and remove the filter inside the phone or tablet camera if you can physically modify
other way around use known model of web cam and remove the filter on top of the sensor and use external filters to protect the sensor from UV light and so on.

Categories

Resources