Android - getting supported Color Effects - android

SO
There is an API for that: Camera.Parameters.html#getSupportedColorEffects
But it doesn't properly work on my Samsung Galaxy S Plus. It returns 9 color effects, but actually supported only three of them.
I came to that conclusion after launching the 'native' camera app - there are only threee effects available there (sepia, negative and black'n'white). And these work in my application. When I try to apply others form that list that is returned by getSupportedColorEffects() - nothing happens.
Does anybody know how it is possible to find supported color effects?
Here is how I am getting those effects:
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
Camera.Parameters parameters = mCamera.getParameters();
List<Size> sizes = parameters.getSupportedPreviewSizes();
List<String> effects = parameters.getSupportedColorEffects();
//...
}

I think you may have found a bug in Android, or at least the build of it on that device. It sounds like someone else had a similar issue on a different device: android camera samsung galaxy i9003 setParameters faild.
One thought as a possible workaround: Are you able to successfully set the color effects that aren't working? That is, do you check that getColorEffect() is not null after calling setColorEffect()? If you get null for the ones that don't work, you can just follow up your getSupportedColorEffects() with a loop to verify each.
If that doesn't work, but rather the device is claiming to support effects that it silently ignores, then I'm not sure there's anything you can do about it.

Related

Using Vulkan to sample from Android Camera2 hardware buffer - Issue with image formats

I'm currently working on an app in C++ using the Android ndk, and I need to create a sampler to access the camera output image.
I have done this using the AIMAGE_FORMAT_YUV_420_888, and using the VkSamplerYcbcrConversion for accessing the image in the hardware buffer. I do the yuv -> rgb conversion in a shader, and it all looks good on my phone.
I have since discovered that this doesn't work on Samsung phones, in my case specifically the Samsung Galaxy S10/S10+.
The reason is that when I set up an image reader with the AIMAGE_FORMAT_YUV_420_888 I get a camera error using Samsung. On my OnePlus and on another phone I tried the pipeline worked entirely as expected. I created a very simple test setup to even try to open the camera with that image format in the ImageReader on Samsung S10 and got the error, but when I changed the ImageReader format to AIMAGE_FORMAT_JPEG the error went away and the camera seemed to start as expected.
AImageReader* SimpleCamera::CreateJpegReader()
{
AImageReader* reader = nullptr;
// media_status_t status = AImageReader_new(640, 480, AIMAGE_FORMAT_JPEG,
//AIMAGE_FORMAT_RGBA_8888
//media_status_t status = AImageReader_new(640, 480, AIMAGE_FORMAT_RGB_565,4, &reader);
media_status_t status = AImageReader_newWithUsage(640, 480,
//AIMAGE_FORMAT_RGBA_8888,
//AIMAGE_FORMAT_RGB_565,
//AIMAGE_FORMAT_RGB_888,
AIMAGE_FORMAT_JPEG,
AHARDWAREBUFFER_USAGE_GPU_SAMPLED_IMAGE | AHARDWAREBUFFER_USAGE_CPU_READ_RARELY,
4, &reader);
if (status != AMEDIA_OK) {
LOGE("Couldn't create new image reader");
return nullptr;
}
AImageReader_ImageListener listener{
.context = nullptr,
.onImageAvailable = imageCallback1,
};
AImageReader_setImageListener(reader, &listener);
return reader;
}
None of the other formats are guaranteed to be supported except AIMAGE_FORMAT_JPEG, but this format doesn't seem to work with the VkSamplerYcbcrConversion because the image layout is different.
Has anyone come up against this issue before? And if so how did you resolve it?
At a high level th goal is: In C++, get the image out of the camera2 api and onto a VkImage. If anyone knows an alternative way of doing that, I'm also all ears.
Try to use ImageFormat.PRIVATE with USAGE_GPU_SAMPLED_IMAGE flag. This used to work fine on the mentioned Samsung devices in particular.
Please make sure to read Vulkan specification, as there are quite a few android-specific and VkSamplerYcbcrConversion requirements.
I can also recommend to take a look at this great project which uses android camera2 api and vulkan.

Android Camera2 API - Set AE-regions not working

In my Camera2 API project for Android, I want to set a region for my Exposure Calculation. Unfortunately it doesn't work. On the other side the Focus region works without any problems.
Device: Samsung S7 / Nexus 5
1.) Initial values for CONTROL_AF_MODE & CONTROL_AE_MODE
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
2.) Create the MeteringRectangle List
meteringFocusRectangleList = new MeteringRectangle[]{new MeteringRectangle(0,0,500,500,1000)};
3.) Check if it is supported by the device and set the CONTROL_AE_REGIONS (same for CONTROL_AF_REGIONS)
if (camera2SupportHandler.cameraCharacteristics.get(CameraCharacteristics.CONTROL_MAX_REGIONS_AE) > 0) {
camera2SupportHandler.mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_REGIONS, meteringFocusRectangleList);
}
4.) Tell the camera to start Exposure control
camera2SupportHandler.mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CameraMetadata.CONTROL_AE_PRECAPTURE_TRIGGER_START);
The CONTROL_AE_STATE is always in CONTROL_AE_STATE_SEARCHING, but doesn't use the configured regions...
After long testing & development I've found an answer.
The coordinate system - Camera 1 API VS Camera 2 API
RED = CAM1; GREEN = CAM2; As shown in the image below, the blue rect are the coordinates for a possible focus/exposure area for the Cam1. By using the Cam2 API, there must be firstly queried the max of the height and the width. Please find more info here.
Initial values for CONTROL_AF_MODE & CONTROL_AE_MODE: See in the question above.
Set the CONTROL_AE_REGIONS: See in the question above.
Set the CONTROL_AE_PRECAPTURE_TRIGGER.
// This is how to tell the camera to start AE control
CaptureRequest captureRequest = camera2SupportHandler.mPreviewRequestBuilder.build();
camera2SupportHandler.mCaptureSession.setRepeatingRequest(captureRequest, captureCallbackListener, camera2SupportHandler.mBackgroundHandler);
camera2SupportHandler.mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);
camera2SupportHandler.mCaptureSession.capture(captureRequest, captureCallbackListener, camera2SupportHandler.mBackgroundHandler);
The ''captureCallbackListener'' gives feedback of the AE control (of course also for AF control)
So this configuration works for the most Android phones. Unfortunately it doesn't work for the Samsung S6/7. For this reason I've tested their Camera SDK, which can be found here.
After deep investigations I've found the config field ''SCaptureRequest.METERING_MODE''. By setting this to the value of ''SCaptureRequest.METERING_MODE_MANUAL'', the AE area works also the Samsung phones.
I'll add an example to github asap.
Recently I had the same problem and finally found a solution that helped me.
All I needed to do was to step 1 pixel from the edges of the active sensor rectangle. In your example instead of this rectangle:
meteringRectangleList = new MeteringRectangle[]{new MeteringRectangle(0,0,500,500,1000)};
I would use this:
meteringRectangleList = new MeteringRectangle[]{new MeteringRectangle(1,1,500,500,1000)};
and it started working as magic on both Samsung and Nexus 5!
(note that you should also step 1 pixel from right/bottom edges if you use maximum values there)
It seems that many vendors have poorly implemented this part of documentation
If the metering region is outside the used android.scaler.cropRegion returned in capture result metadata, the camera device will ignore the sections outside the crop region and output only the intersection rectangle as the metering region in the result metadata. If the region is entirely outside the crop region, it will be ignored and not reported in the result metadata.

Android camera API blurry image on Samsung devices

After implementing the camera2 API for the inApp camera I noticed that on Samsung devices the images appear blurry. After searching about that I found the Sasmung Camera SDK (http://developer.samsung.com/galaxy#camera). So after implementing the SDK on Samsung Galaxy S7 the images are fine now, but on Galaxy S6 they are still blurry. Someone experienced those kind of issues with Samsung devices?
EDIT:
To complement #rcsumners comment. I am setting autofocus by using
mPreviewBuilder.set(SCaptureRequest.CONTROL_AF_TRIGGER, SCaptureRequest.CONTROL_AF_TRIGGER_START);
mSCameraSession.capture(mPreviewBuilder.build(), new SCameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(SCameraCaptureSession session, SCaptureRequest request, STotalCaptureResult result) {
isAFTriggered = true;
}
}, mBackgroundHandler);
It is a long exposure image where the use has to take an image of a static non moving object. For this I am using the CONTROL_AF_MODE_MACRO
mCaptureBuilder.set(SCaptureRequest.CONTROL_AF_MODE, SCaptureRequest.CONTROL_AF_MODE_MACRO);
and also I am enabling auto flash if it is available
requestBuilder.set(SCaptureRequest.CONTROL_AE_MODE,
SCaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
I am not really an expert in this API, I mostly followed the SDK example app.
There could be a number of issues causing this problem. One prominent one is the dimensions of your output image
I ran Camera2 API and the preview is clear, but the output was quite blurry
val characteristics: CameraCharacteristics? = cameraManager.getCameraCharacteristics(cameraId)
val size = characteristics?.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP)?.getOutputSizes(ImageFormat.JPEG) // The issue
val width = imageDimension.width
val height = imageDimension.height
if (size != null) {
width = size[0].width; height = size[0].height
}
val imageReader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 5)
The code below was returning a dimension about 245*144 which was way to small to be sent to the image reader. Some how the output was stretching the image making it end up been blurry. Therefore I removed this line below.
val size = characteristics?.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP)?.getOutputSizes(ImageFormat.JPEG) // this was returning a small
Setting the width and height manually resolved the issue.
You're setting the AF trigger for one frame, but then are you waiting for AF to complete? For AF_MODE_MACRO (are you verifying the device lists support for this AF mode?) you need to wait for AF_STATE_FOCUSED_LOCKED before the image is guaranteed to be stable and sharp. (You may also receive NOT_FOCUSED_LOCKED if the AF algorithm can't reach sharp focus, which could be because the object is just too close for the lens, or the scene is too confusing)
On most modern devices, it's recommended to use CONTINUOUS_PICTURE and not worry about AF triggering unless you really want to lock focus for some time period. In that mode, the device will continuously try to focus to the best of its ability. I'm not sure all that many devices support MACRO, to begin with.

Android - implementation discrepancies

I spent a lot of time debugging different problems that were reproducible only on a specific devices.
For instance I left my attempts to take a picture from a camera using an Intent. Because only a limited set of the devices behave as expected.
Another example is when I use a byte array from the onPictureTakenCallback:
public void onPictureTaken(byte[] data, Camera camera) {
byte[] tempData = new byte[data.length];
System.arraycopy(data, 0, dataTemp, 0, data.length);
///...
}
So if I don't make a copy, but use original "data" array some time later then I fall into troubles because some devices clean this array up after a time. But other devices don't do such cleaning so it works perfectly without doing a copy.
One more example:
Some devices return null when:
Camera.Parameters params = camera.getParameters();
List<Camera.Size> sizes = params.getSupportedPreviewSizes();
// sizes is null
But most of devices (I think) return a list of supported sizes.
So I wonder if is there any kind of knowledge base / FAQ assembled of such problems? If not, let's post here issues with which we faced?
I'm unaware of it. But byte array you are receiving is mmapped, and in control of another (native) application (and thus data may go at camera application discretion, if it reuses this buffer)
Best way is to copy it away to safe location ASAP
As for preview sizes - they are a mess. Even if you get this list, not all resolutions are supported actually ( I got segfaults on bigger resolutions - somehow preview buffer did not fit ). Only way is to probe whether this preview size is actually supported by activating them in turn and waiting for exc eption

Camera Preview on Motorola Droid

Our application displays a camera preview and it seems to work fine on all phones except for the Motorola Droid where we get a runtime exception when we set the camera parameters:
java.lang.RuntimeException: setParameters failed
at android.hardware.Camera.native_setParameters(Native Method)
at android.hardware.Camera.setParameters(Camera.java:611)
at com.highwaynorth.andrometer.CameraPreviewSurfaceView.surfaceChanged(CameraPreviewSurfaceView.java:57)
at android.view.SurfaceView.updateWindow(SurfaceView.java:460)
at android.view.SurfaceView.dispatchDraw(SurfaceView.java:287)
at android.view.ViewGroup.drawChild(ViewGroup.java:1525)
Here is the code for surfaceChanged() which is mostly taken from APIDemos
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
Camera.Parameters parameters = mCamera.getParameters();
parameters.setPreviewSize(w, h);
parameters.setPictureFormat(PixelFormat.JPEG);
parameters.setPreviewFormat(PixelFormat.YCbCr_422_SP);
parameters.setPreviewFrameRate(1);
mCamera.setParameters(parameters);
mCamera.startPreview();
}
Does anyone know what is wrong with how we are setting the parameters that would be causing the exception on the Motorola Droid?
I can tell you your problem is with one of the following two lines:
parameters.setPreviewFormat(PixelFormat.YCbCr_422_SP);
parameters.setPreviewFrameRate(1);
I know this, because the rest of that code is just what I do in some camera samples in my book, and they've been tested on a DROID.
You may wish to use getSupportedPreviewFormats() and getSupportedPreviewFrameRates() on your Camera.Parameters object, to see if the device in question supports the format and frame rate you seek. Note that those methods are new to Android 2.0, so they'll work on the DROID/Milestone (and, presumably, the Nexus One), but nothing else at the time of this writing. If you are targeting older Android API versions, you'll need to use reflection or some classloading tricks to get these methods to work on Android 2.0 and be skipped on older versions.
You should check what preview formats are available to make sure you can run on as many devices as possible.
It looks like DROID supports
PixelFormat.YCbCr_422_I
PixelFormat.YCbCr_420_SP
you can use the following method to get a list of available formats.
getSupportedPreviewFormats()
Pixel Formats
Another thing you may want to investigate:
I am experiencing this issue for devices running Motoblur and updated to 2.3 (especially Droid2, DroidX and Atrix with Verizon).
The Camera parameters were fine, but in layout/capture.xml the background of the ViewfinderView is set to transparent:
<com.google.zxing.client.android.ViewfinderView
android:id="#+id/viewfinder_view"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:background="#color/transparent"
/>
Well, it looks that transparent for Motoblur on Android 2.3 is not that transparent...
removing
android:background="#color/transparent"
from the ViewFinder solved my problem.

Categories

Resources