I'm developing android camera application. Tested on some devices and it work's, but on samsung galaxy i9003 setparameters didn't work.
Here how I do it:
cameraParams.setColorEffect([some supported effect]);
camera.setPrameters(cameraParams);
[some supported effect] I get with cameraParams.getSupportedColorEffects().
My code is not as simple as I wrote , but in the end it do this.
It works on many devices including samsung galaxy s i9000 ( though I must do some trick on i9000, there is known bug in params when you get params with params.flatten() there is spaces in parameters).
Please some help, it's first time I'm writing here.
Different devices will have different camera capabilities so you should check that the parameter that you wish to set is indeed supported by the device e.g. call getSupportedColorEffects() - this gives you a list of supported effects or null so you can check this before you try to set them.
Related
Following the Camera 2 sample I've created simple camera class to capture the images. When it's okay with capturing both flash/non-flash images on any device with Android < 7.0, on mine Nexus 5X with Android 7.1 the same config fire the flash only once on the preview. Pre-sequences are the next:
for the preview I'm using CameraDevice.TEMPLATE_PREVIEW with AE mode set to CameraMetadata.CONTROL_AE_MODE_ON_ALWAYS_FLASH
the same I use for the capture still picture, but with CameraDevice.TEMPLATE_STILL_CAPTURE
If someone can help me with this case - I'll be really appreciated.
This is just additional information on the above issue. I wish to draw some attention to this problem!
My application takes a photo every 5 seconds. I (1) select the camera, (2) acquirer a session and then with each loop I (3) create a Capture request in which I set the Flash Mode and call the capture method on the session.
I have no issues with my Samsung SM-G550T (Android version 6.01), but I was having some issues with the Flash Mode on my Moto G4 (Android version 7.0). I got both phones to Flash, but only with this setting:
CaptureRequest.Builder requestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
requestBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_SINGLE);
I am presently having an issue with a LG device (M210N) (Android version 7.0). Using the settings I stated above I get the device to flash just once. If I completely re-initialize the camera (as described above) the device will flash again only once.
If I add the CONTROL_AE_MODE_ON_ALWAYS_FLASH setting to the above requestBuilder, then the LG does not flash at all. So I had to remove that flag.
I have tried many different additional settings and combination for settings and none of them have eliminated this issue. I wonder how many devices are affected by this issue.
Pictured above on the left is how it should look (captured using a Galaxy S6), but on the right (captured using a Galaxy S7) is what I'm getting if I use the Camera2 API on an S7. I'm doing computer vision stuff using OpenCV so this glossy effect is breaking it.
It seems the Camera2 API (the stock Samsung camera app is fine) is producing some sort of undesired glossy effect when used on the Galaxy S7. I've tried the plain Android Camera2 API and the Samsung Galaxy Camera SDK 1.1 (found on http://developer.samsung.com/galaxy#camera).
This doesn't happen if I'm using the deprecated Camera1 API, so it seems the issue is with the S7's HALv3. This also never happens on the Galaxy S6 and other devices (both Samsung and non-Samsung).
If you try any 3rd-party camera app on the Play Store that uses Camera2, you should be able to replicate this effect. Not sure if SO is the best place to post this, but Samsung doesn't seem to be active in their own developer forums.
This is controlled by various CaptureRequest settings. S7 has perhaps different default settings for Camera2 API, that's why it is different from the S6, but you should be able to get similar results just by experimenting with various settings.
You should get rid of the "glow" by disabling EDGE_MODE like this: requestBuilder.set(CaptureRequest.EDGE_MODE, CaptureRequest.EDGE_MODE_OFF);
See for more settings and description: https://developer.android.com/reference/android/hardware/camera2/CaptureRequest.html#EDGE_MODE
When I check the device camera parameter IsVideoSnapshotSupported, it is set to false despite my device seemingly supporting this functionality. This setting is used to detect if a photo can be taken while the device is recording a video. My device can natively do this, and I can successfully take a photo while recording within my app if I just ignore the IsVideoSnapshotSupported value.
I don't like the idea of blindly assuming the feature is supported by all devices using my app, but I am not able to determine why the value of this parameter is false. The only device I have tested is a Samsung Galaxy S4 running Android version 5.0.1. The application is being built using Xamarin. The app is built on API 15.
I use android face detection API with Camera.FaceDetectionListener .
I have several tablets.
A Galaxy Tab GT-P5110 2 version 4.0.3 (Kernel: 3.0.8-365113-user dpi DELL155 # # 1)
A Galaxy Tab GT-P5110 2 version 4.1.1 (Kernel: 3.0.31-523998 se.infra # SEP-98 # 1)
A Galaxy Tab GT-P5210 3 version 4.2.2 (Kernel: 3.4.34-1135839 se.infra # S0210-10 # 1)
The detection works perfectly on the first but on the second Tab 2, as soon as I turn on face detection, preview flick and detection is slow.
On Tab 3, the listener is never invoked as if it did not detect any faces.
I also tried this sample : http://developer.samsung.com/android/samples/Mad-Hatter-Face-Recogition
but I have the same issues.
Is there a known bug on Samsung devices ?
Thanks
It is known that Samsung's devices perform differently. So you have to treat different Samsung's device carefully. You can use two steps to investigate this issue: 1. pass a bitmap directly to the detection API to check whether it can work. 2. check the preview FPS. maybe you have to set different parameters to improve camera performance.
I have developed an application, that captures image by its front camera, using a surface view. It is working fine on other phones except Sony phones.
The log cat for Sony phone says
Permission failure:com.sonyericsson.permission.CAMERA_EXTENDED
I have included this permission also, but its not working.
Thanks in advance.
This permission failure is only a warning; it may prevent using higher resolutions, though. Try 320x240 to begin with, and post more information if this attempt fails.
For those who are working with the new camera HAL introduced in API 21:
I had the same issue for my Sony Xperia Z4 Tablet. The Problem for me was, that i was configuring the flash (to light the scene) with
CaptureRequest.Builder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
This device has no builtin flash, so after removing this it worked fine.
I got the same logcat output as you had.