I am using android camera2 in my application to take continuous images, Here when I use camera2 getting image preview brightness very dark compare to original camera. I seen this but there is no similar requirement in that answer.
I tried to set brightness in camera2 as suggested here:
Note that this control will only be effective if android.control.aeMode != OFF. This control will take effect even when android.control.aeLock == true.
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, 6);
But it still showing preview as dark image only as shown below.
See the difference here:
Original Camera:
Using Camera2:
And what is the value I need to pass as second parameter in:
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, 6);
I kept 6 because as suggested in doc's:
For example, if the exposure value (EV) step is 0.333, '6' will mean an exposure compensation of +2 EV; -3 will mean an exposure compensation of -1 EV.
But still no effect in brightness..
Here it is:
Add below code in onConfigured() and unlockFocus()
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE,getRange());
By using the above code you will get the better preview. But your captured picture will remain as it is. To get the better picture as well use the same below code in captureStillPicture()
captureBuilder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, getRange());
getRange is:
private Range<Integer> getRange() {
CameraCharacteristics chars = null;
try {
chars = mCameraManager.getCameraCharacteristics(mCameraId);
Range<Integer>[] ranges = chars.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
Range<Integer> result = null;
for (Range<Integer> range : ranges) {
int upper = range.getUpper();
// 10 - min range upper for my needs
if (upper >= 10) {
if (result == null || upper < result.getUpper().intValue()) {
result = range;
}
}
}
if (result == null) {
result = ranges[0];
}
return result;
} catch (CameraAccessException e) {
e.printStackTrace();
return null;
}
}
CONTROL_AE_LOCK should be off. You have misinterpreted the doc, possibly document itself is a bit confusing.
Note that this control will only be effective if
android.control.aeMode != OFF. This control will take effect even when
android.control.aeLock == true.
What it means is that when AE lock is ON, the exposure compensation will be applied on the locked exposure and not on the instantaneous exposure at the time of taking picture.
Even in your repeat request, exposure is locked so it doesn't help.
Remove AE lock and it should work.
While setting CONTROL_AE_EXPOSURE_COMPENSATION the second parameter as defined by docs is relative to CameraCharacteristics.CONTROL_AE_COMPENSATION_STEP
The adjustment is measured as a count of steps, with the step size defined by android.control.aeCompensationStep and the allowed range by android.control.aeCompensationRange."
The value of 6 in the example for +2EV is correct only when step is 0.333 which is just an example.
Following code will give you the exposure compensation value to be used for +2EV
CameraManager manager = (CameraManager)this.getSystemService(Context.CAMERA_SERVICE);
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
double exposureCompensationSteps = characteristics.get(CameraCharacteristics.CONTROL_AE_COMPENSATION_STEP).doubleValue();
int exposureCompensation = (int)( 2.0 / exposureCompensationSteps );
I would also suggest you check if the value is within the range specified by CameraCharacteristics.CONTROL_AE_COMPENSATION_RANGE
You can try this
public void setBrightness(int value) {
int brightness = (int) (minCompensationRange + (maxCompensationRange - minCompensationRange) * (value / 100f));
previewRequestBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, brightness);
applySettings();
}
private void applySettings() {
captureSession.setRepeatingRequest(previewRequestBuilder.build(), null, null);
}
I messed around with CaptureRequest.SENSOR_SENSITIVITY and it worked great on my Samsung s3, s7 and s8 phones.
You can get the CameraCharacteristics.SENSOR_INFO_SENSITIVITY_RANGE
sensitivity_range = chars.get(CameraCharacteristics.SENSOR_INFO_SENSITIVITY_RANGE);
On my s7, the range is from mid 50s to more than 3000. I then set it to 1500 as follows.
mCaptureRequest.set(CaptureRequest.SENSOR_SENSITIVITY, 1500);
It brightened the preview a few factors.
First, don't lock autoexposure - that's not needed when adjusting exposure compensation.
Second, did you call CameraCaptureSession.setRepeatingRequest with your new capture request?
Related
In my case, I don't need to show the preview to the user and would like to capture the image from the service, to achieve this I have used ImageFormat .JPG to capture the images but output images are really very dark. I have tried this link in StackOverflow but it is not working.
val streamConfigurationMap =
mCameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP) // Available stream configuration.
mPreviewSize = streamConfigurationMap!!.getOutputSizes(ImageFormat.JPEG)[0]
mCameraID = cameraId
mImageReader =
ImageReader.newInstance(mPreviewSize!!.width, mPreviewSize!!.height, ImageFormat.JPEG, 1)
mImageReader!!.setOnImageAvailableListener(onImageAvailable, mBackgroundHandler)
If I use the dummy surface texture view getting below error, after few seconds of app launch
E/BufferQueueProducer: [SurfaceTexture-1-20857-1] cancelBuffer: BufferQueue has been abandoned
First of all, you don't have to use a TextureView. The reason your preview is really dark is probably because of your CaptureRequest.builder. You want to control your Auto Exposure with for example, I explain later this below.
First, when you set your surface, you should set it as such:
builder.addTarget(mImageReader.getSurface());
Now on to the brightness issue, you can control your AE like this:
builder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE,getRange());
where getRange() is:
private Range<Integer> getRange() {
CameraCharacteristics chars = null;
try {
CameraManager manager = (CameraManager) ((Activity)getContext()).getSystemService(Context.CAMERA_SERVICE);
chars = manager.getCameraCharacteristics(mCameraId);
Range<Integer>[] ranges = chars.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
Range<Integer> result = null;
for (Range<Integer> range : ranges) {
int upper = range.getUpper();
// 10 - min range upper for my needs
if (upper >= 10) {
if (result == null || upper < result.getUpper().intValue()) {
result = range;
}
}
}
if (result == null) {
result = ranges[0];
}
return result;
} catch (CameraAccessException e) {
e.printStackTrace();
return null;
}
}
mImageReader = ImageReader.newInstance(hardcoded_width, hardcoded_height, ImageFormat.YUV_420_888, 2);
mImageReader.setOnImageAvailableListener(mVideoCapture, mBackgroundHandler);
If you want to know more about custom brightness etc. Check this out
I am trying to use Camera2 to allow an app to take a simple picture. I managed to get a working sample using android-Camera2Basic sample code, the problem is that the camera preview is very dark (same problem as this other question), following some answers i did get a proper FPS range [15, 15], setting this in the lockFocus() method allows the app to great a clear picture with correct brightness and fixes the preview from the camera:
private void lockFocus() {
try {
// This is how to tell the camera to lock focus.
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_START);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, Range.create(15, 15));
// Tell #mCaptureCallback to wait for the lock.
mState = STATE_WAITING_LOCK;
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
But the preview before taking the pic is still very dark. I tried to set up the same line of code in other parts of the sample but is not working. How can i fix it in order to ge the same results in the preview? I am working with a Samsung SM-P355M tablet.
Using an FPS range with equal lower and upper bounds, such as [15,15], [30,30], [etc...], will put a constrain to the AE algorithm about how much it can adjust to light changes, which therefore may produce dark results. Such type of ranges are meant for video recording to maintain a constant FPS. For photos you need to find a range with a wide spread between the lower and upper bound, such as [7,30], [15,25], [etc...]
The next method can help you to find the optimal FPS range. Take into account that it is meant for photos and not video recording as it discards FPS ranges with equal lower and upper bounds.
(Adjust MIN_FPS_RANGE and MAX_FPS_RANGE to your requirements)
#Nullable
public static Range<Integer> getOptimalFpsRange(#NonNull final CameraCharacteristics characteristics) {
final int MIN_FPS_RANGE = 0;
final int MAX_FPS_RANGE = 30;
final Range<Integer>[] rangeList = characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
if ((rangeList == null) || (rangeList.length == 0)) {
Log.e(TAG, "Failed to get FPS ranges.");
return null;
}
Range<Integer> result = null;
for (final Range<Integer> entry : rangeList) {
int candidateLower = entry.getLower();
int candidateUpper = entry.getUpper();
if (candidateUpper > 1000) {
Log.w(TAG,"Device reports FPS ranges in a 1000 scale. Normalizing.");
candidateLower /= 1000;
candidateUpper /= 1000;
}
// Discard candidates with equal or out of range bounds
final boolean discard = (candidateLower == candidateUpper)
|| (candidateLower < MIN_FPS_RANGE)
|| (candidateUpper > MAX_FPS_RANGE);
if (discard == false) {
// Update if none resolved yet, or the candidate
// has a >= upper bound and spread than the current result
final boolean update = (result == null)
|| ((candidateUpper >= result.getUpper()) && ((candidateUpper - candidateLower) >= (result.getUpper() - result.getLower())));
if (update == true) {
result = Range.create(candidateLower, candidateUpper);
}
}
}
return result;
}
After lots of reseach it seams there is no easy fix for this (at least not with our same hardware), so we implemented a new version of the camera activities this time using the deprecated Camera Api and everything works as espected. Not really a clean solution but so far works for me.
I am trying to set the best possible output picture size in my camera object. So that, i can get a perfect downscaled sample image and display it.
During debugging i observed i am setting output picture size exactly the size of my screen dimensions. But when i DecodeBounds of the returned image by camera. I get some larger number!
Also i am not setting my display dimensions as expected output picture size. Code used to calculate and set the output picture size is given below.
I am using this code for devices having API level < 21, so using camera shouldn't be a problem.
I don't have any idea of why i am getting this behavior. Thanks in advance for help!
Defining Camera parameter
Camera.Parameters parameters = mCamera.getParameters();
setOutputPictureSize(parameters.getSupportedPictureSizes(), parameters); //update paramters in this function.
//set the modified parameters back to mCamera
mCamera.setParameters(parameters);
Optimal picture size calculation
private void setOutputPictureSize(List<Camera.Size> availablePicSize, Camera.Parameters parameters)
{
if (availablePicSize != null) {
int bestScore = (1<<30); //set an impossible value.
Camera.Size bestPictureSize = null;
for (Camera.Size pictureSize : availablePicSize) {
int curScore = calcOutputScore(pictureSize); //calculate sore of the current picture size
if (curScore < bestScore) { //update best picture size
bestScore = curScore;
bestPictureSize = pictureSize;
}
}
if (bestPictureSize != null) {
parameters.setPictureSize(bestPictureSize.width, bestPictureSize.height);
}
}
}
//calculates score of a target picture size compared to screen dimensions.
//scores are non-negative where 0 is the best score.
private int calcOutputScore(Camera.Size pictureSize)
{
Point displaySize = AppData.getDiaplaySize();
int score = (1<<30);//set an impossible value.
if (pictureSize.height < displaySize.x || pictureSize.width < displaySize.y) {
return score; //return the worst possible score.
}
for (int i = 1; ; ++i) {
if (displaySize.x * i > pictureSize.height || displaySize.y * i > pictureSize.width) {
break;
}
score = Math.min(score, Math.max(pictureSize.height-displaySize.x*i, pictureSize.width-displaySize.y*i));
}
return score;
}
Finally i resolved the issue after many attempts! Below are my findings:
Step 1. If we are already previewing, call mCamera.stopPreview()
Step 2. Set modified parameters by calling mCamera.setParameters(...)
Step 3. Start previewing again, call mCamera.startPreview()
If i call mCamera.setParameters without stopping preview (Assuming camera is previewing). Camera seems to ignore the updated parameters.
I came up with this solution after several trail and errors. Anyone know better way to update parameters during preview please share.
Now i am working in Business Card reader application, in that i want to change the camera image brightness when camera is on open mode/camera is on. please tell me what camera parameter need to set.
Thanks in advance.
If you're targeting API Level 8 and higher, you could look at the camera parameters, specifically at white balance and exposure. You should play a bit with them to find the correct settings for your needs.
white balance is not supported for all devices. that means white balance value is auto.so we can't increase or decrease that value.
All devices supporting the exposure. The default value of the exposure is 0.
and we are able to get the maximum and minimum values from the camera api like this
public void setExposureCompensation(int value){
Camera.Parameters camParams = mCamera.getParameters();
int minExpCom=camParams.getMinExposureCompensation();
int maxExpCom=camParams.getMaxExposureCompensation();
//Log.i(TAG,"minExpCom : "+minExpCom);
//Log.i(TAG,"maxExpCom : "+maxExpCom);
if(maxExpCom>0 && value<=maxExpCom && value>=minExpCom){
camParams.setExposureCompensation(value);
mCamera.setParameters(camParams);
}
}
we manipulate the values of exposure in between the min and max of exposure.
This is the only option supporting all the devices For controlling the brightness of camera.
You can see my code to change camera Exposure value:
// set Camera Exposure value from input progress (0.0f - 1.0f)
void setEV(float progress) {
if (progress < 0.0f && progress > 1.0f) return;
params = mCamera.getParameters();
int min = params.getMinExposureCompensation(); // -3 on my phone
int max = params.getMaxExposureCompensation(); // 3 on my phone
float realProgress = progress - 0.5f;
int value;
if (realProgress < 0) {
value = -(int) (realProgress * 2 * min);
} else {
value = (int) (realProgress * 2 * max);
}
// if changed
if (value != params.getExposureCompensation()) {
params.setExposureCompensation(value);
mCamera.setParameters(params);
}
}
I have a problem with the Samsung Galaxy SIII. In the app we are creating, we use a mediaRecorder to record a video of the user using the front camera. I have looked thoroughly in the documentation and all over forums and I have seen a few similar posts for the SII or crashes in general, but those fixes unfortunately did not work for us. The process that the camera records is as follows --> There is a function (code will be provided) that checks each devices compatible camera resolutions, then we check to see if they meet our specifications (right now it is 480p or less) If there is one that meets this criteria then it uses that quality to set the videoSize() (function provided by android to set the recording size of a video). This seems pretty trivial and looks like it would work with almost any device. This code does work for a couple different devices that we've tested it on (e.g. Galaxy S4, and Galaxy Stellar). But for some reason the SIII is being very difficult. When you record on the SIII in any resolution lower than 720p, the video becomes corrupt and plays back with multi colored screen (screen shots in link below). Why not just record the video in 720p+ then? Unfortunately we need lower video sizes so that it is not such a heavy data load to transfer over a cell phone provider network.
So my question is, why does recording corrupt the video in any lower resolution than 720p, when the resolution that it uses is being pulled out from a list of device supported resolutions?
This is the function to pull supported resolutions from the device.
public Camera.Size getSupportedRecordingSizes() {
Camera.Size result = null;
Camera.Parameters params = camera.getParameters();
List<Size> sizes = params.getSupportedPictureSizes();
for (Size s : sizes) {
if (s.height < 481 && s.width < 721) {
if (result == null) {
result = s;
} else {
int resultVideoSize = result.width * result.height;
int newVideoSize = s.width * s.height;
if (newVideoSize > resultVideoSize) {
result = s;
}
}
}
}
if (!sizes.isEmpty() && result == null) {
Context context = getApplicationContext();
CharSequence text = "Used default first value";
int duration = Toast.LENGTH_SHORT;
Toast toast = Toast.makeText(context, text, duration);
toast.show();
for (Size size : sizes) {
if (result == null) {
result = size;
}
int previousSize = result.width * result.height;
int newSize = size.width * size.height;
if (newSize < previousSize) {
result = size;
}
}
}
return (result);
}
This is the code for our mediaRecorder (shortened for simplicity)
mediaRecorder = new MediaRecorder();
setCameraDisplayOrientaion();
//sets devices supported video size less than or equal to 480p (720x480 resolution).
Camera.Size vidSize = getSupportedRecordingSizes();
camera.unlock();
mediaRecorder.setCamera(camera);
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setOutputFormat(videoFormat);
mediaRecorder.setAudioEncoder(audioEncoder);
mediaRecorder.setVideoEncoder(videoEncoder);
mediaRecorder.setOutputFile(fullQuestionPath);
if (vidSize == null) {
mediaRecorder.setVideoSize(480, 360);
} else {
mediaRecorder.setVideoSize(vidSize.width, vidSize.height);
}
mediaRecorder.setVideoFrameRate(videoFrameRate);
mediaRecorder.setPreviewDisplay(surfaceHolder.getSurface());
//set the bitrate manually if possible.
if (android.os.Build.VERSION.SDK_INT > 7) {
mediaRecorder.setVideoEncodingBitRate(videoBitrate);
}
try {
mediaRecorder.prepare();
mediaRecorder.start();
}
catch (Exception e) {
Log.e(ResponseActivity.class
.toString(), e.getMessage());
releaseMediaRecorder();
}
The images that correlate to this problem are here http://imgur.com/a/8F7Tb (Sorry about not posting earlier.)
The order of these images are as follows --> 1) Before recording, preview is going. 2)Recording, preview showing current recording/recording. 3)Stop Recording, recording has stopped as well as preview. 4) Playback, this is where the issue is, it shows this multicolored corrupt image, which is the same if you pull it from the device directly.
EDIT: Note, I have also tried using the CamcorderProfile and setting the quality to low or high. Setting to QUALITY_HIGH forces 720p which recording works at, but QUALITY_LOW, despite everyone having quite the opposite problem, does not work for me.
EDIT: Anyone have an idea to point me in the right direction?