Issue:-
Camera Preview different from Image Captured using Camera2 Api. And the problem occurs only in landscape mode.
Requirement:-
My requirement is to capture an image in landscape mode using camera2 api.Camera Preview should be in full screen.
I have followed the following github sample :-
https://github.com/googlesamples/android-Camera2Basic
This sample works fine in portrait mode as well as in landscape mode if Texture View is wrap_content as the aspect ratio is maintained.
But to display the camera preview in full screen, i changed TextureView to match_parent. By doing that the output got changed. Now preview of camera is different from the image captured.
Please check the images attached here.
1.Camera Preview:-Screenshot of camera preview
2.Image Captured:-On Tapping Picture button
Following is my code snippet:-
fragment_camera2_basic.xml
<RelativeLayout
xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent">
<com.example.android.camera2basic.AutoFitTextureView
android:id="#+id/texture"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_alignParentStart="true"
android:layout_alignParentTop="true" />
<FrameLayout
android:id="#+id/control"
android:layout_width="match_parent"
android:layout_height="112dp"
android:layout_alignParentBottom="true"
android:layout_alignParentStart="true"
android:background="#color/control_background">
<Button
android:id="#+id/picture"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center"
android:text="#string/picture" />
<ImageButton
android:id="#+id/info"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_vertical|right"
android:padding="20dp"
android:src="#drawable/ic_action_info" />
</FrameLayout>
</RelativeLayout>
Following is the code to capture image:-
/**
* Initiate a still image capture.
*/
private void takePicture() {
lockFocus();
}
/**
* Lock the focus as the first step for a still image capture.
*/
private void lockFocus() {
try {
// This is how to tell the camera to lock focus.
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
CameraMetadata.CONTROL_AF_TRIGGER_START);
// Tell #mCaptureCallback to wait for the lock.
mState = STATE_WAITING_LOCK;
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,
mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* Run the precapture sequence for capturing a still image. This method should be called when
* we get a response in {#link #mCaptureCallback} from {#link #lockFocus()}.
*/
private void runPrecaptureSequence() {
try {
// This is how to tell the camera to trigger.
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER,
CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);
// Tell #mCaptureCallback to wait for the precapture sequence to be set.
mState = STATE_WAITING_PRECAPTURE;
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,
mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* Capture a still picture. This method should be called when we get a response in
* {#link #mCaptureCallback} from both {#link #lockFocus()}.
*/
private void captureStillPicture() {
try {
final Activity activity = getActivity();
if (null == activity || null == mCameraDevice) {
return;
}
// This is the CaptureRequest.Builder that we use to take a picture.
final CaptureRequest.Builder captureBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
// Use the same AE and AF modes as the preview.
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
setAutoFlash(captureBuilder);
// Orientation
int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(rotation));
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
showToast("Saved: " + mFile);
Log.d(TAG, mFile.toString());
unlockFocus();
}
};
mCaptureSession.stopRepeating();
mCaptureSession.abortCaptures();
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* Retrieves the JPEG orientation from the specified screen rotation.
*
* #param rotation The screen rotation.
* #return The JPEG orientation (one of 0, 90, 270, and 360)
*/
private int getOrientation(int rotation) {
// Sensor orientation is 90 for most devices, or 270 for some devices (eg. Nexus 5X)
// We have to take that into account and rotate JPEG properly.
// For devices with orientation of 90, we simply return our mapping from ORIENTATIONS.
// For devices with orientation of 270, we need to rotate the JPEG 180 degrees.
return (ORIENTATIONS.get(rotation) + mSensorOrientation + 270) % 360;
}
/**
* Unlock the focus. This method should be called when still image capture sequence is
* finished.
*/
private void unlockFocus() {
try {
// Reset the auto-focus trigger
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
setAutoFlash(mPreviewRequestBuilder);
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,
mBackgroundHandler);
// After this, the camera will go back to the normal state of preview.
mState = STATE_PREVIEW;
mCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback,
mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
I tried to set CaptureRequest.SCALER_CROP_REGION in camera preview builder and image reader as well but it did not work as expected.
The TextureView is probably partially outside the display, in terms of the layout.
So it's being cut off.
You may want to confirm that with Android Studio's layout tools
In general, you can't get identical images unless you match aspect ratios, so if you want full-screen preview, you'll have to select a JPEG aspect ratio that matches the screen.
That may not available directly from the camera, so you may need to crop the JPEG yourself. But most likely you can at least get 16:9 preview and still capture relatively easily, which will have relatively small black bars, compared to the 4:3 maximum still capture size.
Related
I have an android application that takes a photo and then displays the image. On my device, which I originally developed the app on, the image capture behaves as expected. However, when I have tried running it on other devices, on some devices it seems that the image is rotated 90 degrees. I have been able to determine that this is not an issue with the image preview, and that the image itself is rotated. The code for the image capture is here:
public void takePicture(){
if(null == cameraDevice) {
return;
}
try {
System.out.println("Taking Picture");
getCameraCharacteristics();
ImageReader reader = ImageReader.newInstance(1920, 1440, ImageFormat.JPEG, 1);
//ImageReader reader = ImageReader.newInstance(camera_width, camera_height, ImageFormat.RAW_SENSOR, 1);
List<Surface> outputSurfaces = buildOutputSurfaces(reader);
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
// Orientation
int rotation = parent.getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
ImageReader.OnImageAvailableListener readerListener = reader1 -> getImageFromBuffer(reader1);
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
createCameraPreview();
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, mBackgroundHandler);
}
catch (CameraAccessException e) {
e.printStackTrace();
}
}
Regardless of device, the value for rotation is always 0. I have tried manually setting the JPEG_ORIENTATION to different values, but it does not seem to make a difference.
I have seen other StackOverflow questions with similar issues, but the fixes in those questions did not seem to make a difference here.
Can anyone suggest what might be causing this?
EDIT: to add some more details to the requirements for the app. The issue isn't just with displaying the image but with handling it afterwards. The user has to select a point in the image and then pair of point and image are sent to a server for processing. As a result, I need to orientation of the underlying image to be consistent between devices, its not enough to simply compensate when displaying the image.
Unfortunately I cant switch my application over to using a CameraIntent for image capture, as the application needs to be able to observe behaviour during photo capture and provide continuous feedback.
Use Glide to load and display your taken picture:
Glide.with(context).load(imageUri).into(imageView)
Demo:
https://youtu.be/tPwr2yYxlA4
Helpful reading:
Captured image will be displayed horizontally:
https://stackoverflow.com/a/47630783/3466808
Okay, I found a solution to this issue from a blogpost here. Essentially rather than relying on setting the JPEG rotation in the capture builder, you compute it yourself and incorporate the sensor data to determine how many degrees you have to rotate the image by.
// Orientation
int deviceRotation = parent.getWindowManager().getDefaultDisplay().getRotation();
int surfaceRotation = ORIENTATIONS.get(deviceRotation);
jpegOrientation = (surfaceRotation + sensorOrientation + 270) % 360;
I then decode the image into a bitmap, rotate it by the computed value, and then encoded it back into a ByteArray.
I'm unable to get the touch to focus to work properly on Camera2 API. On touching I just seem to focus for a second and then it becomes extremely blurred. The phone is a Nexus 5X. Here is my code for touch to focus.
private void refocus(MotionEvent event, View view){
//Handler for autofocus callback
CameraCaptureSession.CaptureCallback captureCallbackHandler = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
if (request.getTag() == "FOCUS_TAG") {
//the focus trigger is complete -
//resume repeating (preview surface will get frames), clear AF trigger
previewRequest.set(CaptureRequest.CONTROL_AF_TRIGGER, null);
try{
mSession.setRepeatingRequest(previewRequest.build(), null, null);}
catch (Exception e){
}
}
}
#Override
public void onCaptureFailed(CameraCaptureSession session, CaptureRequest request, CaptureFailure failure) {
super.onCaptureFailed(session, request, failure);
Log.e(TAG, "Manual AF failure: " + failure); }
};
try {
final Rect sensorArraySize = manager.getCameraCharacteristics(mCameraDevice.getId()).get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
//Find area size
int x = (int)(event.getX()/(float)view.getWidth() * (float)sensorArraySize.width());
int y = (int)(event.getY()/(float)view.getHeight() * (float)sensorArraySize.height());
final int halfTouchWidth = 150; //(int)motionEvent.getTouchMajor(); //TODO: this doesn't represent actual touch size in pixel. Values range in [3, 10]...
final int halfTouchHeight = 150; //(int)motionEvent.getTouchMinor();
MeteringRectangle rect = new MeteringRectangle(Math.max(x - halfTouchWidth, 0),
Math.max(y - halfTouchHeight, 0),
halfTouchWidth * 2,
halfTouchHeight * 2,
MeteringRectangle.METERING_WEIGHT_MAX - 1);
mSession.stopRepeating();
transparentLayer.drawFeedback(rect);
//Cancel requests
previewRequest.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
previewRequest.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_OFF);
mSession.capture(previewRequest.build(), captureCallbackHandler, null);
//Now add a new AF trigger with focus region
if (isMeteringAreaAFSupported()) {
previewRequest.set(CaptureRequest.CONTROL_AF_REGIONS, new MeteringRectangle[]{rect});
}
previewRequest.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
previewRequest.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
previewRequest.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_START);
previewRequest.setTag("FOCUS_TAG"); //we'll capture this later for resuming the preview
//then we ask for a single request (not repeating!)
mSession.capture(previewRequest.build(), captureCallbackHandler, null);
}catch (Exception e){
e.printStackTrace();
}
}
Also have another helper function:
private boolean isMeteringAreaAFSupported() {
try {
return manager.getCameraCharacteristics(mCameraDevice.getId()).get(CameraCharacteristics.CONTROL_MAX_REGIONS_AF) >= 1;
}catch (Exception e){
return false;
}
}
What could be the possible reason for the focus working for a brief second, and then restarting, or getting completely blurry? There is no solution that I can find which is helpful.
Thanks all!
I would try setting AF_TRIGGER to IDLE in onCaptureCompleted - removing it entirely isn't totally well-specified.
Beyond that, it's not clear to me how you're converting from the screen touch coordinates to the camera active array coordinates for the metering regions. It looks like you're assuming the coordinates are identical, which isn't true. That shouldn't cause blurriness, but will cause you to focus on a different area than you think.
You need to scale the x and y correctly (based on the current crop region which defines the visible field of view when using digital zoom, and the active array rectangle)
I am using a SurfaceView to show the preview I capture. I want to use width=1080,height=1920 for the preview. Where can I set the size of the preview?
I googled for an answer, but they are all for camera version one. I am using the android.hardware.camera2.
private void takePreview() {
try {
final CaptureRequest.Builder previewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewRequestBuilder.addTarget(mSurfaceHolder.getSurface());
mCameraDevice.createCaptureSession(Arrays.asList(mSurfaceHolder.getSurface(), mImageReader.getSurface()), new CameraCaptureSession.StateCallback() // ③
{
#Override
public void onConfigured(CameraCaptureSession cameraCaptureSession) {
if (null == mCameraDevice) return;
mCameraCaptureSession = cameraCaptureSession;
try {
previewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
previewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
previewRequestBuilder.set(CaptureRequest.JPEG_THUMBNAIL_SIZE, new Size(1080,1920));
CaptureRequest previewRequest = previewRequestBuilder.build();
mCameraCaptureSession.setRepeatingRequest(previewRequest, null, childHandler);
} catch (CameraAccessException e) {
Log.e("takePreview","onConfigured(CameraCaptureSession cameraCaptureSession)",e);
}
}
#Override
public void onConfigureFailed(CameraCaptureSession cameraCaptureSession) {
Log.e("takePreview","onConfigureFailed");
}
}, childHandler);
} catch (CameraAccessException e) {
Log.e("takePreview","CameraAccessException");
}
}
As documented in the reference to createCaptureSession:
For drawing to a SurfaceView: Once the SurfaceView's Surface is created, set the size of the Surface with setFixedSize(int, int) to be one of the sizes returned by getOutputSizes(SurfaceHolder.class) and then obtain the Surface by calling getSurface(). If the size is not set by the application, it will be rounded to the nearest supported size less than 1080p, by the camera device.
Take a look at the Camera2Basic example that Google provides on GitHub: https://github.com/googlesamples/android-Camera2Basic
There is a method in the main fragment which chooses the optimal preview size for a given device. This may be a better approach if you want to make your app more flexible, rather than hardcoding the size, but even if you would still rather stick with set sizes you can see how they use the results.
The summary is that you simply set the size of the, in their case, TextureView to whatever size preview you want.
The method name is 'chooseOptimalSize' and it includes this comment/explanation:
/**
* Given {#code choices} of {#code Size}s supported by a camera, choose the smallest one that
* is at least as large as the respective texture view size, and that is at most as large as the
* respective max size, and whose aspect ratio matches with the specified value. If such size
* doesn't exist, choose the largest one that is at most as large as the respective max size,
* and whose aspect ratio matches with the specified value.
*
* #param choices The list of sizes that the camera supports for the intended output
* class
* #param textureViewWidth The width of the texture view relative to sensor coordinate
* #param textureViewHeight The height of the texture view relative to sensor coordinate
* #param maxWidth The maximum width that can be chosen
* #param maxHeight The maximum height that can be chosen
* #param aspectRatio The aspect ratio
* #return The optimal {#code Size}, or an arbitrary one if none were big enough
*/
to set preview size: mSurfaceTexture.setDefaultBufferSize(width, height); in onSurfaceTextureAvailable function.
to set capture (picture) size: ImageReader.newInstance(width, height, format, maxSize);
How to enable front camera on Camera2 API.Can anyone help ? I have this Camera2 API code .This only sets Main camera of the device ,I want to enable both front and rear camera on a button click.What is LENS_FACING_FRONT,I am new to android programming.
private void setUpCameraOutputs(int width, int height) {
Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try {
for (String cameraId : manager.getCameraIdList()) {
CameraCharacteristics characteristics
= manager.getCameraCharacteristics(cameraId);
// We don't use a front facing camera in this sample.
Integer facing = characteristics.get(CameraCharacteristics.LENS_FACING);
if (facing != null && facing == CameraCharacteristics.LENS_FACING_FRONT) {
continue;
}
StreamConfigurationMap map = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
if (map == null) {
continue;
}
// For still image captures, we use the largest available size.
Size largest = Collections.max(
Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)),
new CompareSizesByArea());
mImageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),
ImageFormat.JPEG, /*maxImages*/2);
mImageReader.setOnImageAvailableListener(
mOnImageAvailableListener, mBackgroundHandler);
// Danger, W.R.! Attempting to use too large a preview size could exceed the camera
// bus' bandwidth limitation, resulting in gorgeous previews but the storage of
// garbage capture data.
mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class),
width, height, largest);
// We fit the aspect ratio of TextureView to the size of preview we picked.
int orientation = getResources().getConfiguration().orientation;
if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
mTextureView.setAspectRatio(
mPreviewSize.getWidth(), mPreviewSize.getHeight());
} else {
mTextureView.setAspectRatio(
mPreviewSize.getHeight(), mPreviewSize.getWidth());
}
mCameraId = cameraId;
return;
}
} catch (CameraAccessException e) {
e.printStackTrace();
} catch (NullPointerException e) {
// Currently an NPE is thrown when the Camera2API is used but not supported on the
// device this code runs.
ErrorDialog.newInstance(getString(R.string.camera_error))
.show(getChildFragmentManager(), FRAGMENT_DIALOG);
}
}
We can use CameraManager to iterate all the cameras that are available in the system, each with a designated cameraId. Using the cameraId, we can get the properties of the specified camera device. Those properties are represented by class CameraCharacteristics. Things like "is it front or back camera", "output resolutions supported" can be queried there.
You can get official sample application here
This example found in Google Git repo will demo for you checking the permission before launching camera in new Marshmallow using Camera2 API
Give a look at this article for more about it.
I am using this code:
https://github.com/commonsguy/cw-advandroid/blob/master/Camera/Picture/src/com/commonsware/android/picture/PictureDemo.java
where in Manifest, Activity Orientation is set to Landscape.
So, its like allowing user to take picture only in Landscape mode, and if the picture is taking by holding the device in portrait mode, the image saved is like this:
a 90 degree rotated image.
After searching for a solution, I found this:
Android - Camera preview is sideways
where the solution is:
in surfaceChanged() check for
Display display = ((WindowManager)getSystemService(WINDOW_SERVICE)).getDefaultDisplay();
display.getRotation();
and change the Camera's displayOrientation accordingly.
camera.setDisplayOrientation(90);
But no matter how many times I rotate the device, surfaceChanged() never gets called.
I even tried removing orientation="Landscape" in the Manifest.xml, but then the preview itself is shown sideways(may be because default android.view.SurfaceView is supposed to be in Landscape mode?).
Try this.
public void surfaceCreated(SurfaceHolder holder) {
try {
camera = Camera.open();
camParam = camera.getParameters();
Camera.Parameters params = camera.getParameters();
String currentversion = android.os.Build.VERSION.SDK;
Log.d("System out", "currentVersion " + currentversion);
int currentInt = android.os.Build.VERSION.SDK_INT;
Log.d("System out", "currentVersion " + currentInt);
if (getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) {
if (currentInt != 7) {
camera.setDisplayOrientation(90);
} else {
Log.d("System out", "Portrait " + currentInt);
params.setRotation(90);
/*
* params.set("orientation", "portrait");
* params.set("rotation",90);
*/
camera.setParameters(params);
}
}
if (getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE) {
// camera.setDisplayOrientation(0);
if (currentInt != 7) {
camera.setDisplayOrientation(0);
} else {
Log.d("System out", "Landscape " + currentInt);
params.set("orientation", "landscape");
params.set("rotation", 90);
camera.setParameters(params);
}
}
camera.setPreviewDisplay(holder);
camera.startPreview();
} catch (IOException e) {
Log.d("CAMERA", e.getMessage());
}
}
Since you've forced your application to be landscape, your application's configuration won't change when you rotate the device, and as a result, your UI won't get redrawn. So you'll never see a surfaceCreated/Changed callback because of it.
In any case, your issue isn't with preview, it's with the captured pictures.
The camera API doesn't automatically know which way is down; it needs you to tell it how you want your images rotated by using the Camera.Parameters setRotation method. There are several coordinate systems in play here (the orientation of the camera sensor relative to your device; the orientation of your UI relative to the device; and the orientation of the device relative to the world) which have to be done correctly.
So I highly recommend you use the code provided in the setRotation documentation, and inherit from the OrientationEventListener, implementing the listener as follows:
public void onOrientationChanged(int orientation) {
if (orientation == ORIENTATION_UNKNOWN) return;
android.hardware.Camera.CameraInfo info =
new android.hardware.Camera.CameraInfo();
android.hardware.Camera.getCameraInfo(cameraId, info);
orientation = (orientation + 45) / 90 * 90;
int rotation = 0;
if (info.facing == CameraInfo.CAMERA_FACING_FRONT) {
rotation = (info.orientation - orientation + 360) % 360;
} else { // back-facing camera
rotation = (info.orientation + orientation) % 360;
}
mParameters.setRotation(rotation);
}
This will update your camera's still picture orientation correctly so that 'up' is always up, whether your app is landscape or portrait, or your device is a tablet or a phone.