I'm building a camera app based on Camera2, but the picture I save is not matching the latest one I saw on my surface view. It seems that the Preview session works but when I ask for a capture, the new request stop the preview and capture the image. The surface view freeze on the latest pic and that create a gap between the time I press the shutter button (preview running and request capture) and the onCaptureCompleted from the capture request.
Here is the preview session
private void createCameraPreviewSession() {
try {
SurfaceTexture texture = mTextureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
Log.d(TAG, "here is the width of texture" + mPreviewSize.getWidth());
Log.d(TAG, "here is the height of texture" +mPreviewSize.getHeight());
Surface surface = new Surface(texture);
mPreviewRequestBuilder
= mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
if (null == mCameraDevice) {
return;
}
mCaptureSession = cameraCaptureSession;
try {
mPreviewRequest = mPreviewRequestBuilder.build();
mCaptureSession.setRepeatingRequest(mPreviewRequest,
mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(
#NonNull CameraCaptureSession cameraCaptureSession) {
showToast("Failed");
}
}, null
);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
the mCaptureCallback is defined as below :
private CameraCaptureSession.CaptureCallback mCaptureCallback
= new CameraCaptureSession.CaptureCallback() {
private void process(CaptureResult result) {
switch (mState) {
case STATE_PREVIEW:
break;
case STATE_CAPTURE:
mState = STATE_PREVIEW;
capturePicture();
break;
}
}
#Override
public void onCaptureProgressed(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull CaptureResult partialResult) {
process(partialResult);
}
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
TotalCaptureResult iResult = result;
Log.d(TAG, "Frame on Completed: "+result.getFrameNumber());
process(result);
}
}
What's happening is that I repeating the preview and it works. the process is just used to keep it running and nothing happened until the mState is set to CAPTURE.
It's set to capture when we click on the shutter button. When I click on the button, I call:
private void takePicture(){
try {
mFile = ImageSaver.generateNewFileImage();
mState = STATE_CAPTURE;
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,
mBackgroundHandler);
} catch (CameraAccessException e) {
Log.e(TAG,"Camera exception",e);
}
}
I call then CapturePicture as mState is in Capture as defined in the mCaptureCallback
private void capturePicture() {
mTakePictureRunnable = new Runnable() {
#Override
public void run() {
takePictureNow();
}
};
mBackgroundHandler.post(mTakePictureRunnable);
}
the takePicutreNow is defined
private void takePictureNow() {
Log.d(TAG, "Running captureStillPicture");
try {
if (null == mCameraDevice) {
return;
}
// This is the CaptureRequest.Builder that we use to take a picture.
final CaptureRequest.Builder captureBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
SurfaceTexture texture = mTextureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
Log.d(TAG, "here is the width of texture" + mPreviewSize.getWidth());
Log.d(TAG, "here is the height of texture" + mPreviewSize.getHeight());
Surface surface = new Surface(texture);
captureBuilder.addTarget(surface);
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
//Location if needed
boolean Location_Saved = CameraSettings.Instance().getBoolean(CameraSettings.SAVE_LOCATION,
getResources().getBoolean(R.bool.action_camera_settings_dflt_location));
if(Location_Saved == true) {
captureBuilder.set(CaptureRequest.JPEG_GPS_LOCATION, mLocationManager.getCurrentLocation());
} else {
captureBuilder.set(CaptureRequest.JPEG_GPS_LOCATION, null);
}
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureStarted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull long timestamp,
#NonNull long framenumber) {
playShutterSound();
showShutterAnimation();
}
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
Log.d(TAG, mFile.toString());
mState = STATE_PREVIEW;
}
};
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
mCaptureSession.stopRepeating();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
This process is working but I got a lag between the press and the imagesave and the latest pic I saw on the screen from the preview is not exactly the same saved if I move quickly.
It seems that the capture do not update the surface and the surface only show when we were in Preview
Any idea to see what I save ?
Thanks
The surface will freeze when you take an image, that might not be the image that gets saved. If you still want to show camera preview on the screen, you just have to restart the camera preview by calling the function you used to create preview at the first time.
Related
I am using Camera2 API (https://github.com/googlearchive/android-Camera2Basic) for my project.
I want to use Front and Rear both camera, It’s working properly on my device
Realme5
Android: 10
But when I am trying to use front camera on
One Plus 6, Android: 10 or Samsung Galaxy J7 Nxt, Android 7.0 Or Redmi.
Then I am going on onConfigureFailed method and it’s redirect me firstActivity
Manifest ScreenShots:enter image description here
Log ScreenShots:enter image description here
Whole code in googlearchive GitHub link is given above
Variable Initialisation
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
private static final int REQUEST_CAMERA_PERMISSION = 1;
private static final String FRAGMENT_DIALOG = "dialog";
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
public static int mSelectedFacing = 1;
flip camera button click
if (mTextureView.isAvailable()) {
if(mSelectedFacing ==0){
mSelectedFacing = 1;
}else {
mSelectedFacing = 0;
}
closeCamera();
openCamera(mTextureView.getWidth(), mTextureView.getHeight(), mSelectedFacing);
} else {
mTextureView.setSurfaceTextureListener(mSurfaceTextureListener);
}
Open camera
private void openCamera(int width, int height,int mSelectedFacing) {
if (ContextCompat.checkSelfPermission(getActivity(), Manifest.permission.CAMERA)
!= PackageManager.PERMISSION_GRANTED) {
requestCameraPermission();
return;
}
setUpCameraOutputs(width, height,mSelectedFacing);
configureTransform(width, height);
Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try {
if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
throw new RuntimeException("Time out waiting to lock camera opening.");
}
manager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
} catch (InterruptedException e) {
throw new RuntimeException("Interrupted while trying to lock camera opening.", e);
}
}
Close camera
private void closeCamera() {
try {
mCameraOpenCloseLock.acquire();
if (null != mCaptureSession) {
mCaptureSession.close();
mCaptureSession = null;
}
if (null != mCameraDevice) {
mCameraDevice.close();
mCameraDevice = null;
}
if (null != mImageReader) {
mImageReader.close();
mImageReader = null;
}
} catch (InterruptedException e) {
throw new RuntimeException("Interrupted while trying to lock camera closing.", e);
} finally {
mCameraOpenCloseLock.release();
}
}
CameraPreviewSession
private void createCameraPreviewSession() {
try {
SurfaceTexture texture = mTextureView.getSurfaceTexture();
assert texture != null;
// We configure the size of default buffer to be the size of camera preview we want.
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
// This is the output Surface we need to start preview.
Surface surface = new Surface(texture);
// We set up a CaptureRequest.Builder with the output Surface.
mPreviewRequestBuilder
= mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);
// Here, we create a CameraCaptureSession for camera preview.
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
// The camera is already closed
if (null == mCameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
mCaptureSession = cameraCaptureSession;
try {
// Auto focus should be continuous for camera preview.
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
// Flash is automatically enabled when necessary.
setAutoFlash(mPreviewRequestBuilder);
// Finally, we start displaying the camera preview.
mPreviewRequest = mPreviewRequestBuilder.build();
mCaptureSession.setRepeatingRequest(mPreviewRequest,
mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(
#NonNull CameraCaptureSession cameraCaptureSession) {
showToast("Failed");
}
}, null
);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Please help me …
Thanks in advance
I have do like it will capture image in background without any preview screen using Camera2 API in background service.
Visit https://codepalyers.blogspot.com/2020/12/capture-image-in-background-without.html
If you wish to use rear camera use
private final String frontCamera = "1";
private final String backCamera = "0";
as already define in code.
I want Camera preview and Camera Capture use Camera2 API.
this code is works, but there is one problem.
problem is When click the capture button, capture succeeds, camera preview stops.
if I press the capture button again, the camera preview is still paused.
if I do not press the capture button after running the app, camera preview is success work.
Error log when the capture button is pressed.
java.lang.IllegalArgumentException: submitRequestList:208: Request targets Surface that is not part of current capture session
at android.hardware.camera2.CameraManager.throwAsPublicException(CameraManager.java:650)
at android.hardware.camera2.impl.ICameraDeviceUserWrapper.submitRequestList(ICameraDeviceUserWrapper.java:86)
at android.hardware.camera2.impl.CameraDeviceImpl.submitCaptureRequest(CameraDeviceImpl.java:935)
at android.hardware.camera2.impl.CameraDeviceImpl.setRepeatingRequest(CameraDeviceImpl.java:974)
at android.hardware.camera2.impl.CameraCaptureSessionImpl.setRepeatingRequest(CameraCaptureSessionImpl.java:243)
at com.bilal.androidthingscameralib.CameraHelper$4.onCaptureCompleted(CameraHelper.java:273)
at java.lang.reflect.Method.invoke(Native Method)
at android.hardware.camera2.dispatch.InvokeDispatcher.dispatch(InvokeDispatcher.java:39)
at android.hardware.camera2.dispatch.HandlerDispatcher$1.run(HandlerDispatcher.java:65)
at android.os.Handler.handleCallback(Handler.java:790)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:164)
at android.app.ActivityThread.main(ActivityThread.java:6494)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:438)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:807)
Caused by: android.os.ServiceSpecificException: submitRequestList:208: Request targets Surface that is not part of current capture session (code 3)
at android.os.Parcel.readException(Parcel.java:2018)
at android.os.Parcel.readException(Parcel.java:1950)
at android.hardware.camera2.ICameraDeviceUser$Stub$Proxy.submitRequestList(ICameraDeviceUser.java:334)
at android.hardware.camera2.impl.ICameraDeviceUserWrapper.submitRequestList(ICameraDeviceUserWrapper.java:84)
at android.hardware.camera2.impl.CameraDeviceImpl.submitCaptureRequest(CameraDeviceImpl.java:935)
at android.hardware.camera2.impl.CameraDeviceImpl.setRepeatingRequest(CameraDeviceImpl.java:974)
at android.hardware.camera2.impl.CameraCaptureSessionImpl.setRepeatingRequest(CameraCaptureSessionImpl.java:243)
at com.bilal.androidthingscameralib.CameraHelper$4.onCaptureCompleted(CameraHelper.java:273)
at java.lang.reflect.Method.invoke(Native Method)
at android.hardware.camera2.dispatch.InvokeDispatcher.dispatch(InvokeDispatcher.java:39)
at android.hardware.camera2.dispatch.HandlerDispatcher$1.run(HandlerDispatcher.java:65)
at android.os.Handler.handleCallback(Handler.java:790)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:164)
at android.app.ActivityThread.main(ActivityThread.java:6494)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:438)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:807)
and my source (MainActivity.class);
private InitializeCamera mInitializeCamera;
TextureView mTextureView;
ImageView imgCaptureImage;
#Override
public void onCreate(Bundle savedInstanceState) {
Button capBtn = (Button) findViewById(R.id.capBtn); //capture button
#Override
public void onClick(View v) {
mInitializeCamera.captureImage();
}
}
//camera open texture
private final TextureView.SurfaceTextureListener mSurfaceTextureListener
= new TextureView.SurfaceTextureListener() {
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
mInitializeCamera = new InitializeCamera(getApplicationContext(), mOnPictureAvailableListener, mTextureView, 640, 480, 1);
}
...
};
//picture taken
private OnPictureAvailableListener mOnPictureAvailableListener =
new OnPictureAvailableListener() {
#Override
public void onPictureAvailable(byte[] imageBytes) {
onPictureTaken(imageBytes);
}
};
//get picture
private void onPictureTaken(final byte[] imageBytes) {
if (imageBytes != null) {
Bitmap bitmap = BitmapFactory.decodeByteArray(imageBytes, 0, imageBytes.length);
imgCaptureImage.setImageBitmap(bitmap);
}
}
(InitializeCamera.class) :)
OnPictureAvailableListener mOnPictureAvailableListener;
private HandlerThread mCameraHandlerThread;
//Initialize camera library class
private CameraHelper mCameraHelper;
//Handler for running Camera Task in the background
private Handler mCameraHandler = new Handler();
public InitializeCamera(Context mContext, OnPictureAvailableListener mOnPictureAvailableListener, TextureView textureView, int imageHeight, int imageWidth, int maxSize) {
this.mOnPictureAvailableListener = mOnPictureAvailableListener;
//create new handler thread for camera operations.
mCameraHandlerThread = new HandlerThread("CameraBackground");
mCameraHandlerThread.start();
//Initialize Camera class.
mCameraHelper = CameraHelper.getInstance();
mCameraHelper.initializeCameraHelper(mContext, mCameraHandler, textureView, this, imageHeight, imageWidth, maxSize);
}
//capture image
public void captureImage() {
mCameraHelper.takePicture();
}
//get imagereader available
#Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireNextImage();
ByteBuffer imageBuf = image.getPlanes()[0].getBuffer();
final byte[] imageBytes = new byte[imageBuf.remaining()];
image.close();
//post image bytes data to main UI Thread for displaying it in image view
mCameraHandler.post(new Runnable() {
#Override
public void run()( {
mOnPictureAvailableListener.onPictureAvailable(imageBytes);
}
});
}
(CameraHelper.class) (thanks)
TextureView textureView;
private CameraDevice mCameraDevice;
private CameraCaptureSession mCaptureSession;
private ImageReader mImageReader;
Size imageDimension;
private CaptureRequest.Builder captureRequestBuilder;
//Lazy-loaded singleton, so only one instance of the camera is created.
private CameraHelper() {
}
private static class InstanceHolder {
private static CameraHelper mCamera = new CameraHelper();
}
public static CameraHelper getInstance() {
return InstanceHolder.mCamera;
}
//Initialize the camera device.
public void initializeCameraHelper(Context context, Handler backgroundHandler, TextureView textureView, ImageReader.OnImageAvailableListener imageAvailableListener, int imageWidth, int imageHeight, int maxImages) {
this.textureView = textureView;
//discover the camera instance
CameraManager manager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
String[] camIds = {};
try {
camIds = manager.getCameraIdList();
if (camIds.length < 1) {
return;
}
String id = camIds[0];
CameraCharacteristics characteristics = manager.getCameraCharacteristics(id);
StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
imageDimension = map.getOutputSizes(SurfaceTexture.class)[2];
//Initialize the image processor
mImageReader = ImageReader.newInstance(imageWidth, imageHeight, ImageFormat.JPEG, maxImages);
mImageReader.setOnImageAvailableListener(
imageAvailableListener, backgroundHandler);
//Open camera resource
manager.openCamera(id, mStateCallback, backgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* Callback handling device state changes
*/
private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(#NonNull CameraDevice cameraDevice) {
Log.d(TAG, "Opened camera.");
mCameraDevice = cameraDevice;
createCameraPreview();
}
#Override
public void onDisconnected(#NonNull CameraDevice cameraDevice) {
Log.d(TAG, "Camera disconnected, closing.");
closeCaptureSession();
cameraDevice.close();
}
#Override
public void onError(#NonNull CameraDevice cameraDevice, int i) {
Log.d(TAG, "Camera device error, closing.");
closeCaptureSession();
cameraDevice.close();
}
#Override
public void onClosed(#NonNull CameraDevice cameraDevice) {
Log.d(TAG, "Closed camera, releasing");
mCameraDevice = null;
}
};
private void createCameraPreview() {
Log.d(TAG, "createCameraPreview --" + textureView);
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
Surface surface = new Surface(texture);
captureRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.addTarget(surface);
mCameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession session) {
if (mCameraDevice == null) {
return;
}
mCaptureSession = session;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession session) {
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
//ready preview screen
private void updatePreview() {
if (mCameraDevice == null) {
}
captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
try {
mCaptureSession.setRepeatingRequest(captureRequestBuilder.build(), null, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* Begin a still image capture
*/
public void takePicture() {
if (mCameraDevice == null) {
Log.w(TAG, "Cannot capture image. Camera not initialized.");
return;
}
// Here, we create a CameraCaptureSession for capturing still images.
try {
mCameraDevice.createCaptureSession(
Collections.singletonList(mImageReader.getSurface()),
mSessionCallback,
null);
} catch (CameraAccessException cae) {
Log.d(TAG, "access exception while preparing pic", cae);
}
}
/**
* Callback handling session state changes
*/
private CameraCaptureSession.StateCallback mSessionCallback =
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
// The camera is already closed
if (mCameraDevice == null) {
return;
}
// When the session is ready, we start capture.
mCaptureSession = cameraCaptureSession;
triggerImageCapture();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Log.w(TAG, "Failed to configure camera");
}
};
/**
* Execute a new capture request within the active session
*/
private void triggerImageCapture() {
try {
final CaptureRequest.Builder captureBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
Log.d(TAG, "Capture request created.");
mCaptureSession.capture(captureBuilder.build(), mCaptureCallback, null);
} catch (CameraAccessException cae) {
Log.d(TAG, "camera capture exception");
}
}
/**
* Callback handling capture session events
*/
private final CameraCaptureSession.CaptureCallback mCaptureCallback =
new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureProgressed(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull CaptureResult partialResult) {
Log.d(TAG, "Partial result");
}
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
//session.close();
//mCaptureSession = null;
captureRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
try {
mCaptureSession.setRepeatingRequest(captureRequestBuilder.build(), mCaptureCallback, null); //OCCUR ERROR LOG
} catch (CameraAccessException e) {
e.printStackTrace();
}
Log.d(TAG, "CaptureSession closed");
}
};
Occur error log CameraHelper.class
Request targets Surface that is not part of current capture session
private final CameraCaptureSession.CaptureCallback mCaptureCallback =
new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
//session.close();
//mCaptureSession = null;
captureRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
try {
mCaptureSession.setRepeatingRequest(captureRequestBuilder.build(), mCaptureCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
Log.d(TAG, "CaptureSession closed");
}
};
this part
I think after capture, there seems to be a problem with mCaptureCallback
How to fix this problem?
if you know please advice for me..
thanks.
You're creating a new capture session in takePicture with just the one ImageReader Surface; after that, you can no longer use the TextureView Surface as a capture request target. You have to create yet another session to resume preview.
It's better to just add the JPEG ImageReader Surface to the initial capture session, and not have to be continually tearing down sessions to take pictures (which is quite slow). For regular preview, don't target the ImageReader in the repeating request, and then for still capture, just issue a capture request that targets the ImageReader.
See the camera2basic sample for examples of this.
I've integrated camera2 with textureVIew. It works on all devices but on tablet when we capture image for first time, it crashesand displays following logs.
Fatal Exception: java.lang.IllegalArgumentException: Surface had no valid native Surface.
at android.hardware.camera2.legacy.LegacyCameraDevice.nativeGetSurfaceId(LegacyCameraDevice.java)
at android.hardware.camera2.legacy.LegacyCameraDevice.getSurfaceId(LegacyCameraDevice.java:658)
at android.hardware.camera2.legacy.LegacyCameraDevice.containsSurfaceId(LegacyCameraDevice.java:678)
at android.hardware.camera2.legacy.RequestThreadManager$2.onPictureTaken(RequestThreadManager.java:220)
at android.hardware.Camera$EventHandler.handleMessage(Camera.java:1248)
at android.os.Handler.dispatchMessage(Handler.java:111)
at android.os.Looper.loop(Looper.java:207)
at android.hardware.camera2.legacy.CameraDeviceUserShim$CameraLooper.run(CameraDeviceUserShim.java:136)
at java.lang.Thread.run(Thread.java:818)
Following code is used to capture the image.
protected void takePicture() {
if (getContext() == null || cameraDevice == null) return;
lockFocus();
CameraManager manager = (CameraManager) getContext().getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
if (characteristics != null) {
sizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
ImageReader reader = getImageReader();
if (reader == null) return;
List<Surface> outputSurfaces = getSurfaces(reader);
final CaptureRequest.Builder captureBuilder = getCaptureBuilder(reader);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, null);
} catch (Exception e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, null);
} catch (Exception e) {
e.printStackTrace();
}
}
Any help would be appreciated.
That can happen if the ImageReader gets garbage collected before the camera picture capture completes.
Does the getImageReader method store the image reader somewhere permanent (like as a class member)? If not, the Surface from the ImageReader is like a weak reference, and will not keep it from being removed.
I am using the Camera2 api to show a preview form the camera. I also want to implement a ImageReader to process the images. I have a Start preview function. When I call it the preview is just black. If I remove "mimageReader.getSurface()" from Arrays.asList() I am able to see the camera preview. How can I have a camera preview show up and also use ImageReader?
private void startPreview()
{
List<Surface> outputSurfaces = new ArrayList<>();
List surfaces = new ArrayList<>();
SurfaceTexture surfaceTexture = textureView.getSurfaceTexture();
surfaceTexture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
Surface previewSurface = new Surface(surfaceTexture);
try {
mCaptureRequestBuilder = _cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mCaptureRequestBuilder.addTarget(previewSurface);
_cameraDevice.createCaptureSession(Arrays.asList(previewSurface,mimageReader.getSurface()),
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
Log.d("", "onConfigured: startPreview");
try {
session.setRepeatingRequest( mCaptureRequestBuilder.build(),null,mthreadhandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
Log.d("", "onConfigureFailed: startPreview");
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
You have to create a new instance of ImageReader, set its onImageAvailableListener(you can process the preview frames here) and add it to PreviewRequestBuilder. For example:
mImageReader = ImageReader.newInstance(
mPreviewSize.getWidth(), mPreviewSize.getHeight(), ImageFormat.YUV_420_888, 2);
mImageReader.setOnImageAvailableListener(mOnGetPreviewListener, mBackgroundHandler);
mPreviewRequestBuilder.addTarget(mImageReader.getSurface());
It may be because your function startPreview() is not called, it can happens when we don't get surfaceTexture values at time.
This is my startPreview method
protected void createCameraPreview() {
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
Surface surface = new Surface(texture);
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.addTarget(surface);
cameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback(){
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
//The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
cameraCaptureSessions = cameraCaptureSession;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Toast.makeText(MainActivity.this, "Configuration change", Toast.LENGTH_SHORT).show();
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Then i call updatePreview function to restart the preview
protected void updatePreview() {
if(null == cameraDevice) {
Log.e(TAG, "updatePreview error, return");
}
captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
try {
cameraCaptureSessions.setRepeatingRequest(captureRequestBuilder.build(), null, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
I'm trying to create an app using the camera2 api, what I need is to create a burst of 30 fps which I was able to create.
The problems is both the preview images and the saved images interlaced (I'm photographing some blinking leds so its easy to see).
I tried to disable the auto exposure and set the sensitivity myself but that didn't work.
private void captureStillPicture() {
try {
final Activity activity = getActivity();
mPictureCounter = 0;
if (null == activity || null == mCameraDevice) {
return;
}
List<CaptureRequest> captureList = new ArrayList<CaptureRequest>();
// This is the CaptureRequest.Builder that we use to take a picture.
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
// Use the same AE and AF modes as the preview.
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
captureBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, 2000);
//Auto focus - should keep that
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, Consts.aeMode);
captureBuilder.addTarget(mImageReader.getSurface());
for(int i = 0; i < Consts.frameAmount; i++) {
captureList.add(captureBuilder.build());
}
CameraCaptureSession.CaptureCallback CaptureCallback = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
mPictureCounter++;
unlockFocus();
}
};
mCaptureSession.stopRepeating();
mCaptureSession.captureBurst(captureList, CaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Any thought?