I am creating an app on android 5.0.2 using new camera API(camera2). The app is to take a picture every 2.5 sec for 3 hours(4320 pictures in total). As you can see in the code below I coded repeating stuff with "timer" and no preview referring to Capture picture without preview using camera2 API. I am using NEXUS7 2013 16G 5.0.2 for the test. It works fine for the beginning 200-300 pictures and failed with the following error message. The fail always starts with "E/RequestThread-1﹕ Hit timeout for jpeg callback!", it must trigger something. Would anyone help to get rid of this trigger? Or this is going to fixed in 5.1.0 if it is android bug??
03-30 15:46:04.472 11432-11432/com.example.android.camera2basic V/yo click﹕ ---- 174 ---- click
03-30 15:46:05.026 11432-11537/com.example.android.camera2basic E/RequestThread-1﹕ Hit timeout for jpeg callback!
03-30 15:46:05.027 11432-11537/com.example.android.camera2basic W/CaptureCollector﹕ Jpeg buffers dropped for request: 173
03-30 15:46:05.076 11432-11480/com.example.android.camera2basic E/CameraDevice-JV-1﹕ Lost output buffer reported for frame 173
03-30 15:46:05.090 11432-11537/com.example.android.camera2basic W/LegacyRequestMapper﹕ convertRequestMetadata - control.awbRegions setting is not supported, ignoring value
03-30 15:46:05.090 11432-11537/com.example.android.camera2basic W/LegacyRequestMapper﹕ Only received metering rectangles with weight 0.
03-30 15:46:05.091 11432-11537/com.example.android.camera2basic W/LegacyMetadataMapper﹕ convertAfModeToLegacy - ignoring unsupported mode 4, defaulting to fixed
03-30 15:46:05.091 11432-11537/com.example.android.camera2basic W/LegacyRequestMapper﹕ convertRequestToMetadata - Ignoring android.lens.focusDistance false, only 0.0f is supported
03-30 15:46:05.098 11432-11537/com.example.android.camera2basic E/AndroidRuntime﹕ FATAL EXCEPTION: RequestThread-1
Process: com.example.android.camera2basic, PID: 11432
java.lang.RuntimeException: startPreview failed
at android.hardware.Camera.startPreview(Native Method)
at android.hardware.camera2.legacy.RequestThreadManager.startPreview(RequestThreadManager.java:275)
at android.hardware.camera2.legacy.RequestThreadManager.doJpegCapturePrepare(RequestThreadManager.java:288)
at android.hardware.camera2.legacy.RequestThreadManager.access$1700(RequestThreadManager.java:61)
at android.hardware.camera2.legacy.RequestThreadManager$5.handleMessage(RequestThreadManager.java:767)
at android.os.Handler.dispatchMessage(Handler.java:98)
at android.os.Looper.loop(Looper.java:135)
at android.os.HandlerThread.run(HandlerThread.java:61)
Here is my code::
public class CameraActivity extends Activity {
Timer mTimer = null;
Handler mHandler = new Handler();
private ImageReader imageReader;
private Handler backgroundHandler;
private HandlerThread backgroundThread;
private String cameraId;
private CameraDevice cameraDevice;
private CameraCaptureSession cameraCaptureSession;
static int count = 0;
static int count2 = 0;
/**
* Conversion from screen rotation to JPEG orientation.
*/
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
//setContentView(R.layout.activity_camera);
setContentView(R.layout.activity_main);
Button takePicture = (Button)findViewById(R.id.takepic);
takePicture.setOnClickListener(onClickPicture);
//(1) setting up camera but stop before camera createCaptureRequest
setupCamera2();
}
private void setupCamera2() {
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
for (String cameraId : manager.getCameraIdList()) {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
//if (characteristics.get(CameraCharacteristics.LENS_FACING) != CameraCharacteristics.LENS_FACING_BACK) {
if (characteristics.get(CameraCharacteristics.LENS_FACING) != CameraCharacteristics.LENS_FACING_FRONT) {
continue;
}
StreamConfigurationMap configs = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
this.cameraId = cameraId;
manager.openCamera(this.cameraId, cameraStateCallback, backgroundHandler);
Size[] sizes = configs.getOutputSizes(ImageFormat.JPEG);
int picWidth = 640;//1920;
int picHeight = 480;//1080;
imageReader = ImageReader.newInstance(picWidth, picHeight, ImageFormat.JPEG, 2);
imageReader.setOnImageAvailableListener(onImageAvailableListener, backgroundHandler);
}
} catch (CameraAccessException | NullPointerException e) {
e.printStackTrace();
}
}
private final CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(CameraDevice device) {
cameraDevice = device;
//(2) Camera capture session
createCameraCaptureSession();
}
#Override
public void onDisconnected(CameraDevice cameraDevice) {}
#Override
public void onError(CameraDevice cameraDevice, int error) {}
};
//private void createCaptureSession() {
private void createCameraCaptureSession() {
List<Surface> outputSurfaces = new LinkedList<>();
outputSurfaces.add(imageReader.getSurface());
Log.v("-yo(2)-", "in createcameraCaptureSession now");
try {
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
//cameraCaptureSession = session;
cameraCaptureSession = session;
//commented out to invoked from button
//createCaptureRequest();
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private final ImageReader.OnImageAvailableListener onImageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
//createCaptureRequest();
Log.v("yo ireader ","---- "+(count2++)+" ---- ireader");
//Image mImage = imageReader.acquireLatestImage();
Image mImage = reader.acquireLatestImage();
File mFile = new File(Environment.getExternalStorageDirectory() + "/yP2PTEST/0P2Pimage.jpg");
Log.v("--yo--", "In ImageReader now writing to "+mFile);
/////////////////////////////////////
ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
FileOutputStream output = null;
try {
output = new FileOutputStream(mFile);
output.write(bytes);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
mImage.close();
if (null != output) {
try {
output.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
ImageView curPic = (ImageView)findViewById(R.id.imageView1);
Bitmap mCurrentBitmap = BitmapFactory.decodeFile(mFile.getPath());
curPic.setImageBitmap(mCurrentBitmap);
}
///////////////////////////////////
};
private void createCaptureRequest() {
Log.v("-yo(3)-", "in createCaptureRequest now");
try {
CaptureRequest.Builder requestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
requestBuilder.addTarget(imageReader.getSurface());
// Focus
requestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
// Orientation
//yo int rotation = windowManager.getDefaultDisplay().getRotation();
int rotation = this.getWindowManager().getDefaultDisplay().getRotation();
requestBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
// cameraCaptureSession.capture(requestBuilder.build(), camera2Callback, null);
cameraCaptureSession.capture(requestBuilder.build(), mCaptureCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
CameraCaptureSession.CaptureCallback mCaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request,
TotalCaptureResult result) {
//showToast("JPEG Saved : ");
//Log.v("yo save","- saved JPEG -");
//unlockFocus();
}
};
private Handler mMessageHandler = new Handler() {
#Override
public void handleMessage(Message msg) {
if (this != null) {
Toast.makeText(CameraActivity.this, (String) msg.obj, Toast.LENGTH_SHORT).show();
}
}
};
private void showToast(String text) {
// We show a Toast by sending request message to mMessageHandler. This makes sure that the
// Toast is shown on the UI thread.
Message message = Message.obtain();
message.obj = text;
mMessageHandler.sendMessage(message);
}
//------------------------------------------------------------//
public View.OnClickListener onClickPicture = new View.OnClickListener() {
public void onClick(View v) {
/*------- camera2 --------------*/
mTimer = null;
mTimer = new Timer(true);
mTimer.schedule( new TimerTask(){
#Override
public void run() {
/*------------------------*/
mHandler.post( new Runnable() {
public void run() {
createCaptureRequest();
Log.v("yo click ","---- "+(count++)+" ---- click");
}
});
}
}, 1000, 2500);//1500,1600, 1800 etc
};
};
};
Thanks in advance.
EDIT
I looked into source program of CAMERA2 API and found where the error message comes.
JPEG_FRAME_TIMEOUT is currently 300ms, I guess it is so small and want to increase it. If anyone know how to do it, please let me know?
if (holder.hasJpegTargets()) {
doJpegCapture(holder);
if (!mReceivedJpeg.block(JPEG_FRAME_TIMEOUT)) {
Log.e(TAG, "Hit timeout for jpeg callback!");
mCaptureCollector.failNextJpeg();
}
}
I've had same issue with modified google sample.
After taking two pictures I got this error, so you just need to extend last parameter as much as you need, and it will work.
imageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),ImageFormat.JPEG, 2);
But keep in mind this:
#param maxImages The maximum number of images the user will want to access simultaneously. This should be as small as possible to limit memory use. Once maxImages Images are obtained by the user, one of them has to be released before a new Image will become available for access through {#link #acquireLatestImage()} or {#link #acquireNextImage()}. Must be greater than 0.
Related
I want Camera preview and Camera Capture use Camera2 API.
this code is works, but there is one problem.
problem is When click the capture button, capture succeeds, camera preview stops.
if I press the capture button again, the camera preview is still paused.
if I do not press the capture button after running the app, camera preview is success work.
Error log when the capture button is pressed.
java.lang.IllegalArgumentException: submitRequestList:208: Request targets Surface that is not part of current capture session
at android.hardware.camera2.CameraManager.throwAsPublicException(CameraManager.java:650)
at android.hardware.camera2.impl.ICameraDeviceUserWrapper.submitRequestList(ICameraDeviceUserWrapper.java:86)
at android.hardware.camera2.impl.CameraDeviceImpl.submitCaptureRequest(CameraDeviceImpl.java:935)
at android.hardware.camera2.impl.CameraDeviceImpl.setRepeatingRequest(CameraDeviceImpl.java:974)
at android.hardware.camera2.impl.CameraCaptureSessionImpl.setRepeatingRequest(CameraCaptureSessionImpl.java:243)
at com.bilal.androidthingscameralib.CameraHelper$4.onCaptureCompleted(CameraHelper.java:273)
at java.lang.reflect.Method.invoke(Native Method)
at android.hardware.camera2.dispatch.InvokeDispatcher.dispatch(InvokeDispatcher.java:39)
at android.hardware.camera2.dispatch.HandlerDispatcher$1.run(HandlerDispatcher.java:65)
at android.os.Handler.handleCallback(Handler.java:790)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:164)
at android.app.ActivityThread.main(ActivityThread.java:6494)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:438)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:807)
Caused by: android.os.ServiceSpecificException: submitRequestList:208: Request targets Surface that is not part of current capture session (code 3)
at android.os.Parcel.readException(Parcel.java:2018)
at android.os.Parcel.readException(Parcel.java:1950)
at android.hardware.camera2.ICameraDeviceUser$Stub$Proxy.submitRequestList(ICameraDeviceUser.java:334)
at android.hardware.camera2.impl.ICameraDeviceUserWrapper.submitRequestList(ICameraDeviceUserWrapper.java:84)
at android.hardware.camera2.impl.CameraDeviceImpl.submitCaptureRequest(CameraDeviceImpl.java:935)
at android.hardware.camera2.impl.CameraDeviceImpl.setRepeatingRequest(CameraDeviceImpl.java:974)
at android.hardware.camera2.impl.CameraCaptureSessionImpl.setRepeatingRequest(CameraCaptureSessionImpl.java:243)
at com.bilal.androidthingscameralib.CameraHelper$4.onCaptureCompleted(CameraHelper.java:273)
at java.lang.reflect.Method.invoke(Native Method)
at android.hardware.camera2.dispatch.InvokeDispatcher.dispatch(InvokeDispatcher.java:39)
at android.hardware.camera2.dispatch.HandlerDispatcher$1.run(HandlerDispatcher.java:65)
at android.os.Handler.handleCallback(Handler.java:790)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:164)
at android.app.ActivityThread.main(ActivityThread.java:6494)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:438)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:807)
and my source (MainActivity.class);
private InitializeCamera mInitializeCamera;
TextureView mTextureView;
ImageView imgCaptureImage;
#Override
public void onCreate(Bundle savedInstanceState) {
Button capBtn = (Button) findViewById(R.id.capBtn); //capture button
#Override
public void onClick(View v) {
mInitializeCamera.captureImage();
}
}
//camera open texture
private final TextureView.SurfaceTextureListener mSurfaceTextureListener
= new TextureView.SurfaceTextureListener() {
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
mInitializeCamera = new InitializeCamera(getApplicationContext(), mOnPictureAvailableListener, mTextureView, 640, 480, 1);
}
...
};
//picture taken
private OnPictureAvailableListener mOnPictureAvailableListener =
new OnPictureAvailableListener() {
#Override
public void onPictureAvailable(byte[] imageBytes) {
onPictureTaken(imageBytes);
}
};
//get picture
private void onPictureTaken(final byte[] imageBytes) {
if (imageBytes != null) {
Bitmap bitmap = BitmapFactory.decodeByteArray(imageBytes, 0, imageBytes.length);
imgCaptureImage.setImageBitmap(bitmap);
}
}
(InitializeCamera.class) :)
OnPictureAvailableListener mOnPictureAvailableListener;
private HandlerThread mCameraHandlerThread;
//Initialize camera library class
private CameraHelper mCameraHelper;
//Handler for running Camera Task in the background
private Handler mCameraHandler = new Handler();
public InitializeCamera(Context mContext, OnPictureAvailableListener mOnPictureAvailableListener, TextureView textureView, int imageHeight, int imageWidth, int maxSize) {
this.mOnPictureAvailableListener = mOnPictureAvailableListener;
//create new handler thread for camera operations.
mCameraHandlerThread = new HandlerThread("CameraBackground");
mCameraHandlerThread.start();
//Initialize Camera class.
mCameraHelper = CameraHelper.getInstance();
mCameraHelper.initializeCameraHelper(mContext, mCameraHandler, textureView, this, imageHeight, imageWidth, maxSize);
}
//capture image
public void captureImage() {
mCameraHelper.takePicture();
}
//get imagereader available
#Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireNextImage();
ByteBuffer imageBuf = image.getPlanes()[0].getBuffer();
final byte[] imageBytes = new byte[imageBuf.remaining()];
image.close();
//post image bytes data to main UI Thread for displaying it in image view
mCameraHandler.post(new Runnable() {
#Override
public void run()( {
mOnPictureAvailableListener.onPictureAvailable(imageBytes);
}
});
}
(CameraHelper.class) (thanks)
TextureView textureView;
private CameraDevice mCameraDevice;
private CameraCaptureSession mCaptureSession;
private ImageReader mImageReader;
Size imageDimension;
private CaptureRequest.Builder captureRequestBuilder;
//Lazy-loaded singleton, so only one instance of the camera is created.
private CameraHelper() {
}
private static class InstanceHolder {
private static CameraHelper mCamera = new CameraHelper();
}
public static CameraHelper getInstance() {
return InstanceHolder.mCamera;
}
//Initialize the camera device.
public void initializeCameraHelper(Context context, Handler backgroundHandler, TextureView textureView, ImageReader.OnImageAvailableListener imageAvailableListener, int imageWidth, int imageHeight, int maxImages) {
this.textureView = textureView;
//discover the camera instance
CameraManager manager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
String[] camIds = {};
try {
camIds = manager.getCameraIdList();
if (camIds.length < 1) {
return;
}
String id = camIds[0];
CameraCharacteristics characteristics = manager.getCameraCharacteristics(id);
StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
imageDimension = map.getOutputSizes(SurfaceTexture.class)[2];
//Initialize the image processor
mImageReader = ImageReader.newInstance(imageWidth, imageHeight, ImageFormat.JPEG, maxImages);
mImageReader.setOnImageAvailableListener(
imageAvailableListener, backgroundHandler);
//Open camera resource
manager.openCamera(id, mStateCallback, backgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* Callback handling device state changes
*/
private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(#NonNull CameraDevice cameraDevice) {
Log.d(TAG, "Opened camera.");
mCameraDevice = cameraDevice;
createCameraPreview();
}
#Override
public void onDisconnected(#NonNull CameraDevice cameraDevice) {
Log.d(TAG, "Camera disconnected, closing.");
closeCaptureSession();
cameraDevice.close();
}
#Override
public void onError(#NonNull CameraDevice cameraDevice, int i) {
Log.d(TAG, "Camera device error, closing.");
closeCaptureSession();
cameraDevice.close();
}
#Override
public void onClosed(#NonNull CameraDevice cameraDevice) {
Log.d(TAG, "Closed camera, releasing");
mCameraDevice = null;
}
};
private void createCameraPreview() {
Log.d(TAG, "createCameraPreview --" + textureView);
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
Surface surface = new Surface(texture);
captureRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.addTarget(surface);
mCameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession session) {
if (mCameraDevice == null) {
return;
}
mCaptureSession = session;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession session) {
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
//ready preview screen
private void updatePreview() {
if (mCameraDevice == null) {
}
captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
try {
mCaptureSession.setRepeatingRequest(captureRequestBuilder.build(), null, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
/**
* Begin a still image capture
*/
public void takePicture() {
if (mCameraDevice == null) {
Log.w(TAG, "Cannot capture image. Camera not initialized.");
return;
}
// Here, we create a CameraCaptureSession for capturing still images.
try {
mCameraDevice.createCaptureSession(
Collections.singletonList(mImageReader.getSurface()),
mSessionCallback,
null);
} catch (CameraAccessException cae) {
Log.d(TAG, "access exception while preparing pic", cae);
}
}
/**
* Callback handling session state changes
*/
private CameraCaptureSession.StateCallback mSessionCallback =
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
// The camera is already closed
if (mCameraDevice == null) {
return;
}
// When the session is ready, we start capture.
mCaptureSession = cameraCaptureSession;
triggerImageCapture();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Log.w(TAG, "Failed to configure camera");
}
};
/**
* Execute a new capture request within the active session
*/
private void triggerImageCapture() {
try {
final CaptureRequest.Builder captureBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
Log.d(TAG, "Capture request created.");
mCaptureSession.capture(captureBuilder.build(), mCaptureCallback, null);
} catch (CameraAccessException cae) {
Log.d(TAG, "camera capture exception");
}
}
/**
* Callback handling capture session events
*/
private final CameraCaptureSession.CaptureCallback mCaptureCallback =
new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureProgressed(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull CaptureResult partialResult) {
Log.d(TAG, "Partial result");
}
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
//session.close();
//mCaptureSession = null;
captureRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
try {
mCaptureSession.setRepeatingRequest(captureRequestBuilder.build(), mCaptureCallback, null); //OCCUR ERROR LOG
} catch (CameraAccessException e) {
e.printStackTrace();
}
Log.d(TAG, "CaptureSession closed");
}
};
Occur error log CameraHelper.class
Request targets Surface that is not part of current capture session
private final CameraCaptureSession.CaptureCallback mCaptureCallback =
new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
//session.close();
//mCaptureSession = null;
captureRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
try {
mCaptureSession.setRepeatingRequest(captureRequestBuilder.build(), mCaptureCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
Log.d(TAG, "CaptureSession closed");
}
};
this part
I think after capture, there seems to be a problem with mCaptureCallback
How to fix this problem?
if you know please advice for me..
thanks.
You're creating a new capture session in takePicture with just the one ImageReader Surface; after that, you can no longer use the TextureView Surface as a capture request target. You have to create yet another session to resume preview.
It's better to just add the JPEG ImageReader Surface to the initial capture session, and not have to be continually tearing down sessions to take pictures (which is quite slow). For regular preview, don't target the ImageReader in the repeating request, and then for still capture, just issue a capture request that targets the ImageReader.
See the camera2basic sample for examples of this.
My application was working good in large number of phones. However, when I installed it in my old android phone the following error is thrown and application crashes while taking the photo.
Android java.lang.IllegalArgumentException: previewSize must not be
taller than activeArray
Photo Capturing code:
public class Camera1 extends AppCompatActivity {
private static final String TAG = "AndroidCameraApi";
private TextureView textureView;
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
private Bitmap scaled;
private Bitmap mBitmapToSave;
private Bitmap mBitmapToSave1;
private String cameraId;
protected CameraDevice cameraDevice;
protected CameraCaptureSession cameraCaptureSessions;
protected CaptureRequest captureRequest;
protected CaptureRequest.Builder captureRequestBuilder;
private Size imageDimension;
private ImageReader imageReader;
private File file;
private com.google.android.gms.vision.face.FaceDetector detector;
private static final int REQUEST_CAMERA_PERMISSION = 200;
private boolean mFlashSupported;
private Handler mBackgroundHandler;
private HandlerThread mBackgroundThread;
private int width = 640;
private int height = 480;
private int index;
//Image request code
private int PICK_IMAGE_REQUEST = 1;
private int a=0;
//storage permission code
private static final int STORAGE_PERMISSION_CODE = 123;
//Bitmap to get image from gallery
private Bitmap bitmap;
//Uri to store the image uri
private Uri filePath;
private String name,dl_no,truck_id, tstatus;
private float l_value;
private String dl;
private int c=0;
File fileToUpload;
int f=0;
private String uuid;
String latitude;
String longitude;
String time1;
String date1;
private int mSensorOrientation;
CameraCharacteristics characteristics;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_android_camera2_api);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
textureView = (TextureView) findViewById(R.id.texture);
assert textureView != null;
textureView.setSurfaceTextureListener(textureListener);
detector = new FaceDetector.Builder(getApplicationContext())
.setMode(FaceDetector.ACCURATE_MODE)
.build();
uuid = UUID.randomUUID().toString();
Bundle extras = getIntent().getExtras();
if (extras != null) {
l_value=extras.getFloat("v");
tstatus=extras.getString("status");
name=extras.getString("name");
dl_no=extras.getString("dl");
truck_id=extras.getString("tid");
latitude=extras.getString("lat");
longitude=extras.getString("lon");
time1 =extras.getString("t");
date1=extras.getString("d");
}
fileToUpload = new File(Environment.getExternalStorageDirectory() + "/" + "/Faceapp/"+name+"_"+dl_no+"_"+truck_id+"_"+latitude+"_"+longitude+"_"+time1+"_"+date1+"_"+a+".jpg");
//Intent uplaod_intent = new Intent(Camera1.this, uploadRest.class);
// Add extras to the bundle
// Start the service
// Camera1.this.startService(uplaod_intent);
}
private int getOrientation(int rotation) {
// Sensor orientation is 90 for most devices, or 270 for some devices (eg. Nexus 5X
// We have to take that into account and rotate JPEG properly.
// For devices with orientation of 90, we simply return our mapping from ORIENTATIONS.
// For devices with orientation of 270, we need to rotate the JPEG 180 degrees.
return (ORIENTATIONS.get(rotation) + mSensorOrientation + 270) % 360;
}
TextureView.SurfaceTextureListener textureListener = new TextureView.SurfaceTextureListener() {
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
//open your camera here
openCamera();
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
// Transform you image captured size according to the surface width and height
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return false;
}
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
};
private final CameraDevice.StateCallback stateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(CameraDevice camera) {
//This is called when the camera is open
Log.e(TAG, "onOpened");
cameraDevice = camera;
createCameraPreview();
}
#Override
public void onDisconnected(CameraDevice camera) {
cameraDevice.close();
}
#Override
public void onError(CameraDevice camera, int error) {
cameraDevice.close();
cameraDevice = null;
}
};
final CameraCaptureSession.CaptureCallback captureCallbackListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Toast.makeText(Camera1.this, "Saved:" + file, Toast.LENGTH_SHORT).show();
createCameraPreview();
}
};
protected void startBackgroundThread() {
mBackgroundThread = new HandlerThread("Camera Background");
mBackgroundThread.start();
mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
}
protected void stopBackgroundThread() {
mBackgroundThread.quitSafely();
try {
mBackgroundThread.join();
mBackgroundThread = null;
mBackgroundHandler = null;
} catch (InterruptedException e) {
e.printStackTrace();
}
}
public static void goToCompletedActivity(Context mContext) {
Intent login = new Intent(mContext, Completed.class);
mContext.startActivity(login);
}
private void userLogin() {
//first getting the values
final String Driver_id = dl_no;
class UserLogin extends AsyncTask<Void, Void, String> {
ProgressBar progressBar;
#Override
protected void onPreExecute() {
super.onPreExecute();
progressBar = (ProgressBar) findViewById(R.id.progressBar);
//progressBar.setVisibility(View.VISIBLE);
}
#Override
protected void onPostExecute(String s) {
super.onPostExecute(s);
// progressBar.setVisibility(View.GONE);
try {
//converting response to json object
JSONObject obj = new JSONObject(s);
//if no error in response
if (!obj.getBoolean("error")) {
Toast.makeText(getApplicationContext(), obj.getString("message"), Toast.LENGTH_SHORT).show();
//getting the user from the response
JSONObject userJson = obj.getJSONObject("user");
//creating a new user object
User user = new User(
userJson.getString("Driver_id"),
userJson.getString("Driver_name"),
userJson.getString("Truck_id"),
userJson.getString("Trainingstatus")
);
//storing the user in shared preferences
SharedPrefManager.getInstance(getApplicationContext()).userLogin(user);
//starting the profile activity
finish();
startActivity(new Intent(getApplicationContext(), ProfileActivity.class));
} else {
Toast.makeText(getApplicationContext(), "Invalid Driver ID", Toast.LENGTH_SHORT).show();
}
} catch (JSONException e) {
e.printStackTrace();
}
}
#Override
protected String doInBackground(Void... voids) {
//creating request handler object
RequestHandler requestHandler = new RequestHandler();
//creating request parameters
HashMap<String, String> params = new HashMap<>();
params.put("Driver_id", Driver_id);
//returing the response
return requestHandler.sendPostRequest(URLs.URL_LOGIN, params);
}
}
UserLogin ul = new UserLogin();
ul.execute();
}
public void launchuploadservice() {
// Construct our Intent specifying the Service
Intent i = new Intent(this, Upload.class);
// Add extras to the bundle
i.putExtra("name", name);
i.putExtra("dl", dl_no);
i.putExtra("tid", truck_id);
i.putExtra("lat", latitude);
i.putExtra("lon", longitude);
i.putExtra("status", tstatus);
i.putExtra("t", time1);
i.putExtra("d", date1);
// Start the service
startService(i);
}
protected void takePicture() {
if (null == cameraDevice) {
Log.e(TAG, "cameraDevice is null");
return;
}
final CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
Size[] jpegSizes = null;
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
if (jpegSizes != null && 0 < jpegSizes.length) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
// Orientation
int displayRotation = this.getWindowManager().getDefaultDisplay().getRotation();
int sensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
mSensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
boolean swappedDimensions = false;
switch (displayRotation) {
case Surface.ROTATION_0:
case Surface.ROTATION_180:
if (sensorOrientation == 90 || sensorOrientation == 270) {
if (mSensorOrientation == 90 || mSensorOrientation == 270) {
swappedDimensions = true;
}
}
break;
case Surface.ROTATION_90:
case Surface.ROTATION_270:
if (sensorOrientation == 0 || sensorOrientation == 180) {
if (mSensorOrientation == 0 || mSensorOrientation == 180) {
swappedDimensions = true;
}
}
break;
}
int rotation = this.getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(rotation));
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null ;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
mBitmapToSave = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
if (detector.isOperational() && mBitmapToSave != null) {
Frame frame = new Frame.Builder()
.setBitmap(mBitmapToSave)
//.setImageData(buffer, width, height, YUV_420_888)
//.setRotation(getWindowManager().getDefaultDisplay().getRotation())
.build();
SparseArray<Face> faces = detector.detect(frame);
for (index = 0; index < faces.size(); ++index) {
Face face = faces.valueAt(index);
}
if (faces.size() == 0) {
Toast.makeText(Camera1.this, "No Face" + "\n", Toast.LENGTH_SHORT).show();
saveImageToDisk(bytes);
MediaPlayer mediaPlayer = MediaPlayer.create(getApplicationContext(), R.raw.not);
mediaPlayer.start();
// mBitmapToSave.recycle();
} else {
saveImageToDisk(bytes);
Toast.makeText(Camera1.this, "Face Found " + "\n", Toast.LENGTH_SHORT).show();
launchuploadservice();
}
}
} catch (Exception ee) {
}
finally {
if(image!=null)
image.close();
}
}
};
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
createCameraPreview();
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, mBackgroundHandler);
mBitmapToSave = null;
} catch(CameraAccessException e){
e.printStackTrace();
}
}
private void saveImageToDisk(final byte[] bytes) {
final File file = new File(Environment.getExternalStorageDirectory() + "/"+"/Faceapp/"+name+"_"+dl_no+"_"+truck_id+"_"+longitude+"_"+latitude+"_"+time1+"_"+date1 + "_.jpg");
try (final OutputStream output = new FileOutputStream(file)) {
output.write(bytes);
//this.picturesTaken.put(file.getPath(), bytes);
} catch (IOException e) {
Log.e(TAG, "Exception occurred while saving picture to external storage ", e);
}
}
protected void createCameraPreview() {
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
Surface surface = new Surface(texture);
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.addTarget(surface);
cameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback(){
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
//The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
cameraCaptureSessions = cameraCaptureSession;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Toast.makeText(Camera1.this, "Configuration change", Toast.LENGTH_SHORT).show();
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private void openCamera() {
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
Log.e(TAG, "is camera open");
try {
cameraId = manager.getCameraIdList()[1];
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
assert map != null;
imageDimension = map.getOutputSizes(SurfaceTexture.class)[0];
// Add permission for camera and let user grant the permission
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED && ActivityCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(Camera1.this, new String[]{Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE}, REQUEST_CAMERA_PERMISSION);
return;
}
manager.openCamera(cameraId, stateCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
Log.e(TAG, "openCamera X");
}
protected void updatePreview() {
if(null == cameraDevice) {
Log.e(TAG, "updatePreview error, return");
}
captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
try {
cameraCaptureSessions.setRepeatingRequest(captureRequestBuilder.build(), null, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private void closeCamera() {
if (null != cameraDevice) {
cameraDevice.close();
cameraDevice = null;
}
if (null != imageReader) {
//image.close();
imageReader.close();
imageReader = null;
}
}
#Override
public void onRequestPermissionsResult(int requestCode, String permissions[], int[] grantResults) {
if (requestCode == REQUEST_CAMERA_PERMISSION) {
if (grantResults[0] == PackageManager.PERMISSION_DENIED) {
// close the app
Toast.makeText(Camera1.this, "Sorry!!!, you can't use this app without granting permission", Toast.LENGTH_LONG).show();
finish();
}
}
}
// #Override
/* public void onRequestPermissionsResult(int requestCode, #NonNull String[] permissions, #NonNull int[] grantResults) {
if (requestCode == REQUEST_CAMERA_PERMISSION) {
if (grantResults[0] == PackageManager.PERMISSION_DENIED) {
// close the app
Toast.makeText(Camera1.this, "Sorry!!!, you can't use this app without granting permission", Toast.LENGTH_LONG).show();
finish();
}
}
}
*/
#Override
protected void onResume() {
final Intent intent = new Intent(Camera1.this, Completed.class);
super.onResume();
Log.e(TAG, "onResume");
startBackgroundThread();
if (textureView.isAvailable()) {
openCamera();
} else {
textureView.setSurfaceTextureListener(textureListener);
}
final int PICTURES_LIMIT = 1;
final Timer timer = new Timer();
timer.schedule(new TimerTask() {
int pictureNo=0;
public void run() {
if (pictureNo>PICTURES_LIMIT){
timer.cancel();
finish();
} else {
takePicture();
pictureNo++;
}
}
},10, 5500);
}
#Override
protected void onPause() {
super.onPause();
Log.e(TAG, "onPause");
closeCamera();
stopBackgroundThread();
}
}
Based on the code found here, its pretty self explanatory that the old device preview size is less than the size that you're trying to crop.
Check the getPreviewCropRectangleUnzoomed function in the code that I have mentioned above. The documentation of the function says the cause of the error specifically. From the documentation.
/**
* Calculate the effective crop rectangle for this preview viewport;
* assumes the preview is centered to the sensor and scaled to fit across one of the dimensions
* without skewing.
*
* <p>The preview size must be a subset of the active array size; the resulting
* rectangle will also be a subset of the active array rectangle.</p>
*
* <p>The unzoomed crop rectangle is calculated only.</p>
*
* #param activeArray active array dimensions, in sensor space
* #param previewSize size of the preview buffer render target, in pixels (not in sensor space)
* #return a rectangle which serves as the preview stream's effective crop region (unzoomed),
* in sensor space
*
* #throws NullPointerException
* if any of the args were {#code null}
* #throws IllegalArgumentException
* if {#code previewSize} is wider or taller than {#code activeArray}
*/
Check the part - The preview size must be a subset of the active array size; the resulting rectangle will also be a subset of the active array rectangle. which declares the preview size must be smaller than the actual active array size.
In this case, you might consider having a try/catch block while you are taking picture using the camera.
try {
takePicture();
} catch (IllegalArgumentException e) {
Toast.makeText(this, "Your phone is too old", Toast.LENGTH_SHORT).show();
}
Hope that helps.
I believe that this may happen in very old devices with small image sensors, although seems to me (only guessing) more like a mistake by the manufacturer when they implemented the camera2 legacy wrapper.
A solution could be that when you are resolving the optimal preview size, then also to take into account that such preview size it doesn't exceed the size specified by SENSOR_INFO_ACTIVE_ARRAY_SIZE, which can be queried in the CameraCharacteristics:
CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE
I develop an android camera app! I use camera2 to implement the camera module! The problem is that when I take a picture with it, the size of the image is too high, something about 9MB!!! So my gallery is too slow! What is the reason of it?
I test it in different cell phones, the size of images are different, but still too high!! I tried photo.compress(Bitmap.CompressFormat.JPEG, 50, out); this code to reduce the size, but the quality of image is too important to me, so I don't want to reduce the resolution! Here is my camera code:
public class MainActivity extends AppCompatActivity {
private Size previewsize;
private Size jpegSizes[] = null;
private TextureView textureView;
private CameraDevice cameraDevice;
private CaptureRequest.Builder previewBuilder;
private CameraCaptureSession previewSession;
private static VirtualFileSystem vfs;
ImageButton getpicture;
ImageButton btnShow;
Button btnSetting;
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
textureView = (TextureView) findViewById(R.id.textureview);
textureView.setSurfaceTextureListener(surfaceTextureListener);
getpicture = (ImageButton) findViewById(R.id.getpicture);
btnShow = (ImageButton) findViewById(R.id.button2);
btnSetting = (Button) findViewById(R.id.btn_setting);
btnSetting.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
startActivity(new Intent(MainActivity.this, Setting.class));
}
});
btnShow.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
startActivity(new Intent(MainActivity.this, Gallery2.class));
}
});
getpicture.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
getPicture();
}
});
}
void getPicture() {
if (cameraDevice == null) {
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
int width = 640, height = 480;
if (jpegSizes != null && jpegSizes.length > 0) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder capturebuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
capturebuilder.addTarget(reader.getSurface());
capturebuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
int rotation = getWindowManager().getDefaultDisplay().getRotation();
capturebuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
ImageReader.OnImageAvailableListener imageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
Bitmap photo = BitmapFactory.decodeByteArray(bytes, 0, bytes.length, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
photo.compress(Bitmap.CompressFormat.JPEG, 50, out);
save(out);
photo = Bitmap.createScaledBitmap(photo, 120, 120, false);
btnShow.setImageBitmap(photo);
save(out);
} catch (Exception ee) {
} finally {
if (image != null)
image.close();
}
}
void save(ByteArrayOutputStream bytes) {
File file12 = getOutputMediaFile();
OutputStream outputStream = null;
try {
outputStream = new FileOutputStream(file12);
outputStream.write(bytes.toByteArray());
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
if (outputStream != null)
outputStream.close();
} catch (Exception e) {
}
}
}
};
HandlerThread handlerThread = new HandlerThread("takepicture");
handlerThread.start();
final Handler handler = new Handler(handlerThread.getLooper());
reader.setOnImageAvailableListener(imageAvailableListener, handler);
final CameraCaptureSession.CaptureCallback previewSSession = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureStarted(CameraCaptureSession session, CaptureRequest request, long timestamp, long frameNumber) {
super.onCaptureStarted(session, request, timestamp, frameNumber);
}
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
startCamera();
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(capturebuilder.build(), previewSSession, handler);
} catch (Exception e) {
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, handler);
} catch (Exception e) {
}
}
public void openCamera() {
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
String camerId = manager.getCameraIdList()[0];
CameraCharacteristics characteristics = manager.getCameraCharacteristics(camerId);
StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
previewsize = map.getOutputSizes(SurfaceTexture.class)[0];
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
return;
}
manager.openCamera(camerId, stateCallback, null);
}catch (Exception e)
{
}
}
private TextureView.SurfaceTextureListener surfaceTextureListener=new TextureView.SurfaceTextureListener() {
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
openCamera();
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return false;
}
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
};
private CameraDevice.StateCallback stateCallback=new CameraDevice.StateCallback() {
#Override
public void onOpened(CameraDevice camera) {
cameraDevice=camera;
startCamera();
}
#Override
public void onDisconnected(CameraDevice camera) {
}
#Override
public void onError(CameraDevice camera, int error) {
}
};
#Override
protected void onPause() {
super.onPause();
if(cameraDevice!=null)
{
cameraDevice.close();
}
}
void startCamera()
{
if(cameraDevice==null||!textureView.isAvailable()|| previewsize==null)
{
return;
}
SurfaceTexture texture=textureView.getSurfaceTexture();
if(texture==null)
{
return;
}
texture.setDefaultBufferSize(previewsize.getWidth(),previewsize.getHeight());
Surface surface=new Surface(texture);
try
{
previewBuilder=cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
}catch (Exception e)
{
}
previewBuilder.addTarget(surface);
try
{
cameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
previewSession=session;
getChangedPreview();
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
},null);
}catch (Exception e)
{
}
}
void getChangedPreview()
{
if(cameraDevice==null)
{
return;
}
previewBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
HandlerThread thread=new HandlerThread("changed Preview");
thread.start();
Handler handler=new Handler(thread.getLooper());
try
{
previewSession.setRepeatingRequest(previewBuilder.build(), null, handler);
}catch (Exception e){}
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
private File getOutputMediaFile() {
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss")
.format(new Date());
File mediaFile;
File(Environment.getExternalStorageDirectory()+"/MyCamAppCipher1"+"/" + mTime + ".jpg");
mediaFile = new File("/myfiles.db"+"/" + mTime + ".jpg");
return mediaFile;
}
#Override
protected void onDestroy() {
super.onDestroy();
}
}
In this piece of code, you're selecting the first available resolution, which usually is the highest one.
int width = 640, height = 480;
if (jpegSizes != null && jpegSizes.length > 0) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
Aside from doing what you're saying, which is reduce image quality by calling photo.compress(Bitmap.CompressFormat.JPEG, 50, out);, your alternative is selecting a smaller camera resolution, by iterating the jpegSizes array and selecting a lower resolution.
Take into account that to do so you should use relative compares (i.e. width >= minWidth), or find the middle resolution, but always selecting one of the ones available in this array. Notice that this array will vary from phone to phone (i.e. it depends on the camera characteristics).
For example, let's say that you need a minimum of 3M pixels (2048x1536). You could have the following code:
int width = Integer.MAX_VALUE, height = Integer.MAX_VALUE;
for (int i = 0; i < jpegSizes.length; i++) {
int currWidth = jpegSizes[0].getWidth();
int currHeight = jpegSizes[0].getHeight();
if ((currWidth < width && currHeight < height) && // smallest resolution
(currWidth > 2048 && currHeight > 1536)) { // at least 3M pixels
width = currWidth;
height = currHeight;
}
}
Notice that there are two conditions:
in the first line, we're looking for the smallest possible resolution (always smaller than the previous one; notice also the initial values for width and height, Integer.MAX_VALUE, which means that at least the first value will always match this condition)
in the second line, we're making sure is at least 3M pixels
So, all combined this code will select the smallest resolution possible, at least 3M pixels.
Finally, you may add a fallback condition: if no matching resolution is found (i.e. width and height are Integer.MAX_VALUE), select the first one as you're currently doing.
I have done a camera app with camera2 too and saving images into disk is nearly immediate, even with 12 or 15 Mb picture size (no need of compressing to 50%). When you say your gallery is slow... are you loading full size images into thumbnails? Are you displaying full size images? Try to load small pictures to your thumbnails and display.
I'm trying to make an app which broadcast video through internet, currently I am using the deprecated Camera API, adding a Camera.PreviewCallback to the Camera object and then sending the byte array which comes in the onPreviewFrame() method from Camera.PreviewCallback.
But now I want to test the new Camera2 API, I am watching at the Camera2BasicTutorial , and I think that I need to make a CameraCaptureSession.CaptureCallback object to get the image byte array, something like the tutorial says:
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
showToast("Saved: " + mFile);
Log.d(TAG, mFile.toString());
unlockFocus();
}
};
And then add it to the CameraCaptureSession:
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
The problem is that I don't know how to retrieve each image byte array from any of the parameters in onCaptureCompleted() from the CaptureCallback.
Any help?
You're kind of right- you can't get the image data from the onCaptureCompleted() method. That callback only returns the metadata about the exposure for your own bookkeeping. The actual image information gets sent to whatever Surface you indicated in the exposure's CaptureRequest.
At least I realized how to do what I wanted, from the Camera2BasicTutorial, I did the following changes to the Camera2BasicFragment class:
Modify captureStillPicture() method to delete stuff which I determined that was unneccessary with my broadcast needs, also don't allow this method to stop the repeating mode:
private void captureStillPicture() {
try {
final Activity activity = getActivity();
if (null == activity || null == mCameraDevice) {
return;
}
final CaptureRequest.Builder captureBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
}
};
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
In createCameraPreviewSession() method, disable the automaticall flash:
// When the session is ready, we start displaying the preview.
mCaptureSession = cameraCaptureSession;
try {
// Auto focus should be continuous for camera preview.
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
// Flash is automatically enabled when necessary.
// mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE,
// CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
// Finally, we start displaying the camera preview.
mPreviewRequest = mPreviewRequestBuilder.build();
mCaptureSession.setRepeatingRequest(mPreviewRequest,
mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
I created a boolean var to detect if there is an image currently being under process, to not queue all the frames that the camera captures; and another boolean to track if there is a frame sending through internet:
private boolean mWorking = false;
private boolean mNetworkWorking = false;
Modify the CaptureCallback object to run the captureStillPicture() method in each frame (only if there is no frame processing at the moment).
case STATE_PREVIEW: {
if (!mWorking){
Log.d(TAG, "capturing..");
mWorking = true;
mBackgroundHandler.post(new Runnable() {
#Override
public void run() {
captureStillPicture();
}
});
} else {
Log.d(TAG, "thread working, doing nothing");
}
break;
Finally, read the frame and send it; I achieved this modifying the OnImageAvailableListener object:
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
= new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(final ImageReader reader) {
// Process the image.
Image image = reader.acquireNextImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
final byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
image.close();
if (!mNetworkWorking){
Thread thread = new Thread(){
#Override
public void run(){
mNetworkWorking = true;
HttpResponse response = null;
HttpClient client = new DefaultHttpClient();
HttpPost post = new HttpPost(mBroadcastUrl);
post.setEntity(new ByteArrayEntity(bytes));
try {
response = client.execute(post);
} catch (ClientProtocolException e) {
if (BuildConfig.LOCAL_LOG)
Log.w(TAG, "ClientProtocolException: "+e.getMessage());
} catch (IOException e) {
if (BuildConfig.LOCAL_LOG)
Log.w(TAG, "IOException: "+e.getMessage());
}
mNetworkWorking = false;
}
};
thread.setName("networkThread");
thread.setPriority(Thread.MAX_PRIORITY);
thread.start();
}
mWorking = false;
}
};
That's all.
I'm working on Android and I'm trying to capture a picture without displaying any preview. I tried to simplify the process by making a class. It's working but all the pictures are really really dark.
Here is my class :
public class Cam {
private Context context;
private CameraManager manager;
private CameraDevice camera;
private CameraCaptureSession session;
private ImageReader reader;
public static String FRONT="-1";
public static String BACK="-1";
private boolean available=true;
private String filepath;
private static final String NO_CAM = "No camera found on device!";
private static final String ERR_CONFIGURE = "Failed configuring session";
private static final String ERR_OPEN = "Can't open the camera";
private static final String CAM_DISCONNECT = "Camera disconnected";
private static final String FILE_EXIST = "File already exist";
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
public Cam(Context context) throws CameraAccessException {
this.context = context;
this.manager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
String ids[] = manager.getCameraIdList();
if(ids.length==2){
BACK=ids[0];
FRONT=ids[1];
}
else if(ids.length==1){
BACK=ids[0];
}
else{
available=false;
throw new CameraAccessException(-1, NO_CAM);
}
}
public void takePicture(String camId, String filepath) throws CameraAccessException {
if(available){
this.filepath=filepath;
StreamConfigurationMap map = manager.getCameraCharacteristics(camId).get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
Size largest = Collections.max(Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)), new CompareSizesByArea());
reader=ImageReader.newInstance(largest.getWidth(), largest.getHeight(), ImageFormat.JPEG, 1);
reader.setOnImageAvailableListener(imageListener, null);
manager.openCamera(camId, cameraStateCallback, null);
}
else
throwError(NO_CAM);
}
private CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(CameraDevice camera) {
Cam.this.camera=camera;
try {
camera.createCaptureSession(Collections.singletonList(reader.getSurface()), sessionStateCallback, null);
} catch (CameraAccessException e) {
throwError(e.getMessage());
}
}
#Override
public void onDisconnected(CameraDevice camera) {
throwError(CAM_DISCONNECT);
}
#Override
public void onError(CameraDevice camera, int error) {
throwError(ERR_OPEN);
}
};
private CameraCaptureSession.StateCallback sessionStateCallback = new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
Cam.this.session=session;
try {
CaptureRequest.Builder request = camera.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
request.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
request.addTarget(reader.getSurface());
int rotation = ((WindowManager)context.getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay().getRotation();
request.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
session.capture(request.build(), captureCallback, null);
} catch (CameraAccessException e) {
throwError(e.getMessage());
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
throwError(ERR_CONFIGURE);
}
};
private CameraCaptureSession.CaptureCallback captureCallback = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureFailed(CameraCaptureSession session, CaptureRequest request, CaptureFailure failure) {
super.onCaptureFailed(session, request, failure);
throwError(failure.toString());
}
};
private ImageReader.OnImageAvailableListener imageListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireLatestImage();
try {
File file = saveImage(image);
// Send file via a listener
closeCamera();
} catch (IOException e) {
throwError(e.getMessage());
}
reader.close();
}
};
private File saveImage(Image image) throws IOException {
File file = new File(filepath);
if (file.exists()) {
throwError(FILE_EXIST);
return null;
}
else {
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
FileOutputStream output = new FileOutputStream(file);
output.write(bytes);
image.close();
output.close();
return file;
}
}
static class CompareSizesByArea implements Comparator<Size> {
#Override
public int compare(Size lhs, Size rhs) {
return Long.signum((long) lhs.getWidth() * lhs.getHeight() - (long) rhs.getWidth() * rhs.getHeight());
}
}
private void closeCamera(){
if(session!=null) {session.close();}
if(reader!=null) {reader.close();}
if(camera!=null) {camera.close();}
}
Then I call the Cam object in my Activity :
Cam cam = new Cam(MainActivity.this);
cam.takePicture(Cam.BACK, "/sdcard/pic.jpg");
A listener prevent the MainActivity when the picture is available, but I removed the code to clear a bit.
I don't know what I am doing wrong, the pictures are really dark. Maybe a flag or something... Any help will be appreciated.
EDIT:
Working class : https://github.com/omaflak/Android-Camera2-Library/blob/master/ezcam/src/main/java/me/aflak/ezcam/EZCam.java
Example: https://github.com/omaflak/Android-Camera2-Library/blob/master/app/src/main/java/me/aflak/libraries/MainActivity.java
If the only capture request you send to the camera is the one for the final picture, this is not surprising.
The camera automatic exposure, focus, and white balance routines generally need a second or two of streaming buffers before they converge to good results.
While you don't need to draw preview on screen, the simplest method here is to first run a repeating request targeting a dummy SurfaceTexture for a second or two, and then fire off the JPEG capture.
You could just stream the JPEG capture, but JPEG capture is slow, so you'll need a longer time for convergence (plus it's more likely a camera implementation has a bug with repeated JPEG capture and getting good exposure, than with a typical preview).
So, create a dummy SurfaceTexture with a random texture ID argument:
private SurfaceTexture mDummyPreview = new SurfaceTexture(1);
private Surface mDummySurface = new Surface(mDummyPreview);
and include the Surface in your session configuration. Once the session is configured, create a preview request that targets the dummy preview, and after N capture results have come in, submit the capture request for the JPEG you want. You'll want to experiment with N, but probably ~30 frames is enough.
Note that you're still not dealing with:
Locking AF to ensure a sharp image for your JPEG
Running AE precapture to allow for flash metering, if you want to allow for flash use
Having some way for the user to know what they'll be capturing, since there's no preview, they can't aim the device at anything very well.
The AF lock and precapture trigger sequences are included in Camera2Basic sample here: https://github.com/googlesamples/android-Camera2Basic, so you can take a look at what those do.
Maybe you can try to turn on the automatic exposure mode and automatic white balance:
request.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
request.set(CaptureRequest.CONTROL_AWB_MODE, CaptureRequest.CONTROL_AWB_MODE_AUTO);
I hope it will help :)
In my case just configuration FPS helps me. And don't forget to put it to CaptureRequest.Builder for preview and ALSO to CaptureRequest.Builder capture builder. As usual FPS 10 or 15 frames quite enough for photo and preview.
Capture builder
// This is the CaptureRequest.Builder that we use to take a picture.
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
...
setupFPS(captureBuilder);
Preview builder:
// We set up a CaptureRequest.Builder with the output Surface.
mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);
...
// set FPS rate
setupFPS(mPreviewRequestBuilder);
Where setupFPS:
private void setupFPS(CaptureRequest.Builder builder){
if(fpsRange != null) {
builder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, fpsRange);
}
}
And initialization of FPS with:
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
try {
Range<Integer>[] ranges = characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
if(ranges != null) {
for (Range<Integer> range : ranges) {
int upper = range.getUpper();
Log.i(TAG, "[FPS Range Available]:" + range);
if (upper >= 10) {
if (fpsRange == null || upper < fpsRange.getUpper()) {
fpsRange = range;
}
}
}
}
} catch (Exception e) {
e.printStackTrace();
}
Log.i(TAG, "[FPS Range] is:" + fpsRange);
I was facing a dark image issue for the last two days and now I have a solution for it.
You need to set CaptureRequest like below.
I tried to set the brightness in camera2 API.
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
captureBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
captureBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, 10);
captureBuilder.addTarget(imageReader.getSurface());
I know this is old, but I ran into a similar issue. Sharing what worked for me for anyone else who stumbles upon this. Tried all sorts of answers here with no success.
Setting CONTROL_AE_MODE to CONTROL_AE_MODE_ON (as some suggested) also did not fix it (you think it would).
What fixed it for me was setting the CONTROL_CAPTURE_INTENT to CONTROL_CAPTURE_INTENT_VIDEO_RECORD.
request.set(CaptureRequest.CONTROL_CAPTURE_INTENT, CaptureRequest.CONTROL_CAPTURE_INTENT_VIDEO_RECORD);
Once adding this line and building, auto exposure was enabled on the camera as expected and the camera adjusted automatically.
Refer to https://developer.android.com/reference/android/hardware/camera2/CameraMetadata for more information. I used this as a guide to discover what options were available.