I am trying get JPEG image from both camera parallel on Snapdragon 820 platform.
I not getting first camera Image callback. I only getting second camera JPEG callback.
Here is my code :
protected void takePictureBack() {
Log.d(TAG, "takePictureBack() called");
if (null == cameraDeviceBack) {
Log.e(TAG, "cameraDeviceBack is null");
return;
}
try {
final File file_back = new File(Environment.getExternalStorageDirectory() + "/pic_back.jpg");
final CaptureRequest.Builder captureBuilderBack = cameraDeviceBack.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
List<Surface> outputSurfaces = new ArrayList<Surface>(3);
outputSurfaces.add(new Surface(mTextureViewBack.getSurfaceTexture()));
ImageReader reader = ImageReader.newInstance(640, 480, ImageFormat.JPEG, 1);
outputSurfaces.add(reader.getSurface());
captureBuilderBack.addTarget(reader.getSurface());
ImageReader.OnImageAvailableListener readerListenerBack = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Log.d(TAG, "onImageAvailable() called with: reader = [" + reader + "]");
if (reader.getImageFormat() == ImageFormat.JPEG) {
Log.d(TAG, "onImageAvailable() called with back: reader = JPEG");
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
} catch (IOException e) {
e.printStackTrace();
} finally {
if (image != null) {
image.close();
}
}
}
}
private void save(byte[] bytes) throws IOException {
OutputStream output = null;
try {
output = new FileOutputStream(file_back);
output.write(bytes);
} finally {
if (null != output) {
output.close();
}
}
}
};
reader.setOnImageAvailableListener(readerListenerBack, mBackgroundHandlerBack);
captureBuilderBack.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
final CameraCaptureSession.CaptureCallback captureListenerBack = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
if (DEBUG) Log.d(TAG, "onCaptureCompleted: take picture back successfully");
//Toast.makeText(getActivity(), "Take picture successfully", Toast.LENGTH_SHORT).show();
createCameraPreviewBack();
mCaptureResultBack = result;
}
};
cameraDeviceBack.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilderBack.build(), captureListenerBack, mBackgroundHandlerBack);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, mBackgroundHandlerBack);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Front camera capture code is also same.
Individual single camera JPEG capture work fine.
Any idea, why I am not getting bot JPEG Images callback ?
I find solution.We need take "ImageReader reader" as global variable.
By this change I am able to get two JPEG from both camera.
Related
I was trying to use the Camera2 API of Android. The front camera is working fine, but when it comes to using the back/rear camera, this error occurs:
LegacyCameraDevice_nativeGetSurfaceId: Could not retrieve native Surface from surface.
This problem occurs after I click the button to take a picture. The capture callback is successful, but I get no image in onImageAvailable().
I followed the tutorial of https://web.archive.org/web/20161011160303/https://inducesmile.com/android/android-camera2-api-example-tutorial/ . I do not have any idea on how to proceed with the error that I am facing right now.
Here is the code used in capturing the image:
private void takePicture() {
if (mCameraDevice == null) {
return;
}
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.LOLLIPOP) {
CameraManager mManager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics mCharacteristics = mManager.getCameraCharacteristics(mCameraDevice.getId());
Size[] jpegSizes = null;
if (mCharacteristics != null) {
jpegSizes = mCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
int width = 640;
int height = 480;
for(int x = 0; x < jpegSizes.length; x++) {
Log.wtf("jpegSizes", String.valueOf(jpegSizes[x]));
}
if (jpegSizes != null && jpegSizes.length > 0) {
width = jpegSizes[4].getWidth();
height = jpegSizes[4].getHeight();
}
final ImageReader mReader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> mOutputSurface = new ArrayList<>(2);
mOutputSurface.add(mReader.getSurface());
mOutputSurface.add(new Surface(mTextureView.getSurfaceTexture()));
final CaptureRequest.Builder mCaptureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
mCaptureBuilder.addTarget(mReader.getSurface());
mCaptureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
int mRotation = getWindowManager().getDefaultDisplay().getRotation();
int jpegOrientation = (ORIENTATIONS.get(mRotation) + mSensorOrientation + 270) % 360;
if(cameraId.equals("0")) {
mCaptureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(mRotation));
} else {
if(extras.getString("orient").equals("landscape")) {
mCaptureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(mRotation));
} else {
mCaptureBuilder.set(CaptureRequest.JPEG_ORIENTATION, jpegOrientation);
}
}
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
if(mImage == null) {
Toast.makeText(StartCameraActivity.this, "Capturing Image Failed, Please Try Again", Toast.LENGTH_SHORT).show();
Log.wtf("onCaptureComplete", "Image not Available");
} else {
Log.wtf("onCaptureComplete", "Image Available");
}
//createCameraPreview();
}
#Override
public void onCaptureFailed(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull CaptureFailure failure) {
super.onCaptureFailed(session, request, failure);
Log.wtf("FAILED", failure.toString());
}
};
mCameraDevice.createCaptureSession(mOutputSurface, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession session) {
Log.wtf("onConfigured", "succes");
try {
session.capture(mCaptureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession session) {
Log.wtf("onConfigureFailed", "failed");
}
}, mBackgroundHandler);
mReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(final ImageReader reader) {
mImage = reader.acquireLatestImage();
Log.wtf("imageAvail", "OnImageAvailable");
StartCameraActivity.this.runOnUiThread(new Runnable() {
#Override
public void run() {
if (mImage == null) {
return;
}
final Image.Plane[] planes = mImage.getPlanes();
final ByteBuffer buffer = planes[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
mTextureView.setVisibility(View.INVISIBLE);
if(cameraId.equals("0")) {
screenshotHolder.setImageBitmap(bitmap);
} else {
screenshotHolder.setImageBitmap(flip(bitmap, mImage.getWidth(), mImage.getHeight()));
}
new RenderPicture(StartCameraActivity.this).execute();
if (mImage != null) {
mImage.close();
}
if(mReader != null) {
mReader.close();
}
}
});
}
}, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
}
Your ImageReader is a local variable in takePicture, and doesn't look like it's stored anywhere in the parent class. It's likely being garbage collected immediately or soon after takePicture exits, so when the camera tries to set itself up, the Surface reports as being abandoned.
A Surface is like a weak reference and won't keep the ImageReader alive by itself. Store it in the parent class like you do with the camera device.
I want to save image only if face is detected, currently it is saving even if face is not present. I want to capture only if the face detected, I am using android face API for face detection.
My code:here i want to save image only if there are faces that is in else block i need to put my code. according to my code if i start capture session with in the else block i cant access readerListener so how to do this.
final ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
mBitmapToSave1 = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
mBitmapToSave = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
Bitmap scaled = Bitmap.createScaledBitmap(mBitmapToSave, width, height, true);
int w = scaled.getWidth();
int h = scaled.getHeight();
// Setting post rotate to 90
Matrix mtx = new Matrix();
mtx.postRotate(-180);
// Rotating Bitmap
mBitmapToSave = Bitmap.createBitmap(scaled, 0, 0, w, h, mtx, true);
// mBitmapToSave = Bitmap.createBitmap(width+rowPadding/pixelStride,height, Bitmap.Config.RGB_565);
// mBitmapToSave.copyPixelsToBuffer(buffer);
if (detector.isOperational() && mBitmapToSave != null) {
Frame frame = new Frame.Builder().setBitmap(mBitmapToSave).build();
SparseArray<Face> faces = detector.detect(frame);
for (index = 0; index < faces.size(); ++index) {
Face face = faces.valueAt(index);
}if (faces.size() == 0) {
MediaPlayer mediaPlayer = MediaPlayer.create(getApplicationContext(), R.raw.not);
mediaPlayer.start();
//Toast.makeText(AndroidCamera2API.this, "Face Not detected Adjust Camera Properly", Toast.LENGTH_SHORT).show();
} else {
Toast.makeText(AndroidCamera2API.this, "Face Found " + "\n", Toast.LENGTH_SHORT).show();
}
}
}catch(FileNotFoundException e){
e.printStackTrace();
} catch(IOException e){
e.printStackTrace();
} finally{
if (image != null) {
image.close();
}
}
}
private void save(byte[] bytes) throws IOException {
OutputStream output = null;
try {
output = new FileOutputStream(file);
output.write(bytes);
} finally {
if (null != output) {
output.close();
}
}
}
};
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Toast.makeText(AndroidCamera2API.this, "Saved:" + file, Toast.LENGTH_SHORT).show();
uploadMultipart();
createCameraPreview();
}
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, mBackgroundHandler);
mBitmapToSave = null;
} catch(CameraAccessException e){
e.printStackTrace();
}
}
I am using the camera2 API. I need to take a photo in the service without a preview. It works, but the photos have a bad exposure. The pictures are very dark or sometimes very light. How can I fix my code so that the photos are of high quality? I'm using the front camera.
public class Camera2Service extends Service
{
protected static final String TAG = "myLog";
protected static final int CAMERACHOICE = CameraCharacteristics.LENS_FACING_BACK;
protected CameraDevice cameraDevice;
protected CameraCaptureSession session;
protected ImageReader imageReader;
protected CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(#NonNull CameraDevice camera) {
Log.d(TAG, "CameraDevice.StateCallback onOpened");
cameraDevice = camera;
actOnReadyCameraDevice();
}
#Override
public void onDisconnected(#NonNull CameraDevice camera) {
Log.w(TAG, "CameraDevice.StateCallback onDisconnected");
}
#Override
public void onError(#NonNull CameraDevice camera, int error) {
Log.e(TAG, "CameraDevice.StateCallback onError " + error);
}
};
protected CameraCaptureSession.StateCallback sessionStateCallback = new CameraCaptureSession.StateCallback() {
#Override
public void onReady(CameraCaptureSession session) {
Camera2Service.this.session = session;
try {
session.setRepeatingRequest(createCaptureRequest(), null, null);
} catch (CameraAccessException e) {
Log.e(TAG, e.getMessage());
}
}
#Override
public void onConfigured(CameraCaptureSession session) {
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession session) {
}
};
protected ImageReader.OnImageAvailableListener onImageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Log.d(TAG, "onImageAvailable");
Image img = reader.acquireLatestImage();
if (img != null) {
processImage(img);
img.close();
}
}
};
public void readyCamera() {
CameraManager manager = (CameraManager) getSystemService(CAMERA_SERVICE);
try {
String pickedCamera = getCamera(manager);
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
return;
}
manager.openCamera(pickedCamera, cameraStateCallback, null);
imageReader = ImageReader.newInstance(1920, 1088, ImageFormat.JPEG, 2 /* images buffered */);
imageReader.setOnImageAvailableListener(onImageAvailableListener, null);
Log.d(TAG, "imageReader created");
} catch (CameraAccessException e){
Log.e(TAG, e.getMessage());
}
}
public String getCamera(CameraManager manager){
try {
for (String cameraId : manager.getCameraIdList()) {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
int cOrientation = characteristics.get(CameraCharacteristics.LENS_FACING);
if (cOrientation != CAMERACHOICE) {
return cameraId;
}
}
} catch (CameraAccessException e){
e.printStackTrace();
}
return null;
}
#Override
public int onStartCommand(Intent intent, int flags, int startId) {
Log.d(TAG, "onStartCommand flags " + flags + " startId " + startId);
readyCamera();
return super.onStartCommand(intent, flags, startId);
}
#Override
public void onCreate() {
Log.d(TAG,"onCreate service");
super.onCreate();
}
public void actOnReadyCameraDevice()
{
try {
cameraDevice.createCaptureSession(Arrays.asList(imageReader.getSurface()), sessionStateCallback, null);
} catch (CameraAccessException e){
Log.e(TAG, e.getMessage());
}
}
#Override
public void onDestroy() {
try {
session.abortCaptures();
} catch (CameraAccessException e){
Log.e(TAG, e.getMessage());
}
session.close();
}
private void processImage(Image image){
//Process image data
ByteBuffer buffer;
byte[] bytes;
boolean success = false;
File file = new File(Environment.getExternalStorageDirectory() + "/Pictures/image.jpg");
FileOutputStream output = null;
if(image.getFormat() == ImageFormat.JPEG) {
buffer = image.getPlanes()[0].getBuffer();
bytes = new byte[buffer.remaining()]; // makes byte array large enough to hold image
buffer.get(bytes); // copies image from buffer to byte array
try {
output = new FileOutputStream(file);
output.write(bytes); // write the byte array to file
j++;
success = true;
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
image.close(); // close this to free up buffer for other images
if (null != output) {
try {
output.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}
protected CaptureRequest createCaptureRequest() {
try {
CaptureRequest.Builder builder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
builder.addTarget(imageReader.getSurface());
return builder.build();
} catch (CameraAccessException e) {
Log.e(TAG, e.getMessage());
return null;
}
}
#Override
public IBinder onBind(Intent intent) {
return null;
}
}
Sergey, I copied your code and indeed I was able to reproduce the issue. I got totally black pictures out of Google Pixel 2 (Android 8.1).
However, I have successfully resolved the black-pic issue as follows:
First, in case anyone is wondering, you actually do NOT need any Activity, or any preview UI element as many other threads about the Camera API claim! That used to be true for the deprecated Camera v1 API. Now, with the new Camera v2 API, all I needed was a foreground service.
To start the capturing process, I used this code:
CaptureRequest.Builder builder = cameraDevice.createCaptureRequest (CameraDevice.TEMPLATE_VIDEO_SNAPSHOT);
builder.set (CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_AUTO);
builder.set (CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_OFF);
builder.addTarget (imageReader.getSurface ());
captureRequest = builder.build ();
Then, in ImageReader.onImageAvailable, I skipped the first N pictures (meaning I did not save them). I let the session run, and capture more pics without saving them.
That gave the camera enough time to automatically gradually adjust the exposition parameters. Then, after N ignored photos, I saved a photo, which was normally exposed, not black at all.
The value of the N constant will depend on characteristics of your hardware. So you will need to determine the ideal value of N experimentally for your hardware. You can also use histogram-based heuristic automation. At the beginning of experiments, don't be afraid to start saving only after hundreds of milliseconds of calibration have passed.
Finally, in a lot of similar threads people suggest to just wait e.g. 500 ms after creating the session and only then taking a single picture. That does not help. One really has to let the camera run and let it take many pictures rapidly (at the fastest rate possible). For that, simply use the setRepeatingRequest method (as in your original code).
Hope this helps. :)
EDITED TO ADD: When skipping the initial N pictures, you need to call the acquireLatestImage method of ImageReader for each of those skipped pictures too. Otherwise, it won't work.
Full original code with my changes incorporated that resolved the issue, tested and confirmed as working on Google Pixel 2, Android 8.1:
public class Camera2Service extends Service
{
protected static final int CAMERA_CALIBRATION_DELAY = 500;
protected static final String TAG = "myLog";
protected static final int CAMERACHOICE = CameraCharacteristics.LENS_FACING_BACK;
protected static long cameraCaptureStartTime;
protected CameraDevice cameraDevice;
protected CameraCaptureSession session;
protected ImageReader imageReader;
protected CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(#NonNull CameraDevice camera) {
Log.d(TAG, "CameraDevice.StateCallback onOpened");
cameraDevice = camera;
actOnReadyCameraDevice();
}
#Override
public void onDisconnected(#NonNull CameraDevice camera) {
Log.w(TAG, "CameraDevice.StateCallback onDisconnected");
}
#Override
public void onError(#NonNull CameraDevice camera, int error) {
Log.e(TAG, "CameraDevice.StateCallback onError " + error);
}
};
protected CameraCaptureSession.StateCallback sessionStateCallback = new CameraCaptureSession.StateCallback() {
#Override
public void onReady(CameraCaptureSession session) {
Camera2Service.this.session = session;
try {
session.setRepeatingRequest(createCaptureRequest(), null, null);
cameraCaptureStartTime = System.currentTimeMillis ();
} catch (CameraAccessException e) {
Log.e(TAG, e.getMessage());
}
}
#Override
public void onConfigured(CameraCaptureSession session) {
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession session) {
}
};
protected ImageReader.OnImageAvailableListener onImageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Log.d(TAG, "onImageAvailable");
Image img = reader.acquireLatestImage();
if (img != null) {
if (System.currentTimeMillis () > cameraCaptureStartTime + CAMERA_CALIBRATION_DELAY) {
processImage(img);
}
img.close();
}
}
};
public void readyCamera() {
CameraManager manager = (CameraManager) getSystemService(CAMERA_SERVICE);
try {
String pickedCamera = getCamera(manager);
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
return;
}
manager.openCamera(pickedCamera, cameraStateCallback, null);
imageReader = ImageReader.newInstance(1920, 1088, ImageFormat.JPEG, 2 /* images buffered */);
imageReader.setOnImageAvailableListener(onImageAvailableListener, null);
Log.d(TAG, "imageReader created");
} catch (CameraAccessException e){
Log.e(TAG, e.getMessage());
}
}
public String getCamera(CameraManager manager){
try {
for (String cameraId : manager.getCameraIdList()) {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
int cOrientation = characteristics.get(CameraCharacteristics.LENS_FACING);
if (cOrientation == CAMERACHOICE) {
return cameraId;
}
}
} catch (CameraAccessException e){
e.printStackTrace();
}
return null;
}
#Override
public int onStartCommand(Intent intent, int flags, int startId) {
Log.d(TAG, "onStartCommand flags " + flags + " startId " + startId);
readyCamera();
return super.onStartCommand(intent, flags, startId);
}
#Override
public void onCreate() {
Log.d(TAG,"onCreate service");
super.onCreate();
}
public void actOnReadyCameraDevice()
{
try {
cameraDevice.createCaptureSession(Arrays.asList(imageReader.getSurface()), sessionStateCallback, null);
} catch (CameraAccessException e){
Log.e(TAG, e.getMessage());
}
}
#Override
public void onDestroy() {
try {
session.abortCaptures();
} catch (CameraAccessException e){
Log.e(TAG, e.getMessage());
}
session.close();
}
private void processImage(Image image){
//Process image data
ByteBuffer buffer;
byte[] bytes;
boolean success = false;
File file = new File(Environment.getExternalStorageDirectory() + "/Pictures/image.jpg");
FileOutputStream output = null;
if(image.getFormat() == ImageFormat.JPEG) {
buffer = image.getPlanes()[0].getBuffer();
bytes = new byte[buffer.remaining()]; // makes byte array large enough to hold image
buffer.get(bytes); // copies image from buffer to byte array
try {
output = new FileOutputStream(file);
output.write(bytes); // write the byte array to file
j++;
success = true;
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
image.close(); // close this to free up buffer for other images
if (null != output) {
try {
output.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}
protected CaptureRequest createCaptureRequest() {
try {
CaptureRequest.Builder builder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
builder.addTarget(imageReader.getSurface());
return builder.build();
} catch (CameraAccessException e) {
Log.e(TAG, e.getMessage());
return null;
}
}
#Override
public IBinder onBind(Intent intent) {
return null;
}
}
I am new to camera2 api. I want to build an image processing framework on my Android phone.
Step1: use the Camera2 API to open a camera preview stream
Step2: feed the preview frame data to OpenCV for processing
Step3: display the processed result live on the screen
Currently, I have finished Step1 using ImageReader and C++ OpenCV code. However, I don't know how to do step3.
How do I display the processed image on the screen? (I want to display the normal image, and overlay an icon if I detect predefined object)
Here are some key codes:
protected void createCameraPreview() {
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
// Surface surface = new Surface(texture);
Surface mImageSurface = mImageReader.getSurface();
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
// captureRequestBuilder.addTarget(surface);
captureRequestBuilder.addTarget(mImageSurface);
cameraDevice.createCaptureSession(Arrays.asList(mImageSurface), new CameraCaptureSession.StateCallback(){
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
//The camera is already closed
if (null == cameraDevice) {
return;
}
cameraCaptureSessions = cameraCaptureSession;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Toast.makeText(MainActivity.this, "Configuration change", Toast.LENGTH_SHORT).show();
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
protected void updatePreview() {
if(null == cameraDevice) {
Log.e(TAG, "updatePreview error, return");
}
try {
cameraCaptureSessions.setRepeatingRequest(captureRequestBuilder.build(), null, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Log.e(TAG, "onImageAvailable: " + count++);
Image img = null;
img = reader.acquireNextImage();
try {
if (img == null) throw new NullPointerException("cannot be null");
ByteBuffer buffer = img.getPlanes()[0].getBuffer();
byte[] data = new byte[buffer.remaining()];
buffer.get(data);
int width = img.getWidth();
int height = img.getHeight();
// ****try to get captured img for display here (synchronized)
// ****try to process image for detecting the object here (asynchronized)
} catch (NullPointerException ex) {
ex.printStackTrace();
}finally {
Log.e(TAG, "in the finally! ------------");
if (img != null)
img.close();
}
}
};
I am developing a custome camera but when trying to capture an image using Camera 2 apithen i am getting black image.i am using below code for capture
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
String cameraId = "";
if (cameraFront) {
cameraId = "" + findFrontFacingCamera();
} else {
cameraId = "" + findBackFacingCamera();
}
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
// CameraCharacteristics characteristics = manager.getCameraCharacteristics(mCameraDevice.getId());
Size[] jpegSizes = null;
if (characteristics != null) {
jpegSizes =
characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(
ImageFormat.JPEG);
}
int width = 720;
int height = 640;
if (jpegSizes != null && 0 < jpegSizes.length) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
Size largest =
Collections.max(Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)), new CompareSizesByArea());
ImageReader reader =
ImageReader.newInstance(largest.getWidth(), largest.getHeight(), ImageFormat.JPEG, /* maxImages */1);
// ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(mTextureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
if (cameraFront) {
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation) + 180);
} else {
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
}
final File file = getOutputMediaFile();
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener()
{
#Override
public void onImageAvailable(ImageReader reader)
{
Image image = null;
try {
image = reader.acquireLatestImage();
// ByteBuffer buffer = image.getPlanes()[0].getBuffer();
// final byte[] bytes = new byte[buffer.capacity()];
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
final byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
save(bytes);
buffer.clear();
runOnUiThread(new Runnable()
{
#Override
public void run()
{
mThumbnail.setVisibility(View.VISIBLE);
filePathLabel.setVisibility(View.VISIBLE);
filePathValue.setText(file.getAbsolutePath());
Bitmap bmp =
UtilityMethods.getScaledBitmap(CameraImageTestActivityLoliipop.this, bytes);
mThumbnail.setImageBitmap(bmp);
}
});
} catch (FileNotFoundException e) {
AppLogger.exception(myContext, getClass().getSimpleName(), e);
// e.printStackTrace();
} catch (IOException e) {
AppLogger.exception(myContext, getClass().getSimpleName(), e);
// e.printStackTrace();
} finally {
if (image != null) {
image.close();
}
}
}
private void save(byte[] bytes) throws IOException
{
OutputStream output = null;
try {
output = new FileOutputStream(file);
output.write(bytes);
} finally {
if (null != output) {
output.close();
}
}
}
};
HandlerThread thread = new HandlerThread("CameraPicture");
thread.start();
final Handler backgroudHandler = new Handler(thread.getLooper());
reader.setOnImageAvailableListener(readerListener, backgroudHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback()
{
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request,
TotalCaptureResult result)
{
super.onCaptureCompleted(session, request, result);
startPreview();
}
};
mCameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback()
{
#Override
public void onConfigured(CameraCaptureSession session)
{
try {
session.capture(captureBuilder.build(), captureListener, backgroudHandler);
} catch (CameraAccessException e) {
AppLogger.exception(myContext, getClass().getSimpleName(), e);
// e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session)
{
}
}, backgroudHandler);
} catch (CameraAccessException e) {
AppLogger.exception(myContext, getClass().getSimpleName(), e);
// e.printStackTrace();
}
and below are the methods for camera preview
protected void startPreview()
{
try {
if (null == mCameraDevice || !mTextureView.isAvailable() || null == mPreviewSize) {
Log.e(TAG, "startPreview fail, return");
return;
}
SurfaceTexture texture = mTextureView.getSurfaceTexture();
if (null == texture) {
Log.e(TAG, "texture is null, return");
return;
}
Log.e(TAG, "Width: " + mPreviewSize.getWidth() + " Hieght : " + mPreviewSize.getHeight());
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
Surface surface = new Surface(texture);
try {
mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
} catch (CameraAccessException e) {
AppLogger.exception(myContext, getClass().getSimpleName(), e);
// e.printStackTrace();
}
mPreviewBuilder.addTarget(surface);
mCameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback()
{
#Override
public void onConfigured(CameraCaptureSession session)
{
mPreviewSession = session;
updatePreview();
}
#Override
public void onConfigureFailed(CameraCaptureSession session)
{
}
}, null);
} catch (CameraAccessException e) {
AppLogger.exception(myContext, getClass().getSimpleName(), e);
// e.printStackTrace();
}
}
protected void updatePreview()
{
try {
if (null == mCameraDevice) {
Log.e(TAG, "updatePreview error, return");
}
mPreviewBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
// mPreviewBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
// Flash is automatically enabled when necessary.
mPreviewBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
HandlerThread thread = new HandlerThread("CameraPreview");
thread.start();
Handler backgroundHandler = new Handler(thread.getLooper());
mPreviewSession.setRepeatingRequest(mPreviewBuilder.build(), null, backgroundHandler);
} catch (CameraAccessException e) {
}
its working in all devices but when i am runnig on Micromax Q382 device then getting black image with below warning in logcat
I/Choreographer: Skipped 37 frames! The application may be doing too much work on its main thread.
W/ImageReader_JNI: Unable to acquire a lockedBuffer, very likely client tries to lock more than maxImages buffers
I am not getting any idea that what happening.Please Help me.
You're passing a null argument to createCaptureSession inside startPreview. Make backgroudHandler previously used visible to that method (pass as parameter or initialize it as a class variable) and use there as well.
You could try setting some delay before starting the capture (and after opening the camera). Something like:
new Handler().postDelayed(() -> {
//takePicture();
}, 500);
If you want, I've created a service that massively facilitates photos capturing with Android Camera 2 API: https://github.com/hzitoun/android-camera2-secret-picture-taker . Usage is described in the readme file.
Hope that helped!
You're setting the AE mode to 'always flash', but then you don't run the precapture sequence to allow the camera device to meter for that flash; this will likely not work very well on any device, and on some devices you may end up with some default exposure value (bad).
If you want to fire the flash, you need to use a precapture sequence first (send a single request with AE_PRECAPTURE_TRIGGER set, wait for the PRECAPTURE AE_STATE to end, then issue the capture request), on non-LEGACY devices. If the device is LEGACY-level, then your current code should be OK for those.