Burst capture at 10 FPS with preview without lag - android

I need to do an application with an activity wich captures images in burst mode at 10 FPS.
My problem is that I'm able to capture very fast pictures with the burst mode, but it seems that my buffer of images is full before I can have the time to treat them (I just want to put my images into an ArrayList).
So my buffer is full and my app crashes.
The size of the buffer that I've choose is 50 (I cannot take too much memory).
It seems that my burst mode can take 30 FPS, but I'm able to treat my images at 10 FPS.
So, I want to slow down my capturing, but when I do that (by waiting before repeat another capture), it slows down my preview too. So I have a lag in my preview.
I don't want to save my images, I just want to stock them in order to treat them later (in an other activity).
Do you have any idea, maybe to slow down my capturing, or maybe to speed up my treatment.
Code ::
My code is based on the sample Camera2Basic.
private void captureStillPicture() {
try {
final Activity activity = getActivity();
if (null == activity || null == mCameraDevice) {
return;
}
mPreviewRequestBuilder.set(CaptureRequest.EDGE_MODE, CaptureRequest.EDGE_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE, CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE_ON);
mPreviewRequestBuilder.set(CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE, CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.NOISE_REDUCTION_MODE, CaptureRequest.NOISE_REDUCTION_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_CANCEL);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AWB_LOCK, true);
mCaptureSession.stopRepeating();
mCaptureSession.abortCaptures();
mPreviewRequestBuilder.addTarget(mImageReader.getSurface());
mCaptureSession.capture(mPreviewRequestBuilder.build(), captureCallback, null);
mPreviewRequestBuilder.removeTarget(mImageReader.getSurface());
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
blockCapture() permits to wait 100 ms between captures.
private void blockCapture() {
mPreviewRequestBuilder.removeTarget(mImageReader.getSurface());
while(Math.abs(timerBurst - System.currentTimeMillis()) < 1000){}
System.out.println("ELAPSED :::::::: " + Math.abs(timerBurst - System.currentTimeMillis()));
timerBurst = System.currentTimeMillis();
mPreviewRequestBuilder.addTarget(mImageReader.getSurface());
for(int i =0; i< NB_BURST; i++){captureList.add(mPreviewRequestBuilder.build());}
}
private final CameraCaptureSession.CaptureCallback captureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
showToast("Saved: " + mFile);
Log.d(TAG,"Saved: " + mFile);
mCounterImageForSave ++;
mCounterBurst++;
if(mCounterImageForSave <= 1) {
timerBurst = System.currentTimeMillis();
}
if(mCounterBurst >= NB_BURST && isPlayed) {
mCounterBurst = 0;
try {
blockCapture();
mCaptureSession.capture(mPreviewRequestBuilder.build(),captureCallback,null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
if (mCounterBurst >= NB_BURST || !isPlayed)
unlockFocus();
}
};
I have tried to capture YUV_420_888 images instead of JPEG, but my images where all black, and I cannot convert them.
Sorry if I'd make mistakes with my english ...

Related

ImageReader.OnImageAvailableListener() not being called when taking a picture

I have a Camera App developed for Android 26 SDK. I have been using it happily with a Motorola G5 and G6 but when I move to a Motorola G7 the app crashes when I press the button to take a picture in my app.
The G7 is running Android 9. I have another Android 9 phone a Samsung S10 plus. The S10 plus does not crash when I press the button for taking a picture.
While debugging I noticed that the G7 doesn't call ImageReader.OnImageAvailableListener while the S10 does. Looking at the code this is where the image is saved for use later on in CameraCaptureSession.CaptureCallback. The callback expects bytes to be populated and crashes when it isn't (I haven't included the stack trace because it's not a little unhelpful but I can if you think you would like to see it).
I can get the G7 to save the image if I run it slowly through debug on 'some' occasions.
So I have a button that calls the function onImageCaptureClick() inside it does a bunch of stuff but one of the things it does is create an ImageReader.OnImageAvailableListener. The OnImageAvailableListener saves the image and populates a variable bytes from the image buffer. This onImageAvailableListener is attached to my reader by using reader.setOnImageAvailableListener(readerListener, null), and this listener is never used. When I get in to the CaptureCallBack the class variable bytes is not populated and the app crashes.
Do you have any idea where I would look to solve this?
protected void onImageCaptureClick() {
if (null == mCameraDevice) {
logger.debug("null == mCameraDevice");
Log.e(TAG, "cameraDevice is null");
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(mCameraDevice.getId());
Size[] jpegSizes = null;
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
int width = 640;
int height = 480;
if (jpegSizes != null && 0 < jpegSizes.length) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List < Surface > outputSurfaces = new ArrayList < > (2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(mTextureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
if (mFlashMode == FLASH_MODE_OFF) {
captureBuilder.set(CaptureRequest.FLASH_MODE, CaptureRequest.FLASH_MODE_OFF);
logger.debug("FLASH OFF");
}
if (mFlashMode == CONTROL_AE_MODE_ON) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE,
CaptureRequest.CONTROL_AE_MODE_ON);
captureBuilder.set(CaptureRequest.FLASH_MODE,
CaptureRequest.FLASH_MODE_TORCH);
logger.debug("FLASH ON");
}
if (mFlashMode == CONTROL_AE_MODE_ON_AUTO_FLASH) {
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
captureBuilder.set(CaptureRequest.FLASH_MODE,
CaptureRequest.FLASH_MODE_OFF);
logger.debug("FLASH AUTO");
}
captureBuilder.set(CaptureRequest.SCALER_CROP_REGION, zoom);
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
final File file = new File(_pictureUri.getPath());
logger.debug("OnImageCaptureClick: _pictureUri is: " + _pictureUri.getPath());
// ************************************
// this listener is not used on the G7,
// and so the image isn't saved.
// ************************************
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
bytes = new byte[buffer.capacity()];
buffer.get(bytes);
logger.debug("onImageCaptureClick, the filesize to save is: " + bytes.toString());
save();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (image != null) {
image.close();
}
}
}
private void save() throws IOException {
OutputStream output = null;
try {
output = new FileOutputStream(file);
output.write(bytes);
} finally {
if (null != output) {
output.close();
}
}
}
};
// ********************************************************
// the reader sets the listener here but it is never called
// and when I get in to the CaptureCallback the BitmapUtils
// expects bytes to be populated and crashes the app
// ********************************************************
reader.setOnImageAvailableListener(readerListener, null);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
try {
BitmapUtils.addTimeStampAndRotate(_pictureUri, bytes);
Intent intent = new Intent(CameraActivity.this, CameraReviewPhotoActivity.class);
intent.putExtra(MediaStore.EXTRA_OUTPUT, _pictureUri);
startActivityForResult(intent, CameraActivity.kRequest_Code_Approve_Image);
} catch (IOException e) {
e.printStackTrace();
} catch (ImageReadException e) {
e.printStackTrace();
} catch (ImageWriteException e) {
e.printStackTrace();
}
}
};
mCameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
Log.w(TAG, "Failed to configure camera");
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
} finally {
takePictureButton.setEnabled(false);
mTextureView.setEnabled(false);
}
The API makes no guarantee about the order of onCaptureCompleted and OnImageAvailableListener. They may arrive in arbitrary order, depending on the device, capture settings, the load on the device, or even the particular OS build you have.
Please don't make any assumptions about it.
Instead, if you need both callbacks to fire before you process something, then wait for both to happen before you move forward. For example, check if the other callback has fired in each callback, and if so, call the method to do the processing.
I think I have found the solution to this.
I have the following phones:
Samsung S10 Plus
Motorola G7
Motorola G6
My app works on the S10 and the G6.
The S10 and G6 both call the OnImageAvailableListener function before the onCaptureCompleted callback. The G7 however calls them both the other way around onCaptureCompleted then OnImageAvailableListener.
According to https://proandroiddev.com/understanding-camera2-api-from-callbacks-part-1-5d348de65950 the correct way is onCaptureCompleted then OnImageAvailableListener.
In my code I am assuming that OnImageAvailableListener has saved the image and then OnCaptureCompleted tries to manipulate it, which causes the crash.
Looking at the INFO_SUPPORTED_HARDWARE_LEVEL of each device I have the following levels of support from none level 0 to uber level 3.
Samsung S10 Plus reports device level support level 1
Motorola G7 reports device level support level 3
Motorola G6 reports device level support level 2
My assumption at this point is that the events fire in a different order when you support the android-camera2 API at Level 3 compared to other levels.
Hope this helps

Size not in valid set when recording without preview

I am trying to record video using a Vivo X20 (7.1.1) and the camera2 api without using a preview and without recording sound (Strictly recording HD Video only).
I'm currently stuck because I cannot figure out how to successfully call MediaRecorder.setVideoSize() and record a video in HD. Currently when I run the app the log shows the error: Surface with size (w=1920, h=1080) and format 0x1 is not valid, size not in valid set: [1440x1080, 1280x960, 1280x720, 960x540, 800x480, 720x480, 768x432, 640x480, 384x288, 352x288, 320x240, 176x144]
The phone's stock camera app can record video up to 4K so I'm definitely missing something here. There are a total of two camera devices identified by CameraManager. When I use getOutPutFormats() from CameraCharacteristics it shows the same valid set of resolutions for both cameras and it is the same range as the above error message.
The below is the code I am using to initialize MediaRecorder and initiate a capture session:
public void StartRecordingVideo() {
Initialize();
recordingVideo = true;
cameraManager = (CameraManager) this.getSystemService(Context.CAMERA_SERVICE);
try {
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED) {
String[] cameraIDs = cameraManager.getCameraIdList();
//LogAllCameraInfo();
if (cameraIDs != null)
{
for(int x = 0; x < cameraIDs.length; x++)
{
Log.d(LOG_ID, "ID: " + cameraIDs[x]);
}
}
cameraManager.openCamera(deviceCameraID, cameraStateCallback, handler);
Log.d(LOG_ID, "Successfully opened camera");
}
else
{
throw new IllegalAccessException();
}
}
catch (Exception e)
{
recordingVideo = false;
Log.e(LOG_ID, "Error during record video start: " + e.getMessage());
}
}
private void Initialize()
{
videoRecordThread = new HandlerThread("video_capture");
videoRecordThread.start();
handler = new Handler((videoRecordThread.getLooper()));
try
{
vidRecorder = new MediaRecorder();
vidRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
vidRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
vidRecorder.setVideoFrameRate(30);
vidRecorder.setCaptureRate(30);
vidRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);
vidRecorder.setVideoEncodingBitRate(10000000);
vidRecorder.setVideoSize(1920, 1080);
String videoFilename = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS)+ File.separator + System.currentTimeMillis() + ".mp4";
vidRecorder.setOutputFile(videoFilename);
Log.d(LOG_ID, "Starting video: " + videoFilename);
vidRecorder.prepare();
}
catch (Exception e)
{
Log.e(LOG_ID, "Error during Initialize: " + e.getMessage());
}
}
And the onReady/onSurfacePrepared/Camera onOpened callbacks:
#Override
public void onReady(CameraCaptureSession session) {
Log.d(LOG_ID, "onReady: ");
super.onReady(session);
try {
CaptureRequest.Builder builder = deviceCamera.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
builder.addTarget(vidRecorder.getSurface());
CaptureRequest request = builder.build();
session.setRepeatingRequest(request, null, handler);
vidRecorder.start();
} catch (CameraAccessException e) {
Log.d(LOG_ID, "Error on Ready: " + e.getMessage());
}
}
#Override
public void onSurfacePrepared(CameraCaptureSession session, Surface surface) {
Log.d(LOG_ID, "onSurfacePrepared: ");
super.onSurfacePrepared(session, surface);
}
#Override
public void onOpened(CameraDevice camera) {
Log.d(LOG_ID, "onOpened: ");
deviceCamera = camera;
try {
camera.createCaptureSession(Arrays.asList(vidRecorder.getSurface()), recordSessionStateCallback, handler);
} catch (CameraAccessException e) {
Log.d(LOG_ID, "onOpened: " + e.getMessage());
}
}
I've tried messing with the order of calls and the output format/encoder with no luck. I am sure that I have all the required permissions. Thanks in advance for your time!
This device most likely supports camera2 at the LEGACY level; check what the output of INFO_SUPPORTED_HARDWARE_LEVEL is.
LEGACY devices are effectively running camera2 on top of the legacy android.hardware.Camera API (more complex than that, but roughly true); as a result, their capabilities via camera2 are restricted.
The maximum recording resolution is one significant problem; android.hardware.Camera records videos via a magic path that the LEGACY mapping layer cannot directly use (there's no Surface involved). As a result, camera2 LEGACY can only record at the maximum preview resolution supported by android.hardware.Camera, not at the maximum recording resolution.
Sounds like this device has no support for 1:1 1080p preview, which is pretty unusual for a device launched so recently.
You can verify if the set of supported preview sizes in the deprecated Camera API matches the list you get in your error; if it doesn't then there may be a OS bug in generating the list so it'd be good to know.
But in general, you can't request sizes that aren't enumerated in the CameraCharacteristics StreamConfiguraitonMap for the camera, no matter what the feature list on the side of the box says. Sometimes the OEM camera app has magic hooks to enable features that normal apps can't get to; often because the feature only works in some very very specific set of circumstances, which normal apps wouldn't know how to replicate.

Android Camera2 Flash Timing Issues

I'm working on an Android camera app that needs to use a fixed (manual) focus, and always uses the flash. I'm having some issues that seem to be with the flash timing. The flash always fires and an image is always acquired, but sometimes the flash doesn't actually illuminate the captured frame. Some frames have flash, some are overexposed, and some are dark; basically it's inconsistent and unpredictable.
I based my code off the Camera2Basic example. I think I've shown all the relevant parts here:
My preview request builder has the following setup
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_OFF);
float minimumLens = mCameraCharacteristics.get(CameraCharacteristics.LENS_INFO_MINIMUM_FOCUS_DISTANCE);
mPreviewRequestBuilder.set(CaptureRequest.LENS_FOCUS_DISTANCE, minimumLens);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AWB_MODE,CaptureRequest.CONTROL_AWB_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE,CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE, CameraMetadata.FLASH_MODE_OFF);
Then the actual sequence that acquires the pictures (almost straight from Camera2Basic) is:
private void takePicture() {
runPrecaptureSequence();
}
private CameraCaptureSession.CaptureCallback mCaptureCallback = new CameraCaptureSession.CaptureCallback() {
private void process(CaptureResult result) {
switch (mState) {
case STATE_PREVIEW: {
break;
}
case STATE_WAITING_PRECAPTURE: {
Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
if (aeState == null ||
aeState == CaptureResult.CONTROL_AE_STATE_PRECAPTURE ||
aeState == CaptureRequest.CONTROL_AE_STATE_FLASH_REQUIRED) {
mState = STATE_CAPTURE;
}
break;
}
case STATE_CAPTURE: {
// CONTROL_AE_STATE can be null on some devices
Integer aeState = result.get(CaptureResult.CONTROL_AE_STATE);
if (aeState == null || aeState != CaptureResult.CONTROL_AE_STATE_PRECAPTURE) {
mState = STATE_PICTURE_TAKEN;
captureStillPicture();
}
break;
}
}
}
#Override
public void onCaptureProgressed(**ARGS**) {
process(partialResult);
}
#Override
public void onCaptureCompleted(**ARGS**) {
process(result);
}
};
private void runPrecaptureSequence() {
try { mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);
mState = STATE_WAITING_PRECAPTURE;
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private void captureStillPicture() {
try {
final Activity activity = getActivity();
if (null == activity || null == mCameraDevice) {
return;
}
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_OFF); CaptureBuilder.set(CaptureRequest.CONTROL_AWB_MODE,CaptureRequest.CONTROL_AWB_MODE_OFF);
float minimumLens = mCameraCharacteristics.get(CameraCharacteristics.LENS_INFO_MINIMUM_FOCUS_DISTANCE);
captureBuilder.set(CaptureRequest.LENS_FOCUS_DISTANCE, minimumLens);
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE,CaptureRequest.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
captureBuilder.set(CaptureRequest.FLASH_MODE, CameraMetadata.FLASH_MODE_OFF);
int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(rotation));
mFileName = getFileNameFromTime() + ".jpg";
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
resumePreview();
}
};
mCaptureSession.stopRepeating();
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Saving the image takes place in the ImageReader onImageAvailableListener call and works just fine.
It does seem like the flash is firing before the image is acquired, so I tried the suggestion in this answer, but the FLASH_FIRED conditional suggested never triggered.
Can anybody with better familiarity with Camera2 than me see where I'm screwing up?
i have seen the code of camera2basic https://github.com/googlearchive/android-Camera2Basic and its newer java version https://github.com/android/camera-samples/tree/master/Camera2BasicJava
so basically there are three places in the code where the code sets flash(i am talking about the newer java version)
1) private void createCameraPreviewSession() function
2) private void captureStillPicture()
3) private void unlockFocus()
1) createCameraPreviewSession() function gets called when the preview is about to get displayed (so i you want to start flash from the beginning of the app start call these function accordingly)
2) captureStillPicture() function gets called when it is going to capture an HD picture
3) unlockFocus() function gets called when you have captured image and want to unlock focus when you click an HD picture
so if flash blinks or image gets washed out you may be setting flashmode torch at only one of last two functions
set flash mode consitent in last two functions and try to avoid first frame that you capture just after starting flash
i was setting flash like this
int flashFlag=1;
private void setAutoFlash(CaptureRequest.Builder requestBuilder) {
if (mFlashSupported) {
if (flashFlag==1)
{
requestBuilder.set(CaptureRequest.CONTROL_AE_MODE,CaptureRequest.CONTROL_AE_MODE_ON);
requestBuilder.set(CaptureRequest.FLASH_MODE,CaptureRequest.FLASH_MODE_TORCH);
}
else
{
requestBuilder.set(CaptureRequest.CONTROL_AE_MODE,
CaptureRequest.CONTROL_AE_MODE_ON);
}
}
}
i was making flashFlag a global variable and setting it at both the places together and rejecting the first flash HD frame because it usually gets washed out
also do not try to turnOn flash without capturing an HD picture because it will lock you in focus
Add the following three lines to the Camera2Basic Sample:
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, 4000);
mPreviewRequestBuilder.set(CaptureRequest.FLASH_MODE, FLASH_MODE_TORCH);
When Flash is activated and a repeatingRequest is started, the manual settings are being overwritten by some kind of default-Settings (Sensor-Sensitivity: 100). But the request clearly states Sensor-Sensitivity should be 4000.
I tested it with these two lines in the onCaptureCompleted-method:
Log.d(TAG, "request: "+request.get(CaptureRequest.SENSOR_SENSITIVITY));
Log.d(TAG, "result: "+result.get(CaptureResult.SENSOR_SENSITIVITY));

Android Camera2, take picture continuously

I need to take pictures continuously with Camera2 API. It works fine on high end devices (for instance a Nexus 5X), but on slower ones (for instance a Samsung Galaxy A3), the preview freezes.
The code is a bit long, so I post only the most relevant parts:
Method called to start my preview:
private void startPreview() {
SurfaceTexture texture = mTextureView.getSurfaceTexture();
if(texture != null) {
try {
// We configure the size of default buffer to be the size of camera preview we want.
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
// This is the output Surface we need to start preview.
Surface surface = new Surface(texture);
// We set up a CaptureRequest.Builder with the output Surface.
mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);
// Here, we create a CameraCaptureSession for camera preview.
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
// If the camera is already closed, return:
if (mCameraDevice == null) { return; }
// When the session is ready, we start displaying the preview.
mCaptureSession = cameraCaptureSession;
// Auto focus should be continuous for camera preview.
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
mPreviewRequest = mPreviewRequestBuilder.build();
// Start the preview
try { mCaptureSession.setRepeatingRequest(mPreviewRequest, null, mPreviewBackgroundHandler); }
catch (CameraAccessException e) { e.printStackTrace(); }
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Log.e(TAG, "Configure failed");
}
}, null
);
}
catch (CameraAccessException e) { e.printStackTrace(); }
}
}
Method called to take a picture:
private void takePicture() {
try {
CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
mCaptureSession.capture(captureBuilder.build(), null, mCaptureBackgroundHandler);
}
catch (CameraAccessException e) { e.printStackTrace(); }
}
And here is my ImageReader:
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(final ImageReader reader) {
mSaveBackgroundHandler.post(new Runnable() {
#Override
public void run() {
// Set the destination file:
File destination = new File(getExternalFilesDir(null), "image_" + mNumberOfImages + ".jpg");
mNumberOfImages++;
// Acquire the latest image:
Image image = reader.acquireLatestImage();
// Save the image:
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
FileOutputStream output = null;
try {
output = new FileOutputStream(destination);
output.write(bytes);
}
catch (IOException e) { e.printStackTrace(); }
finally {
image.close();
if (null != output) {
try { output.close(); }
catch (IOException e) { e.printStackTrace(); }
}
}
// Take a new picture if needed:
if(mIsTakingPictures) {
takePicture();
}
}
});
}
};
I have a button that toggle the mIsTakingPictures boolean, and makes the first takePicture call.
To recap, I'm using 3 threads:
one for the preview
one for the capture
one for the image saving
What can be the cause of this freeze?
It's impossible to avoid framing lost in your preview when you are taking images all time on weak devices. The only way to avoid this is on devices which support TEMPLATE_ZERO_SHUTTER_LAG and using a reprocessableCaptureSession. The documentation about this is pretty horrible and find a way to implement it can be a odyssey. I have this problem a few months ago and finally I found the way to implement it:
How to use a reprocessCaptureRequest with camera2 API
In that answer you can also find some Google CTS test's which also implements ReprocessableCaptureSession and shoot some burst captures with ZSL template.
Finally, you can also use a CaptureBuilder with your preview surface and the image reader surface attached, in that case your preview will continue working all time and also you will save each frame as a new picture. But you will still having the freeze problem.
I also tried implement a burst capture using a handler which dispatch a new capture call each 100 milliseconds, this second option was pretty good in performance and avoiding frame rate lost, but you will not get as many captures per second like the two ImageReader option.
Hope that my answer will help you a bit, API 2 still being a bit complex and there's not so many examples or information about it.
One thing I noticed on low end devices: the preview stops after a capture, even when using camera 1 api, so it has to be restarted manually, thus producing a small preview freeze when capturing a high resolution picture.
But the camera 2 api provides the possibility to get raw image when taking a still capture (that wasn't possible on the devices I have when using camera 1 (Huawei P7, Sony Xperia E5, wiko UFeel)). Using this feature is much faster than capturing a JPEG (maybe due to JPEG compression), so the preview can be restarted earlier, and the preview freeze is shorter. Of course using this solution you'll have to convert the picture from YUV to JPEG in a background task..

How to get each frame data using camera2 API in Android5.0 in realtime

I am working with camera2Basic now and trying to get each frame data to do some image processing. I am using camera2 API in Android5.0, everything is fine when only doing the camera preview and it is fluid. But the preview stuck when I use the ImageReader.OnImageAvailableListener callback to get each frame data, this cause a bad User Experience.
The following is my related codes:
This is the setup for camera and ImageReader, I set the format of image is YUV_420_888
public<T> Size setUpCameraOutputs(CameraManager cameraManager,Class<T> kClass, int width, int height) {
boolean flagSuccess = true;
try {
for (String cameraId : cameraManager.getCameraIdList()) {
CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId);
// choose the front or back camera
if (FLAG_CAMERA.BACK_CAMERA == mChosenCamera &&
CameraCharacteristics.LENS_FACING_BACK != characteristics.get(CameraCharacteristics.LENS_FACING)) {
continue;
}
if (FLAG_CAMERA.FRONT_CAMERA == mChosenCamera &&
CameraCharacteristics.LENS_FACING_FRONT != characteristics.get(CameraCharacteristics.LENS_FACING)) {
continue;
}
StreamConfigurationMap map = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
Size largestSize = Collections.max(
Arrays.asList(map.getOutputSizes(ImageFormat.YUV_420_888)),
new CompareSizesByArea());
mImageReader = ImageReader.newInstance(largestSize.getWidth(), largestSize.getHeight(),
ImageFormat.YUV_420_888, 3);
mImageReader.setOnImageAvailableListener(mOnImageAvailableListener, mBackgroundHandler);
...
mCameraId = cameraId;
}
} catch (CameraAccessException e) {
e.printStackTrace();
} catch (NullPointerException e) {
}
......
}
When the camera opened successfully, I Create a CameraCaptureSession for camera preview
private void createCameraPreviewSession() {
if (null == mTexture) {
return;
}
// We configure the size of default buffer to be the size of camera preview we want.
mTexture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
// This is the output Surface we need to start preview
Surface surface = new Surface(mTexture);
// We set up a CaptureRequest.Builder with the output Surface.
try {
mPreviewRequestBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(mImageReader.getSurface());
mPreviewRequestBuilder.addTarget(surface);
// We create a CameraCaptureSession for camera preview
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
if (null == mCameraDevice) {
return;
}
// when the session is ready, we start displaying the preview
mCaptureSession = session;
// Finally, we start displaying the camera preview
mPreviewRequest = mPreviewRequestBuilder.build();
try {
mCaptureSession.setRepeatingRequest(mPreviewRequest,
mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
The last is the ImageReader.OnImageAvailableListener callback
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Log.d(TAG, "The onImageAvailable thread id: " + Thread.currentThread().getId());
Image readImage = reader.acquireLatestImage();
readImage.close();
}
};
Maybe I do the wrong setup but I try several times and it doesn't work. Maybe there is another way to get frame data rather than ImageReader but I don't know.
Anybody knows how to get each frame data in realtime?
I do not believe that Chen is correct. The image format has almost 0 effect on the speed on the devices I have tested. Instead, the problem seems to be with the image size. On an Xperia Z3 Compact with the image format YUV_420_888, I am offered a bunch of different options in the StreamConfigurationMap's getOutputSizes method:
[1600x1200, 1280x720, 960x720, 720x480, 640x480, 480x320, 320x240, 176x144]
For these respective sizes, the maximum fps I get when setting mImageReader.getSurface() as a target for the mPreviewRequestBuilder are:
[13, 18, 25, 28, 30, 30, 30, 30 ]
So one solution is to use a lower resolution to achieve the rate you want. For the curious... note that these timings do not seem to be affected by the line
mPreviewRequestBuilder.addTarget(surface);
...
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),
I was worried that adding the surface on the screen might be adding overhead, but if I remove that first line and change the second to
mCameraDevice.createCaptureSession(Arrays.asList(mImageReader.getSurface()),
then I see the timings change by less than 1 fps. So it doesn't seem to matter whether you are also displaying the image on the screen.
I think there is simply some overhead in the camera2 API or ImageReader's framework that makes it impossible to get the full rate that the TextureView is clearly getting.
One of the most disappointing things of all is that, if you switch back to the deprecated Camera API, you can easily get 30 fps by setting up a PreviewCallback via the Camera.setPreviewCallbackWithBuffer method. With that method, I am able to get 30fps regardless of the resolution. Specifically, although it does not offer me 1600x1200 directly, it does offer 1920x1080, and even that is 30fps.
I'm trying the same things, I think you may change the Format like
mImageReader = ImageReader.newInstance(largestSize.getWidth(),
largestSize.getHeight(),
ImageFormat.FLEX_RGB_888, 3);
Because using the YUV may cause CPU to compress the data and it may cost some time. RGB can be directly showed on the device. And detect face from image should put in other Thread you must know it.

Categories

Resources