I'm working on Android and I'm trying to capture a picture without displaying any preview. I tried to simplify the process by making a class. It's working but all the pictures are really really dark.
Here is my class :
public class Cam {
private Context context;
private CameraManager manager;
private CameraDevice camera;
private CameraCaptureSession session;
private ImageReader reader;
public static String FRONT="-1";
public static String BACK="-1";
private boolean available=true;
private String filepath;
private static final String NO_CAM = "No camera found on device!";
private static final String ERR_CONFIGURE = "Failed configuring session";
private static final String ERR_OPEN = "Can't open the camera";
private static final String CAM_DISCONNECT = "Camera disconnected";
private static final String FILE_EXIST = "File already exist";
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
public Cam(Context context) throws CameraAccessException {
this.context = context;
this.manager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
String ids[] = manager.getCameraIdList();
if(ids.length==2){
BACK=ids[0];
FRONT=ids[1];
}
else if(ids.length==1){
BACK=ids[0];
}
else{
available=false;
throw new CameraAccessException(-1, NO_CAM);
}
}
public void takePicture(String camId, String filepath) throws CameraAccessException {
if(available){
this.filepath=filepath;
StreamConfigurationMap map = manager.getCameraCharacteristics(camId).get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
Size largest = Collections.max(Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)), new CompareSizesByArea());
reader=ImageReader.newInstance(largest.getWidth(), largest.getHeight(), ImageFormat.JPEG, 1);
reader.setOnImageAvailableListener(imageListener, null);
manager.openCamera(camId, cameraStateCallback, null);
}
else
throwError(NO_CAM);
}
private CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(CameraDevice camera) {
Cam.this.camera=camera;
try {
camera.createCaptureSession(Collections.singletonList(reader.getSurface()), sessionStateCallback, null);
} catch (CameraAccessException e) {
throwError(e.getMessage());
}
}
#Override
public void onDisconnected(CameraDevice camera) {
throwError(CAM_DISCONNECT);
}
#Override
public void onError(CameraDevice camera, int error) {
throwError(ERR_OPEN);
}
};
private CameraCaptureSession.StateCallback sessionStateCallback = new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
Cam.this.session=session;
try {
CaptureRequest.Builder request = camera.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
request.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
request.addTarget(reader.getSurface());
int rotation = ((WindowManager)context.getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay().getRotation();
request.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
session.capture(request.build(), captureCallback, null);
} catch (CameraAccessException e) {
throwError(e.getMessage());
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
throwError(ERR_CONFIGURE);
}
};
private CameraCaptureSession.CaptureCallback captureCallback = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureFailed(CameraCaptureSession session, CaptureRequest request, CaptureFailure failure) {
super.onCaptureFailed(session, request, failure);
throwError(failure.toString());
}
};
private ImageReader.OnImageAvailableListener imageListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireLatestImage();
try {
File file = saveImage(image);
// Send file via a listener
closeCamera();
} catch (IOException e) {
throwError(e.getMessage());
}
reader.close();
}
};
private File saveImage(Image image) throws IOException {
File file = new File(filepath);
if (file.exists()) {
throwError(FILE_EXIST);
return null;
}
else {
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
FileOutputStream output = new FileOutputStream(file);
output.write(bytes);
image.close();
output.close();
return file;
}
}
static class CompareSizesByArea implements Comparator<Size> {
#Override
public int compare(Size lhs, Size rhs) {
return Long.signum((long) lhs.getWidth() * lhs.getHeight() - (long) rhs.getWidth() * rhs.getHeight());
}
}
private void closeCamera(){
if(session!=null) {session.close();}
if(reader!=null) {reader.close();}
if(camera!=null) {camera.close();}
}
Then I call the Cam object in my Activity :
Cam cam = new Cam(MainActivity.this);
cam.takePicture(Cam.BACK, "/sdcard/pic.jpg");
A listener prevent the MainActivity when the picture is available, but I removed the code to clear a bit.
I don't know what I am doing wrong, the pictures are really dark. Maybe a flag or something... Any help will be appreciated.
EDIT:
Working class : https://github.com/omaflak/Android-Camera2-Library/blob/master/ezcam/src/main/java/me/aflak/ezcam/EZCam.java
Example: https://github.com/omaflak/Android-Camera2-Library/blob/master/app/src/main/java/me/aflak/libraries/MainActivity.java
If the only capture request you send to the camera is the one for the final picture, this is not surprising.
The camera automatic exposure, focus, and white balance routines generally need a second or two of streaming buffers before they converge to good results.
While you don't need to draw preview on screen, the simplest method here is to first run a repeating request targeting a dummy SurfaceTexture for a second or two, and then fire off the JPEG capture.
You could just stream the JPEG capture, but JPEG capture is slow, so you'll need a longer time for convergence (plus it's more likely a camera implementation has a bug with repeated JPEG capture and getting good exposure, than with a typical preview).
So, create a dummy SurfaceTexture with a random texture ID argument:
private SurfaceTexture mDummyPreview = new SurfaceTexture(1);
private Surface mDummySurface = new Surface(mDummyPreview);
and include the Surface in your session configuration. Once the session is configured, create a preview request that targets the dummy preview, and after N capture results have come in, submit the capture request for the JPEG you want. You'll want to experiment with N, but probably ~30 frames is enough.
Note that you're still not dealing with:
Locking AF to ensure a sharp image for your JPEG
Running AE precapture to allow for flash metering, if you want to allow for flash use
Having some way for the user to know what they'll be capturing, since there's no preview, they can't aim the device at anything very well.
The AF lock and precapture trigger sequences are included in Camera2Basic sample here: https://github.com/googlesamples/android-Camera2Basic, so you can take a look at what those do.
Maybe you can try to turn on the automatic exposure mode and automatic white balance:
request.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
request.set(CaptureRequest.CONTROL_AWB_MODE, CaptureRequest.CONTROL_AWB_MODE_AUTO);
I hope it will help :)
In my case just configuration FPS helps me. And don't forget to put it to CaptureRequest.Builder for preview and ALSO to CaptureRequest.Builder capture builder. As usual FPS 10 or 15 frames quite enough for photo and preview.
Capture builder
// This is the CaptureRequest.Builder that we use to take a picture.
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
...
setupFPS(captureBuilder);
Preview builder:
// We set up a CaptureRequest.Builder with the output Surface.
mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);
...
// set FPS rate
setupFPS(mPreviewRequestBuilder);
Where setupFPS:
private void setupFPS(CaptureRequest.Builder builder){
if(fpsRange != null) {
builder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, fpsRange);
}
}
And initialization of FPS with:
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
try {
Range<Integer>[] ranges = characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
if(ranges != null) {
for (Range<Integer> range : ranges) {
int upper = range.getUpper();
Log.i(TAG, "[FPS Range Available]:" + range);
if (upper >= 10) {
if (fpsRange == null || upper < fpsRange.getUpper()) {
fpsRange = range;
}
}
}
}
} catch (Exception e) {
e.printStackTrace();
}
Log.i(TAG, "[FPS Range] is:" + fpsRange);
I was facing a dark image issue for the last two days and now I have a solution for it.
You need to set CaptureRequest like below.
I tried to set the brightness in camera2 API.
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
captureBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
captureBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, 10);
captureBuilder.addTarget(imageReader.getSurface());
I know this is old, but I ran into a similar issue. Sharing what worked for me for anyone else who stumbles upon this. Tried all sorts of answers here with no success.
Setting CONTROL_AE_MODE to CONTROL_AE_MODE_ON (as some suggested) also did not fix it (you think it would).
What fixed it for me was setting the CONTROL_CAPTURE_INTENT to CONTROL_CAPTURE_INTENT_VIDEO_RECORD.
request.set(CaptureRequest.CONTROL_CAPTURE_INTENT, CaptureRequest.CONTROL_CAPTURE_INTENT_VIDEO_RECORD);
Once adding this line and building, auto exposure was enabled on the camera as expected and the camera adjusted automatically.
Refer to https://developer.android.com/reference/android/hardware/camera2/CameraMetadata for more information. I used this as a guide to discover what options were available.
Related
Current my device Raspberry pi model 3 on Android Things (Oreo 8.1)
first I try camera device open, and clicked capture button,
get camera preview jpg file.
my source
private CameraDevice mCameraDevice;
private CameraCaptureSession mCaptureSession;
private ImageReader mImageReader;
private CaptureRequest.Builder captureBuilder;
private final TextureView.SurfaceTextureListener mSurfaceTextureListener = new TextureView.SurfaceTextureListener() {
#Override
public void onSurfaceTextureAvailable(SurfaceTexture texture, int width, int height) {
openCamera(width, height);
}
...
}
private void openCamera(int width, int height) {
//camera open ...
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
mImageReader = ImageReader.newInstance(320, 240, ImageFormat.JPEG, 1);
manager.openCamera(cameraId, mStateCallback, mBackgroundHandler);
}
//Camera preview check
private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(CameraDevice cameraDevice) {
mCameraDevice = cameraDevice;
createCameraPreviewSession();
}
...
};
//create a new CameraCaptureSession for camera preview
private void createCameraPreviewSession() {
try {
SurfaceTexture texture = mTextureView.getSurfaceTexture();
assert texture != null;
// We configure the size of default buffer to be the size of camera preview we want.
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
// This is the output Surface we need to start preview.
Surface surface = new Surface(texture);
// We set up a CaptureRequest.Builder with the output Surface.
mPreviewRequestBuilder
= mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession cameraCaptureSession) {
mCaptureSession = cameraCaptureSesssion;
updatePreview();
}
#Override
public void onConfiguredFailed(CameraCaptureSession cameraCaptureSession) {
}
}, null);
} catch (CameraAccessException e) {
}
}
this source is success camera preview and camera capture on SamSung galaxy 8 device.
but raspberry pi 3 not open camera preview and fail capture.
raspberry pi always call onConfigureFailed on createCameraPreviewSession.
log
W/CameraDevice-JV-0: Stream configuration failed due to: endConfigure:372: Camera 0: Unsupported set of inputs/outputs provided
E/CameraCaptureSession: Session 0: Failed to create capture session; configuration failed
so I changed this part
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()), new CameraCaptureSession.StateCallback() {
--> change
mCameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
this source is camera open on raspberry pi. but not work capture.
How to camera preview capture on raspberry pi 3 android things?
if you know please advice me
thanks.
I'm trying to create an app using the camera2 api, what I need is to create a burst of 30 fps which I was able to create.
The problems is both the preview images and the saved images interlaced (I'm photographing some blinking leds so its easy to see).
I tried to disable the auto exposure and set the sensitivity myself but that didn't work.
private void captureStillPicture() {
try {
final Activity activity = getActivity();
mPictureCounter = 0;
if (null == activity || null == mCameraDevice) {
return;
}
List<CaptureRequest> captureList = new ArrayList<CaptureRequest>();
// This is the CaptureRequest.Builder that we use to take a picture.
final CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
// Use the same AE and AF modes as the preview.
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
captureBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, 2000);
//Auto focus - should keep that
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, Consts.aeMode);
captureBuilder.addTarget(mImageReader.getSurface());
for(int i = 0; i < Consts.frameAmount; i++) {
captureList.add(captureBuilder.build());
}
CameraCaptureSession.CaptureCallback CaptureCallback = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
mPictureCounter++;
unlockFocus();
}
};
mCaptureSession.stopRepeating();
mCaptureSession.captureBurst(captureList, CaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Any thought?
I'm trying to make an app which broadcast video through internet, currently I am using the deprecated Camera API, adding a Camera.PreviewCallback to the Camera object and then sending the byte array which comes in the onPreviewFrame() method from Camera.PreviewCallback.
But now I want to test the new Camera2 API, I am watching at the Camera2BasicTutorial , and I think that I need to make a CameraCaptureSession.CaptureCallback object to get the image byte array, something like the tutorial says:
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
showToast("Saved: " + mFile);
Log.d(TAG, mFile.toString());
unlockFocus();
}
};
And then add it to the CameraCaptureSession:
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
The problem is that I don't know how to retrieve each image byte array from any of the parameters in onCaptureCompleted() from the CaptureCallback.
Any help?
You're kind of right- you can't get the image data from the onCaptureCompleted() method. That callback only returns the metadata about the exposure for your own bookkeeping. The actual image information gets sent to whatever Surface you indicated in the exposure's CaptureRequest.
At least I realized how to do what I wanted, from the Camera2BasicTutorial, I did the following changes to the Camera2BasicFragment class:
Modify captureStillPicture() method to delete stuff which I determined that was unneccessary with my broadcast needs, also don't allow this method to stop the repeating mode:
private void captureStillPicture() {
try {
final Activity activity = getActivity();
if (null == activity || null == mCameraDevice) {
return;
}
final CaptureRequest.Builder captureBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
}
};
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
In createCameraPreviewSession() method, disable the automaticall flash:
// When the session is ready, we start displaying the preview.
mCaptureSession = cameraCaptureSession;
try {
// Auto focus should be continuous for camera preview.
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
// Flash is automatically enabled when necessary.
// mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE,
// CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
// Finally, we start displaying the camera preview.
mPreviewRequest = mPreviewRequestBuilder.build();
mCaptureSession.setRepeatingRequest(mPreviewRequest,
mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
I created a boolean var to detect if there is an image currently being under process, to not queue all the frames that the camera captures; and another boolean to track if there is a frame sending through internet:
private boolean mWorking = false;
private boolean mNetworkWorking = false;
Modify the CaptureCallback object to run the captureStillPicture() method in each frame (only if there is no frame processing at the moment).
case STATE_PREVIEW: {
if (!mWorking){
Log.d(TAG, "capturing..");
mWorking = true;
mBackgroundHandler.post(new Runnable() {
#Override
public void run() {
captureStillPicture();
}
});
} else {
Log.d(TAG, "thread working, doing nothing");
}
break;
Finally, read the frame and send it; I achieved this modifying the OnImageAvailableListener object:
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
= new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(final ImageReader reader) {
// Process the image.
Image image = reader.acquireNextImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
final byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
image.close();
if (!mNetworkWorking){
Thread thread = new Thread(){
#Override
public void run(){
mNetworkWorking = true;
HttpResponse response = null;
HttpClient client = new DefaultHttpClient();
HttpPost post = new HttpPost(mBroadcastUrl);
post.setEntity(new ByteArrayEntity(bytes));
try {
response = client.execute(post);
} catch (ClientProtocolException e) {
if (BuildConfig.LOCAL_LOG)
Log.w(TAG, "ClientProtocolException: "+e.getMessage());
} catch (IOException e) {
if (BuildConfig.LOCAL_LOG)
Log.w(TAG, "IOException: "+e.getMessage());
}
mNetworkWorking = false;
}
};
thread.setName("networkThread");
thread.setPriority(Thread.MAX_PRIORITY);
thread.start();
}
mWorking = false;
}
};
That's all.
I am creating an app on android 5.0.2 using new camera API(camera2). The app is to take a picture every 2.5 sec for 3 hours(4320 pictures in total). As you can see in the code below I coded repeating stuff with "timer" and no preview referring to Capture picture without preview using camera2 API. I am using NEXUS7 2013 16G 5.0.2 for the test. It works fine for the beginning 200-300 pictures and failed with the following error message. The fail always starts with "E/RequestThread-1﹕ Hit timeout for jpeg callback!", it must trigger something. Would anyone help to get rid of this trigger? Or this is going to fixed in 5.1.0 if it is android bug??
03-30 15:46:04.472 11432-11432/com.example.android.camera2basic V/yo click﹕ ---- 174 ---- click
03-30 15:46:05.026 11432-11537/com.example.android.camera2basic E/RequestThread-1﹕ Hit timeout for jpeg callback!
03-30 15:46:05.027 11432-11537/com.example.android.camera2basic W/CaptureCollector﹕ Jpeg buffers dropped for request: 173
03-30 15:46:05.076 11432-11480/com.example.android.camera2basic E/CameraDevice-JV-1﹕ Lost output buffer reported for frame 173
03-30 15:46:05.090 11432-11537/com.example.android.camera2basic W/LegacyRequestMapper﹕ convertRequestMetadata - control.awbRegions setting is not supported, ignoring value
03-30 15:46:05.090 11432-11537/com.example.android.camera2basic W/LegacyRequestMapper﹕ Only received metering rectangles with weight 0.
03-30 15:46:05.091 11432-11537/com.example.android.camera2basic W/LegacyMetadataMapper﹕ convertAfModeToLegacy - ignoring unsupported mode 4, defaulting to fixed
03-30 15:46:05.091 11432-11537/com.example.android.camera2basic W/LegacyRequestMapper﹕ convertRequestToMetadata - Ignoring android.lens.focusDistance false, only 0.0f is supported
03-30 15:46:05.098 11432-11537/com.example.android.camera2basic E/AndroidRuntime﹕ FATAL EXCEPTION: RequestThread-1
Process: com.example.android.camera2basic, PID: 11432
java.lang.RuntimeException: startPreview failed
at android.hardware.Camera.startPreview(Native Method)
at android.hardware.camera2.legacy.RequestThreadManager.startPreview(RequestThreadManager.java:275)
at android.hardware.camera2.legacy.RequestThreadManager.doJpegCapturePrepare(RequestThreadManager.java:288)
at android.hardware.camera2.legacy.RequestThreadManager.access$1700(RequestThreadManager.java:61)
at android.hardware.camera2.legacy.RequestThreadManager$5.handleMessage(RequestThreadManager.java:767)
at android.os.Handler.dispatchMessage(Handler.java:98)
at android.os.Looper.loop(Looper.java:135)
at android.os.HandlerThread.run(HandlerThread.java:61)
Here is my code::
public class CameraActivity extends Activity {
Timer mTimer = null;
Handler mHandler = new Handler();
private ImageReader imageReader;
private Handler backgroundHandler;
private HandlerThread backgroundThread;
private String cameraId;
private CameraDevice cameraDevice;
private CameraCaptureSession cameraCaptureSession;
static int count = 0;
static int count2 = 0;
/**
* Conversion from screen rotation to JPEG orientation.
*/
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
//setContentView(R.layout.activity_camera);
setContentView(R.layout.activity_main);
Button takePicture = (Button)findViewById(R.id.takepic);
takePicture.setOnClickListener(onClickPicture);
//(1) setting up camera but stop before camera createCaptureRequest
setupCamera2();
}
private void setupCamera2() {
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
for (String cameraId : manager.getCameraIdList()) {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
//if (characteristics.get(CameraCharacteristics.LENS_FACING) != CameraCharacteristics.LENS_FACING_BACK) {
if (characteristics.get(CameraCharacteristics.LENS_FACING) != CameraCharacteristics.LENS_FACING_FRONT) {
continue;
}
StreamConfigurationMap configs = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
this.cameraId = cameraId;
manager.openCamera(this.cameraId, cameraStateCallback, backgroundHandler);
Size[] sizes = configs.getOutputSizes(ImageFormat.JPEG);
int picWidth = 640;//1920;
int picHeight = 480;//1080;
imageReader = ImageReader.newInstance(picWidth, picHeight, ImageFormat.JPEG, 2);
imageReader.setOnImageAvailableListener(onImageAvailableListener, backgroundHandler);
}
} catch (CameraAccessException | NullPointerException e) {
e.printStackTrace();
}
}
private final CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(CameraDevice device) {
cameraDevice = device;
//(2) Camera capture session
createCameraCaptureSession();
}
#Override
public void onDisconnected(CameraDevice cameraDevice) {}
#Override
public void onError(CameraDevice cameraDevice, int error) {}
};
//private void createCaptureSession() {
private void createCameraCaptureSession() {
List<Surface> outputSurfaces = new LinkedList<>();
outputSurfaces.add(imageReader.getSurface());
Log.v("-yo(2)-", "in createcameraCaptureSession now");
try {
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
//cameraCaptureSession = session;
cameraCaptureSession = session;
//commented out to invoked from button
//createCaptureRequest();
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private final ImageReader.OnImageAvailableListener onImageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
//createCaptureRequest();
Log.v("yo ireader ","---- "+(count2++)+" ---- ireader");
//Image mImage = imageReader.acquireLatestImage();
Image mImage = reader.acquireLatestImage();
File mFile = new File(Environment.getExternalStorageDirectory() + "/yP2PTEST/0P2Pimage.jpg");
Log.v("--yo--", "In ImageReader now writing to "+mFile);
/////////////////////////////////////
ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
FileOutputStream output = null;
try {
output = new FileOutputStream(mFile);
output.write(bytes);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
mImage.close();
if (null != output) {
try {
output.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
ImageView curPic = (ImageView)findViewById(R.id.imageView1);
Bitmap mCurrentBitmap = BitmapFactory.decodeFile(mFile.getPath());
curPic.setImageBitmap(mCurrentBitmap);
}
///////////////////////////////////
};
private void createCaptureRequest() {
Log.v("-yo(3)-", "in createCaptureRequest now");
try {
CaptureRequest.Builder requestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
requestBuilder.addTarget(imageReader.getSurface());
// Focus
requestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
// Orientation
//yo int rotation = windowManager.getDefaultDisplay().getRotation();
int rotation = this.getWindowManager().getDefaultDisplay().getRotation();
requestBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
// cameraCaptureSession.capture(requestBuilder.build(), camera2Callback, null);
cameraCaptureSession.capture(requestBuilder.build(), mCaptureCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
CameraCaptureSession.CaptureCallback mCaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request,
TotalCaptureResult result) {
//showToast("JPEG Saved : ");
//Log.v("yo save","- saved JPEG -");
//unlockFocus();
}
};
private Handler mMessageHandler = new Handler() {
#Override
public void handleMessage(Message msg) {
if (this != null) {
Toast.makeText(CameraActivity.this, (String) msg.obj, Toast.LENGTH_SHORT).show();
}
}
};
private void showToast(String text) {
// We show a Toast by sending request message to mMessageHandler. This makes sure that the
// Toast is shown on the UI thread.
Message message = Message.obtain();
message.obj = text;
mMessageHandler.sendMessage(message);
}
//------------------------------------------------------------//
public View.OnClickListener onClickPicture = new View.OnClickListener() {
public void onClick(View v) {
/*------- camera2 --------------*/
mTimer = null;
mTimer = new Timer(true);
mTimer.schedule( new TimerTask(){
#Override
public void run() {
/*------------------------*/
mHandler.post( new Runnable() {
public void run() {
createCaptureRequest();
Log.v("yo click ","---- "+(count++)+" ---- click");
}
});
}
}, 1000, 2500);//1500,1600, 1800 etc
};
};
};
Thanks in advance.
EDIT
I looked into source program of CAMERA2 API and found where the error message comes.
JPEG_FRAME_TIMEOUT is currently 300ms, I guess it is so small and want to increase it. If anyone know how to do it, please let me know?
if (holder.hasJpegTargets()) {
doJpegCapture(holder);
if (!mReceivedJpeg.block(JPEG_FRAME_TIMEOUT)) {
Log.e(TAG, "Hit timeout for jpeg callback!");
mCaptureCollector.failNextJpeg();
}
}
I've had same issue with modified google sample.
After taking two pictures I got this error, so you just need to extend last parameter as much as you need, and it will work.
imageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),ImageFormat.JPEG, 2);
But keep in mind this:
#param maxImages The maximum number of images the user will want to access simultaneously. This should be as small as possible to limit memory use. Once maxImages Images are obtained by the user, one of them has to be released before a new Image will become available for access through {#link #acquireLatestImage()} or {#link #acquireNextImage()}. Must be greater than 0.
I'm new to Android development but have Java/C# experience. I want to use camera api to take a picture, show on the screen and save to disk.
I like to use newest api, which says android.hardware.Camera is obsolete and we should use android.hardware.camera2. I searched the Internet but found few information on android.hardware.camera2.
Here is what I have now.
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
String[] ids = manager.getCameraIdList();
for (String id : ids) {
CameraCharacteristics ch = manager.getCameraCharacteristics(id);
if (ch.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_BACK) {
manager.openCamera(id, new CameraCallback(), null);
}
}
.
class CameraCallback extends CameraDevice.StateCallback {
#Override
public void onOpened(CameraDevice camera) {
try {
CaptureRequest.Builder captureBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onDisconnected(CameraDevice camera) { }
#Override
public void onError(CameraDevice camera, int error) { }
}
I have no idea after CaptureRequest.Builder captureBuilder what I should do. I want to see some samples.
Besides, what control should I use to display the image?