How to enable front camera in Camera2 API? - android

How to enable front camera on Camera2 API.Can anyone help ? I have this Camera2 API code .This only sets Main camera of the device ,I want to enable both front and rear camera on a button click.What is LENS_FACING_FRONT,I am new to android programming.
private void setUpCameraOutputs(int width, int height) {
Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try {
for (String cameraId : manager.getCameraIdList()) {
CameraCharacteristics characteristics
= manager.getCameraCharacteristics(cameraId);
// We don't use a front facing camera in this sample.
Integer facing = characteristics.get(CameraCharacteristics.LENS_FACING);
if (facing != null && facing == CameraCharacteristics.LENS_FACING_FRONT) {
continue;
}
StreamConfigurationMap map = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
if (map == null) {
continue;
}
// For still image captures, we use the largest available size.
Size largest = Collections.max(
Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)),
new CompareSizesByArea());
mImageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),
ImageFormat.JPEG, /*maxImages*/2);
mImageReader.setOnImageAvailableListener(
mOnImageAvailableListener, mBackgroundHandler);
// Danger, W.R.! Attempting to use too large a preview size could exceed the camera
// bus' bandwidth limitation, resulting in gorgeous previews but the storage of
// garbage capture data.
mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class),
width, height, largest);
// We fit the aspect ratio of TextureView to the size of preview we picked.
int orientation = getResources().getConfiguration().orientation;
if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
mTextureView.setAspectRatio(
mPreviewSize.getWidth(), mPreviewSize.getHeight());
} else {
mTextureView.setAspectRatio(
mPreviewSize.getHeight(), mPreviewSize.getWidth());
}
mCameraId = cameraId;
return;
}
} catch (CameraAccessException e) {
e.printStackTrace();
} catch (NullPointerException e) {
// Currently an NPE is thrown when the Camera2API is used but not supported on the
// device this code runs.
ErrorDialog.newInstance(getString(R.string.camera_error))
.show(getChildFragmentManager(), FRAGMENT_DIALOG);
}
}

We can use CameraManager to iterate all the cameras that are available in the system, each with a designated cameraId. Using the cameraId, we can get the properties of the specified camera device. Those properties are represented by class CameraCharacteristics. Things like "is it front or back camera", "output resolutions supported" can be queried there.
You can get official sample application here
This example found in Google Git repo will demo for you checking the permission before launching camera in new Marshmallow using Camera2 API
Give a look at this article for more about it.

Related

Is it possible to capture the images without texture view using the camera 2 API?

In my case, I don't need to show the preview to the user and would like to capture the image from the service, to achieve this I have used ImageFormat .JPG to capture the images but output images are really very dark. I have tried this link in StackOverflow but it is not working.
val streamConfigurationMap =
mCameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP) // Available stream configuration.
mPreviewSize = streamConfigurationMap!!.getOutputSizes(ImageFormat.JPEG)[0]
mCameraID = cameraId
mImageReader =
ImageReader.newInstance(mPreviewSize!!.width, mPreviewSize!!.height, ImageFormat.JPEG, 1)
mImageReader!!.setOnImageAvailableListener(onImageAvailable, mBackgroundHandler)
If I use the dummy surface texture view getting below error, after few seconds of app launch
E/BufferQueueProducer: [SurfaceTexture-1-20857-1] cancelBuffer: BufferQueue has been abandoned
First of all, you don't have to use a TextureView. The reason your preview is really dark is probably because of your CaptureRequest.builder. You want to control your Auto Exposure with for example, I explain later this below.
First, when you set your surface, you should set it as such:
builder.addTarget(mImageReader.getSurface());
Now on to the brightness issue, you can control your AE like this:
builder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE,getRange());
where getRange() is:
private Range<Integer> getRange() {
CameraCharacteristics chars = null;
try {
CameraManager manager = (CameraManager) ((Activity)getContext()).getSystemService(Context.CAMERA_SERVICE);
chars = manager.getCameraCharacteristics(mCameraId);
Range<Integer>[] ranges = chars.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
Range<Integer> result = null;
for (Range<Integer> range : ranges) {
int upper = range.getUpper();
// 10 - min range upper for my needs
if (upper >= 10) {
if (result == null || upper < result.getUpper().intValue()) {
result = range;
}
}
}
if (result == null) {
result = ranges[0];
}
return result;
} catch (CameraAccessException e) {
e.printStackTrace();
return null;
}
}
mImageReader = ImageReader.newInstance(hardcoded_width, hardcoded_height, ImageFormat.YUV_420_888, 2);
mImageReader.setOnImageAvailableListener(mVideoCapture, mBackgroundHandler);
If you want to know more about custom brightness etc. Check this out

Android Camera2, I aquire YUV image from ImageReader. U and V buffers have only one row(stride) data, the rest is zero

This is how I instantiate the ImageReader.
Size[] sizes = configs.getOutputSizes(ImageFormat.YUV_420_888);
mImageReader = ImageReader.newInstance(width, height, ImageFormat.YUV_420_888, 2);
mImageReader.setOnImageAvailableListener(mOnImageAvailableListener, null);
Surface rgbCaptureSurface = mImageReader.getSurface();
List<Surface> surfaces = new ArrayList<Surface>();
surfaces.add(rgbCaptureSurface);
//surfaces.add(surface);
mPreviewRequestBuilder
= mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
//mPreviewRequestBuilder.addTarget(surface);
mPreviewRequestBuilder.addTarget(rgbCaptureSurface);
mCameraDevice.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession cameraCaptureSession) {
// The camera is already closed
if (null == mCameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
mCaptureSession = cameraCaptureSession;
try {
// Auto focus should be continuous for camera preview.
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_VIDEO);
// Flash is automatically enabled when necessary.
//setAutoFlash(mPreviewRequestBuilder);
// Finally, we start displaying the camera preview.
mPreviewRequest = mPreviewRequestBuilder.build();
mCaptureSession.setRepeatingRequest(mPreviewRequest,
mCaptureCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Reading is done like this:
public void onImageAvailable(ImageReader reader) {
Image image;
while (true) {
image = reader.acquireLatestImage();
if (image == null) return;
Image.Plane Y = image.getPlanes()[0];
Image.Plane U = image.getPlanes()[1];
Image.Plane V = image.getPlanes()[2];
int Yb = Y.getBuffer().remaining();
int Ub = U.getBuffer().remaining();
int Vb = V.getBuffer().remaining();
byte[] data = new byte[Yb + Ub + Vb];
Y.getBuffer().get(data, 0, Yb);
U.getBuffer().get(data, Yb, Ub);
V.getBuffer().get(data, Yb + Ub, Vb);
I tried several different ImageFormats. I'm testing on LG G3, API 21 and the problem occurs.On Nexus 4 I do not have the problem, API 22.
I upgraded to API 23 and the same code worked fine. Also tested on API 22 and it also worked.
Same as : Using Camera2 API with ImageReader
Your observation is correct. API 21 does not properly support Camera2. This has been found by several people independently here on SO, see e.g. Camera2 API21 not working
So it is reasonable to start using Camera2 not before API22. It is not understandable why documentation hasn't been amended in the meantime.
Personally I am continuing to perform Camera2 studies, but I am still reluctant to use Camera2 in my app now. I first want to test it on many many devices first and for the near future I don't expect "Camera1" not being supported anymore by new devices.

Auto-exposure doesn't work Android Camera API v1

I'm trying to use camera features as part of my application and I'm stuck on camera preview step.
I want to understand why the preview image remains dark if there are no bright light.
Here is what params I set before start previewing:
mParameters = mCamera.getParameters();
List<Camera.Size> mSupportedPreviewSizes = mParameters.getSupportedPreviewSizes();
Camera.Size optimalSize = CameraHelper.getOptimalPreviewSize(mSupportedPreviewSizes, DEFAULT_PREVIEW_WIDTH, DEFAULT_PREVIEW_HEIGHT);
// Use the same size for recording profile.
mProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH);
mProfile.videoFrameWidth = optimalSize.width;
mProfile.videoFrameHeight = optimalSize.height;
// likewise for the camera object itself.
mParameters.setPreviewSize(mProfile.videoFrameWidth, mProfile.videoFrameHeight);
// Set correct video width and height according to screen rotation
transformMatrixHelper.setVideoDimensions(optimalSize.width, optimalSize.height);
if (getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) {
transformMatrixHelper.setVideoDimensions(optimalSize.height, optimalSize.width);
}
transformMatrixHelper.clearInitTextureDimension();
mParameters.setPreviewFpsRange(MAX_FPS, MAX_FPS);
mParameters.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_AUTO);
// Auto-focus
if (mParameters.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
mParameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
}
// Auto-exposure
if (mParameters.isAutoExposureLockSupported()) {
mParameters.setAutoExposureLock(false);
}
mCamera.setParameters(mParameters);
I'm not calling any camera.autoFocus(callback) method after preview stared.
I will be very grateful if someone help me, thanks.

Horizontal "bars" instead of video when recording with MediaRecorder

Subject A: Horizontal Bars
This is a weird one. The video records and the sound is fine, but those bars represent my hand. It is as if I am looking a ribbed pyrex bowl:
Subject B:
I am testing this on a Samsung Galaxy Note 10.1 using the front facing camera (API 16). Some notes:
When I record using the back facing camera, it records fine.
When I record using the front facing camera in the standard camera
app, it works fine.
When I run the same code on a Samsung Galaxy S4 (API 21),
both back and front cameras work fine.
I use CamcorderProfile to select the appropriate MediaRecorder settings. Here is my code:
#TargetApi(Build.VERSION_CODES.HONEYCOMB)
private boolean prepareVideoRecorder() {
// Get screen width and height for supported video sizes
Display display = getWindowManager().getDefaultDisplay();
int width, height;
if (Build.VERSION.SDK_INT < 13) {
width = display.getWidth();
height = display.getHeight();
} else {
Point size = new Point();
display.getSize(size);
width = size.x;
height = size.y;
}
// Check and set the highest quality available recording profile of the camera.
int frontCamera = Camera.CameraInfo.CAMERA_FACING_FRONT;
CamcorderProfile profile = null;
if (CamcorderProfile.hasProfile(frontCamera, CamcorderProfile.QUALITY_1080P))
profile = CamcorderProfile.get(frontCamera, CamcorderProfile.QUALITY_1080P);
else if (CamcorderProfile.hasProfile(frontCamera, CamcorderProfile.QUALITY_720P))
profile = CamcorderProfile.get(frontCamera, CamcorderProfile.QUALITY_720P);
else if (CamcorderProfile.hasProfile(frontCamera, CamcorderProfile.QUALITY_480P))
profile = CamcorderProfile.get(frontCamera, CamcorderProfile.QUALITY_480P);
else if (CamcorderProfile.hasProfile(frontCamera, CamcorderProfile.QUALITY_HIGH))
profile = CamcorderProfile.get(frontCamera, CamcorderProfile.QUALITY_HIGH);
mCamera = CameraHelper.getDefaultFrontFacingCameraInstance();
// Need to make sure that our recording video size is supported by the camera.
// Query camera to find all sizes and choose the optimal size given the dimensions of screen.
Camera.Parameters parameters = mCamera.getParameters();
List<Camera.Size> mSupportedVideoSizes = parameters.getSupportedVideoSizes();
Camera.Size optimalSize = CameraHelper.getOptimalRecordingSize(mSupportedVideoSizes,
width, height);
// link camera to preview surface
try {
mCamera.setPreviewDisplay(mPreview.getHolder());
} catch (IOException e) {
Log.e(TAG, "Surface text is unavailable or unsuitable for preview" + e.getMessage());
return false;
}
// Unlock the camera to allow the Media Recorder to own it
mCamera.unlock();
// Create and assign the Camera to the Media Recorder
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mCamera);
// Set Sources
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
if (profile != null) {
profile.videoFrameWidth = optimalSize.width;
profile.videoFrameHeight = optimalSize.height;
mMediaRecorder.setProfile(profile);
}
// Specify the output file
mMediaRecorder.setOutputFile(CameraHelper.getOutputMediaFile(
CameraHelper.MEDIA_TYPE_VIDEO).toString());
// Prepare configured MediaRecorder
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
Log.d(TAG, "IllegalStateException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
} catch (IOException e) {
Log.d(TAG, "IOException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
}
return true;
}
I've tried bypassing the CamcorderProfile and trying every MediaRecorder.VideoEncoder encoder available. The file format is .mp4, which I don't believe is the issue because the standard Camera app on the device outputs a .mp4. Any ideas?

Android camera setDisplayOrientation: strange behavior for galaxy tab

I face a problem a problem trying to have a camera preview in portrait mode. I have read various articles about it and I had solved it having the following code:
Display display = ((CaptureActivity)context).getWindowManager().getDefaultDisplay();
int width = display.getWidth();
int height = display.getHeight();
if (Integer.parseInt(Build.VERSION.SDK) >= 8) {
setDisplayOrientation(camera, 90);
}else{
Camera.Parameters parameters = camera.getParameters();
parameters.set("orientation", "portrait");
camera.setParameters(parameters);
}
where setDisplayOrientation() is defined as:
protected void setDisplayOrientation(Camera camera, int angle) {
Method downPolymorphic;
try {
downPolymorphic = camera.getClass().getMethod(
"setDisplayOrientation", new Class[] { int.class });
if (downPolymorphic != null)
downPolymorphic.invoke(camera, new Object[] { angle });
} catch (Exception e1) {
}
}
Now I tried this code to a Galaxy Tab and it failed. I solved it (trying and error approach) using the following code:
if (height == 1024 && width == 600) {
Camera.Parameters parameters = camera.getParameters();
parameters.set("orientation", "portrait");
parameters.setRotation(90);
camera.setParameters(parameters);
}
Now my two questions are:
1) Why there is such problem while Galaxy tab has the 2.2 version, and
2) Is there any better solution to this problem?
Thanks a lot for your time!
for setting the display orientation check out the official docs, dont just hardcode 90 degrees there.

Categories

Resources