Unable to take a picture by using the camera - android

I am trying to develop an Android application that will take pictures in a service. Before implementing a service I am trying to implement the functionality on a button click. I've used code available on following locations
how to capture an image in background without using the camera application
However I am getting an excecption while executing Camera.takePicture() call. The LogCat shows following
12-10 14:46:56.135 589-18809/? E/QCameraStateMachine: int32_t qcamera::QCameraStateMachine::procEvtPreviewReadyState(qcamera_sm_evt_enum_t, void *): Error!! cannot handle evt(24) in state(1)
12-10 14:46:56.135 589-4574/? I/QCamera2HWI: [KPI Perf] static int qcamera::QCamera2HardwareInterface::pre_take_picture(struct camera_device *): X
12-10 14:46:56.135 589-4574/? E/QCamera2HWI: static int qcamera::QCamera2HardwareInterface::take_picture(struct camera_device *): pre_take_picture failed with ret = -38
12-10 14:46:56.135 589-4574/? I/QCameraHalWatchdog: Stopped Watchdog Thread (0ms)[take_picture]
I could not find any information on "cannot handle evt(24) in state(1)". Neither on "qcamera::QCamera2HardwareInterface::pre_take_picture(struct camera_device *): X"
I have not copied the code here since the code is already presnet in the link I've provided earlier
My min compile version is KitKat and I am using Motorola Lenovo.
Your help is highly appreciated.
Edit****
Here is the code
Camera.PictureCallback mCall = new Camera.PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.d(TAG, "------ In onPictureTaken");
// decode the data obtained by the camera into a Bitmap
// display.setImageBitmap(photo);
Bitmap bitmapPicture = BitmapFactory.decodeByteArray(data, 0,
data.length);
}
};
...
Camera c = null;
try {
c = Camera.open();
SurfaceView sv = new SurfaceView(this);
SurfaceHolder holder = sv.getHolder();
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
c.setPreviewDisplay(holder);
Camera.Parameters params = c.getParameters();
params.setJpegQuality(100);
c.setParameters(params);
c.startPreview();
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
c.setPreviewDisplay(holder);
c.takePicture(null, null, mCall);
} catch (Exception e) {
Log.e(TAG, "---- problems with camera operation " + e.getMessage(), e);
e.printStackTrace();
Log.d(TAG, "------ 333333 ");
} finally {
if (c != null) {
c.stopPreview();
c.release();
}
}

Related

Android Camera2 pipeline for preview and MediaCodec input Surface

I'm trying to send Android Camera2 output to both a preview Surface and a surface obtained from MediaCodec.createInputSurface(). However, when I pass those surfaces to a call to CameraDevice.createCaptureSession and then try to build a CaptureRequest, I get:
android.hardware.camera2.CameraAccessException: CAMERA_ERROR (3): submitRequestList - cannot use a surface that wasn't configured.
The CaptureRequest building logic (see below) is from an offical Flutter camera plugin and works fine when you use MediaRecorder.getSurface() instead of MediaCodec.createInputSurface(). Which suggests that the MediaCodec surface hasn't been configured. I'm using a VideoEncoder class from well-tried open source RTMP code (https://github.com/pedroSG94/rtmp-rtsp-stream-client-java) that works with the old Camera API (i.e. not Camera2). That class initialises the codec thus:
String type = CodecUtil.H264_MIME;
MediaCodecInfo encoder = chooseEncoder(type);
try {
if (encoder != null) {
codec = MediaCodec.createByCodecName(encoder.getName());
} else {
Log.e(TAG, "Valid encoder not found");
return false;
}
MediaFormat videoFormat;
//if you dont use mediacodec rotation you need swap width and height in rotation 90 or 270
// for correct encoding resolution
String resolution;
if (!hardwareRotation && (rotation == 90 || rotation == 270)) {
resolution = height + "x" + width;
videoFormat = MediaFormat.createVideoFormat(type, height, width);
} else {
resolution = width + "x" + height;
videoFormat = MediaFormat.createVideoFormat(type, width, height);
}
Log.i(TAG, "Prepare video info: " + this.formatVideoEncoder.name() + ", " + resolution);
videoFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
this.formatVideoEncoder.getFormatCodec());
videoFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 0);
videoFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
videoFormat.setInteger(MediaFormat.KEY_FRAME_RATE, fps);
videoFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, iFrameInterval);
if (hardwareRotation) {
videoFormat.setInteger("rotation-degrees", rotation);
}
if (this.avcProfile > 0 && this.avcProfileLevel > 0) {
// MediaFormat.KEY_PROFILE, API > 21
videoFormat.setInteger("profile", this.avcProfile);
// MediaFormat.KEY_LEVEL, API > 23
videoFormat.setInteger("level", this.avcProfileLevel);
}
codec.configure(videoFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
inputSurface = codec.createInputSurface();
The code that fails when I try to build a capture request is in CameraCaptureSession.StateCallback.onConfigured, where the call to build() raises the exception:
createCaptureSession( CameraDevice.TEMPLATE_RECORD, successCallback, surfaceFromMediaCodec );
private void createCaptureSession(
int templateType, Runnable onSuccessCallback, Surface... surfaces)
throws CameraAccessException {
// Close any existing capture session.
closeCaptureSession();
// Create a new capture builder.
captureRequestBuilder = cameraDevice.createCaptureRequest(templateType);
// Build Flutter surface to render to
SurfaceTexture surfaceTexture = flutterTexture.surfaceTexture();
surfaceTexture.setDefaultBufferSize(previewSize.getWidth(), previewSize.getHeight());
Surface flutterSurface = new Surface(surfaceTexture);
captureRequestBuilder.addTarget(flutterSurface);
List<Surface> remainingSurfaces = Arrays.asList(surfaces);
if (templateType != CameraDevice.TEMPLATE_PREVIEW) {
// If it is not preview mode, add all surfaces as targets.
for (Surface surface : remainingSurfaces) {
captureRequestBuilder.addTarget(surface);
}
}
// Prepare the callback
CameraCaptureSession.StateCallback callback =
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession session) {
try {
if (cameraDevice == null) {
dartMessenger.send(
DartMessenger.EventType.ERROR, "The camera was closed during configuration.");
return;
}
cameraCaptureSession = session;
captureRequestBuilder.set(
CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
cameraCaptureSession.setRepeatingRequest(captureRequestBuilder.build(), null, null);
if (onSuccessCallback != null) {
onSuccessCallback.run();
}
} catch (CameraAccessException | IllegalStateException | IllegalArgumentException e) {
Log.i( TAG, "exception building capture session " + e );
dartMessenger.send(DartMessenger.EventType.ERROR, e.getMessage());
}
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
dartMessenger.send(
DartMessenger.EventType.ERROR, "Failed to configure camera session.");
}
};
// Collect all surfaces we want to render to.
List<Surface> surfaceList = new ArrayList<>();
surfaceList.add(flutterSurface);
surfaceList.addAll(remainingSurfaces);
// Start the session
cameraDevice.createCaptureSession(surfaceList, callback, null);
}
If I remove the MediaCodec inputSurface as a build target, it works (but I don't capture anything into the MediaCodec). What am I missing? BTW, there are bits of Flutter code in the second code extract but there's no evidence that the embedding in Flutter is relevant.
Answering my own question. I was thrown off the scent by the misleading Exception message "cannot use a surface that wasn't configured". The surface was configured. And I thought I'd checked the sizes, but one was 720x480, the other was 480x720. It worked after I swapped.

Capture image in service without preview android

Problem statement:
When someone tries to open the device with the wrong pattern / PIN, my application should trigger an alarm, send an alert SMS to the registered mobile number. AND it should capture the image of the person trying to unlock the device, and send this image to registered Email ID.
What I have achieved:
I am getting notification of wrong Pattern / PIN in my DeviceAdmin class.
I start the service for background tasks. This service plays alarm successfully.
I send an alert SMS to the registered mobile number.
I sent an alert Email to the registered email ID successfully. (BUT without image.)
I am confused about how can I capture image in background IntentService when the device is locked, and that, too without preview.
I cannot use the Camera intent obviously. Because startActivityForResult can't be called from Service. PLUS, I don't want user to capture the image after opening the Camera app.
My research led me to these posts already.
Can I use Android Camera in service without preview?
How to Capture Image When Device is Locked
Issue is:
Camera API is deprecate`. Camera2 API requires Minimum sdk version 21,
but my client's requirement is minSdkVersion 15, which I can't change. I am unable to figure out what should I do now. Any reference or help please?
I solved my issue with help of this blog
So I capture the image within background service using this code:
#Override
public void onStart(Intent intent, int startId) {
mCamera = getAvailableFrontCamera(); // globally declared instance of camera
if (mCamera == null){
mCamera = Camera.open(); //Take rear facing camera only if no front camera available
}
SurfaceView sv = new SurfaceView(getApplicationContext());
SurfaceTexture surfaceTexture = new SurfaceTexture(10);
try {
mCamera.setPreviewTexture(surfaceTexture);
//mCamera.setPreviewDisplay(sv.getHolder());
parameters = mCamera.getParameters();
//set camera parameters
mCamera.setParameters(parameters);
//This boolean is used as app crashes while writing images to file if simultaneous calls are made to takePicture
if(safeToCapture) {
mCamera.startPreview();
mCamera.takePicture(null, null, mCall);
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
//Get a surface
sHolder = sv.getHolder();
//tells Android that this surface will have its data constantly replaced
sHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
Camera.PictureCallback mCall = new Camera.PictureCallback()
{
public void onPictureTaken(byte[] data, Camera camera)
{
safeToCapture = false;
//decode the data obtained by the camera into a Bitmap
FileOutputStream outStream = null;
try{
// create a File object for the parent directory
File myDirectory = new File(Environment.getExternalStorageDirectory()+"/Test");
// have the object build the directory structure, if needed.
myDirectory.mkdirs();
//SDF for getting current time for unique image name
SimpleDateFormat curTimeFormat = new SimpleDateFormat("ddMMyyyyhhmmss");
String curTime = curTimeFormat.format(new java.util.Date());
// create a File object for the output file
outStream = new FileOutputStream(myDirectory+"/user"+curTime+".jpg");
outStream.write(data);
outStream.close();
mCamera.release();
mCamera = null;
String strImagePath = Environment.getExternalStorageDirectory()+"/"+myDirectory.getName()+"/user"+curTime+".jpg";
sendEmailWithImage(strImagePath);
Log.d("CAMERA", "picture clicked - "+strImagePath);
} catch (FileNotFoundException e){
Log.d("CAMERA", e.getMessage());
} catch (IOException e){
Log.d("CAMERA", e.getMessage());
}
safeToCapture = true; //Set a boolean to true again after saving file.
}
};
private Camera getAvailableFrontCamera (){
int cameraCount = 0;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for (int camIdx = 0; camIdx < cameraCount; camIdx++) {
Camera.getCameraInfo(camIdx, cameraInfo);
if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
try {
cam = Camera.open(camIdx);
} catch (RuntimeException e) {
Log.e("CAMERA", "Camera failed to open: " + e.getLocalizedMessage());
}
}
}
return cam;
}
//Send Email using javamail API as user will not be allowed to choose available
// application using a Chooser dialog for intent.
public void sendEmailWithImage(String imageFile){
.
.
.
}
Following uses-features and permissions will be needed for this in manifest file:
<uses-feature
android:name="android.hardware.camera"
android:required="false" />
<uses-feature
android:name="android.hardware.camera.front"
android:required="false" />
<uses-permission android:name="android.permission.SEND_MAIL" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
I have set property required to false, so that user will be able to install my app even without any of the Camera available. May be this can help someone in future.

Size not in valid set when recording without preview

I am trying to record video using a Vivo X20 (7.1.1) and the camera2 api without using a preview and without recording sound (Strictly recording HD Video only).
I'm currently stuck because I cannot figure out how to successfully call MediaRecorder.setVideoSize() and record a video in HD. Currently when I run the app the log shows the error: Surface with size (w=1920, h=1080) and format 0x1 is not valid, size not in valid set: [1440x1080, 1280x960, 1280x720, 960x540, 800x480, 720x480, 768x432, 640x480, 384x288, 352x288, 320x240, 176x144]
The phone's stock camera app can record video up to 4K so I'm definitely missing something here. There are a total of two camera devices identified by CameraManager. When I use getOutPutFormats() from CameraCharacteristics it shows the same valid set of resolutions for both cameras and it is the same range as the above error message.
The below is the code I am using to initialize MediaRecorder and initiate a capture session:
public void StartRecordingVideo() {
Initialize();
recordingVideo = true;
cameraManager = (CameraManager) this.getSystemService(Context.CAMERA_SERVICE);
try {
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED) {
String[] cameraIDs = cameraManager.getCameraIdList();
//LogAllCameraInfo();
if (cameraIDs != null)
{
for(int x = 0; x < cameraIDs.length; x++)
{
Log.d(LOG_ID, "ID: " + cameraIDs[x]);
}
}
cameraManager.openCamera(deviceCameraID, cameraStateCallback, handler);
Log.d(LOG_ID, "Successfully opened camera");
}
else
{
throw new IllegalAccessException();
}
}
catch (Exception e)
{
recordingVideo = false;
Log.e(LOG_ID, "Error during record video start: " + e.getMessage());
}
}
private void Initialize()
{
videoRecordThread = new HandlerThread("video_capture");
videoRecordThread.start();
handler = new Handler((videoRecordThread.getLooper()));
try
{
vidRecorder = new MediaRecorder();
vidRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
vidRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
vidRecorder.setVideoFrameRate(30);
vidRecorder.setCaptureRate(30);
vidRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);
vidRecorder.setVideoEncodingBitRate(10000000);
vidRecorder.setVideoSize(1920, 1080);
String videoFilename = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS)+ File.separator + System.currentTimeMillis() + ".mp4";
vidRecorder.setOutputFile(videoFilename);
Log.d(LOG_ID, "Starting video: " + videoFilename);
vidRecorder.prepare();
}
catch (Exception e)
{
Log.e(LOG_ID, "Error during Initialize: " + e.getMessage());
}
}
And the onReady/onSurfacePrepared/Camera onOpened callbacks:
#Override
public void onReady(CameraCaptureSession session) {
Log.d(LOG_ID, "onReady: ");
super.onReady(session);
try {
CaptureRequest.Builder builder = deviceCamera.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
builder.addTarget(vidRecorder.getSurface());
CaptureRequest request = builder.build();
session.setRepeatingRequest(request, null, handler);
vidRecorder.start();
} catch (CameraAccessException e) {
Log.d(LOG_ID, "Error on Ready: " + e.getMessage());
}
}
#Override
public void onSurfacePrepared(CameraCaptureSession session, Surface surface) {
Log.d(LOG_ID, "onSurfacePrepared: ");
super.onSurfacePrepared(session, surface);
}
#Override
public void onOpened(CameraDevice camera) {
Log.d(LOG_ID, "onOpened: ");
deviceCamera = camera;
try {
camera.createCaptureSession(Arrays.asList(vidRecorder.getSurface()), recordSessionStateCallback, handler);
} catch (CameraAccessException e) {
Log.d(LOG_ID, "onOpened: " + e.getMessage());
}
}
I've tried messing with the order of calls and the output format/encoder with no luck. I am sure that I have all the required permissions. Thanks in advance for your time!
This device most likely supports camera2 at the LEGACY level; check what the output of INFO_SUPPORTED_HARDWARE_LEVEL is.
LEGACY devices are effectively running camera2 on top of the legacy android.hardware.Camera API (more complex than that, but roughly true); as a result, their capabilities via camera2 are restricted.
The maximum recording resolution is one significant problem; android.hardware.Camera records videos via a magic path that the LEGACY mapping layer cannot directly use (there's no Surface involved). As a result, camera2 LEGACY can only record at the maximum preview resolution supported by android.hardware.Camera, not at the maximum recording resolution.
Sounds like this device has no support for 1:1 1080p preview, which is pretty unusual for a device launched so recently.
You can verify if the set of supported preview sizes in the deprecated Camera API matches the list you get in your error; if it doesn't then there may be a OS bug in generating the list so it'd be good to know.
But in general, you can't request sizes that aren't enumerated in the CameraCharacteristics StreamConfiguraitonMap for the camera, no matter what the feature list on the side of the box says. Sometimes the OEM camera app has magic hooks to enable features that normal apps can't get to; often because the feature only works in some very very specific set of circumstances, which normal apps wouldn't know how to replicate.

surface for openCV with Android: already connected

I'm trying to connect my Android app with the OpenCV library and I need to use a native camera for have more control of the camera options. For do it I have found http://nezarobot.blogspot.it/2016/03/android-surfacetexture-camera2-opencv.html, that is what I need.
My problem is that if I use this code, with some small change and when I launch it, my app crash with 3 errors reported:
E/BufferQueueProducer: [SurfaceTexture-0-31525-0] connect(P): already connected (cur=4 req=2)
D/PlateNumberDetection/DetectionBasedTracker: ANativeWindow_lock failed with error code -22
A/libc: Fatal signal 11 (SIGSEGV), code 1, fault addr 0x315e9858 in tid 31735 (CameraBackgroun)
I have tried to close the camera before the jni call and I can capture and show only the first frame, but then I need to restart the camera and I can't create the same thread by itself.
Here I take the frame and I send to NDK.
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
= new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image;
try {
image = reader.acquireLatestImage();
if( image == null) {
return;
}
if (image.getFormat() != ImageFormat.YUV_420_888) {
throw new IllegalArgumentException("image must have format YUV_420_888.");
}
Image.Plane[] planes = image.getPlanes();
if (planes[1].getPixelStride() != 1 && planes[1].getPixelStride() != 2) {
throw new IllegalArgumentException(
"src chroma plane must have a pixel stride of 1 or 2: got "
+ planes[1].getPixelStride());
}
mNativeDetector.detect(image.getWidth(), image.getHeight(), planes[0].getBuffer(), surface);
} catch (IllegalStateException e) {
Log.e(TAG, "Too many images queued for saving, dropping image for request: ", e);
return;
}
image.close();
}
};
and here I manage the camera preview
protected void createCameraPreview() {
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
surface = new Surface(texture);
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.addTarget(mImageReader.get().getSurface());
BlockingSessionCallback sessionCallback = new BlockingSessionCallback();
List<Surface> outputSurfaces = new ArrayList<>();
outputSurfaces.add(mImageReader.get().getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
cameraDevice.createCaptureSession(outputSurfaces, sessionCallback, mBackgroundHandler);
try {
Log.d(TAG, "waiting on session.");
cameraCaptureSessions = sessionCallback.waitAndGetSession(SESSION_WAIT_TIMEOUT_MS);
try {
captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_AUTO);
Log.d(TAG, "setting repeating request");
cameraCaptureSessions.setRepeatingRequest(captureRequestBuilder.build(),
mCaptureCallback, mBackgrounHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
} catch (TimeoutRuntimeException e) {
Toast.makeText(AydaMainActivity.this, "Failed to configure capture session.", Toast.LENGTH_SHORT);
}
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Have you tried that code without your "some small change" first? I have tried that project and it worked well on multiple devices. So it will be useful to first establish if it does not work on your phone at all, or if there is a problem in your modifications.

Android camera fails in IntentService on Lollipop

I'm trying to take a picture in the background from IntentService. On Android 4.4 all is ok, but on Android 5.1 I've got an error:
03-18 14:35:54.497 7659-8956/xyz.bringoff.proximityphoto.app E/InvisibleCameraServiceļ¹• Can't use a camera: Fail to connect to camera service
My code looks like this:
private void handleActionShot() {
mSensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
mProximity = mSensorManager.getDefaultSensor(Sensor.TYPE_PROXIMITY);
releaseCamera();
mCamera = getCameraInstance();
}
public Camera getCameraInstance() {
Camera c = null;
int numCams = Camera.getNumberOfCameras();
if (numCams > 0) {
try {
c = Camera.open(Camera.CameraInfo.CAMERA_FACING_BACK);
} catch (RuntimeException e) {
Log.e(TAG, "Can't use a camera: " + e.getMessage());
releaseCamera();
return null;
}
}
if (c != null) {
c.setParameters(getProperParametersForCurrentDevice(c));
c.lock();
}
return c;
}
I did not find a documented difference between camera requests on this two android versions.
I have found a problem. Receiver caught intent with STATE_OFFHOOK twice and respectively started IntentService twice. I get a camera instance on the first start, so the second time I caught onHandleIntent() camera was already locked. So, I added a check in onHandleIntent() method
if (ACTION_SHOT.equals(action) && !mAlreadyRunning) {
mAlreadyRunning = true;
handleActionShot();
}
and now it works.

Categories

Resources