Capture image in service without preview android - android

Problem statement:
When someone tries to open the device with the wrong pattern / PIN, my application should trigger an alarm, send an alert SMS to the registered mobile number. AND it should capture the image of the person trying to unlock the device, and send this image to registered Email ID.
What I have achieved:
I am getting notification of wrong Pattern / PIN in my DeviceAdmin class.
I start the service for background tasks. This service plays alarm successfully.
I send an alert SMS to the registered mobile number.
I sent an alert Email to the registered email ID successfully. (BUT without image.)
I am confused about how can I capture image in background IntentService when the device is locked, and that, too without preview.
I cannot use the Camera intent obviously. Because startActivityForResult can't be called from Service. PLUS, I don't want user to capture the image after opening the Camera app.
My research led me to these posts already.
Can I use Android Camera in service without preview?
How to Capture Image When Device is Locked
Issue is:
Camera API is deprecate`. Camera2 API requires Minimum sdk version 21,
but my client's requirement is minSdkVersion 15, which I can't change. I am unable to figure out what should I do now. Any reference or help please?

I solved my issue with help of this blog
So I capture the image within background service using this code:
#Override
public void onStart(Intent intent, int startId) {
mCamera = getAvailableFrontCamera(); // globally declared instance of camera
if (mCamera == null){
mCamera = Camera.open(); //Take rear facing camera only if no front camera available
}
SurfaceView sv = new SurfaceView(getApplicationContext());
SurfaceTexture surfaceTexture = new SurfaceTexture(10);
try {
mCamera.setPreviewTexture(surfaceTexture);
//mCamera.setPreviewDisplay(sv.getHolder());
parameters = mCamera.getParameters();
//set camera parameters
mCamera.setParameters(parameters);
//This boolean is used as app crashes while writing images to file if simultaneous calls are made to takePicture
if(safeToCapture) {
mCamera.startPreview();
mCamera.takePicture(null, null, mCall);
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
//Get a surface
sHolder = sv.getHolder();
//tells Android that this surface will have its data constantly replaced
sHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
Camera.PictureCallback mCall = new Camera.PictureCallback()
{
public void onPictureTaken(byte[] data, Camera camera)
{
safeToCapture = false;
//decode the data obtained by the camera into a Bitmap
FileOutputStream outStream = null;
try{
// create a File object for the parent directory
File myDirectory = new File(Environment.getExternalStorageDirectory()+"/Test");
// have the object build the directory structure, if needed.
myDirectory.mkdirs();
//SDF for getting current time for unique image name
SimpleDateFormat curTimeFormat = new SimpleDateFormat("ddMMyyyyhhmmss");
String curTime = curTimeFormat.format(new java.util.Date());
// create a File object for the output file
outStream = new FileOutputStream(myDirectory+"/user"+curTime+".jpg");
outStream.write(data);
outStream.close();
mCamera.release();
mCamera = null;
String strImagePath = Environment.getExternalStorageDirectory()+"/"+myDirectory.getName()+"/user"+curTime+".jpg";
sendEmailWithImage(strImagePath);
Log.d("CAMERA", "picture clicked - "+strImagePath);
} catch (FileNotFoundException e){
Log.d("CAMERA", e.getMessage());
} catch (IOException e){
Log.d("CAMERA", e.getMessage());
}
safeToCapture = true; //Set a boolean to true again after saving file.
}
};
private Camera getAvailableFrontCamera (){
int cameraCount = 0;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for (int camIdx = 0; camIdx < cameraCount; camIdx++) {
Camera.getCameraInfo(camIdx, cameraInfo);
if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
try {
cam = Camera.open(camIdx);
} catch (RuntimeException e) {
Log.e("CAMERA", "Camera failed to open: " + e.getLocalizedMessage());
}
}
}
return cam;
}
//Send Email using javamail API as user will not be allowed to choose available
// application using a Chooser dialog for intent.
public void sendEmailWithImage(String imageFile){
.
.
.
}
Following uses-features and permissions will be needed for this in manifest file:
<uses-feature
android:name="android.hardware.camera"
android:required="false" />
<uses-feature
android:name="android.hardware.camera.front"
android:required="false" />
<uses-permission android:name="android.permission.SEND_MAIL" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
I have set property required to false, so that user will be able to install my app even without any of the Camera available. May be this can help someone in future.

Related

Size not in valid set when recording without preview

I am trying to record video using a Vivo X20 (7.1.1) and the camera2 api without using a preview and without recording sound (Strictly recording HD Video only).
I'm currently stuck because I cannot figure out how to successfully call MediaRecorder.setVideoSize() and record a video in HD. Currently when I run the app the log shows the error: Surface with size (w=1920, h=1080) and format 0x1 is not valid, size not in valid set: [1440x1080, 1280x960, 1280x720, 960x540, 800x480, 720x480, 768x432, 640x480, 384x288, 352x288, 320x240, 176x144]
The phone's stock camera app can record video up to 4K so I'm definitely missing something here. There are a total of two camera devices identified by CameraManager. When I use getOutPutFormats() from CameraCharacteristics it shows the same valid set of resolutions for both cameras and it is the same range as the above error message.
The below is the code I am using to initialize MediaRecorder and initiate a capture session:
public void StartRecordingVideo() {
Initialize();
recordingVideo = true;
cameraManager = (CameraManager) this.getSystemService(Context.CAMERA_SERVICE);
try {
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED) {
String[] cameraIDs = cameraManager.getCameraIdList();
//LogAllCameraInfo();
if (cameraIDs != null)
{
for(int x = 0; x < cameraIDs.length; x++)
{
Log.d(LOG_ID, "ID: " + cameraIDs[x]);
}
}
cameraManager.openCamera(deviceCameraID, cameraStateCallback, handler);
Log.d(LOG_ID, "Successfully opened camera");
}
else
{
throw new IllegalAccessException();
}
}
catch (Exception e)
{
recordingVideo = false;
Log.e(LOG_ID, "Error during record video start: " + e.getMessage());
}
}
private void Initialize()
{
videoRecordThread = new HandlerThread("video_capture");
videoRecordThread.start();
handler = new Handler((videoRecordThread.getLooper()));
try
{
vidRecorder = new MediaRecorder();
vidRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
vidRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
vidRecorder.setVideoFrameRate(30);
vidRecorder.setCaptureRate(30);
vidRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);
vidRecorder.setVideoEncodingBitRate(10000000);
vidRecorder.setVideoSize(1920, 1080);
String videoFilename = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS)+ File.separator + System.currentTimeMillis() + ".mp4";
vidRecorder.setOutputFile(videoFilename);
Log.d(LOG_ID, "Starting video: " + videoFilename);
vidRecorder.prepare();
}
catch (Exception e)
{
Log.e(LOG_ID, "Error during Initialize: " + e.getMessage());
}
}
And the onReady/onSurfacePrepared/Camera onOpened callbacks:
#Override
public void onReady(CameraCaptureSession session) {
Log.d(LOG_ID, "onReady: ");
super.onReady(session);
try {
CaptureRequest.Builder builder = deviceCamera.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
builder.addTarget(vidRecorder.getSurface());
CaptureRequest request = builder.build();
session.setRepeatingRequest(request, null, handler);
vidRecorder.start();
} catch (CameraAccessException e) {
Log.d(LOG_ID, "Error on Ready: " + e.getMessage());
}
}
#Override
public void onSurfacePrepared(CameraCaptureSession session, Surface surface) {
Log.d(LOG_ID, "onSurfacePrepared: ");
super.onSurfacePrepared(session, surface);
}
#Override
public void onOpened(CameraDevice camera) {
Log.d(LOG_ID, "onOpened: ");
deviceCamera = camera;
try {
camera.createCaptureSession(Arrays.asList(vidRecorder.getSurface()), recordSessionStateCallback, handler);
} catch (CameraAccessException e) {
Log.d(LOG_ID, "onOpened: " + e.getMessage());
}
}
I've tried messing with the order of calls and the output format/encoder with no luck. I am sure that I have all the required permissions. Thanks in advance for your time!
This device most likely supports camera2 at the LEGACY level; check what the output of INFO_SUPPORTED_HARDWARE_LEVEL is.
LEGACY devices are effectively running camera2 on top of the legacy android.hardware.Camera API (more complex than that, but roughly true); as a result, their capabilities via camera2 are restricted.
The maximum recording resolution is one significant problem; android.hardware.Camera records videos via a magic path that the LEGACY mapping layer cannot directly use (there's no Surface involved). As a result, camera2 LEGACY can only record at the maximum preview resolution supported by android.hardware.Camera, not at the maximum recording resolution.
Sounds like this device has no support for 1:1 1080p preview, which is pretty unusual for a device launched so recently.
You can verify if the set of supported preview sizes in the deprecated Camera API matches the list you get in your error; if it doesn't then there may be a OS bug in generating the list so it'd be good to know.
But in general, you can't request sizes that aren't enumerated in the CameraCharacteristics StreamConfiguraitonMap for the camera, no matter what the feature list on the side of the box says. Sometimes the OEM camera app has magic hooks to enable features that normal apps can't get to; often because the feature only works in some very very specific set of circumstances, which normal apps wouldn't know how to replicate.

Unable to take a picture by using the camera

I am trying to develop an Android application that will take pictures in a service. Before implementing a service I am trying to implement the functionality on a button click. I've used code available on following locations
how to capture an image in background without using the camera application
However I am getting an excecption while executing Camera.takePicture() call. The LogCat shows following
12-10 14:46:56.135 589-18809/? E/QCameraStateMachine: int32_t qcamera::QCameraStateMachine::procEvtPreviewReadyState(qcamera_sm_evt_enum_t, void *): Error!! cannot handle evt(24) in state(1)
12-10 14:46:56.135 589-4574/? I/QCamera2HWI: [KPI Perf] static int qcamera::QCamera2HardwareInterface::pre_take_picture(struct camera_device *): X
12-10 14:46:56.135 589-4574/? E/QCamera2HWI: static int qcamera::QCamera2HardwareInterface::take_picture(struct camera_device *): pre_take_picture failed with ret = -38
12-10 14:46:56.135 589-4574/? I/QCameraHalWatchdog: Stopped Watchdog Thread (0ms)[take_picture]
I could not find any information on "cannot handle evt(24) in state(1)". Neither on "qcamera::QCamera2HardwareInterface::pre_take_picture(struct camera_device *): X"
I have not copied the code here since the code is already presnet in the link I've provided earlier
My min compile version is KitKat and I am using Motorola Lenovo.
Your help is highly appreciated.
Edit****
Here is the code
Camera.PictureCallback mCall = new Camera.PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.d(TAG, "------ In onPictureTaken");
// decode the data obtained by the camera into a Bitmap
// display.setImageBitmap(photo);
Bitmap bitmapPicture = BitmapFactory.decodeByteArray(data, 0,
data.length);
}
};
...
Camera c = null;
try {
c = Camera.open();
SurfaceView sv = new SurfaceView(this);
SurfaceHolder holder = sv.getHolder();
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
c.setPreviewDisplay(holder);
Camera.Parameters params = c.getParameters();
params.setJpegQuality(100);
c.setParameters(params);
c.startPreview();
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
c.setPreviewDisplay(holder);
c.takePicture(null, null, mCall);
} catch (Exception e) {
Log.e(TAG, "---- problems with camera operation " + e.getMessage(), e);
e.printStackTrace();
Log.d(TAG, "------ 333333 ");
} finally {
if (c != null) {
c.stopPreview();
c.release();
}
}

Tokbox video publish unfocused (Android)

I am using tokbox for video chat and I want to take pictures of printed documents. When I am trying this on a Samsung s7edge my captured image is so unfocused I cannot read it. When I am trying this on a Nexus 6p the image is fine.
It is not a resolution problem, I am always using CameraCaptureResolution.HIGH
any thoughts?
I fixed it:
I used this class:
https://github.com/opentok/opentok-android-sdk-samples/blob/master/Custom-Video-Driver/app/src/main/java/com/tokbox/android/tutorials/customvideodriver/CustomVideoCapturer.java
Changed the init function:
#Override
public void init() {
mCamera = Camera.open(mCameraIndex);
mCurrentDeviceInfo = new Camera.CameraInfo();
Camera.getCameraInfo(mCameraIndex, mCurrentDeviceInfo);
try{
//set camera to continually auto-focus
Camera.Parameters params = mCamera.getParameters();
params.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
mCamera.setParameters(params);
}catch (Exception e) {
e.printStackTrace();
}
}
and the swapCamera as well, so every time the back camera comes into play It must have autofocus.
And in my Activity, onConnected:
CustomVideoCapturer mCapturer = new CustomVideoCapturer(a) ;
mPublisher = new Publisher.Builder(this)
.capturer(mCapturer)
.resolution(Publisher.CameraCaptureResolution.HIGH).build();

my camera do not work on my new device (nullpointerexcepetion) [duplicate]

This question already has answers here:
What is a NullPointerException, and how do I fix it?
(12 answers)
Closed 5 years ago.
after i changed my device my camera suddently stopped working and throwing out a
java.lang.NullPointerException: Attempt to invoke virtual method 'android.hardware.Camera$Parameters android.hardware.Camera.getParameters()' on a null object reference
i had tryed to change out my catch (RuntimeException ex) to finaly
but that instead gaved me a
Fail to connect to camera service” exception
public void surfaceCreated(SurfaceHolder surfaceHolder) {
String PreviewFPS;
String Previewsize;
String Displayor;
PreviewFPS = setingPreferences.getString("previewfps", "");
Previewsize = setingPreferences.getString("screensize", "");
Displayor = setingPreferences.getString("orientation", "");
String[] size = Previewsize.split(",");
try {
camera = Camera.open();
} catch (RuntimeException ex) {
}
Camera.Parameters parameters;
parameters = camera.getParameters();
//modificer parameterene
parameters.setPreviewFrameRate(Integer.parseInt(PreviewFPS));
parameters.setPreviewSize(Integer.parseInt(size[0]),Integer.parseInt(size[1]));
camera.setParameters(parameters);
parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
camera.setDisplayOrientation(Integer.parseInt(Displayor));
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception ex) {
}
}
manifest
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
the manufactor of the device had investigated the code and told me that the reason was due to it couldn't run the scanner and camera at once
Hi Kewin
 
Would you please share the source code to me? as we haven't tested if it can run both the camera and scanner
 
We checked, the scanner and camera can not be opened in same time.
 
Best Regards,
so my fix for this was by having my camera and scanner activity running in 2 separate activites and layouts hopefully this would help someone else
You need all the camera code in the try and you should read the stacktraces if you actually printed them
try {
camera = Camera.open();
Camera.Parameters parameters = camera.getParameters();
...
} catch (RuntimeException ex) {
ex.printStackTrace();
}
// camera remains null if an exception is caught...

Android Camera2, take picture continuously

I need to take pictures continuously with Camera2 API. It works fine on high end devices (for instance a Nexus 5X), but on slower ones (for instance a Samsung Galaxy A3), the preview freezes.
The code is a bit long, so I post only the most relevant parts:
Method called to start my preview:
private void startPreview() {
SurfaceTexture texture = mTextureView.getSurfaceTexture();
if(texture != null) {
try {
// We configure the size of default buffer to be the size of camera preview we want.
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
// This is the output Surface we need to start preview.
Surface surface = new Surface(texture);
// We set up a CaptureRequest.Builder with the output Surface.
mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);
// Here, we create a CameraCaptureSession for camera preview.
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
// If the camera is already closed, return:
if (mCameraDevice == null) { return; }
// When the session is ready, we start displaying the preview.
mCaptureSession = cameraCaptureSession;
// Auto focus should be continuous for camera preview.
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
mPreviewRequest = mPreviewRequestBuilder.build();
// Start the preview
try { mCaptureSession.setRepeatingRequest(mPreviewRequest, null, mPreviewBackgroundHandler); }
catch (CameraAccessException e) { e.printStackTrace(); }
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Log.e(TAG, "Configure failed");
}
}, null
);
}
catch (CameraAccessException e) { e.printStackTrace(); }
}
}
Method called to take a picture:
private void takePicture() {
try {
CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
mCaptureSession.capture(captureBuilder.build(), null, mCaptureBackgroundHandler);
}
catch (CameraAccessException e) { e.printStackTrace(); }
}
And here is my ImageReader:
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(final ImageReader reader) {
mSaveBackgroundHandler.post(new Runnable() {
#Override
public void run() {
// Set the destination file:
File destination = new File(getExternalFilesDir(null), "image_" + mNumberOfImages + ".jpg");
mNumberOfImages++;
// Acquire the latest image:
Image image = reader.acquireLatestImage();
// Save the image:
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
FileOutputStream output = null;
try {
output = new FileOutputStream(destination);
output.write(bytes);
}
catch (IOException e) { e.printStackTrace(); }
finally {
image.close();
if (null != output) {
try { output.close(); }
catch (IOException e) { e.printStackTrace(); }
}
}
// Take a new picture if needed:
if(mIsTakingPictures) {
takePicture();
}
}
});
}
};
I have a button that toggle the mIsTakingPictures boolean, and makes the first takePicture call.
To recap, I'm using 3 threads:
one for the preview
one for the capture
one for the image saving
What can be the cause of this freeze?
It's impossible to avoid framing lost in your preview when you are taking images all time on weak devices. The only way to avoid this is on devices which support TEMPLATE_ZERO_SHUTTER_LAG and using a reprocessableCaptureSession. The documentation about this is pretty horrible and find a way to implement it can be a odyssey. I have this problem a few months ago and finally I found the way to implement it:
How to use a reprocessCaptureRequest with camera2 API
In that answer you can also find some Google CTS test's which also implements ReprocessableCaptureSession and shoot some burst captures with ZSL template.
Finally, you can also use a CaptureBuilder with your preview surface and the image reader surface attached, in that case your preview will continue working all time and also you will save each frame as a new picture. But you will still having the freeze problem.
I also tried implement a burst capture using a handler which dispatch a new capture call each 100 milliseconds, this second option was pretty good in performance and avoiding frame rate lost, but you will not get as many captures per second like the two ImageReader option.
Hope that my answer will help you a bit, API 2 still being a bit complex and there's not so many examples or information about it.
One thing I noticed on low end devices: the preview stops after a capture, even when using camera 1 api, so it has to be restarted manually, thus producing a small preview freeze when capturing a high resolution picture.
But the camera 2 api provides the possibility to get raw image when taking a still capture (that wasn't possible on the devices I have when using camera 1 (Huawei P7, Sony Xperia E5, wiko UFeel)). Using this feature is much faster than capturing a JPEG (maybe due to JPEG compression), so the preview can be restarted earlier, and the preview freeze is shorter. Of course using this solution you'll have to convert the picture from YUV to JPEG in a background task..

Categories

Resources