How to crop SurfaceView in CameraSource for OCR - android

How to crop the SurfaceView for OCR, that OCR would be doing recognition only on the part of the photo, which is in the frame.
I used this tutorial! And for now the whole surface is recognizable. While I want to recognize only part of the whole SurfaceView, as for example in barcodes.
But still, User should have the picture of full surface.
I tried to use TextureView, but it's not compatible with CameraSouce.
private void startCameraSource() {
//Create the TextRecognizer
final TextRecognizer textRecognizer = new TextRecognizer.Builder(getApplicationContext()).build();
if (!textRecognizer.isOperational()) {
Log.w(TAG, "Detector dependencies not loaded yet");
} else {
//Initialize camerasource to use high resolution and set Autofocus on.
mCameraSource = new CameraSource.Builder(getApplicationContext(), textRecognizer)
.setFacing(CameraSource.CAMERA_FACING_BACK)
.setRequestedPreviewSize(1280, 1024)
.setAutoFocusEnabled(true)
.setRequestedFps(2.0f)
.build();
/*
Add call back to SurfaceView and check if camera permission is granted.
If permission is granted we can start our cameraSource and pass it to surfaceView
*/
surfaceView.getHolder().addCallback(new SurfaceHolder.Callback() {
#Override
public void surfaceCreated(SurfaceHolder holder) {
try {
if (ActivityCompat.checkSelfPermission(getApplicationContext(),
Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
return;
}
mCameraSource.start(surfaceView.getHolder());
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
mCameraSource.stop();
}
});
//Set the TextRecognizer's Processor.
textRecognizer.setProcessor(new Detector.Processor<TextBlock>() {
#Override
public void release() {
}
/**
* Detect all the text from camera using TextBlock and the values into a stringBuilder
* which will then be set to the textView.
* */
#Override
public void receiveDetections(Detector.Detections<TextBlock> detections) {
final SparseArray<TextBlock> items = detections.getDetectedItems();
if (items.size() != 0) {
mTextView.post(new Runnable() {
#Override
public void run() {
StringBuilder stringBuilder = new StringBuilder();
for (int i = 0; i < items.size(); i++) {
TextBlock item = items.valueAt(i);
stringBuilder.append(item.getValue());
stringBuilder.append("\n");
}
String detectedext = stringBuilder.toString().replace("\n", "").replace("\r", "");
detectedext = detectedext.replaceAll("\\D+", "");
String comparisonResult = queryEquiNoDatabase(detectedext);
System.out.println(detectedext);
if (comparisonResult == null) {
textViewRectangle.setBackgroundResource(R.drawable.textview_border_detecting);
} else {
textViewRectangle.setBackgroundResource(R.drawable.textview_border_end);
Intent intentAfterCameraScan = new Intent(getApplicationContext(), TransformerBoxActivity.class);
intentAfterCameraScan.putExtra("equino", comparisonResult);
intentAfterCameraScan.putExtra("isOcrLast", true);
createEffect(intentAfterCameraScan, 1500);
textRecognizer.release();
}
}
});
}
}
});
}
}
The picture of the activity:
https://imgur.com/a/wc5Yjjb
What is actually recognized:
https://imgur.com/5LkmFzH
And I expect to recognize only thing which are inside the borders
For now the whole surface is applied for recognition.

Related

flash light is not working when camera is on

I'm creating barcode reading application using google vision and it consists a flash on/off function also. i was able to implement flash using "CameraManager" like in the following code due to the "Camera" is now deprecated.but when camera screen is on, flash light is not working.when camera is freeze(when barcode is detected i'm stoping the camerasource), flash light is working. but i want to flash light on/off with out regarding the camera view is on or stop.i need this get done without using the deprecated APIs.can some one tell me how i can get solved this problem. thanks in advance.
private void setupBarcode(){
cameraPreview = (SurfaceView) findViewById(R.id.cameraPreview);
txtResult = findViewById(R.id.txtResult);
barcodeDetector = new BarcodeDetector.Builder(this)
.setBarcodeFormats(Barcode.ALL_FORMATS)
.build();
cameraSource = new CameraSource
.Builder(this, barcodeDetector)
.setAutoFocusEnabled(true)
.build();
//Add Event
cameraPreview.getHolder().addCallback(new SurfaceHolder.Callback() {
#Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
if (ActivityCompat.checkSelfPermission(getApplicationContext(), android.Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
//Request permission
ActivityCompat.requestPermissions(ScanActivity.this,
new String[]{Manifest.permission.CAMERA}, RequestCameraPermissionID);
return;
}
try {
cameraSource.start(cameraPreview.getHolder());
txtResult.setText("");
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {
}
#Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
cameraSource.stop();
}
});
barcodeDetector.setProcessor(new Detector.Processor<Barcode>() {
#Override
public void release() {
}
#Override
public void receiveDetections(Detector.Detections<Barcode> detections) {
final SparseArray<Barcode> qrcodes = detections.getDetectedItems();
if (qrcodes.size() != 0) {
txtResult.post(new Runnable() {
#Override
public void run() {
okButton.setEnabled(true);
txtResult.setText(qrcodes.valueAt(0).displayValue);
cameraSource.stop();
}
});
}
}
});
}
private void flashLightOn() {
CameraManager cameraManager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
String cameraId = cameraManager.getCameraIdList()[0];
cameraManager.setTorchMode(cameraId, true);
} catch (CameraAccessException e) {
}
}
Use the open source 'CameraSource.java' file from goole vision library in your project.
Get file from here
And then use the following code:
public void switchFlashLight(boolean status) {
cameraSource.setFlashMode(status ? Camera.Parameters.FLASH_MODE_TORCH : Camera.Parameters.FLASH_MODE_OFF);
}

mobile vision Qrcode Call intent multiple time

I made a QrScanner apps for scanning using google mobile vision. App is simple like scannig a Qrcode , app decode the Qrcode ,deliver it to result class and result shows in the Result layout. But the problem is When im trying to Scan Qrcode , Some how it call result class multiple time ... heres my MainActivity code :
public class MainActivity extends AppCompatActivity {
private static final String TAG = "MainActivity";
SurfaceView surfaceView;
BarcodeDetector barcodeDetector;
CameraSource cameraSource;
final int RequestCameraID = 1001;
BoxDetector boxDetector;
#Override
protected void onStop() {
super.onStop();
Toast.makeText(getApplicationContext(),"Stop",Toast.LENGTH_LONG).show();
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
surfaceView = findViewById(R.id.cameraView);
surfaceView.setZOrderMediaOverlay(true);
if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.CAMERA}, PERMISSION_GRANTED);
}
Casting();
}
private void Casting() {
barcodeDetector = new BarcodeDetector.Builder(this)
.setBarcodeFormats(Barcode.QR_CODE)
.build();
boxDetector = new BoxDetector(barcodeDetector, 300, 300);
if (!barcodeDetector.isOperational()) {
Toast.makeText(MainActivity.this, "Sorry couldn't setup the detector", Toast.LENGTH_LONG).show();
}
cameraSource = new CameraSource.Builder(MainActivity.this, boxDetector)
.setFacing(CameraSource.CAMERA_FACING_BACK)
.setRequestedFps(30)
.setAutoFocusEnabled(true)
.setRequestedPreviewSize(1280, 720)
.build();
surfaceView.getHolder().addCallback(new SurfaceHolder.Callback() {
#Override
public void surfaceCreated(SurfaceHolder holder) {
if (ActivityCompat.checkSelfPermission(MainActivity.this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(MainActivity.this, new String[]{Manifest.permission.CAMERA},RequestCameraID);
return;
}
try {
cameraSource.start(surfaceView.getHolder());
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) { }
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
cameraSource.stop();
}
});
boxDetector.setProcessor(new Detector.Processor<Barcode>() {
#Override
public void release() {
}
#Override
public void receiveDetections(Detector.Detections<Barcode> detections) {
final SparseArray<Barcode> barcodeSparseArray = detections.getDetectedItems();
if(barcodeSparseArray.size()!=0){
Handler handler = new Handler(Looper.getMainLooper());
handler.post(new Runnable() {
#Override
public void run() {
cameraSource.release();
barcodeDetector.release();
Intent intent = new Intent(MainActivity.this, ResultActivity.class);
intent.putExtra("Result",barcodeSparseArray.valueAt(0).displayValue);
Casting();
startActivity(intent);
}
});
}
}
});
}
public void cS() {
barcodeDetector = new BarcodeDetector.Builder(this)
.setBarcodeFormats(Barcode.QR_CODE)
.build();
boxDetector = new BoxDetector(barcodeDetector, 300, 300);
if (!barcodeDetector.isOperational()) {
Toast.makeText(MainActivity.this, "Sorry couldn't setup the detector", Toast.LENGTH_LONG).show();
}
cameraSource = new CameraSource.Builder(MainActivity.this, boxDetector)
.setFacing(CameraSource.CAMERA_FACING_BACK)
.setRequestedFps(30)
.setAutoFocusEnabled(true)
.setRequestedPreviewSize(1280, 720)
.build();
}
#Override
public void onRequestPermissionsResult(int requestCode, #NonNull String[] permissions, #NonNull int[] grantResults) {
switch (requestCode) {
case RequestCameraID: {
if (grantResults[0] == PackageManager.PERMISSION_GRANTED) {
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
return;
}
try {
cameraSource.start(surfaceView.getHolder());
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}}
BoxDetector Code is:
public class BoxDetector extends Detector {
private Detector mDelegate;
private int mBoxWidth, mBoxHeight;
BoxDetector(Detector delegate, int boxWidth, int boxHeight) {
mDelegate = delegate;
mBoxWidth = boxWidth;
mBoxHeight = boxHeight;
}
public SparseArray detect(Frame frame) {
int width = frame.getMetadata().getWidth();
int height = frame.getMetadata().getHeight();
int right = ((width / 2) + (mBoxHeight / 2)) -150 ;
int left = ((width / 2) - (mBoxHeight / 2)) - 150;
int bottom = ((height / 2) + (mBoxWidth / 2));
int top = ((height / 2) - ((mBoxWidth) / 2));
YuvImage yuvImage = new YuvImage(frame.getGrayscaleImageData().array(), ImageFormat.NV21, width, height, null);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
yuvImage.compressToJpeg(new Rect(left, top, right, bottom), 100, byteArrayOutputStream);
byte[] jpegArray = byteArrayOutputStream.toByteArray();
Bitmap bitmap = BitmapFactory.decodeByteArray(jpegArray, 0, jpegArray.length);
Frame croppedFrame =
new Frame.Builder()
.setBitmap(bitmap)
.setRotation(frame.getMetadata().getRotation())
.build();
return mDelegate.detect(croppedFrame);
}
public void run(){
try {
Thread.sleep(300);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
public boolean setFocus(int id) {
return mDelegate.setFocus(id);
}
}
Thats it Thank You!

Is it possible to select particular text using Google's vision API?

I am designing an app where i scan the text using the camera and use that text to fetch more details. To do that i am using Google's vision API. But by default the API reads all the text that is available on the image as shown below.
As you can see from the above image the app is recognizing all the text that is available in front of the camera. But i would like to just scan "Hello World" from the camera. Is it possible to use some kind of touch event just to focus on the desired text
Please find the code used for text recognition
private void startCameraSource() {
//Create the TextRecognizer
final TextRecognizer textRecognizer = new TextRecognizer.Builder(getApplicationContext()).build();
if (!textRecognizer.isOperational()) {
Log.w(TAG, "Detector dependencies not loaded yet");
} else {
//Initialize camerasource to use high resolution and set Autofocus on.
mCameraSource = new CameraSource.Builder(getApplicationContext(), textRecognizer)
.setFacing(CameraSource.CAMERA_FACING_BACK)
.setRequestedPreviewSize(1280, 1024)
.setAutoFocusEnabled(true)
.setRequestedFps(2.0f)
.build();
/**
* Add call back to SurfaceView and check if camera permission is granted.
* If permission is granted we can start our cameraSource and pass it to surfaceView
*/
mCameraView.getHolder().addCallback(new SurfaceHolder.Callback() {
#Override
public void surfaceCreated(SurfaceHolder holder) {
try {
if (ActivityCompat.checkSelfPermission(getApplicationContext(),
Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(MainActivity.this,
new String[]{Manifest.permission.CAMERA},
requestPermissionID);
return;
}
mCameraSource.start(mCameraView.getHolder());
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
mCameraSource.stop();
}
});
//Set the TextRecognizer's Processor.
textRecognizer.setProcessor(new Detector.Processor<TextBlock>() {
#Override
public void release() {
}
/**
* Detect all the text from camera using TextBlock and the values into a stringBuilder
* which will then be set to the textView.
* */
#Override
public void receiveDetections(Detector.Detections<TextBlock> detections) {
final SparseArray<TextBlock> items = detections.getDetectedItems();
if (items.size() != 0 ){
mTextView.post(new Runnable() {
#Override
public void run() {
StringBuilder stringBuilder = new StringBuilder();
for(int i=0;i<items.size();i++){
TextBlock item = items.valueAt(i);
stringBuilder.append(item.getValue());
stringBuilder.append("\n");
}
mTextView.setText(stringBuilder.toString());
}
});
}
}
});
}
}

Android java.lang.IllegalArgumentException: previewSize must not be taller than activeArray

My application was working good in large number of phones. However, when I installed it in my old android phone the following error is thrown and application crashes while taking the photo.
Android java.lang.IllegalArgumentException: previewSize must not be
taller than activeArray
Photo Capturing code:
public class Camera1 extends AppCompatActivity {
private static final String TAG = "AndroidCameraApi";
private TextureView textureView;
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
private Bitmap scaled;
private Bitmap mBitmapToSave;
private Bitmap mBitmapToSave1;
private String cameraId;
protected CameraDevice cameraDevice;
protected CameraCaptureSession cameraCaptureSessions;
protected CaptureRequest captureRequest;
protected CaptureRequest.Builder captureRequestBuilder;
private Size imageDimension;
private ImageReader imageReader;
private File file;
private com.google.android.gms.vision.face.FaceDetector detector;
private static final int REQUEST_CAMERA_PERMISSION = 200;
private boolean mFlashSupported;
private Handler mBackgroundHandler;
private HandlerThread mBackgroundThread;
private int width = 640;
private int height = 480;
private int index;
//Image request code
private int PICK_IMAGE_REQUEST = 1;
private int a=0;
//storage permission code
private static final int STORAGE_PERMISSION_CODE = 123;
//Bitmap to get image from gallery
private Bitmap bitmap;
//Uri to store the image uri
private Uri filePath;
private String name,dl_no,truck_id, tstatus;
private float l_value;
private String dl;
private int c=0;
File fileToUpload;
int f=0;
private String uuid;
String latitude;
String longitude;
String time1;
String date1;
private int mSensorOrientation;
CameraCharacteristics characteristics;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_android_camera2_api);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
textureView = (TextureView) findViewById(R.id.texture);
assert textureView != null;
textureView.setSurfaceTextureListener(textureListener);
detector = new FaceDetector.Builder(getApplicationContext())
.setMode(FaceDetector.ACCURATE_MODE)
.build();
uuid = UUID.randomUUID().toString();
Bundle extras = getIntent().getExtras();
if (extras != null) {
l_value=extras.getFloat("v");
tstatus=extras.getString("status");
name=extras.getString("name");
dl_no=extras.getString("dl");
truck_id=extras.getString("tid");
latitude=extras.getString("lat");
longitude=extras.getString("lon");
time1 =extras.getString("t");
date1=extras.getString("d");
}
fileToUpload = new File(Environment.getExternalStorageDirectory() + "/" + "/Faceapp/"+name+"_"+dl_no+"_"+truck_id+"_"+latitude+"_"+longitude+"_"+time1+"_"+date1+"_"+a+".jpg");
//Intent uplaod_intent = new Intent(Camera1.this, uploadRest.class);
// Add extras to the bundle
// Start the service
// Camera1.this.startService(uplaod_intent);
}
private int getOrientation(int rotation) {
// Sensor orientation is 90 for most devices, or 270 for some devices (eg. Nexus 5X
// We have to take that into account and rotate JPEG properly.
// For devices with orientation of 90, we simply return our mapping from ORIENTATIONS.
// For devices with orientation of 270, we need to rotate the JPEG 180 degrees.
return (ORIENTATIONS.get(rotation) + mSensorOrientation + 270) % 360;
}
TextureView.SurfaceTextureListener textureListener = new TextureView.SurfaceTextureListener() {
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
//open your camera here
openCamera();
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
// Transform you image captured size according to the surface width and height
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return false;
}
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
};
private final CameraDevice.StateCallback stateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(CameraDevice camera) {
//This is called when the camera is open
Log.e(TAG, "onOpened");
cameraDevice = camera;
createCameraPreview();
}
#Override
public void onDisconnected(CameraDevice camera) {
cameraDevice.close();
}
#Override
public void onError(CameraDevice camera, int error) {
cameraDevice.close();
cameraDevice = null;
}
};
final CameraCaptureSession.CaptureCallback captureCallbackListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Toast.makeText(Camera1.this, "Saved:" + file, Toast.LENGTH_SHORT).show();
createCameraPreview();
}
};
protected void startBackgroundThread() {
mBackgroundThread = new HandlerThread("Camera Background");
mBackgroundThread.start();
mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
}
protected void stopBackgroundThread() {
mBackgroundThread.quitSafely();
try {
mBackgroundThread.join();
mBackgroundThread = null;
mBackgroundHandler = null;
} catch (InterruptedException e) {
e.printStackTrace();
}
}
public static void goToCompletedActivity(Context mContext) {
Intent login = new Intent(mContext, Completed.class);
mContext.startActivity(login);
}
private void userLogin() {
//first getting the values
final String Driver_id = dl_no;
class UserLogin extends AsyncTask<Void, Void, String> {
ProgressBar progressBar;
#Override
protected void onPreExecute() {
super.onPreExecute();
progressBar = (ProgressBar) findViewById(R.id.progressBar);
//progressBar.setVisibility(View.VISIBLE);
}
#Override
protected void onPostExecute(String s) {
super.onPostExecute(s);
// progressBar.setVisibility(View.GONE);
try {
//converting response to json object
JSONObject obj = new JSONObject(s);
//if no error in response
if (!obj.getBoolean("error")) {
Toast.makeText(getApplicationContext(), obj.getString("message"), Toast.LENGTH_SHORT).show();
//getting the user from the response
JSONObject userJson = obj.getJSONObject("user");
//creating a new user object
User user = new User(
userJson.getString("Driver_id"),
userJson.getString("Driver_name"),
userJson.getString("Truck_id"),
userJson.getString("Trainingstatus")
);
//storing the user in shared preferences
SharedPrefManager.getInstance(getApplicationContext()).userLogin(user);
//starting the profile activity
finish();
startActivity(new Intent(getApplicationContext(), ProfileActivity.class));
} else {
Toast.makeText(getApplicationContext(), "Invalid Driver ID", Toast.LENGTH_SHORT).show();
}
} catch (JSONException e) {
e.printStackTrace();
}
}
#Override
protected String doInBackground(Void... voids) {
//creating request handler object
RequestHandler requestHandler = new RequestHandler();
//creating request parameters
HashMap<String, String> params = new HashMap<>();
params.put("Driver_id", Driver_id);
//returing the response
return requestHandler.sendPostRequest(URLs.URL_LOGIN, params);
}
}
UserLogin ul = new UserLogin();
ul.execute();
}
public void launchuploadservice() {
// Construct our Intent specifying the Service
Intent i = new Intent(this, Upload.class);
// Add extras to the bundle
i.putExtra("name", name);
i.putExtra("dl", dl_no);
i.putExtra("tid", truck_id);
i.putExtra("lat", latitude);
i.putExtra("lon", longitude);
i.putExtra("status", tstatus);
i.putExtra("t", time1);
i.putExtra("d", date1);
// Start the service
startService(i);
}
protected void takePicture() {
if (null == cameraDevice) {
Log.e(TAG, "cameraDevice is null");
return;
}
final CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
Size[] jpegSizes = null;
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
if (jpegSizes != null && 0 < jpegSizes.length) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
// Orientation
int displayRotation = this.getWindowManager().getDefaultDisplay().getRotation();
int sensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
mSensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
boolean swappedDimensions = false;
switch (displayRotation) {
case Surface.ROTATION_0:
case Surface.ROTATION_180:
if (sensorOrientation == 90 || sensorOrientation == 270) {
if (mSensorOrientation == 90 || mSensorOrientation == 270) {
swappedDimensions = true;
}
}
break;
case Surface.ROTATION_90:
case Surface.ROTATION_270:
if (sensorOrientation == 0 || sensorOrientation == 180) {
if (mSensorOrientation == 0 || mSensorOrientation == 180) {
swappedDimensions = true;
}
}
break;
}
int rotation = this.getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(rotation));
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null ;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
mBitmapToSave = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
if (detector.isOperational() && mBitmapToSave != null) {
Frame frame = new Frame.Builder()
.setBitmap(mBitmapToSave)
//.setImageData(buffer, width, height, YUV_420_888)
//.setRotation(getWindowManager().getDefaultDisplay().getRotation())
.build();
SparseArray<Face> faces = detector.detect(frame);
for (index = 0; index < faces.size(); ++index) {
Face face = faces.valueAt(index);
}
if (faces.size() == 0) {
Toast.makeText(Camera1.this, "No Face" + "\n", Toast.LENGTH_SHORT).show();
saveImageToDisk(bytes);
MediaPlayer mediaPlayer = MediaPlayer.create(getApplicationContext(), R.raw.not);
mediaPlayer.start();
// mBitmapToSave.recycle();
} else {
saveImageToDisk(bytes);
Toast.makeText(Camera1.this, "Face Found " + "\n", Toast.LENGTH_SHORT).show();
launchuploadservice();
}
}
} catch (Exception ee) {
}
finally {
if(image!=null)
image.close();
}
}
};
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
createCameraPreview();
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, mBackgroundHandler);
mBitmapToSave = null;
} catch(CameraAccessException e){
e.printStackTrace();
}
}
private void saveImageToDisk(final byte[] bytes) {
final File file = new File(Environment.getExternalStorageDirectory() + "/"+"/Faceapp/"+name+"_"+dl_no+"_"+truck_id+"_"+longitude+"_"+latitude+"_"+time1+"_"+date1 + "_.jpg");
try (final OutputStream output = new FileOutputStream(file)) {
output.write(bytes);
//this.picturesTaken.put(file.getPath(), bytes);
} catch (IOException e) {
Log.e(TAG, "Exception occurred while saving picture to external storage ", e);
}
}
protected void createCameraPreview() {
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
Surface surface = new Surface(texture);
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.addTarget(surface);
cameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback(){
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
//The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
cameraCaptureSessions = cameraCaptureSession;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Toast.makeText(Camera1.this, "Configuration change", Toast.LENGTH_SHORT).show();
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private void openCamera() {
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
Log.e(TAG, "is camera open");
try {
cameraId = manager.getCameraIdList()[1];
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
assert map != null;
imageDimension = map.getOutputSizes(SurfaceTexture.class)[0];
// Add permission for camera and let user grant the permission
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED && ActivityCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(Camera1.this, new String[]{Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE}, REQUEST_CAMERA_PERMISSION);
return;
}
manager.openCamera(cameraId, stateCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
Log.e(TAG, "openCamera X");
}
protected void updatePreview() {
if(null == cameraDevice) {
Log.e(TAG, "updatePreview error, return");
}
captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
try {
cameraCaptureSessions.setRepeatingRequest(captureRequestBuilder.build(), null, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private void closeCamera() {
if (null != cameraDevice) {
cameraDevice.close();
cameraDevice = null;
}
if (null != imageReader) {
//image.close();
imageReader.close();
imageReader = null;
}
}
#Override
public void onRequestPermissionsResult(int requestCode, String permissions[], int[] grantResults) {
if (requestCode == REQUEST_CAMERA_PERMISSION) {
if (grantResults[0] == PackageManager.PERMISSION_DENIED) {
// close the app
Toast.makeText(Camera1.this, "Sorry!!!, you can't use this app without granting permission", Toast.LENGTH_LONG).show();
finish();
}
}
}
// #Override
/* public void onRequestPermissionsResult(int requestCode, #NonNull String[] permissions, #NonNull int[] grantResults) {
if (requestCode == REQUEST_CAMERA_PERMISSION) {
if (grantResults[0] == PackageManager.PERMISSION_DENIED) {
// close the app
Toast.makeText(Camera1.this, "Sorry!!!, you can't use this app without granting permission", Toast.LENGTH_LONG).show();
finish();
}
}
}
*/
#Override
protected void onResume() {
final Intent intent = new Intent(Camera1.this, Completed.class);
super.onResume();
Log.e(TAG, "onResume");
startBackgroundThread();
if (textureView.isAvailable()) {
openCamera();
} else {
textureView.setSurfaceTextureListener(textureListener);
}
final int PICTURES_LIMIT = 1;
final Timer timer = new Timer();
timer.schedule(new TimerTask() {
int pictureNo=0;
public void run() {
if (pictureNo>PICTURES_LIMIT){
timer.cancel();
finish();
} else {
takePicture();
pictureNo++;
}
}
},10, 5500);
}
#Override
protected void onPause() {
super.onPause();
Log.e(TAG, "onPause");
closeCamera();
stopBackgroundThread();
}
}
Based on the code found here, its pretty self explanatory that the old device preview size is less than the size that you're trying to crop.
Check the getPreviewCropRectangleUnzoomed function in the code that I have mentioned above. The documentation of the function says the cause of the error specifically. From the documentation.
/**
* Calculate the effective crop rectangle for this preview viewport;
* assumes the preview is centered to the sensor and scaled to fit across one of the dimensions
* without skewing.
*
* <p>The preview size must be a subset of the active array size; the resulting
* rectangle will also be a subset of the active array rectangle.</p>
*
* <p>The unzoomed crop rectangle is calculated only.</p>
*
* #param activeArray active array dimensions, in sensor space
* #param previewSize size of the preview buffer render target, in pixels (not in sensor space)
* #return a rectangle which serves as the preview stream's effective crop region (unzoomed),
* in sensor space
*
* #throws NullPointerException
* if any of the args were {#code null}
* #throws IllegalArgumentException
* if {#code previewSize} is wider or taller than {#code activeArray}
*/
Check the part - The preview size must be a subset of the active array size; the resulting rectangle will also be a subset of the active array rectangle. which declares the preview size must be smaller than the actual active array size.
In this case, you might consider having a try/catch block while you are taking picture using the camera.
try {
takePicture();
} catch (IllegalArgumentException e) {
Toast.makeText(this, "Your phone is too old", Toast.LENGTH_SHORT).show();
}
Hope that helps.
I believe that this may happen in very old devices with small image sensors, although seems to me (only guessing) more like a mistake by the manufacturer when they implemented the camera2 legacy wrapper.
A solution could be that when you are resolving the optimal preview size, then also to take into account that such preview size it doesn't exceed the size specified by SENSOR_INFO_ACTIVE_ARRAY_SIZE, which can be queried in the CameraCharacteristics:
CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE

Android: How to improve the performance of my camera

I develop an android camera app! I use camera2 to implement the camera module! The problem is that when I take a picture with it, the size of the image is too high, something about 9MB!!! So my gallery is too slow! What is the reason of it?
I test it in different cell phones, the size of images are different, but still too high!! I tried photo.compress(Bitmap.CompressFormat.JPEG, 50, out); this code to reduce the size, but the quality of image is too important to me, so I don't want to reduce the resolution! Here is my camera code:
public class MainActivity extends AppCompatActivity {
private Size previewsize;
private Size jpegSizes[] = null;
private TextureView textureView;
private CameraDevice cameraDevice;
private CaptureRequest.Builder previewBuilder;
private CameraCaptureSession previewSession;
private static VirtualFileSystem vfs;
ImageButton getpicture;
ImageButton btnShow;
Button btnSetting;
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
textureView = (TextureView) findViewById(R.id.textureview);
textureView.setSurfaceTextureListener(surfaceTextureListener);
getpicture = (ImageButton) findViewById(R.id.getpicture);
btnShow = (ImageButton) findViewById(R.id.button2);
btnSetting = (Button) findViewById(R.id.btn_setting);
btnSetting.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
startActivity(new Intent(MainActivity.this, Setting.class));
}
});
btnShow.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
startActivity(new Intent(MainActivity.this, Gallery2.class));
}
});
getpicture.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
getPicture();
}
});
}
void getPicture() {
if (cameraDevice == null) {
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
int width = 640, height = 480;
if (jpegSizes != null && jpegSizes.length > 0) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder capturebuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
capturebuilder.addTarget(reader.getSurface());
capturebuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
int rotation = getWindowManager().getDefaultDisplay().getRotation();
capturebuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
ImageReader.OnImageAvailableListener imageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
Bitmap photo = BitmapFactory.decodeByteArray(bytes, 0, bytes.length, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
photo.compress(Bitmap.CompressFormat.JPEG, 50, out);
save(out);
photo = Bitmap.createScaledBitmap(photo, 120, 120, false);
btnShow.setImageBitmap(photo);
save(out);
} catch (Exception ee) {
} finally {
if (image != null)
image.close();
}
}
void save(ByteArrayOutputStream bytes) {
File file12 = getOutputMediaFile();
OutputStream outputStream = null;
try {
outputStream = new FileOutputStream(file12);
outputStream.write(bytes.toByteArray());
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
if (outputStream != null)
outputStream.close();
} catch (Exception e) {
}
}
}
};
HandlerThread handlerThread = new HandlerThread("takepicture");
handlerThread.start();
final Handler handler = new Handler(handlerThread.getLooper());
reader.setOnImageAvailableListener(imageAvailableListener, handler);
final CameraCaptureSession.CaptureCallback previewSSession = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureStarted(CameraCaptureSession session, CaptureRequest request, long timestamp, long frameNumber) {
super.onCaptureStarted(session, request, timestamp, frameNumber);
}
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
startCamera();
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(capturebuilder.build(), previewSSession, handler);
} catch (Exception e) {
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, handler);
} catch (Exception e) {
}
}
public void openCamera() {
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
String camerId = manager.getCameraIdList()[0];
CameraCharacteristics characteristics = manager.getCameraCharacteristics(camerId);
StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
previewsize = map.getOutputSizes(SurfaceTexture.class)[0];
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
return;
}
manager.openCamera(camerId, stateCallback, null);
}catch (Exception e)
{
}
}
private TextureView.SurfaceTextureListener surfaceTextureListener=new TextureView.SurfaceTextureListener() {
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
openCamera();
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return false;
}
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
};
private CameraDevice.StateCallback stateCallback=new CameraDevice.StateCallback() {
#Override
public void onOpened(CameraDevice camera) {
cameraDevice=camera;
startCamera();
}
#Override
public void onDisconnected(CameraDevice camera) {
}
#Override
public void onError(CameraDevice camera, int error) {
}
};
#Override
protected void onPause() {
super.onPause();
if(cameraDevice!=null)
{
cameraDevice.close();
}
}
void startCamera()
{
if(cameraDevice==null||!textureView.isAvailable()|| previewsize==null)
{
return;
}
SurfaceTexture texture=textureView.getSurfaceTexture();
if(texture==null)
{
return;
}
texture.setDefaultBufferSize(previewsize.getWidth(),previewsize.getHeight());
Surface surface=new Surface(texture);
try
{
previewBuilder=cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
}catch (Exception e)
{
}
previewBuilder.addTarget(surface);
try
{
cameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
previewSession=session;
getChangedPreview();
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
},null);
}catch (Exception e)
{
}
}
void getChangedPreview()
{
if(cameraDevice==null)
{
return;
}
previewBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
HandlerThread thread=new HandlerThread("changed Preview");
thread.start();
Handler handler=new Handler(thread.getLooper());
try
{
previewSession.setRepeatingRequest(previewBuilder.build(), null, handler);
}catch (Exception e){}
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
private File getOutputMediaFile() {
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss")
.format(new Date());
File mediaFile;
File(Environment.getExternalStorageDirectory()+"/MyCamAppCipher1"+"/" + mTime + ".jpg");
mediaFile = new File("/myfiles.db"+"/" + mTime + ".jpg");
return mediaFile;
}
#Override
protected void onDestroy() {
super.onDestroy();
}
}
In this piece of code, you're selecting the first available resolution, which usually is the highest one.
int width = 640, height = 480;
if (jpegSizes != null && jpegSizes.length > 0) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
Aside from doing what you're saying, which is reduce image quality by calling photo.compress(Bitmap.CompressFormat.JPEG, 50, out);, your alternative is selecting a smaller camera resolution, by iterating the jpegSizes array and selecting a lower resolution.
Take into account that to do so you should use relative compares (i.e. width >= minWidth), or find the middle resolution, but always selecting one of the ones available in this array. Notice that this array will vary from phone to phone (i.e. it depends on the camera characteristics).
For example, let's say that you need a minimum of 3M pixels (2048x1536). You could have the following code:
int width = Integer.MAX_VALUE, height = Integer.MAX_VALUE;
for (int i = 0; i < jpegSizes.length; i++) {
int currWidth = jpegSizes[0].getWidth();
int currHeight = jpegSizes[0].getHeight();
if ((currWidth < width && currHeight < height) && // smallest resolution
(currWidth > 2048 && currHeight > 1536)) { // at least 3M pixels
width = currWidth;
height = currHeight;
}
}
Notice that there are two conditions:
in the first line, we're looking for the smallest possible resolution (always smaller than the previous one; notice also the initial values for width and height, Integer.MAX_VALUE, which means that at least the first value will always match this condition)
in the second line, we're making sure is at least 3M pixels
So, all combined this code will select the smallest resolution possible, at least 3M pixels.
Finally, you may add a fallback condition: if no matching resolution is found (i.e. width and height are Integer.MAX_VALUE), select the first one as you're currently doing.
I have done a camera app with camera2 too and saving images into disk is nearly immediate, even with 12 or 15 Mb picture size (no need of compressing to 50%). When you say your gallery is slow... are you loading full size images into thumbnails? Are you displaying full size images? Try to load small pictures to your thumbnails and display.

Categories

Resources