Getting image from SurfaceView to ImageView? - android

I'm having a little trouble of getting an image/drawable or a bitmap from a SurfaceView that works as a camera preivew.
final CameraSurfaceView cameraSurfaceView = new CameraSurfaceView(this);
LinearLayout ll = (LinearLayout)findViewById(R.id.linearLayout1);
ll.addView(cameraSurfaceView); // THIS WORKS
ImageView ivCam = (ImageView) findViewById(R.id.ivCam);
ivCam.setImageBitmap(cameraSurfaceView.getDrawingCache()); // THIS DOESN'T :(
Any suggestions? Thanks!
EDIT:
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
final CameraSurfaceView cameraSurfaceView = new CameraSurfaceView(this);
LinearLayout ll = (LinearLayout)findViewById(R.id.LLS);
ll.addView(cameraSurfaceView); // THIS WORKS
}
///////////////////////////////////////////////////////////////////////////////////////////////
public class CameraSurfaceView extends SurfaceView implements SurfaceHolder.Callback
{
private SurfaceHolder holder;
private Camera camera;
public CameraSurfaceView(Context context)
{
super(context);
//Initiate the Surface Holder properly
this.holder = this.getHolder();
this.holder.addCallback(this);
this.holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
#Override
public void surfaceCreated(SurfaceHolder holder)
{
try
{
this.camera = Camera.open();
this.camera.setPreviewDisplay(this.holder);
this.camera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] _data, Camera _camera) {
Camera.Parameters params = _camera.getParameters();
int w = params.getPreviewSize().width;
int h = params.getPreviewSize().height;
int format = params.getPreviewFormat();
YuvImage image = new YuvImage(_data, format, w, h, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
Rect area = new Rect(0, 0, w, h);
image.compressToJpeg(area, 50, out);
Bitmap bm = BitmapFactory.decodeByteArray(out.toByteArray(), 0, out.size());
ImageView ivCam = (ImageView) findViewById(R.id.imageView1);
ivCam.setImageBitmap(bm); /// NULL POINT EX HERE!
}
});
}
catch(IOException ioe)
{
ioe.printStackTrace(System.out);
}
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height)
{
this.camera.startPreview();
}
#Override
public void surfaceDestroyed(SurfaceHolder holder)
{
this.camera.stopPreview();
this.camera.release();
this.camera = null;
}
public Camera getCamera()
{
return this.camera;
}
}

It's far more complicated than that. The background of the SurfaceView is not the camera preview. You have to have a class that implements Camera.PreviewCalback. Once you have that, you can get a byte array containing the image that the preview sends. On some phones, you can set the preview to be a JPEG in which case you can decode it straight with BitmapFactory. On other phones that don't support that feature, you'll get by default a YUV 4:2:0 image that you have to convert into a JPEG image.
On Android 2.2+, you can convert the YUV image to a JPEG like so:
int w = params.getPreviewSize().width;
int h = params.getPreviewSize().height;
int format = params.getPreviewFormat();
YuvImage image = new YuvImage(data, format, w, h, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
Rect area = new Rect(0, 0, w, h);
image.compressToJpeg(area, 50, out);
Bitmap bm = BitmapFactory.decodeByteArray(out.toByteArray(), 0, out.size());
ivCam.setImageBitmap(bm);
If you're targeting older models, you have to use a conversion algorithm like the one here.
http://blog.tomgibara.com/post/132956174/yuv420-to-rgb565-conversion-in-android
A SO source:
Getting frames from Video Image in Android
EDIT:
If all you want is to show the camera view, then you just add the SurfaceView that your camera is using to a layout that is already displayed like you did in your question. It's already displaying it.

Related

Scaling the image and setting in Image View reduces image quality and squeezes it

I am tying to make a custom camera and after taking picture I am setting it in image view in the same activity as in which I am setting camera. I have been successful in taking the photos but before setting the image in image view I have to scale it which reduces the image quality. Is there any way to show the real image instead of scaling it?
My images are as below First one is real view of camera which is surface view:
After Taking photo it becomes:
The code I am using is:
Camera.PictureCallback picture = new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
mCamera.stopPreview();
surface_view.setVisibility(View.INVISIBLE);
setupImageDisplay(data);
}
};
private void setupImageDisplay(byte[] data) {
photo = BitmapFactory.decodeByteArray(data, 0, data.length);
photo = scaleDown(photo, true);//scaling down bitmap
imageview_photo.setImageBitmap(photo); //setting bitmap in imageview
}
public Bitmap scaleDown(Bitmap realImage, boolean filter) {
int screenWidth = width;
int screenHeight = height;
Bitmap scaled;
if (getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) {
// Notice that width and height are reversed
scaled = Bitmap.createScaledBitmap(realImage, screenHeight, screenWidth, filter);
int w = scaled.getWidth();
int h = scaled.getHeight();
// Setting post rotate to 90
Matrix mtx = new Matrix();
if (camera_id == Camera.CameraInfo.CAMERA_FACING_FRONT) {
float[] mirrorY = {-1, 0, 0, 0, 1, 0, 0, 0, 1};
Matrix matrixMirrorY = new Matrix();
matrixMirrorY.setValues(mirrorY);
mtx.postConcat(matrixMirrorY);
}
mtx.postRotate(90);
// Rotating Bitmap
realImage = Bitmap.createBitmap(scaled, 0, 0, w, h, mtx, filter);
} else {// LANDSCAPE MODE
//No need to reverse width and height
scaled = Bitmap.createScaledBitmap(realImage, screenHeight, screenWidth, filter);
int w = scaled.getWidth();
int h = scaled.getHeight();
// Setting post rotate to 90
Matrix mtx = new Matrix();
if (camera_id == Camera.CameraInfo.CAMERA_FACING_FRONT) {
float[] mirrorY = {-1, 0, 0, 0, 1, 0, 0, 0, 1};
Matrix matrixMirrorY = new Matrix();
matrixMirrorY.setValues(mirrorY);
mtx.postConcat(matrixMirrorY);
}
mtx.postRotate(180);
// Rotating Bitmap
realImage = Bitmap.createBitmap(scaled, 0, 0, w, h, mtx, filter);
}
return realImage;
}
After taking photo the image is like squeezed is there any way that image remains the same after scaling?
You can create a separate file which is temporary file and stores the thumbnail size of the image. You can make a POJO like this to store both images. You can display the smaller one and use the original file to keep high quality.
public class Image {
File fullSize;
File Thumbnail;
public Image(File fullSize, File thumbnail) {
this.fullSize = fullSize;
Thumbnail = thumbnail;
}
public File getFullSize() {
return fullSize;
}
public void setFullSize(File fullSize) {
this.fullSize = fullSize;
}
public File getThumbnail() {
return Thumbnail;
}
public void setThumbnail(File thumbnail) {
Thumbnail = thumbnail;
}
}

Android-Ocr using Tesseract in Portrait

I used the ocr sample in this link https://github.com/rmtheis/android-ocr
Every thing is working fine but i want it in Portrait view,I followed the steps in this link , Zxing Camera in Portrait mode on Android, to enable ocr tesstow in Portrait mode . The View is portrait now but the camera is still taking the picture in landscape mode.
Any help ?
final class PreviewCallback implements Camera.PreviewCallback {
private static final String TAG = PreviewCallback.class.getSimpleName();
private final CameraConfigurationManager configManager;
private Handler previewHandler;
private int previewMessage;
PreviewCallback(CameraConfigurationManager configManager) {
this.configManager = configManager;
}
void setHandler(Handler previewHandler, int previewMessage) {
this.previewHandler = previewHandler;
this.previewMessage = previewMessage;
}
// (NV21) format.
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
Point cameraResolution = configManager.getCameraResolution();
Handler thePreviewHandler = previewHandler;
if (cameraResolution != null && thePreviewHandler != null) {
Message message = thePreviewHandler.obtainMessage(previewMessage, cameraResolution.x,
cameraResolution.y, data);
message.sendToTarget();
previewHandler = null;
} else {
Log.d(TAG, "Got preview callback, but no handler or resolution available");
}
}
Are you using the preview data with this method:
public void onPreviewFrame(byte[] data, Camera camera) {}
If yes, then I can help you, since I am doing very similar project (that will be open sourced soon)
here is the code that I am using to rotate the preview image
public static Bitmap getBitmapImageFromYUV(byte[] data, int width,
int height, int degree, Rect rect) {
Bitmap bitmap = getBitmapImageFromYUV(data, width, height, rect);
return rotateBitmap(bitmap, degree,rect);
}
public static Bitmap rotateBitmap(Bitmap source, float angle, Rect rect) {
Matrix matrix = new Matrix();
matrix.postRotate(angle);
source = Bitmap.createBitmap(source, 0, 0, source.getWidth(),
source.getHeight(), matrix, true);
source = Bitmap.createBitmap(source, rect.left, rect.top, rect.width(), rect.height());
if(mShouldSavePreview)
saveBitmap(source);
return source;
}
public static Bitmap getBitmapImageFromYUV(byte[] data, int width,
int height, Rect rect) {
YuvImage yuvimage = new YuvImage(data, ImageFormat.NV21, width, height,
null);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
yuvimage.compressToJpeg(new Rect(0, 0, width, height), 90, baos);
byte[] jdata = baos.toByteArray();
BitmapFactory.Options bitmapFatoryOptions = new BitmapFactory.Options();
bitmapFatoryOptions.inPreferredConfig = Bitmap.Config.ARGB_8888;
Bitmap bmp = BitmapFactory.decodeByteArray(jdata, 0, jdata.length,
bitmapFatoryOptions);
Log.d(TAG,"getBitmapImageFromYUV w:"+bmp.getWidth()+" h:"+bmp.getHeight());
return bmp;
}
guys i found the solution!
Replace the next code in function: ocrDecode(byte[] data, int width, int height) in DecodeHandler.java file
beepManager.playBeepSoundAndVibrate();
activity.displayProgressDialog();
// *************SHARNOUBY CODE
byte[] rotatedData = new byte[data.length];
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++)
rotatedData[x * height + height - y - 1] = data[x + y * width];
}
int tmp = width;
width = height;
height = tmp;
//******************************
// Launch OCR asynchronously, so we get the dialog box displayed
// immediately
new OcrRecognizeAsyncTask(activity, baseApi, rotatedData, width, height)
.execute();
...the problem was in the switch case in the function handleMessage(Message message)
the second case was never triggered which calls the rotation code

OnPreviewFrame data image to imageView

i'm trying to get the byte[] from the preview of the camera, convert it to bitmap and display it on a imageview with imageView.setImageBitmap()
i've managed to start the preview and display it on a surfaceView, but i don't know how to convert the byte[] data (that comes in Yuv format i think) in a RGB bitmap to display it on a imageView.
the code i'm trying is the following:
camera = camera.open();
parameters = camera.getParameters();
camera.setParameters(parameters);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
camera.setPreviewDisplay(surfaceHolder);
camera.setPreviewCallback(this);
camera.startPreview();
and the preview callback is this
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
int width = parameters.getPreviewSize().width;
int height = parameters.getPreviewSize().height;
ByteArrayOutputStream outstr = new ByteArrayOutputStream();
Rect rect = new Rect(0, 0, width, height);
YuvImage yuvimage=new YuvImage(data,ImageFormat.NV21,width,height,null);
yuvimage.compressToJpeg(rect, 100, outstr);
Bitmap bmp = BitmapFactory.decodeByteArray(outstr.toByteArray(), 0, outstr.size());
imgView1.setImageBitmap(bmp);
}
The preview works but the imageView remains empty
Any idea?
It is possible you did not open the Camera in the UI thread. However, you need to ensure setImageBitmap is called in the UI thread:
#Override
public void onPreviewFrame(final byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
int width = parameters.getPreviewSize().width;
int height = parameters.getPreviewSize().height;
YuvImage yuv = new YuvImage(data, parameters.getPreviewFormat(), width, height, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
yuv.compressToJpeg(new Rect(0, 0, width, height), 50, out);
byte[] bytes = out.toByteArray();
final Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
MyActivity.this.runOnUiThread(new Runnable() {
#Override
public void run() {
((ImageView) findViewById(R.id.loopback)).setImageBitmap(bitmap);
}
});
}

How to change area of scan Zbar?

I want to change the area of camera scan. Now I take image to scan as big as screen of device. I'm trying to crop image to analyze. So just the center of preview will be source to scan. Is there any option to set captured preview to be smaller or creating Bitmap from byte[] data and crop it is the only way to get smaller area? I was trying to read something about it but documentation for Zbar Android is very poor (comparing to iOS).
Picture here:
https://postimg.cc/image/4wk4u0mln/
MainActivity
public class MainActivity extends Activity
{
private Camera mCamera;
private Context context;
private CameraPreview mPreview;
private Handler autoFocusHandler;
TextView scanText;
Button scanButton;
ImageScanner scanner;
private PowerManager.WakeLock wl;
private boolean barcodeScanned = false;
private boolean previewing = true;
static {
System.loadLibrary("iconv");
}
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
autoFocusHandler = new Handler();
mCamera = getCameraInstance();
context = getApplicationContext();
/* Instance barcode scanner */
scanner = new ImageScanner();
scanner.setConfig(0, Config.X_DENSITY, 3);
scanner.setConfig(0, Config.Y_DENSITY, 3);
mPreview = new CameraPreview(this, mCamera, previewCb, autoFocusCB);
FrameLayout preview = (FrameLayout)findViewById(R.id.cameraPreview);
preview.addView(mPreview);
scanText = (TextView)findViewById(R.id.scanText);
}
#Override
public void onPause() {
super.onPause();
releaseCamera();
}
#Override
protected void onResume() {
// TODO Auto-generated method stub
onStart();
}
#Override
protected void onStop() {
// TODO Auto-generated method stub
super.onStop();
finish();
}
/** A safe way to get an instance of the Camera object. */
public static Camera getCameraInstance(){
Camera c = null;
try {
c = Camera.open();
} catch (Exception e){
}
return c;
}
private void releaseCamera() {
if (mCamera != null) {
previewing = false;
mCamera.setPreviewCallback(null);
mCamera.release();
mCamera = null;
}
}
private Runnable doAutoFocus = new Runnable() {
#Override
public void run() {
if (previewing)
mCamera.autoFocus(autoFocusCB);
}
};
PreviewCallback previewCb = new PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
Size size = parameters.getPreviewSize();
//HERE we read taken picture from prieview
Image barcode = new Image(size.width, size.height, "Y800");
barcode.setData(data);
int result = scanner.scanImage(barcode);
if (result != 0) {
previewing = false;
mCamera.setPreviewCallback(null);
mCamera.stopPreview();
SymbolSet syms = scanner.getResults();
for (Symbol sym : syms) {
if (sym.getType() == Symbol.CODE128) {
sym.getData());
MediaPlayer mp = MediaPlayer.create(context, R.raw.beep_ok);
mp.start();
} else {
MediaPlayer mp = MediaPlayer.create(context, R.raw.beep_wrong);
mp.start();
}
mCamera.setPreviewCallback(previewCb);
mCamera.startPreview();
previewing = true;
mCamera.autoFocus(autoFocusCB);
}
}
}
};
// Mimic continuous auto-focusing
AutoFocusCallback autoFocusCB = new AutoFocusCallback() {
#Override
public void onAutoFocus(boolean success, Camera camera) {
autoFocusHandler.postDelayed(doAutoFocus, 1000);
}
};
//Method to crop Bitmap in case of use
public Bitmap scaleCenterCrop(Bitmap source, int newHeight, int newWidth) {
int sourceWidth = source.getWidth();
int sourceHeight = source.getHeight();
float xScale = (float) newWidth / sourceWidth;
float yScale = (float) newHeight / sourceHeight;
float scale = Math.max(xScale, yScale);
float scaledWidth = scale * sourceWidth;
float scaledHeight = scale * sourceHeight;
float left = (newWidth - scaledWidth) / 2;
float top = (newHeight - scaledHeight) / 2;
RectF targetRect = new RectF(left, top, left + scaledWidth, top + scaledHeight);
Bitmap dest = Bitmap.createBitmap(newWidth, newHeight, source.getConfig());
Canvas canvas = new Canvas(dest);
canvas.drawBitmap(source, null, targetRect, null);
return dest;
}
}
CameraPrieview.java
public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
private SurfaceHolder mHolder;
private Camera mCamera;
private PreviewCallback previewCallback;
private AutoFocusCallback autoFocusCallback;
public CameraPreview(Context context, Camera camera,
PreviewCallback previewCb,
AutoFocusCallback autoFocusCb) {
super(context);
mCamera = camera;
previewCallback = previewCb;
autoFocusCallback = autoFocusCb;
/*
* Set camera to continuous focus if supported, otherwise use
* software auto-focus. Only works for API level >=9.
*/
/*
Camera.Parameters parameters = camera.getParameters();
for (String f : parameters.getSupportedFocusModes()) {
if (f == Parameters.FOCUS_MODE_CONTINUOUS_PICTURE) {
mCamera.setFocusMode(Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
autoFocusCallback = null;
break;
}
}
*/
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, now tell the camera where to draw the preview.
try {
mCamera.setPreviewDisplay(holder);
} catch (IOException e) {
Log.d("DBG", "Error setting camera preview: " + e.getMessage());
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
// Camera preview released in activity
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
/*
* If your preview can change or rotate, take care of those events here.
* Make sure to stop the preview before resizing or reformatting it.
*/
if (mHolder.getSurface() == null){
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e){
// ignore: tried to stop a non-existent preview
}
try {
// Hard code camera surface rotation 90 degs to match Activity view in portrait
mCamera.setDisplayOrientation(90);
mCamera.setPreviewDisplay(mHolder);
mCamera.setPreviewCallback(previewCallback);
mCamera.startPreview();
mCamera.autoFocus(autoFocusCallback);
} catch (Exception e){
Log.d("DBG", "Error starting camera preview: " + e.getMessage());
}
}
}
In the PreviewCallback you can actually crop the scanning area. In the onPreviewFrame method, after calling barcode.setData() you can call barcode.setCrop(left,top,width,height). All measurements in pixels.
Also make sure to set the crop size with respect to the preview image size and not the device screen size.
Note: Please make sure that the x-axis is always the longest side even though you orient the phone in portrait mode.
Please refer below Image for better understanding of the arguments.
This solution is specific to zbar for android.

How to get Android camera preview data?

My camera app displays a camera preview on the screen and also processes it in the background. Here is the relevant code, condensed as much as possible (e.g. no error handling or field declarations shown):
public final class CameraView extends SurfaceView implements
SurfaceHolder.Callback, Runnable, PreviewCallback {
public CameraView(Context context, AttributeSet attrs) {
super(context, attrs);
mHolder = getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
void openCamera() {
// Called from parent activity after setting content view to CameraView
mCamera = Camera.open();
mCamera.setPreviewCallbackWithBuffer(this);
}
public void surfaceCreated(SurfaceHolder holder) {
new Thread(this).start();
// Set CameraView to the optimal camera preview size
final Camera.Parameters params = mCamera.getParameters();
final List<Camera.Size> sizes = params.getSupportedPreviewSizes();
final int screenWidth = ((View) getParent()).getWidth();
int minDiff = Integer.MAX_VALUE;
Camera.Size bestSize = null;
if (getResources().getConfiguration().orientation
== Configuration.ORIENTATION_LANDSCAPE) {
// Find the camera preview width that best matches the
// width of the surface.
for (Camera.Size size : sizes) {
final int diff = Math.abs(size.width - screenWidth);
if (diff < minDiff) {
minDiff = diff;
bestSize = size;
}
}
} else {
// Find the camera preview HEIGHT that best matches the
// width of the surface, since the camera preview is rotated.
mCamera.setDisplayOrientation(90);
for (Camera.Size size : sizes) {
final int diff = Math.abs(size.height - screenWidth);
if (Math.abs(size.height - screenWidth) < minDiff) {
minDiff = diff;
bestSize = size;
}
}
}
final int previewWidth = bestSize.width;
final int previewHeight = bestSize.height;
ViewGroup.LayoutParams layoutParams = getLayoutParams();
layoutParams.height = previewHeight;
layoutParams.width = previewWidth;
setLayoutParams(layoutParams);
params.setPreviewFormat(ImageFormat.NV21);
mCamera.setParameters(params);
int size = previewWidth * previewHeight *
ImageFormat.getBitsPerPixel(params.getPreviewFormat()) / 8;
mBuffer = new byte[size];
mCamera.addCallbackBuffer(mBuffer);
mCamera.setPreviewDisplay(mHolder);
mCamera.startPreview();
}
public void onPreviewFrame(byte[] data, Camera camera) {
CameraView.this.notify();
}
public void run() {
mThreadRun = true;
while (mThreadRun) {
synchronized (this) {
this.wait();
processFrame(mBuffer); // convert to RGB and rotate - not shown
}
// Request a new frame from the camera by putting
// the buffer back into the queue
mCamera.addCallbackBuffer(mBuffer);
}
mHolder.removeCallback(this);
mCamera.stopPreview();
mCamera.setPreviewCallback(null);
mCamera.release();
mCamera = null;
}
public void surfaceDestroyed(SurfaceHolder holder) {
mThreadRun = false;
}
}
On all devices, the camera preview displays properly, and on most (emulator, Samsung Galaxy S3, etc.) the data stored in mBuffer is also correct (after NV21 to RGB conversion and rotation, of course). However, a number of devices do not supply the correct data in onPreviewFrame. I'm sure that the data is being converted to RGB correctly after it's received, so the problem appears to be in the raw data supplied to mBuffer. I've noticed this bug report relating to the YV12 (alias YUV420p) camera preview format, but I'm using the old default, NV21 (alias YUV420sp), which must be supported according to the compatibility standard (see 7.5.3.2, bottom of page 29).
For example, for this scene (shown here in Camera Preview on the Samsung Galaxy Tab 2):
the data passed to mBuffer on the Tab 2 looks like:
and on the Motorola Droid 4 looks like:
What is the correct way to get Android camera preview data across all devices?
Edit: for processFrame(), I used OpenCV to convert to RGB and rotate. See this answer and this answer.
The only problem was that I didn't set the preview width and height:
params.setPreviewSize(previewWidth, previewHeight);
mCamera.setParameters(params);
This meant that the height and width I allocated for the array (proportional to previewWidth * previewHeight) tended to be a lot larger than the size of the actual data being returned (proportional to the default preview width and preview height). On some phones, the default was the same size as previewWidth and previewHeight, so there was no issue.
You can also try this
public void takeSnapPhoto() {
camera.setOneShotPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
int format = parameters.getPreviewFormat();
//YUV formats require more conversion
if (format == ImageFormat.NV21 || format == ImageFormat.YUY2 || format == ImageFormat.NV16) {
int w = parameters.getPreviewSize().width;
int h = parameters.getPreviewSize().height;
// Get the YuV image
YuvImage yuv_image = new YuvImage(data, format, w, h, null);
// Convert YuV to Jpeg
Rect rect = new Rect(0, 0, w, h);
ByteArrayOutputStream output_stream = new ByteArrayOutputStream();
yuv_image.compressToJpeg(rect, 100, output_stream);
byte[] byt = output_stream.toByteArray();
FileOutputStream outStream = null;
try {
// Write to SD Card
File file = createFileInSDCard(FOLDER_PATH, "Image_"+System.currentTimeMillis()+".jpg");
//Uri uriSavedImage = Uri.fromFile(file);
outStream = new FileOutputStream(file);
outStream.write(byt);
outStream.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
}
}
});}
From working on Barcode Scanner, which relies heavily on preview data, I feel like I've seen every bug under the sun. My suggestion is simply to not call setPreviewFormat() and let it use the default. The default is what you want here fortunately. There seem to be fewer devices that fail to get the default right, than device that balls up the call to setPreviewFormat(). Try that at least, may or may not be it.

Categories

Resources