Hi i am developing real time image processing application on android. I am using PreviewCallback to get image in every frame. When i get data in Tablet devices the data returns very big. So its too hard to work in large data in real time.
My question is, is there any way to get smaller resolution data from camera preview.
CAMERA PREVIEW CODE:
public void onPreviewFrame(byte[] data, Camera camera) {
// TODO Auto-generated method stub
Camera.Parameters params = camera.getParameters();
Log.v("image format", Integer.toString(params.getPreviewFormat()));
//Frame captureing via frameManager
frameManager.initCamFrame(params.getPreviewSize().width, params.getPreviewSize().height,
data);
}
});
You can call parameters.setPreviewSize(width, height), but you want to do it before camera preview starts. And you need to use supported value, viz previous answer.
And you also should not call camera.getParameters() every frame, just do that once and save the values to some variable. You have some limited time in onPreviewFrame, because byte[] data is overwritten on each frame, so try to do only the important stuff here.
And you should use setPreviewCallbackWithBuffer, it quite improves performance - check this post.
Are you aware you can get a list of supported preview sizes from the camera parameters by calling getSupportedPreviewSizes()? The devices I've tried this on all returned a sorted list, although sometimes in ascending and sometimes in descending order. You'll probably want to manually iterate the list to find the 'smallest' preview size, or sort it first and grab the first item.
you can try this:
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
try {
byte[] baos = convertYuvToJpeg(data, camera);
StringBuilder dataBuilder = new StringBuilder();
dataBuilder.append("data:image/jpeg;base64,").append(Base64.encodeToString(baos, Base64.DEFAULT));
mSocket.emit("newFrame", dataBuilder.toString());
} catch (Exception e) {
Log.d("########", "ERROR");
}
}
};
public byte[] convertYuvToJpeg(byte[] data, Camera camera) {
YuvImage image = new YuvImage(data, ImageFormat.NV21,
camera.getParameters().getPreviewSize().width, camera.getParameters().getPreviewSize().height, null);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
int quality = 20; //set quality
image.compressToJpeg(new Rect(0, 0, camera.getParameters().getPreviewSize().width, camera.getParameters().getPreviewSize().height), quality, baos);//this line decreases the image quality
return baos.toByteArray();
}
Related
I am trying to take a picture using the Android camera. I have a requirement to capture a 1600 (w) x 1200 (h) image (3rd party vendor requirement). My code seems to work fine for many phone cameras but the setPictureSize causes a crash on some phones (Samsung Galaxy S4, Samsung Galaxy Note) and causes a streaked picture on others (Nexus 7 Tablet). On at least the Nexus the size I desire is showing up in the getSupportPictureSizes list.
I have tried specifying the orientation but it didn't help. Taking the picture with the default picture size works fine.
Here is an example of the streaking:
For my image capture I have a requirement of 1600x1200, jpg, 30% compression, so I am capturing a JPG file.
I think I have three choices:
1) Figure out how to capture the 1600x1200 size without a crash or streaking, or
2) Figure out how to change the size of the default picture size to a JPG that is 1600x1200.
3) Something else that is currently unknown to me.
I have found some other postings that have similar issues but not quite the same. I am in my 2nd day of trying things but am not finding a solution. Here is one posting that got close:
Camera picture to Bitmap results in messed up image (none of the suggestions helped me)
Here is the section of my code that worked fine for until I ran into the S4/Note/Nexus 7. I have added a bit of debugging code for now:
Camera.Parameters parameters = mCamera.getParameters();
Camera.Size size = getBestPreviewSize(width, height, parameters);
if (size != null) {
int pictureWidth = 1600;
int pictureHeight = 1200;
// testing
Camera.Size test = parameters.getPictureSize();
List<Camera.Size> testSizes = parameters.getSupportedPictureSizes();
for ( int i = 0; i < testSizes.size(); i++ ) {
test = testSizes.get(i);
}
test = testSizes.get(3);
// get(3) is 1600 x 1200
pictureWidth = test.width;
pictureHeight = test.height;
parameters.setPictureFormat(ImageFormat.JPEG);
parameters.setPictureSize(pictureWidth, pictureHeight);
parameters.setJpegQuality(30);
parameters.setPreviewSize(size.width, size.height);
// catch any exception
try {
// make sure the preview is stopped
mCamera.stopPreview();
mCamera.setParameters(parameters);
didConfig = true;
catch(Exception e) {
// some error presentation was removed for brevity
// since didConfig not set to TRUE it will fail gracefully
}
}
Here is the section of my code that saves the JPG file:
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
if ( data.length > 0 ) {
String fileName = "image.jpg";
File file = new File(getFilesDir(), fileName);
String filePath = file.getAbsolutePath();
boolean goodWrite = false;
try {
OutputStream os = new FileOutputStream(file);
os.write(data);
os.close();
goodWrite = true;
} catch (IOException e) {
goodWrite = false;
}
if ( goodWrite ) {
// go on to the Preview
} else {
// TODO return an error to the calling activity
}
}
Log.d(TAG, "onPictureTaken - jpeg");
}
};
Any suggestions on how to correctly set up the camera parameters for taking photos or how to crop or resize the resulting photo would be great. Especially if it will work with older cameras (API level 8 or later)! Based on needing the full width of the picture I can only crop off the top.
Thanks!
EDIT: Here is what I ended up doing:
I started by processing the Camera.Parameters getSupportedPictureSizes to use the first one that had the height and width both greater than my desired size, AND the same width:height ratio. I set the Camera parameters to that picture size.
Then once the picture was taken:
BitmapFactory.Options options = new BitmapFactory.Options();;
options.inPurgeable = true;
// convert the byte array to a bitmap, taking care to allow for garbage collection
Bitmap original = BitmapFactory.decodeByteArray(input , 0, input.length, options);
// resize the bitmap to my desired scale
Bitmap resized = Bitmap.createScaledBitmap(original, 1600, 1200, true);
// create a new byte array and output the bitmap to a compressed JPG
ByteArrayOutputStream blob = new ByteArrayOutputStream();
resized.compress(Bitmap.CompressFormat.JPEG, 30, blob);
// recycle the memory since bitmaps seem to have slightly different garbage collection
original.recycle();
resized.recycle();
byte[] desired = blob.toByteArray();
Then I write out the desired jpg to a file for upload.
test = testSizes.get(3);
// get(3) is 1600 x 1200
There is no requirement that the array have 4+ elements, let alone that the fourth element be 1600x1200.
1) Figure out how to capture the 1600x1200 size without a crash or streaking
There is no guarantee that every device is capable of taking a picture with that exact resolution. You cannot specify arbitrary values for the resolution -- it must be one of the supported picture sizes. Some devices support arbitrary values, while other devices will give you corrupted output (as is the case here) or will flat-out crash.
2) Figure out how to change the size of the default picture size to a JPG that is 1600x1200
I am not aware that there is a "default picture size", and, beyond that, such a size will be immutable, since it is the default. Changing the picture size is your option #1 above.
3) Something else that is currently unknown to me.
For devices that support a resolution that is bigger on both axes, take a picture in that resolution, then crop to 1600x1200.
For all other devices, where one or both axes are smaller than desired, take a picture in whatever resolution suits you (largest, closest match to 4:3 aspect ratio, etc.), and then stretch/crop to get to 1600x1200.
This question already has answers here:
How to get raw preview data from Camera object at least 15 frames per second in Android?
(6 answers)
Closed 9 years ago.
My application currently has a preview screen and I want it to capture many frames a second
for processing. At the moment, my preview is only storing an image every second, however I require a much larger fps capture rate. Any help would be appreciated.
Another problem (if you can) is that my images are rotated 90 degrees when they appear on my sd card. No internet solutions so far have helped me for these problems :(
Thanks :)
public class MyCameraSurfaceView extends SurfaceView implements SurfaceHolder.Callback{
private SurfaceHolder mHolder;
private Camera mCamera;
public MyCameraSurfaceView(Context context, Camera camera) {
super(context);
mCamera = camera;
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int weight,
int height) {
// If your preview can change or rotate, take care of those events here.
// Make sure to stop the preview before resizing or reformatting it.
mCamera.setDisplayOrientation(90);
if (mHolder.getSurface() == null){
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e){
// ignore: tried to stop a non-existent preview
}
// make any resize, rotate or reformatting changes here
// start preview with new settings
try {
mCamera.setPreviewDisplay(mHolder);
Camera.Parameters parameters = mCamera.getParameters();
List<Camera.Size> size = parameters.getSupportedPreviewSizes();
parameters.setPreviewSize(size.get(0).width, size.get(0).height);
mCamera.setParameters(parameters);
mCamera.startPreview();
} catch (Exception e){}
mCamera.setPreviewCallback(new Camera.PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
int format = parameters.getPreviewFormat();
//YUV formats require more conversion
if (format == ImageFormat.NV21 || format == ImageFormat.YUY2 || format == ImageFormat.NV16) {
int w = parameters.getPreviewSize().width;
int h = parameters.getPreviewSize().height;
// Get the YuV image
YuvImage yuv_image = new YuvImage(data, format, w, h, null);
// Convert YuV to Jpeg
Rect rect = new Rect(0, 0, w, h);
ByteArrayOutputStream output_stream = new ByteArrayOutputStream();
yuv_image.compressToJpeg(rect, 10, output_stream);
byte[] byt = output_stream.toByteArray();
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format(
"/sdcard/bb%d.jpg", System.currentTimeMillis() / 1000));
outStream.write(byt);
outStream.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
}
}
});
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
// TODO Auto-generated method stub
// The Surface has been created, now tell the camera where to draw the preview.
try {
mCamera.setDisplayOrientation(90);
mCamera.setPreviewDisplay(holder);
mCamera.startPreview();
} catch (IOException e) {
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
// TODO Auto-generated method stub
}
}
First, note that Camera.Parameters allows you to set the frame rate with setPreviewFrameRate. The value must be in the range described by getPreviewFpsRange.
Second, handling preview frames in a byte[] is going to restrict your frame rate severely because of the amount of data that has to be copied around. If you want to write unmodified full-frame YUV data to disk then you don't currently have a choice. If you can cope with compression artifacts, and you have Android 4.3 or later, you can just save the data as an MPEG video and read the frames back later. See the CameraToMpegTest.java sample on this page for a code example.
Rotating an image by 90 or 180 degrees is straightforward to code. The Bitmap class can do it if you don't want to write it yourself.
setPreviewCallback() is an easy, but less efficient way to request preview frames. The main problem is that the framework may be very busy allocating the byte[] chunks to fill, and the garbage collector may take a heavy price. The preferred method is to use setPreviewCallbackWithBuffer(), but even this does not guarantee desired frame rate, as can be seen in How to get raw preview data from Camera object at least 15 frames per second in Android?
Scenario:
I need to take a picture as fast as possible and save it to SD Card. It would be fantastic if I could do it in around 0.2 seconds both taking the picture and saving it.
What I did so far:
As normal I've created a SurfaceView to handle the Camera preview and initialized the camera object. The quality of the image doesn't need to be very high, that's why I am not using the largest resolution possible and also no autofocus is required.
I set the parameters like this:
Parameters parameters = camera.getParameters();
parameters.set("jpeg-quality", 70);
parameters.setPictureFormat(ImageFormat.JPEG);
List<Camera.Size> sizes = parameters.getSupportedPictureSizes();
Size size = sizes.get(Integer.valueOf((sizes.size()-1)/2)); //choose a medium resolution
parameters.setPictureSize(size.width, size.height);
camera.setParameters(parameters);
camera.setDisplayOrientation(90);
List<Size> sizes2 = parameters.getSupportedPreviewSizes();
Size size2 = sizes.get(0);
parameters.setPreviewSize(size2.width, size2.height);
camera.setPreviewDisplay(holder);
camera.startPreview();
I save the image to SD card very simple with:
PictureCallback handlePictureStorage = new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
}
};
After making a few tests, on my Galaxy Nexus, the result looks like:
Setting picture size to : wigth=1600 height=1200
Jpeg quality : 70, Picture format JPEG
Fire take picture at: 00:13:23.603
Start saving picture on SD Card at: 00:13:23.956
Finished saving picture on SD Card at: 00:13:23.990
This is almost 0.4 seconds.
Is there a way to tweak the Camera parameters even more to gain some faster speed ? The resolution is OK, the quality of the picture also. I know that there are apps on market that have 30 pictures per second but I think they use buffering to achieve that speed. However, as you may see the biggest time is lost with taking the picture rather than saving it to card. It would be great if I could tweak this a bit more.
After I did a bit of testing with multiple parameters, conclusion is that not much is left to be done. Here are some parameters I've set:
//set color efects to none
cameraParameters.setColorEffect(Camera.Parameters.EFFECT_NONE);
//set antibanding to none
if (cameraParameters.getAntibanding() != null) {
cameraParameters.setAntibanding(Camera.Parameters.ANTIBANDING_OFF);
}
// set white ballance
if (cameraParameters.getWhiteBalance() != null) {
cameraParameters.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_CLOUDY_DAYLIGHT);
}
//set flash
if (cameraParameters.getFlashMode() != null) {
cameraParameters.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
}
//set zoom
if (cameraParameters.isZoomSupported()) {
cameraParameters.setZoom(0);
}
//set focus mode
cameraParameters.setFocusMode(Camera.Parameters.FOCUS_MODE_INFINITY);
However, the best idea is to get the full string list of parameters supported by the camera, and try to tweak them. To get the string, use the flatten method of Camera.Parameters - http://developer.android.com/reference/android/hardware/Camera.Parameters.html#flatten()
But in order to get images really quick, I had to use preview with buffer, and for each frame taken, try to save it on sd-card in a thread. The picture quality isn't fantastic, but it's a start.
If quality doesn't matter, maybe you could look into using something other than JPEG and compare execution times:
http://developer.android.com/reference/android/graphics/ImageFormat.html
EDIT:
After playing around with it for a few hours, I came to believe that the problem is in the image quality. For example, to first image is how it came from the camera. Decoder can't read it. The second image is turned into B/W with adjusted contrast and the decoder reads it great.
Since the demo app that came with zxing is able to read the fist image off the monitor in a few seconds, I think the problem might be in some setting deep within the zxing library. It doesn't wait long enough to process the image, but spits out NotFound almost instantly.
I'm making a simple QR-reader app. Here's a screenshot.
The top black area is a surfaceview, that shows frames from the camera. It works fine, only you can't see it in the screenshot.
Then, when I press the button, a bitmap is taken from that surfaceview, placed on an ImageView below and is attempted to be read by the zxing library.
Yet it will give out a NotFoundException. :/
**10-17 19:53:15.382: WARN/System.err(2238): com.google.zxing.NotFoundException
10-17 19:53:15.382: WARN/dalvikvm(2238): getStackTrace() called but no trace available**
On the other hand, if I crop the qr image from this screenshot, place it into the imageview ( instead of a camera feed ) and try to decode it, it works fine. Therefor the QR image itself and its quality are OK... but then why doesn't it decode in the first scenario?
Thanks!
public void dec(View v)
{
ImageView ivCam2 = (ImageView)findViewById(R.id.imageView2);
ivCam2.setImageBitmap(bm);
BitmapDrawable drawable = (BitmapDrawable) ivCam2.getDrawable();
Bitmap bMap = drawable.getBitmap();
TextView textv = (TextView) findViewById(R.id.mytext);
LuminanceSource source = new RGBLuminanceSource(bMap);
BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(source));
Reader reader = new MultiFormatReader();
try {
Result result = reader.decode(bitmap);
Global.text = result.getText();
byte[] rawBytes = result.getRawBytes();
BarcodeFormat format = result.getBarcodeFormat();
ResultPoint[] points = result.getResultPoints();
textv.setText(Global.text);
} catch (NotFoundException e) {
textv.setText("NotFoundException");
} catch (ChecksumException e) {
textv.setText("ChecksumException");
} catch (FormatException e) {
textv.setText("FormatException");
}
}
how the bitmap is created:
#Override
public void surfaceCreated(SurfaceHolder holder)
{
try
{
this.camera = Camera.open();
this.camera.setPreviewDisplay(this.holder);
this.camera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] _data, Camera _camera) {
Camera.Parameters params = _camera.getParameters();
int w = params.getPreviewSize().width;
int h = params.getPreviewSize().height;
int format = params.getPreviewFormat();
YuvImage image = new YuvImage(_data, format, w, h, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
Rect area = new Rect(0, 0, w, h);
image.compressToJpeg(area, 50, out);
bm = BitmapFactory.decodeByteArray(out.toByteArray(), 0, out.size());
}
});
}
catch(IOException ioe)
{
ioe.printStackTrace(System.out);
}
}
I wrote this code. Returning quickly isn't a problem. Decoding is very fast on a mobile, and very very fast on a desktop.
The general answer to this type of question is that some images just aren't going to decode. That's life -- the heuristics don't always get it right. But I don't think that is the problem here.
QR codes don't decode without a minimal white "quiet zone" around them. The image beyond its borders is considered white for this purpose. But in your raw camera image, there's little border around the code and it's not all considered white by the binarizer, I'd bet.
Still, there's more you can do. Set the TRY_HARDER hint to the decoder, for one, to have it spend a lot more CPU to try to decode. You can also try a different Binarizer implementation than the default HybridBinarizer.
(The rest looks just fine. I assume that RGBLuminanceSource is getting data in the format it expects; it ought to from Bitmap)
See this: http://zxing.org/w/docs/javadoc/com/google/zxing/NotFoundException.html The exception means that a barcode wasn't found in the image. My suggestion would be to use your work around that works instead of trying to decode the un-cropped image.
I would like to save a preview frame as a jpeg image.
I have tried to write the following code:
public void onPreviewFrame(byte[] _data, Camera _camera)
{
if(settings.isRecording())
{
Camera.Parameters params = _camera.getParameters();
params.setPictureFormat(PixelFormat.JPEG);
_camera.setParameters(params);
String path = "ImageDir" + frameCount;
fileRW.setPath(path);
fileRW.WriteToFile(_data);
frameCount++;
}
}
but it's not possible to open a saved file as a jpeg image. Does anyone know how to save preview frames as jpeg images?
Thanks
checkout this code. i hope it helps
camera.setPreviewCallback(new PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
// TODO Auto-generated method stub
Camera.Parameters parameters = camera.getParameters();
Size size = parameters.getPreviewSize();
YuvImage image = new YuvImage(data, ImageFormat.NV21,
size.width, size.height, null);
Rect rectangle = new Rect();
rectangle.bottom = size.height;
rectangle.top = 0;
rectangle.left = 0;
rectangle.right = size.width;
ByteArrayOutputStream out2 = new ByteArrayOutputStream();
image.compressToJpeg(rectangle, 100, out2);
DataInputStream in = new DataInputStream();
in.write(out2.toByteArray());
}
}
});
camera.startPreview();
You have to convert it manually, there are some examples on the android-developers list if you browse the archive - mostly dealing with the format (luminance/chrominance,etc) conversion, then writing the image to a bitmap, then saving to a file.
It's all a bit of a pain really.
I set the PreviewFormat with Camera.Parameters.setPreviewFormat(PixelFormat.JPEG) before preview,but it seems that it can't really set the previewformat......
By the way, the default format of the preview is YCbCr_420_SP....
You must first check what are the supported preview formats for your device by calling getSupportedPreviewFormats(). Make sure JPEG is supported before calling setPreviewFormat(PixelFormat.JPEG).
JPEG is not a format for Camera Preview. As official documentation says,
"Only ImageFormat.NV21 and ImageFormat.YUY2 are supported for now"
In order to get a picture from Camera Preview, you need to define preview format, as below:
Camera.Parameters parameters;
parameters.setPreviewFormat(ImageFormat.NV21); //or ImageFormat.YU2
After that, you compress & save JPEG as in Dany's example.
_data probably isn't in JPEG format. Did you call Camera.Parameters.setPreviewFormat(PixelFormat.JPEG) before calling start preview?