Android Camera.takePicture() saves pictures with 176x144 pixels - android

I have programmed an App with a CameraPreview. When I take a Photo now, the App save it with 176 x 144 Pixels. Can I Change the width an the height?
I use the Camera Api, not Camera2.
When I took photos with the normal Camera-App, it saved Images with 2368x4208 Pixels.
This is where the Photos where taken
public void onPictureTaken(byte[] data, Camera camera) {
Bitmap rawImg = BitmapFactory.decodeByteArray(data, 0, data.length);
Log.e("TakePicture", "Picture Callback erreicht");
shotAnalyser.startAnalysation(rawImg, data);
}
Here I grab the Data and make a Bitmap
public void startAnalysation(final Bitmap rawImg, final byte[] data){
Bitmap rawBitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
// Ermitteln der nötigen Bild-Eigenschaften
int width = rawBitmap.getWidth();
int height = rawBitmap.getHeight();
}
Here I tried to Change the PictureSize
public Camera getCameraInstance(){
Camera c = null;
try {
releaseCameraAndPreview();
c = Camera.open(Camera.CameraInfo.CAMERA_FACING_BACK); // attempt to get a Camera instance
Log.e(LOG, "CameraInstance: " + c + " RUNS");
Camera.Parameters parameters = c.getParameters();
parameters.set("jpeg-quality", 70);
parameters.setPictureFormat(PixelFormat.JPEG);
parameters.setPictureSize(2048, 1232);
c.setParameters(parameters);
}
catch (Exception e){
}
return c; // returns null if camera is unavailable
}
I've googled a lot, but I can't find anything.

Related

Combine two images into one orientation issue

I want to make an app where user can take picture with back camera, and then with front camera.
So, after that I get two bitmaps and I want to combine them into one image.
This code I use for front Camera parameters:
//Set up picture orientation for saving...
Camera.Parameters parameters = theCamera.getParameters();
parameters.setRotation(90);
frontCamera.setParameters(parameters);
//Set up camera preview and set orientation to PORTRAIT
frontCamera.stopPreview();
frontCamera.setDisplayOrientation(90);
frontCamera.setPreviewDisplay(holder);
frontCamera.startPreview();
This code I use for taking picture with front camera
cameraObject.takePicture(shutterCallback, rawCallback, jpegCallback);
Callback for taking picture with back camera
PictureCallback jpegCallback = new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
backBitmap = decodeBitampFromByte(data, 0, 800, 800);
frontCameraObject.release();
initFrontCamera();
}
};
NOTE: Similar code is for taking picture with front camera. I get two bitmaps, and then I try to combine them with code below, but I get saved bitmap with wrong orientation.
This code I use for combing two bitamps: frontBitmap, backBitmap.
public Bitmap combineImages(Bitmap c, Bitmap s, String loc)
{
Bitmap cs = null;
int w = c.getWidth() + s.getWidth();
int h;
if(c.getHeight() >= s.getHeight()){
h = c.getHeight();
}else{
h = s.getHeight();
}
cs = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
Canvas comboImage = new Canvas(cs);
comboImage.drawBitmap(c, 0f, 0f, null);
comboImage.drawBitmap(s, c.getWidth, 0f, null);
String tmpImg = String.valueOf(System.currentTimeMillis()) + ".png";
OutputStream os = null;
try {
os = new FileOutputStream(loc + tmpImg);
cs.compress(CompressFormat.PNG, 100, os);
} catch (IOException e) {
Log.e("combineImages", "problem combining images", e);
}
return cs;
}
NOTE Image with bottle of the water is taken with back camera, and other is with front camera.
Try changing comboImage.drawBitmap(s, c.getWidth, 0f, null); to
comboImage.drawBitmap(s, 0f,c.getHeigh, null);

Capturing Preview Frame in Portrait

Is there any way to acquire the preview frame directly in portrait inside onPreviewFrame method?
I've tried:
camera.setDisplayOrientation(90);
but this seems to work only for the display. The doc reports:
Set the clockwise rotation of preview display in degrees. This affects
the preview frames and the picture displayed after snapshot. This
method is useful for portrait mode applications.
This does not affect the order of byte array passed in onPreviewFrame(byte[], Camera), JPEG pictures, or recorded videos.
This method is not allowed to be called during preview.
I'm targetting API level >= 8, and I've a portrait locked app. I want to avoid manually rotating byte array of data passed as frame.
Many thanks in advance.
Try this it will work
public void takeSnapPhoto() {
camera.setOneShotPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
int format = parameters.getPreviewFormat();
//YUV formats require more conversion
if (format == ImageFormat.NV21 || format == ImageFormat.YUY2 || format == ImageFormat.NV16) {
int w = parameters.getPreviewSize().width;
int h = parameters.getPreviewSize().height;
// Get the YuV image
YuvImage yuv_image = new YuvImage(data, format, w, h, null);
// Convert YuV to Jpeg
Rect rect = new Rect(0, 0, w, h);
ByteArrayOutputStream output_stream = new ByteArrayOutputStream();
yuv_image.compressToJpeg(rect, 100, output_stream);
byte[] byt = output_stream.toByteArray();
FileOutputStream outStream = null;
try {
// Write to SD Card
File file = createFileInSDCard(FOLDER_PATH, "Image_"+System.currentTimeMillis()+".jpg");
//Uri uriSavedImage = Uri.fromFile(file);
outStream = new FileOutputStream(file);
outStream.write(byt);
outStream.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
}
}
});}

Low Quality after scaling Bitmap to lower resolution

I am trying to scale down a Bitmap but the edges are not clear as the original image, its little blurred
I have checked out the below link
Bad image quality after resizing/scaling bitmap
Please help
Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0,data.length);
Matrix m=new Matrix();
m.postRotate(90);
//m.postScale((float)0.5,(float) 0.5);
//Added for merging
//Bitmap mutableBitmap = bitmap.copy(Bitmap.Config.ARGB_8888, true);
Bitmap mutableBitmap = Bitmap.createBitmap(bitmap,0,0,600,400,m,false);
Paint paint = new Paint(Paint.FILTER_BITMAP_FLAG);
Canvas canvas = new Canvas(mutableBitmap);
View v=(View)relLay.getParent();
v.setDrawingCacheEnabled(true);
v.buildDrawingCache();
Options options = new BitmapFactory.Options();
options.inScaled = false;
//options.inJustDecodeBounds=true;
options.inSampleSize=2;
//Bitmap viewCapture = v.getDrawingCache().copy(Bitmap.Config.ARGB_8888, true);
// Bitmap newImage = Bitmap.createScaledBitmap(viewCapture, viewCapture.getWidth()/2, viewCapture.getHeight()/2, true);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
(v.getDrawingCache().copy(Bitmap.Config.ARGB_8888, true)).compress(Bitmap.CompressFormat.PNG, 100, stream);
byte[] byteArray = stream.toByteArray();
Bitmap viewCapture= BitmapFactory.decodeByteArray(byteArray, 0, byteArray.length, options);
v.setDrawingCacheEnabled(false);
Rect src = new Rect(0, 0, viewCapture.getWidth(), viewCapture.getHeight());
Log.d("TEST",""+viewCapture.getWidth()+" "+viewCapture.getHeight());
//Destination RectF sized to the camera picture
Rect dst = new Rect(0, 0, mutableBitmap.getWidth(), mutableBitmap.getHeight());
Log.d("Test",""+mutableBitmap.getWidth()+" "+mutableBitmap.getHeight());
canvas.drawBitmap(viewCapture, src, dst, paint);
// Bitmap newImage = Bitmap.createScaledBitmap(viewCapture, 400,600, true);
mutableBitmap.compress(Bitmap.CompressFormat.PNG, 100, fos);
The viewCapture element gets blurred out if I try to scale
Let me try to help you.
First of all, If you your desired Camera Captured Picture size is 600*400 or similar size then
I would say set the camera param for your desired size (which is supported by camera) and you will get the small image in Camera's Picture taken method
Note : but make sure first you need to check what ate picture sizes supported by Device camera. then set one of them.
Here is the one example I tested for Galaxy Nexus
Galaxy Nexus Camera Supported Picture sizes
Supported Picture Size. Width: 2592 * height : 1944
Supported Picture Size. Width: 2592 * height : 1728
Supported Picture Size. Width: 2592 * height : 1458
Supported Picture Size. Width: 2048 * height : 1536
Supported Picture Size. Width: 1600 * height : 1200
Supported Picture Size. Width: 1280 * height : 1024
Supported Picture Size. Width: 1152 * height : 864
Supported Picture Size. Width: 1280 * height : 960
Supported Picture Size. Width: 640 * height : 480
Supported Picture Size. Width: 320 * height : 240
below is the sample code will help you
CameraActivity.java
public class CameraActivity extends Activity implements SurfaceHolder.Callback,
OnClickListener
{
Camera camera;
SurfaceHolder surfaceHolder;
boolean previewing = false;
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
getWindow().setFormat(PixelFormat.UNKNOWN);
surfaceView = (SurfaceView) findViewById(R.id.surfaceview);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(this);
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
#Override
public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3)
{
// TODO Auto-generated method stub
}
#Override
public void surfaceCreated(SurfaceHolder arg0)
{
// TODO Auto-generated method stub
Log.v(TAG, "surfaceCreated get called");
camera = Camera.open();
camera.setDisplayOrientation(90); //to get portrait display
if (camera != null) {
try {
//Here is the main logic
// We are setting camera parameters as our desired picture size
Camera.Parameters parameters = camera.getParameters();
List<Size> sizes = parameters.getSupportedPictureSizes();
Camera.Size result = null;
for (int i=0;i<sizes.size();i++)
{
result = (Size) sizes.get(i);
Log.i("PictureSize", "Supported Size. Width: " + result.width + "height : " + result.height);
if(result.width == 640)
{
parameters.setPreviewSize(result.width, result.height);//640*480
parameters.setPictureSize(result.width, result.height);
//Now if camera support for 640*480 pictures size you will get captured image as same size
}
else
{
//to do here
}
}
camera.setParameters(parameters);
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
previewing = true;
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
#Override
public void surfaceDestroyed(SurfaceHolder arg0) {
// TODO Auto-generated method stub
Log.v(TAG, "surfaceDestroyed get called");
camera.stopPreview();
camera.release();
camera = null;
previewing = false;
}
}
Let me know your comment in this.

How to capture preview image frames from Camera Application in Android Programming?

I am writing an app to capture the camera preview frames and convert it to bitmap in Android. Here is my code:
Camera.PreviewCallback previewCallback = new Camera.PreviewCallback()
{
public void onPreviewFrame(byte[] data, Camera camera)
{
try
{
BitmapFactory.Options opts = new BitmapFactory.Options();
Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);//,opts);
}
catch(Exception e)
{
}
}
};
mCamera = Camera.open();
mCamera.setPreviewCallback(previewCallback);
After I start preview, the callback got called with data, but the bitmap is null.
What did I do wrong when convert the byte array to BitMap?
In the onPreviewFrame() function, you should check the image format first.
This the NV21 example.
public void onPreviewFrame(byte[] data, Camera camera)
{
Parameters parameters = camera.getParameters();
imageFormat = parameters.getPreviewFormat();
if (imageFormat == ImageFormat.NV21)
{
Rect rect = new Rect(0, 0, PreviewSizeWidth, PreviewSizeHeight);
YuvImage img = new YuvImage(data, ImageFormat.NV21, PreviewSizeWidth, PreviewSizeHeight, null);
OutputStream outStream = null;
File file = new File(NowPictureFileName);
try
{
outStream = new FileOutputStream(file);
img.compressToJpeg(rect, 100, outStream);
outStream.flush();
outStream.close();
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
catch (IOException e)
{
e.printStackTrace();
}
}
}
For another way to take pictures, check out this article: how to use camera in android
Have you tried decoding the preview frame data to RGB before you use BitmapFactory? The default format is YUV which I'm not sure is compatible with BitmapFactory. Dave Manpearl's decode method can be found here:
Getting frames from Video Image in Android
Let me know if it works.
Cheers,
Paul

Best way to scale size of camera picture before saving to SD

The code below is executed as the jpeg picture callback after TakePicture is called. If I save data to disk, it is a 1280x960 jpeg. I've tried to change the picture size but that's not possible as no smaller size is supported. JPEG is the only available picture format.
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream out = null;
Bitmap bm = BitmapFactory.decodeByteArray(data, 0, data.length);
Bitmap sbm = Bitmap.createScaledBitmap(bm, 640, 480, false);
data.Length is something like 500k as expected. After executing BitmapFactory.decodeByteArray(), bm has a height and width of -1 so it appears the operation is failing.
It's unclear to me if Bitmap can handle jpeg data. I would think not but I have seem some code examples that seem to indicate it is.
Does data need to be in bitmap format before decoding and scaling?
If so, how to do this?
Thanks!
On your surfaceCreated, you code set the camara's Picture Size, as shown the code below:
public void surfaceCreated(SurfaceHolder holder) {
camera = Camera.open();
try {
camera.setPreviewDisplay(holder);
Camera.Parameters p = camera.getParameters();
p.set("jpeg-quality", 70);
p.setPictureFormat(PixelFormat.JPEG);
p.setPictureSize(640, 480);
camera.setParameters(p);
} catch (IOException e) {
e.printStackTrace();
}
}

Categories

Resources