in my application i can open the camera and take a picture. The picture is stored in a full size of 2448x3264 pixels on the sd-card. how can i configure this in my application, to save the picture in a size of 90x90 pixels and not in 2448x3264 pixel?
to open the camera and capture a image i use following methods:
/*
* Capturing Camera Image will lauch camera app requrest image capture
*/
private void captureImage() {
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
fileUri = getOutputMediaFileUri(MEDIA_TYPE_IMAGE);
intent.putExtra(MediaStore.EXTRA_OUTPUT, fileUri);
// start the image capture Intent
startActivityForResult(intent, CAMERA_CAPTURE_IMAGE_REQUEST_CODE);
}
private Uri getOutputMediaFileUri(int type) {
return Uri.fromFile(getOutputMediaFile(type));
}
private File getOutputMediaFile(int type) {
// External sdcard location
File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory
(Environment.DIRECTORY_PICTURES), IMAGE_DIRECTORY_NAME);
// Create the storage directory if it does not exist
if (!mediaStorageDir.exists()) {
if (!mediaStorageDir.mkdirs()) {
Log.d(IMAGE_DIRECTORY_NAME, "Oops! Failed create " + IMAGE_DIRECTORY_NAME + " directory");
return null;
}
}
// Create a media file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.getDefault()).format(new Date());
File mediaFile;
if (type == MEDIA_TYPE_IMAGE) {
mediaFile = new File(mediaStorageDir.getPath() + File.separator + "IMG_" + timeStamp + ".jpg");
}
else {
return null;
}
return mediaFile;
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
// if the result is capturing Image
if (requestCode == CAMERA_CAPTURE_IMAGE_REQUEST_CODE) {
if (resultCode == RESULT_OK) {
/*
try {
decodeUri(this, fileUri, 90, 90);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
*/
// successfully captured the image
Toast.makeText(getApplicationContext(),
"Picture successfully captured", Toast.LENGTH_SHORT).show();
} else if (resultCode == RESULT_CANCELED) {
// user cancelled Image capture
Toast.makeText(getApplicationContext(),
"User cancelled image capture", Toast.LENGTH_SHORT).show();
} else {
// failed to capture image
Toast.makeText(getApplicationContext(),
"Sorry! Failed to capture image", Toast.LENGTH_SHORT).show();
}
}
}
public static Bitmap decodeUri(Context c, Uri uri, final int requiredWidth, final int requiredHeight) throws FileNotFoundException {
BitmapFactory.Options o = new BitmapFactory.Options();
o.inJustDecodeBounds = true;
BitmapFactory.decodeStream(c.getContentResolver().openInputStream(uri), null, o);
int width_tmp = o.outWidth, height_tmp = o.outHeight;
int scale = 1;
while(true) {
if(width_tmp / 2 < requiredWidth || height_tmp / 2 < requiredHeight)
break;
width_tmp /= 2;
height_tmp /= 2;
scale *= 2;
}
BitmapFactory.Options o2 = new BitmapFactory.Options();
o2.inSampleSize = scale;
return BitmapFactory.decodeStream(c.getContentResolver().openInputStream(uri), null, o2);
}
#Override
protected void onRestoreInstanceState(Bundle savedInstanceState) {
super.onRestoreInstanceState(savedInstanceState);
// get the file url
fileUri = savedInstanceState.getParcelable("file_uri");
}
i hope that s.o. can help me with this. i am trying to load the captured images into a little imageview, look like that. thanks in advance
No, you cannot control the picture size when you use MediaStore.ACTION_IMAGE_CAPTURE Intent. You can achieve this if you implement your "custom camera" (and there are plenty of working samples on Internet), including mine.
The byte array received in onPictureTaken() is a Jpeg buffer. Look at this Java package for image manipulation: http://mediachest.sourceforge.net/mediautil/ (there is an Android port on GitHub). There are very powerful and efficient methods to scale down Jpeg, without decoding it into Bitmap and back.
Here, I'm giving a method which will take the saved path on SDCard of taken picture and will return the required size image as Bitmap. Now what you have to do is just pass image path on SDCard and get the resized image.
private Bitmap processTakenPicture(String fullPath) {
int targetW = 90; //your required width
int targetH = 90; //your required height
BitmapFactory.Options bmOptions = new BitmapFactory.Options();
bmOptions.inJustDecodeBounds = true;
BitmapFactory.decodeFile(fullPath, bmOptions);
int scaleFactor = 1;
scaleFactor = calculateInSampleSize(bmOptions, targetW, targetH);
bmOptions.inJustDecodeBounds = false;
bmOptions.inSampleSize = scaleFactor * 2;
bmOptions.inPurgeable = true;
Bitmap bitmap = BitmapFactory.decodeFile(fullPath, bmOptions);
return bitmap;
}
private int calculateInSampleSize(BitmapFactory.Options options, int reqWidth,
int reqHeight) {
// Raw height and width of image
final int height = options.outHeight;
final int width = options.outWidth;
int inSampleSize = 1;
if (height > reqHeight || width > reqWidth) {
if (width > height) {
inSampleSize = Math.round((float) height / (float) reqHeight);
} else {
inSampleSize = Math.round((float) width / (float) reqWidth);
}
}
return inSampleSize;
}
After you had read the original image, you can use:
Bitmap.createScaledBitmap(photo, width, height, true);
here is another question wherre a guy has the same problem. He uses the following.
Bitmap ThumbImage = ThumbnailUtils.extractThumbnail(BitmapFactory.decodeFile(imagePath), THUMBSIZE, THUMBSIZE);
Related
I am trying to show the picture taken by the camera and sometimes it works but it usually gives me the error:
java.lang.OutOfMemoryError
at android.graphics.BitmapFactory.nativeDecodeStream(Native Method)
at android.graphics.BitmapFactory.decodeStreamInternal(BitmapFactory.java:727)
at android.graphics.BitmapFactory.decodeStream(BitmapFactory.java:703)
at android.graphics.BitmapFactory.decodeStream(BitmapFactory.java:741)
at android.provider.MediaStore$Images$Media.getBitmap(MediaStore.java:847)
at com.dima.polimi.rentall.NewProduct.onActivityResult(NewProduct.java:207)
at android.app.Activity.dispatchActivityResult(Activity.java:5773)
at android.app.ActivityThread.deliverResults(ActivityThread.java:3710)
at android.app.ActivityThread.handleSendResult(ActivityThread.java:3757)
at android.app.ActivityThread.access$1400(ActivityThread.java:170)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1352)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:146)
at android.app.ActivityThread.main(ActivityThread.java:5635)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:515)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1291)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1107)
at dalvik.system.NativeStart.main(Native Method)
I have tried the solutions from multiple questions but i never make it to work
This is my code:
mImageView.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
dispatchTakePictureIntent();
}
});
dispatchTakePictureIntent
String mCurrentPhotoPath;
private void dispatchTakePictureIntent() {
Intent cameraIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if (cameraIntent.resolveActivity(getPackageManager()) != null) {
// Create the File where the photo should go
File photoFile = null;
try {
photoFile = createImageFile();
} catch (IOException ex) {
// Error occurred while creating the File
Log.i("", "IOException");
}
// Continue only if the File was successfully created
if (photoFile != null) {
cameraIntent.putExtra(MediaStore.EXTRA_OUTPUT, Uri.fromFile(photoFile));
startActivityForResult(cameraIntent, REQUEST_TAKE_PHOTO);
}
}
}
private File createImageFile() throws IOException {
// Create an image file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
String imageFileName = "JPEG_" + timeStamp + "_";
File storageDir = Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_PICTURES);
File image = File.createTempFile(
imageFileName, // prefix
".jpg", // suffix
storageDir // directory
);
// Save a file: path for use with ACTION_VIEW intents
mCurrentPhotoPath = "file:" + image.getAbsolutePath();
return image;
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == REQUEST_TAKE_PHOTO && resultCode == RESULT_OK) {
try {
Bitmap mImageBitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(), Uri.parse(mCurrentPhotoPath));
mImageView.setImageBitmap(mImageBitmap);
} catch (IOException e) {
e.printStackTrace();
}
}
}
Finally I got it to work, the solution is using bitmap like this:
Bitmap b = BitmapUtility.decodeSampledBitmapFromResource(image.getAbsolutePath(), 540, 360);
BitmapUtility:
public class BitmapUtility {
public static Bitmap decodeSampledBitmapFromResource(String path, int reqWidth, int reqHeight) {
// First decode with inJustDecodeBounds=true to check dimensions
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeFile(path,options);
// Calculate inSampleSize
options.inSampleSize = calculateInSampleSize(options, reqWidth, reqHeight);
// Decode bitmap with inSampleSize set
options.inJustDecodeBounds = false;
return BitmapFactory.decodeFile(path, options);
}
private static int calculateInSampleSize(
BitmapFactory.Options options, int reqWidth, int reqHeight) {
// Raw height and width of image
final int height = options.outHeight;
final int width = options.outWidth;
int inSampleSize = 1;
if (height > reqHeight || width > reqWidth) {
final int halfHeight = height / 2;
final int halfWidth = width / 2;
// Calculate the largest inSampleSize value that is a power of 2 and keeps both
// height and width larger than the requested height and width.
while ((halfHeight / inSampleSize) > reqHeight
&& (halfWidth / inSampleSize) > reqWidth) {
inSampleSize *= 2;
}
}
return inSampleSize;
} }
In my app I have to let the user take photo from camera / choose from gallery .
My code working fine on some device but when I run on Galaxy s4 and taking photo by camera the photo doesnt show on the imageView (choose from gallery -working fine)the problem is just when take photo by camera in Galaxy s4.
I scaled the picture before display it on the imageView.
Here is my code to take the picture:
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
File f = new File(android.os.Environment.getExternalStorageDirectory(), "temp.jpg");
intent.putExtra(MediaStore.EXTRA_OUTPUT, Uri.fromFile(f));
startActivityForResult(intent, 1);
this is my code after take photo:
if (requestCode == 1)
{
File f = new File(Environment.getExternalStorageDirectory().toString());
File image_file = new File(Environment.getExternalStorageDirectory().toString());
for (File temp : f.listFiles())
{
if (temp.getName().equals("temp.jpg"))
{
f = temp;
break;
}
}
try
{
Bitmap bitmap;
//BitmapFactory.Options bitmapOptions = new BitmapFactory.Options();
//bitmap=decodeFile(f.getAbsolutePath());//Collapse image
//setImageOnBitmap(bitmap);//Set the image on the imageview
BitmapHandler b=new BitmapHandler(getApplicationContext());
bitmap=b.decodeFileAsPath(f.getAbsolutePath(),"camera");
setImageOnBitmap(bitmap);
String path = android.os.Environment
.getExternalStorageDirectory()
+ File.separator
+ "Phoenix" + File.separator + "default";
f.delete();
}
catch (Exception e)
{
e.printStackTrace();
}
}
this is my function to decode the file of the image:
public static Bitmap decodeSampledBitmapFromFile(String path, int reqWidth, int reqHeight)
{ // BEST QUALITY MATCH
//First decode with inJustDecodeBounds=true to check dimensions
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeFile(path, options);
// Calculate inSampleSize, Raw height and width of image
final int height = options.outHeight;
final int width = options.outWidth;
options.inPreferredConfig = Bitmap.Config.RGB_565;
int inSampleSize = 1;
if (height > reqHeight)
{
inSampleSize = Math.round((float)height / (float)reqHeight);
}
int expectedWidth = width / inSampleSize;
if (expectedWidth > reqWidth)
{
//if(Math.round((float)width / (float)reqWidth) > inSampleSize) // If bigger SampSize..
inSampleSize = Math.round((float)width / (float)reqWidth);
}
options.inSampleSize = inSampleSize;
// Decode bitmap with inSampleSize set
options.inJustDecodeBounds = false;
return BitmapFactory.decodeFile(path, options);
}
I have a similar issue on S4, and this is what solved for me:
mImageView.setLayerType(View.LAYER_TYPE_SOFTWARE, null);
Hope this helps.
What i'm doing/trying to do, is.
1. Take a photo
2. Save it
3. Load/display it into a Bitmap
Opening the Built in camera application:
public void openCamera() {
Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
File file = new File(Environment.getExternalStorageDirectory()+ File.separator + "image.jpg");
takePictureIntent.putExtra(MediaStore.EXTRA_OUTPUT, Uri.fromFile(file));
startActivityForResult(takePictureIntent, REQUEST_IMAGE_CAPTURE);
}
onActivityResult:
protected void onActivityResult(int requestCode, int resultCode, Intent data){
//Check that request code matches ours:
if (requestCode == REQUEST_IMAGE_CAPTURE){
//Get our saved file into a bitmap object:
File file = new File(Environment.getExternalStorageDirectory()+File.separator + "image.jpg");
Bitmap image = decodeSampledBitmapFromFile(file.getAbsolutePath(), 1000, 700);
}
}
decodeSamoleBitmapFromFile:
public static Bitmap decodeSampledBitmapFromFile(String path, int reqWidth, int reqHeight)
{ // BEST QUALITY MATCH
//First decode with inJustDecodeBounds=true to check dimensions
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeFile(path, options);
// Calculate inSampleSize, Raw height and width of image
final int height = options.outHeight;
final int width = options.outWidth;
options.inPreferredConfig = Bitmap.Config.RGB_565;
int inSampleSize = 1;
if (height > reqHeight)
{
inSampleSize = Math.round((float)height / (float)reqHeight);
}
int expectedWidth = width / inSampleSize;
if (expectedWidth > reqWidth)
{
//if(Math.round((float)width / (float)reqWidth) > inSampleSize) // If bigger SampSize..
inSampleSize = Math.round((float)width / (float)reqWidth);
}
options.inSampleSize = inSampleSize;
// Decode bitmap with inSampleSize set
options.inJustDecodeBounds = false;
return BitmapFactory.decodeFile(path, options);
}
I was hoping for it to load it into a bitmap. If anyone could point me in the right direction that would be very helpful!
In order to display a Bitmap, you'll have to use an ImageView. Once you have both the bitmap and the reference to your ImageView, call ImageView.setImageBitmap(Bitmap) to display your bitmap.
In the android app that I am making a user takes a photo which I want to display him later in few activities. I've come up to "out of memory error" while displaying currently taken photo in an image view, so I decided to use code from http://developer.android.com/training/displaying-bitmaps/load-bitmap.html to do it efficiently. Here are methods that I'm using:
public static int calculateInSampleSize(
BitmapFactory.Options options, int reqWidth, int reqHeight) {
final int height = options.outHeight;
final int width = options.outWidth;
int inSampleSize = 1;
if (height > reqHeight || width > reqWidth) {
final int halfHeight = height / 2;
final int halfWidth = width / 2;
while ((halfHeight / inSampleSize) > reqHeight
&& (halfWidth / inSampleSize) > reqWidth) {
inSampleSize *= 2;
}
}
return inSampleSize;
}
and
public static Bitmap decodeSampledBitmap(Uri mUri, int reqWidth, int reqHeight) {
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeFile(mUri.getPath(), options);
options.inSampleSize = calculateInSampleSize(options, reqWidth, reqHeight);
options.inJustDecodeBounds = false;
return BitmapFactory.decodeFile(mUri.getPath, options);
};
I let the user take photo by the method:
static final int REQUEST_IMAGE_CAPTURE = 1;
private void dispatchTakePictureIntent(){
Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if(takePictureIntent.resolveActivity(getPackageManager()) != null){
startActivityForResult(takePictureIntent, REQUEST_IMAGE_CAPTURE);
}
}
And then receive taken picture info:
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data){
if(requestCode == REQUEST_IMAGE_CAPTURE && resultCode == Activity.RESULT_OK){
Uri imageUri = null;
if(data != null){
imageUri = data.getData();
}
ImageView imageView = (ImageView)findViewById(R.id.new_photo);
imageView.setImageBitmap(decodeSampledBitmap(imageUri, imageView.getWidth(), imageView.getHeight()));
}
}
My app breaks and I get info: "File not found or no such directory".
I checked what does the imageUri.getPath() give and in my case its:
"/external/images/media/1777" (which seems quite strange to me because I'm not using SD card)
and the taken photo is actually saved in "/storage/emulated/0/DCIM/100ANDRO/DSC_0052.JPG". Do you have any ideas what am I doing wrong?
Intent Extra Max limit is 1MB approx till gingerbread..
I found a magic number 86389 in JellyBean if you send above this it will throw memory out of exception..
Solution : Pass Image Uri, dont pass complete bitmap object
Use following code just before startActivityForResult(takePictureIntent, REQUEST_IMAGE_CAPTURE);
takePictureIntent.putExtra(MediaStore.EXTRA_OUTPUT, getImageUri());
Where getImageUri() is
private Uri getImageUri() {
// Store image in dcim
File myDir = new File(Environment.getExternalStorageDirectory()
+ "/My_App");
if(!myDir.exists())
myDir.mkdir();
File file = new File(myDir, "MyImage.png");
Uri imgUri = Uri.fromFile(file);
return imgUri;
}
Read up Android official docs regarding this for further clarification.
I am trying to take an image of specified dimensions and saving it in a desired location on the SD card. I am using intent.putExtra to take the image via the default camera application.
Here goes the code
public void onClick(View v) {
//Setting up the URI for the desired location
imageFile = "bmp"+v.getId()+".png";
File f = new File (folder,imageFile);
imageUri = Uri.fromFile(f);
//Setting the desired size parameters
private Camera mCamera;
Camera.Parameters parameters = mCamera.getParameters();
parameters.setPreviewSize(width, height);
mCamera.setParameters(parameters);
//Passing intent.PutExtras to defaul camera activity
Intent i = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
i.putExtra(android.provider.MediaStore.EXTRA_OUTPUT, imageUri);
startActivityForResult(i,CAMERA_PIC_REQUEST);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if(resultCode == RESULT_OK){
return;
}
The camera activiy force closes after taking an image.
Is it possible to modify the size of the images taken by the default camera activity in this way??
Or A separate camera application is necessary??
If your image is saved as a file, create a bitmap from that file and decrease its size using this method and pass this bitmap to your Activity:
public static Bitmap decodeFile(File file, int requiredSize) {
try {
// Decode image size
BitmapFactory.Options o = new BitmapFactory.Options();
o.inJustDecodeBounds = true;
BitmapFactory.decodeStream(new FileInputStream(file), null, o);
// The new size we want to scale to
// Find the correct scale value. It should be the power of 2.
int width_tmp = o.outWidth, height_tmp = o.outHeight;
int scale = 1;
while (true) {
if (width_tmp / 2 < requiredSize
|| height_tmp / 2 < requiredSize)
break;
width_tmp /= 2;
height_tmp /= 2;
scale *= 2;
}
// Decode with inSampleSize
BitmapFactory.Options o2 = new BitmapFactory.Options();
o2.inSampleSize = scale;
return BitmapFactory.decodeStream(new FileInputStream(file), null,
o2);
} catch (FileNotFoundException e) {
}
return null;
}