I would like to load a cropped version of a bitmap image into a Bitmap object, without loading the original bitmap as well.
Is this at all possible without writing custom loading routines to handle the raw data?
Thanks,
Sandor
It's actually very straightforward to do. Use
Bitmap yourBitmap = Bitmap.createBitmap(sourceBitmap, x to start from, y to start from, width, height)
Update: use BitmapRegionDecoder
try this
InputStream istream = null;
try {
istream = this.getContentResolver().openInputStream(yourBitmapUri);
} catch (FileNotFoundException e1) {
e1.printStackTrace();
}
BitmapRegionDecoder decoder = null;
try {
decoder = BitmapRegionDecoder.newInstance(istream, false);
} catch (IOException e) {
e.printStackTrace();
}
Bitmap bMap = decoder.decodeRegion(new Rect(istream, x to start from, y to start from, x to end with, y to end with), null);
imageView.setImageBitmap(bMap);
#RKN
Your method can also throw OutOfMemoryError exception - if cropped bitmap exceeds VM.
My method combines Yours and protection against this exeption:
(l, t, r, b - % of image)
Bitmap cropBitmap(ContentResolver cr, String file, float l, float t, float r, float b)
{
try
{
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
// First decode with inJustDecodeBounds=true to check dimensions
BitmapFactory.decodeFile(file, options);
int oWidth = options.outWidth;
int oHeight = options.outHeight;
InputStream istream = cr.openInputStream(Uri.fromFile(new File(file)));
BitmapRegionDecoder decoder = BitmapRegionDecoder.newInstance(istream, false);
if (decoder != null)
{
options = new BitmapFactory.Options();
int startingSize = 1;
if ((r - l) * oWidth * (b - t) * oHeight > 2073600)
startingSize = (int) ((r - l) * oWidth * (b - t) * oHeight / 2073600) + 1;
for (options.inSampleSize = startingSize; options.inSampleSize <= 32; options.inSampleSize++)
{
try
{
return decoder.decodeRegion(new Rect((int) (l * oWidth), (int) (t * oHeight), (int) (r * oWidth), (int) (b * oHeight)), options);
}
catch (OutOfMemoryError e)
{
Continue with for loop if OutOfMemoryError occurs
}
}
}
else
return null;
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
catch (IOException e)
{
e.printStackTrace();
}
return null;
}
and returns max available bitmap or null
Use RapidDecoder.
And simply do this
import rapid.decoder.BitmapDecoder;
Rect bounds = new Rect(left, top, right, bottom);
Bitmap bitmap = BitmapDecoder.from(getResources(), R.drawable.image)
.region(bounds).decode();
It requires Android 2.2 or above.
You can load the scaled version of bitmap with out fully loading the bitmap using following algorithm
Calculate the maximum possible inSampleSize that still yields an
image larger than your target.
Load the image using
BitmapFactory.decodeFile(file, options), passing inSampleSize as an
option.
Resize to the desired dimensions using
Bitmap.createScaledBitmap().
Check the following post
Android: Resize a large bitmap file to scaled output file for further details.
Related
My app is an OCR app base on Tesseract. It will do OCR task from camera picture. Users can take many pictures and put them into an OCR queue. To get more accuracy, I want to keep high quality image (I choose min size is 1024 x 768 (maybe larger in future), JPEG, 100% quality). When users take many pictures, there are three things to do:
Save the image data byte[] to file and correct EXIF.
Correct the image orientation base on device's orientation. I know there are some answers that said the image which comes out of the camera is not oriented automatically, have to correct it from file, like here and here. I'm not sure about it, I can setup the camera preview orientation correctly, but the image results aren't correct.
Load bitmap from taken picture, convert it to grayscale and save to another file for OCR task.
And here is my try:
public static boolean saveBitmap(byte[] bitmapData, int orientation, String imagePath, String grayScalePath) throws Exception {
Boolean rotationSuccess = false;
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPreferredConfig = Bitmap.Config.ARGB_8888;
Bitmap originalBm = null;
Bitmap bitmapRotate = null;
Bitmap grayScale = null;
FileOutputStream outStream = null;
try {
// save directly from byte[] to file
saveBitmap(bitmapData, imagePath);
// down sample
options.inJustDecodeBounds = true;
BitmapFactory.decodeFile(imagePath, options);
int sampleSize = calculateInSampleSize(options, Config.CONFIG_IMAGE_WIDTH, Config.CONFIG_IMAGE_HEIGHT);
options.inJustDecodeBounds = false;
options.inSampleSize = sampleSize;
originalBm = BitmapFactory.decodeFile(imagePath, options);
Matrix mat = new Matrix();
mat.postRotate(orientation);
bitmapRotate = Bitmap.createBitmap(originalBm, 0, 0, originalBm.getWidth(), originalBm.getHeight(), mat, true);
originalBm.recycle();
originalBm = null;
outStream = new FileOutputStream(new File(imagePath));
bitmapRotate.compress(CompressFormat.JPEG, 100, outStream);
// convert to gray scale
grayScale = UIUtil.convertToGrayscale(bitmapRotate);
saveBitmap(grayScale, grayScalePath);
grayScale.recycle();
grayScale = null;
bitmapRotate.recycle();
bitmapRotate = null;
rotationSuccess = true;
} catch (OutOfMemoryError e) {
e.printStackTrace();
System.gc();
} finally {
if (originalBm != null) {
originalBm.recycle();
originalBm = null;
}
if (bitmapRotate != null) {
bitmapRotate.recycle();
bitmapRotate = null;
}
if (grayScale != null) {
grayScale.recycle();
grayScale = null;
}
if (outStream != null) {
try {
outStream.close();
} catch (IOException e) {
}
outStream = null;
}
}
Log.d(TAG,"save completed");
return rotationSuccess;
}
Save to file directly from byte[]
public static void saveBitmap(byte[] bitmapData, String fileName) throws Exception {
File file = new File(fileName);
FileOutputStream fos;
BufferedOutputStream bos = null;
try {
final int bufferSize = 1024 * 4;
fos = new FileOutputStream(file);
bos = new BufferedOutputStream(fos, bufferSize);
bos.write(bitmapData);
bos.flush();
} catch (Exception ex) {
throw ex;
} finally {
if (bos != null) {
bos.close();
}
}
}
Calculate scale size
public static int calculateInSampleSize(BitmapFactory.Options options, int reqWidth, int reqHeight) {
// Raw height and width of image
final int height = options.outHeight;
final int width = options.outWidth;
int inSampleSize = 1;
if (height > reqHeight || width > reqWidth) {
final int halfHeight = height / 2;
final int halfWidth = width / 2;
// Calculate the largest inSampleSize value that is a power of 2 and
// keeps both
// height and width larger than the requested height and width.
while ((halfHeight / inSampleSize) > reqHeight && (halfWidth / inSampleSize) > reqWidth) {
inSampleSize *= 2;
}
}
return inSampleSize;
}
When save complete, this image is loaded into thumbnail image view by UIL. The problem is the save task is very slow (wait some second before save complete and load into view), and sometime I got OutOfMemory exception. Is there any ideas to reduce the save task and avoid OutOfMemory exception?
Any help would be appreciated!
P/S: the first time I try to convert byte[] to bitmap instead of save to file, and then rotate and convert to grayscale, but I still got above issues.
Update: here is the grayscale bitmap process:
public static Bitmap convertToGrayscale(Bitmap bmpOriginal) {
int width, height;
height = bmpOriginal.getHeight();
width = bmpOriginal.getWidth();
Bitmap bmpGrayscale = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
Canvas c = new Canvas(bmpGrayscale);
Paint paint = new Paint();
ColorMatrix cm = new ColorMatrix();
cm.setSaturation(0);
ColorMatrixColorFilter f = new ColorMatrixColorFilter(cm);
paint.setColorFilter(f);
c.drawBitmap(bmpOriginal, 0, 0, paint);
return bmpGrayscale;
}
The OutOfMemory exception seldom occurred (just a few times) and I can't reproduce it now.
Update:
Since you're still saying that the method takes too long time I would define a callback interface
interface BitmapCallback {
onBitmapSaveComplete(Bitmap bitmap, int orientation);
onBitmapRotateAndBWComlete(Bitmap bitmap);
}
Let your activity implement the above interface and convert the byte[] to bitmap in top of your saveBitmap method and fire the callback, before the first call to save. Rotate the imageView based on the orientation parameter and set a black/white filter on the imageView to fool the user into thinking that the bitmap is black and white (do this in your activity). See to that the calls are done on main thread (the calls to imageView). Keep your old method as you have it. (all steps need to be done anyway) Something like:
public static boolean saveBitmap(byte[] bitmapData, int orientation, String imagePath, String grayScalePath, BitmapCallback callback) throws Exception {
Boolean rotationSuccess = false;
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPreferredConfig = Bitmap.Config.ARGB_8888;
Bitmap originalBm = null;
Bitmap bitmapRotate = null;
Bitmap grayScale = null;
FileOutputStream outStream = null;
try {
// TODO: convert byte to Bitmap, see to that the image is not larger than your wanted size (1024z768)
callback.onBitmapSaveComplete(bitmap, orientation);
// save directly from byte[] to file
saveBitmap(bitmapData, imagePath);
.
.
// same as old
.
.
saveBitmap(grayScale, grayScalePath);
// conversion done callback with the real fixed bitmap
callback.onBitmapRotateAndBWComlete(grayScale);
grayScale.recycle();
grayScale = null;
bitmapRotate.recycle();
bitmapRotate = null;
rotationSuccess = true;
How do you setup your camera? What might be causing the long execution time in the first saveBitmap call, could be that you are using the default camera picture size settings and not reading the supported camera picture size and choosing best fit for your 1024x768 image needs. You might be taking big mpixel images and saving such, but in the end need you need < 1 mpixles (1024x768). Something like this in code:
Camera camera = Camera.open();
Parameters params = camera.getParameters();
List sizes = params.getSupportedPictureSizes();
// Loop camera sizes and find best match, larger than 1024x768
This is probably where you will save most of the time if you are not doing this already. And do it only once, during some initialization phase.
Increase the buffer to 8k in saveBitmap, change the 1024*4 to 1024*8, this would increase the performance at least, not save any significant time perhaps.
To save/reuse bitmap memory consider using inBitmap field, if you have a post honeycomb version, of BitmapFactory.Options and set that field to point to bitmapRotate bitmap and send options down to your convertToGrayscale method to not need allocating yet another bitmap down in that method. Read about inBitmap here: inBitmap
I am trying to make an application, which fetches the image from the storage and place it into that application image size so that it looks good for it.
Scaling bitmaps from memory can be very memory intensive. To avoid crashing your app on old devices, I recommend doing this.
I use these two methods to load a bitmap and scale it down. I split them into two functions. createLightweightScaledBitmapFromStream() uses options.inSampleSize to perform a rough scaling to the required dimensions. Then, createScaledBitmapFromStream() uses a more memory intensive Bitmap.createScaledBitmap() to finish scaling the image to the desired resolution.
Call createScaledBitmapFromStream() and you should be all set.
Lightweight scaling
public static Bitmap createLightweightScaledBitmapFromStream(InputStream is, int minShrunkWidth, int minShrunkHeight, Bitmap.Config config) {
BufferedInputStream bis = new BufferedInputStream(is, 32 * 1024);
try {
BitmapFactory.Options options = new BitmapFactory.Options();
if (config != null) {
options.inPreferredConfig = config;
}
final BitmapFactory.Options decodeBoundsOptions = new BitmapFactory.Options();
decodeBoundsOptions.inJustDecodeBounds = true;
bis.mark(Integer.MAX_VALUE);
BitmapFactory.decodeStream(bis, null, decodeBoundsOptions);
bis.reset();
final int width = decodeBoundsOptions.outWidth;
final int height = decodeBoundsOptions.outHeight;
Log.v("Original bitmap dimensions: %d x %d", width, height);
int sampleRatio = Math.max(width / minShrunkWidth, height / minShrunkHeight);
if (sampleRatio >= 2) {
options.inSampleSize = sampleRatio;
}
Log.v("Bitmap sample size = %d", options.inSampleSize);
Bitmap ret = BitmapFactory.decodeStream(bis, null, options);
Log.d("Sampled bitmap size = %d X %d", options.outWidth, options.outHeight);
return ret;
} catch (IOException e) {
Log.e("Error resizing bitmap from InputStream.", e);
} finally {
Util.ensureClosed(bis);
}
return null;
}
Final Scaling (Calls lightweight scaling first)
public static Bitmap createScaledBitmapFromStream(InputStream is, int maxWidth, int maxHeight, Bitmap.Config config) {
// Start by grabbing the bitmap from file, sampling down a little first if the image is huge.
Bitmap tempBitmap = createLightweightScaledBitmapFromStream(is, maxWidth, maxHeight, config);
Bitmap outBitmap = tempBitmap;
int width = tempBitmap.getWidth();
int height = tempBitmap.getHeight();
// Find the greatest ration difference, as this is what we will shrink both sides to.
float ratio = calculateBitmapScaleFactor(width, height, maxWidth, maxHeight);
if (ratio < 1.0f) { // Don't blow up small images, only shrink bigger ones.
int newWidth = (int) (ratio * width);
int newHeight = (int) (ratio * height);
Log.v("Scaling image further down to %d x %d", newWidth, newHeight);
outBitmap = Bitmap.createScaledBitmap(tempBitmap, newWidth, newHeight, true);
Log.d("Final bitmap dimensions: %d x %d", outBitmap.getWidth(), outBitmap.getHeight());
tempBitmap.recycle();
}
return outBitmap;
}
I am working on an android application. The application has a view containing lots of image. I had an error, I will try to give as much information as possible hoping someone can give me some suggestions.
The application was working great on all the local testings. However, I received lots of crashes from users: java.lang.OutOfMemoryError: bitmap size exceeds VM budget
This is the stack trace
0 java.lang.OutOfMemoryError: bitmap size exceeds VM budget
1 at android.graphics.Bitmap.nativeCreate(Native Method)
2 at android.graphics.Bitmap.createBitmap(Bitmap.java:507)
3 at android.graphics.Bitmap.createBitmap(Bitmap.java:474)
4 at android.graphics.Bitmap.createScaledBitmap(Bitmap.java:379)
5 at android.graphics.BitmapFactory.finishDecode(BitmapFactory.java:498)
6 at android.graphics.BitmapFactory.decodeStream(BitmapFactory.java:473)
7 at android.graphics.BitmapFactory.decodeResourceStream(BitmapFactory.java:336)
8 at android.graphics.BitmapFactory.decodeResource(BitmapFactory.java:359)
9 at android.graphics.BitmapFactory.decodeResource(BitmapFactory.java:385)
My biggest problem is that I was not able to reproduce the issue locally even on old devices.
I have implemented lots of things to try to resolve this:
No memory leaks: I made sure there is no memory leaks at all. I removed the views when I dont need them. I also recycled all the bitmaps and made sure the garbage collector is working as it should. And I implemented all the necessary steps in the onDestroy() method
Image size scaled correctly: Before getting the image I get its dimension and calculate the inSampleSize.
Heap size: I also detect the Max Heap size before getting the image and make sure there is enough space. If there is not enough I rescale the image accordingly.
Code to calculate the correct inSampleSize
public static int calculateInSampleSize(BitmapFactory.Options options, int reqWidth, int reqHeight)
{
// Raw height and width of image
final int height = options.outHeight;
final int width = options.outWidth;
int inSampleSize = 1;
if(height > reqHeight || width > reqWidth)
{
if(width > height)
{
inSampleSize = Math.round((float) height / (float) reqHeight);
}
else
{
inSampleSize = Math.round((float) width / (float) reqWidth);
}
}
return inSampleSize;
}
Code to get the bitmap
// decodes image and scales it to reduce memory consumption
private static Bitmap decodeFile(File file, int newWidth, int newHeight)
{// target size
try
{
Bitmap bmp = MediaStore.Images.Media.getBitmap(getContext().getContentResolver(), Uri.fromFile(file));
if(bmp == null)
{
// avoid concurrence
// Decode image size
BitmapFactory.Options option = new BitmapFactory.Options();
// option = getBitmapOutput(file);
option.inDensity = res.getDisplayMetrics().densityDpi < DisplayMetrics.DENSITY_HIGH ? 120 : 240;
option.inTargetDensity = res.getDisplayMetrics().densityDpi;
if(newHeight > 0 && newWidth > 0)
option.inSampleSize = calculateInSampleSize(option, newWidth, newWidth);
option.inJustDecodeBounds = false;
byte[] decodeBuffer = new byte[12 * 1024];
option.inTempStorage = decodeBuffer;
option.inPurgeable = true;
option.inInputShareable = true;
option.inScaled = true;
bmp = BitmapFactory.decodeStream(new FileInputStream(file), null, option);
if(bmp == null)
{
return null;
}
}
else
{
int inDensity = res.getDisplayMetrics().densityDpi < DisplayMetrics.DENSITY_HIGH ? 120 : 240;
int inTargetDensity = res.getDisplayMetrics().densityDpi;
if(inDensity != inTargetDensity)
{
int newBmpWidth = (bmp.getWidth() * inTargetDensity) / inDensity;
int newBmpHeight = (bmp.getHeight() * inTargetDensity) / inDensity;
bmp = Bitmap.createScaledBitmap(bmp, newBmpWidth, newBmpHeight, true);
}
}
return bmp;
}
catch(Exception e)
{
Log.e("Error calling Application.decodeFile Method params: " + Arrays.toString(new Object[]{file }), e);
}
return null;
}
Code to calculate image size based on Heap size for older devices
private void calculateImagesSize()
{
// only for android older than HoneyComb that does not support large heap
if(Build.VERSION.SDK_INT < Constants.HONEYCOMB)
{
long maxHeapSize = Runtime.getRuntime().maxMemory();
long maxImageHeap = maxHeapSize - 10485760;
if(Application.getResource().getDisplayMetrics().densityDpi >= DisplayMetrics.DENSITY_XHIGH)
{
maxImageHeap -= 12 * 1048576;
}
if(maxImageHeap < (30 * 1048576))
{
int screenHeight = Math.min(Application.getResource().getDisplayMetrics().heightPixels, Application.getResource()
.getDisplayMetrics().widthPixels);
long maxImageSize = maxImageHeap / 100;
long maxPixels = maxImageSize / 4;
long maxHeight = (long) Math.sqrt(maxPixels / 1.5);
if(maxHeight < screenHeight)
{
drawableHeight = (int) maxHeight;
drawableWidth = (int) (drawableHeight * 1.5);
}
}
}
}
I think the problem is with the Heap, maybe sometimes the os doesn't allow the application to use the maxheapsize. Also my biggest problem is that I was not able to reproduce the issue, so when I try a fix I have to wait a little to see if users are still getting the error.
What more could I try to avoid Out of memory issues? Any suggestions would be greatly appreciated. Thanks a lot
just use this function to decode...this is perfect solution for your error..because i also getting same error and i got this solution..
public static Bitmap decodeFile(File f,int WIDTH,int HIGHT){
try {
//Decode image size
BitmapFactory.Options o = new BitmapFactory.Options();
o.inJustDecodeBounds = true;
BitmapFactory.decodeStream(new FileInputStream(f),null,o);
//The new size we want to scale to
final int REQUIRED_WIDTH=WIDTH;
final int REQUIRED_HIGHT=HIGHT;
//Find the correct scale value. It should be the power of 2.
int scale=1;
while(o.outWidth/scale/2>=REQUIRED_WIDTH && o.outHeight/scale/2>=REQUIRED_HIGHT)
scale*=2;
//Decode with inSampleSize
BitmapFactory.Options o2 = new BitmapFactory.Options();
o2.inSampleSize=scale;
return BitmapFactory.decodeStream(new FileInputStream(f), null, o2);
} catch (FileNotFoundException e) {}
return null;
}
Hi you have to decode the file . for this try with the following method.
public static Bitmap new_decode(File f) {
// decode image size
BitmapFactory.Options o = new BitmapFactory.Options();
o.inJustDecodeBounds = true;
o.inDither = false; // Disable Dithering mode
o.inPurgeable = true; // Tell to gc that whether it needs free memory,
// the Bitmap can be cleared
o.inInputShareable = true; // Which kind of reference will be used to
// recover the Bitmap data after being
// clear, when it will be used in the future
try {
BitmapFactory.decodeStream(new FileInputStream(f), null, o);
} catch (FileNotFoundException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
// Find the correct scale value. It should be the power of 2.
final int REQUIRED_SIZE = 300;
int width_tmp = o.outWidth, height_tmp = o.outHeight;
int scale = 1;
while (true) {
if (width_tmp / 1.5 < REQUIRED_SIZE && height_tmp / 1.5 < REQUIRED_SIZE)
break;
width_tmp /= 1.5;
height_tmp /= 1.5;
scale *= 1.5;
}
// decode with inSampleSize
BitmapFactory.Options o2 = new BitmapFactory.Options();
// o2.inSampleSize=scale;
o.inDither = false; // Disable Dithering mode
o.inPurgeable = true; // Tell to gc that whether it needs free memory,
// the Bitmap can be cleared
o.inInputShareable = true; // Which kind of reference will be used to
// recover the Bitmap data after being
// clear, when it will be used in the future
// return BitmapFactory.decodeStream(new FileInputStream(f), null, o2);
try {
// return BitmapFactory.decodeStream(new FileInputStream(f), null,
// null);
Bitmap bitmap= BitmapFactory.decodeStream(new FileInputStream(f), null, null);
System.out.println(" IW " + width_tmp);
System.out.println("IHH " + height_tmp);
int iW = width_tmp;
int iH = height_tmp;
return Bitmap.createScaledBitmap(bitmap, iW, iH, true);
} catch (OutOfMemoryError e) {
// TODO: handle exception
e.printStackTrace();
// clearCache();
// System.out.println("bitmap creating success");
System.gc();
return null;
// System.runFinalization();
// Runtime.getRuntime().gc();
// System.gc();
// decodeFile(f);
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
return null;
}
}
By Reducing/Scale size of the Image you can get rid out of the Out of Memory Exception,
Try this
BitmapFactory.Options options = new BitmapFactory.Options();
options.inSampleSize = 6;
Bitmap receipt = BitmapFactory.decodeFile(photo.toString(),options); //From File You can customise on your needs.
I wrote a summary of suggestions in another StackOverFlow question: Android: BitmapFactory.decodeStream() out of memory with a 400KB file with 2MB free heap
actually the problem is with the development os. In android unlike iOS , google people develop this based on camera resolution. Bitmaps take up a lot of memory, especially for rich images like photographs.Different cameras captures images with different pixels(different mobiles have different camera pixel capacity). Here in android based on that pixels only the captured image will take memory. so obviously a high resolution image will not uploaded by a phone with low pixel capacity.
In android os allocates utmost 16MB to every application. If the uploaded image takes more than this then java.lang.OutofMemoryError: bitmap size exceeds VM budget will occur and application crashes.
refer this
http://developer.android.com/training/displaying-bitmaps/index.html
If u want to avoid OOM, u can catch OOM and increase the sampleSize until the image can be resolved:
private Bitmap getBitmapSafely(Resources res, int id, int sampleSize) {
// res = context.getResources(), id = R.drawable.yourimageid
Bitmap bitmap = null;
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPurgeable = true;
options.inSampleSize = sampleSize;
try {
bitmap = BitmapFactory.decodeResource(res,
id, options);
} catch (OutOfMemoryError oom) {
Log.w("ImageView", "OOM with sampleSize " + sampleSize, oom);
System.gc();
bitmap = getBitmapSafely(res, id, sampleSize + 1);
}
return bitmap;
}
Hope it helps.
It is not suitable to catch the Error, just a workaround.
I've created small app that works with images from gallery or camera.
It all works fine, but.
On device with small screen and small memory size (HTC Desire) I have some images downloaded in full size from other mobile phone, and they are much larger (8MP camera on that phone).
If I try to load that, for my small camera huge image, it will crash immediately.
So, how to implement some kind of check and to downsize that image, but still load it properly?
I do scale images down after they are loaded, but this is something that should be done before the crash appears.
Tnx.
InputStream in = null;
try {
in = getContentResolver().openInputStream(data.getData());
} catch (FileNotFoundException e) {
e.printStackTrace();
}
// get picture size.
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeStream(in, null, options);
try {
in.close();
} catch (IOException e) {
e.printStackTrace();
}
// resize the picture for memory.
int screenH = getResources().getDisplayMetrics().heightPixels; //800
int screenW = getResources().getDisplayMetrics().widthPixels; //480
int width = options.outWidth / screenW;
int height = options.outHeight / screenH;
Log.w("Screen Width", Integer.toString(width));
Log.w("Screen Height", Integer.toString(height));
int sampleSize = Math.max(width, height);
options.inSampleSize = sampleSize;
options.inJustDecodeBounds = false;
try {
in = getContentResolver().openInputStream(data.getData());
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
// convert to bitmap with declared size.
Globals.INSTANCE.imageBmp = BitmapFactory.decodeStream(in, null, options);
try {
in.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
you can avoid load the bitmap in memory just setting
inJustDecodeBounds = true
inJustDecodeBounds will allow you to decode only the bounds of the image without decode it. Given height and width of your bitmap you can downsampling it using.
inSampleSize
as the doc stays:
If set to a value > 1, requests the decoder to subsample the original
image, returning a smaller image to save memory.
int tmpWidth = bitmapWidth;
int tmpHeight = bitmapHeigth;
int requiredSize = ...
while (true) {
if (tmpWidth / 2 < requiredSize
|| tmpHeight / 2 < requiredSize)
break;
tmpWidth /= 2;
tmpHeight /= 2;
ratio *= 2;
}
EDIT: for a 32 bit Bitmap the memory required is width * height * 4
Android bitmap size exceeds VM budget.
My app is getting this error frequently. I have two questions.
Do I need to recycle my about activity (it contains some imageviews and buttons and textViews)?
What is the difference between .recycle(); and between system.gc(); ?
You should always try and recycle Bitmaps afte you have used them.
As far as I understand, you should try and avoid calling system.gc().
Calling recycle() will allow the bitmap object to be garbage collected.
I hope this helps.
I got the same problem while picking images from camera.
I resized the bitmap of image using following code:
Bitmap bitmap = resizeBitMapImage(picturePath, 75, 91);
profilePic.setImageBitmap(bitmap);
private Bitmap resizeBitMapImage(String filePath, int targetWidth,
int targetHeight) {
Bitmap bitMapImage = null;
// First, get the dimensions of the image
Options options = new Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeFile(filePath, options);
double sampleSize = 0;
// Only scale if we need to
// (16384 buffer for img processing)
Boolean scaleByHeight = Math.abs(options.outHeight - targetHeight) >= Math
.abs(options.outWidth - targetWidth);
if (options.outHeight * options.outWidth * 2 >= 1638) {
// Load, scaling to smallest power of 2 that'll get it <= desired
// dimensions
sampleSize = scaleByHeight ? options.outHeight / targetHeight
: options.outWidth / targetWidth;
sampleSize = (int) Math.pow(2d,
Math.floor(Math.log(sampleSize) / Math.log(2d)));
}
// Do the actual decoding
options.inJustDecodeBounds = false;
options.inTempStorage = new byte[128];
while (true) {
try {
options.inSampleSize = (int) sampleSize;
bitMapImage = BitmapFactory.decodeFile(filePath, options);
break;
} catch (Exception ex) {
try {
sampleSize = sampleSize * 2;
} catch (Exception ex1) {
}
}
}
return bitMapImage;
}