Android Bitmap OutOfMemory issue - android

My application would have more than 350 images which would be decoded from database. I create bitmap from image data and scale them based on device screen resolution. When I tried to hold all of these bitmaps into memory, I was facing outOfMemory exception. Then BitmapFactory.Options.inPurgeable has been recommended in various places as a way to avoid OutOfMemoryExceptions.
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPurgeable = true;
options.inInputShareable = true;
Bitmap bitmap = BitmapFactory.decodeByteArray(imagaeData, 0, size, options);
...
..
Bitmap scaledBitmap = Bitmap.createScaledBitmap(bitmap, reqWidth, reqHeight, true);
I am caching this scaled bitmap to HashMap and using it for image view. Again I am facing OutOfMemory exception while loading the bitmaps to memory. I don't understnad whether the inPurgeable is working in my case. I am wondering will the scaled bitmap have reference to bytes array. As I am using scaled bitmap, will it have the effect of the inPurgeable option used in decodeByteArray. I am not able to figure out how to handle this bitmap memory issue. Appreciate your help.

350 images are quite a lot. Are you sure that you need them all at once?
Also: as you create a scaled bitmap, you have them in memory twice -> 700 images in memory is way way way too much. You should check if it would be better to use inScale on the option to reduce it to 350 again plus reduce the memory footprint.
I still think that even in optimized ways 350 images are just too much. You should consider a lazy loading solution.

You may try BitmapFactory.Options.inScaled & BitmapFactory.Options. inScreenDensity to get the scaled bitmap.
And you need a better way to cache bitmap in memory. You'd better hold WeakReference of the Bitmap in HashMap for the bitmap, and you can switch to LinkedHashMap for a simple LRU cache implementation.
You don't really need cache all the images, since they'll never get a chance to be displayed in one screen.

Wow, it sounds crazy to try and keep 350 bitmaps in memory at once. Even if they were small I wouldn't recommend it. Surely you won't be able to show all these bitmaps at once, so whats the point of keeping all of them in memory at the same time?
You really should look into using something like Square's Picasso lib for handling image loading and scaling. Picasso handles "ImageView recycling and download cancelation in an adapter", "automatic memory and disk caching" and finally also "complex image transformations with minimal memory use".

Use this method to reduce the image size first (file points to a photo on SD card)
//decodes image and scales it to reduce memory consumption
private Bitmap decodeFile(File f){
try {
//decode image size
BitmapFactory.Options o = new BitmapFactory.Options();
o.inJustDecodeBounds = true;
FileInputStream stream1=new FileInputStream(f);
BitmapFactory.decodeStream(stream1,null,o);
stream1.close();
//Find the correct scale value. It should be the power of 2.
// maximum size is 50
final int REQUIRED_SIZE=40;
int width_tmp=o.outWidth, height_tmp=o.outHeight;
int scale=1;
while(true){
if(width_tmp/2<=REQUIRED_SIZE || height_tmp/2<=REQUIRED_SIZE)
break;
width_tmp/=2;
height_tmp/=2;
scale*=2;
}
//decode with inSampleSize
BitmapFactory.Options o2 = new BitmapFactory.Options();
o2.inSampleSize=scale;
FileInputStream stream2=new FileInputStream(f);
Bitmap bitmap=BitmapFactory.decodeStream(stream2, null, o2);
stream2.close();
return bitmap;
} catch (FileNotFoundException e) {
}
catch (IOException e) {
e.printStackTrace();
}
return null;
}
// Here is how to call the above method
String path = "/mnt/sdcard/DCIM/camera/IMG_2001.jpg";
Drawable background = hash_map.get(path);
if (background == null) {
try {
Bitmap bitmap = decodeFile(new File(path));
background = new BitmapDrawable(bitmap);
if (hash_map.size() > 600) {
// to prevent HashMap from growing too large.
hash_map.clear();
}
hash_map.put(path, background);
} catch (Throwable e) {
// in case there is an exception, like running out of memory.
if (e instanceof OutOfMemoryError) {
hash_map.clear();
}
}
}

Related

Is it wise to use Glide to load XML vectors

Does Glide know to differentiate between images and vectors? Is there any advantage for loading an XML vector using Glide in terms of memory usage and cache management?
GlideApp.with(imageView)
.load(R.drawable.my_xml_vector)
.into(imageView);
If I choose to load it directly, should I worry about recycling it?
imageView.setImageResource(R.drawable.my_xml_vector);
In both cases, If you are loading large images you should scale the image as per your need before using. In some cases not doing so, might produce outOfMemoryError. You can use following method to scale the image before loading.
iv.setImageBitmap(decodeResource(getResources(), R.drawable.big_image));
private static Bitmap decodeResource(Resources res, int id) {
Bitmap bitmap = null;
BitmapFactory.Options options = new BitmapFactory.Options();
for (options.inSampleSize = 1; options.inSampleSize <= 32; options.inSampleSize++) {
try {
bitmap = BitmapFactory.decodeResource(res, id, options);
Log.d(TAG_LOG, "Decoded successfully for sampleSize " + options.inSampleSize);
break;
} catch (OutOfMemoryError outOfMemoryError) {
// If an OutOfMemoryError occurred, we continue with for loop and next inSampleSize value
Log.e(TAG_LOG, "outOfMemoryError while reading file for sampleSize " + options.inSampleSize
+ " retrying with higher value");
}
}
return bitmap;
}
In some cases, Glide fails to load vector images or partially load it. The best possible way I have seen is to load vector images is as a placeholder or as an error image like
GlideApp.with(mContext) .load("") .error(R.drawable.my_vector) .into(holder.imageView);

Bitmap size much greater than file size from which it's loaded

I'm using the following code to retrieve an bitmap from image file in android:
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPreferredConfig = Bitmap.Config.ARGB_8888;
Bitmap bitmap = BitmapFactory.decodeFile(path, options);
However, the size of the bitmap is more than double the size of file. Eg, for a file of size 520kb, the bitmap size is around 1.3MB. Is there a way I can get the bitmap of the same size as that of file?
The bitmap size is pure memory data without compression. You can calculate the size with 4 bytes per pixel (with your settings).
Your file is probably in a compressed format like jpg. Without a compression it would take up the same space.
The reason why the bitmap is kept uncompressed in memory is basically the performance. You can read the data and work with it much faster in an uncompressed way. For example if you would like to see what color a specific pixel has, you would need to uncompress the data first. That takes time and the more pixels you check the more time it would take.
While reading the file from the storage, the uncompression just takes an ignorable amount of time compared to the complete reading process. So you will not have a big performance impact compared to "on demand" uncompression.
try like this may help you,
public static Bitmap decodeFile(File f,int WIDTH,int HIGHT){
try {
//Decode image size
BitmapFactory.Options o = new BitmapFactory.Options();
o.inJustDecodeBounds = true;
BitmapFactory.decodeStream(new FileInputStream(f),null,o);
//The new size we want to scale to
final int REQUIRED_WIDTH=WIDTH;
final int REQUIRED_HIGHT=HIGHT;
//Find the correct scale value. It should be the power of 2.
int scale=1;
while(o.outWidth/scale/2>=REQUIRED_WIDTH && o.outHeight/scale/2>=REQUIRED_HIGHT)
scale*=2;
//Decode with inSampleSize
BitmapFactory.Options o2 = new BitmapFactory.Options();
o2.inSampleSize=scale;
return BitmapFactory.decodeStream(new FileInputStream(f), null, o2);
} catch (FileNotFoundException e) {}
return null;
}

Does anyone know if there is a limitation on the size of the file BitmapFactory.decodeFile can decode?

I use the following code. With large image files it returns null, small ones display. Does anyone know if there is a size limitation with BitmapFactory.decodeFile?
File imgFile = new File(path);
if(imgFile.exists())
{
Bitmap myBitmap=null;
myBitmap = BitmapFactory.decodeFile(imgFile.getAbsolutePath());
ImageView imageView=(ImageView)findViewById(R.id.imageView1);
if(myBitmap!=null) {
imageView.setImageBitmap(myBitmap);
}
else {
Toast.makeText(MainActivity.this, "NULL", Toast.LENGTH_LONG).show();
}
} else Toast.makeText(MainActivity.this, "NO FILE", Toast.LENGTH_LONG).show();
how large is it? how many MBs?
do you get an OutOfMemoryException because each mobile app have specific amount of memory (heap size) and a large bitmap might exceed this limit, so u get a null
EDIT:
unless the big bitmap by chance, is corrupted of some weird format that the app could not decode? try another large bitmap
From experience there is a limit and when you hit that resolution limit for openGL you will see the following type of error message:
Bitmap too large to be uploaded into a texture (4128x2322, max=4096x4096)
You may also get a OutOfMemoryException error like many have noted.
First of all you should do this:
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeStream(inputStream, null, options);
final int width = options.outWidth;
final int height = options.outHeight;
Now that you have the width and height, it's time to make an intelligent decision about how to rescale it.
I would at this point look at the full code example (really only two methods to copy/paste) from Google at Loading Large Bitmaps Efficiently.

Android Rotate image/photo

I have a problem with the rotation of images.
I try:
Matrix matrix = new Matrix();
matrix.postRotate(90);
BitmapFactory.Options options=new BitmapFactory.Options();
options.inSampleSize = 4;
Bitmap pic = BitmapFactory.decodeFile(filePath, options);
Bitmap rotatedPhoto = Bitmap.createBitmap(pic, 0, 0, pic.getWidth(), pic.getHeight(), matrix, true);
photo.setImageBitmap(rotatedPhoto);
try {
stream = new FileOutputStream(filePath);
rotatedPhoto.compress(CompressFormat.JPEG, 100 ,stream);
}
catch (FileNotFoundException e) {
e.printStackTrace();
}
Picture rotating, but the quality is very much lost.
How do I solve this problem? How do I rotate the image without losing quality?
Thank you!
Update:
And how to rotate image without losing resolution?
I think your problem is arising because you are setting inSampleSize to 4. This means the returned image will be a factor of 4 smaller than the original image.
http://developer.android.com/reference/android/graphics/BitmapFactory.Options.html#inSampleSize
Try setting options.inSampleSize to 1 - does this help?
Be careful when dealing with images though - you have very little memory to play with in an Android app. Loading just a couple of large images into memory at once can often cause your app to crash.

OutOfMemory exception when loading bitmap from external storage

In my application I load a couple of images from JPEG and PNG files. When I place all those files into assets directory and load it in this way, everything is ok:
InputStream stream = getAssets().open(path);
Bitmap bitmap = BitmapFactory.decodeStream(stream, null, null);
stream.close();
return new BitmapDrawable(bitmap);
But when I try to load the exact same images from sd card, I get an OutOfMemory exception!
InputStream stream = new FileInputStream("/mnt/sdcard/mydata/" + path);
Bitmap bitmap = BitmapFactory.decodeStream(stream, null, null);
stream.close();
return new BitmapDrawable(bitmap);
This is what I get in the log:
11-05 00:53:31.003: ERROR/dalvikvm-heap(13183): 827200-byte external allocation too large for this process.
11-05 00:53:31.003: ERROR/GraphicsJNI(13183): VM won't let us allocate 827200 bytes
...
11-05 00:53:31.053: ERROR/AndroidRuntime(13183): Caused by: java.lang.OutOfMemoryError: bitmap size exceeds VM budget
11-05 00:53:31.053: ERROR/AndroidRuntime(13183): at android.graphics.BitmapFactory.nativeDecodeStream(Native Method)
...
Why can this happen?
UPDATE: Tried both of these on real device - it seems that I can't load more than 12MB of bitmaps into whatever is called "external memory" (this is not an sd card).
I tried all the approaches mentioned here & at other resources but I came to the inference that setting ImageView's reference to null will solve the issue:
public Bitmap getimage(String path ,ImageView iv)
{
//iv is passed to set it null to remove it from external memory
iv=null;
InputStream stream = new FileInputStream("/mnt/sdcard/mydata/" + path);
Bitmap bitmap = BitmapFactory.decodeStream(stream, null, null);
stream.close();
stream=null;
return bitmap;
}
& you are done!
Note:Though it may solve above problem but I would suggest you to check Tom van Zummeren 's optimized image loading.
And also check SoftReference: All SoftReferences pointing to softly reachable objects are guaranteed to be cleared before the VM will throw an OutOfMemoryError.
When doing a lot with bitmaps, don't debug the app - just run it. The debugger will leave memory leaks.
Bitmaps are very expensive. If possible, scale them down on load by creating BitmapFactory.Options and setting inSampleSize to >1.
EDIT: Also, be sure to check your app for memory leaks. Leaking a Bitmap (having static Bitmaps is an excellent way to do that) will quickly exhaust your available memory.
Probably nothing wrong with your API usage, I guess all we can do is infer that using the AssetManager involves less behind-the-scenes heap allocation than opening a random file from the SD card.
800KB is a serious allocation in anybody's book... this will doubtless be for the decompressed image pixels. Given that you know the size of the image, what depth is it? If it's 32bpp then try overriding that using inPreferredConfig.
This is a fairly common issue which all of us face while loading images from the sdcard.
The solution as I found was to use inJustDecodeBounds first while loading the image using decodeFileDescriptor . That would not actually decode the image, but give the image size. Now I can scale it appropriately(using the options) so as to resize the image for the display area. Its needed because low memory on the phone can be easily taken over by your 5MP image. This I believe is the most elegant solution.
There are two issues here....
Bitmap memory isn't in the VM heap but rather in the native heap - see BitmapFactory OOM driving me nuts
Garbage collection for the native heap is lazier than the VM heap - so you need to be quite aggressive about doing bitmap.recycle and bitmap =null every time you go through an Activity's onPause or onDestroy
Instead of loading it from the SD Card directly, why not move the image to the cache in the phone's internal storage using getCacheDir() or use a temp directory to store the images in?
See this, this on external memory usage. Also, this article may be of relevance to you.
Use the below code and you will never get the following error: java.lang.OutOfMemoryError: bitmap size exceeds VM budget
BitmapFactory.Options bounds = new BitmapFactory.Options();
bounds.inSampleSize = 4;
myBitmap = BitmapFactory.decodeFile(imgFile.getAbsolutePath(), bounds);
picturesView.setImageBitmap(myBitmap);
The best solution i found and edited according to my need
public static Bitmap getImageBitmap(String path) throws IOException{
// Allocate files and objects outside of timingoops
File file = new File(thumbpath);
RandomAccessFile in = new RandomAccessFile(file, "rws");
final FileChannel channel = in.getChannel();
final int fileSize = (int)channel.size();
final byte[] testBytes = new byte[fileSize];
final ByteBuffer buff = ByteBuffer.allocate(fileSize);
final byte[] buffArray = buff.array();
#SuppressWarnings("unused")
final int buffBase = buff.arrayOffset();
// Read from channel into buffer, and batch read from buffer to byte array;
long time1 = System.currentTimeMillis();
channel.position(0);
channel.read(buff);
buff.flip();
buff.get(testBytes);
long time1 = System.currentTimeMillis();
Bitmap bmp = Bitmap_process(buffArray);
long time2 = System.currentTimeMillis();
System.out.println("Time taken to load: " + (time2 - time1) + "ms");
return bmp;
}
public static Bitmap Bitmap_process(byte[] buffArray){
BitmapFactory.Options options = new BitmapFactory.Options();
options.inDither=false; //Disable Dithering mode
options.inPurgeable=true; //Tell to gc that whether it needs free memory, the Bitmap can be cleared
options.inInputShareable=true; //Which kind of reference will be used to recover the Bitmap data after being clear, when it will be used in the future
options.inTempStorage=new byte[32 * 1024]; //Allocate some temporal memory for decoding
options.inSampleSize=1;
Bitmap imageBitmap = BitmapFactory.decodeByteArray(buffArray, 0, buffArray.length, options);
return imageBitmap;
}
Thanks to all the threads, I've found a solution that works for me on a real device.
The tricks are all about using
BitmapFactory.Options opts=new BitmapFactory.Options();
opts.inSampleSize=(int)(target_size/bitmap_size); //if original bitmap is bigger
But for me this was not enough. My original image (taken from the Camera app) was 3264x2448. The correct ratio for me was 3, since i wanted a plain VGA image of 1024x768.
But setting inSampleSize to 3 was not enough: still out of memory exception.
So in the end I opted for a iterative approach: I start from the computed correct size, and increase it until I stop having a OOM exception.
For me it was at sample of 4.
// Decode with inSampleSize
BitmapFactory.Options o2 = new BitmapFactory.Options();
// o2.inSampleSize = scale;
float trueScale = o.outWidth / 1024;
o2.inPurgeable = true;
o2.inDither = false;
Bitmap b = null;
do {
o2.inSampleSize = (int) trueScale;
Log.d(TAG, "Scale is " + trueScale);
try {
b = BitmapFactory.decodeStream(new FileInputStream(f), null, o2);
} catch (OutOfMemoryError e) {
Log.e(TAG,"Error decoding image at sampling "+trueScale+", resampling.."+e);
System.gc();
try {
Thread.sleep(50);
} catch (InterruptedException e1) {
e1.printStackTrace();
}
}
trueScale += 1;
} while (b==null && trueScale < 10);
return b;
You must not depends on the GC to recycle your bitmap memory.
You must clearly recycle the bitmap when it is not needed.
See the Bitmap method:
void recycle()
Free up the memory associated with this bitmap's pixels, and mark the bitmap as "dead", meaning it will throw an exception if getPixels() or setPixels() is called, and will draw nothing.
Try this another way...
Bitmap bmpOrignal = BitmapFactory.decodeFile("/sdcard/mydata/" + path");
Allows inSampleSize resize the final read image.
getLength() of AssetFileDescriptor allows get size of file.
You can vary inSampleSize according to getLength() to prevent OutOfMemory like this :
private final int MAX_SIZE = 500000;
public Bitmap readBitmap(Uri selectedImage)
{
Bitmap bm = null;
AssetFileDescriptor fileDescriptor = null;
try
{
fileDescriptor = this.getContentResolver().openAssetFileDescriptor(selectedImage,"r");
long size = fileDescriptor.getLength();
BitmapFactory.Options options = new BitmapFactory.Options();
options.inSampleSize = (int) (size / MAX_SIZE);
bm = BitmapFactory.decodeFileDescriptor(fileDescriptor.getFileDescriptor(), null, options);
}
catch (Exception e)
{
e.printStackTrace();
}
finally
{
try {
if(fileDescriptor != null) fileDescriptor.close();
} catch (IOException e) {}
}
return bm;
}

Categories

Resources