Effective Bitmap Scaling for Album Art in Android - android

I just added support for album art in my Android application and I've encountered a problem where displaying the album art in a layout causes the application memory to spike and the playback service is eventually killed to free up memory. I believe the issue is that I'm adding the extracted album art to the layout without compressing it. This results in a large image having to be cached in memory. The code I'm using to make the Bitmap is:
byte [] blob = mCursor.getBlob(mCursor.getColumnIndexOrThrow(Media.MediaColumns.PICTURE));
if (blob != null) {
return BitmapFactory.decodeByteArray(blob, 0, blob.length);
}
Is it possible to uniformly scale/compress these Bitmaps to reduce their memory footprint. Also, is there a way to compress directly using the byte array (rather then an inputstream).

Try this
Options opt = new Options();
opt.inSampleSize = 2;
if (blob != null) {
return BitmapFactory.decodeByteArray(blob, 0, length, opt)
}
More info about this
public int inSampleSize
Added in API level 1
If set to a value > 1, requests the decoder to subsample the original image, returning a smaller image to save memory. The sample size is the number of pixels in either dimension that correspond to a single pixel in the decoded bitmap. For example, inSampleSize == 4 returns an image that is 1/4 the width/height of the original, and 1/16 the number of pixels. Any value <= 1 is treated the same as 1. Note: the decoder will try to fulfill this request, but the resulting bitmap may have different dimensions that precisely what has been requested. Also, powers of 2 are often faster/easier for the decoder to honor.

Related

BitmapFactory downsamples unexpectedly image

In my android application, I upload and download JPEG images. For some images BitmapFactory downsamples the downloaded image and I don't understand why and how to stop it from doing so.
I log the Bitmap object size when I upload and download the images:
Uploading image with size 3840x2160
Opening received image of size 2601584B degradation 0 decoded 384x216
So here my image width and height are divided by 10 after decoding by BitmapFactory.
For another image I go from 3480x4640 to 217x290, so a division by 16. I first thought it could be caused by image dpi metadata, so I tried to force it when opening the image with the following code:
BitmapFactory.Options opts = new BitmapFactory.Options();
opts.inSampleSize = 1 << degradation;
opts.inScaled = false;
opts.inDensity = 1;
opts.inTargetDensity = 1;
opts.inScreenDensity = 1;
final Bitmap original = BitmapFactory.decodeByteArray(data, 0, data.length, opts);
Log.d(Tags.IMAGES, "Opening received image of size " + data.length + " degradation " + degradation + " decoded " + original.getWidth() + "x" + original.getHeight());
As you can see the number of bytes (2MB) I give the BitmapFactory is consistent with a large image not with a 300x200px image. So my uploading code and downloading code are correct I have the right data before decoding.
I forced all the densities to avoid scaling and also disabled scaling and my sample size is 1 since my degradation is 0 so I don't force any subsampling. I still end with a very small image. And BitmapFactory.decodeByteArray quickly calls native code so I can't really debug that.
I tried:
settings all the density in the BitmapFactory.Options object to 0 but this didn't change a thing.
setting up the density of the bitmap before uploading, no effect (originally these two subsampled images had a density of 420).
reading the size of the bitmap before decoding it fully (via justDecodeBounds), and the size returned this way is also too small.
not pass any options to the decodeByteArray method, still the image is too small.
Ok after more investigation I found the issue and it was due to another part of the code. Due to a race condition in the application I was writing to the same file the image and a miniature of it concurrently. So I ended up with a large image file but its beginning was overridden with a small image, and this is why BitmapFactory creates this small image.
Sorry the issue was not in this part of the code so you couldn't find it.

Bitmap.compress doesn't decrease the Byte count

I need to compress an image to send it to my server. I am trying to do it this way:
private Bitmap compressImage(Bitmap bitmapImg){
ByteArrayOutputStream out = new ByteArrayOutputStream();
bitmapImg.compress(Bitmap.CompressFormat.JPEG, 50, out);
Bitmap compressed = BitmapFactory.decodeStream(new ByteArrayInputStream(out.toByteArray()));
return compressed;
}
But when I compare the Byte count of the original Bitmap object and the compressed one, I get the same number:
Log.e("UNCOMPRESSED", Integer.toString(mBitmapImg.getByteCount()));
E/UNCOMPRESSED: 23970816
Log.e("COMPRESSED", Integer.toString(compressedBitmapImg.getByteCount()));
E/COMPRESSED: 23970816
How can I fix this to have a smaller file?
But when I compare the Byte count of the original Bitmap object and the compressed one, I get the same number:
The size of a Bitmap in memory is based only on its resolution (width and height in pixels) and bit depth (the number of bytes per pixel, for controlling how many colors can be used per pixel).
How can I fix this to have a smaller file?
You do not have a file. You have a Bitmap object in memory. An image file is usually stored in a compressed form. In particular, this is true for JPEG, PNG, WebP, and GIF, the four major image formats used in Android. So, for example, out.toByteArray() will be smaller than 23,970,816 bytes.
Moreover, you are not sending a Bitmap to the server. You are sending an image to the server. You need to read the documentation for the server, or talk to the server developers, to determine what image format(s) they support and how to send the image to the server (ideally, something efficient like an HTTP PUT).
If you want to reduce the in-memory size of the Bitmap, scale it to a lower-resolution image (e.g., via createScaledBitmap()).
You can change your bitmap format to RGB_565 from ARGB_8888. That'll reduce your bitmap's memory footprint to half, but, would lead to loss of quality as well. Unfortunately, that's the most you can do with Bitmap.
Having said that, the compression method that you're using should work fine for most situations. It's also the advocated method for a number of platforms. An example for Firebase is this.

90MB heap size when loading drawables

I want to show a preview of some images from the users device. For this purpose I search the external storage for image files and for every folder which contains some, I list the folder's name and an image from this folder in a listview.
On my Nexus there are 6 folders containing images, so I have 6 listview items.
I load all images using:
Drawable.createFromPath(file.getAbsolutePath())
And cache the resulting drawable in a HashMap in order to prevent loading the same image multiple times.
However, the heap is growing from 20MB to >90MB. When the images are loaded the app response is delayed like 2 seconds. Pretty bad.
I have no idea how the heap can grow to 90MB from 6 images which are like 50KB but whatever. To fix this I tried to load subsampled bitmaps from the images - however whenever I load them I get an outofmemory exception.
I have verified multiple times that not more than those 6 images are loaded.
What can I do?
What you should do is analyze your app's memory usage using MAT tool, as described in this amazing article. This tool will help you identify potential memory leaks and see what exactly triggers the heap growth.
I have used the tool what Egor described and there were no memory leaks. The images took about 30MB heap. This amount can only be reduced by reducing the image size. My solution uses subsampling and takes about 3MB heap now.
private Drawable loadDrawable(Context context, File file) {
Drawable drawable = drawables.get(file);
if (drawable == null) {
final int targetSize = 500;
// get subsampling factor
Options opts = new Options();
opts.inJustDecodeBounds = true;
BitmapFactory.decodeFile(file.getAbsolutePath(), opts);
int largest = opts.outWidth > opts.outHeight ? opts.outWidth : opts.outHeight;
float factor = largest / (float) targetSize;
factor = Math.round(factor);
// load bitmap with subsampling
opts = new Options();
opts.inSampleSize = (int) factor;
Bitmap bitmap = BitmapFactory.decodeFile(file.getAbsolutePath(), opts);
drawable = new BitmapDrawable(getResources(), bitmap);
drawables.put(file, drawable);
}
return drawable;
}

Large bitmaps not loading efficiently from sdcard in Android

By following this link, I have written the following code to show a large image bitmap from sdcard.
try {
InputStream lStreamToImage = context.getContentResolver().openInputStream(Uri.parse(imagePath));
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeStream(lStreamToImage, null, options);
options.inSampleSize = 8; //Decrease the size of decoded image
options.inPreferredConfig = Bitmap.Config.ARGB_4444;
options.inJustDecodeBounds = false;
bitmap = BitmapFactory.decodeStream(lStreamToImage, null, options);
} catch(Exception e){}
image.setImageBitmap(bitmap);
But it is not returning the bitmap(I mean it returns null). In logcat it is showing the below message repeatedly
08-02 17:21:04.389: D/skia(19359): --- SkImageDecoder::Factory returned null
If I will comment the options.inJustDecodeBounds line and rerun it, it works fine but slowly. The developer guide link I provided above says to use inJustDecodeBounds to load bitmaps efficiently.
Please tell me where I am doing wrong.
inJustDecodeBounds does not load bitmaps. That's the point of it. It loads the dimensions of the bitmap without loading the actual bitmap so you can do any pre-processing or checking on the bitmap before you actually load it. This is helpful is you, say, were having memory issues and you needed to check if loading a bitmap would crash you program.
The reason your bitmap might be loading slowly is because it's probably very large and SD cards are very slow.
EDIT:
From the documentation:
If set to true, the decoder will return null (no bitmap), but the out... fields will still be set, allowing the caller to query the bitmap without having to allocate the memory for its pixels.
Edit 2:
Looking at your code with the example provided by Google, it looks like you are doing relatively the same thing. The reason it's returning null is possibly your InputStream has been modified in the first decoding and thus not starting at the beginning of the bitmap's memory address (they use a resource ID rather than InputStream.
From the code you supplied here, here's what I've figured. You are ALWAYS setting a sample size to 8 regardless of what the first decoding gives you. The reason Google decodes the first time is to figure out what the actual size of the bitmap is versus what they want. They determine that the bitmap is ZxZ dimensions and they want YxY dimensions, so they calculate the samplesize that they should use from the second decoding. You are not doing this. You are simply retrieving the dimensions of the bitmap and not using them. THEN, you set the sample size to a hard-coded 8, swapping it to a hard-coded ARGB_4444 bitmap, then decoding the full bitmap in to memory. In other words, these three lines are not being used:
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeStream(lStreamToImage, null, options);
Setting inJustDecodeBounds merely gives you the bitmap's dimensions without putting the bitmap in to memory. It doesn't make it more efficient. It's meant to allow you to load bitmaps in a smaller memory space if they are too big because you can pre-decide what size it should be without decoding the whole thing).
The reason decoding the bitmap is slow might merely be a CPU thing. Depending on the size of your bitmap, you're loading the bitmap from an InputStream from the SDcard which is a slow operation in itself.

BitmapFactory.Options.inSampleSize not being honored?

I have a photo on disk with dimensions 2560 x 1920. This is often too large to load into memory, so I'm trying to use BitmapFactory.Options.inSampleSize to conserve memory. From the docs:
inSampleSize: If set to a value > 1, requests the decoder to subsample the original image, returning a smaller image to save memory.
This is how I use it:
BitmapFactory.Options optsDownSample = new BitmapFactory.Options();
optsDownSample.inSampleSize = 3;
Bitmap bmp = BitmapFactory.decodeFile(path, optsDownSample);
but the app still sometimes crashes on the last line there, and from logcat I can see it's trying to allocate ~5mb, and I suspect this is because the downsampling is not really being honored.
Anyone else know what could be going on here, am I using inSampleSize incorrectly?
Thanks
I'm also struggling understanding how to use the BitmapFactory.Options, based on all the documentation I've read I believe you are just missing the optsDownSample.inJustDecodeBounds = false; as indicated on the Android Developers site.
Best of lucks!

Categories

Resources