While I try to get image dimensions in pixels in an ImageView, I found that its width is 3 times more than the original jpg file width.
I put a jpg file which dimensions are 800 x 600, but the code below displays 2400 as its width.
Bitmap bitmap = ((BitmapDrawable)imgv.getDrawable()).getBitmap();
float fwidth = bitmap.getWidth();
Log.d("width0", Float.toString(fwidth));
I checked the jpg file size again but it was not changed (800 x 600),
I also searched for a solution but the code above displays the correct dimensions of the bitmap on other user's experience.
What have I done incorrectly?
Can anyone give me some advice?
Thanks for your help.
This is the solution found in a web site.
I needed to change the Option value when I decode the resource not to scale the original image. I had to use the three parameters for the decodeResource function, not two.
Of course, the third parameter was Options specifying the original bitmap not to be scaled. So now I can get 800 when I call bitmap's getWidth() function.
Resources res = getResources();
int id = R.drawable.map10;
BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false;
// options.inSampleSize = 3;
Bitmap bb = BitmapFactory.decodeResource(res, id, options);
float fwidth = bb.getWidth();
Make sure, that your ImageView is set to:
height : wrap_content
width : wrap_content
scaleType: none
Related
I have a PNG image file with 2236x971px dimensions as a resource.
I want to scale it down by a factor of two (to its half). However, when i use this code:
BitmapFactory.Options bo = new BitmapFactory.Options();
bo.inSampleSize = 2;
Bitmap decodedBitmap = BitmapFactory.decodeResource(getResources(), R.drawable.image, bo);
the decodedBitmap.getWidth() and decodedBitmap.getHeight() show width: 1677, height:728 → Only 25% reduction in size instead of expected 50%. Why is that so?
I am running the code on API 19.
The reason is that, your resource gets loaded according to your screen metrics. Put your image in the drawable-nodpi Folder or open an input stream to your resource and decode that input stream.
I have images of size 1080P , now i do not want to use different variations of images that we put in RES folder . I am going to install this app on 1K devices with random images , so thats not feasible to have different versions of images.
Can we scale it on runtime , still getting the best quality ?
To scale images use following
Bitmap bitmap = BitmapFactory.decodeResource(
getResources(), R.drawable.app_bg);
scaledBitmap = Bitmap.createScaledBitmap(bitmap, width, height, true);
then assign this scaledBitmap to any ImageView or any other View. This will scale the original Bitmap to the requested width and height. To get width and height of the device screen use following
Display display = getWindowManager().getDefaultDisplay();
Point size = new Point();
display.getSize(size);
int width = size.x;
int height = size.y;
EDIT
In order to handle MemoryLeakException add scaledBitmap.recycle() after using this Bitmap.
Since that loading large bitmaps to show on a device screen in android(the correct way) is not a trivial task, I took a look on some tutorials on how to effectively make it. I'm already aware that you need to do the following in order to make a memory efficient image loader method:
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true; //do that to avoid loading all the information on the heap to decode
BitmapFactory.decodeResource(getResources(), R.id.myimage, options);
Acoording to the google tutorial you should make a sampled sized image, until this point I understand, you should make a method like this:
public static Bitmap decodeSampledBitmapFromResource(Resources res, int resId,
int reqWidth, int reqHeight) {
// First decode with inJustDecodeBounds=true to check dimensions
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeResource(res, resId, options);
// Calculate inSampleSize
options.inSampleSize = calculateInSampleSize(options, reqWidth, reqHeight);
// Decode bitmap with inSampleSize set
options.inJustDecodeBounds = false;
return BitmapFactory.decodeResource(res, resId, options);
}
And Finally you call the method like this:
imageView.setImageBitmap(decodeSampledBitmapFromResource(getResources(), drawable.big_image, 200, 200));
And here is where my doubt lies: How to know the best size values according to the devices's screen size/resolution ? Is there any method that I can embed on my code which returns the optimal screen resolution to load an image without the pixelated effect and yet not too big that would blow up the VM heap ? This is one of the biggest challenges I'm facing on my project right now. I searched this link(and others I don't remember) but I couldn't find the answer I'm looking for.
As a suggestion ,
You can define the approx : width and height of your imageview as a portion of
device width and height in pixels. As ex:
imageview width = 0.8 of device width & imageview height = 0.4 of device height
Then you can calculate the device width and height in pixels. There by you can get the actual required imageview size according to relevant portion.
Then you can pass the calculated imageview width and height in to "decodeSampledBitmapFromResource" method.
In my application, on most devices, I am interested in loading and displaying a bitmap exactly as it exists in the resource file:
options.inScaled = false;
backgroundImage = BitmapFactory.decodeResource(getResources(), R.drawable.demo_frame, options);
On certain (small) devices, I want to reduce the size of the bitmap, so I call the following:
backgroundImage = Bitmap.createScaledBitmap(backgroundImage, Scale(backgroundImage.getWidth()), Scale(backgroundImage.getHeight()), false);
where the Scale function applies a scaling factor to shrink the image. The bitmap returned shows me the desired values in the getWidth() and getHeight() functions, but the image is displayed in its original size, i.e. without the scaling factor. I assume this has something to do with the options.inScaled parameter, but the documentation did not enlighten me. Can someone give a more detailed explanation of the inScaled parameter, and how I can override it in this case? Thanks.
I am running into a strange issue. I have multiple images in my Android project which I stored as .png files under res\drawable. I was able to easily extract the images at runtime and convert them to a bitmap like this:
Drawable d = getResources().getDrawable(R.drawable.imageId);
Bitmap bitmap = ((BitmapDrawable)d).getBitmap();
This works great and the image gets scaled correctly no matter what screen density the device has. All my images are 200 pixels by 200 pixels and my image layout is configured as 200 dip x 200 dip.
Now, I have stored all images as blobs in an SQlite database due to scalability issues and I am extracting them at runtime and converting to a bitmap like this:
byte[] bb = cursor.getBlob(columnIndex);
Bitmap bitmap = BitmapFactory.decodeByteArray(bb, 0, bb.length);
The image displays fine if the screen density is standard 160 dip. But if the density is any less or more, the image doesn't scale and remains 200 pixels x 200 pixels for 160 dip. So basically, on a smaller screen (120 dip), the image takes more space than it should and on a larger screen (240 dip), it takes less space than it should.
Has anyone else run into this bizarre issue? Any explanation, workaround, solution will be really appreciated.
Thanks much in advance!
Okay, I finally got it to work by using createScaledBitmap().
After creating a bitmap from the blob, I calculate the scaling factor and then calculate the new width and height. I then use those in the createScaledBitmap() function. Here's the code that works:
public static Bitmap getImageFromBlob(byte[] mBlob)
{
byte[] bb = mBlob;
Bitmap b = BitmapFactory.decodeByteArray(bb, 0, bb.length);
DisplayMetrics metrics = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(metrics);
int newWidth = b.getWidth()*metrics.densityDpi)/DisplayMetrics.DENSITY_DEFAULT;
int newHeight = b.getHeight()*metrics.densityDpi)/DisplayMetrics.DENSITY_DEFAULT;
return Bitmap.createScaledBitmap(b, newWidth, newHeight, true);
}
This link provided the clue:
difference between methods to scale a bitmap
Use decodeByteArray(byte[] data, int offset, int length, BitmapFactory.Options opts) and in opts set inDesity to let say 160.
Worked for me, see this. But I have to say, I think SQLite is doing it right, and the android drawable is incorrectly sized.