I'm having trouble cleanly down-scaling images on Android. I'm looking to scale small PNG images between arbitrary sizes of about 10-100% of their original size.
I've created a sample image to demonstrate the problem and exacerbate the unusual behaviors I'm seeing in Android's image scaler:
The above image is a screenshot from an Android device with some annotations added. I've also added the same images in a second column on the left side showing how they are rendered with a linear scaling by "The GIMP" (GNU Image Manipulation Program).
The base image consists of a checkerboard pattern background of red and blue pixels. On that background I've drawn some 1px-wide yellow lines and fairly thin green text. The image is 288x288 pixels.
When scaling the image to 1/3 of its original dimensions, Android seems to simply grab one in nine pixels, throwing out all other data. Some of the yellow lines disappear entirely as a result. Remarkably, the checkerboard pattern remains intact (which is simply a result of every 3rd pixel being used).
When scaling the image to a dimension of near-but-not-exactly 50% of its original size, e.g., 142x142 or 143x143, the scaler creates some fairly large anomalies/artifacts on the image.
At 50% size (144x144), the image looks correct.
The test image does bring out the worst of the image scaler, but "normal" PNG icon images are severely impacted as well. From 10-33% or so the images aren't properly resampled, and thus appear extremely "bitmapped". And certain larger size images have very strange anomalies in them at certain sizes.
If anyone knows a means to disable this strange scaling behavior, even at a performance cost, I'd greatly appreciate knowing about it. It can certainly be solved by writing an algorithm that works directly on the pixels of bitmaps, but I'm hopeful that isn't the only option.
Also noteworthy is the fact that all image work is being done with ARGB_8888 Bitmap.Configs. I've tried manipulating image size by setting maxwidth/maxheight on ImageViews, by using Bitmap.createScaledBitmap(), and by using Bitmap.createBitmap with a Matrix. All attempts have this same result. Bitmap filtering is enabled.
Thanks again for any suggestions!
Using Bitmap.createScaledBitmap() and Bitmap.createBitmap with a Matrix is the same; see the source for Bitmap.createScaledBitmap (which hasn't changed since Android 2).
On Android 4.0+, using a matrix (as in Bitmap.createScaledBitmap) allows hardware-accelerated operations if enabled (enabled by default on 4.1+ IIRC), thus we doesn't have direct control over what is being done and how it is done.
That means you'll have to implement your own scaling method using the desired (here, linear) filtering; either by pixel processing; or using OpenGL ES with the good filter, but it may not be available on all devices.
Related
I'm new to Android development. I have a library here that produces LCD images that come from a device through USB. These images are small, 120x80 pixel, and are 1-bit. I want to show these images on the UI.
All documents I've found explain how to show bitmaps from image files (PNG etc.) or how to show them from an app's resources. I did find out that I can add an ImageView to the UI. Then, for each incoming 120x80 pixel image, create a Bitmap instance, fill it with pixels, and assign it to the ImageView. However, I do not know if this yields the best performance here.
It is also important to keep in mind that nearest neighbor scaling must be used here. Bilinear filtering with such a small image produces a result that is too blurry.
From my experience with other languages, it seems wasteful to create a Bitmap instance for each incoming image. But perhaps I am wrong here.
Suggestions?
I've been having problems with large images being resized for UI use in Android.
Look at this image, it's an ImageView:
The original image (That arc is a progressbar) is around 10 times bigger than what you see here. In UWP (Windows Platform) we had no problem using a very large image, but here in Android, I beleive it's the Nearest Neighbour method used for fitting images into UI elements, which as you see, causes sharp edges.
Is there any way to switch it into another method? Like Bicubic? It happens in all Android versions I've tested (4.1, 5.0, 6.0).
Just to mention, I'm using Xamarin 4, which I don't beleive as a contributing factor here.
No luck searching through the internet, I'm afraid I'm the only one having this problem.
Thanks.
As mentioned above, you should prefer to use vector image instead of pixel image.
But if you have to use pixel image, maybe you could use BitmapRegionDecoder to decode lines of image and write your own resample algorithm(like Bilinear Interpolation, it's much better than the Near Neighbor) to resize the image, typically in JNI side.
Another possible way is to use "filter" parameter while calling Bitmap.createBitmap method as your original image would not cause OOM issue, just set it to true, it works to reduce the artifacts.
You should use Vector Images instead of Bitmap Images.
Bitmap x Vector
A bitmap represents an image by a series of colored pixels. Whereas a vector image is represented by geometric shapes (lines, curves) using colors.
The main utility of a vector image is allowing to scale without losing definition.
I have a background png in my Android application. I would like to support lots of displays, but I have this one obstacle - many displays have many resolutions and many ratios. I would like to make sure my background is displayed properly and make it more ellegant, than just creating 10+ cropped png files in Photoshop.
My idea would be - a fairly large picture imported in the project. The app would find out screen dimensions and simply say starting points(x,y) and ending points, that would "crop" the picture and display it without any deformations.
Is there a way of doing it?
I think bitmap.createBitmap() is the method your looking after. The method simply lets you cut out a defined portion of the bitmap.
I'm developing fairly simple 2D game for android using OpenGL ES. I want it to run on almost any device on the market. The problem I face is, when I map the textures I use. If the image I want to map is with different size than its target object, the image quality is reduced (even when the size of the image is bigger than the object), it gets pixelated. I want very sharp images, and the only way to receive them is when I use textures and objects with same dimensions. But in this case I need like 4,5 textures representing the same image with different dimensions, to support enough screen resolutions. I'm using PNG images and the following GL methods:
glClear(GL10.GL_COLOR_BUFFER_BIT);
glEnable(GL10.GL_TEXTURE_2D);
glEnable(GL10.GL_BLEND);
glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE_MINUS_SRC_ALPHA);
//drawing happens here
glDisable(GL10.GL_BLEND);
There is no problem for me to create so many images, but the size of the game will get bigger. Am I doing anything wrong, or just OpenGL can not resize images like Photoshop?
Have you experimented with alternative sampling options, like GL_NEAREST or GL_LINEAR? (see glTexParameteri).
I don't think this is the fault of opengl, but when you display textures that are not at their native resolution (or an even multiple of it), there will always be some loss of exactness (because the image must be resampled at non-pixel boundaries). GL_LINEAR may alleviate some of the pixellated look at the expense of being slightly blurry, which may or may not be noticible.
I've currently been trying to optimize my app with RGB_565 textures rather than RGBA_8888 textures. All is well except when I try to re-size these textures? Is there some hard coded reason why I can't simply make these textures smaller?
I'm trying to keep all my textures larger than necessary so I can scale down; thus being able to dynamically scale the textures for larger screens to have better image quality. This doesn't seem to work with RGB_565 or I'm missing something :-?.
Should I just create a few different copies and dynamically LOAD the correct one rather than SCALE it? Thanks for any help you can offer!
[EDIT]
The whole point here is that I can't dynamically re-size/scale these textures once loaded into the app? They stretch like pulled from top right and bottom left of screen when I try to reduce their size even...?
Has to do with the GL blend functions that can be used. Didn't get into it, simply re-sized them statically for now, but when I do get into it again I'll be sure to post the code!