I've currently been trying to optimize my app with RGB_565 textures rather than RGBA_8888 textures. All is well except when I try to re-size these textures? Is there some hard coded reason why I can't simply make these textures smaller?
I'm trying to keep all my textures larger than necessary so I can scale down; thus being able to dynamically scale the textures for larger screens to have better image quality. This doesn't seem to work with RGB_565 or I'm missing something :-?.
Should I just create a few different copies and dynamically LOAD the correct one rather than SCALE it? Thanks for any help you can offer!
[EDIT]
The whole point here is that I can't dynamically re-size/scale these textures once loaded into the app? They stretch like pulled from top right and bottom left of screen when I try to reduce their size even...?
Has to do with the GL blend functions that can be used. Didn't get into it, simply re-sized them statically for now, but when I do get into it again I'll be sure to post the code!
Related
I've been having problems with large images being resized for UI use in Android.
Look at this image, it's an ImageView:
The original image (That arc is a progressbar) is around 10 times bigger than what you see here. In UWP (Windows Platform) we had no problem using a very large image, but here in Android, I beleive it's the Nearest Neighbour method used for fitting images into UI elements, which as you see, causes sharp edges.
Is there any way to switch it into another method? Like Bicubic? It happens in all Android versions I've tested (4.1, 5.0, 6.0).
Just to mention, I'm using Xamarin 4, which I don't beleive as a contributing factor here.
No luck searching through the internet, I'm afraid I'm the only one having this problem.
Thanks.
As mentioned above, you should prefer to use vector image instead of pixel image.
But if you have to use pixel image, maybe you could use BitmapRegionDecoder to decode lines of image and write your own resample algorithm(like Bilinear Interpolation, it's much better than the Near Neighbor) to resize the image, typically in JNI side.
Another possible way is to use "filter" parameter while calling Bitmap.createBitmap method as your original image would not cause OOM issue, just set it to true, it works to reduce the artifacts.
You should use Vector Images instead of Bitmap Images.
Bitmap x Vector
A bitmap represents an image by a series of colored pixels. Whereas a vector image is represented by geometric shapes (lines, curves) using colors.
The main utility of a vector image is allowing to scale without losing definition.
I'm having trouble cleanly down-scaling images on Android. I'm looking to scale small PNG images between arbitrary sizes of about 10-100% of their original size.
I've created a sample image to demonstrate the problem and exacerbate the unusual behaviors I'm seeing in Android's image scaler:
The above image is a screenshot from an Android device with some annotations added. I've also added the same images in a second column on the left side showing how they are rendered with a linear scaling by "The GIMP" (GNU Image Manipulation Program).
The base image consists of a checkerboard pattern background of red and blue pixels. On that background I've drawn some 1px-wide yellow lines and fairly thin green text. The image is 288x288 pixels.
When scaling the image to 1/3 of its original dimensions, Android seems to simply grab one in nine pixels, throwing out all other data. Some of the yellow lines disappear entirely as a result. Remarkably, the checkerboard pattern remains intact (which is simply a result of every 3rd pixel being used).
When scaling the image to a dimension of near-but-not-exactly 50% of its original size, e.g., 142x142 or 143x143, the scaler creates some fairly large anomalies/artifacts on the image.
At 50% size (144x144), the image looks correct.
The test image does bring out the worst of the image scaler, but "normal" PNG icon images are severely impacted as well. From 10-33% or so the images aren't properly resampled, and thus appear extremely "bitmapped". And certain larger size images have very strange anomalies in them at certain sizes.
If anyone knows a means to disable this strange scaling behavior, even at a performance cost, I'd greatly appreciate knowing about it. It can certainly be solved by writing an algorithm that works directly on the pixels of bitmaps, but I'm hopeful that isn't the only option.
Also noteworthy is the fact that all image work is being done with ARGB_8888 Bitmap.Configs. I've tried manipulating image size by setting maxwidth/maxheight on ImageViews, by using Bitmap.createScaledBitmap(), and by using Bitmap.createBitmap with a Matrix. All attempts have this same result. Bitmap filtering is enabled.
Thanks again for any suggestions!
Using Bitmap.createScaledBitmap() and Bitmap.createBitmap with a Matrix is the same; see the source for Bitmap.createScaledBitmap (which hasn't changed since Android 2).
On Android 4.0+, using a matrix (as in Bitmap.createScaledBitmap) allows hardware-accelerated operations if enabled (enabled by default on 4.1+ IIRC), thus we doesn't have direct control over what is being done and how it is done.
That means you'll have to implement your own scaling method using the desired (here, linear) filtering; either by pixel processing; or using OpenGL ES with the good filter, but it may not be available on all devices.
I'm developing fairly simple 2D game for android using OpenGL ES. I want it to run on almost any device on the market. The problem I face is, when I map the textures I use. If the image I want to map is with different size than its target object, the image quality is reduced (even when the size of the image is bigger than the object), it gets pixelated. I want very sharp images, and the only way to receive them is when I use textures and objects with same dimensions. But in this case I need like 4,5 textures representing the same image with different dimensions, to support enough screen resolutions. I'm using PNG images and the following GL methods:
glClear(GL10.GL_COLOR_BUFFER_BIT);
glEnable(GL10.GL_TEXTURE_2D);
glEnable(GL10.GL_BLEND);
glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE_MINUS_SRC_ALPHA);
//drawing happens here
glDisable(GL10.GL_BLEND);
There is no problem for me to create so many images, but the size of the game will get bigger. Am I doing anything wrong, or just OpenGL can not resize images like Photoshop?
Have you experimented with alternative sampling options, like GL_NEAREST or GL_LINEAR? (see glTexParameteri).
I don't think this is the fault of opengl, but when you display textures that are not at their native resolution (or an even multiple of it), there will always be some loss of exactness (because the image must be resampled at non-pixel boundaries). GL_LINEAR may alleviate some of the pixellated look at the expense of being slightly blurry, which may or may not be noticible.
so im trying to make a game with just a simple static background at the moment, but when i draw it to the screen (no scaling being done as the resolution of the image is the same as the screen) it draws the bottom portion of the image incorrectly where the bottom few hundred pixels of the image are exactly the same going down the image. Sorry it's so difficult to explain but being new here i cant actually post an image of what is going wrong.
Now im just using a simple sprite to render this background image. Here is the code being used:
// background layer: another image
background = CCSprite.sprite("WaterBackground.png");
// change the transform anchor point (optional)
background.setPosition(CGPoint.make(GAME_WIDTH/2, GAME_HEIGHT/2));
addChild(background);
am i doing something wrong here? Does Cocos2D not support such large images for sprites? (resolution of 800*1280)
Any help would be greatly appreciated!
Since i am now able to upload images, here are visuals of what is going wrong:
And the problem in my game:
As you can see, the problem is hard to describe. The problem only exists with this larger image; i scaled it down manually in GIMP and then scaled it up for the game and it looked fine (except for being a lower resolution). I also tried scaling down this larger image which still resulted in the same problem. Hopefully you guys can help me as I have no clue what could possibly cause this error. Especially since i read that Cocos2D's max supported image size is 2048*2048 and my image is well within that.
Thanks for any help you guys can provide!
This is due to limitations on the size of textures. Coсos2d-android supports images with a maximum size of 1024 x 1024 pixels.
I faced the same problem and looking for a way to solve it.
EDIT
I found the solution
In cocos2d project open file CCTexture2d.java in org.cocos2d.opengl package and change kMaxTextureSize from 1024 to 2048
I'm not certain, as from your code and looking at the cocos2d code I can't see a definite reason why this would be happening, but given the number of sprites you've got on the screen, i'd definitely take a look at my answer to this question as you may well be hitting one of cocos2d's quirky little rendering faults around multiple sprites. can't hurt to try spritesheets, and it's certainly the correct way to actually do sprites in cocos.
also, definitely create yourself a helper utility to determine the scaling ratio of a device compared to your original image sizes, as unlike on iphone, android does have a pretty much unlimited variation of screen resolutions, and a simple bit of "scale as you load" utility code now will save you lots of time in the future if you want to use this on any other device.
you should try to make one method for adding background,i am doing this this like this way try this hope it will help full for you...here addBackGroundScene is my method and i am adding my background with this method,scrXScaleFactor is maintaining scaling part of my screen try this...
private void addBackGroundScene() {
CCSprite bgForLevel1 = addBackgroundAtOrigin("prelevel1/bgMenu.jpg");
bgForLevel1 .setScaleX(scrXScaleFactor);
bgForLevel1 .setAnchorPoint(0, 0);
addChild(bgForLevel1 );
}
I don't have any problem managing textures. But I didn't work that much with loading textures from images. All I know is that the texture needs to be of size 2^i by 2^i.
But what's the best technique to load any images into a texture. If the images is not a square, I can fit it in the square and add two black parts to fill what is missing. But I'm not sure how to do the stretching.
So, if I have an image of let say 800x600 and I want to put it in a 512x512 square, what's the best trick to copy the pixels into texture ? Or, specially on Android, is there some functions that exists that would do that for me ? In short, it's like I want to resize the 800x600 image to be 512x384 and put it in the texture. But I want to preserve as much information as I can.
The OP Answered his own question with:
SOLVED: a friend showed me some references that covers what I want.
To resize, you can do it with the Bitmap class of Android. You can specify a Matrix, just like the one used for OpenGL to resize the Bitmap. From that point, I expect the pixels to be well preserved and will be able to put them in the texture.
If they come back and put the answer here themselves and accept it then I will delete this answer.