On emulators running Android 4.0 or 4.0.3, I am seeing horrible colour banding which I can't seem to get rid of. On every other Android version I have tested, gradients look smooth.
I have a SurfaceView which is configured as RGBX_8888, and the banding is not present in the rendered canvas. If I manually dither the image by overlaying a noise pattern at the end of rendering I can make the gradients smooth again, though obviously at a cost to performance which I'd rather avoid.
So the banding is being introduced later. I can only assume that, on 4.0+, my SurfaceView is being quantized to a lower bit-depth at some point between it being drawn and being displayed, and I can see from a screen capture that gradients are stepping 8 values at a time in each channel, suggesting a quantization to 555 (not 565).
I added the following to my Activity onCreate function, but it made no difference.
getWindow().setFormat(PixelFormat.RGBA_8888);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_DITHER);
I also tried putting the above in onAttachedToWindow() instead, but there was still no change.
(I believe that RGBA_8888 is the default window format anyway for 2.2 and above, so it's little surprise that explicitly setting that format has no effect on 4.0+.)
Which leaves the question, if the source is 8888 and the destination is 8888, what is introducing the quantization/banding and why does it only appear on 4.0+?
Very puzzling. I wonder if anyone can shed some light?
Try dis..
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPreferredConfig = Bitmap.Config.ARGB_8888;
Bitmap gradient = BitmapFactory.decodeResource(getResources(), R.drawable.gradient, options);
findViewById(R.id.main).setBackgroundDrawable(new BitmapDrawable(gradient));
Turning on the emulator "Use Host GPU" option fixed the color problems for me, credit goes to this answer https://stackoverflow.com/a/17166234/1287459
In my case I was using Android 4.2.2.
Related
I'm writing an app that requires displaying a semi-transparent PNG layer over a camera preview. Everything was fine until I wanted to publish it and make sure it works also on Android 2.x. It seems that on older versions of Android, the camera preview causes the drawable (in my case, a subclass of ImageView) to not show. When I get rid of the preview, it works just fine - the drawable is visible as it should. It works like this both on the emulator and on real devices.
Here how it looks like on Android 2.3:
and 4.2.2:
I think there would be too much code to paste here, so I've isolated the problematic parts into a small project: http://krzeminski.it/wp-content/uploads/2013/09/DrawableTest.zip. The most interesting and probably guilty class is CameraPreview.
Also, I'm not sure why the preview itself doesn't work. I've read that on Android 2.x emulators, the test image from the emulated camera is just this plain white, so I assumed it's ok. However, my friend tested the app on his phone with Android 2.3 and the preview appeared to be plain black. I guess it's a subject for a separate question, but maybe you'll notice something in the code.
I've spent probably 2 days now to solve these two problems, so any clues would be really helpful. Thank you!
I ran into this issue a while back. I remember that a post on SO recommended that you shouldn't use ImageView#getImageMatrix(). Reasoning given was:
public Matrix getImageMatrix ():
Return the view's optional matrix. This is applied to the view's
drawable when it is drawn. If there is not matrix, this method will
return an identity matrix. Do not change this matrix in place but make
a copy. If you want a different matrix applied to the drawable, be
sure to call setImageMatrix().
Even after reading through this, I couldn't/don't understand what difference this makes. I sorted the problem out by using:
Matrix matrix = new Matrix();
May be you're dealing with the same issue. Give this a try.
I don't have Android 2.3 device. But I think probably you can try two ways:
1. change FrameLayout to RelativeLayout
2. programmatically add views:
mLayout.addView(mPreview);
mLayout.addView(mImageView);
mLayout.addView(mTextView);
Could someone explain what is really happening when setting inDither = true in the context of configarating a bitmap in Android?
At Developer.Android one can read about the static variable
Config.RGB_565
This configuration can produce slight visual artifacts depending on the configuration of the source. For instance, without dithering, the result might show a greenish tint. To get better results dithering should be applied
I had this problem until I followed this recommendation, that is:
options.inPreferredConfig = Config.RGB_565;
options.inDither = true;
So my question: how do one understand inDither in Android. Its one thing to know when to use a syntax ... another to fully understand it.
Thanks in advance!
When you are running low on the number of colors supported , then moving from one color to other (gradient) will cause bands to appear (less steps in between).
Dithering reduces this by placing random noise in color steps. With dither, we can use a noise of available colors to give an illusion of unavailable colors:
RGB_565 has low precision (2 bytes) than ARGB_8888 (4 bytes). Due to low color range, RGB_565 bitmaps can show banding and low color range. Hence, dither flag is use to improve perceived image, and give an illusion of more colors.
In a custom view I have, I'm using Canvas.drawBitmap with a source Bitmap that is wider than 2048px. This of course causes problems when Hardware Acceleration is enabled, with the LogCat spewing out "W/OpenGLRenderer(4968): Bitmap too large to be uploaded into a texture" each time drawBitmap is called.
So to work around this I tried calling setLayerType(View.LAYER_TYPE_SOFTWARE, null) on my view. The only problem is that it doesn't seem to help. When I try to run my app, LogCat will still give the warnings, and nothing ends up being drawn.
I cannot understand why is this so. This article on the Android site clearly states:
You can disable hardware acceleration for an individual view at runtime with the following code:
myView.setLayerType(View.LAYER_TYPE_SOFTWARE, null);
Is there something I'm doing wrong? Why is Android not respecting the setLayerType call?
[in case it matters: I'm performing my testing on a Samsung Galaxy Tab 10.1 running Android 3.2]
Try to resize the bitmap first. Use createBitmap.
I've run into issues with banding of my PNG files. Digging into the problem has yielded two solutions. Both make sense individually, but together they don't. The solutions I've discovered:
1) Move the PNG file into the "raw" folder. This prevents AAPT from "optimizing" the image which results in banding.
2) Change the pixel format of your Activity's window to RGBA_8888 (i.e. in onCreate add this line "getWindow().setFormat(PixelFormat.RGBA_8888)"). On Android 2.2 and lower the default pixel format is 16-bit (565).
I have tried both of these and they correct the banding effect in my images, however now I am even more confused as to what Android is doing.
On the one hand, if I leave my PNG in the drawable folder it is "optimized" which results in a banding effect in the image. It magically goes away when I change the pixel format to 32-bit. If the image was "optimized" though, I would have expected the banding to remain.
On the other hand, if I move the PNG to the raw folder it will retain the nice gradient and display nicely even though the pixelFormat is supposedly 16-bit.
If anyone has any insight into what is going on I would appreciate it.
Thanks,
-Dan
I believe its quite simple :
You have to think of the pixel format of your Activity(RGBA_8888) as a DEFAULT optimization for your bitmaps.
If it is not set, then prior to 2.2, by default it will compress your bitmap to RGB_565.
But if you were to create programmatically a bitmap and set it to RGBA_8888, then it would be used as such by the app.
Same applies when you put your bitmap in the raw folder : Even though the default PixelFormat is set to RGB_565, the activity will use it as it is without "optimizing" it.
When you put your bitmap in the raw folder it will not be compressed at all and used as is even though the default PixelFormat is still RGB_565.
I have some .png files in my app. I need to load these during runtime, and get the exact colors of certain pixels from them. It's important, that I do not want to scale these pictures. I don't show them on the UI directly, they serve as maps.
Now, on Android 1.5, there's no problem with this. I put these images in the '/res/drawable' dir, load them with BitmapFactory into a Bitmap object, and use it to get the color of the desired pixels. E.g. pixel (100, 50) has the color RGB(100, 1, 100).
On Android 2.2 tho, the same procedure results varying colors (for the same pixel), so I get RGB(99, 3, 102) / RGB(101, 2, 99) / etc. for the same (100, 50) pixel. I checked the resolution of the Bitmap object, it seems that is didn't get scaled.
Could somebody explain, why I get distorted colour values?
Solved: It appears, that on Android 2.2, I have to set the correct bitmap configuration. Somehow, versions below 2.2 managed to do this (or maybe fewer configs are supported on those, and the system guessed the config correctly, don't know).
Anyways, here's the code I use now:
BitmapFactory.Options opt = new BitmapFactory.Options();
opt.inDither=false;
opt.inPreferredConfig = Bitmap.Config.ARGB_8888;
Bitmap mask = BitmapFactory.decodeResource(getResources(), R.drawable.picture, opt);
Go make yourself a bitmap thats entirely the same color of the pixel in question. Make the size of this bitmap the same resolution of the one your currrently using. Load it up and check the RGB values of the same pixel (or any pixel) you are having problems with.
This should tell you whether your problem is either scaling, which is what I think it is, or possibly a problem in the color translation.
If you don't find an answer quickly, my pragmatist streak would ask how hard it is to parse the .png yourself, to get completely deterministic results independent of any changes in the platform.
My 2.3 devices (Nexus One and S) worked fine without setting "opt.inPreferredConfig", but it appears that 2.2 requires it for accurate RGBs.