How to create view cache by RGB565? - android

android 2.2, the default view cache is ARGB_8888,
How to create view's cache by RGB565?
Thanks!

The simplest way is to simply drop the least-significant bits.
Ie.
NewR = R >> 3;
NewG = G >> 2;
NewB = B >> 3;
However, I suppose a more thorough method might include some form of dithering, to minimise banding.

Related

Tensorflow lite object detection, Android Studio, ssd mobilevet v2, same structure different tflite file but almost 0 detection

I want to make object detection application base on this github https://github.com/bendahouwael/Vehicle-Detection-App-Android.
That github code uses tflite based on ssd mobilenet v1. So I made my custom model based on ssd mobilenet v2. I followed this link https://colab.research.google.com/drive/1qXn9q6m5ug7EWJsJov6mHaotHhCUY-wG?usp=sharing to make my own TFLITE model.
From https://netron.app/ I checked the model structure both almost same. Please see the pictures below.
First picture is about SSD MOBILENET V1 Structure.
Second picture is about my own custom model based on SSD MOBILENET V2.
I think both models' structure is same. So I just pasted my own model into app code(to asset folder) with label txt file.
The application showed its real time image well but did not detect the objects that I decided what to detect. I know ssd mobilenet V1 type is unit8 and my own model (which is based on ssd mobilenet v2) type is float32. But this is not a problem I guess b/c in the code it has setting about quantized or not.
So please who has any ideas, tell me the reason why my application works so bad.
ps1) I forgot to say about debugging. It did not show any error messages. This makes me much hard to work
If you look closely at INPUT part,
with MobileNet V1 you have: type: unit8[1, 300, 300, 1]
with MobileNet V2 you have: type: float[1, 300, 300, 1]
This means that the first model is quantized (more info: here) and for the weight and biases use integer values. (this is done for inference speed)
Now if you go to your TFlite Object Detection class (or maybe named different), usually you will have a method called recognizeImage() similar like this (this is the part when you create fill the ByteBuffer):
imgData.rewind();
for (int i = 0; i < inputSize; ++i) {
for (int j = 0; j < inputSize; ++j) {
int pixelValue = intValues[i * inputSize + j];
if (isModelQuantized) {
// Quantized model
imgData.put((byte) ((pixelValue >> 16) & 0xFF));
imgData.put((byte) ((pixelValue >> 8) & 0xFF));
imgData.put((byte) (pixelValue & 0xFF));
} else { // Float model
imgData.putFloat((((pixelValue >> 16) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
imgData.putFloat((((pixelValue >> 8) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
imgData.putFloat(((pixelValue & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
}
}
}
where:
private static final float IMAGE_MEAN = 128.0f;
private static final float IMAGE_STD = 128.0f;
So in the first case set the isModelQuantized = true, and for MobileNet V2 you set isModelQuantized = false

Using Hardware Layer in Custom View onDraw

So I'm trying to understand how I can properly use hardware acceleration (when available) in a custom View that is persistently animating. This is the basic premise of my onDraw():
canvas.drawColor(mBackgroundColor);
for (Layer layer : mLayers) {
canvas.save();
canvas.translate(layer.x, layer.y);
//Draw that number of images in a grid, offset by -1
for (int i = -1; i < layer.xCount - 1; i++) {
for (int j = -1; j < layer.yCount - 1; j++) {
canvas.drawBitmap(layer.bitmap, layer.w * i, layer.h * j, null);
}
}
//If the layer's x has moved past its width, reset back to a seamless position
layer.x += ((difference * layer.xSpeed) / 1000f);
float xOverlap = layer.x % layer.w;
if (xOverlap > 0) {
layer.x = xOverlap;
}
//If the layer's y has moved past its height, reset back to a seamless position
layer.y += ((difference * layer.ySpeed) / 1000f);
float yOverlap = layer.y % layer.h;
if (yOverlap > 0) {
layer.y = yOverlap;
}
canvas.restore();
}
//Redraw the view
ViewCompat.postInvalidateOnAnimation(this);
I'm enabling hardware layers in onAttachedToWindow() and disabling them in onDetachedFromWindow(), but I'm trying to understand whether or not I'm actually using it. Essentially, the i/j loop that calls drawBitmap() never changes; the only thing that changes is the Canvas translation. Is the Bitmap automatically saved to the GPU as a texture behind the scenes, or is there something I need to do manually to do so?
On what view(s) are you setting View.LAYER_TYPE_HARDWARE exactly? If you are setting a hardware layer on the view that contains the drawing code shown above, you are causing the system to do a lot more work than necessary. Since you are only drawing bitmaps you don't need to do anything here. If you call Canvas.drawBitmap() the framework will cache the resulting OpenGL texture on your behalf.
You could however optimize your code a little more. Instead of calling drawBitmap(), you could use child views. If you move these children using the offset*() methods (or setX()/setY()) the framework will apply further optimizations to avoid calling the draw() methods again.
In general, hardware layers should be set on views that are expensive to draw and whose content won't change often (so pretty much the opposite of what you're doing :)
You can use Android's Tracer for OpenGL ES to see if your view issue OpenGL commands.
From developer.android.com
Tracer is a tool for analyzing OpenGL for Embedded Systems (ES) code in your Android application. The tool allows you to capture OpenGL ES commands and frame by frame images to help you understand how your graphics commands are being executed.
There is also a tutorial about Android Performance Study by Romain Guy which describes its use almost step by step.

Undo/Redoing adjustments to whole image (Android)

Note:
Please bear with me, this (imho) is not a duplicate of the dozen questions asking about undoing in paint/draw scenarios.
Background:
I've been developing an image processing application, using Processing for Android and now I'm trying to implement a simple, one-step undo/redo functionality.
My initial (undo-friendly) idea was to apply the adjustments to the downsampled preview image only, keep an array of adjustment actions, and apply them at save-time to the original image. I had to sack this idea for two reasons:
some of the actions take a few seconds to finish, and if we have a few of these, it will make the already slow saving process tediously slower.
some actions (e.g. color-noise reduction) produce drastically different (wrong) results when applied to the downsampled image instead of the full-sized image. But anyways this is a less serious problem...
So I decided to go with storing the before/after images.
Problem:
Unfortunately buffering the images in memory is not an option because of memory limitations. So what I'm doing at the moment is saving the before/after images to internal storage.
But that creates a performance/quality dilemma:
jpeg is fast (i.e. ~500ms to save on my Xperia Arc S) but degrades the quality beyond acceptability after two/three iterations.
png is of course lossless, but is super slow (~7000ms to save) which makes it impractical.
bmp I guess would probably be fast, but android does not encode bmp (I think processing for android saves "file.bmp" as tiff).
tiff has somewhat acceptable performance (~1500ms to save), but android does not decode tiff.
I also tried writing the raw pixel array to a file using this function:
void writeData(String filename, int[] data) {
try {
DataOutputStream dos = new DataOutputStream(new BufferedOutputStream(openFileOutput(filename, Context.MODE_PRIVATE)));
for (int i = 0; i < data.length; i++) {
dos.writeInt(data[i]);
}
dos.close();
}
catch (IOException e) {
e.printStackTrace();
}
}
but it takes above 2000ms to finish, so I gave up on it for now.
Questions:
Is there a faster way of writing/reading the data for this purpose?
...or should I go back to the initial idea and try to solve its problems as much as possible?
Any other suggestions?
Update:
I came up with this method to write the raw data:
void saveRAW2(String filename) {
byte[] bytes = new byte[orig.pixels.length*3];
orig.loadPixels(); //orig = my original PImage, duh!
int index = 0;
for (int i = 0; i < bytes.length; i++) {
bytes[i++] = (byte)((orig.pixels[index] >> 16) & 0xff);
bytes[i++] = (byte)((orig.pixels[index] >> 8) & 0xff);
bytes[i] = (byte)((orig.pixels[index]) & 0xff);
index++;
}
saveBytes(filename, bytes);
}
...and it takes less than 1000ms to finish.
It runs 3 times faster than that if I write the file on my SD card, but I guess I can't count on that to be the same on every phone. right?
Anyways, I'm using this method to read the saved data back into orig.pixels:
void loadRAW(String filename) {
byte[] bytes = loadBytes(filename);
int index = 0;
int count = bytes.length/3;
for (int i = 0; i<count; i++) {
orig.pixels[i] =
0xFF000000 |
(bytes[index++] & 0xff) << 16 |
(bytes[index++] & 0xff) << 8 |
(bytes[index++] & 0xff);
}
orig.updatePixels();
}
This takes ~1500ms to finish. Any ideas for optimizing that?
I'd recommend finding out what the Android "scratch disk" area is, and processing your images as tiles, caching them on that scratch disk. This might be a bit slower than straight memory use, but it means you can do your image editing without running into memory limitations, and (provided Android's SDK has a sensible API), writing the tiles to a full file shouldn't take incredibly long. That said, you've kind of moved from "Processing" to plain Java, so the question isn't really about Processing anymore... and my answer is probably not as good as someone who's intimately familiar with the Android SDK

How do you get the selection color in Android?

I need to get the selection color used by Android to draw ListView and EditText selection. I know these controls user selectors to draw their states, but I've written a few widgets that I want to match the selection color of the platform they are running on and the drawables that are defined can't do that because they are using 9 patch images instead of colors.
I've looked all through the Android source and haven't found a color selector or color constant I can use to get the color I'm looking for.
You could try this :
android:background="?android:attr/selectableItemBackground"
you can get all the resource from here
android-sdk-windows\platforms\android-<Desire API Level>\data\res\drawable-hdpi.
or you can use directly like this
android:background="#android:drawable/list_selector_background"
or like this also
Drawable d = getResources().getDrawable(android.R.drawable.list_selector_background);
Take a look at this answer. You can browse through the platform colors and styles at the Android github mirror. Please note this values can be different for every version of Android. If you want to use platform styles, just don't create your own custom selectors for the controls. Hope it will help.
You could just get the list_selector_background drawable, as explained by Jignesh, and then find its average color as shown in this answer (I'd do it in your initialization code so you don't have to waste processing them every time, but hey, that's premature optimization). That should be consistent enough with the theme to let your widgets match as needed.
Your code could look like this:
public static Color getPlatformSelectionColor(Context c) {
Bitmap bitmap = BitmapFactory.decodeResource(c.getResources(),
android.R.drawable.list_selector_background);
long redBucket = 0;
long greenBucket = 0;
long blueBucket = 0;
long pixelCount = 0;
for (int y = 0; y < bitmap.getHeight(); y++)
{
for (int x = 0; x < bitmap.getWidth(); x++)
{
Color c = bitmap.getPixel(x, y);
pixelCount++;
redBucket += Color.red(c);
greenBucket += Color.green(c);
blueBucket += Color.blue(c);
// does alpha matter?
}
}
Color averageColor = Color.rgb(redBucket / pixelCount,
greenBucket / pixelCount,
blueBucket / pixelCount);
return averageColor;
}
You can achieve easily achieve this for google owned devices but not for other manufacturers as most of the manufacturers have overridden the default colors, layouts, backgrounds, etc for almost all the versions of android including jelly-beans.
So ideally its not recommended and also tough to follow each and everyone's design guidelines.

How can I check the status bar colour in Android?

How can I check the status bar colour in Android?
Why do I need to check this? I have created my status bar icons as per the design guidelines however on some devices (Samsung Galaxy S) the status bar is black and it is running Android 2.1.
The recommendations for the status bar icons look great on Android 2.3 (Nexus S) and within the emulator (earlier Android 2.1) with the default light grey status bar. However the black icons that are recommended for pre-2.3 don't look clear on the Samsung Galaxy S.
I would like to provide a white icon if the Android is running 2.1 or 2.2 with a black status bar.
Basically the problem is that the Android design guidelines don't really cover the phones that changed the UI like the Samsung Galaxy S. For example the Samsung Galaxy S running Android 2.1 should have a light grey status bar but it has a black one. This doesn't fit in well with Google's provided design guidelines.
I've developed the following method to guess light value of a status bar background. It actually gets background color of a status bar item, but I assume the whole status bar should have similar color. I use this method to distinguish, whether to load black or white version of my status notification icon.
/**
* Returns estimated value of the lightness of the status bar background
* #return
*/
private int getStatusBarBackgroundLightValue()
{
// better this than nothing
Drawable bg = getResources().getDrawable(android.R.drawable.status_bar_item_background);
int height = Math.max(1, bg.getIntrinsicHeight());
int width = Math.max(1, bg.getIntrinsicWidth());
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
Canvas canvas = new Canvas(bitmap);
bg.setBounds(0, 0, width, height);
bg.draw(canvas);
long sum = 0;
for (int x=0; x<width; x++){
for (int y=0; y<height; y++){
int color = bitmap.getPixel(x, y);
int r = (color >> 16) & 0xFF;
int g = (color >> 8) & 0xFF;
int b = (color) & 0xFF;
int max = Math.max(r, Math.max(g, b));
int min = Math.min(r, Math.min(g, b));
int l = (min + max)/2;
sum = sum + l;
}
}
bitmap.recycle();
bitmap = null;
canvas = null;
bg = null;
sum = sum / (width * height);
// should be [0..255]
return (int)Math.min(255, Math.max(sum, 0));
}
Usually you are suppose to bind this via the manifest file but there is a way to get the value:
Use the Build.VERSION.SDK_INT and check it against the Build.VERSION_CODES,
http://developer.android.com/reference/android/os/Build.VERSION.html
you can get the more interesting read from an android dev blog post here
--edit--
You will have to figure out what device you are on from BUILD but don't forget people have themes as well...
Short of ripping up the framework and the status bar app, that is going to be the best... here is a link if you want to try digging into the StatusBar and getting it from the raw Framework.jar...
Here is a link to a guy who mentions how he handled playing with the status bar.
With adb (you'll find in android development kit) with "pull" command, or just a root file manager, you need to access /system/framework/framework-res.apk. Decompress, it's like zip use(!) apktool. Inside, you'll find the bitmaps of the graphics. Select the ones, containing "statusbar", you just look at them, and you'll see if you want to mass with it, or not.
The tricky part is to get them back, for what you'll need to recompress it, with apktool. The bitmaps have an extra line, with info on streching them. Apktool gets rid of those, or better put, just hides them.
The really hard part is, where you change the clock, and notification colors, from white to black, and for that you'll need to modify classes.dex, whitch part I was stuck.
The easy method is to download and install CyanogenMod, it lets you access these parameters.
Cheers.

Categories

Resources