Get specific two pixel from the bitmap for processing - android

I have an image on which I want to process pixel number 159 and 160 specifically. I am running this code but it gives me ArrayIndexOutOfBoundsException. I think the way I have declared that array that is the problem.
int[] twoPositions = {159, 160};
bitmap.GetgetPixels(twoPositions, 0, width, 0, 0, width, height); //exception line (ArrayIndex....)
how to declare array in a way which will give me 0 row, 159th and 160th column pixel?
thank you in advance...

Related

Android: Working with bitmaps without OutOfMemoryError

first the important lines of code:
Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), getDrawableForLvl(drawable));
int []pixels = new int[bitmap.getWidth()*bitmap.getHeight()];
bitmap.getPixels(pixels, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
//..
pixels = null; // please gc remove this huge data
System.gc();
So I'm working on an Android app (game) with multiple levels. Every level is an image that I load as bitmap on level start. Then I have to analyse every pixel (line 3). For that I create an array (line 2). The images are 1280x800, so the array has a size over one million ints. No problems here and since it's a local variable it should be destroyed on method return. But it's java, so it's not -.- Depending on the device the garbage collector is running fast enough in time or not. So when a user starts and closes levels very fast it produces a java.lang.OutOfMemoryError in line 2. I guess because the old array(s) wasn't/weren't removed yet and now I have multiple ones filling the memory.
I could make the array a static member. So it's created once and is always available. Then it's not possible to have multiple instances of it. But I think that's a bad coding style, because it's also available (4 MB) when not needed.
I don't need all pixels at the same time. I'm splitting the images in more than a hundred rectangles, so I could use a way smaller array and fill it one after another with pixels. But I have problems to understand the methods parameters can someone help me here?
There is also a method to get just one pixel at position x,y, but I guess a million function calls is pretty slow.
Has someone a better idea? There is no way to force an object out of memory in java, is there?
Update1:
As vzoha suggested to get only the first quarter:
int []pixels = new int[bitmap.getWidth()/2*bitmap.getHeight()/2];
bitmap.getPixels(pixels, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth()/2, bitmap.getHeight()/2);
gives an ArrayIndexOutOfBound. I think the function call is just getting the pixels of the first quarter, but still expects the full size array and the other fields will be left untouched.
Update2: I guess I can do it row by row (half row by half row for the first quarter):
int []pixels = new int[bitmap.getWidth()/2*bitmap.getHeight()/2];
for(int row = 0; row < bitmap.getHeight()/2; ++row)
bitmap.getPixels(pixels, bitmap.getWidth()/2, bitmap.getWidth(), 0, row, bitmap.getWidth()/2, 1);
But when I do that for 20x10 parts it's not much better than getting each pixel by itself. Well it is much better but still the method should be capable to do that with one call, shouldn't it? I just don't get this "stride" parameter: "The number of entries in pixels[] to skip between rows (must be >= bitmap's width). Can be negative." How can it be negativ when >= width?
The size in pixels doesn't directly translate to how much memory the image will take up in memory. Bitmaps in Android (before using ART) are notoriously difficult to use heavily while avoiding OOM exceptions, enough so, that there's a page dedicated to how to use them efficiently. The problem is normally that there is actually enough memory available, but it has become fragmented and there isn't a single contiguous block the size you need available.
My first suggestion (for a quick win) would be to decode the bitmap with a specific config, you should be able to occupy only 1/4 of the amount of memory you were previously using by switching to use ALPHA_8 for your bitmap config. This will use 1 byte per pixel instead of 4 (the default is ARGB_8888)
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPreferredConfig = Bitmap.Config.ALPHA_8
bitmap = BitmapFactory.decodeResource(getResources(), getDrawableForLvl(drawable), options);
My next suggestion would be to scale you bitmap to start with and place the appropriate one in your hdpi,xhdpi,xxhdpi folders.
In fact it is pretty simple to get just the pixels of a specific area. But the documentation is wrong about the stride parameter. It says:
public void getPixels (int[] pixels, int offset, int stride, int x, int y, int width, int height)
stride: The number of entries in pixels[] to skip between rows (must be >= bitmap's width). Can be negative.
But the truth is it can (and must in most cases) be less than the bitmap's width, it just has to be bigger or equal to the passed width (the second to last parameter of getPixels), meaning the width of the area from which I want the pixels.
So to get the pixels of the first quarter of a bitmap:
int []pixels = new int[bitmap.getWidth()>>1 * bitmap.getHeight()>>1]; // quarter of the bitmap size
bitmap.getPixels(pixels, 0, bitmap.getWidth()>>1, 0, 0, bitmap.getWidth()>>1, bitmap.getHeight()>>1);
Or to get a specific rect with x,y (upper left corner) and width, height:
int []pixels = new int[width*height];
bitmap.getPixels(pixels, 0, width, x, y, width, height);
So pretty simple. It was just the wrong documentation that put a twist in my brain.

openGl glReadPixels return incorrect values

I'm strageling with openGl and can't find a solution for the following problem.
I'm using openGl with Ndk and trying to generate ByteBuffer of pixels from my texture and send it to the native c.
The texture image is black and white image, meaning byte-wise:
black: expected result - 0(R) 0(G) 0(B) 1(A) - 0001
white: expected result - 1(R) 1(G) 1(B) 1(A) - 1111
The sign is not relevant here sence both numbers are positive.
This is true because in Java the state of the primitive is always signed.
Complication: we are getting a buffer of unsigned bytes
so the expectations are:
black: 0(R) 0(G) 0(B) 1(A) - 0001 stays the same because left bit of every byte is 0.
white: 1(R) 1(G) 1(B) 1(A) - bit sign is always on(meaning negative number), so -127,-127,-127,-127
BUT, the strange output I'm getting is: -10 -10 -10 -1(RGBA),-10 -10 -10 -1,-10 -10 -10 -1,-10 -10 -10 -1
why there is sign if I asked for unsigned bytes?
Edit:
Partial explanation is that java types are still signed, so on debugger/logs/prints the value will be null depending on the left most(big endian) bit. I still don't know why this bit is 1.
why the values are -10 and -1? at least alpha byte should always be 1.
I'm setting the ByteBuffer as following:
int[] frame = new int[1];
GLES20.glGenFramebuffers(1, frame, 0);
RendererUtils.checkGlError("glGenFramebuffers");
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, frame[0]);
RendererUtils.checkGlError("glBindFramebuffer");
GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, texture, 0);
RendererUtils.checkGlError("glFramebufferTexture2D");
ByteBuffer buffer = ByteBuffer.allocate(width * height * 4);
//GLES20.glPixelStorei(pname, param);
GLES20.glReadPixels(0, 0, width, height, GLES20.GL_RGBA, arrayNativeType, buffer); // todo take unsigned int
RendererUtils.checkGlError("glReadPixels");
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
RendererUtils.checkGlError("glBindFramebuffer");
GLES20.glDeleteFramebuffers(1, new int[]{0}, 0);
RendererUtils.checkGlError("glDeleteFramebuffer");
Unbelivable answer, I checked everything memory and ndk wise and at the end the problem was in my black&white shader.
Explanation: the previous wrong shader took the highest rgb value and set the r g and b according to it, that way we got the black:
float greyValue = max(fragColor.r, max(fragColor.g, fragColor.b));
gl_FragColor = vec4(greyValue, greyValue, greyValue, 1.0);
The error is that I don't know the precise color and its byte representation.
The fix was instead of greyValue to put 1, and if its value is 0 than to put 0 for r g and b.
I hope it clear.

compare 2 images to avoid duplication

I am comparing 2 similar images and would like to see if both are similar .Currently I used the code:
public void foo(Bitmap bitmapFoo) {
int[] pixels;
int height = bitmapFoo.getHeight();
int width = bitmapFoo.getWidth();
pixels = new int[height * width];
bitmapFoo.getPixels(pixels, 0, width, 1, 1, width - 1, height - 1);
}
and I call the function : foo(img1) where :
img1=(Bitmap)data.getExtras().get("data");
I would like to know how to get the above getpixel,I tried assigning it to variable but did not work .Should it have a return type ?? and in format it is ?
And also how do I compare 2 images??
Also both the images may be of different dimensions based on the mobile camera the snapshot is taken from .
Also can it recognize if the same image is shot in the morning and night ???
Thanks in Advance.
This code will compare pixel from base image with other image.
If both pixel at location (x,y) are same then add same pixel in result image without change. Otherwise it modifies that pixel and add to our result image.
In case of baseline image height or width is larger than other image then it will add red color for extra pixel which is not available in other images.
Both image file format should be same for comparison.
Code will use base image file format to create resultant image file and resultant image contains highlighted portion where difference observed.
Here is a Link To Code with sample example attached.
If you want to copy the pixels of the bitmap to a byte array, the easiest way is:
int height = bitmapFoo.getHeight();
int width = bitmapFoo.getWidth();
pixels = new int[height * width];
bitmapFoo.copyPixelsToBuffer(pixels);
See the documentation
I should warn you that you will have to handle this with care, any other way you will get OutOfMemoryError.
To get All Pixels
bitmapFoo.copyPixelsToBuffer(pixels);
or
bitmapFoo.getPixels(pixels, 0, width, 0, 0, width, height);
To get One Pixel
The two arguments must be two integers within the range [0, getWidth()-1] and [0, getHeight()-1]
int pix = bitmapFoo.getPixel(x, y);

OutOfMemoryError while create a bitmap

I want to send a fax from my app.
A fax document has a resolution of 1728 x 2444 pixels.
So I create a bitmap, add text and/or pictures and encode it to CCITT (Huffman):
Bitmap image = Bitmap.createBitmap(1728, 2444, Config.ALPHA_8);
Canvas canvas = new Canvas(image);
canvas.drawText("This is a fax", 100, 100, new Paint());
ByteBuffer buffer = ByteBuffer.allocateDirect(image.getWidth() * image.getHeight());
image.copyPixelsToBuffer(buffer);
image.recycle();
encodeCCITT(buffer, width, height);
This works perfect on my Galaxy SII (64 MB heap size), but not at emulator (24 MB). After creating the second fax page I get "4223232-byte external allocation too large for this process...java.lang.OutOfMemoryError" while allocating the buffer.
I already reduced color depth from ARGB_8888 (4 byte per pixel) to ALPHA_8 (1 byte), because fax pages are monochrome anyway.
I need this resolution and I need to have access to the pixels for encoding.
What is the best way?
Android doesn't support 1-Bpp bitmaps, and the Java heap size limit of 24/32/48MB is part of Android. Real devices can't allocate more than the Java heap limit no matter how much RAM they have. There appear to be only two possible solutions:
1) Work within the limitations of the Java heap.
2) Use native code (NDK).
In native code you can allocate the entire available RAM of the device. The only down side is that you will need to write your own code to edit and encode your bitmap.
In addition to BitBank's already good answer, you have to null the reference if you want the Garbage collector to actually clean up your Bitmap's references. The documentation for that method states:
This is an advanced call, and normally need not be called, since the
normal GC process will free up this memory when there are no more
references to this bitmap.
instead of copy all pixels to a ByteBuffer, you can copy step by step. Here with a int[] array. So, you need less memory:
int countLines = 100;
int[] pixels = new int[width * countLines];
for (int y = 0; y < heigth; y += countLines) {
image.getPixels(line, 0, width, 0, y, width, countLines);
// do something with pixels...
image.setPixels(line, 0, width, 0, y, width, countLines);
}

Android getPixels() possibly a silly mistake?

Okay, this is quite simple to understand, but for some bizarre reason I can't get it working.. I've simplified this example from the actual code.
InputStream is = context.getResources().openRawResource(R.raw.someimage);
Bitmap bitmap = BitmapFactory.decodeStream(is);
try
{
int[] pixels = new int[32*32];
bitmap.getPixels(pixels, 0, 800, 0, 0, 32, 32);
}
catch(ArrayIndexOutOfBoundsException ex)
{
Log.e("testing", "ArrayIndexOutOfBoundsException", ex);
}
Why on earth do I keep getting an ArrayIndexOutOfBoundsException? the pixels array is 32x32 and as far as I'm aware I'm correctly using getPixels. The image dimensions is 800x800 and I am attempting to retrieve a 32x32 section. The image is a 32-bit PNG which is being reported as ARGB-8888.
Any ideas? even if I'm being an idiot! I'm about to throw the keyboard out of the window :D
use bitmap width as stride, in ur case 32
bitmap.getPixels(pixels, 0, 32, 0, 0, 32, 32);
every row gap with 800 causes ur pixelarray to get out of bound
"I'm about to throw the keyboard out of the window " funny lol
You're overflowing the destination buffer because you're asking for a stride of 800 entries between rows.
http://developer.android.com/reference/android/graphics/Bitmap.html#getPixels%28int[],%20int,%20int,%20int,%20int,%20int,%20int%29
You getting OutOfBounds exception becacuse stride is applied to pixels array not to the original bitmap,so in your case you're trying to retrieve 32*800 pixels which doesn't fit into your array.

Categories

Resources