I need to to turn the actual values of the pixels of a Bitmap into greyscale.
I've found solutions where it adds a filter like here
The problem is: it just changes it on the screen for viewing while keeping it as is. I need to change the actual bitmap to work on the pixels later and saving.
I tried to read about the color theory but got lost more
Related
I'm new to Android development. I have a library here that produces LCD images that come from a device through USB. These images are small, 120x80 pixel, and are 1-bit. I want to show these images on the UI.
All documents I've found explain how to show bitmaps from image files (PNG etc.) or how to show them from an app's resources. I did find out that I can add an ImageView to the UI. Then, for each incoming 120x80 pixel image, create a Bitmap instance, fill it with pixels, and assign it to the ImageView. However, I do not know if this yields the best performance here.
It is also important to keep in mind that nearest neighbor scaling must be used here. Bilinear filtering with such a small image produces a result that is too blurry.
From my experience with other languages, it seems wasteful to create a Bitmap instance for each incoming image. But perhaps I am wrong here.
Suggestions?
What im trying to do is to save an edited bitmap that is composed by 2 bitmaps overlayed. My application allows the user to draw on top of a picture and save it.
My problem is: when i save the result image, it gets smaller, even setting the quality to 100. So, if the user saves and edit the image multiple times, the image will get smaller and smaller.
I save the bitmap with this:
result.compress(Bitmap.CompressFormat.JPEG, 100, fos);
I debugged the code, and at this point the width and height are fine, but after saving, the image shrinks.
I've researched for questions about this, but the ones i've found had no answers that could help me.
What i need is a way to save a bitmap without shrinking it, but i need it to be in image format, like JPG, PNG, etc.
Thanks in advance.
I guess is not at the time when you are saving the image, the issue is when you are editing the image (creating layers). Check that mechanism, maybe you are setting the image size there.
I'm having trouble cleanly down-scaling images on Android. I'm looking to scale small PNG images between arbitrary sizes of about 10-100% of their original size.
I've created a sample image to demonstrate the problem and exacerbate the unusual behaviors I'm seeing in Android's image scaler:
The above image is a screenshot from an Android device with some annotations added. I've also added the same images in a second column on the left side showing how they are rendered with a linear scaling by "The GIMP" (GNU Image Manipulation Program).
The base image consists of a checkerboard pattern background of red and blue pixels. On that background I've drawn some 1px-wide yellow lines and fairly thin green text. The image is 288x288 pixels.
When scaling the image to 1/3 of its original dimensions, Android seems to simply grab one in nine pixels, throwing out all other data. Some of the yellow lines disappear entirely as a result. Remarkably, the checkerboard pattern remains intact (which is simply a result of every 3rd pixel being used).
When scaling the image to a dimension of near-but-not-exactly 50% of its original size, e.g., 142x142 or 143x143, the scaler creates some fairly large anomalies/artifacts on the image.
At 50% size (144x144), the image looks correct.
The test image does bring out the worst of the image scaler, but "normal" PNG icon images are severely impacted as well. From 10-33% or so the images aren't properly resampled, and thus appear extremely "bitmapped". And certain larger size images have very strange anomalies in them at certain sizes.
If anyone knows a means to disable this strange scaling behavior, even at a performance cost, I'd greatly appreciate knowing about it. It can certainly be solved by writing an algorithm that works directly on the pixels of bitmaps, but I'm hopeful that isn't the only option.
Also noteworthy is the fact that all image work is being done with ARGB_8888 Bitmap.Configs. I've tried manipulating image size by setting maxwidth/maxheight on ImageViews, by using Bitmap.createScaledBitmap(), and by using Bitmap.createBitmap with a Matrix. All attempts have this same result. Bitmap filtering is enabled.
Thanks again for any suggestions!
Using Bitmap.createScaledBitmap() and Bitmap.createBitmap with a Matrix is the same; see the source for Bitmap.createScaledBitmap (which hasn't changed since Android 2).
On Android 4.0+, using a matrix (as in Bitmap.createScaledBitmap) allows hardware-accelerated operations if enabled (enabled by default on 4.1+ IIRC), thus we doesn't have direct control over what is being done and how it is done.
That means you'll have to implement your own scaling method using the desired (here, linear) filtering; either by pixel processing; or using OpenGL ES with the good filter, but it may not be available on all devices.
all
I want to make a filter just like what instagram's. I use an ImageView with colorFilter to accomplish the effect, but I don't know how to save the filtered image as a file. If I savethe bitmap directly, the original bitmap was stored without filter effect. If I store the imageview's pixels, its size is not as the same as the bitmap. And I dont want to calculate pixel by pixel for a new bitmap for effect reason. I was blocked on this problem for days.Would anybody help me ?
Thanks.
BR
QiuPing
I think this might be of help to you. Check out the Saving Internal and Saving External sections.
http://developer.android.com/guide/topics/data/data-storage.html
You're going to want to get your image into a bitmap, like this:
Save bitmap to location
I found a better solution now.
I used a Canvas on the original bitmap, and paint with ColorMatrixColorFilter.
Different ColorMatrixColorFilter take different use. Changing brightness, changing saturation, changing contrast. Make combinations, then filters were created.
In short I am unable to access all the pixels of a bitmap image.
I have used an intent to fire the native Camera app and returned a Bitmap image to my application activity. The data is definitely a bitmap object and I am able to display, get the height/width etc and access some pixels using getPixel(). However when I use the values of getHeight() and getWidth() I get an array out of bounds error. By trail and error I have found I can only access a reduced number of pixels of the image, for example with one image which returned a height and width value of 420,380, I could also access 200,100. I then do some image processing and used setPixel() on the original image. When I display the image it shows the, say 200,100, processing pixels and the rest normal, therefore the pixels are obviously there and accessible by android but not by me. I have to spoken to other people who have also had this problem with images.
Does anyone know anything more about this, reasons? or a work around?
Many thanks in advance.
It seems that there's no way around this, does anyone think it would be better/possible to access the image directly in memory maybe using the NDK?
You won't be able to access the pixel at (getWidth(),getHeight()) in any image because like everything else they are 0-indexed. The valid range of pixels is (0 to getWidth()-1, 0 to getHeight()-1), and thus the bottomrightmost pixel is obtained by b.getPixel(b.getWidth()-1, b.getHeight()-1).
Got an answer from Albert Pucciani on the Android forums. I now create an int buffer and copy the pixels to it, then use get() and put() to extract the pixels. It's also much quicker to use get() and put() instead of the get/setPixel() from the Bitmap class. Need to test now whether this does return all the pixels to the buffer for all images.
After more testing I have discovered this is simply a memory issue as the amount allocated for each process includes all bitmaps.