Creating a 9 patch procedurally? [duplicate] - android

I have a requirement on my Android application that parts on the graphics should be customizable, by retrieving new colors and images from the server side. Some of these images are nine-patch images.
I can't find a way to create and display these nine-patch images (that have been retrieved over the network).
The nine-patch images are retrieved and kept in the application as Bitmaps. In order to create a NinePatchDrawable, you either need the corresponding NinePatch or the chunk (byte[]) of the NinePatch. The NinePatch can NOT be loaded from the Resources, since the images doesn't exist in /res/drawable/. Furthermore, in order to create the NinePatch, you need the chunk of the NinePatch. So, it all drills down to the chunk.
The question is then, how do one format/generate the chunk from an existing Bitmap (containing the NinePatch information)?
I've searched through the Android source code and the Web and I can't seem to find any examples of this. To make things worse, all decoding of a NinePatch resources seem to be done natively.
Have anyone had any experiences with this kind of issue?
I'm targeting API level 4, if that is of importance.

getNinePatchChunk works just fine. It returned null because you were giving Bitmap a "source" ninepatch. It needs a "compiled" ninepatch image.
There are two types of ninepatch file formats in the Android world ("source" and "compiled"). The source version is where you add the 1px transparency border everywhere-- when you compile your app into a .apk later, aapt will convert your *.9.png files to the binary format that Android expects. This is where the png file gets its "chunk" metadata. (read more)
Okay, now down to business.
Client code, something like this:
InputStream stream = .. //whatever
Bitmap bitmap = BitmapFactory.decodeStream(stream);
byte[] chunk = bitmap.getNinePatchChunk();
boolean result = NinePatch.isNinePatchChunk(chunk);
NinePatchDrawable patchy = new NinePatchDrawable(bitmap, chunk, new Rect(), null);
Server-side, you need to prepare your images. You can use the Android Binary Resource Compiler. This automates some of the pain away from creating a new Android project just to compile some *.9.png files into the Android native format. If you were to do this manually, you would essentially make a project and throw in some *.9.png files ("source" files), compile everything into the .apk format, unzip the .apk file, then find the *.9.png file, and that's the one you send to your clients.
Also: I don't know if BitmapFactory.decodeStream knows about the npTc chunk in these png files, so it may or may not be treating the image stream correctly. The existence of Bitmap.getNinePatchChunk suggests that BitmapFactory might-- you could go look it up in the upstream codebase.
In the event that it does not know about the npTc chunk and your images are being screwed up significantly, then my answer changes a little.
Instead of sending the compiled ninepatch images to the client, you write a quick Android app to load compiled images and spit out the byte[] chunk. Then, you transmit this byte array to your clients along with a regular image-- no transparent borders, not the "source" ninepatch image, not the "compiled" ninepatch image. You can directly use the chunk to create your object.
Another alternative is to use object serialization to send ninepatch images (NinePatch) to your clients, such as with JSON or the built-in serializer.
Edit If you really, really need to construct your own chunk byte array, I would start by looking at do_9patch, isNinePatchChunk, Res_png_9patch and Res_png_9patch::serialize() in ResourceTypes.cpp. There's also a home-made npTc chunk reader from Dmitry Skiba. I can't post links, so if someone can edit my answer that would be cool.
do_9patch:
https://android.googlesource.com/platform/frameworks/base/+/gingerbread/tools/aapt/Images.cpp
isNinePatchChunk: http://netmite.com/android/mydroid/1.6/frameworks/base/core/jni/android/graphics/NinePatch.cpp
struct Res_png_9patch: https://scm.sipfoundry.org/rep/sipX/main/sipXmediaLib/contrib/android/android_2_0_headers/frameworks/base/include/utils/ResourceTypes.h
Dmitry Skiba stuff: http://code.google.com/p/android4me/source/browse/src/android/graphics/Bitmap.java

If you need to create 9Patches on the fly check out this gist I made: https://gist.github.com/4391807
You pass it any bitmap and then give it cap insets similar to iOS.

I create a tool to create NinePatchDrawable from (uncompiled) NinePatch bitmap.
See https://gist.github.com/knight9999/86bec38071a9e0a781ee .
The method
NinePatchDrawable createNinePatchDrawable(Resources res, Bitmap bitmap)
helps you.
For example,
ImageView imageView = (ImageView) findViewById(R.id.imageview);
Bitmap bitmap = loadBitmapAsset("my_nine_patch_image.9.png", this);
NinePatchDrawable drawable = NinePatchBitmapFactory.createNinePatchDrawable(getResources(), bitmap);
imageView.setBackground( drawable );
where
public static final Bitmap loadBitmapAsset(String fileName,Context context) {
final AssetManager assetManager = context.getAssets();
BufferedInputStream bis = null;
try {
bis = new BufferedInputStream(assetManager.open(fileName));
return BitmapFactory.decodeStream(bis);
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
bis.close();
} catch (Exception e) {
}
}
return null;
}
In this sample case, the my_nine_patch_image.9.png is under the assets directory.

No need to use Android Binary Resource Compiler to prepare compiled 9patch pngs, just using aapt in android-sdk is ok, the command line is like this:
aapt.exe c -v -S /path/to/project -C /path/to/destination

So you basically want to create a NinePatchDrawable on demand, don't you? I tried the following code, maybe it works for you:
InputStream in = getResources().openRawResource(R.raw.test);
Drawable d = NinePatchDrawable.createFromStream(in, null);
System.out.println(d.getMinimumHeight() + ":" + d.getMinimumHeight());
I think this should work. You just have to change the first line to get the InputStream from the web. getNinePatchChunk() is not intended to be called from developers according to the documentation, and might break in the future.

WORKING AND TESTED - RUNTIME NINEPATCH CREATION
This is my implementation of android Ninepatch Builder, you can create NinePatches on Runtime through this class and code examples below by supplying any Bitmap
public class NinePatchBuilder {
int width,height;
Bitmap bitmap;
Resources resources;
private ArrayList<Integer> xRegions=new ArrayList<Integer>();
private ArrayList<Integer> yRegions=new ArrayList<Integer>();
public NinePatchBuilder(Resources resources,Bitmap bitmap){
width=bitmap.getWidth();
height=bitmap.getHeight();
this.bitmap=bitmap;
this.resources=resources;
}
public NinePatchBuilder(int width, int height){
this.width=width;
this.height=height;
}
public NinePatchBuilder addXRegion(int x, int width){
xRegions.add(x);
xRegions.add(x+width);
return this;
}
public NinePatchBuilder addXRegionPoints(int x1, int x2){
xRegions.add(x1);
xRegions.add(x2);
return this;
}
public NinePatchBuilder addXRegion(float xPercent, float widthPercent){
int xtmp=(int)(xPercent*this.width);
xRegions.add(xtmp);
xRegions.add(xtmp+(int)(widthPercent*this.width));
return this;
}
public NinePatchBuilder addXRegionPoints(float x1Percent, float x2Percent){
xRegions.add((int)(x1Percent*this.width));
xRegions.add((int)(x2Percent*this.width));
return this;
}
public NinePatchBuilder addXCenteredRegion(int width){
int x=(int)((this.width-width)/2);
xRegions.add(x);
xRegions.add(x+width);
return this;
}
public NinePatchBuilder addXCenteredRegion(float widthPercent){
int width=(int)(widthPercent*this.width);
int x=(int)((this.width-width)/2);
xRegions.add(x);
xRegions.add(x+width);
return this;
}
public NinePatchBuilder addYRegion(int y, int height){
yRegions.add(y);
yRegions.add(y+height);
return this;
}
public NinePatchBuilder addYRegionPoints(int y1, int y2){
yRegions.add(y1);
yRegions.add(y2);
return this;
}
public NinePatchBuilder addYRegion(float yPercent, float heightPercent){
int ytmp=(int)(yPercent*this.height);
yRegions.add(ytmp);
yRegions.add(ytmp+(int)(heightPercent*this.height));
return this;
}
public NinePatchBuilder addYRegionPoints(float y1Percent, float y2Percent){
yRegions.add((int)(y1Percent*this.height));
yRegions.add((int)(y2Percent*this.height));
return this;
}
public NinePatchBuilder addYCenteredRegion(int height){
int y=(int)((this.height-height)/2);
yRegions.add(y);
yRegions.add(y+height);
return this;
}
public NinePatchBuilder addYCenteredRegion(float heightPercent){
int height=(int)(heightPercent*this.height);
int y=(int)((this.height-height)/2);
yRegions.add(y);
yRegions.add(y+height);
return this;
}
public byte[] buildChunk(){
if(xRegions.size()==0){
xRegions.add(0);
xRegions.add(width);
}
if(yRegions.size()==0){
yRegions.add(0);
yRegions.add(height);
}
/* example code from a anwser above
// The 9 patch segment is not a solid color.
private static final int NO_COLOR = 0x00000001;
ByteBuffer buffer = ByteBuffer.allocate(56).order(ByteOrder.nativeOrder());
//was translated
buffer.put((byte)0x01);
//divx size
buffer.put((byte)0x02);
//divy size
buffer.put((byte)0x02);
//color size
buffer.put(( byte)0x02);
//skip
buffer.putInt(0);
buffer.putInt(0);
//padding
buffer.putInt(0);
buffer.putInt(0);
buffer.putInt(0);
buffer.putInt(0);
//skip 4 bytes
buffer.putInt(0);
buffer.putInt(left);
buffer.putInt(right);
buffer.putInt(top);
buffer.putInt(bottom);
buffer.putInt(NO_COLOR);
buffer.putInt(NO_COLOR);
return buffer;*/
int NO_COLOR = 1;//0x00000001;
int COLOR_SIZE=9;//could change, may be 2 or 6 or 15 - but has no effect on output
int arraySize=1+2+4+1+xRegions.size()+yRegions.size()+COLOR_SIZE;
ByteBuffer byteBuffer=ByteBuffer.allocate(arraySize * 4).order(ByteOrder.nativeOrder());
byteBuffer.put((byte) 1);//was translated
byteBuffer.put((byte) xRegions.size());//divisions x
byteBuffer.put((byte) yRegions.size());//divisions y
byteBuffer.put((byte) COLOR_SIZE);//color size
//skip
byteBuffer.putInt(0);
byteBuffer.putInt(0);
//padding -- always 0 -- left right top bottom
byteBuffer.putInt(0);
byteBuffer.putInt(0);
byteBuffer.putInt(0);
byteBuffer.putInt(0);
//skip
byteBuffer.putInt(0);
for(int rx:xRegions)
byteBuffer.putInt(rx); // regions left right left right ...
for(int ry:yRegions)
byteBuffer.putInt(ry);// regions top bottom top bottom ...
for(int i=0;i<COLOR_SIZE;i++)
byteBuffer.putInt(NO_COLOR);
return byteBuffer.array();
}
public NinePatch buildNinePatch(){
byte[] chunk=buildChunk();
if(bitmap!=null)
return new NinePatch(bitmap,chunk,null);
return null;
}
public NinePatchDrawable build(){
NinePatch ninePatch=buildNinePatch();
if(ninePatch!=null)
return new NinePatchDrawable(resources, ninePatch);
return null;
}
}
Now we can use ninepatch builder to create NinePatch or NinePatchDrawable or for creating NinePatch Chunk.
Example:
NinePatchBuilder builder=new NinePatchBuilder(getResources(), bitmap);
NinePatchDrawable drawable=builder.addXCenteredRegion(2).addYCenteredRegion(2).build();
//or add multiple patches
NinePatchBuilder builder=new NinePatchBuilder(getResources(), bitmap);
builder.addXRegion(30,2).addXRegion(50,1).addYRegion(20,4);
byte[] chunk=builder.buildChunk();
NinePatch ninepatch=builder.buildNinePatch();
NinePatchDrawable drawable=builder.build();
//Here if you don't want ninepatch and only want chunk use
NinePatchBuilder builder=new NinePatchBuilder(width, height);
byte[] chunk=builder.addXCenteredRegion(1).addYCenteredRegion(1).buildChunk();
Just copy paste the NinePatchBuilder class code in a java file and use the examples to create NinePatch on the fly during your app runtime, with any resolution.

The Bitmap class provides a method to do this yourbitmap.getNinePatchChunk(). I've never used it but it seems like thats what your looking for.

Related

AndroidSVG fuzzy edges on image

I am want to display Barcode on android. As input I get SVG string. As a SVG library I use AndroidSVG. I used sample code from library website and everything seem to be fine. But when I zoom on image, I get distorted edges (Anti-alias?). I tried to disable all the flags. But the image still has fuzzy edges. What can be wrong with my code?
Picture:
Try to zoom to max, you will see the fuzzy edges.
Code:
private void loadQRCode(String svgString) {
SVG svg = null;
try {
svg = SVG.getFromString(svgString);
} catch (SVGParseException e) {
e.printStackTrace();
}
if (svg.getDocumentWidth() != -1) {
int widthPx = Utils.pxFromDp(400);
int heightDp = Utils.pxFromDp(300);
svg.setDocumentWidth(widthPx);
svg.setDocumentHeight(heightDp);
int width = (int) Math.ceil(svg.getDocumentWidth());
int height = (int) Math.ceil(svg.getDocumentHeight());
Bitmap newBM = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
Canvas bmcanvas = new Canvas(newBM);
final DrawFilter filter = new PaintFlagsDrawFilter(Paint.ANTI_ALIAS_FLAG| Paint.FILTER_BITMAP_FLAG | Paint.DITHER_FLAG, 0);
bmcanvas.setDrawFilter(filter);
barcode.setLayerType(View.LAYER_TYPE_SOFTWARE,null);
bmcanvas.drawRGB(255, 255, 255);
svg.renderToCanvas(bmcanvas);
barcode.setImageBitmap(newBM);
}
}
If the edges of the bars do not lie exactly on pixel boundaries, you will get anti-aliasing. On a high resolution screen, this should not normally be visible.
However, in your code, you are rendering the SVG to a bitmap and setting the bitmap to an ImageView. If that ImageView has a size larger than the bitmap - ie. greater than 400 x 300, then the anti-aliased pixels in that bitmap will likely be rendered larger and thus more visible.
One solution is to avoid using a bitmap. Use a Picture/PictureDrawable instead. That way the barcode will be rendered at highest quality no matter what size it is. As vector graphics are supposed to be.
Follow the example on this page:
http://bigbadaboom.github.io/androidsvg/use_with_ImageView.html
So your code should probably look something like the following:
private void loadQRCode(String svgString) {
try {
SVG svg = SVG.getFromString(svgString);
barcode.setLayerType(View.LAYER_TYPE_SOFTWARE,null);
Drawable drawable = new PictureDrawable(svg.renderToPicture());
barcode.setImageDrawable(drawable);
} catch (SVGParseException e) {
e.printStackTrace();
}
}
If for some reason you need to use bitmaps - maybe you are caching them or something - then you should watch for changes in the size of the ImageView and then recreate the bitmap at the new size. So the bitmap is always the same size as the ImageView to which it is assigned.

Image from Android expansion file appears too small

I am in the process of implementing expansion files for my Android app.
So far, I've placed several images and audio files into the main expansion file (I am not using the patch expansion file). None of the files have been compressed.
Through the app, I am able to download the expansion file, and play the audio files without any problems. At the same time an audio file in the expansion file is played, I am also displaying an image from the expansion file. However, the image is considerably smaller than I expected.
The image is 320x400px. Before implementing the expansion files, it was displayed as expected in my app. However, after implementation, it looks like the image shrank to about 50px wide (the height shrank in proportion).
I then tried the solution offered in How to create a drawable from a stream without resizing it. While the image does appear slightly larger, it is still much smaller than what I want it to be (looks like it's about 100x125 px now). Currently, my code for displaying the image looks like this:
public void displayImageFromExpansionFile(){
Bitmap b = BitmapFactory.decodeStream(fileStream);
b.setDensity(Bitmap.DENSITY_NONE);
Drawable d = new BitmapDrawable(this.getResources(), b);
imageToDisplay.setImageDrawable(d);
}
public void showImg(int imgNum){
switch(imgNum){
case(1):
try{
if((getResources().getConfiguration().screenLayout &
Configuration.SCREENLAYOUT_SIZE_MASK) == Configuration.SCREENLAYOUT_SIZE_SMALL){
fileStream = expansionFile.getInputStream("filepath inside expansion file for small image");
}
else if((getResources().getConfiguration().screenLayout &
Configuration.SCREENLAYOUT_SIZE_MASK) == Configuration.SCREENLAYOUT_SIZE_NORMAL){
fileStream = expansionFile.getInputStream("filepath inside expansion file for normal image");
}
else if((getResources().getConfiguration().screenLayout &
Configuration.SCREENLAYOUT_SIZE_MASK) == Configuration.SCREENLAYOUT_SIZE_LARGE){
fileStream = expansionFile.getInputStream("filepath inside expansion file for large image");
}
else{
fileStream = expansionFile.getInputStream("filepath inside expansion file for xlarge image");
}
displayImageFromExpansionFile();
} catch (Exception e) {
e.printStackTrace();
}
break;
// more cases here
}
It still seems as if the image is not being displayed at its actual size. When I examine the image inside the expansion file, I can see that it is still at 320x400px. However, the app is not displaying the image at these dimensions.
What could I do to get the app to display the image at its correct dimensions?
Thanks!
---UPDATE---
I've also tried the code below, with no difference in results. It still looks to be about 100x125px, instead of 320x400px, like its original size.
public void displayImageFromExpansionFile(int bWidth, int bHeight){
BitmapFactory.Options bfo = new BitmapFactory.Options();
bfo.inPreferredConfig = Bitmap.Config.ARGB_8888;
Bitmap b = BitmapFactory.decodeStream(fileStream, null, bfo);
b.setDensity(Bitmap.DENSITY_NONE);
imageToDisplay.setImageBitmap(b);
}
The only thing that's worked so far is Bitmap.createScaledBitmap(b, bWidth, bHeight, true);
On my phone, doubling the image's original dimensions (to 640x800 px) using the above method brings the image up to its expected size, but I would imagine that the image might appear at different sizes on different phones (probably because of screen density/size). When I tried doubling the xlarge image dimensions and viewed it on my tablet, the image appears larger than it should.
In the end, I fiddled around with the layout XML file that was displaying the shrunken image. I changed the attributes of the imageView like so:
Width / Height: fill_parent (both)
Scale Type: fitCenter
Now, the images are slightly bigger than I initially expected, but I think it looks better this way. And now the images are of a consistent size. Problem solved!
If anyone else is having trouble with what I've been experiencing, hope this helps.
thats worked for me:
public static Bitmap getBitmapExt(Context context, String name){
Bitmap bitmap = null;
try {
if(expansionFile == null){
PackageInfo pInfo = context.getPackageManager().getPackageInfo(context.getPackageName(), 0);
expansionFile = APKExpansionSupport.getAPKExpansionZipFile(context, pInfo.versionCode, 0);
}
InputStream is = expansionFile.getInputStream("images/" + name + ".png");
bitmap = BitmapFactory.decodeStream(is);
}
catch (Exception e){
e.printStackTrace();
}
return bitmap;
}
https://gist.github.com/uebi-belli/3c07323c8055a3b66ccac0d388b2b013
Hope that helps =)

NinePatchDrawable does not get padding from chunk

I need help with NinePatchDrawable:
My app can download themes from the network.
Almost all things work fine, except 9-Patch PNGs.
final Bitmap bubble = getFromTheme("bubble");
if (bubble == null) return null;
final byte[] chunk = bubble.getNinePatchChunk();
if (!NinePatch.isNinePatchChunk(chunk)) return null;
NinePatchDrawable d = new NinePatchDrawable(getResources(), bubble, chunk, new Rect(), null);
v.setBackgroundDrawable(d);
d = null;
System.gc();
getFromTheme() loads the Bitmap from the SD card. The 9-Patch PNGs are already compiled, that means they include the required chunk.
The way how I convert the Bitmap to a NinePatchDrawable object seems to be working, because the image is stretchable as well as I drew it.
The only thing that doesn't work is the padding. I already tried to set the padding to the view like this:
final Rect rect = new Rect(); // or just use the new Rect() set
d.getPadding(rect); // in the constructor
v.setPadding(rect.left, rect.top, rect.right, rect.bottom);
d.getPadding(rect) should fill the variable rect with the padding got from the chunk, shouldn't it? But it doesn't.
Result: The TextView (v) does not show the text in the content area of the 9-Patch image. The paddings are set to 0 in each coordinate.
Thanks for reading.
Finally, I did it. Android wasn't interpreting the chunk data correctly. There might be bug. So you have to deserialize the chunk yourself to get the padding data.
Here we go:
package com.dragonwork.example;
import android.graphics.Rect;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
class NinePatchChunk {
public static final int NO_COLOR = 0x00000001;
public static final int TRANSPARENT_COLOR = 0x00000000;
public final Rect mPaddings = new Rect();
public int mDivX[];
public int mDivY[];
public int mColor[];
private static void readIntArray(final int[] data, final ByteBuffer buffer) {
for (int i = 0, n = data.length; i < n; ++i)
data[i] = buffer.getInt();
}
private static void checkDivCount(final int length) {
if (length == 0 || (length & 0x01) != 0)
throw new RuntimeException("invalid nine-patch: " + length);
}
public static NinePatchChunk deserialize(final byte[] data) {
final ByteBuffer byteBuffer =
ByteBuffer.wrap(data).order(ByteOrder.nativeOrder());
if (byteBuffer.get() == 0) return null; // is not serialized
final NinePatchChunk chunk = new NinePatchChunk();
chunk.mDivX = new int[byteBuffer.get()];
chunk.mDivY = new int[byteBuffer.get()];
chunk.mColor = new int[byteBuffer.get()];
checkDivCount(chunk.mDivX.length);
checkDivCount(chunk.mDivY.length);
// skip 8 bytes
byteBuffer.getInt();
byteBuffer.getInt();
chunk.mPaddings.left = byteBuffer.getInt();
chunk.mPaddings.right = byteBuffer.getInt();
chunk.mPaddings.top = byteBuffer.getInt();
chunk.mPaddings.bottom = byteBuffer.getInt();
// skip 4 bytes
byteBuffer.getInt();
readIntArray(chunk.mDivX, byteBuffer);
readIntArray(chunk.mDivY, byteBuffer);
readIntArray(chunk.mColor, byteBuffer);
return chunk;
}
}
Use the class above as following:
final byte[] chunk = bitmap.getNinePatchChunk();
if (NinePatch.isNinePatchChunk(chunk)) {
textView.setBackgroundDrawable(new NinePatchDrawable(getResources(),
bitmap, chunk, NinePatchChunk.deserialize(chunk).mPaddings, null));
}
And it will work perfectly!
It's actually slightly more complicated than that, but what it boils down to is pretty simple:
The padding rect is returned by BitmapFactory.decodeStream(InputStream, Rect, Options). There is no version of decodeByteArray() which can return the padding rect.
The whole nine-patch API is a bit silly:
decodeByteArray() calls nativeDecodeByteArray(), which is presumably more efficient than nativeDecodeStream() on a ByteArrayInputStream, but obviously the devs never expected you to want to decode a nine-patch from memory.
The padding rect is only used by nine-patches, so it makes more sense for it to be part of NinePatch instead of BitmapFactory. Sadly, NinePatch.java is not much more than a wrapper that passes the bitmap and nine-patch chunk to drawing methods (and most of the NinePatch.draw() calls aren't thread-safe due to the call to mRect.set(location)).
NinePatchDrawable doesn't offer a way to take a NinePatch and a padding rect, which makes NinePatch somewhat useless in application code (unless you want to do the padding yourself). There is no NinePatchDrawable.getNinePatch() or NinePatch.getBitmap().
This comment sums it up pretty well:
ugh. The decodeStream contract is that we have already allocated
the pad rect, but if the bitmap does not had a ninepatch chunk,
then the pad will be ignored. If we could change this to lazily
alloc/assign the rect, we could avoid the GC churn of making new
Rects only to drop them on the floor.
My fix is fairly simple:
public final class NinePatchWrapper {
private final Bitmap mBitmap;
private final Rect mPadding;
/**
* The caller must ensure that that bitmap and padding are not modified after
* this method returns. We could copy them, but Bitmap.createBitmap(Bitmap)
* does not copy the nine-patch chunk on some Android versions.
*/
public NinePatchWrapper(Bitmap bitmap, Rect padding) {
mBitmap = bitmap;
mPadding = padding;
}
public NinePatchDrawable newDrawable(Resources resources) {
return new NinePatchDrawable(mBitmap, mBitmap.getNinePatchChunk(), mPadding, null);
}
}
...
public NinePatchWrapper decodeNinePatch(byte[] byteArray, int density) {
Rect padding = new Rect();
ByteArrayInputStream stream = new ByteArrayInputStream(byteArray);
Bitmap bitmap = BitmapFactory.decodeStream(stream, padding, null);
bitmap.setDensity(density);
return new NinePatchWrapper(bitmap, padding);
}
Untested, since it's greatly simplified. In particular, you might want to check that the nine-patch chunk is valid.
I've never seen an example where the Padding isn't included as part of the 9-patch like so:
To do this you should first construct a NinePatch and then create you're Drawable from it:
NinePatch ninePatch = new NinePatch(bitmap, chunk, srcName);
NinePatchDrawable d = new NinePatchDrawable(res, ninePatch);
However, you seem to be constructing your Drawable with an empty rectangle:
NinePatchDrawable d = new NinePatchDrawable(getResources(), bubble, chunk, new Rect(), null);
If you want to programatically specify the padding try this:
Rect paddingRectangle = new Rect(left, top, right, bottom);
NinePatchDrawable d = new NinePatchDrawable(getResources(), bubble, chunk, paddingRectangle, null);
A bit late to the party, but here is how I solved it:
I use the decoder method that NinePatchDrawable provides, it reads the padding correctly:
var myDrawable = NinePatchDrawable.createFromStream(sr, null);

Create a NinePatch/NinePatchDrawable in runtime

I have a requirement on my Android application that parts on the graphics should be customizable, by retrieving new colors and images from the server side. Some of these images are nine-patch images.
I can't find a way to create and display these nine-patch images (that have been retrieved over the network).
The nine-patch images are retrieved and kept in the application as Bitmaps. In order to create a NinePatchDrawable, you either need the corresponding NinePatch or the chunk (byte[]) of the NinePatch. The NinePatch can NOT be loaded from the Resources, since the images doesn't exist in /res/drawable/. Furthermore, in order to create the NinePatch, you need the chunk of the NinePatch. So, it all drills down to the chunk.
The question is then, how do one format/generate the chunk from an existing Bitmap (containing the NinePatch information)?
I've searched through the Android source code and the Web and I can't seem to find any examples of this. To make things worse, all decoding of a NinePatch resources seem to be done natively.
Have anyone had any experiences with this kind of issue?
I'm targeting API level 4, if that is of importance.
getNinePatchChunk works just fine. It returned null because you were giving Bitmap a "source" ninepatch. It needs a "compiled" ninepatch image.
There are two types of ninepatch file formats in the Android world ("source" and "compiled"). The source version is where you add the 1px transparency border everywhere-- when you compile your app into a .apk later, aapt will convert your *.9.png files to the binary format that Android expects. This is where the png file gets its "chunk" metadata. (read more)
Okay, now down to business.
Client code, something like this:
InputStream stream = .. //whatever
Bitmap bitmap = BitmapFactory.decodeStream(stream);
byte[] chunk = bitmap.getNinePatchChunk();
boolean result = NinePatch.isNinePatchChunk(chunk);
NinePatchDrawable patchy = new NinePatchDrawable(bitmap, chunk, new Rect(), null);
Server-side, you need to prepare your images. You can use the Android Binary Resource Compiler. This automates some of the pain away from creating a new Android project just to compile some *.9.png files into the Android native format. If you were to do this manually, you would essentially make a project and throw in some *.9.png files ("source" files), compile everything into the .apk format, unzip the .apk file, then find the *.9.png file, and that's the one you send to your clients.
Also: I don't know if BitmapFactory.decodeStream knows about the npTc chunk in these png files, so it may or may not be treating the image stream correctly. The existence of Bitmap.getNinePatchChunk suggests that BitmapFactory might-- you could go look it up in the upstream codebase.
In the event that it does not know about the npTc chunk and your images are being screwed up significantly, then my answer changes a little.
Instead of sending the compiled ninepatch images to the client, you write a quick Android app to load compiled images and spit out the byte[] chunk. Then, you transmit this byte array to your clients along with a regular image-- no transparent borders, not the "source" ninepatch image, not the "compiled" ninepatch image. You can directly use the chunk to create your object.
Another alternative is to use object serialization to send ninepatch images (NinePatch) to your clients, such as with JSON or the built-in serializer.
Edit If you really, really need to construct your own chunk byte array, I would start by looking at do_9patch, isNinePatchChunk, Res_png_9patch and Res_png_9patch::serialize() in ResourceTypes.cpp. There's also a home-made npTc chunk reader from Dmitry Skiba. I can't post links, so if someone can edit my answer that would be cool.
do_9patch:
https://android.googlesource.com/platform/frameworks/base/+/gingerbread/tools/aapt/Images.cpp
isNinePatchChunk: http://netmite.com/android/mydroid/1.6/frameworks/base/core/jni/android/graphics/NinePatch.cpp
struct Res_png_9patch: https://scm.sipfoundry.org/rep/sipX/main/sipXmediaLib/contrib/android/android_2_0_headers/frameworks/base/include/utils/ResourceTypes.h
Dmitry Skiba stuff: http://code.google.com/p/android4me/source/browse/src/android/graphics/Bitmap.java
If you need to create 9Patches on the fly check out this gist I made: https://gist.github.com/4391807
You pass it any bitmap and then give it cap insets similar to iOS.
I create a tool to create NinePatchDrawable from (uncompiled) NinePatch bitmap.
See https://gist.github.com/knight9999/86bec38071a9e0a781ee .
The method
NinePatchDrawable createNinePatchDrawable(Resources res, Bitmap bitmap)
helps you.
For example,
ImageView imageView = (ImageView) findViewById(R.id.imageview);
Bitmap bitmap = loadBitmapAsset("my_nine_patch_image.9.png", this);
NinePatchDrawable drawable = NinePatchBitmapFactory.createNinePatchDrawable(getResources(), bitmap);
imageView.setBackground( drawable );
where
public static final Bitmap loadBitmapAsset(String fileName,Context context) {
final AssetManager assetManager = context.getAssets();
BufferedInputStream bis = null;
try {
bis = new BufferedInputStream(assetManager.open(fileName));
return BitmapFactory.decodeStream(bis);
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
bis.close();
} catch (Exception e) {
}
}
return null;
}
In this sample case, the my_nine_patch_image.9.png is under the assets directory.
No need to use Android Binary Resource Compiler to prepare compiled 9patch pngs, just using aapt in android-sdk is ok, the command line is like this:
aapt.exe c -v -S /path/to/project -C /path/to/destination
So you basically want to create a NinePatchDrawable on demand, don't you? I tried the following code, maybe it works for you:
InputStream in = getResources().openRawResource(R.raw.test);
Drawable d = NinePatchDrawable.createFromStream(in, null);
System.out.println(d.getMinimumHeight() + ":" + d.getMinimumHeight());
I think this should work. You just have to change the first line to get the InputStream from the web. getNinePatchChunk() is not intended to be called from developers according to the documentation, and might break in the future.
WORKING AND TESTED - RUNTIME NINEPATCH CREATION
This is my implementation of android Ninepatch Builder, you can create NinePatches on Runtime through this class and code examples below by supplying any Bitmap
public class NinePatchBuilder {
int width,height;
Bitmap bitmap;
Resources resources;
private ArrayList<Integer> xRegions=new ArrayList<Integer>();
private ArrayList<Integer> yRegions=new ArrayList<Integer>();
public NinePatchBuilder(Resources resources,Bitmap bitmap){
width=bitmap.getWidth();
height=bitmap.getHeight();
this.bitmap=bitmap;
this.resources=resources;
}
public NinePatchBuilder(int width, int height){
this.width=width;
this.height=height;
}
public NinePatchBuilder addXRegion(int x, int width){
xRegions.add(x);
xRegions.add(x+width);
return this;
}
public NinePatchBuilder addXRegionPoints(int x1, int x2){
xRegions.add(x1);
xRegions.add(x2);
return this;
}
public NinePatchBuilder addXRegion(float xPercent, float widthPercent){
int xtmp=(int)(xPercent*this.width);
xRegions.add(xtmp);
xRegions.add(xtmp+(int)(widthPercent*this.width));
return this;
}
public NinePatchBuilder addXRegionPoints(float x1Percent, float x2Percent){
xRegions.add((int)(x1Percent*this.width));
xRegions.add((int)(x2Percent*this.width));
return this;
}
public NinePatchBuilder addXCenteredRegion(int width){
int x=(int)((this.width-width)/2);
xRegions.add(x);
xRegions.add(x+width);
return this;
}
public NinePatchBuilder addXCenteredRegion(float widthPercent){
int width=(int)(widthPercent*this.width);
int x=(int)((this.width-width)/2);
xRegions.add(x);
xRegions.add(x+width);
return this;
}
public NinePatchBuilder addYRegion(int y, int height){
yRegions.add(y);
yRegions.add(y+height);
return this;
}
public NinePatchBuilder addYRegionPoints(int y1, int y2){
yRegions.add(y1);
yRegions.add(y2);
return this;
}
public NinePatchBuilder addYRegion(float yPercent, float heightPercent){
int ytmp=(int)(yPercent*this.height);
yRegions.add(ytmp);
yRegions.add(ytmp+(int)(heightPercent*this.height));
return this;
}
public NinePatchBuilder addYRegionPoints(float y1Percent, float y2Percent){
yRegions.add((int)(y1Percent*this.height));
yRegions.add((int)(y2Percent*this.height));
return this;
}
public NinePatchBuilder addYCenteredRegion(int height){
int y=(int)((this.height-height)/2);
yRegions.add(y);
yRegions.add(y+height);
return this;
}
public NinePatchBuilder addYCenteredRegion(float heightPercent){
int height=(int)(heightPercent*this.height);
int y=(int)((this.height-height)/2);
yRegions.add(y);
yRegions.add(y+height);
return this;
}
public byte[] buildChunk(){
if(xRegions.size()==0){
xRegions.add(0);
xRegions.add(width);
}
if(yRegions.size()==0){
yRegions.add(0);
yRegions.add(height);
}
/* example code from a anwser above
// The 9 patch segment is not a solid color.
private static final int NO_COLOR = 0x00000001;
ByteBuffer buffer = ByteBuffer.allocate(56).order(ByteOrder.nativeOrder());
//was translated
buffer.put((byte)0x01);
//divx size
buffer.put((byte)0x02);
//divy size
buffer.put((byte)0x02);
//color size
buffer.put(( byte)0x02);
//skip
buffer.putInt(0);
buffer.putInt(0);
//padding
buffer.putInt(0);
buffer.putInt(0);
buffer.putInt(0);
buffer.putInt(0);
//skip 4 bytes
buffer.putInt(0);
buffer.putInt(left);
buffer.putInt(right);
buffer.putInt(top);
buffer.putInt(bottom);
buffer.putInt(NO_COLOR);
buffer.putInt(NO_COLOR);
return buffer;*/
int NO_COLOR = 1;//0x00000001;
int COLOR_SIZE=9;//could change, may be 2 or 6 or 15 - but has no effect on output
int arraySize=1+2+4+1+xRegions.size()+yRegions.size()+COLOR_SIZE;
ByteBuffer byteBuffer=ByteBuffer.allocate(arraySize * 4).order(ByteOrder.nativeOrder());
byteBuffer.put((byte) 1);//was translated
byteBuffer.put((byte) xRegions.size());//divisions x
byteBuffer.put((byte) yRegions.size());//divisions y
byteBuffer.put((byte) COLOR_SIZE);//color size
//skip
byteBuffer.putInt(0);
byteBuffer.putInt(0);
//padding -- always 0 -- left right top bottom
byteBuffer.putInt(0);
byteBuffer.putInt(0);
byteBuffer.putInt(0);
byteBuffer.putInt(0);
//skip
byteBuffer.putInt(0);
for(int rx:xRegions)
byteBuffer.putInt(rx); // regions left right left right ...
for(int ry:yRegions)
byteBuffer.putInt(ry);// regions top bottom top bottom ...
for(int i=0;i<COLOR_SIZE;i++)
byteBuffer.putInt(NO_COLOR);
return byteBuffer.array();
}
public NinePatch buildNinePatch(){
byte[] chunk=buildChunk();
if(bitmap!=null)
return new NinePatch(bitmap,chunk,null);
return null;
}
public NinePatchDrawable build(){
NinePatch ninePatch=buildNinePatch();
if(ninePatch!=null)
return new NinePatchDrawable(resources, ninePatch);
return null;
}
}
Now we can use ninepatch builder to create NinePatch or NinePatchDrawable or for creating NinePatch Chunk.
Example:
NinePatchBuilder builder=new NinePatchBuilder(getResources(), bitmap);
NinePatchDrawable drawable=builder.addXCenteredRegion(2).addYCenteredRegion(2).build();
//or add multiple patches
NinePatchBuilder builder=new NinePatchBuilder(getResources(), bitmap);
builder.addXRegion(30,2).addXRegion(50,1).addYRegion(20,4);
byte[] chunk=builder.buildChunk();
NinePatch ninepatch=builder.buildNinePatch();
NinePatchDrawable drawable=builder.build();
//Here if you don't want ninepatch and only want chunk use
NinePatchBuilder builder=new NinePatchBuilder(width, height);
byte[] chunk=builder.addXCenteredRegion(1).addYCenteredRegion(1).buildChunk();
Just copy paste the NinePatchBuilder class code in a java file and use the examples to create NinePatch on the fly during your app runtime, with any resolution.
The Bitmap class provides a method to do this yourbitmap.getNinePatchChunk(). I've never used it but it seems like thats what your looking for.

Access to raw data in ARGB_8888 Android Bitmap

I am trying to access the raw data of a Bitmap in ARGB_8888 format on Android, using the copyPixelsToBuffer and copyPixelsFromBuffer methods. However, invocation of those calls seems to always apply the alpha channel to the rgb channels. I need the raw data in a byte[] or similar (to pass through JNI; yes, I know about bitmap.h in Android 2.2, cannot use that).
Here is a sample:
// Create 1x1 Bitmap with alpha channel, 8 bits per channel
Bitmap one = Bitmap.createBitmap(1,1,Bitmap.Config.ARGB_8888);
one.setPixel(0,0,0xef234567);
Log.v("?","hasAlpha() = "+Boolean.toString(one.hasAlpha()));
Log.v("?","pixel before = "+Integer.toHexString(one.getPixel(0,0)));
// Copy Bitmap to buffer
byte[] store = new byte[4];
ByteBuffer buffer = ByteBuffer.wrap(store);
one.copyPixelsToBuffer(buffer);
// Change value of the pixel
int value=buffer.getInt(0);
Log.v("?", "value before = "+Integer.toHexString(value));
value = (value >> 8) | 0xffffff00;
buffer.putInt(0, value);
value=buffer.getInt(0);
Log.v("?", "value after = "+Integer.toHexString(value));
// Copy buffer back to Bitmap
buffer.position(0);
one.copyPixelsFromBuffer(buffer);
Log.v("?","pixel after = "+Integer.toHexString(one.getPixel(0,0)));
The log then shows
hasAlpha() = true
pixel before = ef234567
value before = 214161ef
value after = ffffff61
pixel after = 619e9e9e
I understand that the order of the argb channels is different; that's fine. But I don't
want the alpha channel to be applied upon every copy (which is what it seems to be doing).
Is this how copyPixelsToBuffer and copyPixelsFromBuffer are supposed to work? Is there any way to get the raw data in a byte[]?
Added in response to answer below:
Putting in buffer.order(ByteOrder.nativeOrder()); before the copyPixelsToBuffer does change the result, but still not in the way I want it:
pixel before = ef234567
value before = ef614121
value after = ffffff41
pixel after = ff41ffff
Seems to suffer from essentially the same problem (alpha being applied upon each copyPixelsFrom/ToBuffer).
One way to access data in Bitmap is to use getPixels() method. Below you can find an example I used to get grayscale image from argb data and then back from byte array to Bitmap (of course if you need rgb you reserve 3x bytes and save them all...):
/*Free to use licence by Sami Varjo (but nice if you retain this line)*/
public final class BitmapConverter {
private BitmapConverter(){};
/**
* Get grayscale data from argb image to byte array
*/
public static byte[] ARGB2Gray(Bitmap img)
{
int width = img.getWidth();
int height = img.getHeight();
int[] pixels = new int[height*width];
byte grayIm[] = new byte[height*width];
img.getPixels(pixels,0,width,0,0,width,height);
int pixel=0;
int count=width*height;
while(count-->0){
int inVal = pixels[pixel];
//Get the pixel channel values from int
double r = (double)( (inVal & 0x00ff0000)>>16 );
double g = (double)( (inVal & 0x0000ff00)>>8 );
double b = (double)( inVal & 0x000000ff) ;
grayIm[pixel++] = (byte)( 0.2989*r + 0.5870*g + 0.1140*b );
}
return grayIm;
}
/**
* Create a gray scale bitmap from byte array
*/
public static Bitmap gray2ARGB(byte[] data, int width, int height)
{
int count = height*width;
int[] outPix = new int[count];
int pixel=0;
while(count-->0){
int val = data[pixel] & 0xff; //convert byte to unsigned
outPix[pixel++] = 0xff000000 | val << 16 | val << 8 | val ;
}
Bitmap out = Bitmap.createBitmap(outPix,0,width,width, height, Bitmap.Config.ARGB_8888);
return out;
}
}
My guess is that this might have to do with the byte order of the ByteBuffer you are using. ByteBuffer uses big endian by default.
Set endianess on the buffer with
buffer.order(ByteOrder.nativeOrder());
See if it helps.
Moreover, copyPixelsFromBuffer/copyPixelsToBuffer does not change the pixel data in any way. They are copied raw.
I realize this is very stale and probably won't help you now, but I came across this recently in trying to get copyPixelsFromBuffer to work in my app. (Thank you for asking this question, btw! You saved me tons of time in debugging.) I'm adding this answer in the hopes it helps others like me going forward...
Although I haven't used this yet to ensure that it works, it looks like that, as of API Level 19, we'll finally have a way to specify not to "apply the alpha" (a.k.a. premultiply) within Bitmap. They're adding a setPremultiplied(boolean) method that should help in situations like this going forward by allowing us to specify false.
I hope this helps!
This is an old question, but i got to the same issue, and just figured out that the bitmap byte are pre-multiplied, you can set the bitmap (as of API 19) to not pre-multiply the buffer, but in the API they make no guarantee.
From the docs:
public final void setPremultiplied(boolean premultiplied)
Sets whether the bitmap should treat its data as pre-multiplied.
Bitmaps are always treated as pre-multiplied by the view system and Canvas for performance reasons. Storing un-pre-multiplied data in a Bitmap (through setPixel, setPixels, or BitmapFactory.Options.inPremultiplied) can lead to incorrect blending if drawn by the framework.
This method will not affect the behaviour of a bitmap without an alpha channel, or if hasAlpha() returns false.
Calling createBitmap or createScaledBitmap with a source Bitmap whose colors are not pre-multiplied may result in a RuntimeException, since those functions require drawing the source, which is not supported for un-pre-multiplied Bitmaps.

Categories

Resources