In my project I have an bitmap image. I need to convert this picture to byteArray in order to manipulate some bytes and after that save it as image.
with this code image = BitmapFactory.decodeResource(context.getResources(), R.drawable.tasnim); I have acces to width and height but how can I have access to bytes of this image?
Thanks
AFAIK Most correct way is:
ByteBuffer copyToBuffer(Bitmap bitmap){
int size = bitmap.getHeight() * bitmap.getRowBytes();
ByteBuffer buffer = ByteBuffer.allocateDirect(size);
bitmap.copyPixelsToBuffer(buffer);
return buffer;
}
I'm assuming the OP wants to manipulate the pixels, not the header information of the Image...
Assuming your image is a Bitmap
int w = image.getWidth(), h = image.getHeight();
int[] rgbStream = new int[w * h];
image.getPixels(rgbStream, 0, w, 0, 0, w, h);
Of course, this gets you Pixel values as Integers...But you can always convert them again.
int t = w * h;
for (int i = 0; i < t; i++) {
pixel = rgbStream[i]; //get pixel value (ARGB)
int A = (pixel >> 24) & 0xFF; //Isolate Alpha value...
int R = (pixel >> 16) & 0xFF; //Isolate Red Channel value...
int G = (pixel >> 8) & 0xFF; //Isolate Green Channel value...
int B = pixel & 0xFF; //Isolate Blue Channel value...
//NOTE, A,R,G,B can be cast as bytes...
}
Related
I'm trying to write a couple of methods to convert an Android Bitmap to an RGBA byte array and then back to a Bitmap. The problem is that I don't seem to hit the formula, because the colors are always coming back wrong. I have tried with several different assumptions but to no avail.
So, this is the method to convert from Bitmap to RGBA that I think is fine:
public static byte[] bitmapToRgba(Bitmap bitmap) {
int[] pixels = new int[bitmap.getWidth() * bitmap.getHeight()];
byte[] bytes = new byte[pixels.length * 4];
bitmap.getPixels(pixels, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
int i = 0;
for (int pixel : pixels) {
// Get components assuming is ARGB
int A = (pixel >> 24) & 0xff;
int R = (pixel >> 16) & 0xff;
int G = (pixel >> 8) & 0xff;
int B = pixel & 0xff;
bytes[i++] = (byte) R;
bytes[i++] = (byte) G;
bytes[i++] = (byte) B;
bytes[i++] = (byte) A;
}
return bytes;
}
And this is the method aimed at creating back a bitmap from those bytes that is not working as expected:
public static Bitmap bitmapFromRgba(int width, int height, byte[] bytes) {
int[] pixels = new int[bytes.length / 4];
int j = 0;
// It turns out Bitmap.Config.ARGB_8888 is in reality RGBA_8888!
// Source: https://stackoverflow.com/a/47982505/1160360
// Now, according to my own experiments, it seems it is ABGR... this sucks.
// So we have to change the order of the components
for (int i = 0; i < pixels.length; i++) {
byte R = bytes[j++];
byte G = bytes[j++];
byte B = bytes[j++];
byte A = bytes[j++];
int pixel = (A << 24) | (B << 16) | (G << 8) | R;
pixels[i] = pixel;
}
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
bitmap.copyPixelsFromBuffer(IntBuffer.wrap(pixels));
return bitmap;
}
That's my last implementation, though I have tried several different ones without success. I'm assuming createBitmap expects ABGR in spite of specifying ARGB_8888 because I have done experiments hardcoding all the pixels to things like:
0xff_ff_00_00 -> got blue
0xff_00_ff_00 -> got green
0xff_00_00_ff -> got red
Anyway maybe that assumption is wrong and a consequence of some other mistaken one before.
I think the main problem may be related to the use of signed numeric values, since there are no unsigned ones in Java (well, there's something in Java 8+ but on one hand I don't think it should be necessary to use these, and on the other it is not supported by older Android versions that I need to support).
Any help will be very appreciated.
Thanks a lot in advance!
I solved it myself. There are a number of issues but all these began with this line:
bitmap.copyPixelsFromBuffer(IntBuffer.wrap(pixels));
That seems to be mixing up the color components in the wrong way. Maybe it's something related to byte order (little/big indian stuff), in any case I worked it around using setPixels instead:
bitmap.setPixels(pixels, 0, width, 0, 0, width, height);
This is the final code that's working as expected, just in case it's useful for someone else:
public static byte[] bitmapToRgba(Bitmap bitmap) {
if (bitmap.getConfig() != Bitmap.Config.ARGB_8888)
throw new IllegalArgumentException("Bitmap must be in ARGB_8888 format");
int[] pixels = new int[bitmap.getWidth() * bitmap.getHeight()];
byte[] bytes = new byte[pixels.length * 4];
bitmap.getPixels(pixels, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
int i = 0;
for (int pixel : pixels) {
// Get components assuming is ARGB
int A = (pixel >> 24) & 0xff;
int R = (pixel >> 16) & 0xff;
int G = (pixel >> 8) & 0xff;
int B = pixel & 0xff;
bytes[i++] = (byte) R;
bytes[i++] = (byte) G;
bytes[i++] = (byte) B;
bytes[i++] = (byte) A;
}
return bytes;
}
public static Bitmap bitmapFromRgba(int width, int height, byte[] bytes) {
int[] pixels = new int[bytes.length / 4];
int j = 0;
for (int i = 0; i < pixels.length; i++) {
int R = bytes[j++] & 0xff;
int G = bytes[j++] & 0xff;
int B = bytes[j++] & 0xff;
int A = bytes[j++] & 0xff;
int pixel = (A << 24) | (R << 16) | (G << 8) | B;
pixels[i] = pixel;
}
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
bitmap.setPixels(pixels, 0, width, 0, 0, width, height);
return bitmap;
}
I am doing an image processing which require to convert RGB bitmap image to YCbCr color space. I retrieved RGB value for each pixel and apply the conversion matrix to it.
public void convertRGB (View v) {
if (imageLoaded) {
int width = inputBM.getWidth();
int height = inputBM.getHeight();
int pixel;
int alpha, red, green, blue;
int Y,Cb,Cr;
outputBM = Bitmap.createBitmap(width, height, inputBM.getConfig());
for (int x = 0; x < width; x++) {
for (int y = 0; y < height; y++) {
pixel = inputBM.getPixel(x, y);
alpha = Color.alpha(pixel);
red = Color.red(pixel);
green = Color.green(pixel);
blue = Color.blue(pixel);
Y = (int) (0.299 * red + 0.587 * green + 0.114 * blue);
Cb = (int) (128-0.169 * red-0.331 * green + 0.500 * blue);
Cr = (int) (128+0.500 * red - 0.419 * green - 0.081 * blue);
int p = (Y << 24) | (Cb << 16) | (Cr<<8);
outputBM.setPixel(x,y,p);
}
}
comImgView.setImageBitmap(outputBM);
}
}
The problem is he output color is different with original. I tried to use BufferedImage but it do not work in Android
Original:
After Conversion:
May I know what is the correct way to handle YCbCr image in android java.
Try setting using below code
ByteArrayOutputStream out = new ByteArrayOutputStream();
YuvImage yuvImage = new YuvImage(your_yuv_data, ImageFormat.NV21, width, height, null);
yuvImage.compressToJpeg(new Rect(0, 0, width, height), 50, out);
byte[] imageBytes = out.toByteArray();
Bitmap image = BitmapFactory.decodeByteArray(imageBytes, 0, imageBytes.length);
iv.setImageBitmap(image);
Check documentation for detailed description for YuvImage Class.
For my android app I am getting a ByteBuffer from native code. It contains the pixel color values to create a bitmap.
Original image -
I used copyPixelsFromBuffer on bitmap, but I am getting incorrect color on displaying the bitmap.
Here is the code for this approach -
Approach 1
ByteBuffer buffer = ...
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
buffer.rewind();
bitmap.copyPixelsFromBuffer(buffer);
Approx. time - ~0.4 ms
Result - Wrong colors -
Approach 2
Next I tried setPixels. It still gives wrong colors and is more than 10 times slower and uses extra memory for int[]. Please note that buffer.hasArray() is false, so I can't get array from buffer.
ByteBuffer buffer = ...
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
buffer.rewind();
int[] pixels = new int[width * height];
for (int i = 0; i < width * height; i++) {
int a = buffer.get();
int r = buffer.get();
int g = buffer.get();
int b = buffer.get();
pixels[i] = a << 24 | r << 16 | g << 8 | b;
}
bitmap.setPixels(pixels, 0, width, 0, 0, width, height);
Approx. time - ~4.0 ms
Result - Wrong colors -
Approach 3
This time I used setPixels but with the pixel values taken from IntBuffer representation of ByteBuffer. The colors are correct but the time is still high and there is extra memory allocation.
ByteBuffer buffer = ...
IntBuffer intBuffer = buffer.asIntBuffer();
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
buffer.rewind();
int[] pixels = new int[width * height];
for (int i = 0; i < width * height; i++) {
pixels[i] = intBuffer.get();
}
bitmap.setPixels(pixels, 0, width, 0, 0, width, height);
Approx. time - ~3.0 ms
Result - Correct colors -
Any hints on why I am getting wrong colors with copyPixelsFromBuffer? I want to use it instead of setPixels as it is faster and does not require extra memory allocation.
I figured out the problem - even though the Bitmap.Config is said to be ARGB_8888, it really is RGBA. I think it is a huge bug in android developer documentation and code.
The same issue has been noted in this question - Is Android's ARGB_8888 Bitmap internal format always RGBA?
And the ndk documentation correctly notes the format to be ANDROID_BITMAP_FORMAT_RGBA_8888
Solution is simple - create the buffer with RGBA format. Or on the java side switch the channels, something like below -
for (int i = 0; i < width * height; i++) {
Byte a = buffer.get();
Byte r = buffer.get();
Byte g = buffer.get();
Byte b = buffer.get();
bufferCopy.put(r);
bufferCopy.put(g);
bufferCopy.put(b);
bufferCopy.put(a);
}
This is not very efficient code, but gets the job done.
The Bitmap.Config.ARGB_8888 documentation mentions, "Use this formula to pack into 32 bits:
int color = (A & 0xff) << 24 | (B & 0xff) << 16 | (G & 0xff) << 8 | (R & 0xff);". It calls for the color to be packed in ABGR_8888 format yet it is the ARGB_8888 format, and according this post is formally referred to as ANDROID_BITMAP_FORMAT_RGBA_8888 here (ABGR_8888 backwards). In any case, this Kotlin function will convert your colors for use in copyPixelsFromBuffer.
fun androidBitmapFormatRGBA8888(color: Int): Int {
val a = (color shr 24) and 255
val r = (color shr 16) and 255
val g = (color shr 8) and 255
val b = color and 255
return (a shl 24) or (b shl 16) or (g shl 8) or r
}
I have camera (the "deprecated" API) and camera PreviewCallback, in which I get frames. I use that to take pictures (not takePicure or PictureCallback, because I want it fast over quality) and save as jpeg, code snippset below:
#Override
public synchronized void onPreviewFrame(byte[] frame, Camera camera) {
YuvImage yuv = new YuvImage(frame, previewFormat, width, height, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
yuv.compressToJpeg(new Rect(0, 0, width, height), 50, out);
out.toByteArray(); // this is my JPEG
// ... save this JPEG
}
I want to draw text, timestamp, into frame. I know how to do this by converting to Bitmap, using Canvas, drawing text and converting to Bitmap again. But this method is relatively slow (it's not biggest issue), but also i need the frame byte array to other things i.e. put into video and I would like to know if there is some method or lib to directly write text into that frame. I don't use preview (or rather, I use dummy preview), I'm asking how to change the frame byte array. I bet I'd need to do some mumble-jumble into byte array to make it work. The frame is in standard(?) format (YUV420/NV21).
Edit:
I managed to get a working function (getNV21). Of course it is far from efficient at it creates new Bitmap every frame and draws text to it but at least i have something I can work on, directly into yuv image. Still, answers would be appreciated.
private Bitmap drawText(String text, int rotation) {
Bitmap bitmap = Bitmap.createBitmap(maxWidth, maxHeight, Bitmap.Config.ARGB_8888);
int width = bitmap.getWidth();
int height = bitmap.getHeight();
Canvas canvas = new Canvas(bitmap);
Paint paint = new Paint(Paint.ANTI_ALIAS_FLAG);
paint.setColor(Color.rgb(255, 255, 255));
paint.setStrokeWidth(height/36);
paint.setTextSize(height/36);
paint.setShadowLayer(5f, 0f, 0f, Color.BLACK);
paint.setTypeface(Typeface.MONOSPACE);
if (rotation == 0 || rotation == 180) {
canvas.rotate(rotation,width/2,height/2);
} else {
canvas.translate(Math.abs(width-height), 0);
int w = width/2 - Math.abs(width/2-height/2);
int h = height/2;
canvas.rotate(-rotation,w,h);
}
canvas.drawText(text, 10, height-10, paint);
return bitmap;
}
byte [] getNV21(byte[] yuv, int inputWidth, int inputHeight, Bitmap scaled) {
int [] argb = new int[inputWidth * inputHeight];
scaled.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);
encodeYUV420SP(yuv, argb, inputWidth, inputHeight);
scaled.recycle();
return yuv;
}
private static void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
final int frameSize = width * height;
int yIndex = 0;
int uvIndex = frameSize;
int a, R, G, B, Y, U, V;
int index = 0;
for (int j = 0; j < height; j++) {
for (int i = 0; i < width; i++,index++,yIndex++) {
a = (argb[index] & 0xff000000) >> 24; // a is not used obviously
R = (argb[index] & 0xff0000) >> 16;
G = (argb[index] & 0xff00) >> 8;
B = (argb[index] & 0xff) >> 0;
if (R == 0 && G == 0 && B == 0) {
if (j % 2 == 0 && index % 2 == 0) uvIndex+=2;
continue;
}
// well known RGB to YUV algorithm
Y = ( ( 66 * R + 129 * G + 25 * B + 128) >> 8) + 16;
U = ( ( -38 * R - 74 * G + 112 * B + 128) >> 8) + 128;
V = ( ( 112 * R - 94 * G - 18 * B + 128) >> 8) + 128;
// NV21 has a plane of Y and interleaved planes of VU each sampled by a factor of 2
// meaning for every 4 Y pixels there are 1 V and 1 U. Note the sampling is every other
// pixel AND every other scanline.
yuv420sp[yIndex] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y));
if (j % 2 == 0 && index % 2 == 0) {
yuv420sp[uvIndex] = (byte)((V<0) ? 0 : ((V > 255) ? 255 : V));
yuv420sp[uvIndex] = (byte)((U<0) ? 0 : ((U > 255) ? 255 : U));
}
}
}
}
I'm fairly new to programming on the android platform. I'm having some problems massaging an int array to turn it into a Bitmap.
Each element in the int array is a number from 0 to 255, and represents a grey-scale pixel color (so 0 would be black, and 255 would be white, etc...). Here's my code:
int nPixels = 262144;
int width = 512;
int height = 512;
int[] converted = new int[nPixels];
int alpha = 255;
for (int i = 0; i < nPixels; i++) {
//greyscale, so all r, g, and b have the same value
converted[i] = (alpha << 24) | (pixels[i] << 16) | (pixels[i] << 8) | pixels[i];
}
Bitmap bm = Bitmap.createBitmap(converted, width, height, Bitmap.Config.ARGB_8888);
The resulting bitmap has height and width of -1, which indicates that the createBitmap function didn't work. What is wrong with my method?
Oh, I took the shifting part for making the color from http://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/graphics/CreateBitmap.html.
Edit: thought it might've been because alpha was 256 instead of 255, but I still get the same error.
How do you learn that your Bitmap has [-1 -1] size?
To get size, use:
int w = bm.getWidth();
int h = bm.getHeight();
This is working fine.
I not sure if you can create the bitmap directly from the int array. You may need to iterate through the pixels of the bitmap and set the pixel value from the array using the bitmap setPixel method.
bitmap.setPixel (int x, int y, int color)