I am reading a raw image from the network. This image has been read by an image sensor, not from a file.
These are the things I know about the image:
~ Height & Width
~ Total size (in bytes)
~ 8-bit grayscale
~ 1 byte/pixel
I'm trying to convert this image to a bitmap to display in an imageview.
Here's what I tried:
BitmapFactory.Options opt = new BitmapFactory.Options();
opt.outHeight = shortHeight; //360
opt.outWidth = shortWidth;//248
imageBitmap = BitmapFactory.decodeByteArray(imageArray, 0, imageSize, opt);
decodeByteArray returns null, since it cannot decode my image.
I also tried reading it directly from the input stream, without converting it to a Byte Array first:
imageBitmap = BitmapFactory.decodeStream(imageInputStream, null, opt);
This returns null as well.
I've searched on this & other forums, but cannot find a way to achieve this.
Any ideas?
EDIT: I should add that the first thing I did was to check if the stream actually contains the raw image. I did this using other applications `(iPhone/Windows MFC) & they are able to read it and display the image correctly. I just need to figure out a way to do this in Java/Android.
Android does not support grayscale bitmaps. So first thing, you have to extend every byte to a 32-bit ARGB int. Alpha is 0xff, and R, G and B bytes are copies of the source image's byte pixel value. Then create the bitmap on top of that array.
Also (see comments), it seems that the device thinks that 0 is white, 1 is black - we have to invert the source bits.
So, let's assume that the source image is in the byte array called Src. Here's the code:
byte [] src; //Comes from somewhere...
byte [] bits = new byte[src.length*4]; //That's where the RGBA array goes.
int i;
for(i=0;i<src.length;i++)
{
bits[i*4] =
bits[i*4+1] =
bits[i*4+2] = ~src[i]; //Invert the source bits
bits[i*4+3] = 0xff; // the alpha.
}
//Now put these nice RGBA pixels into a Bitmap object
Bitmap bm = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
bm.copyPixelsFromBuffer(ByteBuffer.wrap(bits));
Once I did something like this to decode the byte stream obtained from camera preview callback:
Bitmap.createBitmap(imageBytes, previewWidth, previewHeight,
Bitmap.Config.ARGB_8888);
Give it a try.
for(i=0;i<src.length;i++)
{
bits[i*4] = bits[i*4+1] = bits[i*4+2] = ~src[i]; //Invert the source bits
bits[i*4+3] = 0xff; // the alpha.
}
The conversion loop can take a lot of time to convert the 8 bit image to RGBA, a 640x800 image can take more than 500ms... A quicker solution is to use ALPHA8 format for the bitmap and use a color filter:
//setup color filter to inverse alpha, in my case it was needed
float[] mx = new float[]{
1.0f, 0, 0, 0, 0, //red
0, 1.0f, 0, 0, 0, //green
0, 0, 1.0f, 0, 0, //blue
0, 0, 0, -1.0f, 255 //alpha
};
ColorMatrixColorFilter cf = new ColorMatrixColorFilter(mx);
imageView.setColorFilter(cf);
// after set only the alpha channel of the image, it should be a lot faster without the conversion step
Bitmap bm = Bitmap.createBitmap(width, height, Bitmap.Config.ALPHA_8);
bm.copyPixelsFromBuffer(ByteBuffer.wrap(src)); //src is not modified, it's just an 8bit grayscale array
imageview.setImageBitmap(bm);
Use Drawable create from stream. Here's how to do it with an HttpResponse, but you can get the inputstream anyway you want.
InputStream stream = response.getEntity().getContent();
Drawable drawable = Drawable.createFromStream(stream, "Get Full Image Task");
Related
I have to draw byte array image to glsurfaceview which is continuously comes from server in xamarin android project. code is not working for direct buffer draw but its working for bitmap draw.
my code is working if I convert byte array to bitmap and draw :
mBitmap = { .. create bitmap from byte[].. }
GLUtils.TexImage2D(GLES20.GlTexture2d, 0, mBitmap, 0);
At the same code if I put following code, then its not draw. I want to draw byte array using GLES20.GlTexImage2D(). My not working code is:
// mFrameBuffer is my byte array
ByteBuffer buffer = ByteBuffer.AllocateDirect(mFrameBuffer.Length);
buffer.Order (ByteOrder.NativeOrder());
buffer.Put (mFrameBuffer);
buffer.Position (0);
GLES20.GlTexImage2D (GLES20.GlTexture2d, 0, GLES20.GlRgb565, mWidth, mHeight, 0, GLES20.GlRgb565, GLES20.GlUnsignedByte, buffer);
How to draw image using GlTexImage2D for direct byte array.
After buffer.position(0) you should bind some texture on some slot
int id = -1;
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, id);
After this check your bitmap config, if it's not RGB565 you should use respective color format in glTexImage2D, for example GLES20.GL_RGBA if bitmap config is Bitmap.Config.ARGB_8888.
GLES20.glTexImage2D(GL20W.GL_TEXTURE_2D, 0, GL20W.GL_RGBA, mWidth, mHeigh, 0,
GL20W.GL_RGBA, GL20W.GL_UNSIGNED_BYTE, buffer)
Then set your screen as FrameBuffer target:
GLES20.glBindFramebuffer(GL_FRAMEBUFFER, 0);
And after this steps simply draw your texture
I found solution by change format from GLES20.GlRgb565 to GLES20.GlRgba.
GLES20.GlRGB565 and GLES20.GlRGBA4 are not valid values to pass to GLES20.GlTexImage2D(), found from glTexImage2D.
I am developing an app in which i need to send the bitmaps in the form of pixel data from android to ios through sockets to preview the android bitmaps.I send the pixel data from android successfully but when i convert the pixel data to bitmap image in IOS the resulting image display with the blue layer on it.
Following is my code of android.
int[] pixels = new int[scaledBitmap.getHeight() * scaledBitmap.getWidth()];
scaledBitmap.getPixels(pixels, 0, scaledBitmap.getWidth(), 0,
0, scaledBitmap.getWidth(), scaledBitmap.getHeight());
ByteBuffer byteBuffer = ByteBuffer.allocate(pixels.length * 4);
IntBuffer intBuffer = byteBuffer.asIntBuffer();
intBuffer.put(pixels);
byte[] imagedata = byteBuffer.array();
osRC.write(imagedata ); // osRC outputstream of destination IOS socket.
Following is the code on IOS Side that receive the buffer
(UIImage ) createUIImage: (char) buffer :(int)width :(int)height
{
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = nil;
context = CGBitmapContextCreate(buffer, width, height, 8, width * 4,rgbColorSpace,kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);//kCGBitmapByteOrder32Little
// create image from data in graphics context
CGImageRef newImage = CGBitmapContextCreateImage(context);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
UIImage* image = [UIImage imageWithCGImage:newImage];
CGImageRelease(newImage);
return image;
}
I also try this with different color spaces on android and also by changing the color channels of RGB and ARGB. But after these changes I suffer the same problem that the resulting image on ios displays but with some color layer and most of the time this layer is blue.
I have also tried to convert the pixel data on android with the following code.
ByteBuffer buffer = ByteBuffer.allocate(bytesperRow * height);
scaledBitmap.copyPixelsToBuffer(buffer);
byte[] imagedata = byteBuffer.array();
osRC.write(imagedata ); // osRC outputstream of destination IOS socket.
I hope this made clear what I'm trying to do.i need to know what should i do to get the pixel data from android to send it to ios so that ios display the same bitmap as in android.
Your help will be much appreciated.Thanks in advance.
I want to read monochrome image data from disk in a binary format (unsigned byte) and display it as an OpenGL ES 2 texture in Android. I am currently using Eclipse and the AVD emulator.
I am able to read the data from disk using an InputStream, and then convert the byte data to int to allow me to use the createBitmap method.
My hope was to create a monochrome bitmap by using ALPHA_8 as the bitmap format, but if I do that the texture appears as solid black when rendered. If I change the bitmap format to RGB_565 I can see parts of the image but of course the color is all scrambled because it is the wrong data format.
I have tried adding extra parameters to texImage2D() to try to force the texture format and source data type, but Eclipse shows an error if I use any of the opengl texture format codes in the texImage2D arguments.
I'm at a loss, can anyone tell me how to edit this to get a monochrome texture into OpenGL ES?
int w = 640;
int h = 512;
int nP = w * h; //no. of pixels
//load the binary data
byte[] byteArray = new byte[nP];
try {
InputStream fis = mContext.getResources()
.openRawResource(R.raw.testimage); //testimage is a binary file of U8 image data
fis.read(byteArray);
fis.close();
} catch(IOException e) {
// Ignore.
}
System.out.println(byteArray[1]);
//convert byte to int to work with createBitmap (is there a better way to do this?)
int[] intArray = new int[nP];
for (int i=0; i < nP; i++)
{
intArray[i] = byteArray[i];
}
//create bitmap from intArray and send to texture
Bitmap img = Bitmap.createBitmap(intArray, w, h, Bitmap.Config.ALPHA_8);
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, img, 0);
img.recycle();
//as the code is the image is black, if I change ALPHA_8 to RGB_565 then I see a corrupted image
Once you have loaded Bitmap into byte array you can also use glTexImage2D directly with your byte array. It would be something along these lines;
byte data[bitmapLength] = your_byte_data;
ByteBuffer buffer = ByteBuffer.allocateDirect(bitmapLength);
buffer.put(data);
buffer.position(0);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
bitmapWidth, bitmapHeight, 0, GLES20.GL_LUMINANCE,
GLES20.GL_UNSIGNED_BYTE, buffer);
This should assign each byte value into RGB, same value for each, plus alpha is set to one.
According to the createBitmap docs, that int array is interpreted as an array of Color, which is "(alpha << 24) | (red << 16) | (green << 8) | blue". So, when you're loading those bytes and populating your int array, you're currently putting the data in the blue slot instead of the alpha slot. As such, your alpha values are all zero, which I'd actually expect to result in a clear texture. I believe you want
intArray[i] = byteArray[i] << 24;
i have code to load image from sdcard and post it to ImageView.
Mat mRgba = Highgui.imread(dir);
Bitmap bmp = Bitmap.createBitmap(mRgba.cols(), mRgba.rows(),Bitmap.Config.ARGB_8888);
Utils.matToBitmap(mRgba, bmp);
mImage.setImageBitmap(bmp, true, null, 5.0f);
the image is loaded but it's wrong color. Color seem to be inverted (but not inverted).
Here is image comparison
I tried to load image by
Bitmap bmp = BitmapFactory.decodeFile(dir);
It worked correctly. But i have to use Highgui.imread.
What wrong with my code?
You will have to use something like this:
Mat inputImage = Highgui.imread(pathToFile);
Mat tmp = new Mat();
Imgproc.cvtColor(inputImage, tmp, Imgproc.COLOR_BGR2RGB);
Bitmap imageToShow = Bitmap.createBitmap(tmp.cols(), tmp.rows(), Bitmap.Config.ARGB_8888);
Utils.matToBitmap(tmp, imageToShow);
You're trying to load a bitmap supposing that the image is 8-bit/color RGBA: are you sure of that?
Also note that ARGB is not RGBA. You may need to re-arrange the bytes of each pixel. Something like
int pixel = get_the_pixel();
int alpha = 0xff & pixel;
pixel = pixel<<8 | alpha;
set_the_pixel(pixel);
You'll want to do something more efficient than accessor methods shown here, but you get the idea.
I want to print a Bitmap to a mobile Bluetooth Printer (Bixolon SPP-R200) - the SDK doesn't offer direkt methods to print an in-memory image. So I thought about converting a Bitmap like this:
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
To a Monochrome Bitmap. I am drawing black text on above given Bitmap using a Canvas, which works well. However, when I convert the above Bitmap to a ByteArray, the printer seems to be unable to handle those bytes. I suspect I need an Array with one Bit per Pixel (a Pixel would be either white = 1 or black = 0).
As there seems to be no convenient, out of the box way to do that, one idea I had was to use:
bitmap.getPixels(pixels, offset, stride, x, y, width, height)
to Obtain the pixels. I assume, I'd have to use it as follows:
int width = bitmap.getWidth();
int height = bitmap.getHeight();
int [] pixels = new int [width * height];
bitmap.getPixels(pixels, 0, width, 0, 0, width, height);
However - I am not sure about a few things:
In getPixels - does it make sense to simply pass the width as the "Stride" argument?
I guess I'd have to evaluate the color information of each pixel and either switch it to black or white (And I'd write this value in a new target byte array which I would ultimately pass to the printer)?
How to best evaluate each pixel color information in order to decide that it should be black or white? (The rendered Bitmap is black pain on a white background)
Does this approach make sense at all? Is there an easier way? It's not enough to just make the bitmap black & white, the main issue is to reduce the color information for each pixel into one bit.
UPDATE
As suggested by Reuben I'll first convert the Bitmap to a monochrome Bitmap. and then I'll iterate over each pixel:
int width = bitmap.getWidth();
int height = bitmap.getHeight();
int[] pixels = new int[width * height];
bitmap.getPixels(pixels, 0, width, 0, 0, width, height);
// Iterate over height
for (int y = 0; y < height; y++) {
int offset = y * height;
// Iterate over width
for (int x = 0; x < width; x++) {
int pixel = bitmap.getPixel(x, y);
}
}
Now Reuben suggested to "read the lowest byte of each 32-bit pixel" - that would relate to my question about how to evaluate the pixel color. My last question in this regard: Do I get the lowest byte by simply doing this:
// Using the pixel from bitmap.getPixel(x,y)
int lowestByte = pixel & 0xff;
You can convert the image to monochrome 32bpp using a ColorMatrix.
Bitmap bmpMonochrome = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
Canvas canvas = new Canvas(bmpMonochrome);
ColorMatrix ma = new ColorMatrix();
ma.setSaturation(0);
Paint paint = new Paint();
paint.setColorFilter(new ColorMatrixColorFilter(ma));
canvas.drawBitmap(bmpSrc, 0, 0, paint);
That simplifies the color->monochrome conversion. Now you can just do a getPixels() and read the lowest byte of each 32-bit pixel. If it's <128 it's a 0, otherwise it's a 1.
Well I think its quite late now to reply to this thread but I was also working on this stuff sometimes back and decided to build my own library that will convert any jpg or png image to 1bpp .bmp. Most printers that require 1bpp images will support this image (tested on one of those :)).
Here you can find library as well as a test project that uses it to make a monochrome single channel image. Feel free to change it..:)
https://github.com/acdevs/1bpp-monochrome-android
Enjoy..!! :)
You should convert each pixel into HSV space and use the value to determine if the Pixel on the target image should be black or white:
Bitmap bwBitmap = Bitmap.createBitmap( bitmap.getWidth(), bitmap.getHeight(), Bitmap.Config.RGB_565 );
float[] hsv = new float[ 3 ];
for( int col = 0; col < bitmap.getWidth(); col++ ) {
for( int row = 0; row < bitmap.getHeight(); row++ ) {
Color.colorToHSV( bitmap.getPixel( col, row ), hsv );
if( hsv[ 2 ] > 0.5f ) {
bwBitmap.setPixel( col, row, 0xffffffff );
} else {
bwBitmap.setPixel( col, row, 0xff000000 );
}
}
}
return bwBitmap;
Converting to monochrome with exact the same size as the original bitmap is not enough to print.
Printers can only print each "pixel" (dot) as monochrome because each spot of ink has only 1 color, so they must use much more dots than enough and adjust their size, density... to emulate the grayscale-like feel. This technique is called halftoning. You can see that printers often have resolution at least 600dpi, normally 1200-4800dpi, while display screen often tops at 200-300ppi.
So your monochrome bitmap should be at least 3 times the original resolution in each side.