byte array image show to glsurafceview using GLES20.GlTexImage2D() android xamarin - android

I have to draw byte array image to glsurfaceview which is continuously comes from server in xamarin android project. code is not working for direct buffer draw but its working for bitmap draw.
my code is working if I convert byte array to bitmap and draw :
mBitmap = { .. create bitmap from byte[].. }
GLUtils.TexImage2D(GLES20.GlTexture2d, 0, mBitmap, 0);
At the same code if I put following code, then its not draw. I want to draw byte array using GLES20.GlTexImage2D(). My not working code is:
// mFrameBuffer is my byte array
ByteBuffer buffer = ByteBuffer.AllocateDirect(mFrameBuffer.Length);
buffer.Order (ByteOrder.NativeOrder());
buffer.Put (mFrameBuffer);
buffer.Position (0);
GLES20.GlTexImage2D (GLES20.GlTexture2d, 0, GLES20.GlRgb565, mWidth, mHeight, 0, GLES20.GlRgb565, GLES20.GlUnsignedByte, buffer);
How to draw image using GlTexImage2D for direct byte array.

After buffer.position(0) you should bind some texture on some slot
int id = -1;
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, id);
After this check your bitmap config, if it's not RGB565 you should use respective color format in glTexImage2D, for example GLES20.GL_RGBA if bitmap config is Bitmap.Config.ARGB_8888.
GLES20.glTexImage2D(GL20W.GL_TEXTURE_2D, 0, GL20W.GL_RGBA, mWidth, mHeigh, 0,
GL20W.GL_RGBA, GL20W.GL_UNSIGNED_BYTE, buffer)
Then set your screen as FrameBuffer target:
GLES20.glBindFramebuffer(GL_FRAMEBUFFER, 0);
And after this steps simply draw your texture

I found solution by change format from GLES20.GlRgb565 to GLES20.GlRgba.
GLES20.GlRGB565 and GLES20.GlRGBA4 are not valid values to pass to GLES20.GlTexImage2D(), found from glTexImage2D.

Related

How do I pass a Texture2D into Android and save it as a Bitmap?

I have a unity scene where I want to pass a Texture2D into android, do some processing, then save it as a bitmap. My code isn't working. For simplicity sake i removed the processing part of it and I'm trying to just save the image as a bitmap.
On the unity side there's some initialization for the android package and this line:
_pluginInterface.CallStatic("ProcessImage", testTexture.GetNativeTexturePtr().ToInt32(), testTexture.width, testTexture.height);
On the Java side:
public static void ProcessImage(int ptr, int width, int height){
Bitmap b = Bitmap.createBitmap(width, height,Bitmap.Config.ARGB_8888); // this is just so I get the right length
int byteCount = b.getByteCount();
ByteBuffer inputBuffer = ByteBuffer.allocate(byteCount);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, ptr);
GLES20.glReadPixels(0, 0, width, height, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, inputBuffer);
Bitmap bmp = Bitmap.createBitmap(width, height, Bitmap.Config.RGB_565);
inputBuffer.rewind();
bmp.copyPixelsFromBuffer(inputBuffer);
SaveToFileDebug(bmp); // basic function that saves the bitmap
}
As you can probably tell I'm all over the place with the encodings and i think that might be the problem, but they don't seem to match between unity and android.
Thanks in advance
Instead of doing such acrobatics coding just use GetRawTextureData and use the returned byte array to create your bitmap
Example
byte[] raw = testTexture.GetRawTextureData();
_plugin.Interface.CallStatic("YourMethod",raw,width, height);
Java Side
Bitmap bmp = BitmapFactory.decodeByteArray(raw, 0, raw.length)

Opengl - Convert Texture to Array

in my android App, i have a Frame Buffer Object that takes me the rendered Scene As a texture.
the app is an Origami game and user can fold a paper freely:
in every Fold, the current rendered scene saves to a texture using fbo and then i redraw the paper with new coordinates with new texture attached to it, to seem like folded paper. and this way the user can fold the paper as many time as he wants.
I want in every frame Check the rendered scene, to determinate does the user riches to the final shape (assume that i have the final shape in a 2d-array with 0 and 1 filled, 0 for transparency and 1 for colored pixels)
what i want, is to some How, Convert this Texture to A 2d-Array filled with 0 and 1,
0 for transparency pixel, and 1 for Colored pixel of texture.
i need this to then compare this result with a previously Known 2d-Array to determinate if the texture is the shape i want or not.
is it possible to save the texture data to an array?
i cant use glreadPixels because it is so heavy and its not possible to call it every frame.
here is the FBO class (i want to have renderTex[0] as array):
public class FBO {
int [] fb, renderTex;
int texW;
int texH;
public FBO(int width,int height){
texW = width;
texH = height;
fb = new int[1];
renderTex= new int[1];
}
public void setup(GL10 gl){
// generate
((GL11ExtensionPack)gl).glGenFramebuffersOES(1, fb, 0);
gl.glEnable(GL10.GL_TEXTURE_2D);
gl.glGenTextures(1, renderTex, 0);// generate texture
gl.glBindTexture(GL10.GL_TEXTURE_2D, renderTex[0]);
gl.glTexParameterf(GL10.GL_TEXTURE_2D,
GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D,
GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S,
GL10.GL_CLAMP_TO_EDGE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T,
GL10.GL_CLAMP_TO_EDGE);
//texBuffer = ByteBuffer.allocateDirect(buf.length*4).order(ByteOrder.nativeOrder()).asIntBuffer();
//gl.glTexEnvf(GL10.GL_TEXTURE_ENV, GL10.GL_TEXTURE_ENV_MODE,GL10.GL_MODULATE);
gl.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_RGBA, texW, texH, 0, GL10.GL_RGBA, GL10.GL_UNSIGNED_BYTE, null);
gl.glDisable(GL10.GL_TEXTURE_2D);
}
public boolean RenderStart(GL10 gl){
Log.d("TextureAndFBO", ""+renderTex[0] + " And " +fb[0]);
// Bind the framebuffer
((GL11ExtensionPack)gl).glBindFramebufferOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, fb[0]);
// specify texture as color attachment
((GL11ExtensionPack)gl).glFramebufferTexture2DOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, GL11ExtensionPack.GL_COLOR_ATTACHMENT0_OES, GL10.GL_TEXTURE_2D, renderTex[0], 0);
int error = gl.glGetError();
if (error != GL10.GL_NO_ERROR) {
Log.d("err", "FIRST Background Load GLError: " + error+" ");
}
int status = ((GL11ExtensionPack)gl).glCheckFramebufferStatusOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES);
if (status != GL11ExtensionPack.GL_FRAMEBUFFER_COMPLETE_OES)
{
Log.d("err", "SECOND Background Load GLError: " + status+" ");;
return true;
}
gl.glClear(GL10.GL_COLOR_BUFFER_BIT);
return true;
}
public void RenderEnd(GL10 gl){
((GL11ExtensionPack)gl).glBindFramebufferOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, 0);
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glEnable(GL10.GL_TEXTURE_2D);
gl.glBindTexture(GL10.GL_TEXTURE_2D, 0);
gl.glColor4f(1.0f,1.0f,1.0f,1.0f);
gl.glDisable(GL10.GL_TEXTURE_2D);
}
public int getTexture(){
return renderTex[0];
}
public int getFBO(){
return fb[0];
}
}
If you are using openGL ES 3.0 and later then pbo would be a good solution. But I think you can use EGLImage. Because this only needs OpenGL ES 1.1 or 2.0.
The function to create an EGLImageKHR is:
EGLImageKHR eglCreateImageKHR(EGLDisplay dpy,
EGLContext ctx,
EGLenum target,
EGLClientBuffer buffer,
const EGLint *attrib_list)
To allocate an ANativeWindowBuffer, Android has a simple wrapper called GraphicBuffer:
GraphicBuffer *window = new GraphicBuffer(width, height, PIXEL_FORMAT_RGBA_8888, GraphicBuffer::USAGE_SW_READ_OFTEN | GraphicBuffer::USAGE_HW_TEXTURE);
struct ANativeWindowBuffer *buffer = window->getNativeBuffer();
EGLImageKHR *image = eglCreateImageKHR(eglGetCurrentDisplay(), EGL_NO_CONTEXT, EGL_NATIVE_BUFFER_ANDROID, *attribs);
to read pixels from an FBO use one of these two methods below:
void EGLImageTargetTexture2DOES(enum target, eglImageOES image)
void EGLImageTargetRenderbufferStorageOES(enum target, eglImageOES image)
These two methods will esablish all the properties of the target GL_TEXTURE_2D or GL_RENDERBUFFER
uint8_t *ptr;
glBindTexture(GL_TEXTURE_2D, texture_id);
glEGLImageTargetTexture2DOES(GL_TEXTURE_2D, image);
window->lock(GraphicBuffer::USAGE_SW_READ_OFTEN, &ptr);
memcpy(pixels, ptr, width * height * 4);
window->unlock();
To accomplish what you want, you need to use a PBO (Pixel Buffer Object): You can map it to an array to read it if it were a regular array.
OpenGL ARB_pixel_buffer_object extension is very close to
ARB_vertex_buffer_object. It simply expands ARB_vertex_buffer_object
extension in order to store not only vertex data but also pixel data
into the buffer objects. This buffer object storing pixel data is
called Pixel Buffer Object (PBO). ARB_pixel_buffer_object extension
borrows all VBO framework and APIs, plus, adds 2 additional "target"
tokens. These tokens assist the PBO memory manger (OpenGL driver) to
determine the best location of the buffer object; system memory,
shared memory or video memory. Also, the target tokens clearly specify
that the bound PBO will be used in one of 2 different operations;
GL_PIXEL_PACK_BUFFER_ARB to transfer pixel data to a PBO, or
GL_PIXEL_UNPACK_BUFFER_ARB to transfer pixel data from PBO.
It can be created similiar to other buffer objects:
glGenBuffers(1, &pbo);
glBindBuffer(GL_PIXEL_PACK_BUFFER, pbo);
glBufferData(GL_PIXEL_PACK_BUFFER, size, 0, GL_DYNAMIC_READ);
Then you can read from an FBO (or a texture) easily:
glReadBuffer(GL_COLOR_ATTACHMENT0);
glBindBuffer(GL_PIXEL_PACK_BUFFER, pbo);
glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, 0);
GLubyte *array = (GLubyte*)glMapBufferRange(GL_PIXEL_PACK_BUFFER, 0, size, GL_MAP_READ_BIT);
// TODO: Do your checking of the shape inside of this 'array' pointer or copy it somewhere using memcpy()
glUnmapBuffer(GL_PIXEL_PACK_BUFFER);
glBindBuffer(GL_PIXEL_PACK_BUFFER, 0);
Here GL_COLOR_ATTACHMENT0 is used as input - see the specification of glReadBuffer for further details how to specify front or backbuffer to be used.

How do I load 8-bit binary image data into an OpenGL ES 2 texture on Android

I want to read monochrome image data from disk in a binary format (unsigned byte) and display it as an OpenGL ES 2 texture in Android. I am currently using Eclipse and the AVD emulator.
I am able to read the data from disk using an InputStream, and then convert the byte data to int to allow me to use the createBitmap method.
My hope was to create a monochrome bitmap by using ALPHA_8 as the bitmap format, but if I do that the texture appears as solid black when rendered. If I change the bitmap format to RGB_565 I can see parts of the image but of course the color is all scrambled because it is the wrong data format.
I have tried adding extra parameters to texImage2D() to try to force the texture format and source data type, but Eclipse shows an error if I use any of the opengl texture format codes in the texImage2D arguments.
I'm at a loss, can anyone tell me how to edit this to get a monochrome texture into OpenGL ES?
int w = 640;
int h = 512;
int nP = w * h; //no. of pixels
//load the binary data
byte[] byteArray = new byte[nP];
try {
InputStream fis = mContext.getResources()
.openRawResource(R.raw.testimage); //testimage is a binary file of U8 image data
fis.read(byteArray);
fis.close();
} catch(IOException e) {
// Ignore.
}
System.out.println(byteArray[1]);
//convert byte to int to work with createBitmap (is there a better way to do this?)
int[] intArray = new int[nP];
for (int i=0; i < nP; i++)
{
intArray[i] = byteArray[i];
}
//create bitmap from intArray and send to texture
Bitmap img = Bitmap.createBitmap(intArray, w, h, Bitmap.Config.ALPHA_8);
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, img, 0);
img.recycle();
//as the code is the image is black, if I change ALPHA_8 to RGB_565 then I see a corrupted image
Once you have loaded Bitmap into byte array you can also use glTexImage2D directly with your byte array. It would be something along these lines;
byte data[bitmapLength] = your_byte_data;
ByteBuffer buffer = ByteBuffer.allocateDirect(bitmapLength);
buffer.put(data);
buffer.position(0);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
bitmapWidth, bitmapHeight, 0, GLES20.GL_LUMINANCE,
GLES20.GL_UNSIGNED_BYTE, buffer);
This should assign each byte value into RGB, same value for each, plus alpha is set to one.
According to the createBitmap docs, that int array is interpreted as an array of Color, which is "(alpha << 24) | (red << 16) | (green << 8) | blue". So, when you're loading those bytes and populating your int array, you're currently putting the data in the blue slot instead of the alpha slot. As such, your alpha values are all zero, which I'd actually expect to result in a clear texture. I believe you want
intArray[i] = byteArray[i] << 24;

int array to opengl texture in android

I'm trying to add some efects to the camera in android,
I found some things on internet but I got stuck when creating the texture,
I use the funcion decodeYUV420SP() that returns me a int[width*height] RGB array with the hex values into each array position,
Now, I want to create an openGL texture of this array but i dont know how, I can convert each hex value to its R_G_B separated and put it into opengl but it doesn't work
I do something like this:
mNewTexture = new int[width*height*4]
for(int i=0; i<mRGB.length; i=i+4){
mNewTexture[i] = getR(mRGB[i]) ; //R
mNewTexture[i+1] = getG(mRGB[i]) ; //G
mNewTexture[i+2] = getB(mRGB[i]) ; //B
mNewTexture[i+3] = getA(mRGB[i]); //A
}
To convert the hex value to RGBA (from 0 to 255)
And i do this to convert it to the openGL texture:
gl.glBindTexture(GL10.GL_TEXTURE_2D, tex);
gl.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_RGBA, 1024, 512, 0, GL10.GL_RGBA, GL10.GL_FLOAT, FloatBuffer.wrap(mNewTexture));
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR);
However something is worng, cause it doesn't work...
Any idea?
Why do you try to wrap your int array as a FloatBuffer? Most of your conversions are unnecessary.
Just take your original texture, wrap it in a bytebuffer, and pass it to glTexImage with the type GL_UNSIGNED_BYTE. There's no need to create a new array from what you already have.
gl.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_RGBA, 1024, 512, 0, GL10.GL_RGBA, GL10.GL_UNSIGNED_BYTE, ByteBuffer.wrap(mRGB));

How to get a Bitmap from a raw image

I am reading a raw image from the network. This image has been read by an image sensor, not from a file.
These are the things I know about the image:
~ Height & Width
~ Total size (in bytes)
~ 8-bit grayscale
~ 1 byte/pixel
I'm trying to convert this image to a bitmap to display in an imageview.
Here's what I tried:
BitmapFactory.Options opt = new BitmapFactory.Options();
opt.outHeight = shortHeight; //360
opt.outWidth = shortWidth;//248
imageBitmap = BitmapFactory.decodeByteArray(imageArray, 0, imageSize, opt);
decodeByteArray returns null, since it cannot decode my image.
I also tried reading it directly from the input stream, without converting it to a Byte Array first:
imageBitmap = BitmapFactory.decodeStream(imageInputStream, null, opt);
This returns null as well.
I've searched on this & other forums, but cannot find a way to achieve this.
Any ideas?
EDIT: I should add that the first thing I did was to check if the stream actually contains the raw image. I did this using other applications `(iPhone/Windows MFC) & they are able to read it and display the image correctly. I just need to figure out a way to do this in Java/Android.
Android does not support grayscale bitmaps. So first thing, you have to extend every byte to a 32-bit ARGB int. Alpha is 0xff, and R, G and B bytes are copies of the source image's byte pixel value. Then create the bitmap on top of that array.
Also (see comments), it seems that the device thinks that 0 is white, 1 is black - we have to invert the source bits.
So, let's assume that the source image is in the byte array called Src. Here's the code:
byte [] src; //Comes from somewhere...
byte [] bits = new byte[src.length*4]; //That's where the RGBA array goes.
int i;
for(i=0;i<src.length;i++)
{
bits[i*4] =
bits[i*4+1] =
bits[i*4+2] = ~src[i]; //Invert the source bits
bits[i*4+3] = 0xff; // the alpha.
}
//Now put these nice RGBA pixels into a Bitmap object
Bitmap bm = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
bm.copyPixelsFromBuffer(ByteBuffer.wrap(bits));
Once I did something like this to decode the byte stream obtained from camera preview callback:
Bitmap.createBitmap(imageBytes, previewWidth, previewHeight,
Bitmap.Config.ARGB_8888);
Give it a try.
for(i=0;i<src.length;i++)
{
bits[i*4] = bits[i*4+1] = bits[i*4+2] = ~src[i]; //Invert the source bits
bits[i*4+3] = 0xff; // the alpha.
}
The conversion loop can take a lot of time to convert the 8 bit image to RGBA, a 640x800 image can take more than 500ms... A quicker solution is to use ALPHA8 format for the bitmap and use a color filter:
//setup color filter to inverse alpha, in my case it was needed
float[] mx = new float[]{
1.0f, 0, 0, 0, 0, //red
0, 1.0f, 0, 0, 0, //green
0, 0, 1.0f, 0, 0, //blue
0, 0, 0, -1.0f, 255 //alpha
};
ColorMatrixColorFilter cf = new ColorMatrixColorFilter(mx);
imageView.setColorFilter(cf);
// after set only the alpha channel of the image, it should be a lot faster without the conversion step
Bitmap bm = Bitmap.createBitmap(width, height, Bitmap.Config.ALPHA_8);
bm.copyPixelsFromBuffer(ByteBuffer.wrap(src)); //src is not modified, it's just an 8bit grayscale array
imageview.setImageBitmap(bm);
Use Drawable create from stream. Here's how to do it with an HttpResponse, but you can get the inputstream anyway you want.
InputStream stream = response.getEntity().getContent();
Drawable drawable = Drawable.createFromStream(stream, "Get Full Image Task");

Categories

Resources