GL_LUMINANCE with byte array on OpenGL ES 2.0 - android

I'm programming combining the yuv data which were got by libvpx(WebM decode library) and OpenGL ES 2.0 shader(for Android).
These are the same byte array, but it's not drawn correctly in this case.
Success:
// ex) unsigned char *p = yuv.y, yuv.u or yuv.v;
for(int dy = 0; dy < hh; dy++){
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, dy, ww, 1, GL_LUMINANCE, GL_UNSIGNED_BYTE, p);
p += ww;
}
Fail :
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, ww, hh, GL_LUMINANCE, GL_UNSIGNED_BYTE, p);
Because I'm not knowledgeable about OpenGL, I don't understand this reason.
I think that if glTexSubImage2D is called for each line, performance will get worse. Isn't it improvable any more?

My best guess is that the data you are passing to glTexSubImage2D is not correctly aligned.
From the glTexSubImage2D Reference page for OpenGL ES 2.0:
Storage parameter GL_UNPACK_ALIGNMENT, set by glPixelStorei, affects the way that data is read out of client memory. See glPixelStorei for a description.
Passing a single line at a time from your data probably hides the fact that each line is not correctly aligned, and therefore the call succeeds.

Related

How to get 24bit color information by glReadPixels() on android?

To save processed image by OpenGL ES, I made codes as follows. And it works well.
ByteBuffer bb = ByteBuffer.allocate(mWidth * mHeight * 4);
mGL.glReadPixels(0, 0, mWidth, mHeight, GL_RGBA, GL_UNSIGNED_BYTE, bb);
try {
TJCompressor tjCompressor = new TJCompressor(bb.array(), 0, 0, mWidth, 0, mHeight, TJ.PF_RGB);
tjCompressor.setJPEGQuality(100);
tjCompressor.setSubsamp(TJ.SAMP_444);
return tjCompressor.compress(0);
} catch (Exception e) {
e.printStackTrace();
}
After that, to get 24bit color information without alpha channel for saving memory and processing time, I changed the line #1 and #2 of the codes as follows.
ByteBuffer bb = ByteBuffer.allocate(mWidth * mHeight * 3);
mGL.glReadPixels(0, 0, mWidth, mHeight, GL_RGB, GL_UNSIGNED_BYTE, bb);
And then additionally, I removed EGL_ALPHA_SIZE at mGL(GL10 instance)'s EGLConfig. And I passed GLES20.GL_RGB as internal format parameter, when GLUtils.texImage2D() method is called.
However, the result indicates there is something wrong. The result image has only black color, and when I checked the data of bb buffer after glReadPixels() method calling, I found all data is zero. I need advice. Help, plz.
In core GLES2, the only valid format/type combos for glReadPixels are:
GL_RGBA/GL_UNSIGNED_BYTE
Optional framebuffer-specific format/type queried via glGetIntegerv with GL_IMPLEMENTATION_COLOR_READ_FORMAT and GL_IMPLEMENTATION_COLOR_READ_TYPE respectively
In GLES2 without extensions, if GL_IMPLEMENTATION_COLOR_READ_FORMAT/GL_IMPLEMENTATION_COLOR_READ_TYPE don't yield something useful, you're stuck with GL_RGBA/GL_UNSIGNED_BYTE, unfortunately.
With GLES3, you can glReadPixels into the bound GL_PACK_BUFFER, and glMapBufferRange that, though again, you're limited by format.
I'll note that drivers are prone to emulating tightly-packed rgb8 24-bit formats, instead implementing only the better aligned formats like rgba8, rgb565, and rgba4. A renderable format exposed as "rgb8" is potentially just rgbx8 behind the scenes.
Highly driver dependent, but if you don't care about keeping the contents of the framebuffer around, you might be able to win back some memory bandwidth with EXT_discard_framebuffer. (or glInvalidateFramebuffer in GLES3)
You might also look into EGL_KHR_lock_surface3.

treating image from glReadPixels with OpenCV and return it as a texture

I am trying to start with some basic operations with OpenCV and GLES20 on Android using C++.
I use CameraGLSurfaceView and its callback onCameraTexture(...) which calls I pass into my native library.
Calls are flowing well, I can read frame buffer to vector and pass it to texture without changing and it works as expected.
But when I try to work with pixels I get image broken.
My C++ code:
cv::Mat in(w,h,CV_8UC4);
cv::Mat out(w,h,CV_8UC4);
glReadPixels(0, 0, w, h, GL_RGBA, GL_UNSIGNED_BYTE, in.data);
// following operations break image >>
cv::cvtColor(in, out, CV_RGBA2BGRA);
cv::flip(out, in, 0);
cv::cvtColor(in, out, CV_BGRA2RGBA);
// << prev operations break image
glBindTexture(GL_TEXTURE_2D, (GLuint) tex2);
glTexImage2D(GL_TEXTURE_2D,
0,
GL_RGBA,
w,
h,
0,
GL_RGBA,
GL_UNSIGNED_BYTE,
out.ptr());
glBindTexture(GL_TEXTURE_2D, 0);
in.release();
out.release();
Without signed operations picture goes to texture and is displayed well.
I understand that my mistake is in converting formats between OpenGL and OpenCV.
How to convert formats properly?
It's my mistake with sizes of Mat:
cv::Mat in(w,h,CV_8UC4) should be cv::Mat in(h,w,CV_8UC4)

OpenGL es 2.0 glDrawElements index pointer error

Im having trouble with texturing a cube with different textures per face. I can draw the cube with one texture on all the faces, but when I try use multiple textures it fails. The way im trying to do it is like so:
//my indexing array located in a header file
#define NUM_IMAGE_OBJECT_INDEX 36
static const unsigned short cubeIndices[NUM_IMAGE_OBJECT_INDEX] =
{
0, 1, 2, 2, 3, 0, // front
4, 5, 6, 6, 7, 4, // right
8, 9,10, 10,11, 8, // top
12,13,14, 14,15,12, // left
16,17,18, 18,19,16, // bottom
20,21,22, 22,23,20 // back
};
now in my rendering function, this currently works for drawing the cube with a single texture
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, iconTextureID);
glDrawElements(GL_TRIANGLES, NUM_IMAGE_OBJECT_INDEX, GL_UNSIGNED_SHORT, 0);
this does not work
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, iconTextureID);
glDrawElements(GL_TRIANGLES, NUM_IMAGE_OBJECT_INDEX, GL_UNSIGNED_SHORT, (const GLvoid*)&cubeIndices[0]);
which should equate to the same thing, from looking at some other examples. Ultimately I would like to be doing this something like this:
for(int i = 0; i < 6; i++){
iconTextureID = textureID[i];
glBindTexture(GL_TEXTURE_2D, iconTextureID);
glDrawElements(GL_TRIANGLES, 6, GL_UNSIGNED_SHORT, (const GLvoid*)&cubeIndices[i*6]); //index 0-5 use texture 1, 6-11 use texture 2, etc
}
does anyone know what could be wrong with this indexing? ive basically copy pasted this code from an android project (which works), currently trying to do this on ios.
In OpenGL ES 2.0, index data can come from either buffer objects or pointers to client memory. Your code is obviously using a buffer object. Though you don't show the creation of this buffer object, where you upload your client array of pointers, or where you call glBindBuffer(GL_ELEMENT_ARRAY_BUFFER) before rendering with it. It must be there or your code would have crashed. When a buffer is bound to GL_ELEMENT_ARRAY_BUFFER, OpenGL expects the "pointer" given to glDrawElements to be a byte offset into the buffer object, not a client-memory pointer.
This is why copy-and-paste coding is a bad idea. Where you copied from was probably using client memory; you are not.
If you want your looping code to work, you need to do the pointer arithmetic yourself:
for(int i = 0; i < 6; i++)
{
iconTextureID = textureID[i];
glBindTexture(GL_TEXTURE_2D, iconTextureID);
glDrawElements(GL_TRIANGLES, 6, GL_UNSIGNED_SHORT, reinterpret_cast<void*>(i * 6 * sizeof(GLushort)));
}

Can't draw loaded models in OpenGL ES 1.x with C++

I load obj models and try to render them with OpenGL ES using Android NDK:
class ObjModel{
public:
ObjModel();
~ObjModel();
int numVertex, numNormal,numTexCoord, numTriange;
float *vertexArray;
float *normalArray;
float *texCoordArray;
unsigned short *indexArray;
void loadModel(string fileName);
};
model->loadModel(filename);
glVertexPointer(3, GL_FLOAT, 0, &(model->vertexArray[0]));
glNormalPointer(GL_FLOAT, 0, &(model->normalArray[0]));
glDrawElements(GL_TRIANGLES, model->numTriange, GL_UNSIGNED_SHORT,
&(model->indexArray[0]));
Model is not rendered fully, I see only part of it.
I checked the data in arrays and they are parsed properly. I think that the only issue might be with passing arguments. Am I doing it right?
Hope this helps! I think you are just missing the number 3!
glDrawElements(GL_TRIANGLES, 3 * model->numTriange, GL_UNSIGNED_SHORT,
&(model->indexArray[0]));

Android OpenGL ES 2.0 -- glReadPixels() and glTexImage2D() drawing a black texture?

I'm working on some Android code for caching and redrawing a framebuffer object's color buffer between the loss and recreation of EGL contexts. Development is primarily happening on a Xoom tablet running Honeycomb. Anyway, what I'm trying to do is store the result of calling glReadPixels() on the FBO in a direct ByteBuffer, then use that buffer with glTexImage2D() and draw it back into the (now cleared) framebuffer. All of this seems to work fine — the ByteBuffer contains the right values ([-1, 0, 0, -1] etc. for a pixel, according to Java's inability to understand unsigned bytes), no GlErrors seem to be thrown, and the quad is drawn to the right part of the screen (currently the top-left quarter of the framebuffer for testing purposes).
However, no matter what I try, glTexImage2D() always outputs a plain black texture. I've had some issues with this before — when displaying Bitmaps, I eventually gave up trying to use the basic GLES20.glTexImage2D() with Buffers and skipped to using GLUtils.glTexImage2D(), which processes the Bitmap for you. Unfortunately, that's less of an option here (I did actually try converting the ByteBuffer to a Bitmap so I could use GLUtils, without much success), so I've really run out of ideas.
Can anyone think of anything that could be causing glTexImage2D() to not correctly process a perfectly good ByteBuffer? Any and all suggestions would be welcome.
ByteBuffer pixelBuffer;
void storePixels() {
try {
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fbuf);
pixelBuffer = ByteBuffer.allocateDirect(width * height * 4).order(ByteOrder.nativeOrder());
GLES20.glReadPixels(0, 0, width, height, GL20.GL_RGBA, GL20.GL_UNSIGNED_BYTE, pixelBuffer);
GLES20.glBindFrameBuffer(GLES20.GL_FRAMEBUFFER, 0);
gfx.checkGlError("store Pixels");
}catch (OutOfMemoryError e) {
pixelBuffer = null;
}
}
void redrawPixels() {
GLES20.glBindFramebuffer(GL20.GL_FRAMEBUFFER, fbuf);
int[] texId = new int[1];
GLES20.glGenTextures(1, texId, 0);
int bufferTex = texId[0];
GLES20.glBindTexture(GL20.GL_TEXTURE_2D, bufferTex);
GLES20.glTexParameterf(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_MAG_FILTER, GL20.GL_LINEAR);
GLES20.glTexParameterf(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_MIN_FILTER, GL20.GL_LINEAR);
GLES20.glTexParameterf(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_WRAP_S, repeatX ? GL20.GL_REPEAT
: GL20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameterf(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_WRAP_T, repeatY ? GL20.GL_REPEAT
: GL20.GL_CLAMP_TO_EDGE);
GLES20.glTexImage2D(GL20.GL_TEXTURE_2D, 0, GL20.GL_RGBA, width, height, 0, GL20.GL_RGBA, GL20.GL_UNSIGNED_BYTE, pixelBuffer);
gfx.drawTexture(bufferTex, width, height, Transform.IDENTITY, width/2, height/2, false, false, 1);
GLES20.glDeleteTextures(1, IntBuffer.wrap(new int[] {bufferTex}));
pixelBuffer = null;
GLES20.glBindFrameBuffer(GLES20.GL_FRAMEBUFFER, 0);
}
gfx.drawTexture() builds a quad and draws it to the currently bound framebuffer, by the way. That code has been well-tested in other parts of my project — it shouldn't be the issue here.
For those of you playing along at home, this code is in fact totally valid. Remember when I swore blind that gfx.drawTexture() has been well-tested and shouldn't be the issue here"? Yeah, it was totally the issue there. I was buffering vertices to draw without actually flushing them through a glDrawElements() call. Whoops.

Categories

Resources