When frame buffer in used on android, scene does not render - android

I am trying to switch my application over to using framebuffer/renderbuffer objects (looking to render to texture later) but whenever I use the framebuffer/renderbuffer, nothing renders. When the default framebuffer is left on, as is the default, it renders fine; All colour, depth and stencil related features work exactly as expected.
Initialisation code:
if(frameBuffer == 0 || !GLES20.glIsFramebuffer(frameBuffer)){
int[] retVal = new int[1];
GLES20.glGenFramebuffers(1, retVal, 0);
frameBuffer = retVal[0];
retVal[0] = 0;
if(renderBufferColour != 0 || !GLES20.glIsRenderbuffer(renderBufferColour)){
GLES20.glGenRenderbuffers(1, retVal, 0);
renderBufferColour = retVal[0];
}
if(renderBufferDepth != 0 || !GLES20.glIsRenderbuffer(renderBufferDepth)){
GLES20.glGenRenderbuffers(1, retVal, 0);
renderBufferDepth = retVal[0];
}
if(renderBufferStencil != 0 || !GLES20.glIsRenderbuffer(renderBufferStencil)){
GLES20.glGenRenderbuffers(1, retVal, 0);
renderBufferStencil = retVal[0];
}
}
Framebuffer enabling code (commenting this out allows rendering using the default buffer):
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, frameBuffer);
GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderBufferColour);
GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_RGBA4, width, height);
GLES20.glFramebufferRenderbuffer(
GLES20.GL_FRAMEBUFFER,
GLES20.GL_COLOR_ATTACHMENT0,
GLES20.GL_RENDERBUFFER,
renderBufferColour);
GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderBufferDepth);
GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, width, height);
GLES20.glFramebufferRenderbuffer(
GLES20.GL_FRAMEBUFFER,
GLES20.GL_DEPTH_ATTACHMENT,
GLES20.GL_RENDERBUFFER,
renderBufferDepth);
GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderBufferStencil);
GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_STENCIL_INDEX8, width, height);
GLES20.glFramebufferRenderbuffer(
GLES20.GL_FRAMEBUFFER,
GLES20.GL_STENCIL_ATTACHMENT,
GLES20.GL_RENDERBUFFER,
renderBufferStencil);
if (GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER) != GLES20.GL_FRAMEBUFFER_COMPLETE)
throw new RuntimeException(String.format("Failed to make complete framebuffer object %x", GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER)));
The exception at the end is not thrown and when checking glGetError() after each step, no error is returned. Am I missing something here?

Ok, after reading the OpenGLES 2.0 manual, I came across this paragraph:
By allowing the images of a renderbuffer to be attached to a framebuffer,
OpenGL ES provides a mechanism to support off-screen rendering. Further, by
allowing the images of a texture to be attached to a framebuffer, OpenGL ES provides
a mechanism to support render to texture.
It appears to me that application render buffers are always off-screen not on-screen as I had expected. This, if I have read correctly, is why there is no image on the screen.

Related

Use glEGLImageTargetTexture2DOES to replace glReadPixels on Android

Given a textureId, I need to extract pixel data from the texture.
glReadPixels works, but it is extremely slow on some devices, even with FBO/PBO. (On Xiaomi MI 5, it is 65 ms, and even slower with PBO). So I decided to use Hardwarebuffer and eglImageKHR, which should be much faster. However, I cannot get it to work. The screen goes black, and nothing is read into the data.
//attach inputTexture to FBO
glBindFrameBuffer(GL_FRAMEBUFFER, fbo)
glFrameBufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, textureId, 0)
//allocate hardware buffer
AHardwareBuffer* buffer = nullptr;
AHardwareBuffer_Desc desc = {};
desc.width = width;
desc.height = height;
desc.layers = 1;
desc.usage = AHARDWAREBUFFER_USAGE_CPU_READ_OFTEN | AHARDWAREBUFFER_USAGE_CPU_WRITE_NEVER
| AHARDWAREBUFFER_USAGE_GPU_COLOR_OUTPUT ;
desc.format = AHARDWAREBUFFER_FORMAT_R8G8B8A8_UNORM;
//create eglImageKHR
EGLint eglAttributes[] = {EGL_IMAGE_PRESERVED_KHR, EGL_TRUE, EGL_NONE};
EGLClientBuffer clientBuffer = eglGetNativeClientBufferANDROID(buffer);
EGLImageKHR eglImageKhr = eglCreateImageKHR(eglDisplay, eglContext,
EGL_NATIVE_BUFFER_ANDROID, clientBuffer, eglAttributes);
//read pixels to hardware buffer ????
glBindTexture(GL_TEXTURE_2D, textureId);
glEGLImageTargetTexture2DOES(GL_TEXTURE_2D, eglImageKhr);
//copy pixels into memory
void *readPtr;
AHardwareBuffer_lock(buffer, AHARDWAREBUFFER_USAGE_CPU_READ_OFTEN, -1, nullptr, (void**)&readPtr);
memcpy(writePtr, readPtr, width * 4);
AHardwareBuffer_unlock(buffer, nullptr);
This is my code with glReadPixels, and it just can get the pixels after attaching the texture to framebuffer.
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, fboBuffer!![0])
GLES30.glFramebufferTexture2D(GLES30.GL_FRAMEBUFFER,
GLES30.GL_COLOR_ATTACHMENT0,
GLES30.GL_TEXTURE_2D,
inputTextureId,
0)
GLES30.glReadPixels(0, 0, width, height, GLES30.GL_RGBA, GLES30.GL_UNSIGNED_BYTE, byteBuffer)
Please tell me where I did wrong :(
//read pixels to hardware buffer ????
glBindTexture(GL_TEXTURE_2D, textureId);
glEGLImageTargetTexture2DOES(GL_TEXTURE_2D, eglImageKhr);
This doesn't do any reading of data. This is just setting up the texture to refer to the EGLImage. If you want data to be copied into it you need to either do glCopyTexImage2D() or bind it to a framebuffer and render in to it.

Opengl - Convert Texture to Array

in my android App, i have a Frame Buffer Object that takes me the rendered Scene As a texture.
the app is an Origami game and user can fold a paper freely:
in every Fold, the current rendered scene saves to a texture using fbo and then i redraw the paper with new coordinates with new texture attached to it, to seem like folded paper. and this way the user can fold the paper as many time as he wants.
I want in every frame Check the rendered scene, to determinate does the user riches to the final shape (assume that i have the final shape in a 2d-array with 0 and 1 filled, 0 for transparency and 1 for colored pixels)
what i want, is to some How, Convert this Texture to A 2d-Array filled with 0 and 1,
0 for transparency pixel, and 1 for Colored pixel of texture.
i need this to then compare this result with a previously Known 2d-Array to determinate if the texture is the shape i want or not.
is it possible to save the texture data to an array?
i cant use glreadPixels because it is so heavy and its not possible to call it every frame.
here is the FBO class (i want to have renderTex[0] as array):
public class FBO {
int [] fb, renderTex;
int texW;
int texH;
public FBO(int width,int height){
texW = width;
texH = height;
fb = new int[1];
renderTex= new int[1];
}
public void setup(GL10 gl){
// generate
((GL11ExtensionPack)gl).glGenFramebuffersOES(1, fb, 0);
gl.glEnable(GL10.GL_TEXTURE_2D);
gl.glGenTextures(1, renderTex, 0);// generate texture
gl.glBindTexture(GL10.GL_TEXTURE_2D, renderTex[0]);
gl.glTexParameterf(GL10.GL_TEXTURE_2D,
GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D,
GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S,
GL10.GL_CLAMP_TO_EDGE);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T,
GL10.GL_CLAMP_TO_EDGE);
//texBuffer = ByteBuffer.allocateDirect(buf.length*4).order(ByteOrder.nativeOrder()).asIntBuffer();
//gl.glTexEnvf(GL10.GL_TEXTURE_ENV, GL10.GL_TEXTURE_ENV_MODE,GL10.GL_MODULATE);
gl.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_RGBA, texW, texH, 0, GL10.GL_RGBA, GL10.GL_UNSIGNED_BYTE, null);
gl.glDisable(GL10.GL_TEXTURE_2D);
}
public boolean RenderStart(GL10 gl){
Log.d("TextureAndFBO", ""+renderTex[0] + " And " +fb[0]);
// Bind the framebuffer
((GL11ExtensionPack)gl).glBindFramebufferOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, fb[0]);
// specify texture as color attachment
((GL11ExtensionPack)gl).glFramebufferTexture2DOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, GL11ExtensionPack.GL_COLOR_ATTACHMENT0_OES, GL10.GL_TEXTURE_2D, renderTex[0], 0);
int error = gl.glGetError();
if (error != GL10.GL_NO_ERROR) {
Log.d("err", "FIRST Background Load GLError: " + error+" ");
}
int status = ((GL11ExtensionPack)gl).glCheckFramebufferStatusOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES);
if (status != GL11ExtensionPack.GL_FRAMEBUFFER_COMPLETE_OES)
{
Log.d("err", "SECOND Background Load GLError: " + status+" ");;
return true;
}
gl.glClear(GL10.GL_COLOR_BUFFER_BIT);
return true;
}
public void RenderEnd(GL10 gl){
((GL11ExtensionPack)gl).glBindFramebufferOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, 0);
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glEnable(GL10.GL_TEXTURE_2D);
gl.glBindTexture(GL10.GL_TEXTURE_2D, 0);
gl.glColor4f(1.0f,1.0f,1.0f,1.0f);
gl.glDisable(GL10.GL_TEXTURE_2D);
}
public int getTexture(){
return renderTex[0];
}
public int getFBO(){
return fb[0];
}
}
If you are using openGL ES 3.0 and later then pbo would be a good solution. But I think you can use EGLImage. Because this only needs OpenGL ES 1.1 or 2.0.
The function to create an EGLImageKHR is:
EGLImageKHR eglCreateImageKHR(EGLDisplay dpy,
EGLContext ctx,
EGLenum target,
EGLClientBuffer buffer,
const EGLint *attrib_list)
To allocate an ANativeWindowBuffer, Android has a simple wrapper called GraphicBuffer:
GraphicBuffer *window = new GraphicBuffer(width, height, PIXEL_FORMAT_RGBA_8888, GraphicBuffer::USAGE_SW_READ_OFTEN | GraphicBuffer::USAGE_HW_TEXTURE);
struct ANativeWindowBuffer *buffer = window->getNativeBuffer();
EGLImageKHR *image = eglCreateImageKHR(eglGetCurrentDisplay(), EGL_NO_CONTEXT, EGL_NATIVE_BUFFER_ANDROID, *attribs);
to read pixels from an FBO use one of these two methods below:
void EGLImageTargetTexture2DOES(enum target, eglImageOES image)
void EGLImageTargetRenderbufferStorageOES(enum target, eglImageOES image)
These two methods will esablish all the properties of the target GL_TEXTURE_2D or GL_RENDERBUFFER
uint8_t *ptr;
glBindTexture(GL_TEXTURE_2D, texture_id);
glEGLImageTargetTexture2DOES(GL_TEXTURE_2D, image);
window->lock(GraphicBuffer::USAGE_SW_READ_OFTEN, &ptr);
memcpy(pixels, ptr, width * height * 4);
window->unlock();
To accomplish what you want, you need to use a PBO (Pixel Buffer Object): You can map it to an array to read it if it were a regular array.
OpenGL ARB_pixel_buffer_object extension is very close to
ARB_vertex_buffer_object. It simply expands ARB_vertex_buffer_object
extension in order to store not only vertex data but also pixel data
into the buffer objects. This buffer object storing pixel data is
called Pixel Buffer Object (PBO). ARB_pixel_buffer_object extension
borrows all VBO framework and APIs, plus, adds 2 additional "target"
tokens. These tokens assist the PBO memory manger (OpenGL driver) to
determine the best location of the buffer object; system memory,
shared memory or video memory. Also, the target tokens clearly specify
that the bound PBO will be used in one of 2 different operations;
GL_PIXEL_PACK_BUFFER_ARB to transfer pixel data to a PBO, or
GL_PIXEL_UNPACK_BUFFER_ARB to transfer pixel data from PBO.
It can be created similiar to other buffer objects:
glGenBuffers(1, &pbo);
glBindBuffer(GL_PIXEL_PACK_BUFFER, pbo);
glBufferData(GL_PIXEL_PACK_BUFFER, size, 0, GL_DYNAMIC_READ);
Then you can read from an FBO (or a texture) easily:
glReadBuffer(GL_COLOR_ATTACHMENT0);
glBindBuffer(GL_PIXEL_PACK_BUFFER, pbo);
glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, 0);
GLubyte *array = (GLubyte*)glMapBufferRange(GL_PIXEL_PACK_BUFFER, 0, size, GL_MAP_READ_BIT);
// TODO: Do your checking of the shape inside of this 'array' pointer or copy it somewhere using memcpy()
glUnmapBuffer(GL_PIXEL_PACK_BUFFER);
glBindBuffer(GL_PIXEL_PACK_BUFFER, 0);
Here GL_COLOR_ATTACHMENT0 is used as input - see the specification of glReadBuffer for further details how to specify front or backbuffer to be used.

OpenGL ES 2.0 render to framebuffer/texture results in black texture

I'm using libgdx but this is pretty much vanilla opengl es 2.0 stuff. Just try and ignore the Gdx.gl prefix everywhere ^^ I'm testing it on my desktop as well as android device and it's the same story in both cases.
I have the following code in my window resize event. It is supposed to delete the frame buffer and associated textures if they already were created, and then make some new ones the right size. I'm not sure if this is even correct to delete the textures and framebuffer like i am doing.
if (depthTexture >= 0)
{
Gdx.gl.glDeleteTexture(depthTexture);
depthTexture = -1;
}
if (colorTexture >= 0)
{
Gdx.gl.glDeleteTexture(colorTexture);
colorTexture = -1;
}
if (depthBuffer >= 0)
{
Gdx.gl.glDeleteFramebuffer(depthBuffer);
depthBuffer = -1;
}
IntBuffer intBuffer = BufferUtils.newIntBuffer(16); // See http://lwjgl.org/forum/index.php?topic=1314.0;wap2
intBuffer.clear();
Gdx.gl.glGenFramebuffers(1, intBuffer);
frameBuffer = intBuffer.get(0);
intBuffer.clear();
Gdx.gl.glGenTextures(1, intBuffer);
colorTexture = intBuffer.get(0);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, colorTexture);
Gdx.gl.glTexImage2D(GL20.GL_TEXTURE_2D, 0, GL20.GL_RGBA, width, height
, 0, GL20.GL_RGBA, GL20.GL_UNSIGNED_BYTE, null);
Gdx.gl.glTexParameteri(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_MIN_FILTER, GL20.GL_NEAREST);
Gdx.gl.glTexParameteri(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_MAG_FILTER, GL20.GL_NEAREST);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, 0);
intBuffer.clear();
Gdx.gl.glGenTextures(1, intBuffer);
depthTexture = intBuffer.get(0);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, depthTexture);
Gdx.gl.glTexImage2D(GL20.GL_TEXTURE_2D, 0, GL20.GL_DEPTH_COMPONENT, width, height
, 0, GL20.GL_DEPTH_COMPONENT, GL20.GL_UNSIGNED_SHORT, null);
Gdx.gl.glTexParameteri(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_MIN_FILTER, GL20.GL_NEAREST);
Gdx.gl.glTexParameteri(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_MAG_FILTER, GL20.GL_NEAREST);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, 0);
Gdx.gl.glBindFramebuffer(GL20.GL_FRAMEBUFFER, frameBuffer);
Gdx.gl.glFramebufferTexture2D(GL20.GL_FRAMEBUFFER, GL20.GL_COLOR_ATTACHMENT0
, GL20.GL_TEXTURE_2D, colorTexture, 0);
Gdx.gl.glFramebufferTexture2D(GL20.GL_FRAMEBUFFER, GL20.GL_DEPTH_ATTACHMENT
, GL20.GL_TEXTURE_2D, depthTexture, 0);
int status = Gdx.gl.glCheckFramebufferStatus(GL20.GL_FRAMEBUFFER);
if (status != GL20.GL_FRAMEBUFFER_COMPLETE)
{
System.out.println("frame buffer not complete. status " + Integer.toHexString(status));
System.exit(0);
}
Gdx.gl.glBindFramebuffer(GL20.GL_FRAMEBUFFER, 0);
status = Gdx.gl.glCheckFramebufferStatus(GL20.GL_FRAMEBUFFER);
if (status != GL20.GL_FRAMEBUFFER_COMPLETE)
{
System.out.println("default buffer not complete. status " + Integer.toHexString(status));
System.exit(0);
}
I am not sure at all if i have made mistakes in setting up the render buffer or either the color texture or depth texture attachments. Anyway, on to the rendering loop
// update cameras and things
// setup rendering to off screen framebuffer
Gdx.gl.glBindFramebuffer(GL20.GL_FRAMEBUFFER, frameBuffer);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
// draw things
// setup rendering to default framebuffer
Gdx.gl.glBindFramebuffer(GL20.GL_FRAMEBUFFER, 0);
Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
shader.begin();
// setup shader stuff
Gdx.gl.glActiveTexture(0);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, depthTexture);
shader.setUniformi("u_fbDepth", 0);
Gdx.gl.glActiveTexture(1);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, colorTexture);
shader.setUniformi("u_fbColor", 1);
// draw things with shader
shader.end();
Again i am not sure i am setting things up the right way. The idea here is hopefully pretty clear. Render to the off screen frame buffer then use the depth and color textures from that frame buffer as textures to sample in the final shader that renders to the default framebuffer.
The depth and color textures that end up in my fragment shader are just empty however. Black screen. I know the fragment shader is not the problem - if i sample a different texture i see the texture as expected. I know that the drawing its self is not the problem - if i render what i would want to render to the off screen frame buffer directly to the default frame buffer i see what i expect.
I got it. There's a bit of a gotcha with setting active textures. The function glActiveTexture expects one of the GL_TEXTURE0 type constants, but the shader uniform just wants to be the integer in the constant name.
basically
Gdx.gl.glActiveTexture(0);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, depthTexture);
shader.setUniformi("u_fbDepth", 0);
Gdx.gl.glActiveTexture(1);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, colorTexture);
shader.setUniformi("u_fbColor", 1);
needed to be
Gdx.gl.glActiveTexture(GL20.GL_TEXTURE0);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, depthTexture);
shader.setUniformi("u_fbDepth", 0);
Gdx.gl.glActiveTexture(GL20.GL_TEXTURE1);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, colorTexture);
shader.setUniformi("u_fbColor", 1);

replacing glReadPixels with EGL_KHR_image_base for faster pixel copy

I am trying to use EGL_KHR_image_base in an android native process in order to replace glReadPixels because it is to slow ( 220ms for 1280x800 RGBA ).
This is what I have so far, but my it produces an empty buffer ( only zeros )
uint8_t *ptr;
GLuint mTexture;
status_t error;
GraphicBufferAlloc* mGraphicBufferAlloc = new GraphicBufferAlloc();
sp<GraphicBuffer> window = mGraphicBufferAlloc->createGraphicBuffer(width, height, PIXEL_FORMAT_RGBA_8888, GraphicBuffer::USAGE_SW_READ_OFTEN | GraphicBuffer::USAGE_HW_TEXTURE,&error);
EGLClientBuffer buffer = (EGLClientBuffer)window->getNativeBuffer();
EGLint eglImageAttributes[] = {EGL_WIDTH, width, EGL_HEIGHT, height, EGL_MATCH_FORMAT_KHR, EGL_FORMAT_RGBA_8888_KHR, EGL_IMAGE_PRESERVED_KHR, EGL_TRUE, EGL_NONE};
EGLImageKHR image = eglCreateImageKHR(eglGetCurrentDisplay(), EGL_NO_CONTEXT, EGL_NATIVE_BUFFER_ANDROID,buffer, eglImageAttributes);
glGenTextures(1, &mTexture);
glBindTexture(GL_TEXTURE_2D, mTexture);
glEGLImageTargetTexture2DOES(GL_TEXTURE_2D, image);
window->lock(GraphicBuffer::USAGE_SW_READ_OFTEN, (void**)&ptr);
memcpy(texture, ptr, width * height * 4);
window->unlock();
What am I doing wrong ?
You're creating an empty buffer and then reading the contents out of it. Walking through the code:
GraphicBufferAlloc* mGraphicBufferAlloc = new GraphicBufferAlloc();
sp<GraphicBuffer> window = mGraphicBufferAlloc->createGraphicBuffer(width, height, PIXEL_FORMAT_RGBA_8888, GraphicBuffer::USAGE_SW_READ_OFTEN | GraphicBuffer::USAGE_HW_TEXTURE,&error);
This creates a new GraphicBuffer (sometimes referred to as a "gralloc buffer"), with the specified dimensions and pixel format. The usage flags allow it to be used as a texture or read from software, which is what you want.
EGLClientBuffer buffer = (EGLClientBuffer)window->getNativeBuffer();
EGLint eglImageAttributes[] = {EGL_WIDTH, width, EGL_HEIGHT, height, EGL_MATCH_FORMAT_KHR, EGL_FORMAT_RGBA_8888_KHR, EGL_IMAGE_PRESERVED_KHR, EGL_TRUE, EGL_NONE};
EGLImageKHR image = eglCreateImageKHR(eglGetCurrentDisplay(), EGL_NO_CONTEXT, EGL_NATIVE_BUFFER_ANDROID,buffer, eglImageAttributes);
This takes the ANativeWindow object (which is a queue of buffers under the hood) and attaches an EGLImage "handle" to it.
glGenTextures(1, &mTexture);
glBindTexture(GL_TEXTURE_2D, mTexture);
glEGLImageTargetTexture2DOES(GL_TEXTURE_2D, image);
This creates a new texture object, and attaches the EGLImage to it. So now the ANativeWindow can be used as a texture object.
window->lock(GraphicBuffer::USAGE_SW_READ_OFTEN, (void**)&ptr);
memcpy(texture, ptr, width * height * 4);
window->unlock();
This locks the buffer for reading, copies the data out of it, and unlocks it. Since you haven't drawn anything, there's nothing to read.
For this to do something useful, you have to render something into the texture. You can do this by creating an FBO and attaching the texture to it as the color buffer, or by using glCopyTexImage2D() to copy pixels from the framebuffer to the texture.
I was able to get your example to work by adding the following before the call to grallocBuffer->lock():
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, 0, width, height, 0);
glFinish();
The glFinish() is necessary to ensure that GL has finished copying the pixels before we try to look at them.
Edit: my office-mate suggested that the GL_TEXTURE_2D needs to be GL_TEXTURE_EXTERNAL_OES. Haven't tried it yet.
Edit: see also this question.
GraphicBuffer is a part of Android Source Code, not in NDK.
see this for your reference:
https://github.com/fuyufjh/GraphicBuffer

Using OpenGL ES 2.0 FrameBuffer (FBO) and Stencil in android Native code (ndk)

I am trying to generate a frambuffer object and use stencil inside a native android application using the NDK (r5b). Target device is running froyo 2.2, supporting OpenGL ES 2.0.
So, I've been coding lots of gl code in my c++ native libs and havent got through any problem except for this. I just can't seems to make it work.
Here's a code snipplet for the framebuffer creation. Completness is all good, but screen remains completly black. It's like the fbo I am creating is not really bound to the gl surface that is created by the Java part of the app. The rest of my app code is all good, if I remove the fbo creation and binding, everything works perfectly fine except that I don't have the stencils working which I need for my app.
GLint backingWidth = 1024;
GLint backingHeight = 1024;
//Create the FrameBuffer and binds it
glGenFramebuffers(1, &_defaultFramebuffer);
checkGlError("glGenFramebuffers");
glBindFramebuffer(GL_FRAMEBUFFER, _defaultFramebuffer);
checkGlError("glBindFramebuffer");
//Create the RenderBuffer for offscreen rendering // Color
glGenRenderbuffers(1, &_colorRenderbuffer);
checkGlError("glGenRenderbuffers color");
glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer);
checkGlError("glBindRenderbuffer color");
glRenderbufferStorage(GL_RENDERBUFFER, GL_RGBA4, backingWidth, backingHeight);
checkGlError("glRenderbufferStorage color");
//Create the RenderBuffer for offscreen rendering // Depth
glGenRenderbuffers(1, &_depthRenderbuffer);
checkGlError("glGenRenderbuffers depth");
glBindRenderbuffer(GL_RENDERBUFFER, _depthRenderbuffer);
checkGlError("glBindRenderbuffer depth");
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, backingWidth, backingHeight);
checkGlError("glRenderbufferStorage depth");
//Create the RenderBuffer for offscreen rendering // Stencil
glGenRenderbuffers(1, &_stencilRenderbuffer);
checkGlError("glGenRenderbuffers stencil");
glBindRenderbuffer(GL_RENDERBUFFER, _stencilRenderbuffer);
checkGlError("glBindRenderbuffer stencil");
glRenderbufferStorage(GL_RENDERBUFFER, GL_STENCIL_INDEX8, backingWidth, backingHeight);
checkGlError("glRenderbufferStorage stencil");
// bind renderbuffers to framebuffer object
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, _depthRenderbuffer);
checkGlError("glFramebufferRenderbuffer depth");
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, _colorRenderbuffer);
checkGlError("glFramebufferRenderbuffer color");
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_STENCIL_ATTACHMENT, GL_RENDERBUFFER, _stencilRenderbuffer);
checkGlError("glFramebufferRenderbuffer stencil");
//Test for FrameBuffer completeness
GLenum status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
checkGlError("glCheckFramebufferStatus");
switch (status)
{
case GL_FRAMEBUFFER_COMPLETE: LOGI("\n\n\nFLIPBOOM : FBO complete GL_FRAMEBUFFER_COMPLETE %x\n\n\n", status);break;
case GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT: LOGI("\n\n\nFLIPBOOM : FBO GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT %x\n\n\n", status);break;
case GL_FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT: LOGI("\n\n\nFLIPBOOM : FBO FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT %x\n\n\n", status);break;
case GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS: LOGI("\n\n\nFLIPBOOM : FBO FRAMEBUFFER_INCOMPLETE_DIMENSIONS %x\n\n\n", status);break;
case GL_FRAMEBUFFER_UNSUPPORTED: LOGI("\n\n\nFLIPBOOM : FBO GL_FRAMEBUFFER_UNSUPPORTED %x\n\n\n", status);break;
default : LOGI("\n\n\nFLIPBOOM : failed to make complete framebuffer object %x\n\n\n", status);
}
I've also tried rendering to a 2D texture instead of the renderbuffer...didn't worked either.
So, Is there a way I can fix this ? Am I getting something wrong here ? If anyone has any ideas please lemme know....been spending way too much time looking up this problem...hehe ;)
Thanks in advance !
Cheers !
EDIT :
Ok, I've manage to make the stencil buffer work but the FBO are just not working. I think OpenGL ES 2.0 is not fully supported by android (using r5b here btw). I think method stubs are defined, but not fully implemented. Or the GlSurfaceView created doesn't link correctly with the FBOs.
As for the stencil buffer, I had to do
glEnable(GL_DEPTH_TEST);
and remove the usage of glDepthMask in order for them to work correctly.
# Zennichimaro, For the stencil buffer usage !
During Initialisation :
glBindFramebuffer(GL_FRAMEBUFFER, 0);
glEnable(GL_DEPTH_TEST);
During the rendering :
glViewport(0, 0, GetViewWidth(), GetViewHeight());
checkGlError("glViewport");
if (_firstRenderDone == false)
{
glClearDepthf( 0.9f );
glDepthMask( GL_TRUE );
glClear( GL_DEPTH_BUFFER_BIT );
glDepthMask( GL_FALSE );
_firstRenderDone = true;
}
glClearColor(M_channelToFloat(_backgroundColor.r),
M_channelToFloat(_backgroundColor.g),
M_channelToFloat(_backgroundColor.b),
M_channelToFloat(_backgroundColor.a));
checkGlError("glClearColor");
glClearStencil( 0 );
checkGlError("glClearStencil");
glClear( GL_COLOR_BUFFER_BIT | GL_STENCIL_BUFFER_BIT );
checkGlError("glClear");
_stencilLayer = 1;
//use our custom shaders
if( _program )
{
glUseProgram(_program);
if( transformMatrix3x3 != NULL )
{
glUniformMatrix3fv( _uniforms[OGL_UNIFORM_TRANSFORM], 1, false, transformMatrix3x3 );
}
// reset the shading.
glUniform1f( _uniforms[ OGL_UNIFORM_SHADE ], 0.0f );
}
//Do the actual drawing (Triangle Slip)
if( object )
{
_isRender = true;
object->OglDraw(this);
_isRender = false;
}
When I need to use stencil I use the following methods depending on what I need :
void GlEs2Renderer::StencilStartMask()
{
if (!USE_STENCIL) //For debugging purpose
return;
glEnable(GL_STENCIL_TEST);
//Turn off writing to the Color Buffer and Depth Buffer
//We want to draw to the Stencil Buffer only
glColorMask( GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE );
//Set 1 into the stencil buffer
glStencilFunc( GL_ALWAYS, NewStencilLayer(), 0xFFFFFFFF );
glStencilOp( GL_ZERO, GL_ZERO, GL_REPLACE );
}
void GlEs2Renderer::StencilUseMask()
{
if (!USE_STENCIL) //For debugging purpose
return;
//Turn back on Color Buffer and Depth Buffer
glColorMask( GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE );
//Only write to the Stencil Buffer where 1 is set
glStencilFunc( GL_EQUAL, StencilLayer(), 0xFFFFFFFF);
//Keep the content of the Stencil Buffer
glStencilOp( GL_KEEP, GL_KEEP, GL_KEEP );
}
void GlEs2Renderer::StencilOverlayMask()
{
if (!USE_STENCIL) //For debugging purpose
return;
//Turn back on Color Buffer and Depth Buffer
glColorMask( GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE );
glDepthMask(true);
//Only write to the Stencil Buffer where 1 is set
glStencilFunc( GL_EQUAL, StencilLayer(), 0xFFFFFFFF);
//Keep the content of the Stencil Buffer and increase when z passed
glStencilOp(GL_KEEP, GL_KEEP, GL_INCR);
}
And Finally I do the double pass technic to draw inside the stencil ... Here's an example :
glVertexAttribPointer(OGL_ATTRIB_VERTEX, 2, GL_FLOAT, 0, 0, _triangles);
glEnableVertexAttribArray(OGL_ATTRIB_VERTEX);
glVertexAttribPointer(OGL_ATTRIB_COLOR, 4, GL_UNSIGNED_BYTE, 1, 0, _colors);
glEnableVertexAttribArray(OGL_ATTRIB_COLOR);
glContext->StencilStartMask();
glDrawArrays(GL_TRIANGLE_STRIP, 0, _nPoints);
glContext->StencilUseMask();;
glDrawArrays(GL_TRIANGLE_STRIP, 0, _nPoints);
glContext->StencilEndMask();
My code is fairly complex so it's hard to only post what's related to the stencil, But I hope It'll help ;)

Categories

Resources