I am in a situation where I need to render to a texture using FBO, and I need to remove some pixels from the buffer. That is, draw (0,0,0,0) color in a RGBA colored framebuffer. This will later allow me to later render this texture on top of another one and the 'zero pixels' will not be rendered since I use blending.
The problem is that I cannot draw such color on the screen because GL_BLEND is enabled while I am doing that and the zero pixels aren't rendered to the texture.
So I have tried disabling GL_BLEND just before rendering to the texture, and 1. It continued blending like nothing happened, 2.At some point it renders of all the graphics without blending although it clearly receives the glEnable(GL_BLEND) call.
Here is most of the code:
gl.glEnableClientState(GL11.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL11.GL_TEXTURE_COORD_ARRAY);
gl.glDisable(GL11.GL_BLEND);
gl.glBindTexture(GL11.GL_TEXTURE_2D, 0);
gl.glFrontFace(GL11.GL_CW);
gl.glVertexPointer(3, GL11.GL_FLOAT, 0, vertexBuffer);
gl.glTexCoordPointer(2, GL11.GL_FLOAT, 0, textureBuffer);
gl.glPushMatrix();
gl.glTranslatef(start_x+adv_x/2+adv_x*(float)col_idx, start_y+adv_y/2+adv_y*(float)j, -1.0f);
gl.glScalef(adv_x/2, adv_y/2, 1.0f);
gl.glColor4f(0,0,0,0);
gl.glDrawArrays(GL11.GL_TRIANGLE_STRIP, 0, vertex.length / 3);
gl.glColor4f(1f, 1f, 1f, 1f);
gl.glPopMatrix();
gl.glEnable(GL11.GL_BLEND);
gl.glBlendFunc(GL11.GL_SRC_ALPHA, GL11.GL_ONE_MINUS_SRC_ALPHA);
gl.glDisableClientState(GL11.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL11.GL_TEXTURE_COORD_ARRAY);
This is code from an Android application, written in Java.
What can be the problem?
Related
I am drawing objects in OpenGL in this order and z-order:
A background quad,
background objects = images with titles (textures)
a translucent black quad which darkens the background objects (colored quad with low alpha value),
and foreground objects = images with titles (textures, but the one in the center has a white quad behind it)
Currently, it looks like this:
This is not as it's supposed to be - the 5 objects around "Chikinki" (Timid Tiger, The Bishops,...) should be as bright as the "Chikinki" picture. Also, all titles (including Chikinki's) are darker than the textures really are. It seems to me as though the white quad behind the "Chikinki" image somehow forces OpenGL to draw the pic in front of it rightly. Without the translucent black quad (which is there to darken all objects except the one in the middle and 5 around it), everything is fine. So I suspect that this translucent quad somehow bleeds through, although the other objects are drawn after it and have a higher z-value.
As mentioned, the objects are drawn back-to-front, and also laid out back-to-front z-wise. Blending is always enabled. The white rectangle has blending function: gl.glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE_MINUS_SRC_ALPHA); and all images have function gl.glBlendFunc(GL10.GL_ONE, GL10.GL_ONE_MINUS_SRC_ALPHA);. Depth-test is enabled.
Relevant texture-fill quad drawing code (too dim in the screenshot):
gl.glPushMatrix();
gl.glEnable(GL10.GL_BLEND);
gl.glBlendFunc(GL10.GL_ONE, GL10.GL_ONE_MINUS_SRC_ALPHA);
gl.glActiveTexture(GL10.GL_TEXTURE0);
gl.glBindTexture(GL10.GL_TEXTURE_2D, mTextureName);
gl.glTexParameterx(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, GL10.GL_REPEAT);
gl.glTexParameterx(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, GL10.GL_REPEAT);
gl.glFrontFace(GL10.GL_CW);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, mFVertexBuffer);
gl.glEnable(GL10.GL_TEXTURE_2D);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, mTexBuffer);
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, 4);
mFVertexBuffer.position(0);
mTexBuffer.position(0);
mIndexBuffer.position(0);
gl.glDisable(GL10.GL_TEXTURE_2D);
gl.glDisable(GL10.GL_BLEND);
gl.glPopMatrix();
Relevant color-fill quad drawing code (shown OK in the screenshot):
gl.glPushMatrix();
gl.glEnable(GL10.GL_BLEND);
gl.glBlendFunc(GL10.GL_ONE, GL10.GL_ONE_MINUS_SRC_ALPHA);
gl.glFrontFace(GL10.GL_CW);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, mFVertexBuffer);
gl.glColor4f(mRed, mGreen, mBlue, mAlpha);
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, 4);
mFVertexBuffer.position(0);
mIndexBuffer.position(0);
gl.glPopMatrix();
gl.glDisable(GL10.GL_BLEND);
Have you got any clue what could be causing the darkening of these objects which should by all means be fully opaque, considering that they are drawn AFTER the translucent quad and have a HIGHER z-value?
SOLUTION
Tim pointed out in the comments that texture drawing is affected by the current color buffer (its alpha value). Calling gl.glColor4f(1,1,1,1); before drawing the images did the trick, reverting the earlier call to gl.glColor4f() with a lower-than-1 alpha value.
I'm developing a game for android and this is my first experience with OpenGL.
When the application loads I create my vertices and texture buffers, I load images from drawable resources; using GLUtils.tex2Image2D to bind the image to a texture array.
I was wondering if glBindTexture() was the correct function to use when changing the texture to produce animation.
public void onDraw(GL10 gl){
sprite.animate();
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[sprite.frameNumber]);
sprite.draw(gl);
}
Code Explanation
sprite.animate() - changes the frame number depending on System.uptimeMillis()
sprite.draw() - does the actual drawing:
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer);
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, vertices.length / 3);
//Disable the client state before leaving
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
The function does work, but I wanted to confirm it was the correct function to use, or if there is an alternative way to do this.
Binding a different texture to animate is one way to do what you want.
A more popular way of doing this is to have all your animations frames in a big texture (pack all the individual frames in a huge rectangle): to draw a different frame, just change the texture coordinates.
For example, pack four frames of animation in a big 2x2 square
1|2
3|4
Then you'll use for texture coordinates (0,0) (0.5,0) (0.5,0.5) (0,0.5) to display frame 1, and the rest should be obvious.
I am trying to develop an android drawing app.In this app I am using texture to draw.
I want to draw translucent texture one over another to get continuous translucent line but I am only able to get this:
As you can see in part B circular translucent textures appear one over other. I want line to appear as in part A.
I am adding information of texture drawing like width,height,x&y coordinate in ArrayList spriteArray.
The blending function used is- gl.glBlendFunc(GL10.GL_ONE, GL10.GL_ONE_MINUS_SRC_ALPHA);
To set opacity of texture-gl.glColor4f(1, 1, 1, opacityValue);
I have also used gl.glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE_MINUS_SRC_ALPHA);it does not work either.
Any help to solve this problem will be appreciated.
If you need more information please tell me.
drawFrame code:enter code here
gl.glMatrixMode(GL10.GL_TEXTURE);
gl.glEnable(GL10.GL_TEXTURE_2D);
gl.glBlendFunc(GL10.GL_ONE, GL10.GL_ONE_MINUS_SRC_ALPHA);
gl.glColor4f(1, 1, 1, opacityValue);
for (int x = 0; x < GLSurfaceView.mSpriteArray.size(); x++) {
int
mTextureName=GLSurfaceView.mSpriteArray.get(x).textureName;
gl.glBindTexture(GL10.GL_TEXTURE_2D, mTextureName);
// Draw using the DrawTexture extension.
((GL11Ext) gl).glDrawTexfOES(x, y, 0, width, height);
}
actually I'm drawing a cube, I'm checking rotation problems of the cube, but for this I need to draw a point on the 0,0,-1 opengl coordinate of the screen, I'm using perspective projection, MyGLSurfaceView and android 1.5 opengl es 1.x
How can I draw a black or white point on the 0,0,-1 opengl coordinate of the screen?
If you want to be able to draw directly in window space then the easiest thing would be to load modelview and projection temporarily with the identity matrix and draw a GL_POINT with the location that you need. So that'd be something like:
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glLoadIdentity();
glMatrixMode(GL_PROJECTION);
glPushMatrix();
glLoadIdentity();
// draw the point here; specifics depending on whether you
// favour VBOs, VBAs, etc
// e.g. (assuming you don't have any client state enabled
// on entry and don't care about leaving the vertex array
// enabled on exit)
GLfloat vertexLocation[] = {0.0f, 0.0f, -1.0f};
glColor4f(0.0f, 0.0f, 0.0f, 1.0f);
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, 0, vertexLocation);
glDrawArrays(GL_POINTS, 0, 1);
// end of example to plot a GL_POINT
glPopMatrix();
glMatrixMode(GL_MODELVIEW);
glPopMatrix();
// and possibly restore yourself to some other matrix mode
// if, atypically, the rest of your code doesn't assume modelview
In my application I'm trying to create a mesh that is shaded by a single directional light. The problem I'm facing is that I can't seem to get the light to take my normals into account at all.
It works fine if I set the normals on a per-triangle-strip basis, but if I try to render a series of triangles with normals set using the glNormalPointer method the entire mesh is rendered using the same color (which is identical to the result I'm getting if I skip calling glNormalPointer all together).
My mesh render method looks like this:
public void render(GL10 gl) {
gl.glFrontFace(GL10.GL_CW);
gl.glNormalPointer(GL10.GL_FLOAT, 0, normalBuffer);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
gl.glDrawElements(GL10.GL_TRIANGLES, indexBuffer.capacity(), GL10.GL_UNSIGNED_BYTE, indexBuffer);
}
You should call
glEnableClientState(GL_NORMAL_ARRAY);