android opengl rendering garbage - android

I've been doing some simple shaders and Im encountering an error that happens randomly, when I start rendering my scene, sometimes the mesh is rendered with extra vectors, and if I kill the activity and then I open the same activity it renders sometimes without the extra vectors.
My guesses are that the memory on the GPU is not completely wiped out when I kill the activity. Whats more weird is that these extra polygons are rendered sometimes using my shader logic and other times they render as if they were filled with random squares.
Im going all crazy I've reviewed all the code, from where I read the obj, to where I set the vertex attributes, if you have been seen this before please let me know. BTW I'm using a motorola milestone with android 2.1.
This is the code related where I create a simple triangle and set the attributes of the vertices:
//This is where I create the mesh
mMesh = new Mesh();
mMesh.setVertices(new float[]{-0.5f, 0f, 0.5f,
0.5f, 0f, -0.5f,
-0.5f, 0f, -0.5f});
ArrayList<VertexAttribute> attributes = new ArrayList<VertexAttribute>();
attributes.add(new VertexAttribute(Usage.Position, 3, ProgramShader.POSITION_ATTRIBUTE));
VertexAttributes vertexAttributes = new VertexAttributes(attributes.toArray(new VertexAttribute[attributes.size()]));
mMesh.setVertexAttributes(vertexAttributes);
...
...
.......
//This is where I send the mesh to opengl
for(VertexAttribute attr :mVertexAttributes.getAttributes().values()){
mVertexBuffer.position(attr.offset);
int handler = shader.getHandler(attr.alias);
if(handler != -1){
try{
GLES20.glVertexAttribPointer(handler, attr.numComponents, GLES20.GL_FLOAT, false, mVertexAttributes.vertexSize, mVertexBuffer);
GLES20.glEnableVertexAttribArray(handler);
}catch (RuntimeException e) {
Log.d("CG", attr.alias);
throw e;
}
}
}
//(length = 3 for a triangle)
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, length);
Here are some screenshots for you to see the issue:
Screenshots
Also here is a link to a video I took when I run the app on the phone.
Video

So... I found the problem it was a really dumb thing I was doing,
//on this line I was sending length, where length was
//the length of the vertices for the triangle it was "9" (x,y,z for each vertex)
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, length);
//I had to divide that by the number of components for each vertex
//so when the vertex only has position attributes (x,y,z) is divided by 3
//when you have more i.e. normals it will be divided by 6 (x,y,z, normalX, normalY, normalZ)
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, length/mVertexAttributes.vertexNumComponents);
I hope this helps others.

Related

Libgdx: Framebuffer for "Fog of War"-Effect

I am writing a RTS Game for Android and I want the "Fog of War" effect on the player's units. This effect means that only a "circle" around each unit shows the background map while on places where no player unit is located, the screen should be black. I don't want to use shaders.
I have a first version of it working. What I am doing is to render the map to the default framebuffer, then I have a second Framebuffer (similar to light technics) which is completely black. Where the units of the players are, I then batch-draw a texture which is completely transparent and has a white circle with blurred edges in its middle.
Finally I draw the second (light) FrameBuffer's colorTexture over the first one using Gdx.gl.glBlendFunc(GL20.GL_DST_COLOR, GL20.GL_ZERO);
The visual effect now is that indeed the whole map is black and a circle around my units is visible - but a lot of white color is added.
The reason is pretty clear as I drew the light textures for the units like this:
lightBuffer.begin();
Gdx.gl.glBlendFunc(GL20.GL_SRC_ALPHA, GL20.GL_ONE);
Gdx.gl.glEnable(GL20.GL_BLEND);
Gdx.gl.glClearColor(0.1f, 0.1f, 0.1f, 1f);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.setProjectionMatrix(camera.combined);
batch.begin();
batch.setColor(1f, 1f, 1f, 1f);
for (RTSTroopAction i : unitList) {
batch.draw(lightSprite, i.getX() + (i.getWidth() / 2) - 230, i.getY() + (i.getHeight() / 2) - 230, 460, 460); //, 0, 0, lightSprite.getWidth(), lightSprite.getHeight(), false, true);
}
batch.end();
lightBuffer.end();
However, I don't want the "white stuff" on the original texture, I just want the original background shine through. How can I achieve that ?
I think it's playing around with the blendFuncs, but I was not able to figure out which values to use yet.
Thanks to Tenfour04 pointing into the right direction, I was able to find the solution. First of all, the problem is not directly within batch.end();. The problem is, that indeed a sprite batch maintains its own blendFunc Settings. These get applied when flush(); is called. (end() calls it also ).
However the batch is also calling flush when it draws a TextureRegion that is bound to a different texture than the one used in the previous draw() call.
So in my original code: whatever blendFunc I had set was always overridden when I called batch.draw(lightBuffer,...). The solution is to use the spritebatch's blendFunc and not the Gdx.gl.blendFunc.
The total working code finally looks like this:
lightBuffer.begin();
Gdx.gl.glBlendFunc(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);
Gdx.gl.glEnable(GL20.GL_BLEND);
// start rendering to the lightBuffer
// set the ambient color values, this is the "global" light of your scene
Gdx.gl.glClearColor(0.1f, 0.1f, 0.1f, 1f);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
// start rendering the lights
batch.setProjectionMatrix(camera.combined);
batch.begin();
// set the color of your light (red,green,blue,alpha values)
batch.setColor(1f, 1f, 1f, 1f);
for (RTSTroopAction i : unitList) {
if (i.getOwnerId() == game.getCallback().getPlayerId()) {
batch.draw(lightSprite, i.getX() + (i.getWidth() / 2) - 230, i.getY() + (i.getHeight() / 2) - 230, 460, 460); //, 0, 0, lightSprite.getWidth(), lightSprite.getHeight(), false, true);
}
}
batch.end();
lightBuffer.end();
// now we render the lightBuffer to the default "frame buffer"
// with the right blending !
Gdx.gl.glEnable(GL20.GL_BLEND);
Gdx.gl.glBlendFunc(GL20.GL_ZERO, GL20.GL_SRC_COLOR);
batch.setProjectionMatrix(getStage().getCamera().combined);
batch.enableBlending();
batch.setBlendFunction(GL20.GL_ZERO, GL20.GL_SRC_COLOR);
batch.begin();
Gdx.gl.glEnable(GL20.GL_BLEND);
Gdx.gl.glBlendFunc(GL20.GL_ZERO, GL20.GL_SRC_COLOR);
batch.draw(lightBufferRegion,0, 0, getStage().getWidth(), getStage().getHeight());
batch.end();
batch.setBlendFunction(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);

Android glGetFloatv called unimplemented "OpenGL ES API" [duplicate]

I'm learning Open GL ES and would like to get a more intuitive interface with 3D objects than the one suggested by google in the TouchRotateActivity sample.
In order to do that, I would like to multiply my Modelview matrix by the ModelView matrix in the previous state.
But I encounter the following problem : getFloatv function returns 0 values in my float array, and I don't understand why (my ModelView matrix is not empty : if it was, I wouldn't get my cube on the screen).
Could someone help me to figure out what the problem is? Here are the changes in the code .
private float[] previous;
public CubeRenderer() {
mCube = new Cube();
previous = new float[16];
}
public void onDrawFrame(GL10 gl) {
GL11 gl11 = (GL11) gl;
gl11.glClear(GL11.GL_COLOR_BUFFER_BIT | GL11.GL_DEPTH_BUFFER_BIT);
gl11.glMatrixMode(GL11.GL_MODELVIEW);
gl11.glLoadIdentity();
gl11.glTranslatef(0, 0, -3.0f);
gl11.glRotatef(mAngleX, 0, 1, 0);
gl11.glRotatef(mAngleY, 1, 0, 0);
gl11.glEnableClientState(GL11.GL_VERTEX_ARRAY);
gl11.glEnableClientState(GL11.GL_COLOR_ARRAY);
/*if(!previous.equals(new float[16]))
gl11.glMultMatrixf(previous, 0);*/
gl11.glGetFloatv(GL11.GL_MODELVIEW_MATRIX, previous, 0);
Log.d("taille matrice",Integer.toString(previous.length));
for(int i=0; i<previous.length;i++)
Log.d(Integer.toString(i),Float.toString(previous[i]));
mCube.draw(gl11);
}
Thank you in advance.
Depending on your device you might be using the PixelFlinger software GL renderer, which unfortunately does not implement glGetFloat, at least as of version 1.2. Checking the logcat output should reveal messages to this effect if this is the case.
The solution is to handle the matrices yourself so there's no need to retrieve them from OpenGL in the first place. Like so.
I don't program in Java, so for all I know, your problem could be in the way the memory is being passed to glGetFloatv. In any case, I found this page floating around out there, maybe it will help you.

Adding a Vertex to a libgdx Mesh

the title basically.
You can reserve more than you need when creating a mesh
mesh = new Mesh(false, 100, 0, new VertexAttribute(Usage.Position, 3, "a_position"));
But there is no method for adding a vertex. You can get the FloatBuffer and add to that, but I get strange results. I also tried the mesh.setVertices with offset but that does not work either.
I debugged with drawing points. Works until I try adding a vertex by any means (even if I tweak offsets to account for 3 floats in one vertex)
Copied code segment:
mesh.setVertices(new float[] {
-0.5f, -0.5f, 0,
0.5f, -0.5f, 0/*,
-0.5f,0.5f,0.f*/});//works if I uncoment this
mesh.setVertices(new float[]{-0.5f,0.5f,0.f}, 6, 3);//but comment this out
I also tried
squareMesh.setVertices(new float[]{-0.5f,0.5f,0.f}, 2, 3);
Thanks :)
Adding is a bad idea because the underlying buffers must be recreated and that takes up a lot of time. Instead, one should allocate meshes in bulk and only render what is currently used.
For example mesh.render(GL10.GL_TRIANGLES,0,num_triangles);
As for updating the buffer:
FloatBuffer fbuftmp = mesh.getVerticesBuffer();
BufferUtils.copy(buf,fbuftmp,fbuftmp.capacity(),0);
Where buf is an float array.
Use BufferUtils.copy for reason explained here

Android OpenGL ES 2.0 -- glReadPixels() and glTexImage2D() drawing a black texture?

I'm working on some Android code for caching and redrawing a framebuffer object's color buffer between the loss and recreation of EGL contexts. Development is primarily happening on a Xoom tablet running Honeycomb. Anyway, what I'm trying to do is store the result of calling glReadPixels() on the FBO in a direct ByteBuffer, then use that buffer with glTexImage2D() and draw it back into the (now cleared) framebuffer. All of this seems to work fine — the ByteBuffer contains the right values ([-1, 0, 0, -1] etc. for a pixel, according to Java's inability to understand unsigned bytes), no GlErrors seem to be thrown, and the quad is drawn to the right part of the screen (currently the top-left quarter of the framebuffer for testing purposes).
However, no matter what I try, glTexImage2D() always outputs a plain black texture. I've had some issues with this before — when displaying Bitmaps, I eventually gave up trying to use the basic GLES20.glTexImage2D() with Buffers and skipped to using GLUtils.glTexImage2D(), which processes the Bitmap for you. Unfortunately, that's less of an option here (I did actually try converting the ByteBuffer to a Bitmap so I could use GLUtils, without much success), so I've really run out of ideas.
Can anyone think of anything that could be causing glTexImage2D() to not correctly process a perfectly good ByteBuffer? Any and all suggestions would be welcome.
ByteBuffer pixelBuffer;
void storePixels() {
try {
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fbuf);
pixelBuffer = ByteBuffer.allocateDirect(width * height * 4).order(ByteOrder.nativeOrder());
GLES20.glReadPixels(0, 0, width, height, GL20.GL_RGBA, GL20.GL_UNSIGNED_BYTE, pixelBuffer);
GLES20.glBindFrameBuffer(GLES20.GL_FRAMEBUFFER, 0);
gfx.checkGlError("store Pixels");
}catch (OutOfMemoryError e) {
pixelBuffer = null;
}
}
void redrawPixels() {
GLES20.glBindFramebuffer(GL20.GL_FRAMEBUFFER, fbuf);
int[] texId = new int[1];
GLES20.glGenTextures(1, texId, 0);
int bufferTex = texId[0];
GLES20.glBindTexture(GL20.GL_TEXTURE_2D, bufferTex);
GLES20.glTexParameterf(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_MAG_FILTER, GL20.GL_LINEAR);
GLES20.glTexParameterf(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_MIN_FILTER, GL20.GL_LINEAR);
GLES20.glTexParameterf(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_WRAP_S, repeatX ? GL20.GL_REPEAT
: GL20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameterf(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_WRAP_T, repeatY ? GL20.GL_REPEAT
: GL20.GL_CLAMP_TO_EDGE);
GLES20.glTexImage2D(GL20.GL_TEXTURE_2D, 0, GL20.GL_RGBA, width, height, 0, GL20.GL_RGBA, GL20.GL_UNSIGNED_BYTE, pixelBuffer);
gfx.drawTexture(bufferTex, width, height, Transform.IDENTITY, width/2, height/2, false, false, 1);
GLES20.glDeleteTextures(1, IntBuffer.wrap(new int[] {bufferTex}));
pixelBuffer = null;
GLES20.glBindFrameBuffer(GLES20.GL_FRAMEBUFFER, 0);
}
gfx.drawTexture() builds a quad and draws it to the currently bound framebuffer, by the way. That code has been well-tested in other parts of my project — it shouldn't be the issue here.
For those of you playing along at home, this code is in fact totally valid. Remember when I swore blind that gfx.drawTexture() has been well-tested and shouldn't be the issue here"? Yeah, it was totally the issue there. I was buffering vertices to draw without actually flushing them through a glDrawElements() call. Whoops.

OpenGL Depth Buffer issue on Android

I am developing a 3D Rendering Engine for Android. I have experienced some issues with the depth buffer. I am drawing some cubes, one big and two small ones that will fall on top of the bigger one. While rendering I have seen that obviously something with the depth buffer is wrong, as seen in this screen shot:
This screen shot was taken on an HTC Hero (running Android 2.3.4) with OpenGL ES 1.1 The whole application is (still) targeted at OpenGL ES 1.1. It does look the same on the Emulator.
These are the calls in my onSurfaceCreated method in the renderer:
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
Log.d(TAG, "onsurfacecreated method called");
int[] depthbits = new int[1];
gl.glGetIntegerv(GL_DEPTH_BITS, depthbits, 0);
Log.d(TAG, "Depth Bits: " + depthbits[0]);
gl.glDisable(GL_DITHER);
gl.glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_FASTEST);
gl.glClearColor(1, 1, 1, 1);
gl.glClearDepthf(1f);
gl.glEnable(GL_CULL_FACE);
gl.glShadeModel(GL_SMOOTH);
gl.glEnable(GL_DEPTH_TEST);
gl.glMatrixMode(GL_PROJECTION);
gl.glLoadMatrixf(
GLUtil.matrix4fToFloat16(mFrustum.getProjectionMatrix()), 0);
setLights(gl);
}
The GL Call for the depth bits returns 16 on the device and 0 on the emulator. It would've made sense if it only didn't work on the emulator since there obviously is no depth buffer present. (I've tried setting the EGLConfigChooser to true, so it would create a Config with as close to 16 bits depth buffer as possible, but that didn't work on the emulator. It wasn't necessary on the device.)
In my onDrawFrame method I make the following OpenGL Calls:
gl.glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
gl.glClearDepthf(1);
And then for each of the cubes:
gl.glEnableClientState(GL_VERTEX_ARRAY);
gl.glFrontFace(GL_CW);
gl.glVertexPointer(3, GL_FIXED, 0, mVertexBuffer);
// gl.glColorPointer(4, GL_FIXED, 0, mColorBuffer);
gl.glEnableClientState(GL_NORMAL_ARRAY);
gl.glNormalPointer(GL_FIXED, 0, mNormalBuffer);
// gl.glEnable(GL_TEXTURE_2D);
// gl.glTexCoordPointer(2, GL_FLOAT, 0, mTexCoordsBuffer);
gl.glDrawElements(GL_TRIANGLES, mIndexBuffer.capacity(),
GL_UNSIGNED_SHORT, mIndexBuffer);
gl.glDisableClientState(GL_NORMAL_ARRAY);
gl.glDisableClientState(GL_VERTEX_ARRAY);
What am I missing? If more code is needed just ask.
Thanks for any advice!
I got it to work correctly now. The problem was not OpenGL. It was (as Banthar mentioned) a problem with the projection matrix. I am managing the projection matrix myself and the calculation of the final matrix was somehow corrupted (or at least not what OpenGL expects). I can't remember where I got the algorithm for my calculation, but once I changed it to the way OpenGL calculates the projection matrix (or directly call glFrustumf(...)) it worked fine.
try enabling:
-glDepthFunc(GL_LEQUAL)
-glDepthMask( true );

Categories

Resources