I'm currently learning OpenGL ES programming on Android (2.1). I started with the obligatory rotating cube. It's rotating fine but I can't get the depth buffer to work. The polygons are always displayed in the order the GL commands render them. I do this during initialization of GL:
gl.glClearColor(.5f, .5f, .5f, 1);
gl.glShadeModel(GL10.GL_SMOOTH);
gl.glClearDepthf(1f);
gl.glEnable(GL10.GL_DEPTH_TEST);
gl.glDepthFunc(GL10.GL_LEQUAL);
gl.glHint(GL10.GL_PERSPECTIVE_CORRECTION_HINT, GL10.GL_NICEST);
On surface-change I do this:
gl.glViewport(0, 0, width, height);
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glLoadIdentity();
GLU.gluPerspective(gl, 45.0f, (float) width / (float) height, 0.1f, 100f);
When I enable backface culling then everything looks correct. But backface culling is only a speed-optimization so it should also work with only the depth buffer or not? So what is missing here?
Found it myself. It wasn't GL code, it was android code:
view.setEGLConfigChooser(false);
The "false" in this line explicitly says that no Z-Buffer should be allocated. After switching it to "true" it worked perfectly.
I was using the GL2JNIView provided in the hello-gl2 sample in NDK r10,
and I was also having this issue.
The problem was that when creating the GL2JNIView object, i didn't specify the depth size on the constructor, so that the private class ConfigChooser could find the right EGL configuration.
public GameJNIView(Context context, boolean translucent, int depth, int stencil){...}
public GameJNIView(Context context){...}
myView = new GL2JNIView(this, false, 16, 0);
instead of
myView = new GL2JNIView(this);
Related
Following code is drawing two different cube (red/green) on android studio with JNI using opengl-ES. On the virtual device, the result is correct. However, on real device, the result looks like strange.
The structure(means 3d model and its view on 2d) is correct. but the color is different to AVD. In addition it looks like the depth test is not working. What's the problem?
Simply speaking,
AVD : gives correct result (red, green cube with depth test on with specific camera pose)
real device : gives strange result (same camera pose to AVD. but color is different. two green cubes are there. also depth test not working)
float color1[] = {1.0f, 0.0f, 0.0f};
float color2[] = {0.0f, 1.0f, 0.0f};
int mColorHandle1;
int mColorHandle2;
glViewport(0,1280,720,1280);
glEnable(GL_DEPTH_TEST);
glUniform4f(mColorHandle1, color1[0], color1[1], color1[2], 1.0f);
glVertexAttribPointer(gvPositionHandle, 3, GL_FLOAT, GL_FALSE, 0, gTriangleVertices1);
glEnableVertexAttribArray(gvPositionHandle);
glDrawArrays(GL_TRIANGLES, 0, 36);
glUniform4f(mColorHandle2, color2[0], color2[1], color2[2], 1.0f);
glDrawArrays(GL_TRIANGLES, 36, 36);
glDisableVertexAttribArray(gvPositionHandle);
glDisable(GL_DEPTH_TEST);
I want to do additive blending on camera preview's surface texture binded to my opengl context.
I am getting weirdly rendered texture when i enable blending(20 noisy squares getting rendered), if I call GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT); preview is proper but i lose my additive blending as i have cleared the buffer!
With glClear call
Without glClear call
I have no clue whats the problem is, i am newbie to opengl-es, Any suggestions?
If any piece of code is needed i can provide to better understand the issue.
putting relevant code only. Ask for any other part of code if necessary.
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
//compile shader, link program.... etc omitted for brevity
//enable additive blending
GLES20.glEnable(GLES20.GL_BLEND);
GLES20.glBlendFunc(GLES20.GL_ONE, GLES20.GL_ONE);
GLES20.glBlendEquation(GLES30.GL_MAX);
}
public void onDrawFrame(GL10 unused) {
// Do not want to do this, but if i do preview is normal but i lose my blending
//GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
surfaceTexture.updateTexImage();
GLES20.glUseProgram(_onscreenShader);
int th = GLES20.glGetUniformLocation(_onscreenShader, "sTexture");
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, _rawVideoTexture);
GLES20.glUniform1i(th, 0);
GLES20.glVertexAttribPointer(_onscreenPositionAttribute, 2, GLES20.GL_FLOAT, false, 4 * 2, pVertex);
GLES20.glVertexAttribPointer(_onscreenUVAttribute, 2, GLES20.GL_FLOAT, false, 4 * 2, pTexCoord);
GLES20.glEnableVertexAttribArray(_onscreenPositionAttribute);
GLES20.glEnableVertexAttribArray(_onscreenUVAttribute);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}
Most Android devices have a PowerVR, Mali, Adreno or Vivante GPU core which are all deferred, tiled-based renderers. This architecture requires the glClear operation at the start of each frame to tell the OpenGL ES driver when to clear internal triangle binning queues as well as the frame buffer. If you don't do the glClear, this caching design does not work properly and you will get weird results that will differ from one GPU type to another. So, you really must do the glClear.
I use the following code to retrieve the depth buffer:
FloatBuffer pixels = ByteBuffer
.allocateDirect(4).order(ByteOrder.nativeOrder()).asFloatBuffer();
GLES20.glReadPixels(pointx, pointy, 1, 1,
GLES20.GL_DEPTH_COMPONENT16, GLES20.GL_FLOAT, pixels);
Problem is, whichever point I am requesting, the pixels is giving me 0.0;
I have enabled the following in onSurfaceCreated:
GLES20.glEnable(GLES20.GL_DEPTH_TEST);
GLES20.glEnable(GLES20.GL_BLEND);
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA);
GLES20.glDepthFunc(GLES20.GL_LEQUAL);
GLES20.glDepthMask(true);
GLES20.glClearColor(1, 1, 1, 1);
I've been struggling with this issue for days! Please help.
According to the OpenGL ES 2.0 docs, glReadPixels() doesn't support reading the depth buffer. What does glGetError() return?
When I run my program on an emulator, it displays an ImageView splash screen, then a black screen for the remainder of the application, which uses GLSurfaceViews. The OGL runs well on my phone. I have tested the program on two computers (low and high performance) and neither displays the GLSurfaceViews. I have also tested the emulator using some of the OGL demos from the Google apidemos interweb site and the demos don't display on either computer. My program uses OGL es 1.1, however I have also tested using OGL es 1.0 to no avail. How might I display ogl on an emulator? Thanks.
Here's an example of some simple square rendering code that doesn't work on the emulator
public void onDrawFrame(GL10 gl) {
//This works
gl.glClearColor(_red, _green, _blue, 1.0f);
gl.glClear(GL10.GL_COLOR_BUFFER_BIT);
//This doesn't
float vertices[] = { .5f, .5f, 0, .5f, -.5f, 0, -.5f, .5f, 0, -.5f, -.5f, 0 };
FloatBuffer vertexSquareBuffer = ByteBuffer.allocateDirect(4 * 3 * 4)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
vertexSquareBuffer.put(vertices);
vertexSquareBuffer.position(0);
gl.glColor4f(1, 1, 0, 0.5f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexSquareBuffer);
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0,4);
}
Well there are some possibilities. Firstly lets see if something that should work, does work on your emulator. Can you please go here and try this tutorial and run it through your emulator: http://www.droidnova.com/android-3d-game-tutorial-part-i,312.html (Do this bit first)
That way we know if the problem is with your code or the emulator.
After that you might need to look to see if there are all the required shared object libraries present.
Let me know in the comments how you go.
I am developing a 3D Rendering Engine for Android. I have experienced some issues with the depth buffer. I am drawing some cubes, one big and two small ones that will fall on top of the bigger one. While rendering I have seen that obviously something with the depth buffer is wrong, as seen in this screen shot:
This screen shot was taken on an HTC Hero (running Android 2.3.4) with OpenGL ES 1.1 The whole application is (still) targeted at OpenGL ES 1.1. It does look the same on the Emulator.
These are the calls in my onSurfaceCreated method in the renderer:
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
Log.d(TAG, "onsurfacecreated method called");
int[] depthbits = new int[1];
gl.glGetIntegerv(GL_DEPTH_BITS, depthbits, 0);
Log.d(TAG, "Depth Bits: " + depthbits[0]);
gl.glDisable(GL_DITHER);
gl.glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_FASTEST);
gl.glClearColor(1, 1, 1, 1);
gl.glClearDepthf(1f);
gl.glEnable(GL_CULL_FACE);
gl.glShadeModel(GL_SMOOTH);
gl.glEnable(GL_DEPTH_TEST);
gl.glMatrixMode(GL_PROJECTION);
gl.glLoadMatrixf(
GLUtil.matrix4fToFloat16(mFrustum.getProjectionMatrix()), 0);
setLights(gl);
}
The GL Call for the depth bits returns 16 on the device and 0 on the emulator. It would've made sense if it only didn't work on the emulator since there obviously is no depth buffer present. (I've tried setting the EGLConfigChooser to true, so it would create a Config with as close to 16 bits depth buffer as possible, but that didn't work on the emulator. It wasn't necessary on the device.)
In my onDrawFrame method I make the following OpenGL Calls:
gl.glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
gl.glClearDepthf(1);
And then for each of the cubes:
gl.glEnableClientState(GL_VERTEX_ARRAY);
gl.glFrontFace(GL_CW);
gl.glVertexPointer(3, GL_FIXED, 0, mVertexBuffer);
// gl.glColorPointer(4, GL_FIXED, 0, mColorBuffer);
gl.glEnableClientState(GL_NORMAL_ARRAY);
gl.glNormalPointer(GL_FIXED, 0, mNormalBuffer);
// gl.glEnable(GL_TEXTURE_2D);
// gl.glTexCoordPointer(2, GL_FLOAT, 0, mTexCoordsBuffer);
gl.glDrawElements(GL_TRIANGLES, mIndexBuffer.capacity(),
GL_UNSIGNED_SHORT, mIndexBuffer);
gl.glDisableClientState(GL_NORMAL_ARRAY);
gl.glDisableClientState(GL_VERTEX_ARRAY);
What am I missing? If more code is needed just ask.
Thanks for any advice!
I got it to work correctly now. The problem was not OpenGL. It was (as Banthar mentioned) a problem with the projection matrix. I am managing the projection matrix myself and the calculation of the final matrix was somehow corrupted (or at least not what OpenGL expects). I can't remember where I got the algorithm for my calculation, but once I changed it to the way OpenGL calculates the projection matrix (or directly call glFrustumf(...)) it worked fine.
try enabling:
-glDepthFunc(GL_LEQUAL)
-glDepthMask( true );