I am having problems with my simple 2D OpenGL game.
Its really weird, I get textures to display correctly etc. but when I wanted to create particle effect with simple changing colors, for some reason it displays wrong colors from the buffer. I am using Android's OpenGL ES 1.1 but its the same with any version of OpenGL that uses VBO's.
I initialize the screen etc. and red triangle is displayed, but according to my color buffer it should be white, whats wrong?
GL11 gl11 = (GL11) gl;
gl11.glLoadIdentity();
gl11.glClear(GL_COLOR_BUFFER_BIT);
GLU.gluLookAt(gl, 0f, 0f, -container.getCamera().getScale(), 0f, 0f, 0f, 0f, -1f, 0f);
gl11.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl11.glEnableClientState(GL10.GL_COLOR_ARRAY);
gl11.glTranslatef(container.getCamera().getX(), container.getCamera().getY(), 0.0f);
container.addParticle(new ColouredParticle(-container.getCamera().getX(), -container.getCamera().getY(), (float)Math.random(), (float)Math.random(), 0f, 5000));
particleColorBufferPointer = createFloatBuffer(gl11, GL11.GL_ARRAY_BUFFER, new float[] {
1f, 1f, 1f, 1f,
1f, 1f, 1f, 1f,
1f, 1f, 1f, 1f,
});
gl11.glBindBuffer(GL11.GL_ARRAY_BUFFER, particleColorBufferPointer);
gl11.glColorPointer(4, GL10.GL_FLOAT, 0, 0);
particleVertexBufferPointer = createFloatBuffer(gl11, GL11.GL_ARRAY_BUFFER, new float[]{
-0.1f, -0.05f,
0.1f, -0.05f,
0.0f, 0.05f
});
gl11.glBindBuffer(GL11.GL_ARRAY_BUFFER, particleVertexBufferPointer);
gl11.glVertexPointer(2, GL10.GL_FLOAT, 0, 0);
gl11.glDrawArrays(GL11.GL_TRIANGLES, 0, 3);
gl11.glDeleteBuffers(2, new int[]{particleVertexBufferPointer, particleColorBufferPointer}, 0);
gl11.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl11.glDisableClientState(GL10.GL_COLOR_ARRAY);
int error = gl11.glGetError();
if(error != GL11.GL_NO_ERROR) {
Log.v(TAG, "error " + Integer.toHexString(error));
}
Its just simple hardcoded float array containing the triangle vertices and colors, but for some reason the colors are wrong as I said before, what can possibly go wrong with such small amount of code? the color doesn't change to anything but black, like it only reads the red value from the buffer. Also it gives absolutely no error at all!
createFloatBuffer method:
private int createFloatBuffer(GL11 gl, int type, float[] data) {
int[] bufferPointerBuffer = new int[1];
gl.glGenBuffers(1, bufferPointerBuffer, 0);
int bufferPointer = bufferPointerBuffer[0];
gl.glBindBuffer(type, bufferPointer);
FloatBuffer dataBuffer = createFloatBuffer(data);
gl.glBufferData(type, data.length * FLOAT_SIZE, dataBuffer, GL_STATIC_DRAW);
gl.glBindBuffer(type, -1);
return bufferPointer;
}
Wow.. It all was because I had texture bound and it for some reason tried to draw it even if I had texture coords disabled, I wish i could have just listened when I got told to disable any state thats not needed anymore.. Thanks guys! Problem solved!
Related
I have, a problem with the setLookAtM function. My goal is to create a cube within a cube something like this (yep, it's paint :P ):
So basically everything works... almoust... I have the smaller cube and I have the bigger one.
However, there is a problem. I created the bigger one with coords from -1 to 1 and now I want to upscale it. With scale 1.0f i have something like this (the inner cube is rotating):
And thats good, but now... when I try to scale the bigger cube (so that it looks like in the paint drawing) the image goes black or white (i guess it's because the "camera" looks at the white cube but still i dont know why does my inner cube disappear :/ I don't understand what I'm doing wrong. Here is my code:
public void onDrawFrame(GL10 unused) {
float[] scratch = new float[16];
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glEnable(GLES20.GL_DEPTH_TEST);
Matrix.setLookAtM(mViewMatrix, 0, 0, 0, -5.0f, 0f, 0f, -1.0f, 0f, 1.0f, 0.0f);
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);
mRoom.mScale = 1.0f;
Matrix.setIdentityM(mScaleMatrix, 0);
Matrix.scaleM(mScaleMatrix, 0, mRoom.mScale, mRoom.mScale, mRoom.mScale);
float[] scaleTempMatrix = new float[16];
Matrix.multiplyMM(scaleTempMatrix, 0, mMVPMatrix, 0, mScaleMatrix, 0);
mRoom.draw(scaleTempMatrix);
When I set for example:
mRoom.mScale = 3.0f;
And
Matrix.setLookAtM(mViewMatrix, 0, 0, 0, -2.0f, 0f, 0f, 0.0f, 1.0f, 1.0f, 0.0f);
My camera should be at (0, 0, -2) looking at (0,0, -1) and it should be inside the white cube (since scale is 3.0 so the coords should be from -3 to 3 right?) But all I get is a white screen without the smaller cube rotating inside :/
If your scale is 3x in this code, then your visible coordinate range is actually going to be [-1/3,1/3].
You are thinking about things backwards, it might help if you considered the order in which the scale operation is applied. Right now you are scaling the object-space coordinates, then applying the view matrix and then projection. It may not look that way, but that is how matrix multiplication in GL works; GL effectively flips the operands when it does matrix multiplication and matrix multiplication is not commutative.
I believe this is what you actually want:
public void onDrawFrame(GL10 unused) {
float[] scratch = new float[16];
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glEnable(GLES20.GL_DEPTH_TEST);
Matrix.setLookAtM(mViewMatrix, 0, 0, 0, -5.0f, 0f, 0f, -1.0f, 0f, 1.0f, 0.0f);
mRoom.mScale = 3.0f;
Matrix.setIdentityM(mScaleMatrix, 0);
Matrix.scaleM(mScaleMatrix, 0, mRoom.mScale, mRoom.mScale, mRoom.mScale);
Matrix.multiplyMM(mMVPMatrix, 0, mScaleMatrix, 0, mProjectionMatrix, 0);
Matrix.multiplyMM(mMVPMatrix, 0, mMVPMatrix, 0, mViewMatrix, 0);
mRoom.draw(mMVPMatrix);
In my GLSurfaceView.Renderer, I'm drawing my scene in two parts. The first group is offset and rotated, while the second is aligned to the "camera", so I apply a glRotate and glTranslate, then apply the exact opposite glTranslate and glRotate. On some devices, this works fine, but on others, the entire scene slowly rotates off "center".
Images: At start and After ~5 mins
Here's the onDraw() function:
#Override
public void onDrawFrame(GL10 gl) {
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
// rotate card table
gl.glRotatef(-20f, 1f, 0f, 0f);
gl.glRotatef( 5f, 0f, 1f, 0f);
gl.glRotatef(-10f, 0f, 0f, 1f);
gl.glTranslatef( 1f, 3f, 0f);
mDeck.draw(gl);
// undo rotate for buttons/overlays
gl.glTranslatef(-1f, -3f, 0f);
gl.glRotatef( 10f, 0f, 0f, 1f);
gl.glRotatef( -5f, 0f, 1f, 0f);
gl.glRotatef( 20f, 1f, 0f, 0f);
mOverlayBtns.draw(gl);
mPass.draw(gl);
}
The full source can be found on GitHub, here.
Is there something obvious I'm missing? Is there a better way to handle rotating, then rotating back?
So far, I've noticed this problem on:
Nexus 4
HTC Sensation
Another LG phone whose name I can't remember
These devices do not show the problem:
Galaxy Nexus
Nexus 7
Xoom
Your code was applying a set of affine transformations each frame, the problem is that each transformation is accumulative, so you were applying the transformation over and over. You either need to reset the current loaded MODELVIEW matrix using glLoadIdentity (http://www.khronos.org/opengles/sdk/1.1/docs/man/glLoadIdentity.xml) or push the current matrix, load your new matrix, draw and then pop the matrix back (as you were doing in your fix response).
I would put a glLoadIdentity at the beginning of each frame draw. Also, with your fix, I would also do this:
gl.glPushMatrix();
gl.glLoadIdentity(); // Be sure we start clean
// rotate card table
gl.glRotatef(-20f, 1f, 0f, 0f);
gl.glRotatef( 5f, 0f, 1f, 0f);
gl.glRotatef(-10f, 0f, 0f, 1f);
gl.glTranslatef( 1f, 3f, 0f);
mDeck.draw(gl);
gl.glPopMatrix();
Hope that helps.
I've found a fix for the issue, but I'm still hoping someone will be able to explain the problem I'm seeing above.
Instead of rotating the scene, then rotating back, I can use glPushMatrix and glPopMatrix to isolate the two groups. That way, the group of glTranslate and glRotate to move the scene back are unnecessary.
#Override
public void onDrawFrame(GL10 gl) {
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glPushMatrix();
// rotate card table
gl.glRotatef(-20f, 1f, 0f, 0f);
gl.glRotatef( 5f, 0f, 1f, 0f);
gl.glRotatef(-10f, 0f, 0f, 1f);
gl.glTranslatef( 1f, 3f, 0f);
mDeck.draw(gl);
gl.glPopMatrix();
gl.glPushMatrix();
// undo rotate for buttons/overlays
// gl.glTranslatef(-1f, -3f, 0f);
// gl.glRotatef( 10f, 0f, 0f, 1f);
// gl.glRotatef( -5f, 0f, 1f, 0f);
// gl.glRotatef( 20f, 1f, 0f, 0f);
mOverlayBtns.draw(gl);
mPass.draw(gl);
gl.glPopMatrix();
}
EDIT 2: Take a look on the Triangle2d sample of this GitHub project for a complete, working sample.
EDIT: See the accepted answer for a link with a good explanation on how the ortographic matrix works. In the end, I tweaked the provided code a little:
float[] mvp = {
2f/width, 0f, 0f, 0f,
0f, -2f/height, 0f, 0f,
0f, 0f, 0f, 0f,
-1f, 1f, 0f, 1f
};
Please note that my z is fixed in 0 and w fixed in 1. This matrix make the origin (0,0) at the bottom-left of the screen; if you want the origin at top-left, try:
float[] mvp = {
2f/width, 0f, 0f, 0f,
0f, 2f/height, 0f, 0f,
0f, 0f, 0f, 0f,
-1f, -1f, 0f, 1f
};
Another problem was the call to GLES20.glUniformMatrix4fv which I changed to:
FloatBuffer b = ByteBuffer.allocateDirect(mvp.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
b.put(mvp).position(0);
GLES20.glUniformMatrix4fv(uMvpPos, b.limit() / mvp.length, false, b);
If you want to mess with this a bit, try this online calculator. Just remember that the rows in your sorce file will be columns in the calculator.
Original Problem:
I'm trying to draw a 2d triangle using OpenGLES 2.0 on android, but so far, not much success. These are my shaders:
(Vertex shader)
uniform mat4 uMvp;
attribute vec3 aPosition;
attribute vec3 aColor;
varying vec4 vColor;
void main() {
vColor = vec4(aColor.xyz, 1.0);
vec4 position = vec4(aPosition.xyz, 1.0);
gl_Position = uMvp * position;
};
(Fragment shader)
precision mediump float;
varying vec4 vColor;
void main(void)
{
gl_FragColor = vColor;
};
Then, in the onSurfaceChanged method of GLSurfaceView.Renderer, I put the following:
public void onSurfaceChanged(GL10 gl, int width, int height)
{
// here I load and compile the shaders and link the program.
// in the end, I use GLES20.glUseProgram(programHandle);
// (code omitted)
// create the matrix for the uniform
int uMvpPos = GLES20.glGetUniformLocation(programHandle, "uMvp");
float[] mvp = {width, 0f, 0f, 0f,
0f, -height, 0f, 0f,
0f, 0f, -2f, 0f,
-1f, 1, -1f, 1};
GLES20.glUniformMatrix4fv(uMvpPos, mvp.length * 4, false, mvp, 0);
// set viewport and clear color to white
GLES20.glViewport(0, 0, width, height);
GLES20.glClearColor(1f, 1f, 1f,1.0f);
}
I used the values of the matrix showed on this question. My intent here is to work with coordinates in the same way as a canvas works: (0,0) on top-left of screen and (width, height) on bottom-right.
And last but not least, this is the code of onDrawFrame:
public void onDrawFrame(GL10 gl)
{
int aPos = GLES20.glGetAttribLocation(programHandle,"aPosition");
int aColor = GLES20.glGetAttribLocation(programHandle,"aColor");
// assuming I correctly set up my coordinate system,
// these are the triangle coordinates and color
float[] data =
{
// XYZ, RGB
100f, 100f, 0f,
1f, 0f, 0f,
50f, 50f, 0f,
1f, 0f, 0f,
150f, 50f, 0f,
1f, 0f, 0f,
};
// put all my data into a float buffer
// the '* 4' is because a float has 4 bytes
FloatBuffer dataVertex = ByteBuffer.allocateDirect(data.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
dataVertex.put(data).position(0);
// set the POSITION values
// attribute, dataSize(number of elements), data type, normalized?, size(bytes), data
GLES20.glEnableVertexAttribArray(aPos);
GLES20.glVertexAttribPointer(aPos, 3, GLES20.GL_FLOAT, false, 6 * 4, dataVertex);
// set the COLOR values
dataVertex.position(3); // offset
GLES20.glEnableVertexAttribArray(aColor);
GLES20.glVertexAttribPointer(aColor, 3, GLES20.GL_FLOAT, false, 6 * 4, dataVertex);
// put a white background
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
// and finally draw the triangle!
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, 3);
}
The end result is... a boring white screen, without the triangle. I guess I'm committing a very simple mistake, but I just can't spot it. Any thoughts?
Your orthographic projection matrix is wrong.
Orthographic projection is defined as (in column major order):
2/(r-l), 0, 0, 0,
0, 2/(t-b), 0, 0,
0, 0, 2/(f-n), 0,
(r+l)/(l-r), (t+b)/(b-t), (f+n)/(n-f), 1
In your case r=0, l=width, t=height, b=0, f=1, n=0 and you get:
-2/width, 0, 0, 0,
0, 2/height, 0, 0,
0, 0, 2, 0,
1, -1, -1, 1
I want to display shapes as different as one likes using OpenGL ES on an Android device. Problem is that my code doesn't even work for easy shapes like a rectangle (which I am going to use below).
I think somthing is wrong with the glTranslatef. I've adjusted all the values but I can't figure out what it is.
The Rectangle is defined by the points P(0,0,0), P(0,1,0), P(1,1,0), P(1,0,0). In the Activity I implemented the GLSurfaceView.Renderer like this:
private static FloatBuffer getVertexCoords() {
float coords[] = {
0f, 0f, 0f, // first triangle first point
0f, 1f, 0f, // first triangle second point
1f, 1f, 0f, // first triangle third point
1f, 1f, 0f, // second triangle first point
1f, 0f, 0f, // second triangle second point
0f, 0f, 0f, // second triangle third point
}
ByteBuffer vbb = ByteBuffer.allocateDirect(coords.length * 4); // n coords * 4 bytes per float
vbb.order(ByteOrder.nativeOrder());
FloatBuffer trianglesVB = vbb.asFloatBuffer();
trianglesVB.put(coords);
trianglesVB.position(0);
return trianglesVB;
}
#Override
public void onDrawFrame(GL10 gl) {
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
gl.glTranslatef(0f, 0f, -4f);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY); // glBegin
gl.glColor4f(1.0f, 0.0f, 0.0f, 1.0f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, getVertexCoords());
gl.glDrawArrays(GL10.GL_TRIANGLES, 0, 2 * 3 * 3); // triangles * points * coords
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY); // glEnd
int error = gl.glGetError();
if (error != GL10.GL_NO_ERROR) {
Log.e(TAG, "OpenGL ES Error: " + error);
}
}
#Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
// think this one doesn't matter
}
#Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
gl.glClearColor(1.0f, 1.0f, 1.0f, 1.0f); // white background
gl.glFrontFace(GL10.GL_CW); // front face is clockwise
}
I think you need a projection matrix in there somewhere. If you don't set one, then you are drawing directly in normalized device coordinates, of which the only valid z-values are from (-1 to 1).
Simply your triangle is outside of the depth range displayed.
Try adding a simple projection matrix to onSurfaceCreated:
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-10, 10, -10, 10, 0, 10);
I'm trying to learn OpenGL ES 2.0 for Android, but I am finding it hard to locate good introduction tutorials... I found This, but it only explains "glDrawArrays" and not "glDrawElements"... I'm trying to convert code from my model loading class ES 1.1, but I feel that drawing arrays might be too slow...
So what I'm asking is how would I convert the following to work in ES 2.0?
How an object is stored (seems to be fine):
private ShortBuffer SindexBuffer;
private FloatBuffer SvertexBuffer;
private FloatBuffer StexBuffer;
private void initSprite()
{
float[] fcol = {1,1,1,1};
float[] coords = {
0.5f, 0.5f, 0f,
-0.5f, 0.5f, 0f,
0.5f, -0.5f, 0f,
-0.5f, -0.5f, 0f
};
short[] index = {
0,1,2,
1,3,2
};
float[] texCoords ={
0,1,
1,1,
0,0,
1,0
};
//float has 4 bytes
ByteBuffer vbb = ByteBuffer.allocateDirect(coords.length * 4);
vbb.order(ByteOrder.nativeOrder());
SvertexBuffer = vbb.asFloatBuffer();
ByteBuffer tC = ByteBuffer.allocateDirect(texCoords.length * 4);
tC.order(ByteOrder.nativeOrder());
StexBuffer = tC.asFloatBuffer();
StexBuffer.put(texCoords);
StexBuffer.position(0);
//short has 2 bytes
ByteBuffer ibb = ByteBuffer.allocateDirect(index.length*2);
ibb.order(ByteOrder.nativeOrder());
SindexBuffer = ibb.asShortBuffer();
SvertexBuffer.put(coords);
SindexBuffer.put(index);
SvertexBuffer.position(0);
SindexBuffer.position(0);
ByteBuffer fbb = ByteBuffer.allocateDirect(fcol.length * 4);
fbb.order(ByteOrder.nativeOrder());
}
And how it is drawn (this is where I need help):
public void drawSprite(GL10 gl, int tex)
{
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glBindTexture(GL10.GL_TEXTURE_2D, tex);
//defines the vertices we want to draw
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, SvertexBuffer);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, StexBuffer);
//Draw the vertices
gl.glDrawElements(GL10.GL_TRIANGLES, 6, GL10.GL_UNSIGNED_SHORT, SindexBuffer);
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
}
(though the texture part isn't that important yet... not up to that yet).
Thanks to anyone who can help... oh, and does anyone know any simple introductions to ES 2.0 for Android aimed at people who used 1.1 first?
Translate as this
float[] coords = {
0.5f, 0.5f, 0f, //index 0
-0.5f, 0.5f, 0f, //index 1
0.5f, -0.5f, 0f, //index 2
-0.5f, -0.5f, 0f //index 3
}
And you are telling to opengl that plot using this sequence of points:
0, 1, 2, 1, 3 , 2.
Here you save memory becouse indexes are short type and each vertex needs 3 floats
Regards