Doesn't rotate after applying transformation matrix [duplicate] - android

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Is Google’s Android OpenGL tutorial teaching incorrect linear algebra?
Learning OpenGL ES 2.0 on Android. Using Emulator, running Android 4.1.
Copied and pasted snippets from Android Developer Site / OpenGL
Updated onDrawFrame method. Pasted below.
Added Matrix.setIdentityM(mRotationMatrix, 0) since it's was a Null Matrix.
Changed mAngle to angle (line 16).
public void onDrawFrame(GL10 unused) {
// Redraw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Set the camera position (View matrix)
Matrix.setLookAtM(mVMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
// Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
// Create a rotation transformation for the triangle
long time = SystemClock.uptimeMillis() % 4000L;
float angle = 0.090f * ((int) time);
Matrix.setIdentityM(mRotationMatrix, 0); //added
Matrix.setRotateM(mRotationMatrix, 0, angle, 0, 0, 1.0f); //changed
// Combine the rotation matrix with the projection and camera view
Matrix.multiplyMM(mMVPMatrix, 0, mRotationMatrix, 0, mMVPMatrix, 0);
// Draw shape
mTriangle.draw(mMVPMatrix);
}
And Commented out setRenderMode(RENDERMODE_WHEN_DIRTY);
Yet the Triangle drawn did not rotated. Where did I go wrong?

Thanks to this question here. I learnt to solve this.
Just edit the Vertex Shader Code. uMVPMatrix is important without this the projection is not applied.
private final String vertexShaderCode = "attribute vec4 vPosition;"
+"uniform mat4 uMVPMatrix;"
+ "void main() {" + " gl_Position = uMVPMatrix * vPosition;" + "}";

Related

Open gl ES 2 orthographic projection and rotation

my rotation works normally this way :
public void onDrawFrame(GL10 unused) {
float[] scratch = new float[16];
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
Matrix.setLookAtM(mViewMatrix, 0, 0f ,0f, -3f, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);
Matrix.setRotateM(mRotationMatrix, 0, i, 0, 0, 1.0f);
Matrix.multiplyMM(scratch, 0, mMVPMatrix, 0, mRotationMatrix, 0);
mTriangle.draw(scratch);
}
with this shader vertex :
private final String vertexShaderCode =
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
" gl_Position = uMVPMatrix * vPosition;" +
"}";
But when i want to change my 0x;0y coordinate to be at the top left like this :
private final String vertexShaderCode =
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
" gl_Position = vec4 ( vPosition.x * 2.0 / 1280.0 - 1.0," +
"vPosition.y* - 2.0 / 800.0 + 1.0," +
"vPosition.z, " +
"1.0);" +
"}";*
With these vertices :
static float triangleCoords[] = {
300.0f, 0.0f, 0.0f,
0.0f, 500.0f, 0.0f,
1280.0f, 500.0f, 0.0f
};
Picture : http://i.stack.imgur.com/aRQa9.jpg
My triangle appears like i would like but i can't see my rotation anymore . In order to solve i tried a lot of things like multiply this one by uMVPMatrix ,but now i can see the rotation but i always get coordinate problems as it appears at the middle of the screen, and my vertice triangle doesnt fit the size it should :
Picture : http://i.stack.imgur.com/tH02H.png
My android screen resolution is 1280*800 I would just like to get this triangle with those vertices rotate without being modified this way.
Can someone explain me how to do this .
EDIT :
Thanks to Andon M. Coleman and SAKrisT , i was able to get my origin point to 0:0 android coordinate at the top left.
if it can help :
public void onDrawFrame(GL10 unused) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
Matrix.orthoM(mProjectionMatrix, 0, left, right, top, bottom, near, far);
Matrix.setLookAtM(mViewMatrix, 0, 0, 0, 1.0f, 0.0f, 0f, 0f, 0f, 1.0f, 0.0f);
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);
Matrix.setRotateM(mRotationMatrix, 0, 0, 0, 0, 1.0f);
Matrix.multiplyMM(mProjectionMatrix, 0, mMVPMatrix, 0, mRotationMatrix, 0);
mTriangle.draw(mProjectionMatrix);
}

Correct vertex shader code? OpenGL ES 2.0

Edit Code added, please see below
Edit 2 - Screenshots from device included at bottom along with explanation
Edit 3 - New code added
I have 2 classes, a rendered and a custom 'quad' class.
I have these declared at class level in my renderer class:
final float[] mMVPMatrix = new float[16];
final float[] mProjMatrix = new float[16];
final float[] mVMatrix = new float[16];
And in my onSurfaceChanged method I have:
#Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 3, 7);
}
and....
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
// TODO Auto-generated method stub
myBitmap = BitmapFactory.decodeResource(curView.getResources(), R.drawable.box);
//Create new Dot objects
dot1 = new Quad();
dot1.setTexture(curView, myBitmap);
dot1.setSize(300,187); //These numbers are the size but are redundant/not used at the moment.
myBitmap.recycle();
//Set colour to black
GLES20.glClearColor(0, 0, 0, 1);
}
And finally from this class, onDrawFrame:
#Override
public void onDrawFrame(GL10 gl) {
// TODO Auto-generated method stub
//Paint the screen the colour defined in onSurfaceCreated
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Set the camera position (View matrix) so looking from the front
Matrix.setLookAtM(mVMatrix, 0, 0, 0, 3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
// Combine
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
dot1.rotateQuad(0,0,45, mMVPMatrix); //x,y,angle and matrix passed in
}
Then, in my quad class:
This declared at class level:
private float[] mRotationMatrix = new float[16];
private final float[] mMVPMatrix = new float[16];
private final float[] mProjMatrix = new float[16];
private final float[] mVMatrix = new float[16];
private int mMVPMatrixHandle;
private int mPositionHandle;
private int mRotationHandle;
//Create our vertex shader
String strVShader =
"uniform mat4 uMVPMatrix;" +
"uniform mat4 uRotate;" +
"attribute vec4 a_position;\n"+
"attribute vec2 a_texCoords;" +
"varying vec2 v_texCoords;" +
"void main()\n" +
"{\n" +
// "gl_Position = a_position * uRotate;\n"+
// "gl_Position = uRotate * a_position;\n"+
"gl_Position = a_position * uMVPMatrix;\n"+
// "gl_Position = uMVPMatrix * a_position;\n"+
"v_texCoords = a_texCoords;" +
"}";
//Fragment shader
String strFShader =
"precision mediump float;" +
"varying vec2 v_texCoords;" +
"uniform sampler2D u_baseMap;" +
"void main()" +
"{" +
"gl_FragColor = texture2D(u_baseMap, v_texCoords);" +
"}";
Then method for setting texture (don't think this is relevant to this problem though!!)
public void setTexture(GLSurfaceView view, Bitmap imgTexture){
this.imgTexture=imgTexture;
iProgId = Utils.LoadProgram(strVShader, strFShader);
iBaseMap = GLES20.glGetUniformLocation(iProgId, "u_baseMap");
iPosition = GLES20.glGetAttribLocation(iProgId, "a_position");
iTexCoords = GLES20.glGetAttribLocation(iProgId, "a_texCoords");
texID = Utils.LoadTexture(view, imgTexture);
}
And finally, my 'rotateQuad' method (which currently is supposed to draw and rotate the quad).
public void rotateQuad(float x, float y, int angle, float[] mvpMatrix){
Matrix.setRotateM(mRotationMatrix, 0, angle, 0, 0, 0.1f);
// Matrix.translateM(mRotationMatrix, 0, 0, 0, 0); //Removed temporarily
// Combine the rotation matrix with the projection and camera view
Matrix.multiplyMM(mvpMatrix, 0, mRotationMatrix, 0, mvpMatrix, 0);
float[] vertices = {
-.5f,.5f,0, 0,0,
.5f,.5f,0, 1,0,
-.5f,-.5f,0, 0,1,
.5f,-.5f,0, 1,1
};
vertexBuf = ByteBuffer.allocateDirect(vertices.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
vertexBuf.put(vertices).position(0);
//Bind the correct texture
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texID);
//Use program
GLES20.glUseProgram(iProgId);
// get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(iProgId, "uMVPMatrix");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
// get handle to shape's rotation matrix
mRotationHandle = GLES20.glGetUniformLocation(iProgId, "uRotate");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mRotationHandle, 1, false, mRotationMatrix, 0);
//Set starting position for vertices
vertexBuf.position(0);
//Specify attributes for vertex
GLES20.glVertexAttribPointer(iPosition, 3, GLES20.GL_FLOAT, false, 5 * 4, vertexBuf);
//Enable attribute for position
GLES20.glEnableVertexAttribArray(iPosition);
//Set starting position for texture
vertexBuf.position(3);
//Specify attributes for vertex
GLES20.glVertexAttribPointer(iTexCoords, 2, GLES20.GL_FLOAT, false, 5 * 4, vertexBuf);
//Enable attribute for texture
GLES20.glEnableVertexAttribArray(iTexCoords);
//Draw it
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}
for Edit 2.
This is my quad as drawn in the center of the screen. No rotation.
This is the same quad rotated at +45 Degrees with the code "gl_Position = a_position * uMVPMatrix;" + in my vertex shader (it's from a different project now so the shader variable is a_position and not vPosition), it's looks correct!
However, this is the same quad rotated at +45 Degrees with the 2 shader variables switched (so they read "gl_Position = uMVPMatrix * a_position;" - as you can see, it's not quite right.
Also just a side note, you can't see it here as the quare is symetrical, but each method also rotates in the opposite direction to the other....
Any help appreciated.
It's really impossible to tell because we don't know what you are passing to these two variables.
OpenGL is column-major format, so if vPosition is in fact a vector, and uMVPMatrix is a matrix, then the first option is correct, if this is in your shader.
If this is not in your shader but in your program code, then there is not enough information.
If you are using the first option but getting unexpected results, you are likely not computing your matrix properly or not passing the correct vertices.
Normally in the vertex shader you should multiple the positions by the MVP, that is
gl_Position = uMVPMatrix *vPosition;
When you change the order this should work...
Thanks to all for the help.
I managed to track down the problem (For the most part). I will show what I did.
It was the following line:
Matrix.multiplyMM(mvpMatrix, 0, mvpMatrix, 0, mRotationMatrix, 0);
As you can see I was multiplying the matrices and storing them back into one that I was using in the multiplication.
So I created a new matrix called mvpMatrix2 and stored the results in that. Then passed that to my vertex shader.
//Multiply matrices
Matrix.multiplyMM(mvpMatrix2, 0, mvpMatrix, 0, mRotationMatrix, 0);
//get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(iProgId, "uMVPMatrix");
//Give to vertex shader variable
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix2, 0);
After applying this, there is no distortion (And also, with regards to my other question here Using Matrix. Rotate in OpenGL ES 2.0 I am able to translate the centre of the quad). I say 'for the most part' because however, when I rotate it, it rotates backwards (so if I say rotate +45 degrees, (Clockwise), it actually rotates the quad by -45 degrees (Anit-clockwise).
But hopefully, this will help anyone who has a similar problem in the future.

Using Matrix. Rotate in OpenGL ES 2.0

Edit - Added more code
Having a lot of problems attempting to correctly rotate my quad using OpenGL ES 2.0.
It always seems to rotate around the centre of the screen co-ordinates. I'm trying to get it to rotate around it's own centre (for 2d, so z axis only).
I've been experimenting with Matrix.translate as show below. However, changing the x or y pos here simply draws the quad at a different place on the screen, but when it rotates, again it rotates around the centre of the screen. Please could someone explain how to get it to spin around it's own z axis (like a wheel)?
Thanks, here are the relevant lines of code - if more is needed, please ask and I will post. (Please note, I've looked at a lot of similar questions on SO and the wider internet but I've not managed to find an answer thus far).
Thanks.
//Set rotation
Matrix.setRotateM(mRotationMatrix, 0, -angle, 0, 0, 1.0f);
//Testing translation
Matrix.translateM(mRotationMatrix, 0, -.5f, .5f, 0f);
// Combine the rotation matrix with the projection and camera view
Matrix.multiplyMM(mvpMatrix, 0, mRotationMatrix, 0, mvpMatrix, 0);
My Shaders (declared at class level)
private final String vertexShaderCode =
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
" gl_Position = uMVPMatrix * vPosition;" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
From onSurfaceChanged
float ratio = (float) width / height;
Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 3, 7);
In My onDrawFrame method
// Set the camera position (View matrix)
Matrix.setLookAtM(mVMatrix, 0, 0, 0, 3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
//Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
I´ve encountered the same problems (seen weird distortions and everything else), my solution based on the Android Training > Displaying Graphics with OpenGL ES > Adding Motion below.
(Head over to my detailed post for at if needed:
OpenGL ES Android Matrix Transformations.)
Set a mModelMatrix to identity Matrix
Matrix.setIdentityM(mModelMatrix, 0); // initialize to identity matrix
Apply translation to the mModelMatrix
Matrix.translateM(mModelMatrix, 0, -0.5f, 0, 0); // translation to the left
Apply rotation to a mRotationMatrix (angles in degrees)
Matrix.setRotateM(mRotationMatrix, 0, mAngle, 0, 0, -1.0f);
Combine rotation and translation via Matrix.multiplyMM
mTempMatrix = mModelMatrix.clone();
Matrix.multiplyMM(mModelMatrix, 0, mTempMatrix, 0, mRotationMatrix, 0);
Combine the model matrix with the projection and camera view
mTempMatrix = mMVPMatrix.clone();
Matrix.multiplyMM(mMVPMatrix, 0, mTempMatrix, 0, mModelMatrix, 0);
Here is a walkthrough. Let's say you were to draw a teapot... the modelMatrix would be an identity to start with. The shape is centered on the origin like this:
Verify this is what you have before you continue...
Once you have you should apply the rotation to the model matrix, compile+run - you get a rotated copy...
Once you have this you can translate:
So for you, all you need to do appears to verify when rotation matrix is identity e.g.
Matrix.setIdentityM( mRotationMatrix,0);
that the shape is in the center. If it is not move it to the center.
Once it is in the center apply the rotation e.g.
Matrix.setIdentityM( mRotationMatrix,0);
<as needed movement to center>
Matrix.rotate(mRotationMatrix, 0, -angle, 0, 0, 1.0f);
<any other translation you want>
Do it in steps to make your life easy so you see what is going on.
Rotation usually occurs around the origin, so you want to rotate your quad before you translate it. If you rotate after you translate, then the quad will first be moved away from the origin, then rotated around the origin.
Without knowing how your Matrix function are implemented, we cannot advise on whether you are using them correctly. All you've show us in the functions' interface.
But in general, rotate before you translate.
Apply your operations backwards:
1st- Matrix.translateM(mRotationMatrix, 0, -.5f, .5f, 0f);
2nd- Matrix.setRotateM(mRotationMatrix, 0, -angle, 0, 0, 1.0f);
It will rotate around its own center

Android OpenGL weirdness with the setLookAtM method

As a beginner to android and openGL 2.0 es, I'm testing simple things and see how it goes.
I downloaded the sample at http://developer.android.com/training/graphics/opengl/touch.html .
I changed the code to check if I could animate a rotation of the camera around the (0,0,0) point, the center of the square.
So i did this:
public void onDrawFrame(GL10 unused) {
// Draw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Set the camera position (View matrix)
long time = SystemClock.uptimeMillis() % 4000L;
float angle = ((float) (2*Math.PI)/ (float) 4000) * ((int) time);
Matrix.setLookAtM(mVMatrix, 0, (float) (3*Math.sin(angle)), 0, (float) (3.0f*Math.cos(angle)), 0 ,0, 0, 0f, 1.0f, 0.0f);
// Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
// Draw square
mSquare.draw(mMVPMatrix);
}
I expected the camera to look always to the center of the square (the (0,0,0) point) but that's not what happens. The camera is indeed rotating around the square but the square does not stay in the center of the screen.. instead it is moving along the X axis...:
I also expected that if we gave the eyeX and eyeY the same values as centerX and centerY,like this:
Matrix.setLookAtM(mVMatrix, 0, 1, 1, -3, 1 ,1, 0, 0f, 1.0f, 0.0f);
the square would keep it's shape (I mean, your field of vision would be dragged but along a plane which would be paralel to the square), but that's also not what happens:
This is my projection matrix:
float ratio = (float) width / height;
// this projection matrix is applied to object coordinates
// in the onDrawFrame() method
Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 2, 7);
What is going on here?
Looking at the source code to the example you downloaded, I can see why you're having that problem, it has to do with the order of the matrix multiplication.
Typically in OpenGL source you see matrices set up such that
transformed vertex = projMatrix * viewMatrix * modelMatrix * input vertex
However in the source example program that you downloaded, their shader is setup like this:
" gl_Position = vPosition * uMVPMatrix;"
With the position on the other side of the matrix. You can work with OpenGL in this way, but it requires that you reverse the lhs/rhs of your matrix multiplications.
Long story short, in your case, you should change your shader to read:
" gl_Position = uMVPMatrix * vPosition;"
and then I believe you will get the expected behavior.

Is Google's Android OpenGL tutorial teaching incorrect linear algebra?

After helping another user with a question regarding the Responding to Touch Events Android tutorial, I downloaded the source code, and was quite baffled by what I saw. The tutorial seems to not be able to decide whether it wants to use row vectors or column vectors, and it looks all mixed up to me.
On the Android Matrix page, they claim that their convention is column-vector/column-major, which is typical of OpenGL.
Am I right, or is there something I am missing? Here are the relevant bits of it:
Start out by creating a MVPMatrix by multiplying mProjMatrix * mVMatrix. So far so good.
// Set the camera position (View matrix)
Matrix.setLookAtM(mVMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
// Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0)
Next they are appending a rotation to the left hand side of the MVPMatrix? This seems a little weird.
// Create a rotation for the triangle
Matrix.setRotateM(mRotationMatrix, 0, mAngle, 0, 0, -1.0f);
// Combine the rotation matrix with the projection and camera view
Matrix.multiplyMM(mMVPMatrix, 0, mRotationMatrix, 0, mMVPMatrix, 0)
Uploading in non-transposed order.
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
Finally in their shader, a vector*matrix multiplication?
// the matrix must be included as a modifier of gl_Position
" gl_Position = vPosition * uMVPMatrix;"
Adding this all together, we get:
gl_Position = vPosition * mRotation * mProjection * mView;
Which is not correct by any stretch of my imagination. Is there any explanation that I'm not seeing as to what's going on here?
As the guy who wrote that OpenGL tutorial, I can confirm that the example code is incorrect. Specifically, the order of the factors in the shader code should be reversed:
" gl_Position = uMVPMatrix * vPosition;"
As to the application of the rotation matrix, the order of the factors should also be reversed so that the rotation is the last factor. The rule of thumb is that matrices are applied in right-to-left order, and the rotation is applied first (it's the the "M" part of "MVP"), so it needs to be the rightmost operand. Furthermore, you should use a scratch matrix for this calculation, as recommended by Ian Ni-Lewis (see his more complete answer, below):
float[] scratch = new float[16];
// Combine the rotation matrix with the projection and camera view
Matrix.multiplyMM(scratch, 0, mMVPMatrix, 0, mRotationMatrix, 0);
Thanks for calling attention to this problem. I'll get the training class and sample code fixed as soon as I can.
Edit: This issue has now been corrected in the downloadable sample code and the OpenGL ES training class, including comments on the correct order of the factors. Thanks for the feedback, folks!
The tutorial is incorrect, but many of the mistakes either cancel each other out or are not obvious in this very limited context (fixed camera centered at (0,0), rotation around Z only). The rotation is backwards, but otherwise it kind of looks right. (To see why it's wrong, try a less trivial camera: set the eye and lookAt to y=1, for instance.)
One of the things that made this very hard to debug is that the Matrix methods don't do any alias detection on their inputs. The tutorial code makes it seem like you can call Matrix.multiplyMM with the same matrix used as both an input and the result. This isn't true. But because the implementation multiplies a column at a time, it's far less obvious that something is wrong if the right hand side is reused (as in the current code, where mMVPMatrix is the rhs and the result) than if the left hand side is reused. Each column on the left is read before the corresponding column in the result is written, so the output will be correct even if the LHS is overwritten. But if the right-hand side is the same as the result, then its first column will be overwritten before it's finished being read.
So the tutorial code is at a sort of local maximum: it seems like it works, and if you change any one thing, it breaks spectacularly. Which leads one to believe that wrong as it looks, it might just be correct. ;-)
Anyway, here's some replacement code that gets what I think is the intended result.
Java code:
#Override
public void onDrawFrame(GL10 unused) {
float[] scratch = new float[16];
// Draw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Set the camera position (View matrix)
Matrix.setLookAtM(mVMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
// Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
// Draw square
mSquare.draw(mMVPMatrix);
// Create a rotation for the triangle
Matrix.setRotateM(mRotationMatrix, 0, mAngle, 0, 0, 1.0f);
// Combine the rotation matrix with the projection and camera view
Matrix.multiplyMM(scratch, 0, mMVPMatrix, 0, mRotationMatrix, 0);
// Draw triangle
mTriangle.draw(scratch);
}
Shader code:
gl_Position = uMVPMatrix * vPosition;
NB: these fixes make the projection correct, but they also reverse the direction of rotation. That's because the original code applied the transformations in the wrong order. Think of it this way: instead of rotating the object clockwise, it was rotating the camera counterclockwise. When you fix the order of operations so that the rotation is applied to the object instead of the camera, then the object starts going counterclockwise. It's not the matrix that's wrong; it's the angle that was used to create the matrix.
So to get the 'correct' result, you also need to flip the sign of mAngle.
I solved this problem as follows:
#Override
public void onDrawFrame(GL10 unused) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
Matrix.setLookAtM(mViewMatrix, 0, 0, 0, -1f, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
Matrix.setRotateM(mModelMatrix, 0, mAngle, 0, 0, 1.0f);
Matrix.translateM(mModelMatrix, 0, 0.4f, 0.0f, 0);
mSquare.draw(mProjMatrix,mViewMatrix,mModelMatrix);
}
#Override
public void onSurfaceChanged(GL10 unused, int width, int height) {
...
Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 1, 99);
}
class Square {
private final String vertexShaderCode =
"uniform mat4 uPMatrix; \n" +
"uniform mat4 uVMatrix; \n" +
"uniform mat4 uMMatrix; \n" +
"attribute vec4 vPosition; \n" +
"void main() { \n" +
" gl_Position = uPMatrix * uVMatrix * uMMatrix * vPosition; \n" +
"} \n";
...
public void draw(float[] mpMatrix,float[] mvMatrix,float[]mmMatrix) {
...
mPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uPMatrix");
mVMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uVMatrix");
mMMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMMatrix");
GLES20.glUniformMatrix4fv(mPMatrixHandle, 1, false, mpMatrix, 0);
GLES20.glUniformMatrix4fv(mVMatrixHandle, 1, false, mvMatrix, 0);
GLES20.glUniformMatrix4fv(mMMatrixHandle, 1, false, mmMatrix, 0);
...
}
}
I’m working on the same issue and that’s what I found:
I believe that Joe’s sample is CORRECT,
including
the order of the factors in the shader code:
gl_Position = vPosition * uMVPMatrix;
To verify it just try to rotate the triangle with reversed factors order,
it will stretch the triangle to vanishing point at 90 degrees.
The real problem seems to be in setLookAtM function.
In Joe’s sample parameters are:
Matrix.setLookAtM(mVMatrix, 0,
0f, 0f,-3f, 0f, 0f, 0f, 0f, 1f, 0f );
which is perfectly logical as well.
However, the resulting view matrix looks weird to me:
-1 0 0 0
0 1 0 0
0 0 -1 0
0 0 -3 1
As we can see, this matrix will invert X coordinate,
since the first member is –1,
which will lead to left/right flip on the screen.
It will also reverse Z-order, but let's focus on X coordinate here.
I think that setLookAtM function is also working correctly.
However, since Matrix class is NOT a part of OpenGL,
it can use some other coordinates system,
for example - regular screen coordinates with Y axis pointing down.
This is just a guess, I didn’t really verify that.
Possible solutions:
We can build desirable view matrix manually,
the code is:
Matrix.setIdentityM(mVMatrix,0);
mVMatrix[14] = -3f;
OR
we can try to trick setLookAtM function by giving it
reversed camera coordinates:
0, 0, +3 (instead of –3).
Matrix.setLookAtM(mVMatrix, 0,
0f, 0f, 3f, 0f, 0f, 0f, 0f, 1f, 0f );
The resulting view matrix will be:
1 0 0 0
0 1 0 0
0 0 1 0
0 0 -3 1
That’s exactly what we need.
Now camera behaves as expected,
and sample works correctly.
No other suggestions worked for me using the current updated Android example code except for the following when trying to move the triangle.
The following link contains the answer. Took over a day to locate it. Posting here to help others as I seen this post many times. OpenGL ES Android Matrix Transformations

Categories

Resources