The next translate depends on the previous scale in a GL10 object - android

I set my GL10 object in this way in my 2D Android game:
public void onSurfaceChanged(GL10 gl, int width, int height) {
gl.glViewport(0, 0, width, height);
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glLoadIdentity();
gl.glOrthof(0, width, height, 0, -1f, 1f);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
}
When I scale my GL10 object in this way (for example):
gl.glScalef(50, 50, 0);
Then a call to a translate transformation such as:
gl.glTranslatef(1, 1, 0f);
Causes a translate transformation of 50 pixel instead of 1 pixel. The same goes for a rotate transformation.
How can I do a post translate transformation without knowing previous transformations? Is it possible?

I'm not OpenGL guru but what I used is changing vertices like:
Lets say we had:
vertices = new float[12];
vertices[0] = -0.25f;
vertices[1] = -0.25f;
vertices[2] = 0.0f;
vertices[3] = -0.25f;
vertices[4] = 0.25f;
vertices[5] = 0.0f;
vertices[6] = 0.25f;
vertices[7] = -0.25f;
vertices[8] = 0.0f;
vertices[9] = 0.25f;
vertices[10] =0.25f;
vertices[11] =0.0f;
So to change zoom in I used:
gl.glPushMatrix();
...
vertices[0] = -0.25f - 1.75f *(mGameRange - distance)/mGameRange;// dummy increase index
vertices[1] = -0.25f - 1.75f *(mGameRange - distance)/mGameRange;
vertices[3] = -0.25f - 1.75f *(mGameRange - distance)/mGameRange;
vertices[4] = 0.25f + 1.75f *(mGameRange - distance)/mGameRange;
vertices[6] = 0.25f + 1.75f *(mGameRange - distance)/mGameRange;
vertices[7] = -0.25f - 1.75f *(mGameRange - distance)/mGameRange;
vertices[9] = 0.25f + 1.75f *(mGameRange - distance)/mGameRange;
vertices[10] =0.25f + 1.75f *(mGameRange - distance)/mGameRange;
vertexBuffer.clear();
vertexBuffer.put(vertices);
vertexBuffer.position(0);
...
gl.glPopMatrix();

try putting your scale call inside of a glPushMatrix() and glPopMatrix() and having your translate call after the pop.
good discussion here

What i've found is that if you want the translations to be separate you have to translate in the opposite direction for example after finishing a translation.
gl.glTranslatef(position.X, position.Y, 0);
gl.glDrawArrays(...); //first triangle
gl.glTranslatef(-position.X, -position.Y, 0); // note how they are negative instead of positive
gl.glDrawArrays(...); //second triangle
in this code the second triangle will be drawn at 0,0,0 and the first will be drawn at position.X, position.Y , 0. this tends to work for me but i still recommend pushing and popping the matrix before and after doing these translations.

Related

OPEN GL ES 1.0. World coordinates to screen coordinates

For 3d-picking I planned to do this:
-Get touch coordinates (x, y)
-Choose vertex from vertex buffer of my model (xM, yM, zM).
-Then project by my own hands (xM, yM, zM) on the screen coords
(xM, yM, zM) ---> (xP, yP, ...)
and then check match (for example sqrt((x - xP)^2 + (y - yP)^2) < SOME_EPS)
For projecting I saved Frustum Matrix in mProjectionMatrix:
gl.glFrustumf(-ratio / q, ratio / q, -1 / q, 1 / q, 1, 25);
Matrix.frustumM(mProjectionMatrix, 0, -ratio / q, ratio / q, -1 / q, 1 / q, 1, 25);
and saved Transform coordinates in mAccRotation:
gl.glLoadMatrixf(mAccRotation, 0);
So the testing function turned into this:
(TESTIFY_VERT is one of ones in my model)
public void touch(float x, float y){
float TESTIFY_VERT[] = {0.0f, 0.0f, -1.5f, 1.0f}; //first vert in L0
float Resulted[] = new float[4];
float rMatrix[] = new float[16];
Matrix.multiplyMM(rMatrix, 0, mProjectionMatrix, 0, mAccRotation, 0);
Matrix.multiplyMV(Resulted, 0, rMatrix, 0, TESTIFY_VERT, 0);
}
So I tried to use Resulted[0], Resulted[1] as (xP, yP)
and tried to use (Resulted[0] + 1) * ( WIDTH / 2.0f ), (-Resulted[1] + 1) * ( HEIGHT / 2.0f )
And this don't work. Why?
Can you give an advice?
PS I have seen ALL such a questions and they don't answer my problem.
Maybe you are missing perspective divide. Divide Resulted[0], Resulted[1] by Resulted[3] and use the ratios as (xP, yP) i.e.:
float xP = Resulted[0] / Resulted[3];
float yP = Resulted[1] / Resulted[3];

Touch coords and OpenGL space coords

I'm having a hard time implementing the rotation and movement of an on-screen OpenGL shape. Basically, what I want to achieve is being able to control the shape using a touch screen. Wherever I touch, it should rotate to that direction and start moving towards until it gets there.
Here's the code that sets up the frustum and the camera in onSurfaceChanged():
glViewport(0, 0, width, height);
float sizeRatio = (float) width / height;
Matrix.frustumM(
projectionMatrix, 0, -sizeRatio, sizeRatio, -1.0f, 1.0f, 1.0f, 20.0f
);
Matrix.setLookAtM(
viewMatrix, 0, 0, 0, -60, 0f, 0f, 0f, 0.0f, 1.0f, 0.0f
);
Matrix.multiplyMM(globalMvpMatrix, 0, projectionMatrix, 0, viewMatrix, 0);
and here's how the touch input is handled in onTouchEvent() (shape is an object that stores position and rotation and then draws the shape on the screen):
lastTouchX = event.getX();
lastTouchY = event.getY();
float shapePosX = shape.getPositionX();
float shapePosY = shape.getPositionY();
int[] viewport = new int[4];
glGetIntegerv(GL_VIEWPORT, viewport, 0);
float[] unprojectedTouchCoords = new float[4];
GLU.gluUnProject(
lastTouchX, lastTouchY, 0,
viewMatrix, 0,
projectionMatrix, 0,
viewport, 0,
unprojectedTouchCoords, 0
);
float unprojectedX = unprojectedTouchCoords[0] / unprojectedTouchCoords[3];
float unprojectedY = unprojectedTouchCoords[1] / unprojectedTouchCoords[3];
float rotation = 90 - (float) Math.toDegrees(Math.atan2(
shapePosY - unprojectedY, shapePosX - unprojectedX
));
shape.setRotation(rotation);
float moveX = 2 * (float) Math.cos(Math.toRadians(rotation));
float moveY = 2 * (float) Math.sin(Math.toRadians(rotation));
shape.move(moveX, moveY);
However, it doesn't seem to work well. The shape is moving in the wrong direction, and the rotation is only correct if the position of the shape is (0, 0). In any other case, it breaks.
I guess this problem involves distance between the camera and the shape in the OpenGL space, but I have no idea where and how to fix that. I tried bringing the camera closer to the shape but with no apparent improvement.
Can you please help me with this one? I'm getting really frustrated.
I haven't looked deeply at your code but I really want to give you a suggestion: the order of transformations matters in OpenGL, because they are cumulative.
This means that rotate and translate is different than translate and rotate.
Here is a good starting point to understand what I'm saying.
Seems like you have co-ordinates calculation problem. You need to #Override the onTouchEvent method of GLSurfaceView, rest is below
private final float TOUCH_SCALE_FACTOR = 180.0f / 320;
private float mPreviousX;
private float mPreviousY;
#Override
public boolean onTouchEvent(MotionEvent e) {
// MotionEvent reports input details from the touch screen
// and other input controls. In this case, you are only
// interested in events where the touch position changed.
float x = e.getX();
float y = e.getY();
switch (e.getAction()) {
case MotionEvent.ACTION_MOVE:
float dx = x - mPreviousX;
float dy = y - mPreviousY;
// reverse direction of rotation above the mid-line
if (y > getHeight() / 2) {
dx = dx * -1 ;
}
// reverse direction of rotation to left of the mid-line
if (x < getWidth() / 2) {
dy = dy * -1 ;
}
mRenderer.setAngle(
mRenderer.getAngle() +
((dx + dy) * TOUCH_SCALE_FACTOR)); // = 180.0f / 320
requestRender();
}
mPreviousX = x;
mPreviousY = y;
return true;
}

How to read the depth of a pixel with OpenGL ES 1 ? (Z coordinate of a pixel of the screen)

I need to pass to gluUnProject the winZ value of a pixel. to obtain the winZ value I need to read the depth value at a given pixel, this is a normalised z coordinate.
The way to do it is this with C: glReadPixels(winX, winY, 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, &winZ);
The problem is that I'm programming with Android 1.5 and OpenGL ES 1, then I don't have the possibility to use GL_DEPTH_COMPONENT and glReadPixels.
How can I obtain the depth of a pixel on the screen?
Solved, it is simply impossible to do it on OpenGL ES 1.x
If someone find the way, tell me, but after weeks of searching i simply think that it is impossible.
Check out this nice post I think it has what your looking for:
http://afqa123.com/2012/04/03/fixing-gluunproject-in-android-froyo/
A custom GLunproject which enables you to calculate the near and far vectors:
private boolean unProject(float winx, float winy, float winz,
float[] modelMatrix, int moffset,
float[] projMatrix, int poffset,
int[] viewport, int voffset,
float[] obj, int ooffset) {
float[] finalMatrix = new float[16];
float[] in = new float[4];
float[] out = new float[4];
Matrix.multiplyMM(finalMatrix, 0, projMatrix, poffset,
modelMatrix, moffset);
if (!Matrix.invertM(finalMatrix, 0, finalMatrix, 0))
return false;
in[0] = winx;
in[1] = winy;
in[2] = winz;
in[3] = 1.0f;
// Map x and y from window coordinates
in[0] = (in[0] - viewport[voffset]) / viewport[voffset + 2];
in[1] = (in[1] - viewport[voffset + 1]) / viewport[voffset + 3];
// Map to range -1 to 1
in[0] = in[0] * 2 - 1;
in[1] = in[1] * 2 - 1;
in[2] = in[2] * 2 - 1;
Matrix.multiplyMV(out, 0, finalMatrix, 0, in, 0);
if (out[3] == 0.0f)
return false;
out[0] /= out[3];
out[1] /= out[3];
out[2] /= out[3];
obj[ooffset] = out[0];
obj[ooffset + 1] = out[1];
obj[ooffset + 2] = out[2];
return true;
}
In order to get the 3D coordinates of a point x/y on the screen, you only have to call the new method once for each intersection with the near and far clipping planes:
// near clipping plane intersection
float[] objNear = new float[4];
unProject(screenX, screenY, 0.1f,
viewMatrix, 0, projectionMatrix, 0, view, 0, objNear, 0);
// far clipping plane intersection
float[] objFar = new float[4];
unProject(screenX, screenY, 1.0f,
viewMatrix, 0, projectionMatrix, 0, view, 0, objFar, 0);
You can then calculate the difference vector (objFar - objNear) and check what location in 3D space corresponds to the screen event.
This is a simple function i use to find the distance between the two vectors:
public float distanceBetweenVectors(float[] vec1, float[] vec2){
float distance = 0;
// Simple math
distance = (float) Math.sqrt(
Math.pow((vec1[0]-vec2[0]), 2) +
Math.pow((vec1[1]-vec2[1]), 2) +
Math.pow((vec1[2]-vec2[2]), 2) +
Math.pow((vec1[3]-vec2[3]), 2)
);
return distance;
}

Android OpenGL 3D picking

I'm on Android OpenGL-ES 2.0 and after all the limitations that come with it, I can't figure out how to take 2D screen touches to the 3D points I have. I can't get the right results.
I'm trying to implement shooting a ray into the point cloud, which I can then compare distances of my points to the ray, finding the closest point.
public class OpenGLRenderer extends Activity implements GLSurfaceView.Renderer {
public PointCloud ptCloud;
MatrixGrabber mg = new MatrixGrabber();
...
public void onDrawFrame(GL10 gl) {
gl.glDisable(GL10.GL_COLOR_MATERIAL);
gl.glDisable(GL10.GL_BLEND);
gl.glDisable(GL10.GL_LIGHTING);
//Background drawing
if(customBackground)
gl.glClearColor(backgroundRed, backgroundGreen, backgroundBlue, 1.0f);
else
gl.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
if (PointCloud.doneParsing == true) {
if (envDone == false)
setupEnvironment();
// Clears the screen and depth buffer.
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glLoadIdentity();
GLU.gluPerspective(gl, 55.0f, (float) screenWidth / (float) screenHeight, 10.0f ,10000.0f);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
GLU.gluLookAt(gl, eyeX, eyeY, eyeZ,
centerX, centerY, centerZ,
upX, upY, upZ);
if(pickPointTrigger)
pickPoint(gl);
gl.glPushMatrix();
gl.glTranslatef(_xTranslate, _yTranslate, _zTranslate);
gl.glTranslatef(centerX, centerY, centerZ);
gl.glRotatef(_xAngle, 1f, 0f, 0f);
gl.glRotatef(_yAngle, 0f, 1f, 0f);
gl.glRotatef(_zAngle, 0f, 0f, 1f);
gl.glTranslatef(-centerX, -centerY, -centerZ);
ptCloud.draw(gl);
gl.glPopMatrix();
}
}
}
Here is my picking function. I've set the location to the middle of the screen just for debugging purposes:
public void pickPoint(GL10 gl){
mg.getCurrentState(gl);
double mvmatrix[] = new double[16];
double projmatrix[] = new double[16];
int viewport[] = {0,0,screenWidth, screenHeight};
for(int i=0 ; i<16; i++){
mvmatrix[i] = mg.mModelView[i];
projmatrix[i] = mg.mProjection[i];
}
mg.getCurrentState(gl);
float realY = ((float) (screenHeight) - pickY);
float nearCoords[] = { 0.0f, 0.0f, 0.0f, 0.0f };
float farCoords[] = { 0.0f, 0.0f, 0.0f, 0.0f };
GLU.gluUnProject(screenWidth/2, screenHeight/2, 0.0f, mg.mModelView, 0, mg.mProjection, 0,
viewport, 0, nearCoords, 0);
GLU.gluUnProject(screenWidth/2, screenHeight/2, 1.0f, mg.mModelView, 0, mg.mProjection, 0,
viewport, 0, farCoords, 0);
System.out.println("Near: " + nearCoords[0] + "," + nearCoords[1] + "," + nearCoords[2]);
System.out.println("Far: " + farCoords[0] + "," + farCoords[1] + "," + farCoords[2]);
//Plot the points in the scene
nearMarker.set(nearCoords);
farMarker.set(farCoords);
markerOn = true;
double diffX = nearCoords[0] - farCoords[0];
double diffY = nearCoords[1] - farCoords[1];
double diffZ = nearCoords[2] - farCoords[2];
double rayLength = Math.sqrt(Math.pow(diffX, 2) + Math.pow(diffY, 2) + Math.pow(diffZ, 2));
System.out.println("rayLength: " + rayLength);
pickPointTrigger = false;
}
Changing the persepctive zNear and Far doesn't have the expected results, how could the far point of a 1.0-1000.0 perspective be 11 units away?
GLU.gluPerspective(gl, 55.0f, (float) screenWidth / (float) screenHeight, 1.0f ,100.0f);
.....
07-18 11:23:50.430: INFO/System.out(31795): Near: 57.574852,-88.60514,37.272636
07-18 11:23:50.430: INFO/System.out(31795): Far: 0.57574844,0.098602295,0.2700405
07-18 11:23:50.430: INFO/System.out(31795): rayLength: 111.74275719790872
GLU.gluPerspective(gl, 55.0f, (float) width / (float) height, 10.0f , 1000.0f);
...
07-18 11:25:12.420: INFO/System.out(31847): Near: 5.7575016,-7.965394,3.6339219
07-18 11:25:12.420: INFO/System.out(31847): Far: 0.057574987,0.90500546,-0.06634784
07-18 11:25:12.420: INFO/System.out(31847): rayLength: 11.174307289026638
Looking for any suggestions or hopefully bugs you see in my code. Much appreciated. I'm Bountying as much as I can (this has been a problem for a while).
I'm working on this, too - it's a very irritating irritating problem. I have two potential leads: 1. Somehow, the resulting z depend on where the camera is, and not how you'd expect. When the camera z is at 0, the resulting z is -1, no matter what winZ is. Up until now I've mainly been looking at the resulting z, so I don't have any exact figures on the other coordinates, but I messed around with my code and your code, just now, and I've discovered that the reported ray-length increases the farther the camera gets from (0,0,0). At (0,0,0), the ray-length is reported to be 0. An hour or so ago, I gathered a bunch of points (cameraZ, winZ, resultZ) and plugged them into Mathematica. The result seems to indicate a hyperbolic sort of thing; with one of the variables fixed, the other causes the resulting z to vary linearly, with the rate of change depending on the fixed variable.
My second lead is from http://www.gamedev.net/topic/420427-gluunproject-question/; swordfish quotes a formula:
WinZ = (1.0f/fNear-1.0f/fDistance)/(1.0f/fNear-1.0f/fFar)
Now, this doesn't seem to match up with the data I collected, but it's probably worth a look. I think I'm going to see if I can figure out how the math of this thing works and figure out what's wrong. Let me know if you figure anything out. Oh, also, here's the formula fitted to the data I collected:
-0.11072114015496763- 10.000231721597817 x -
0.0003149873867479971x^2 - 0.8633277851535017 y +
9.990256062051143x y + 8.767260632968973*^-9 y^2
Wolfram Alpha plots it like so:
http://www.wolframalpha.com/input/?i=Plot3D[-0.11072114015496763%60+-+10.000231721597817%60+x+-++++0.0003149873867479971%60+x^2+-+0.8633277851535017%60+y+%2B++++9.990256062051143%60+x+y+%2B+8.767260632968973%60*^-9+y^2+%2C+{x%2C+-15%2C++++15}%2C+{y%2C+0%2C+1}]
AHA! Success! As near as I can tell, gluUnProject is just plain broken. Or, nobody understands how to use it at all. Anyway, I made a function that properly undoes the gluProject function, which appears to really be what they use to draw to the screen in some fashion! Code is as follows:
public float[] unproject(float rx, float ry, float rz) {//TODO Factor in projection matrix
float[] modelInv = new float[16];
if (!android.opengl.Matrix.invertM(modelInv, 0, mg.mModelView, 0))
throw new IllegalArgumentException("ModelView is not invertible.");
float[] projInv = new float[16];
if (!android.opengl.Matrix.invertM(projInv, 0, mg.mProjection, 0))
throw new IllegalArgumentException("Projection is not invertible.");
float[] combo = new float[16];
android.opengl.Matrix.multiplyMM(combo, 0, modelInv, 0, projInv, 0);
float[] result = new float[4];
float vx = viewport[0];
float vy = viewport[1];
float vw = viewport[2];
float vh = viewport[3];
float[] rhsVec = {((2*(rx-vx))/vw)-1,((2*(ry-vy))/vh)-1,2*rz-1,1};
android.opengl.Matrix.multiplyMV(result, 0, combo, 0, rhsVec, 0);
float d = 1 / result[3];
float[] endResult = {result[0] * d, result[1] * d, result[2] * d};
return endResult;
}
public float distanceToDepth(float distance) {
return ((1/fNear) - (1/distance))/((1/fNear) - (1/fFar));
}
It currently assumes the following global variables:
mg - a MatrixGrabber with current matrices
viewport - a float[4] with the viewport ({x, y, width, height})
The variables it takes are equivalent to the ones that gluUnProject was supposed to take. For example:
float[] xyz = {0, 0, 0};
xyz = unproject(mouseX, viewport[3] - mouseY, 1);
This will return the point under the mouse, on the far plane. I also added a function to convert between a specified distance from the camera and its 0-1...representation...thing. Like so:
unproject(mouseX, viewport[3] - mouseY, distanceToDepth(5));
This will return the point under the mouse 5 units from the camera.
I tested this with the method given in the question - I checked the distance between the near plane and the far plane. With fNear of 0.1 and fFar of 100, the distance should be 99.9. I have consistently gotten about 99.8977, regardless of position or orientation of the camera, as far as I can tell. Haha, good to have that figured out. Let me know if you do/don't have any problems with it, or if you want me to rewrite it to take inputs instead of using global variables. Hopefully this helps a few people; I had been wondering about this for a few days before seriously trying to fix it.
Hey, so, having figured out how it's supposed to be, I've figured out what they missed in implementing gluUnProject. They forgot (intended not to and didn't tell anyone?) to divide by the fourth element of the resulting vector, which kinda normalizes the vector or something like that. gluProject sets it to 1 before applying matrices, so it needs to be 1 when you're done undoing them. Long story short, you can actually use gluUnProject, but you need to pass it a float[4], and then divide all the resulting coordinates by the 4th one, like so:
float[] xyzw = {0, 0, 0, 0};
android.opengl.GLU.gluUnProject(rx, ry, rz, mg.mModelView, 0, mg.mProjection, 0, this.viewport, 0, xyzw, 0);
xyzw[0] /= xyzw[3];
xyzw[1] /= xyzw[3];
xyzw[2] /= xyzw[3];
//xyzw[3] /= xyzw[3];
xyzw[3] = 1;
return xyzw;
xyzw should now contain the relevant space coordinates. This seems to work exactly the same as the one I cobbled together. It might be a little bit faster; I think they combined one of the steps.

Hud with shaders (opengl-es 2.0)

How to draw a HUD using shaders on opengl es 2.0?
I have a shader which draws a textured quad on screen, it uses MVP matrix. The quad has it own vertices which are independent of view position and so on (cause of MVP matrix)
Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 3f, 17);
Matrix.setLookAtM(mVMatrix, 0, 0, 0, -5, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
I'd like to show the same quad on the top right corner (like a button or something else, HUD).
As I understand, i need create an ortho matrix instead of "frustumM", but what should i do later? How vertex shader should use vertices of quad?
Ok, You have your ortho matrix and quad, so whats the problem, translate modelview matrix of your quad to desired position (x,y,z=0), multiply it by ortho matrix , pass multiplied matrix to vertex shader, multiply vert position by your matrix and done :), i am not using any lookat function in my code to do this, but i have own code for matrices computation its partially code from some bada tutorial, for projection matrix i have other function.
void
Letter::Ortho(Matrix* result, float fovy, float aspect, float nearZ, float farZ)
{
GLfloat frustumW, frustumH;
frustumH = tanf(fovy / 360.0f * PI) * nearZ;
frustumW = frustumH * aspect;
Frustum(result, -frustumW, frustumW, -frustumH, frustumH, nearZ, farZ);
}
void
Letter::LoadIdentity(Matrix* result)
{
memset(result, 0x0, sizeof(Matrix));
result->m[0][0] = 1.0f;
result->m[1][1] = 1.0f;
result->m[2][2] = 1.0f;
result->m[3][3] = 1.0f;
}
void
Letter::Frustum(Matrix *result, float left, float right, float bottom, float top, float nearZ, float farZ)
{
float deltaX = right - left;
float deltaY = top - bottom;
float deltaZ = farZ - nearZ;
Matrix frustum;
if ((nearZ <= 0.0f) || (farZ <= 0.0f) ||
(deltaX <= 0.0f) || (deltaY <= 0.0f) || (deltaZ <= 0.0f))
{
return;
}
frustum.m[0][0] = 2.0f * nearZ / deltaX;
frustum.m[0][1] = frustum.m[0][2] = frustum.m[0][3] = 0.0f;
frustum.m[1][1] = 2.0f * nearZ / deltaY;
frustum.m[1][0] = frustum.m[1][2] = frustum.m[1][3] = 0.0f;
frustum.m[2][0] = (right + left) / deltaX;
frustum.m[2][1] = (top + bottom) / deltaY;
frustum.m[2][2] = -(nearZ + farZ) / deltaZ;
frustum.m[2][3] = -1.0f;
frustum.m[3][2] = -2.0f * nearZ * farZ / deltaZ;
frustum.m[3][0] = frustum.m[3][1] = frustum.m[3][3] = 0.0f;
Multiply(result, &frustum, result);
}
So, with this code :
LoadIdentity(&matPerspective);
Ortho(&matPerspective, 60.0f, TEXMANAGER.aspect, -1.0f, 20.0f);
LoadIdentity(&matModelview);
Translate(&matModelview, x ,y ,z);
Scale(&matModelview,size);
//Rotate(&matModelview, 0.0f, 1.0f, 0.0f, 1.0f);
Multiply(&posMatrix, &matModelview, &matPerspective);
And pass posMatrix to shader :)

Categories

Resources