I have an array containing the the height of the vertices of a terrain map.
When I first draw the terrain it looks fine:
But as I rotate it across the z-axis, parts of the shape seem to be projected behind vertices on the back:
90 degree rotation (z-axis):
~180 degree rotation (z-axis):
Besides my implementation of the map, my code is fairly simple:
Vertex Shader:
attribute vec4 position;
attribute vec4 color;
uniform mat4 matrix;
varying vec4 interpolated_color;
void main() {
gl_Position = matrix * position;
interpolated_color = color;
}
Fragment_shader:
precision mediump float;
varying vec4 interpolated_color;
void main(){
gl_FragColor = interpolated_color;
}
Renderer:
public class MapRenderer implements GLSurfaceView.Renderer {
...
#Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
GLES20.glClearColor(1.0f, 0.0f,0.0f, 1.0f);
map = mapGen.composeMap(); //gets array with vertices heights
mapView = new MapView(context, map, mapGen);
}
#Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLES20.glViewport(0, 0, width, height);
float aspect_ratio = (float) width/height;
Matrix.perspectiveM(projectionMatrix, 0, 45, aspect_ratio, 1f, 10f);
}
#Override
public void onDrawFrame(GL10 gl) {
float[] scratch = new float[16];
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
Matrix.setIdentityM(modelMatrix, 0);
Matrix.translateM(modelMatrix, 0, 0, 0, -4);
Matrix.rotateM(modelMatrix, 0, -cameraAngle, 1, 0, 0); //cameraAngle initialized at 0 changes with user input
Matrix.rotateM(modelMatrix, 0, mapAngle, 0, 0, 1); //mapAngle initialized at 0 changes with user input
Matrix.multiplyMM(scratch, 0, projectionMatrix, 0, modelMatrix, 0);
mapView.draw(scratch);
}
}
MapView Class:
public void draw(float[] mvpMatrix){
int matrix = GLES20.glGetUniformLocation(program, "matrix");
GLES20.glUniformMatrix4fv(matrix, 1, false, mvpMatrix, 0);
//nFaces and facesBuffer are class variables
GLES20.glDrawElements(GLES20.GL_TRIANGLES, nFaces*3, GLES20.GL_UNSIGNED_SHORT, facesBuffer);
}
I tried to turn on and off face culling to see if any differences occurred but none did.
Changing the projection matrix also did not seem to have any effects besides changing the angle the error starts to occur. It seems to happen at ~90 degrees and up to ~270 degrees when using Matrix.perspectiveM and exactly at 90 and 270 when using Matrix.orthoM.
I also checked if OpenGL returned any errors by the glGetErrors() method and did not get anything.
My vertices are sorted in the buffer sequentially from the one located at (-1, 1, 0) to the last one located at (1, -1, 0). I don't know if that could cause this issue or, even if that was the case, how I could solve this in OpenGL ES 2 to support rotation accross the z-axis.
Depth Test needs to be enabled in order for OpenGL take distance into consideration.
On the Renderer:
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
GLES20.glClearColor(1.0f, 0.0f,0.0f, 1.0f);
GLES20.glEnable(GLES20.GL_DEPTH_TEST);
map = mapGen.composeMap(); //gets array with vertices heights
mapView = new MapView(context, map, mapGen);
}
Related
Using Android Studio, my code renders an array of floats as a texture passed to GLSL with one float per texel in the range of 0 to 1, like a grayscale texture. For that i use GL_LUMINANCE as the internalFormat and format for glTexImage2D and GL_FLOAT for type. Running the app on an android device emulator works fine (which uses my PC's GPU), but on a real device (Samsung Galaxy S7) calling glTexImage2D gives error 1282, GL_INVALID_OPERATION. I thought it might be a problem with non power of two textures, but the width and height are certainly powers of two.
The code uses Jos Stam fluid simulation C code (compiled with the NDK, not ported) which outputs density values for a grid.
mSizeN is the width (same as height) of the fluid simulation grid, although 2 is added to it by the fluid sim for boundary conditions, so the width of the array returned is mSizeN + 2; 128 in this case.
The coordinate system is set up as an orthographic projection with 0.0,0.0 the top left of the screen, 1.0,1.0 is the bottom right. I just draw a full screen quad and use the interpolated position across the quad in GLSL as texture coordinates to the array containing density values. Nice easy way to render it.
This is the renderer class.
public class GLFluidsimRenderer implements GLWallpaperService.Renderer {
private final String TAG = "GLFluidsimRenderer";
private FluidSolver mFluidSolver = new FluidSolver();
private float[] mProjectionMatrix = new float[16];
private final FloatBuffer mFullScreenQuadVertices;
private Context mActivityContext;
private int mProgramHandle;
private int mProjectionMatrixHandle;
private int mDensityArrayHandle;
private int mPositionHandle;
private int mGridSizeHandle;
private final int mBytesPerFloat = 4;
private final int mStrideBytes = 3 * mBytesPerFloat;
private final int mPositionOffset = 0;
private final int mPositionDataSize = 3;
private int mDensityTexId;
public static int mSizeN = 126;
public GLFluidsimRenderer(final Context activityContext) {
mActivityContext = activityContext;
final float[] fullScreenQuadVerticesData = {
0.0f, 0.0f, 0.0f,
0.0f, 1.0f, 0.0f,
1.0f, 0.0f, 0.0f,
1.0f, 1.0f, 0.0f,
};
mFullScreenQuadVertices = ByteBuffer.allocateDirect(fullScreenQuadVerticesData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mFullScreenQuadVertices.put(fullScreenQuadVerticesData).position(0);
}
public void onTouchEvent(MotionEvent event) {
}
#Override
public void onSurfaceCreated(GL10 glUnused, EGLConfig config) {
GLES20.glDisable(GLES20.GL_DEPTH_TEST);
GLES20.glClearColor(0.5f, 0.5f, 0.5f, 0.5f);
String vertShader = AssetReader.getStringAsset(mActivityContext, "fluidVertShader");
String fragShader = AssetReader.getStringAsset(mActivityContext, "fluidFragDensityShader");
final int vertexShaderHandle = ShaderHelper.compileShader(GLES20.GL_VERTEX_SHADER, vertShader);
final int fragmentShaderHandle = ShaderHelper.compileShader(GLES20.GL_FRAGMENT_SHADER, fragShader);
mProgramHandle = ShaderHelper.createAndLinkProgram(vertexShaderHandle, fragmentShaderHandle,
new String[] {"a_Position"});
mDensityTexId = TextureHelper.loadTextureLumF(mActivityContext, null, mSizeN + 2, mSizeN + 2);
}
#Override
public void onSurfaceChanged(GL10 glUnused, int width, int height) {
mFluidSolver.init(width, height, mSizeN);
GLES20.glViewport(0, 0, width, height);
Matrix.setIdentityM(mProjectionMatrix, 0);
Matrix.orthoM(mProjectionMatrix, 0, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f);
}
#Override
public void onDrawFrame(GL10 glUnused) {
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glUseProgram(mProgramHandle);
mFluidSolver.step();
TextureHelper.updateTextureLumF(mFluidSolver.get_density(), mDensityTexId, mSizeN + 2, mSizeN + 2);
mProjectionMatrixHandle = GLES20.glGetUniformLocation(mProgramHandle, "u_ProjectionMatrix");
mDensityArrayHandle = GLES20.glGetUniformLocation(mProgramHandle, "u_aDensity");
mGridSizeHandle = GLES20.glGetUniformLocation(mProgramHandle, "u_GridSize");
mPositionHandle = GLES20.glGetAttribLocation(mProgramHandle, "a_Position");
double start = System.nanoTime();
drawQuad(mFullScreenQuadVertices);
double end = System.nanoTime();
}
private void drawQuad(final FloatBuffer aQuadBuffer) {
// Pass in the position information
aQuadBuffer.position(mPositionOffset);
GLES20.glVertexAttribPointer(mPositionHandle, mPositionDataSize, GLES20.GL_FLOAT, false,
mStrideBytes, aQuadBuffer);
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Attach density array to texture unit 0
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mDensityTexId);
GLES20.glUniform1i(mDensityArrayHandle, 0);
// Pass in the actual size of the grid.
GLES20.glUniform1i(mGridSizeHandle, mSizeN + 2);
GLES20.glUniformMatrix4fv(mProjectionMatrixHandle, 1, false, mProjectionMatrix, 0);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}
}
Here's the texture helper functions.
public static int loadTextureLumF(final Context context, final float[] data, final int width, final int height) {
final int[] textureHandle = new int[1];
GLES20.glGenTextures(1, textureHandle, 0);
if (textureHandle[0] != 0) {
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glPixelStorei(GLES20.GL_UNPACK_ALIGNMENT, 1);
GLES20.glPixelStorei(GLES20.GL_PACK_ALIGNMENT, 1);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
(int) width, (int) height, 0, GLES20.GL_LUMINANCE, GLES20.GL_FLOAT,
(data != null ? FloatBuffer.wrap(data) : null));
}
if (textureHandle[0] == 0)
throw new RuntimeException("Error loading texture.");
return textureHandle[0];
}
public static void updateTextureLumF(final float[] data, final int texId, final int w, final int h) {
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texId);
GLES20.glTexSubImage2D(GLES20.GL_TEXTURE_2D, 0, 0, 0, (int)w, (int)h, GLES20.GL_LUMINANCE, GLES20.GL_FLOAT, (data != null ? FloatBuffer.wrap(data) : null));
}
Fragment shader.
precision mediump float;
uniform sampler2D u_aDensity;
uniform int u_GridSize;
varying vec4 v_Color;
varying vec4 v_Position;
void main()
{
gl_FragColor = texture2D(u_aDensity, vec2(v_Position.x, v_Position.y));
}
Is the combination of GL_FLOAT and GL_LUMINANCE unsupported in OpenGL ES 2?
android emulator pic.
edit:
To add, am i right in saying that each floating point value will be reduced to an 8-bit integer component when transferred with glTexImage2D (etc), so the majority of the floating point precision will be lost? In that case, it might be best to rethink the implementation of the simulator to output fixed point. That can be done easily, Stam even describes it in his paper.
Table 3.4 of the spec shows the "Valid pixel format and type combinations" for use with glTexImage2D. For GL_LUMINANCE, the only option is GL_UNSIGNED_BYTE.
OES_texture_float is the relevant extension you'd need to check for.
An alternative approach which would work on more devices is to pack your data in multiple channels of an RGBA. Here is some discussion about packing a float value into an 8888. Note, however, that not all OpenGLES2 devices even support 8888 render targets, you might have to pack into a 4444.
Or you could use OpenGLES 3. Android is up to 61.3% support of OpenGLES3 according to this.
EDIT: On re-reading more carefully, there probably isn't any benefit in using any higher than an 8-bit texture, because when you write the texture to gl_FragColor in your fragment shader you are copying into a 565 or 8888 framebuffer, so any extra precision is lost anyway at that point.
I'm trying to draw a simple line drawing connecting several vertices in OpenGL ES. However, the line is drawn inverted or in a different position from where it should be drawn. I've attached the class for the line drawing below
ConnectingPath.java
--------------------
public class ConnectingPath {
int positionBufferId;
PointF[] verticesList;
public float vertices[];
public FloatBuffer vertexBuffer;
public ConnectingPath(LinkedList<PointF> verticesList, float[] colors)
{
List<PointF> tempCorners = verticesList;
int i = 0;
this.verticesList = new PointF[tempCorners.size()];
for (PointF corner : tempCorners) {
this.verticesList[i++] = corner;
}
}
public float[] getTransformedVertices()
{
float z;
List<Float> finalVertices = new ArrayList<Float>();
finalVertices.clear();
for(PointF point : verticesList){
finalVertices.add(point.x);
finalVertices.add(point.y);
finalVertices.add(0.0f);
}
int i = 0;
float[] verticesArray = new float[finalVertices.size()];
for (Float f : finalVertices) {
verticesArray[i++] = (f != null ? f : Float.NaN);
}
return verticesArray;
}
public void initBooth(){
vertices = this.getTransformedVertices();
for(Float f : vertices){
Log.d("Mapsv3--", f + "");
}
ByteBuffer bb = ByteBuffer.allocateDirect(vertices.length * 4);
bb.order(ByteOrder.nativeOrder());
vertexBuffer = bb.asFloatBuffer();
vertexBuffer.put(vertices);
vertexBuffer.position(0);
int[] buffers = new int[1];
GLES11.glGenBuffers(1, buffers, 0);
GLES11.glBindBuffer(GLES11.GL_ARRAY_BUFFER, buffers[0]);
GLES11.glBufferData(GLES11.GL_ARRAY_BUFFER, 4 * vertices.length, vertexBuffer, GLES11.GL_STATIC_DRAW);
positionBufferId = buffers[0];
}
public void Render(GL10 gl){
GLES11.glPushMatrix();
GLES11.glBindBuffer(GLES11.GL_ARRAY_BUFFER, positionBufferId);
GLES11.glEnableClientState(GL10.GL_VERTEX_ARRAY);
GLES11.glVertexPointer(3, GL10.GL_FLOAT, 0, 0);
GLES11.glBindBuffer(GLES11.GL_ARRAY_BUFFER, 0);
GLES11.glFrontFace(GL10.GL_CW);
GLES11.glLineWidth(10.0f);
GLES11.glColor4f(0.0f,0.0f,0.0f,1.0f);
GLES11.glDrawArrays(GL10.GL_LINE_STRIP, 0, verticesList.length);
GLES11.glDisableClientState(GL10.GL_VERTEX_ARRAY);
GLES11.glPopMatrix();
}
}
Drawing code :
Renderer.java
--------------
// Variables here
public void onSurfaceChanged(GL10 gl, int width, int height) {
viewWidth = width;
viewHeight = height;
}
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
gl.glEnable(GL10.GL_TEXTURE_2D); //Enable Texture Mapping
gl.glShadeModel(GL10.GL_SMOOTH); //Enable Smooth Shading
gl.glClearColor(1.0f, 1.0f, 1.0f, 1.0f); //Grey Background
gl.glClearDepthf(1.0f); //Depth Buffer Setup
gl.glEnable(GL10.GL_DEPTH_TEST); //Enables Depth Testing
gl.glDepthFunc(GL10.GL_LEQUAL);
gl.glHint(GL10.GL_PERSPECTIVE_CORRECTION_HINT, GL10.GL_NICEST);
}
public void onDrawFrame(GL10 gl) {
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glLoadIdentity();
GLU.gluOrtho2D(gl, -viewWidth/2, viewWidth/2, -viewHeight/2,viewHeight/2);
gl.glTranslatef(center.x,center.y,0);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
gl.glTranslatef(0,0, 0);
gl.glColor4f(1.0f, 1.0f, 1.0f, 1.0f);
gl.glEnable(GL10.GL_CULL_FACE);
gl.glCullFace(GL10.GL_FRONT);
if(connectingPath!=null){
connectingPath.Render(gl);
}
gl.glDisable(GL10.GL_CULL_FACE);
gl.glLoadIdentity();
}
Screenshot :
The drawing in OpenGL seems to be inverted for you due to the way OpenGL defines it's screen coordinates. In contrast to most 2D drawing API's, the origin is located in the bottom left corner, which means that the y axis values increase when moving upwards. A very nice explanation is available in the OpenGL common pitfalls (Number 12):
Given a sheet of paper, people write from the top of the page to the bottom. The origin for writing text is at the upper left-hand margin of the page (at least in European languages). However, if you were to ask any decent math student to plot a few points on an X-Y graph, the origin would certainly be at the lower left-hand corner of the graph. Most 2D rendering APIs mimic writers and use a 2D coordinate system where the origin is in the upper left-hand corner of the screen or window (at least by default). On the other hand, 3D rendering APIs adopt the mathematically minded convention and assume a lower left-hand origin for their 3D coordinate systems.
Edit Code added, please see below
Edit 2 - Screenshots from device included at bottom along with explanation
Edit 3 - New code added
I have 2 classes, a rendered and a custom 'quad' class.
I have these declared at class level in my renderer class:
final float[] mMVPMatrix = new float[16];
final float[] mProjMatrix = new float[16];
final float[] mVMatrix = new float[16];
And in my onSurfaceChanged method I have:
#Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 3, 7);
}
and....
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
// TODO Auto-generated method stub
myBitmap = BitmapFactory.decodeResource(curView.getResources(), R.drawable.box);
//Create new Dot objects
dot1 = new Quad();
dot1.setTexture(curView, myBitmap);
dot1.setSize(300,187); //These numbers are the size but are redundant/not used at the moment.
myBitmap.recycle();
//Set colour to black
GLES20.glClearColor(0, 0, 0, 1);
}
And finally from this class, onDrawFrame:
#Override
public void onDrawFrame(GL10 gl) {
// TODO Auto-generated method stub
//Paint the screen the colour defined in onSurfaceCreated
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Set the camera position (View matrix) so looking from the front
Matrix.setLookAtM(mVMatrix, 0, 0, 0, 3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
// Combine
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
dot1.rotateQuad(0,0,45, mMVPMatrix); //x,y,angle and matrix passed in
}
Then, in my quad class:
This declared at class level:
private float[] mRotationMatrix = new float[16];
private final float[] mMVPMatrix = new float[16];
private final float[] mProjMatrix = new float[16];
private final float[] mVMatrix = new float[16];
private int mMVPMatrixHandle;
private int mPositionHandle;
private int mRotationHandle;
//Create our vertex shader
String strVShader =
"uniform mat4 uMVPMatrix;" +
"uniform mat4 uRotate;" +
"attribute vec4 a_position;\n"+
"attribute vec2 a_texCoords;" +
"varying vec2 v_texCoords;" +
"void main()\n" +
"{\n" +
// "gl_Position = a_position * uRotate;\n"+
// "gl_Position = uRotate * a_position;\n"+
"gl_Position = a_position * uMVPMatrix;\n"+
// "gl_Position = uMVPMatrix * a_position;\n"+
"v_texCoords = a_texCoords;" +
"}";
//Fragment shader
String strFShader =
"precision mediump float;" +
"varying vec2 v_texCoords;" +
"uniform sampler2D u_baseMap;" +
"void main()" +
"{" +
"gl_FragColor = texture2D(u_baseMap, v_texCoords);" +
"}";
Then method for setting texture (don't think this is relevant to this problem though!!)
public void setTexture(GLSurfaceView view, Bitmap imgTexture){
this.imgTexture=imgTexture;
iProgId = Utils.LoadProgram(strVShader, strFShader);
iBaseMap = GLES20.glGetUniformLocation(iProgId, "u_baseMap");
iPosition = GLES20.glGetAttribLocation(iProgId, "a_position");
iTexCoords = GLES20.glGetAttribLocation(iProgId, "a_texCoords");
texID = Utils.LoadTexture(view, imgTexture);
}
And finally, my 'rotateQuad' method (which currently is supposed to draw and rotate the quad).
public void rotateQuad(float x, float y, int angle, float[] mvpMatrix){
Matrix.setRotateM(mRotationMatrix, 0, angle, 0, 0, 0.1f);
// Matrix.translateM(mRotationMatrix, 0, 0, 0, 0); //Removed temporarily
// Combine the rotation matrix with the projection and camera view
Matrix.multiplyMM(mvpMatrix, 0, mRotationMatrix, 0, mvpMatrix, 0);
float[] vertices = {
-.5f,.5f,0, 0,0,
.5f,.5f,0, 1,0,
-.5f,-.5f,0, 0,1,
.5f,-.5f,0, 1,1
};
vertexBuf = ByteBuffer.allocateDirect(vertices.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
vertexBuf.put(vertices).position(0);
//Bind the correct texture
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texID);
//Use program
GLES20.glUseProgram(iProgId);
// get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(iProgId, "uMVPMatrix");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
// get handle to shape's rotation matrix
mRotationHandle = GLES20.glGetUniformLocation(iProgId, "uRotate");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mRotationHandle, 1, false, mRotationMatrix, 0);
//Set starting position for vertices
vertexBuf.position(0);
//Specify attributes for vertex
GLES20.glVertexAttribPointer(iPosition, 3, GLES20.GL_FLOAT, false, 5 * 4, vertexBuf);
//Enable attribute for position
GLES20.glEnableVertexAttribArray(iPosition);
//Set starting position for texture
vertexBuf.position(3);
//Specify attributes for vertex
GLES20.glVertexAttribPointer(iTexCoords, 2, GLES20.GL_FLOAT, false, 5 * 4, vertexBuf);
//Enable attribute for texture
GLES20.glEnableVertexAttribArray(iTexCoords);
//Draw it
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}
for Edit 2.
This is my quad as drawn in the center of the screen. No rotation.
This is the same quad rotated at +45 Degrees with the code "gl_Position = a_position * uMVPMatrix;" + in my vertex shader (it's from a different project now so the shader variable is a_position and not vPosition), it's looks correct!
However, this is the same quad rotated at +45 Degrees with the 2 shader variables switched (so they read "gl_Position = uMVPMatrix * a_position;" - as you can see, it's not quite right.
Also just a side note, you can't see it here as the quare is symetrical, but each method also rotates in the opposite direction to the other....
Any help appreciated.
It's really impossible to tell because we don't know what you are passing to these two variables.
OpenGL is column-major format, so if vPosition is in fact a vector, and uMVPMatrix is a matrix, then the first option is correct, if this is in your shader.
If this is not in your shader but in your program code, then there is not enough information.
If you are using the first option but getting unexpected results, you are likely not computing your matrix properly or not passing the correct vertices.
Normally in the vertex shader you should multiple the positions by the MVP, that is
gl_Position = uMVPMatrix *vPosition;
When you change the order this should work...
Thanks to all for the help.
I managed to track down the problem (For the most part). I will show what I did.
It was the following line:
Matrix.multiplyMM(mvpMatrix, 0, mvpMatrix, 0, mRotationMatrix, 0);
As you can see I was multiplying the matrices and storing them back into one that I was using in the multiplication.
So I created a new matrix called mvpMatrix2 and stored the results in that. Then passed that to my vertex shader.
//Multiply matrices
Matrix.multiplyMM(mvpMatrix2, 0, mvpMatrix, 0, mRotationMatrix, 0);
//get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(iProgId, "uMVPMatrix");
//Give to vertex shader variable
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix2, 0);
After applying this, there is no distortion (And also, with regards to my other question here Using Matrix. Rotate in OpenGL ES 2.0 I am able to translate the centre of the quad). I say 'for the most part' because however, when I rotate it, it rotates backwards (so if I say rotate +45 degrees, (Clockwise), it actually rotates the quad by -45 degrees (Anit-clockwise).
But hopefully, this will help anyone who has a similar problem in the future.
Do there is any special emulator settings needed to run OpenGL Apps?
I already set "GPU emulation" property to "yes".
I am trying to run an Android sample live wallpaper, using the sample source found from this link, The desired output is a rotating triangle.
After a little effort I got the app running but it doesn't draw anything in emulator but when I tested in device it works, But in the emulator it still just shows a green screen, I found a discussion on it in Google groups here. I tried to set view port as said in it. But still it doesn't show any result, on surface changed I had added this line
gl.glViewport(0, 0, width, height);
Do this is the correct way to set view port?
This is my render class,
public class MyRenderer implements GLWallpaperService.Renderer {
GLTriangle mTriangle;
public void onDrawFrame(GL10 gl) {
gl.glClearColor(0.2f, 0.4f, 0.2f, 1f);
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glMatrixMode(GL10.GL_MODELVIEW);
autoRotate(gl);
gl.glColor4f(.2f, 0f, .5f, 1f);
mTriangle.draw(gl);
}
public void onSurfaceChanged(GL10 gl, int width, int height) {
gl.glViewport(0, 0, width, height);
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glLoadIdentity();
GLU.gluPerspective(gl, 60f, (float)width/(float)height, 1f, 100f);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
gl.glTranslatef(0, 0, -5);
}
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
mTriangle = new GLTriangle();
gl.glClearDepthf(1f);
gl.glEnable(GL10.GL_DEPTH_TEST);
gl.glDepthFunc(GL10.GL_LEQUAL);
}
/**
* Called when the engine is destroyed. Do any necessary clean up because
* at this point your renderer instance is now done for.
*/
public void release() {
}
private void autoRotate(GL10 gl) {
gl.glRotatef(1, 0, 1, 0);
gl.glRotatef(0.5f, 1, 0, 0);
}
}
Herse is GLTriangle class
import java.nio.FloatBuffer;
import java.nio.ShortBuffer;
import javax.microedition.khronos.opengles.GL10;
public class GLTriangle {
private FloatBuffer _vertexBuffer;
private final int _nrOfVertices = 3;
private ShortBuffer _indexBuffer;
public GLTriangle() {
init();
}
private void init() {
// We use ByteBuffer.allocateDirect() to get memory outside of
// the normal, garbage collected heap. I think this is done
// because the buffer is subject to native I/O.
// See http://download.oracle.com/javase/1.4.2/docs/api/java/nio/ByteBuffer.html#direct
// 3 is the number of coordinates to each vertex.
_vertexBuffer = BufferFactory.createFloatBuffer(_nrOfVertices * 3);
_indexBuffer = BufferFactory.createShortBuffer(_nrOfVertices);
// Coordinates for the vertexes of the triangle.
float[] coords = {
-1f, -1f, 0f, // (x1, y1, z1)
1f, -1f, 0f, // (x2, y2, z2)
0f, 1f, 0f // (x3, y3, z3)
};
short[] _indicesArray = {0, 1, 2};
_vertexBuffer.put(coords);
_indexBuffer.put(_indicesArray);
_vertexBuffer.position(0);
_indexBuffer.position(0);
}
public void draw(GL10 gl) {
// 3 coordinates in each vertex
// 0 is the space between each vertex. They are densely packed
// in the array, so the value is 0
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, getVertexBuffer());
// Draw the primitives, in this case, triangles.
gl.glDrawElements(GL10.GL_TRIANGLES, _nrOfVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
}
private FloatBuffer getVertexBuffer() {
return _vertexBuffer;
}
}
What's going wrong here? Is there a better sample code for Open GL live wallpaper?
AT LAST I FOUND IT..
What I need to do is just add
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
to onSurfaceCreated method along with the code line
gl.glViewport(0, 0, width, height);
in the onSurfaceChanged method in MyRenderer Class
I found a similar question in stack itself [ But Solution worked for me is not marked as correct :( ]
I'm trying to make my first App. based on OpenGl, i'm trying to draw a triangle, and wHEN
RUNNING THE App. it just displays black screen with no triangle.
1-I dont know where my mistake is?
2-is there any good book/tutorials for beginners to opengl es Android?
Triangle Class:
public class Triangle {
private FloatBuffer vertxBuffer;
protected static byte indices[] = {
//Face definition:
0,1,3, //lower-right triangle of the face is drawn with vertices vertices[0]->vertices[1]->vertices[3] (->vertices[0])
0,3,2 //upper-right triangle of the face is drawn with vertices vertices[0]->vertices[3]->vertices[2] (->vertices[0])
};
public float vertices[] = {
-0.5f, -0.5f, 0.0f,
0.5f, -0.5f, 0.0f,
0.0f, 0.5f, 0.0f
};
public Triangle() {
// float has 4 bytes, so we allocate for each coordinate 4 bytes.
//what is the difference between ByteBuffer.allocateDirect AND ByteBuffer.allocate???
ByteBuffer vertexByteBuffer = ByteBuffer.allocateDirect(vertices.length * 4);
vertexByteBuffer.order(ByteOrder.nativeOrder());
// allocate the memory from the byte buffer
vertxBuffer = vertexByteBuffer.asFloatBuffer();
//fill the vertex buffer with the vertices
vertxBuffer.put(vertices);
// set the cursor position to the beginning of the buffer
vertexByteBuffer.position(0);
}
protected static ByteBuffer indexBuffer;
static {
indexBuffer = ByteBuffer.allocateDirect(indices.length);
indexBuffer.put(indices);
indexBuffer.position(0);
}
public void draw(GL10 gl) {
// Because we store the Triangle vertices " Coordinates " in a FloatBuffer
// we need to enable OpenGL to read from it.
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
//set the color for the Triangle (r, g, b, alpha) alpha is between 0-1
gl.glColor4f(2.0f, 1.0f, 0.0f, 0.5f);
// point to our vertex buffer to extract the vertices from it.
//(numberOfVertices, Which type of data the buffer Holds, offset, our Buffer containing the Vertices)
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertxBuffer);
//draw the vertices as triangle strip
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, vertices.length/3 );
gl.glDrawElements(gl.GL_TRIANGLES, indices.length, gl.GL_UNSIGNED_BYTE, indexBuffer);
//disable the client state before leaving
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
}
}
GLRenderer Class:
public class GLRenderer implements Renderer{
private Triangle triangle;
public GLRenderer() {
this.triangle = new Triangle();
}
#Override
public void onDrawFrame(GL10 gl) {
// TODO Auto-generated method stub
// clear screen and depth buffer
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
// reset the mode view matrix
gl.glLoadIdentity();
// Drawing
gl.glTranslatef(0.0f, 0.0f, -5.0f);
triangle.draw(gl);
}
#Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
// TODO Auto-generated method stub
if (height == 0) {
height = 1;
}
//Reset the current View Port
gl.glViewport(0, 0, width, height);
//Select the Projection Matrix
gl.glMatrixMode(GL10.GL_PROJECTION);
// Reset the Projection Matrix
gl.glLoadIdentity();
// calculate the aspect ratio of the window
GLU.gluPerspective(gl, 45.0f, (float) width/(float) height, 0.1f , 100.0f);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
}
#Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
// TODO Auto-generated method stub
gl.glClearColor(0, 0, 0, 1.0f);
gl.glClearDepthf(1.0f);
gl.glEnable(GL10.GL_DEPTH_TEST);
gl.glDepthFunc(GL10.GL_LEQUAL);
gl.glHint(GL10.GL_PERSPECTIVE_CORRECTION_HINT, GL10.GL_NICEST);
gl.glShadeModel(GL10.GL_SMOOTH);
gl.glDisable(GL10.GL_DITHER);
}
}
OpenGlRenderActivity Class:
public class OpenGLRenderActivity extends Activity {
/** Called when the activity is first created. */
private GLSurfaceView gLSurfaceView;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
//requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
gLSurfaceView = new GLSurfaceView(this);
gLSurfaceView.setRenderer(new GLRenderer());
setContentView(gLSurfaceView);
}
#Override
protected void onResume() {
super.onResume();
gLSurfaceView.onResume();
}
#Override
protected void onPause() {
super.onPause();
gLSurfaceView.onPause();
}
}
I think you need to declare the indexes for your vertex (in order to compose the lines and faces):
protected static byte indices[] = {
//Face definition:
0,1,3, //lower-right triangle of the face is drawn with vertices vertices[0]->vertices[1]->vertices[3] (->vertices[0])
0,3,2 //upper-right triangle of the face is drawn with vertices vertices[0]->vertices[3]->vertices[2] (->vertices[0])
};
protected static ByteBuffer indexBuffer;
static {
indexBuffer = ByteBuffer.allocateDirect(indices.length);
indexBuffer.put(indices);
indexBuffer.position(0);
}
then pass those indexes to your draw call:
(replace
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, vertices.length/3 );
with
gl.glDrawElements(GL.GL_TRIANGLES, indices.length, GL.GL_UNSIGNED_BYTE, indexBuffer);
)
As for tutorials you've got great Android versions of the Nehe tutorials here:
http://insanitydesign.com/wp/projects/nehe-android-ports/
It's more code than textual explanations, but they start you off real slow, and the code is well comented. Actually I think one of the first tutrials simply shows how to draw a triangle just like you're trying here.
Your triangles gets clipped by the near clipping plane. Try moving it into the range between the near and far values of gluPerspective.
This android tutorial mentions the requirement of a vertexShader, fragmentShader and Program. I may have missed it, but I don't see anything related to that in your code. In my case the fix for the invisible triangle was to fix a type-o in my fragmentShader.