I'm drawing on Android with coordinates between -1.0 and 1.0, but the renderer is mapping these coordinates to the wrong places. In landscape, the width doesn't extend to the corners, and in portrait, the width extends over the screen borders slightly. I'm trying to make this fill the screen resolution. Here's my code:
package com.mycompany.brickbreaker;
import android.opengl.GLES20;
import android.opengl.GLSurfaceView;
import android.opengl.Matrix;
import android.util.Log;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
public class MyGLRenderer implements GLSurfaceView.Renderer {
private BrickBreaker game;
public MyGLRenderer(BrickBreaker activity) {
super();
game = activity;
}
//mMVPMatrix stands for "Model View Projection Matrix"
private final float[] mMVPMatrix = new float[16];
private final float[] mProjectionMatrix = new float[16];
private final float[] mViewMatrix = new float[16];
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
//Set the background frame color
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
}
#Override
public void onDrawFrame(GL10 unused){
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
//Set the camera position (View Matrix)
Matrix.setLookAtM(mViewMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0f);
//Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);
//Draw the rectangle!
game.draw(mMVPMatrix);
}
#Override
public void onSurfaceChanged(GL10 unused, int width, int height){
GLES20.glViewport(0, 0, width, height);
Log.d("Touch", "Surface changed at height/width " + height + "/" + width);
float ratio = (float) width/(float) height;
//This projection matrix is applied to object coordinates in
//the onDrawFrame() method
Matrix.frustumM(mProjectionMatrix, 0, -ratio, ratio, -1, 1, 3, 7);
}
public static int loadShader(int type, String shaderCode){
//create the shader type
int shader = GLES20.glCreateShader(type);
//add the code to the shader, and compile it
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
}
I'm not an expert at openGL, but I do have a working engine.
In your on SurfaceChanged, you pass in the old projectionMatrix.
You should add the following code:
//Clear the matricies
for(int i=0;i<16;i++)
{
mProjectionMatrix[i] = 0.0f;
mViewMatrix[i] = 0.0f;
mMVPMatrix[i] = 0.0f;
}
// Setup our screen width and height for normal sprite translation.
Matrix.orthoM(mProjectionMatrix, 0, 0f, width, 0.0f, height, 0, 50);
// Set the camera position (View matrix)
Matrix.setLookAtM(mViewMatrix, 0, 0f, 0f, 1f, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
// Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);
Hopefully that fixes your problem.
So I've solved this. What was going on is that when your projection matrix is created with Matrix.frustumM(0,0,-ratio,ratio....), the openGl grid is actually changed from a -1 to 1 range in both X and Y, into a -1 to 1 grid in the y, with a -ratio to ratio range in x coordinates. Multiplying any x-coordinates I used by the ratio passed to frustumM made everything happy again.
Related
I need to plot few coordinates on screen, the coordinates without Z axis get displayed but the coordinates with Z axis values aren't being displayed. I tried normalizing the coordinates before plotting them which worked. When normalized all the coordinates get plotted. But in the case of unnormalized coordinates the vertices with Z axis are hidden.
OpenGL Version : ES 2.0
Coordinates:
float squareCoords[] = {
202.00002f, 244.00002f, 0.0f,
440.00003f, 625.00006f, 0.0f,
440.00003f, 625.00006f, 0.0f,
690.00006f, 186.0f,0.0f,
202.00002f, 244.00002f, 50.0f,
440.00003f, 625.00006f, 50.0f,
440.00003f, 625.00006f, 50.0f,
690.00006f, 186.0f, 50.0f
};
indices:
short[] drawOrder = {
0,1,2,3,
0,4,
1,5,
2,6,
4,5,6,7
};
Draw Code:
GLES20.glDrawElements(
GLES20.GL_LINES, drawOrder.length,
GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
On Surface Changed code:
public void onSurfaceChanged(GL10 unused, int width, int height) {
mWidth = width;
mHeight = height;
GLES20.glViewport(0, 0, mWidth, mHeight);
float ratio = (float) mWidth / mHeight;
// this projection matrix is applied to object coordinates
// in the onDrawFrame() method
Matrix.orthoM(mProjMatrix, 0, 0f, width, 0.0f, height, 0, 50);
}
OnDraw:
public void onDrawFrame(GL10 unused) {
Square square = new Square();
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT|GLES20.GL_DEPTH_BUFFER_BIT);
if (mFirstDraw)
mFirstDraw = false;
long time = SystemClock.uptimeMillis() % 4000L;
float angle = 0.090f * ((int) time);
// float angle = 90;
// Matrix.setRotateM(mRotationMatrix, 0, angle, 0, 0, -1.0f);
// angle += 0.7f;
if (angle > 360f)
angle = 0f;
Matrix.setLookAtM(mVMatrix, 0, 0f, 0f, 4f, 0f, 0f, 0f, 0f, 1f, 0f);
// projection x view = modelView
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
// Creating rotation matrix
Matrix.setRotateM(rotationMatrix, 0, angle, -1f, 0f, 0f);
// rotation x camera = modelView
float[] duplicateMatrix = Arrays.copyOf(mMVPMatrix, 16);
Matrix.multiplyMM(mMVPMatrix, 0, duplicateMatrix, 0, rotationMatrix, 0);
square.draw(mMVPMatrix);
}
I'm rotating the diagram to figure out whether the vertices on Z axis are drawn.
I personally think this line is the culprit, here I've given far value 50 and near value 0. I wonder what these values should be
Matrix.orthoM(mProjMatrix, 0, 0f, width, 0.0f, height, 0, 50);
The problem here was the value of far wasn't higher enough. I put far as 500
Matrix.orthoM(mProjectionMatrix, 0, 0f, width, 0.0f, height,0, 500);
and changed the coordinates to:
float squareCoords[] = {
202.00002f, 244.00002f, 0.0f,
440.00003f, 625.00006f, 0.0f,
440.00003f, 625.00006f, 0.0f,
690.00006f, 186.0f,0.0f,
202.00002f, 244.00002f, 200.0f,
440.00003f, 625.00006f, 200.0f,
440.00003f, 625.00006f, 200.0f,
690.00006f, 186.0f, 200.0f
};
Its working now.
Because of performance I moved to OpenGL ES 2D from canvas.drawBitmap
This is sprite sheet 4x1:
Now to make it work I had followed class:
public Vulcan(ScreenObjectsView objectsView, int vulkanSpriteID, Context context) {
this.b = BitmapFactory.decodeResource(context.getResources(), vulkanSpriteID);
// 1x4
height = b.getHeight();
width = b.getWidth()/4;
WindowManager wm = (WindowManager) context.getSystemService(Context.WINDOW_SERVICE);
Display display = wm.getDefaultDisplay();
x = display.getWidth()/2-width/2; // deprecated
y = display.getHeight()-height; // deprecated
}
public void update() {
frameFreq++;
if(frameFreq > 0){
currentFrame = ++currentFrame % 4;
frameFreq = 0;
}
}
#Override
public void draw(Canvas canvas) {
update();
int srcX = currentFrame * width;
Rect src = new Rect(srcX, 0, srcX+width, height);
Rect dst = new Rect(x, y, x+width, y+height);
canvas.drawBitmap(b, src, dst, null);
}
Each period of time I take Rect and shift from left to right (in loop):
currentFrame = ++currentFrame % 4;
So far so good.
How can I animate above mentioned sprite sheet in in OpenGL ES?
Today, I know how to draw and move objects in OpenGL ES (thanks to good demo)
but don't know to play with sprites.
Any ideas, links, snippets of code?
[Edit]
Ther is no mater to use sprite sheet or 4 images like:
, and so on.
Strange that still didn't get any answer or direction.
Thank you,
[Edit 2]
According to what Aert says I implemented the following code and it works.
But it seems messy
Too much code for OpenGL ES. For each texture (I have 4), I need create FloatBuffer:
Maybe someone have shorter way, or I did something wrong.
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import javax.microedition.khronos.opengles.GL10;
public class DevQuestSpriteBase {
private static final String LOG_TAG = "Fess";//DevQuestSpriteBase.class.getSimpleName();
protected int mFrame = 0;
protected int mSwitcher = 0;
private int textureCount = 1; // frame animation
protected int[] textures = new int[textureCount]; // frame animation
// texture and verts
protected FloatBuffer vertexBuffer,
textureBuffer1,
textureBuffer2,
textureBuffer3,
textureBuffer4;
ByteBuffer bb1;
protected float vertices[] = {
0f,0f,0.0f,
1f,0f,0.0f,
0f,1f,0.0f,
1f,1f,0.0f
};
/** 1 frame */
protected float texture1[] = {
0.0f, 1.0f,
0.0f, 0.0f,
0.25f, 1.0f,
0.25f, 0.0f
};
/** 2 frame */
protected float texture2[] = {
0.25f, 1.0f,
0.25f, 0.0f,
0.5f, 1.0f,
0.5f, 0.0f
};
/** 3 frame */
protected float texture3[] = {
0.5f, 1.0f,
0.5f, 0.0f,
0.75f, 1.0f,
0.75f, 0.0f
};
/** 4 frame */
protected float texture4[] = {
0.75f, 1.0f,
0.75f, 0.0f,
1.0f, 1.0f,
1.0f, 0.0f
};
public DevQuestSpriteBase(){
// vertices buffer
bb1 = ByteBuffer.allocateDirect(vertices.length * 4);
bb1.order(ByteOrder.nativeOrder());
vertexBuffer = bb1.asFloatBuffer();
vertexBuffer.put(vertices);
vertexBuffer.position(0);
// texture buffer
bb1 = ByteBuffer.allocateDirect(texture1.length * 4);
bb1.order(ByteOrder.nativeOrder());
textureBuffer1 = bb1.asFloatBuffer();
textureBuffer1.put(texture1);
textureBuffer1.position(0);
//#########################################################
// texture buffer
bb1 = ByteBuffer.allocateDirect(texture2.length * 4);
bb1.order(ByteOrder.nativeOrder());
textureBuffer2 = bb1.asFloatBuffer();
textureBuffer2.put(texture2);
textureBuffer2.position(0);
//#########################################################
// texture buffer
bb1 = ByteBuffer.allocateDirect(texture3.length * 4);
bb1.order(ByteOrder.nativeOrder());
textureBuffer3 = bb1.asFloatBuffer();
textureBuffer3.put(texture3);
textureBuffer3.position(0);
//#########################################################
// texture buffer
bb1 = ByteBuffer.allocateDirect(texture4.length * 4);
bb1.order(ByteOrder.nativeOrder());
textureBuffer4 = bb1.asFloatBuffer();
textureBuffer4.put(texture4);
textureBuffer4.position(0);
}
private void update() {
if(mSwitcher == 5){
mFrame = ++mFrame % 4;
mSwitcher = 0;
// Log.e(LOG_TAG, "DevQuestSpriteBase :: " + mFrame);
}
else{
mSwitcher++;
}
}
public void draw(GL10 gl){
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);
if(mFrame == 0){
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer1);
}
else if(mFrame == 1){
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer2);
}
else if(mFrame == 2){
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer3);
}
else if(mFrame == 3){
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer4);
}
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, 4);
//Log.e(LOG_TAG, "DevQuestSpriteBase :: draw");
update();
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
//gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer1);
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, vertices.length / 3);
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
}
public int[] getTextures() {
return textures;
}
}
Without going into a lot of detail, you need to do the following (assuming you are already drawing a sprite using 4 vertices):
Define the texture coordinates corresponding to the vertices of the sprite for each animation frame, e.g.
texCoordsFrame1 = [0.0f, 0.0f, 0.25f, 0.0f, 0.0f, 1.0f, 0.25f, 1.0f];
Upload the spritesheet texture, e.g.
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, imageData);
Draw using the texture coordinates corresponding to the frame you want to show when required, e.g.
...
glBindTexture(GL_TEXTURE_2D, texture[0]);
glTexCoordPointer(2, GL_FLOAT, 0, texCoordsFrame1);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
Alternatively, you can upload the separate frames as individual textures, but that is undesirable from a performance point of view.
There are a few gotcha's
When using GLES1, you can only use power-of-two textures. In that case you'll have to scale the texture or increase its size to be power-of-two and adjust the texture coordinates.
The bitmap vs GL y-coordinate direction difference is a bit confusing, and you might end up with a vertically flipped sprite.
Do there is any special emulator settings needed to run OpenGL Apps?
I already set "GPU emulation" property to "yes".
I am trying to run an Android sample live wallpaper, using the sample source found from this link, The desired output is a rotating triangle.
After a little effort I got the app running but it doesn't draw anything in emulator but when I tested in device it works, But in the emulator it still just shows a green screen, I found a discussion on it in Google groups here. I tried to set view port as said in it. But still it doesn't show any result, on surface changed I had added this line
gl.glViewport(0, 0, width, height);
Do this is the correct way to set view port?
This is my render class,
public class MyRenderer implements GLWallpaperService.Renderer {
GLTriangle mTriangle;
public void onDrawFrame(GL10 gl) {
gl.glClearColor(0.2f, 0.4f, 0.2f, 1f);
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glMatrixMode(GL10.GL_MODELVIEW);
autoRotate(gl);
gl.glColor4f(.2f, 0f, .5f, 1f);
mTriangle.draw(gl);
}
public void onSurfaceChanged(GL10 gl, int width, int height) {
gl.glViewport(0, 0, width, height);
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glLoadIdentity();
GLU.gluPerspective(gl, 60f, (float)width/(float)height, 1f, 100f);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
gl.glTranslatef(0, 0, -5);
}
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
mTriangle = new GLTriangle();
gl.glClearDepthf(1f);
gl.glEnable(GL10.GL_DEPTH_TEST);
gl.glDepthFunc(GL10.GL_LEQUAL);
}
/**
* Called when the engine is destroyed. Do any necessary clean up because
* at this point your renderer instance is now done for.
*/
public void release() {
}
private void autoRotate(GL10 gl) {
gl.glRotatef(1, 0, 1, 0);
gl.glRotatef(0.5f, 1, 0, 0);
}
}
Herse is GLTriangle class
import java.nio.FloatBuffer;
import java.nio.ShortBuffer;
import javax.microedition.khronos.opengles.GL10;
public class GLTriangle {
private FloatBuffer _vertexBuffer;
private final int _nrOfVertices = 3;
private ShortBuffer _indexBuffer;
public GLTriangle() {
init();
}
private void init() {
// We use ByteBuffer.allocateDirect() to get memory outside of
// the normal, garbage collected heap. I think this is done
// because the buffer is subject to native I/O.
// See http://download.oracle.com/javase/1.4.2/docs/api/java/nio/ByteBuffer.html#direct
// 3 is the number of coordinates to each vertex.
_vertexBuffer = BufferFactory.createFloatBuffer(_nrOfVertices * 3);
_indexBuffer = BufferFactory.createShortBuffer(_nrOfVertices);
// Coordinates for the vertexes of the triangle.
float[] coords = {
-1f, -1f, 0f, // (x1, y1, z1)
1f, -1f, 0f, // (x2, y2, z2)
0f, 1f, 0f // (x3, y3, z3)
};
short[] _indicesArray = {0, 1, 2};
_vertexBuffer.put(coords);
_indexBuffer.put(_indicesArray);
_vertexBuffer.position(0);
_indexBuffer.position(0);
}
public void draw(GL10 gl) {
// 3 coordinates in each vertex
// 0 is the space between each vertex. They are densely packed
// in the array, so the value is 0
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, getVertexBuffer());
// Draw the primitives, in this case, triangles.
gl.glDrawElements(GL10.GL_TRIANGLES, _nrOfVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
}
private FloatBuffer getVertexBuffer() {
return _vertexBuffer;
}
}
What's going wrong here? Is there a better sample code for Open GL live wallpaper?
AT LAST I FOUND IT..
What I need to do is just add
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
to onSurfaceCreated method along with the code line
gl.glViewport(0, 0, width, height);
in the onSurfaceChanged method in MyRenderer Class
I found a similar question in stack itself [ But Solution worked for me is not marked as correct :( ]
I want to display shapes as different as one likes using OpenGL ES on an Android device. Problem is that my code doesn't even work for easy shapes like a rectangle (which I am going to use below).
I think somthing is wrong with the glTranslatef. I've adjusted all the values but I can't figure out what it is.
The Rectangle is defined by the points P(0,0,0), P(0,1,0), P(1,1,0), P(1,0,0). In the Activity I implemented the GLSurfaceView.Renderer like this:
private static FloatBuffer getVertexCoords() {
float coords[] = {
0f, 0f, 0f, // first triangle first point
0f, 1f, 0f, // first triangle second point
1f, 1f, 0f, // first triangle third point
1f, 1f, 0f, // second triangle first point
1f, 0f, 0f, // second triangle second point
0f, 0f, 0f, // second triangle third point
}
ByteBuffer vbb = ByteBuffer.allocateDirect(coords.length * 4); // n coords * 4 bytes per float
vbb.order(ByteOrder.nativeOrder());
FloatBuffer trianglesVB = vbb.asFloatBuffer();
trianglesVB.put(coords);
trianglesVB.position(0);
return trianglesVB;
}
#Override
public void onDrawFrame(GL10 gl) {
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
gl.glTranslatef(0f, 0f, -4f);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY); // glBegin
gl.glColor4f(1.0f, 0.0f, 0.0f, 1.0f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, getVertexCoords());
gl.glDrawArrays(GL10.GL_TRIANGLES, 0, 2 * 3 * 3); // triangles * points * coords
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY); // glEnd
int error = gl.glGetError();
if (error != GL10.GL_NO_ERROR) {
Log.e(TAG, "OpenGL ES Error: " + error);
}
}
#Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
// think this one doesn't matter
}
#Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
gl.glClearColor(1.0f, 1.0f, 1.0f, 1.0f); // white background
gl.glFrontFace(GL10.GL_CW); // front face is clockwise
}
I think you need a projection matrix in there somewhere. If you don't set one, then you are drawing directly in normalized device coordinates, of which the only valid z-values are from (-1 to 1).
Simply your triangle is outside of the depth range displayed.
Try adding a simple projection matrix to onSurfaceCreated:
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-10, 10, -10, 10, 0, 10);
i am working on a program which draws polygons according to user inputs.
I have problems with drawing triangles using GL_TRIANGLE. I used the same code below to draw square and it worked well. Hovewer, if i want to draw only one triangle it does not work.
Can anyone help me?
public class Triangle extends Shape{
private FloatBuffer vertexBuffer;
private FloatBuffer _colorBuffer;
private ShortBuffer indexBuffer;
private float vertices[] = {
-0.5f, -0.5f, 0.5f, // 0
0.5f, -0.5f, 0.5f, // 1
0f, -0.5f, -0.5f, // 2
};
private short[] indices = { 0, 2, 1 };
float[] colors = {1f, 1f, 0f, 1f };
public Triangle() {
ByteBuffer vbb = ByteBuffer.allocateDirect(vertices.length * 4);
vbb.order(ByteOrder.nativeOrder());
vertexBuffer = vbb.asFloatBuffer();
vertexBuffer.put(vertices);
vertexBuffer.position(0);
ByteBuffer ibb = ByteBuffer.allocateDirect(indices.length * 2);
ibb.order(ByteOrder.nativeOrder());
indexBuffer = ibb.asShortBuffer();
indexBuffer.put(indices);
indexBuffer.position(0);
ByteBuffer cbb = ByteBuffer.allocateDirect(colors.length * 4);
cbb.order(ByteOrder.nativeOrder());
_colorBuffer = cbb.asFloatBuffer();
_colorBuffer.put(colors);
_colorBuffer.position(0);
}
public void draw(GL10 gl) {
gl.glFrontFace(GL10.GL_CCW);
gl.glEnable(GL10.GL_CULL_FACE);
gl.glCullFace(GL10.GL_BACK);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_COLOR_ARRAY);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
gl.glColorPointer(4, GL10.GL_FLOAT, 0, _colorBuffer);
gl.glDrawElements(GL10.GL_TRIANGLES, indices.length,GL10.GL_UNSIGNED_SHORT, indexBuffer);
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisable(GL10.GL_CULL_FACE);
}
}
edit:
i call my Triangle class from here, maybe i have made mistake here
public class OpenGLRenderer implements Renderer {
String name;
ArrayList myArr ;
private float angle, x,y,z;
public OpenGLRenderer(String nm ) {
name =nm;
myArr = new ArrayList<Shape>();
x=0;
y=0;
z=-3;
}
#Override
public void onDrawFrame(GL10 gl) {
//clear the screen and depth buffer
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glLoadIdentity();
gl.glTranslatef(x, y, z);
for (Shape t : myArr)
{
if (t instanceof Rectangle)
{
// gl.glTranslatef(0, 0, -4);
((Rectangle )t).draw(gl);
}
if (t instanceof Square)
{ //gl.glTranslatef(0, 1, 0);
((Square )t).draw(gl);}
if (t instanceof Pyramid){
((Pyramid)t).draw(gl);
if (t instanceof Triangle){
((Triangle)t).draw(gl);
}
if (t instanceof Line){
((Line)t).draw(gl);
}
}
}//for
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
#Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_COLOR_ARRAY);
gl.glViewport(0, 0, width, height);
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glLoadIdentity(); //reset the projection matrix
GLU.gluPerspective(gl, 45.0f, (float)width/(float)height,
0.1f, 100.0f); //calculate the aspect ratio of window
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
}
#Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
//set the bg as black
gl.glClearColor(0.0f, 0.0f, 0.0f, 0.5f);
gl.glShadeModel(GL10.GL_SMOOTH);
//depth buffer setup
gl.glClearDepthf(1.0f);
gl.glEnable(GL10.GL_DEPTH_TEST);
gl.glDepthFunc(GL10.GL_LEQUAL);
gl.glHint(GL10.GL_PERSPECTIVE_CORRECTION_HINT, GL10.GL_NICEST);
}
public void addshape(Shape s)
{y=y+0.1f;
myArr.add(s);
}
}
It looks to me like you don't have enough colors in your colors array. This might result in an invisible triangle depending on the initial garbage values in your _colorBuffer.
(Edit) try:
float[] colors = {
1f, 1f, 0f, 1f,
1f, 1f, 0f, 1f,
1f, 1f, 0f, 1f
};
First, Martin is correct about the colors array. It needs to have a color (4 values in your case) for every vertex (so 12 values at all).
Next, at the moment your triangle lies inside the x-z-plane and as you don't make any changes to the modelview matrix (except translating along the z-axis), you should just see a line, if any, (think of a sheet of paper viewed from the side).
But your real problem is your draw loop. I guess you're not only new to OpenGL, but also to Java and object oriented programming in general. This whole design is complete rubbish. That's what virtual functions are for in object oriented code. Just let each shape implement its correct draw method. So the only thing you need to do is
for (Shape t : myArr)
t.draw(gl);
Given that Shape has an abstract draw method, that the other subclasses implement. But this is more of a design flaw. The actual error is, that the braces of the ifs are broken. At the moment the triangle is only drawn, if t is an instance of Pyramid and of Triangle, so draw is never called for triangles (and also for lines).
here my triangle code from a project which works. Looks like your indices and colour arrays are different
package com.martynhaigh.Vortex;
import android.view.animation.Transformation;
import javax.microedition.khronos.opengles.GL10;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.nio.ShortBuffer;
public class Triangle {
private ShortBuffer _indexBuffer;
private FloatBuffer _vertexBuffer;
private FloatBuffer _colorBuffer;
static float _xAngle, _yAngle;
private int _nrOfVertices;
Triangle() {
float[] coords = {
-0.5f, -0.5f, 0f, // (x1, y1, z1)
0.5f, -0.5f, 0f, // (x2, y2, z2)
0f, 0.5f, 0f // (x3, y3, z3)
}; // 9
_nrOfVertices = coords.length / 3;
float[] colors = {
1f, 0f, 0f, 1f, // point 1
0f, 1f, 0f, 1f, // point 2
0f, 0f, 1f, 1f // point 3
}; // 12
short[] indices = {0, 1, 2}; // 3
// float has 4 bytes, coordinate * 4 bytes
ByteBuffer vbb = ByteBuffer.allocateDirect(coords.length * 4); //36
vbb.order(ByteOrder.nativeOrder());
_vertexBuffer = vbb.asFloatBuffer();
// short has 2 bytes, indices * 2 bytes
ByteBuffer ibb = ByteBuffer.allocateDirect(indices.length * 2); // 6
ibb.order(ByteOrder.nativeOrder());
_indexBuffer = ibb.asShortBuffer();
// float has 4 bytes, colors (RGBA) * 4 bytes
ByteBuffer cbb = ByteBuffer.allocateDirect(colors.length * 4); // 48
cbb.order(ByteOrder.nativeOrder());
_colorBuffer = cbb.asFloatBuffer();
_vertexBuffer.put(coords);
_indexBuffer.put(indices);
_colorBuffer.put(colors);
_vertexBuffer.position(0);
_indexBuffer.position(0);
_colorBuffer.position(0);
}
public void onDraw(GL10 gl) {
// set rotation
gl.glRotatef(_xAngle, 1f, 0f, 0f);
gl.glRotatef(_yAngle, 0f, 1f, 0f);
// set the color of our element
//gl.glColor4f(0.5f, 0f, 0f, 0.5f);
gl.glColorPointer(4, GL10.GL_FLOAT, 0, _colorBuffer);
// define the vertices we want to draw
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer);
// finally draw the vertices
gl.glDrawElements(GL10.GL_TRIANGLES, _nrOfVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
}
public void setXAngle(float angle) {
_xAngle = angle;
}
public float getXAngle() {
return _xAngle;
}
public void setYAngle(float angle) {
_yAngle = angle;
}
public float getYAngle() {
return _yAngle;
}
}