I am trying to create a triangle in OpenGL ES. But app is crashing for line gl.glDrawElements(GL10.GL_TRIANGLES, pIndex.length, GL10.GL_UNSIGNED_SHORT, pBuff); in below code inside draw method.
public class GLTriangleEx {
private float vertices[] = {
0f, 1f, // point 0
1f, -1f, // point 1
-1f, -1f // point 2
};
private FloatBuffer vertBuff;
private short[] pIndex = {0, 1, 2};
private ShortBuffer pBuff;
public GLTriangleEx() {
ByteBuffer bBuff = ByteBuffer.allocateDirect(vertices.length * 4);
bBuff.order(ByteOrder.nativeOrder());
vertBuff = bBuff.asFloatBuffer();
vertBuff.put(vertices);
vertBuff.position(0);
ByteBuffer pbBuff = ByteBuffer.allocateDirect(pIndex.length * 2);
pbBuff.order(ByteOrder.nativeOrder());
pBuff = pbBuff.asShortBuffer();
pBuff.put(pIndex);
pbBuff.position(0);
}
public void draw(GL10 gl){
gl.glFrontFace(GL10.GL_CW);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glVertexPointer(2, GL10.GL_FLOAT, 0, vertBuff);
// app crashes here.
gl.glDrawElements(GL10.GL_TRIANGLES, pIndex.length, GL10.GL_UNSIGNED_SHORT, pBuff);
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
}
Crash log cat is
java.lang.ArrayIndexOutOfBoundsException: remaining() < count < needed
at com.google.android.gles_jni.GLImpl.glDrawElements(Native Method)
at com.mobility.opengleslearning.GLTriangleEx.draw(GLTriangleEx.java:45)
at com.mobility.opengleslearning.GLRenderer.onDrawFrame(GLRenderer.java:38)
at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1522)
at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1239)
I have checked below links for help but of no use for my case:
Android OpenGL error: "remaining() < needed" and Android 4.4
Beginning to learn OpenGL ES. Drawing quad
You need to rewind the ShortBuffer you use for the indices. In this code:
ByteBuffer pbBuff = ByteBuffer.allocateDirect(pIndex.length * 2);
pbBuff.order(ByteOrder.nativeOrder());
pBuff = pbBuff.asShortBuffer();
pBuff.put(pIndex);
pbBuff.position(0);
you're rewinding pbBuff, which is the underlying ByteBuffer.
asShortBuffer() returns a view buffer that shares the underlying data with the original buffer. From the documentation (emphasis added by me):
A view buffer is simply another buffer whose content is backed by the byte buffer. Changes to the byte buffer's content will be visible in the view buffer, and vice versa; the two buffers' position, limit, and mark values are independent.
So pBuff, which is your view buffer, has its own position. You need to rewind the view buffer, which is the buffer you use later:
pBuff.position(0);
Related
I have the book "OpenGL ES 2 for Android A Quick-Start Guide" and it is going through a good tutorial on OpenGL and android. The issue I am having though is that it's examples don't use index buffers for the creation of their shapes.
I am trying to texture a square which I define the 4 verticies of the square (plus an S coordinate and T coordinate for the textures) and then render using the index buffer. However, I the color of the square is only the bottom left corner of the PNG texture file and it is not getting rendered correctly on my square.
This is my render function:
public void onDrawFrame(float[] matrixViewProjection)
{
super.onDrawFrame(matrixViewProjection);
GLES20.glUseProgram(this.shaderProgram);
int posHandle = GLES20.glGetAttribLocation(shaderProgram,"vPosition");
GLES20.glEnableVertexAttribArray(posHandle);
GLES20.glVertexAttribPointer(posHandle,coordsPerVertex,GLES20.GL_FLOAT,false,vertexStride,vertexBuffer);
int colHandle = GLES20.glGetUniformLocation(shaderProgram,"vColor");
GLES20.glUniform4fv(colHandle,1,color,0);
int mvpHandle = GLES20.glGetUniformLocation(shaderProgram,"uMVPMatrix");
GLES20.glUniformMatrix4fv(mvpHandle,1,false,matrixSum,0);
GLES20.glDrawElements(GLES20.GL_TRIANGLES,indexBufferCount,GLES20.GL_UNSIGNED_SHORT,drawListBuffer);
GLES20.glDisableVertexAttribArray(posHandle);
}
And this is my constructor for my square object:
public ShapeSquare(Context context, int program, float size)
{
float squareCoords[] = {
-(size/2.0f),-(size/2.0f),0f, 0f, 0f,
-(size/2.0f), (size/2.0f),0f, 0f, 1f,
(size/2.0f), (size/2.0f),0f, 1f, 1f,
(size/2.0f),-(size/2.0f),0f, 0f, 0f
};
short drawOrder[] = {0,1,3,1,2,3};
indexBufferCount = drawOrder.length;
ByteBuffer bb = ByteBuffer.allocateDirect(squareCoords.length*4);
bb.order(ByteOrder.nativeOrder());
vertexBuffer = bb.asFloatBuffer();
vertexBuffer.put(squareCoords);
vertexBuffer.position(0);
ByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length*2);
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
shaderProgram = program;
shaderProgram = program;
textureID = TextureHelper.loadTexture(context, R.raw.texture1);
}
And these are some of the definitions I have, although I don't know if they are correct
protected static int coordsPerVertex = 3;
protected int vertexCount = 12/coordsPerVertex;
protected static final int vertexStride = coordsPerVertex*4+8;
Here is what is rendered...
And this is the texture I have. (Take note of the bottom left corner)
You have to specify 2 array of vertex attribute data (glVertexAttribPointer).
1 for the vertex coordinates and 1 for the texture coordinates:
int posHandle = GLES20.glGetAttribLocation(shaderProgram, "vPosition");
int texHandle = GLES20.glGetAttribLocation(shaderProgram, ???);
GLES20.glEnableVertexAttribArray(posHandle);
GLES20.glEnableVertexAttribArray(texHandle );
vertexBuffer.position(0);
GLES20.glVertexAttribPointer(posHandle, coordsPerVertex, GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
vertexBuffer.position(3);
GLES20.glVertexAttribPointer(texHandle, coordsPerVertex, GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
Note the vertexBuffer contains tuples of 5 elements (x, y, z, u, v). The stride in bytes is 5*4. The offset of the vertex coordinates is 0 and the offset of the texture coordinates is 3*4 respectively 3 elements in the float buffer.
Since you do not use Vertex Buffer Objects, you should prefer to create separate arrays (respectively float buffers) for the attributes.
I'm trying to render a subdivided mesh with a displacement texture on it and a color texture. To do so I go through every pixel, create a vertex for it, and move that vertex according to a black and white image I have. The problem is that when I render it, I get something that looks a bit like TV snow.
Here's the relevant code:
public Plane(Bitmap image, Bitmap depth)
{
this.image = image; //color image
this.depth = depth; //BW depth image
this.w = image.getWidth();
this.h = image.getHeight();
vertexCoords = vertexArray(); //places vertices in 3d
drawOrder = orderArray(); //sets the draw order
colorCoords = colorArray(); //sets color per vertex
ByteBuffer bb = ByteBuffer.allocateDirect(vertexCoords.length * 4);
bb.order(ByteOrder.nativeOrder());
vertexBuffer = bb.asFloatBuffer();
vertexBuffer.put(vertexCoords);
vertexBuffer.position(0);
ByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length * 4);
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
ByteBuffer cbb = ByteBuffer.allocateDirect(colorCoords.length * 4);
cbb.order(ByteOrder.nativeOrder());
colorBuffer = cbb.asFloatBuffer();
colorBuffer.put(colorCoords);
colorBuffer.position(0);
}
public void draw(GL10 gl) {
// Counter-clockwise winding.
gl.glFrontFace(GL10.GL_CCW);
// Enable face culling.
gl.glEnable(GL10.GL_CULL_FACE);
// What faces to remove with the face culling.
gl.glCullFace(GL10.GL_BACK);
// Enabled the vertices buffer for writing and to be used during
// rendering.
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
// Specifies the location and data format of an array of vertex
// coordinates to use when rendering.
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
// Enable the color array buffer to be used during rendering.
gl.glEnableClientState(GL10.GL_COLOR_ARRAY); // NEW LINE ADDED.
// Point out the where the color buffer is.
gl.glColorPointer(4, GL10.GL_FLOAT, 0, colorBuffer); // NEW LINE ADDED.
gl.glDrawElements(GL10.GL_TRIANGLES, drawOrder.length,
GL10.GL_UNSIGNED_SHORT, drawListBuffer);
// Disable the vertices buffer.
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
// Disable face culling.
gl.glDisable(GL10.GL_CULL_FACE);
gl.glDisableClientState(GL10.GL_COLOR_ARRAY);
}
What can I do to actually view the model, instead of this snow thing? The patterns change if I turn my screen on and off, and they sometimes change randomly. It seems that the colors present in the original bitmap are also present in the snow (the snow color changes with different pictures), so I know I'm doing something right, I just don't know what's wrong here.
EDIT: here's the code for vertexArray()
public float[] vertexArray()
{
int totalPoints = w*h;
float[] arr = new float[totalPoints*3];
int i = 0;
for(int y = 0; y<h; y++)
{
for(int x = 0; x<w; x++)
{
arr[i] = x * 0.01f;
arr[i+1] = y * 0.01f;
arr[i+2] = 1.0f;//getDepth(x,y);
i+=3;
}
}
return arr;
}
Currently I am working on a 3D Game Engine for Android. The Project is just at the beginning, but there is a problem I cant' solve.
I want to use VertexBufferObjects for rendering the models loaded form an .obj file.
Here is the code
public class Mesh {
private final int mBytesPerFloat = 4;
private FloatBuffer vertices;
private FloatBuffer normals;
private IntBuffer faces;
private int vertexBuffer;
private int normalBuffer;
private int indexBuffer;
public Mesh(float[] vertices, float[] normals, int[] faces, int shaderProgram) {
this.normals = ByteBuffer.allocateDirect(normals.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder())
.asFloatBuffer();
this.normals.put(normals).position(0);
this.faces = ByteBuffer.allocateDirect(faces.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder())
.asIntBuffer();
this.faces.put(faces).position(0);
this.vertices = ByteBuffer.allocateDirect(vertices.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder())
.asFloatBuffer();
this.vertices.put(vertices).position(0);
ByteBuffer bb = ByteBuffer.allocateDirect(8);
bb.order(ByteOrder.nativeOrder());
IntBuffer buffer = bb.asIntBuffer();
GLES20.glGenBuffers(1, buffer);
vertexBuffer = buffer.get(0);
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, vertexBuffer);
GLES20.glBufferData(GLES20.GL_ARRAY_BUFFER, this.vertices.capacity() * mBytesPerFloat, this.vertices, GLES20.GL_STATIC_DRAW);
bb = ByteBuffer.allocateDirect(8);
bb.order(ByteOrder.nativeOrder());
buffer = bb.asIntBuffer();
GLES20.glGenBuffers(1, buffer);
normalBuffer = buffer.get(0);
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, normalBuffer);
GLES20.glBufferData(GLES20.GL_ARRAY_BUFFER, this.normals.capacity() * mBytesPerFloat, this.normals, GLES20.GL_STATIC_DRAW);
bb = ByteBuffer.allocateDirect(8);
bb.order(ByteOrder.nativeOrder());
buffer = bb.asIntBuffer();
GLES20.glGenBuffers(1, buffer);
indexBuffer = buffer.get(0);
GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, indexBuffer);
GLES20.glBufferData(GLES20.GL_ELEMENT_ARRAY_BUFFER, this.faces.capacity() * mBytesPerFloat, this.faces, GLES20.GL_STATIC_DRAW);
}
public Mesh() {
// TODO Auto-generated constructor stub
}
public void render(int shaderProgram) {
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, vertexBuffer);
GLES20.glVertexAttribPointer(GLES20.glGetAttribLocation(shaderProgram, "vertex"), 3, GLES20.GL_FLOAT, false, 0, 0);
GLES20.glEnableVertexAttribArray(GLES20.glGetAttribLocation(shaderProgram, "vertex"));
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, normalBuffer);
GLES20.glVertexAttribPointer(GLES20.glGetAttribLocation(shaderProgram, "normal"), 3, GLES20.GL_FLOAT, false, 0, 0);
GLES20.glEnableVertexAttribArray(GLES20.glGetAttribLocation(shaderProgram, "normal"));
GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, indexBuffer);
GLES20.glDrawElements(GLES20.GL_TRIANGLES, vertices.capacity()/3, GLES20.GL_INT, faces);
}
}
The problem is that I do not see anything.
The problem is not the shader i am using (which I know is working correctly) and the modelview matrix and projection matrix are correct, so the object should be visible, if it is rendered which it isn't.
Has anyone an idea what the problem might be?
It doesn't seem like glDrawElements accepts GL_INT as data type, so maybe the glDrawElements call just fails (which could have easily been checking with some simple glGetError calls for the sake of debugging). Just try GL_UNSIGNED_INT instead (and maybe a UIntBuffer for faces or something similar, if that exists).
Maybe it doesn't even have GL_UNSIGNED_INT in ES and you need 2-byte GL_UNSIGNED_SHORTs, but I'm not sure about that. But you definitely need an unsigned type (what sense does a signed type make for indices anyway? But ok, that's Java). Anyway, some glGetErrors might do wonders here.
When i was applying color buffer and blending on a solid circle, the color in first 20 degrees does not display properly, I got some sort of color ribbon but that is not what it supposed to be, maybe I have done something wrong in my code?
public class Circle {
boolean circleChecked;
private int points=361;
private float vertices[]={0.0f,0.0f,0.0f};
private float[] fogcolor = {0.2f,0.4f,0.7f,0.9f};
private FloatBuffer vertBuff, textureBuffer;
private FloatBuffer colorBuffer; // Buffer for color-array (NEW)
float texData[] = null;
private float[] colors = { // Colors for the vertices (NEW)
0.7f,0.7f,0.7f,0.5f,
0.7f,0.7f,0.7f,0.5f,
0.7f,0.7f,0.7f,0.5f
};
float theta = 0;
int[] textures = new int[1];
int R=1;
float textCoordArray[] =
{
-R,
(float) (R * (Math.sqrt(2) + 1)),
-R,
-R,
(float) (R * (Math.sqrt(2) + 1)),
-R
};
public Circle(float size, float positionX, float positionY){
vertices = new float[(points)*3];
for(int i=0;i<3;i+=3){
vertices[i]=positionX * size;
vertices[i+1]=positionY *size;
vertices[i+2]=0.51f;
}
for(int i=3;i<(points)*3;i+=3)
{
vertices[i]=((float) ( Math.cos(theta))/3+positionX) * size;
vertices[i+1]=((float) (Math.sin(theta))/3+positionY) *size;
vertices[i+2]=0.5f;
theta += Math.PI / 90;
}
ByteBuffer bBuff=ByteBuffer.allocateDirect(vertices.length*4);
bBuff.order(ByteOrder.nativeOrder());
vertBuff=bBuff.asFloatBuffer();
vertBuff.put(vertices);
vertBuff.position(0);
// Setup color-array buffer. Colors in float. A float has 4 bytes (NEW)
ByteBuffer cbb = ByteBuffer.allocateDirect(colors.length * 4);
cbb.order(ByteOrder.nativeOrder()); // Use native byte order (NEW)
colorBuffer = cbb.asFloatBuffer(); // Convert byte buffer to float (NEW)
colorBuffer.put(colors); // Copy data into buffer (NEW)
colorBuffer.position(0); // Rewind (NEW)
ByteBuffer bBuff2=ByteBuffer.allocateDirect(textCoordArray.length * 4 * 360);
bBuff2.order(ByteOrder.nativeOrder());
textureBuffer=bBuff2.asFloatBuffer();
textureBuffer.put(textCoordArray);
textureBuffer.position(0);
}
public void draw(GL10 gl){
//gl.glDisable(GL10.GL_LIGHTING);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
//gl.glColor4f(0.8f, 0.8f, 0.8f, 1);
gl.glEnableClientState(GL10.GL_COLOR_ARRAY);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertBuff);
gl.glColorPointer(4, GL10.GL_FLOAT, 0, colorBuffer);
// if(circleChecked){
// gl.glColor4f(0.2f, 0.4f, 0.8f, 1);
//}
//gl.glEnable(GL10.GL_TEXTURE_2D);
gl.glEnable(GL10.GL_BLEND);
gl.glPushMatrix();
gl.glFogf(GL10.GL_FOG_MODE, GL10.GL_LINEAR);
gl.glFogf(GL10.GL_FOG_START, 3.0f);
gl.glFogf(GL10.GL_FOG_END, 5.0f);
float fogColor[] = {1f, 0.0f, 0.5f, 1.0f};
gl.glFogfv(GL10.GL_FOG_COLOR, fogColor, 0);
gl.glFogf(GL10.GL_FOG_DENSITY, 0.9f);
gl.glEnable(GL10.GL_FOG);
gl.glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE_MINUS_SRC_ALPHA);
//gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]); //4
//gl.glTexCoordPointer(2, GL10.GL_FLOAT,0, textureBuffer); //5
// gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glDrawArrays(GL10.GL_TRIANGLE_FAN, 0, points/2);
gl.glDisableClientState(GL10.GL_COLOR_ARRAY); // Disable color-array (NEW)
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glPopMatrix();
//gl.glDisable(GL10.GL_FOG);
}
}
The problem is in your color array. glDrawArrays gl.glDrawArrays(GL10.GL_TRIANGLE_FAN, 0, points/2); will take values from each buffer you enable its client state "gl.glEnableClientState(GL10.GL_COLOR_ARRAY)". The number of those values equals to the last parameter and in your case points/2 but your color buffer has only 3 values. The result is that only first of your triangles has correct color mapping, all the rest is garbage and the results are unpredictable.
Although this might seem inefficient for your case, you will need to repeat those color parameters in your "for" loop where you set your vertex coordinates and the length of the buffer should be the same as "vertBuffer". And by length I mean number of values, not bytes, where 1 color value consists of 4 float values and 1 position value consists of 3 float values in your case.
I spend almost whole day trying to render simple polygon using opengl 1.1 and vertex buffers, but no luck. I searched and searched, but I haven't found much.
This is what i have so far:
public class Polygon {
int bufferId = 0;
private FloatBuffer vertexBuffer; // Buffer for vertex-array
private float[] vertices = { // Vertices for the square
-1.0f, -1.0f, 0.0f, // 0. left-bottom
1.0f, -1.0f, 0.0f, // 1. right-bottom
-1.0f, 1.0f, 0.0f, // 2. left-top
1.0f, 1.0f, 0.0f // 3. right-top
};
private ByteBuffer indexBuffer;
private byte[] indices = {0, 1, 2, 3}; // Indices to above vertices (in CCW)
// Constructor - Setup the vertex buffer
public Polygon() {
// Setup vertex array buffer. Vertices in float. A float has 4 bytes
ByteBuffer vbb = ByteBuffer.allocateDirect(vertices.length * 4);
vbb.order(ByteOrder.nativeOrder()); // Use native byte order
vertexBuffer = vbb.asFloatBuffer(); // Convert from byte to float
vertexBuffer.put(vertices); // Copy data into buffer
vertexBuffer.position(0); // Rewind
indexBuffer = ByteBuffer.allocateDirect(indices.length);
indexBuffer.put(indices);
indexBuffer.position(0);
int[] buffers = new int[1];
GLES11.glGenBuffers(1, buffers, 0);
bufferId = buffers[0];
GLES11.glBindBuffer(GLES11.GL_ARRAY_BUFFER, bufferId);
GLES11.glBufferData(GLES11.GL_ARRAY_BUFFER, vertices.length, vertexBuffer, GLES11.GL_STATIC_DRAW);
GLES11.glBindBuffer(GLES11.GL_ARRAY_BUFFER, 0);
}
// Render the shape
public void draw(GL10 gl) {
GLES11.glBindBuffer(GLES11.GL_ARRAY_BUFFER, bufferId);
GLES11.glEnableClientState(GL10.GL_VERTEX_ARRAY);
GLES11.glVertexPointer(3, GLES11.GL_FLOAT, 0, 0);
GLES11.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, vertices.length);
GLES11.glDisableClientState(GL10.GL_VERTEX_ARRAY);
GLES11.glBindBuffer(GLES11.GL_ARRAY_BUFFER, 0);
}
}
It doesn't render anything and there is no relevant error in android logcat.
I ommited rest of the code. The problem is obviously in this class, since it works fine when I change draw method to this:
public void draw(GL10 gl) {
GLES11.glEnableClientState(GL10.GL_VERTEX_ARRAY);
GLES11.glVertexPointer(3, GLES11.GL_FLOAT, 0, vertexBuffer);
GLES11.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, vertices.length);
GLES11.glDisableClientState(GL10.GL_VERTEX_ARRAY);
}
So, what am I doing wrong?
Don't look in the logCat for errors, you want to check for OpenGL errors with glGetError(), and check that the return value is zero (no error), or nonzero (error).
vertices.length is the wrong argument to glDrawArrays. You want to supply the number of vertices, not the number of floats. It should be vertices.length/3 (3 floats per vertex).
You're currently drawing way past your array into some garbage data, so I'm not sure what kind of consequences that could have.