I'm new in OpenGL ES 2, and I have read many topics about how to draw a circle in OpenGL ES 2 on Android. Based on Drawing Shapes and this code found on gamedev.net, I can draw triangles and quares, but I still don't know how to draw a circle. I now have three ways to draw a circle:
Generate vertices in a circle and use glDrawArray( GL_LINES, ... ). Depending on how many vertices you generate this will yield a nice and crisp result.
Use a pregenerated texture of a circle (with alpha transparency) and map it on a quad. This will result in very smooth graphics and allow for a ´thick´ circle, but it will not be as flexible: Even with mipmapping, you'll want your texture to be about the same size you are rendering the quad.
Use a fragment shader.
But how do I implement them?
I definitely do not recommend rendering a circle through geometry. It has two major disadvantages:
It is slow. If you want to get acceptable accuracy you need a lot of vertices and any of these vertices need to be processed in the shader. For a real circle you need as many vertices as your circle have pixels.
It is not really flexible. Having different circles, styling and colring them is hard to master.
There is another method, which I personally use in every graphics API. Rendering at least a triangle or a sqare/quad and use the fragment-shader to only make the disired (based on a equation) pixel visible. It is very easy to understand. It is flexible and fast. It needs blending, but this is not really hard to get to work.
Steps:
Initialize your buffers with data. You need a vertex-buffer for the vertices, an index-buffer for the indices if you're a using a square geometry, and a textureCoord-buffer for your texture coordinates.
For a square I recommend using -1.0 as the lowest and 1.0 as the highest texture coordinate, because then you are able to use the unit circle equation.
In your fragment-shader, use something like this:
if ((textureCoord.x * textureCoord.x) + (textureCoord.y * textureCoord.y) <= 1.0)
{
// Render colored and desired transparency
}
else
{
// Render with 0.0 in alpha channel
}
While (textureCoord.x * textureCoord.x) + (textureCoord.y * textureCoord.y) <= 1.0 is the inequality, because you need a circle, you have to render every pixel within that range, not just the border. You can change this so that it gives you the desired output.
And that is it. Not very complex to implement, so I don't offer any basic rendering code here. All you need happens within the fragment-shader.
If you want to create geometry for the circle, do something like this:
int vertexCount = 30;
float radius = 1.0f;
float center_x = 0.0f;
float center_y = 0.0f;
// Create a buffer for vertex data
float buffer[] = new float[vertexCount*2]; // (x,y) for each vertex
int idx = 0;
// Center vertex for triangle fan
buffer[idx++] = center_x;
buffer[idx++] = center_y;
// Outer vertices of the circle
int outerVertexCount = vertexCount-1;
for (int i = 0; i < outerVertexCount; ++i){
float percent = (i / (float) (outerVertexCount-1));
float rad = percent * 2*Math.PI;
//Vertex position
float outer_x = center_x + radius * cos(rad);
float outer_y = center_y + radius * sin(rad);
buffer[idx++] = outer_x;
buffer[idx++] = outer_y;
}
//Create VBO from buffer with glBufferData()
Then you can draw using glDrawArrays() either as:
GL_LINE_LOOP(contour only) or
GL_TRIANGLE_FAN(filled shape)
.
// Draw circle contours (skip center vertex at start of the buffer)
glDrawArrays(GL_LINE_LOOP, 2, outerVertexCount);
// Draw circle as a filled shape
glDrawArrays(GL_TRIANGLE_FAN, 0, vertexCount);
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import android.opengl.GLES20;
import android.util.Log;
public class Circle {
private int mProgram, mPositionHandle, mColorHandle, mMVPMatrixHandle ;
private FloatBuffer mVertexBuffer;
private float vertices[] = new float[364 * 3];
float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
private final String vertexShaderCode =
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
" gl_Position = uMVPMatrix * vPosition;" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
Circle(){
vertices[0] = 0;
vertices[1] = 0;
vertices[2] = 0;
for(int i =1; i <364; i++){
vertices[(i * 3)+ 0] = (float) (0.5 * Math.cos((3.14/180) * (float)i ));
vertices[(i * 3)+ 1] = (float) (0.5 * Math.sin((3.14/180) * (float)i ));
vertices[(i * 3)+ 2] = 0;
}
Log.v("Thread",""+vertices[0]+","+vertices[1]+","+vertices[2]);
ByteBuffer vertexByteBuffer = ByteBuffer.allocateDirect(vertices.length * 4);
vertexByteBuffer.order(ByteOrder.nativeOrder());
mVertexBuffer = vertexByteBuffer.asFloatBuffer();
mVertexBuffer.put(vertices);
mVertexBuffer.position(0);
int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
mProgram = GLES20.glCreateProgram(); // create empty OpenGL ES Program
GLES20.glAttachShader(mProgram, vertexShader); // add the vertex shader to program
GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
GLES20.glLinkProgram(mProgram);
}
public static int loadShader(int type, String shaderCode){
int shader = GLES20.glCreateShader(type);
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
public void draw (float[] mvpMatrix){
GLES20.glUseProgram(mProgram);
// get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
// Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, 3,
GLES20.GL_FLOAT, false,12
,mVertexBuffer);
// get handle to fragment shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
// Set color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
// Draw the triangle
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_FAN, 0, 364);
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
}
This is a modified version of the above answer. It also includes the code to color the circle as well. Most of the functions are used as OpenGL ES1 though. Mind the naming conventions of class toilet, LOL. If you need the code of other classes where I render OpenGL as well, let me know.
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import javax.microedition.khronos.opengles.GL10;
public class Toilet {
// Circle variables
int circlePoints = 30;
float radius = 1.0f;
float center_x = 0.0f;
float center_y = 0.0f;
// Outer vertices of the circle i.e. excluding the center_x, center_y
int circumferencePoints = circlePoints-1;
// Circle vertices and buffer variables
int vertices = 0;
float circleVertices[] = new float[circlePoints*2];
private FloatBuffer toiletBuff; // 4 bytes per float
// Color values
private float rgbaValues[] = {
1, 1, 0, .5f,
.25f, 0, .85f, 1,
0, 1, 1, 1
};
private FloatBuffer colorBuff;
public Toilet()
{
// The initial buffer values
circleVertices[vertices++] = center_x;
circleVertices[vertices++] = center_y;
// Set circle vertices values
for (int i = 0; i < circumferencePoints; i++)
{
float percent = (i / (float) (circumferencePoints - 1));
float radians = (float) (percent * 2 * Math.PI);
// Vertex position
float outer_x = (float) (center_x + radius * Math.cos(radians));
float outer_y = (float) (center_y + radius * Math.sin(radians));
circleVertices[vertices++] = outer_x;
circleVertices[vertices++] = outer_y;
}
// Float buffer short has four bytes
ByteBuffer toiletByteBuff = ByteBuffer
.allocateDirect(circleVertices.length * 4);
// Garbage collector won't throw this away
toiletByteBuff.order(ByteOrder.nativeOrder());
toiletBuff = toiletByteBuff.asFloatBuffer();
toiletBuff.put(circleVertices);
toiletBuff.position(0);
// Float buffer short has four bytes
ByteBuffer clrBuff = ByteBuffer.allocateDirect(rgbaValues.length * 4);
// garbage collector wont throw this away
clrBuff.order(ByteOrder.nativeOrder());
colorBuff = clrBuff.asFloatBuffer();
colorBuff.put(rgbaValues);
colorBuff.position(0);
}
// Draw methods
public void draw(GL10 gl) {
// Get the front face
gl.glFrontFace(GL10.GL_CW); // Front facing is clockwise
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
// Enable color array
gl.glEnableClientState(GL10.GL_COLOR_ARRAY);
// Pointer to the buffer
gl.glVertexPointer(2, GL10.GL_FLOAT, 0, toiletBuff);
// Pointer to color
gl.glColorPointer(4, GL10.GL_FLOAT, 0, colorBuff);
// Draw hollow circle
//gl.glDrawArrays(GL10.GL_LINE_LOOP, 1, circumferencePoints);
// Draw circle as filled shape
gl.glDrawArrays(GL10.GL_TRIANGLE_FAN, 0, circlePoints);
gl.glDisableClientState(GL10.GL_COLOR_ARRAY);
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
}
}
One major flaw I noticed in 'goal's post: You can't change the position of the circle.
Here is the fix. Notice the end of the first two lines in the 'for' loop.
vertices[0] = 0;
vertices[1] = 0;
vertices[2] = 0;
for (int i =1; i <364; i++){
vertices[(i * 3)+ 0] = (float) (0.5 * Math.cos((3.14/180) * (float)i ) + vertices[0]);
vertices[(i * 3)+ 1] = (float) (0.5 * Math.sin((3.14/180) * (float)i ) + vertices[1]);
vertices[(i * 3)+ 2] = 0;
}
Related
I am trying to render an opaque torus in Android using OpenGL ES 2.0. When I added colour, following this guide, I noticed an artefact when viewing the torus from certain perspectives. I have linked an image that shows this, though the animation here may make it clearer.
After some initial reading, I considered that it could be a depth buffer issue, since it appears that the interior rear surface could be being rendered over the exterior, front facing surface that should be seen. However, changing the view frustrum near/far limits to try and maximise the separation between surfaces hasn't helped.
I am certain that the vertices themselves are correct, from rendering using GLES20.GL_LINES instead of GLES20.GL_TRIANGLES. Any ideas what could be causing this artefact?
Below is the code for the surface:
public class Miller {
private FloatBuffer verticesBuffer;
private ShortBuffer indicesBuffer;
final int nTheta = 50; // Number of divisions per 2pi theta.
final int nPhi = 50; // And per 2pi phi.
private int mProgramHandle;
private final int POSITION_DATA_SIZE_IN_ELEMENTS = 3; // Number of elements per coordinate per vertex (x,y,z)
private final int COLOR_DATA_SIZE_IN_ELEMENTS = 4; // Number of elements per colour per vertex (r,g,b,a)
private final int BYTES_PER_FLOAT = 4; // Number of bytes used per float.
private final int BYTES_PER_SHORT = 2; // Number of bytes used per short.
private final int POSITION_DATA_SIZE = POSITION_DATA_SIZE_IN_ELEMENTS * nTheta * nPhi;
private final int COLOR_DATA_SIZE = COLOR_DATA_SIZE_IN_ELEMENTS * nTheta * nPhi;
final int STRIDE = (POSITION_DATA_SIZE_IN_ELEMENTS + COLOR_DATA_SIZE_IN_ELEMENTS)* BYTES_PER_FLOAT;
// Use to access and set the view transformation
private int mMVPMatrixHandle;
private final String fragmentShaderCode =
"precision mediump float;" +
"varying vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
private final String vertexShaderCode =
"uniform mat4 uMVPMatrix;" +
"attribute vec4 aColor;" +
"attribute vec4 aPosition;" +
"varying vec4 vColor;" +
"void main() {" +
" vColor = aColor;" +
" gl_Position = uMVPMatrix * aPosition;" +
"}";
private float a; // Minor radius
private float R0; // Major radius
int nVertices = nTheta*nPhi; // Number of vertices
Miller(float minrad, float majrad) {
this.R0 = majrad/3.0f; // Rescale.
this.a = minrad/3.0f;
ByteBuffer buffer1 = ByteBuffer.allocateDirect(nVertices * (POSITION_DATA_SIZE_IN_ELEMENTS + COLOR_DATA_SIZE_IN_ELEMENTS) * BYTES_PER_FLOAT );
buffer1.order(ByteOrder.nativeOrder());
verticesBuffer = buffer1.asFloatBuffer();
for (int iTheta = 0; iTheta < nTheta; iTheta++) {
float theta = (float) (iTheta * 2 * Math.PI / nTheta);
for (int iPhi = 0; iPhi < nPhi; iPhi++) {
float phi = (float) (iPhi * 2 * Math.PI / nPhi);
// Circular torus vertices
float x = (float) ((R0 + a * Math.cos(theta)) * Math.cos(phi));
float y = (float) (a * Math.sin(theta));
float z = (float) ((R0 + a * Math.cos(theta)) * Math.sin(phi));
verticesBuffer.put(x);
verticesBuffer.put(y);
verticesBuffer.put(z);
float mod = (float)Math.sqrt(x*x + y*y + z*z); // Distance from origin to point
float cx = (float)Math.pow(Math.sin(phi),2);
float cy = (float)Math.pow(Math.sin(phi),2);
float cz = (float)Math.pow(Math.cos(phi),2); // colours
// Add colours according to position
verticesBuffer.put(cx);
verticesBuffer.put(cy);
verticesBuffer.put(cz);
verticesBuffer.put(1.0f); // Opaque
}
}
verticesBuffer.position(0);
// Create buffer for indices 2 bytes per short per coord per vertex
ByteBuffer buffer2 = ByteBuffer.allocateDirect(nPhi *nTheta * POSITION_DATA_SIZE_IN_ELEMENTS * BYTES_PER_SHORT * 2);
buffer2.order(ByteOrder.nativeOrder());
indicesBuffer = buffer2.asShortBuffer();
for (int iTheta = 0; iTheta < nTheta; iTheta++) {
for (int iPhi = 0; iPhi < nPhi; iPhi++) {
int f = iTheta* nPhi + iPhi; // First vertex
int s,fp1,sp1; // Initialise second, first plus 1, second plus 1.
if (iTheta != nTheta-1) { // Triangles that link back to theta=0
s = f + nPhi;
} else {
s = iPhi;
}
if (iPhi != nPhi-1) { // Triangles that link back to phi = 0
fp1 = f+1;
sp1 = s+1;
} else {
fp1 = f-iPhi;
sp1 = s-iPhi;
}
indicesBuffer.put((short)f); // First triangle
indicesBuffer.put((short)fp1);
indicesBuffer.put((short)s);
indicesBuffer.put((short)s); // Second triangle
indicesBuffer.put((short)fp1);
indicesBuffer.put((short)sp1);
}
}
indicesBuffer.position(0);
int vertexShaderHandle = TokGLRenderer.loadShader(GLES20.GL_VERTEX_SHADER, // Load vertex shader - acquire handle.
vertexShaderCode);
int fragmentShaderHandle = TokGLRenderer.loadShader(GLES20.GL_FRAGMENT_SHADER, // And fragment shader handle.
fragmentShaderCode);
// create empty OpenGL ES Program
mProgramHandle = GLES20.glCreateProgram();
// add the vertex shader to program
GLES20.glAttachShader(mProgramHandle, vertexShaderHandle);
// add the fragment shader to program
GLES20.glAttachShader(mProgramHandle, fragmentShaderHandle);
// Bind attributes
GLES20.glBindAttribLocation(mProgramHandle, 0, "aPosition");
GLES20.glBindAttribLocation(mProgramHandle, 1, "aColor");
// creates OpenGL ES program executables
GLES20.glLinkProgram(mProgramHandle);
}
private int mPositionHandle;
private int mNormalHandle;
private int mColorHandle;
public void draw(float[] mvpMatrix) {
// Add program to OpenGL ES environment
GLES20.glUseProgram(mProgramHandle);
// get handle to vertex shader's aPosition member
mPositionHandle = GLES20.glGetAttribLocation(mProgramHandle, "aPosition");
// get handle to fragment shader's vColor member
mColorHandle = GLES20.glGetAttribLocation(mProgramHandle, "aColor"); // USED TO BE vColor
// get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgramHandle, "uMVPMatrix");
// Set color for drawing the triangle
//GLES20.glUniform4fv(mColorHandle, 1, color, 0);
// Prepare the coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, 3, GLES20.GL_FLOAT, false, STRIDE, verticesBuffer);
// Pass in the position information
verticesBuffer.position(0);
GLES20.glVertexAttribPointer(mPositionHandle, POSITION_DATA_SIZE_IN_ELEMENTS, GLES20.GL_FLOAT, false, STRIDE, verticesBuffer);
GLES20.glEnableVertexAttribArray(mPositionHandle); // Enable handle to position of vertices
// Pass in the colour information
verticesBuffer.position(POSITION_DATA_SIZE_IN_ELEMENTS);
GLES20.glVertexAttribPointer(mColorHandle, COLOR_DATA_SIZE_IN_ELEMENTS, GLES20.GL_FLOAT, false, STRIDE, verticesBuffer);
GLES20.glEnableVertexAttribArray(mColorHandle); // Enable handle to colour of vertices
// Pass the projection and view transformation to the shader
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
// Draw vertices linked by triangles.
GLES20.glDrawElements(GLES20.GL_TRIANGLES, 6*nTheta*nPhi, GLES20.GL_UNSIGNED_SHORT, indicesBuffer);
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
GLES20.glDisableVertexAttribArray(mColorHandle);
}
}
and for the renderer:
public class TokGLRenderer implements GLSurfaceView.Renderer {
// mMVPMatrix is an abbreviation for "Model View Projection Matrix"
private final float[] mMVPMatrix = new float[16];
private final float[] mProjectionMatrix = new float[16];
private final float[] mViewMatrix = new float[16];
private Miller surf;
public void onSurfaceCreated(GL10 unused) {
surf = new Miller(0.96f, 3.1439243f);
}
public void onSurfaceChanged(GL10 gl10, int width, int height) {
GLES20.glViewport(0,0, width, height);
float ratio = (float) width / height;
// this projection matrix is applied to object coordinates
// in the onDrawFrame() method
float zoom = 0.9f;
Matrix.frustumM(mProjectionMatrix, 0, -ratio/zoom, ratio/zoom, -1f/zoom, 1f/zoom, 7f, 11f);
}
private float[] mRotationMatrix = new float[16];
public void onDrawFrame(GL10 unused) {
// Redraw background color
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
// Set the camera position (View matrix)
Matrix.setLookAtM(mViewMatrix, 0, 5f, 5f, 5f, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
// Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);
surf.draw(mMVPMatrix);
}
public static int loadShader(int type, String shaderCode){
// create a vertex shader type (GLES20.GL_VERTEX_SHADER)
// or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
int shader = GLES20.glCreateShader(type);
// add the source code to the shader and compile it
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
}
To make the depth test work you have to enable the depth test ( GLES20.glEnable(GLES20.GL_DEPTH_TEST)) and you have to specify the size of the depth buffer.
In GLSurfaceView this can be done by the 4th parameter of setEGLConfigChooser:
e.g. depth buffer size of 16 bits:
setEGLConfigChooser(8, 8, 8, 8, 16, 0)
i am trying to implement a 3d aplication for android and i am having trouble when drawing 3d objects like a cone for example.
The problem is that i cant notice the transitions between the different faces, all of them are draw with the same color.
I think i need to add shading to the polygons but i cant find any tutorial showing me how do i do that.
this is the code i am using to draw a cone.
public class Cone{
float baseSize = 0f;
float height = 0f;
protected final float[] mTransformMatrix = new float[16];
private FloatBuffer vertexBuffer;
private final int mProgram;
private final String vertexShaderCode =
// This matrix member variable provides a hook to manipulate
// the coordinates of the objects that use this vertex shader
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
// the matrix must be included as a modifier of gl_Position
// Note that the uMVPMatrix factor *must be first* in order
// for the matrix multiplication product to be correct.
" gl_Position = uMVPMatrix * vPosition;" +
"}";
// Use to access and set the view transformation
private int mMVPMatrixHandle;
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 3;
static float topCoords[] = new float[30];
static float baseCoords[] = new float[30];
static float lineCoords[] = new float[96];
// Set color with red, green, blue and alpha (opacity) values
float color[] = { 1f, 0f, 0f, 1.0f };
float linecolor[] = { 1f, 1f, 1f, 1.0f };
public Cone(float baseSize , float height) {
this.baseSize = baseSize;
this.height = height;
float ang = (float) ((2*Math.PI) / 8);
Matrix.setIdentityM(mTransformMatrix, 0);
// initialize vertex byte buffer for shape coordinates
ByteBuffer bb = ByteBuffer.allocateDirect(
// (number of coordinate values * 4 bytes per float)
(topCoords.length * 2 + lineCoords.length) * 4);
// use the device hardware's native byte order
bb.order(ByteOrder.nativeOrder());
// create a floating point buffer from the ByteBuffer
vertexBuffer = bb.asFloatBuffer();
// add the coordinates to the FloatBuffer
topCoords[0] = 0;
topCoords[1] = height;
topCoords[2] = 0;
baseCoords[0]= 0;
baseCoords[1]= 0;
baseCoords[2]= 0;
for(int i=1; i < 10;i++) {
topCoords[i*3] = this.baseSize * (float) Math.cos(i*ang);
topCoords[i*3 + 1] = 0;
topCoords[i*3 + 2] = this.baseSize * (float) Math.sin(i*ang);
baseCoords[i*3] = this.baseSize * (float) Math.cos(i*ang);
baseCoords[i*3 + 1] = 0;
baseCoords[i*3 + 2] = this.baseSize * (float) Math.sin(i*ang);
}
for (int i = 0 ; i < 8 ; i ++) {
lineCoords[i*6] = 0;
lineCoords[i*6 + 1] = height;
lineCoords[i*6 + 2] = 0;
lineCoords[i*6 + 3] = this.baseSize *(float) Math.cos((i+1)*ang);
lineCoords[i*6 + 4] = 0;
lineCoords[i*6 + 5] = this.baseSize * (float) Math.sin((i+1)*ang);
}
int j = 0;
for (int i = 8 ; i < 16 ; i++){
lineCoords[i*6] = this.baseSize *(float) Math.cos((j+1)*ang);
lineCoords[i*6 + 1] = 0;
lineCoords[i*6 + 2] = this.baseSize * (float) Math.sin((j+1)*ang);
lineCoords[i*6 + 3] = this.baseSize *(float) Math.cos((j+2)*ang);
lineCoords[i*6 + 4] = 0;
lineCoords[i*6 + 5] = this.baseSize * (float) Math.sin((j+2)*ang);
j++;
}
vertexBuffer.put(topCoords);
vertexBuffer.put(baseCoords);
vertexBuffer.put(lineCoords);
// set the buffer to read the first coordinate
vertexBuffer.position(0);
int vertexShader = MyGLRenderer.loadShader(GLES20.GL_VERTEX_SHADER,
vertexShaderCode);
int fragmentShader = MyGLRenderer.loadShader(GLES20.GL_FRAGMENT_SHADER,
fragmentShaderCode);
// create empty OpenGL ES Program
mProgram = GLES20.glCreateProgram();
// add the vertex shader to program
GLES20.glAttachShader(mProgram, vertexShader);
// add the fragment shader to program
GLES20.glAttachShader(mProgram, fragmentShader);
// creates OpenGL ES program executables
GLES20.glLinkProgram(mProgram);
}
private int mPositionHandle;
private int mColorHandle;
private final int topVertexCount = topCoords.length / COORDS_PER_VERTEX;
private final int lineVertexCount = lineCoords.length / COORDS_PER_VERTEX;
private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex
public void draw(float[] mvpMatrix) {
// Add program to OpenGL ES environment
GLES20.glUseProgram(mProgram);
// get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
// Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
// get handle to fragment shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
// Set color for drawing the cone
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
// get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
// Pass the projection and view transformation to the shader
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
// Draw the cone
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_FAN, 0, topVertexCount);
//Draw base
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_FAN, topVertexCount, topVertexCount);
//Draw cone lines
GLES20.glUniform4fv(mColorHandle, 1, linecolor, 0);
GLES20.glDrawArrays(GLES20.GL_LINES, topVertexCount*2, lineVertexCount);
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
}
Thanks for the help
Your fragment shader code indeed assigns the same color to every fragment is processes. There are a number of different ways you can add "lighting" to your scene. 'Gouraud shading' is one of the easiest to implement, with modern shaders. It interpolates the normal at each vertex of a triangle across the triangle, and computes a light intensity based on the light direction. In modern shading languages, (including OpenGL ES 2), this interpolation is done for you.
There are many other possible lighting models, however most (if not all, including Gouraud shading) will require that you generate vertex normals, which are you not doing in your cone mesh generation code.
I have been working on an openGL ES 2.0 Android project. My objective is to create a circle. From the help that i got from online, i managed to run this code which is suppose to give a circle in the view,but instead i got an oval. I tried many ways to make it circle. Any help shal appreciated
Thanks
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import android.opengl.GLES20;
import android.util.Log;
public class Circle {
private int mProgram, mPositionHandle, mColorHandle, mMVPMatrixHandle;
private FloatBuffer mVertexBuffer;
private float vertices[] = new float[364 * 3];
float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
private final String vertexShaderCode = "uniform mat4 uMVPMatrix;"
+ "attribute vec4 vPosition;" + "void main() {"
+ " gl_Position = uMVPMatrix * vPosition;" + "}";
private final String fragmentShaderCode = "precision mediump float;"
+ "uniform vec4 vColor;" + "void main() {"
+ " gl_FragColor = vColor;" + "}";
Circle() {
vertices[0] = 0;
vertices[1] = 0;
vertices[2] = 0;
for (int i = 1; i < 364; i++) {
vertices[(i * 3) + 0] = (float) (0.5 * Math.cos((3.14 / 180)
* (float) i) + vertices[0]);
vertices[(i * 3) + 1] = (float) (0.5 * Math.sin((3.14 / 180)
* (float) i) + vertices[1]);
vertices[(i * 3) + 2] = 0;
}
Log.v("Thread", "" + vertices[0] + "," + vertices[1] + ","
+ vertices[2]);
ByteBuffer vertexByteBuffer = ByteBuffer
.allocateDirect(vertices.length * 4);
vertexByteBuffer.order(ByteOrder.nativeOrder());
mVertexBuffer = vertexByteBuffer.asFloatBuffer();
mVertexBuffer.put(vertices);
mVertexBuffer.position(0);
int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER,
fragmentShaderCode);
mProgram = GLES20.glCreateProgram(); // create empty OpenGL ES Program
GLES20.glAttachShader(mProgram, vertexShader); // add the vertex shader
// to program
GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment
// shader to program
GLES20.glLinkProgram(mProgram);
}
public static int loadShader(int type, String shaderCode) {
int shader = GLES20.glCreateShader(type);
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
public void draw(float[] mvpMatrix) {
GLES20.glUseProgram(mProgram);
// get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
// Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, 3, GLES20.GL_FLOAT,
false, 12, mVertexBuffer);
// get handle to fragment shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
// Set color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
// Draw the triangle
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_FAN, 0, 364);
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
}
And the output i am getting is this
The vertex coordinates span a (normalised) range of -1 to +1 - And it appears you are generating vertices from -0.5 to +0.5.
Now, where does the GL engine draw ? To a buffer that is provided by a window system - in this case the Android window system. The Android window system (ie, the linux kernel display driver in the lowest level) provides a buffer that has a widthxheight of say 480x640 (your screenshot shows that width < height). So, the drawn circle on the screen has a width of 0.5*480, but height of 0.5*640, hence it gives an oval shape.
You will have to scale down your y coordinate by the ratio of (screenwidth/screenheight), to get a true circle. You can get it using either egl API or Android API in your application.
Hello Guys I am Beginner to OpenGL,
I am Trying to follow android developers tutorials . But I am not able to see the Triangle.
What is wrong;
I tried to create a triangle on surface created and called draw method inseide onDraw of Renderer class .
Triangle class:
public class Triangle {
private final String vertexShaderCode =
"attribute vec4 vPosition;" +
"void main() {" +
" gl_Position = vPosition;" +
"}";
private final int vertexCount = triangleCoords.length / COORDS_PER_VERTEX;
private final int vertexStride = COORDS_PER_VERTEX * 4;
private int mProgram,mPositionHandle,mColorHandle;
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
private FloatBuffer vertexBuffer;
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 3;
static float triangleCoords[] = { // in counterclockwise order:
0.0f, 0.622008459f, 0.0f, // top
-0.5f, -0.311004243f, 0.0f, // bottom left
0.5f, -0.311004243f, 0.0f // bottom right
};
// Set color with red, green, blue and alpha (opacity) values
float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
public Triangle() {
// initialize vertex byte buffer for shape coordinates
ByteBuffer bb = ByteBuffer.allocateDirect(
// (number of coordinate values * 4 bytes per float)
triangleCoords.length * 4);
// use the device hardware's native byte order
bb.order(ByteOrder.nativeOrder());
// create a floating point buffer from the ByteBuffer
vertexBuffer = bb.asFloatBuffer();
// add the coordinates to the FloatBuffer
vertexBuffer.put(triangleCoords);
// set the buffer to read the first coordinate
vertexBuffer.position(0);
int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
mProgram = GLES20.glCreateProgram(); // create empty OpenGL ES Program
GLES20.glAttachShader(mProgram, vertexShader); // add the vertex shader to program
GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
GLES20.glLinkProgram(mProgram);
}
public static int loadShader(int type, String shaderCode){
// create a vertex shader type (GLES20.GL_VERTEX_SHADER)
// or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
int shader = GLES20.glCreateShader(type);
// add the source code to the shader and compile it
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
public void draw() {
// Add program to OpenGL ES environment
GLES20.glUseProgram(mProgram);
// get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
// Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
// get handle to fragment shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
// Set color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
// Draw the triangle
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, vertexCount);
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
}
You are missing code to setup a projection matrix and viewport. You also need to call glSwapBuffers(), unless you are using GLSurfaceView, which does that for you. You can use an ortho projection for simplicity, and it should be multiplied by each vPosition in your vertex shader.
This is how you can use and construct a projection matrix:
Ortho(-1.0f, -1.0f, 1.0f, 1.0f, 1.0f, -1.0f);
glUniformMatrix4fv(iProjectionMatrixLocation, 1, GL_FALSE, (const GLfloat *)&m_mViewProj);
glViewport(0, 0, m_iWidth, m_iHeight);
...
// Construct a matrix for an orthographic projection view.
void Button::Ortho(float left, float top, float right, float bottom, float nearPlane, float farPlane)
{
float rcplmr = 1.0f / (left - right);
float rcpbmt = 1.0f / (bottom - top);
float rcpnmf = 1.0f / (nearPlane - farPlane);
m_mViewProj.f0 = -2.0f * rcplmr;
m_mViewProj.f1 = 0.0f;
m_mViewProj.f2 = 0.0f;
m_mViewProj.f3 = 0.0f;
m_mViewProj.f4 = 0.0f;
m_mViewProj.f5 = -2.0f * rcpbmt;
m_mViewProj.f6 = 0.0f;
m_mViewProj.f7 = 0.0f;
m_mViewProj.f8 = 0.0f;
m_mViewProj.f9 = 0.0f;
m_mViewProj.f10 = -2.0f * rcpnmf;
m_mViewProj.f11 = 0.0f;
m_mViewProj.f12 = (right + left) * rcplmr;
m_mViewProj.f13 = (top + bottom) * rcpbmt;
m_mViewProj.f14 = (nearPlane + farPlane) * rcpnmf;
m_mViewProj.f15 = 1.0f;
}
The third article here will help:
http://montgomery1.com/opengl/
I've just started learning OpenGL for Android and I'm having a weird problem when drawing lines. All i want to do is to draw a line based on a finger motion. Now as soon as I start swiping I always get a line folowing my motion from the origin(0,0).
here a picture:
http://imageshack.us/photo/my-images/137/screenshot2012061312174.jpg/
The arrow symbols my finger motion and the line starting in the origin (red circle) is the mentioned line folowing my entire motion.
Don't get bothered with the Coords array I know this isn't best practice but I debuged the entire programm and couldn't finde any bugs involving this array.
I probably should mention that the ArrayList points contains all my generated points.
I'm trying to figure this out for quit a while now but I'm really stuck any suggestion could be helpfull
This is my entire render class.
public class HelloOpenGLES20Renderer implements GLSurfaceView.Renderer {
private FloatBuffer triangleVB;
private int mProgram;
private int maPositionHandle;
public ArrayList<PointWrapper> points;
private int muMVPMatrixHandle;
private float[] mMVPMatrix = new float[16];
private float[] mMMatrix = new float[16];
private float[] mVMatrix = new float[16];
private float[] mProjMatrix = new float[16];
private int[] viewport = new int[4];
private ArrayList<Float> coordinates;
float[] Coords = new float[100000];
boolean first;
private int counter;
private PointWrapper last;
private final String vertexShaderCode =
// This matrix member variable provides a hook to manipulate
// the coordinates of the objects that use this vertex shader
"uniform mat4 uMVPMatrix; \n" +
"attribute vec4 vPosition; \n" + "void main(){ \n" +
// the matrix must be included as a modifier of gl_Position
" gl_Position = uMVPMatrix * vPosition; \n" +
"} \n";
private final String fragmentShaderCode = "precision mediump float; \n"
+ "void main(){ \n"
+ " gl_FragColor = vec4 (0.63671875, 0.76953125, 0.22265625, 1.0); \n"
+ "} \n";
private int loadShader(int type, String shaderCode) {
// create a vertex shader type (GLES20.GL_VERTEX_SHADER)
// or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
int shader = GLES20.glCreateShader(type);
// add the source code to the shader and compile it
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
public HelloOpenGLES20Renderer() {
points = new ArrayList<PointWrapper>();
first = true;
this.counter = 0;
last = new PointWrapper();
coordinates = new ArrayList<Float>();
}
private float[] convertCoordinates(PointWrapper f) {
float[] vector = new float[4];
GLU.gluUnProject(f.point.x, f.point.y, 0.0f, mVMatrix, 0, mProjMatrix,
0, viewport, 0, vector, 0);
return vector;
}
private void initShapes() {
ArrayList<PointWrapper> points2 = new ArrayList<PointWrapper>(points);
float[] vector;
if (!points2.isEmpty()) {
if(points2.size()%2==1){
points2.remove(points2.size()-1);
}
for (int i = counter/2; i < points2.size(); i++) {
vector = convertCoordinates(points2.get(i));
Coords[counter] = vector[0] / vector[3];
Coords[counter+1] = -1 * (vector[1] / vector[3]);
counter= counter+2;
}
}
// initialize vertex Buffer for triangle
ByteBuffer vbb = ByteBuffer.allocateDirect(
// (# of coordinate values * 4 bytes per float)
Coords.length * 4);
vbb.order(ByteOrder.nativeOrder());// use the device hardware's native
// byte order
triangleVB = vbb.asFloatBuffer(); // create a floating point buffer from
// the ByteBuffer
triangleVB.put(Coords); // add the coordinates to the
// FloatBuffer
triangleVB.position(0); // set the buffer to read the first coordinate
}
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
// Set the background frame color
GLES20.glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
// initialize the triangle vertex array
// initShapes();
int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER,
fragmentShaderCode);
mProgram = GLES20.glCreateProgram(); // create empty OpenGL Program
GLES20.glAttachShader(mProgram, vertexShader); // add the vertex shader
// to program
GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment
// shader to program
GLES20.glLinkProgram(mProgram); // creates OpenGL program executables
// get handle to the vertex shader's vPosition member
maPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
}
public void onDrawFrame(GL10 unused) {
// Redraw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
initShapes();
// Add program to OpenGL environment
GLES20.glUseProgram(mProgram);
// Prepare the triangle data
GLES20.glVertexAttribPointer(maPositionHandle, 2, GLES20.GL_FLOAT,
false, 0, triangleVB);
GLES20.glEnableVertexAttribArray(maPositionHandle);
// Apply a ModelView Projection transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
GLES20.glUniformMatrix4fv(muMVPMatrixHandle, 1, false, mMVPMatrix, 0);
GLES20.glLineWidth(5f);
GLES20.glDrawArrays(GLES20.GL_LINE_STRIP, 0, counter);
}
public void onSurfaceChanged(GL10 unused, int width, int height) {
GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
viewport[0] = 0;
viewport[1] = 0;
viewport[2] = width;
viewport[3] = height;
// this projection matrix is applied to object coodinates
// in the onDrawFrame() method
Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 3, 7);
muMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
Matrix.setLookAtM(mVMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
}
}
my thanks in advance
for (int i = counter/2; i < points2.size(); i++) {
vector = convertCoordinates(points2.get(i));
Coords[counter] = vector[0] / vector[3];
Coords[counter+1] = -1 * (vector[1] / vector[3]);
counter= counter+2;
}
You have intialized Coords to hold 100000 floats and it initializes them to 0. In this loop the last iteration has 'counter' with the number of floats you have set in your array.
What you pass to glDrawArrays should be the number of VERTICES to draw. so in this case half of 'counter'.
GLES20.glDrawArrays(GLES20.GL_LINE_STRIP, 0, counter);
Your for-loop is adding 'counter'/2 extra amount of (0,0) vertices at the end of your array. the quickest fix would be to pass 'counter'/ 2 to glDrawArrays but I'd suggest a clearer approach.
numOfVertices = points2.size(); //make field
int counter = 0; //make local
for (int i = 0; i < numOfVertices; i++) {
vector = convertCoordinates(points2.get(i));
Coords[counter] = vector[0] / vector[3];
Coords[counter+1] = -1 * (vector[1] / vector[3]);
counter= counter+2;
}
and then
GLES20.glDrawArrays(GLES20.GL_LINE_STRIP, 0, numOfVertices);