I created an app that uses GLES2.0 on a HTC Desire S.
It works on the HTC, but not on an Samung Galaxy tab10.1.
The program cannot be linked (GLES20.glGetProgramiv(mProgram, GLES20.GL_LINK_STATUS, linOk,0) gives-1) and glGetError() gives me an error 1282 (Invalid Operation).
When I replace this line (in the shader):
graph_coord.z = (texture2D(mytexture, graph_coord.xy / 2.0 + 0.5).r);
by
graph_coord.z = 0.2;
it works also on the galaxy tab.
My shader looks like this:
private final String vertexShaderCode =
"attribute vec2 coord2d;" +
"varying vec4 graph_coord;" +
"uniform mat4 texture_transform;" +
"uniform mat4 vertex_transform;" +
"uniform sampler2D mytexture;" +
"void main(void) {" +
" graph_coord = texture_transform * vec4(coord2d, 0, 1);" +
" graph_coord.z = (texture2D(mytexture, graph_coord.xy / 2.0 + 0.5).r);" +
" gl_Position = vertex_transform * vec4(coord2d, graph_coord.z, 1);" +
"}";
That's where the shaders are attached:
mProgram = GLES20.glCreateProgram(); // create empty OpenGL Program
GLES20.glAttachShader(mProgram, vertexShader); // add the vertex shader to program
GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
GLES20.glLinkProgram(mProgram); // create OpenGL program executables
int linOk[] = new int[1];
GLES20.glGetProgramiv(mProgram, GLES20.GL_LINK_STATUS, linOk,0);
And the texture is loaded here:
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texture_id[0]);
GLES20.glTexImage2D(
GLES20.GL_TEXTURE_2D, // target
0, // level, 0 = base, no minimap,
GLES20.GL_LUMINANCE, // internalformat
size, // width
size, // height
0, // border, always 0 in OpenGL ES
GLES20.GL_LUMINANCE, // format
GLES20.GL_UNSIGNED_BYTE, // type
values
);
This seems to be a limitation of the Nvidia Tegra GPU. I was able to reproduce the error on a Tegra 3 GPU. Even though texture lookups in the vertex shader are in theory part of OpenGL ES 2.0, according to Nvidia the number of vertex shader texture units (GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS) for Tegra is 0 (PDF: OpenGL ES 2.0 Development for the Tegra Platform).
You have to use texture2DLod() instead of texture2D() to make texture lookups in vertex shader.
GLSL specs, section 8.7 Texture Lookup Functions: http://www.khronos.org/files/opengles_shading_language.pdf
Related
I'm trying to implement a ModelViewer that can visualize triangulated shapes with realistic lighting.
Since realistic lighting doesn't seem possible with OpenGL ES 1.0 and I need a way to present depth for a single colored object, the project uses OpenGL ES 2.0 which is new to me.
The object itself consists of triangles that are drawn using:
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, vertexCount);
As for a test project, I've set up simple shaders that draw the object in consideration of the model-view-projection matrix. After that, my intention was to implement lighting but I cannot get past the first steps.
When working with varying variables, the vertex attributes won't be found.
Removing the varying attributes works, but i need to pass data. I've searched for days on how to implement vertex and fragment shaders that do more than present an object at a location.
// VERTEX SHADER CODE
attribute vec4 v_Position;
uniform mat4 u_MVPMatrix;
uniform vec4 u_Color;
varying vec4 v_Color;
void main() {
gl_Position = u_MVPMatrix * v_Position;
v_Color = u_Color;
};
// FRAGMENT SHADER CODE
precision mediump float;
varying vec4 v_Color;
void main() {
gl_FragColor = v_Color;
}
The whole Class:
public class Object3D {
private FloatBuffer vertexBuffer;
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 3;
// static float triangleCoords[] = { // in counterclockwise order:
// 0.0f, 0.622008459f, 0.0f, // top
// -0.5f, -0.311004243f, 0.0f, // bottom left
// 0.5f, -0.311004243f, 0.0f // bottom right
// };
float[] triangleCoords;
// Set color with red, green, blue and alpha (opacity) values
float[] colors = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
private final String vertexShaderCode =
"attribute vec4 v_Position;" +
"uniform float u_Color" +
"uniform mat4 u_MVPMatrix;" +
// outgoing
"varying vec4 v_Color" +
"void main() {" +
// the matrix must be included as a modifier of gl_Position
// Note that the uMVPMatrix factor *must be first* in order
// for the matrix multiplication product to be correct.
"gl_Position = u_MVPMatrix * v_Position;" +
"v_Color = u_Color;" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"varying vec4 v_Color;" +
"void main() {" +
"gl_FragColor = v_Color;" +
"}";
// Use to access and set the view transformation
private int mMVPMatrixHandle;
private final int mProgram;
private int mPositionHandle;
private int mColorHandle;
private final int vertexCount;
private final int vertexStride;
public Object3D(float[] triangleCoords) {
this.triangleCoords = triangleCoords;
vertexCount = triangleCoords.length / COORDS_PER_VERTEX;
vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex
this.colors = new float[4*vertexCount];
for (int i = 0; i < colors.length; i+=4) {
colors[i] = 0.63671875f;
colors[i+1] = 0.76953125f;
colors[i+2] = 0.22265625f;
colors[i+3] = 1.0f;
}
// initialize vertex byte buffer for shape coordinates
ByteBuffer bb = ByteBuffer.allocateDirect(
// (number of coordinate values * 4 bytes per float)
triangleCoords.length * 4);
// use the device hardware's native byte order
bb.order(ByteOrder.nativeOrder());
// create a floating point buffer from the ByteBuffer
vertexBuffer = bb.asFloatBuffer();
// add the coordinates to the FloatBuffer
vertexBuffer.put(triangleCoords);
// set the buffer to read the first coordinate
vertexBuffer.position(0);
int vertexShader = MyGLRenderer.loadShader(GLES20.GL_VERTEX_SHADER,
vertexShaderCode);
int fragmentShader = MyGLRenderer.loadShader(GLES20.GL_FRAGMENT_SHADER,
fragmentShaderCode);
// create empty OpenGL ES Program
mProgram = GLES20.glCreateProgram();
// add the vertex shader to program
GLES20.glAttachShader(mProgram, vertexShader);
// add the fragment shader to program
GLES20.glAttachShader(mProgram, fragmentShader);
GLES20.glBindAttribLocation(mProgram, 0, "v_Position");
// GLES20.glBindAttribLocation(mProgram, 1, "vColor");
// creates OpenGL ES program executables
GLES20.glLinkProgram(mProgram);
}
public void draw(float[] mvpMatrix) { // pass in the calculated transformation matrix
// Add program to OpenGL environment
GLES20.glUseProgram(mProgram);
// get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "v_Position");
// if (mPositionHandle == -1) {
// throw new RuntimeException(
// "Could not get attrib location for v_Position");
// }
// Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
// get handle to fragment shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(mProgram, "v_Color");
// Set color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, colors, 0);
// get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "u_MVPMatrix");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
// // get handle to shape's transformation matrix
// int mColorHandleU = GLES20.glGetUniformLocation(mProgram, "u_Color");
//
// // Apply the projection and view transformation
// GLES20.glUniform4fv(mColorHandleU, 1, new float[] {}, 0);
// Draw the triangle
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, vertexCount);
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
}
If I want to do some necessary color calculations, i need varying variables to pass information from the vertex to the fragment shader. However, I cannot seem to get this done.
The Error I keep getting is:
2019-05-14 21:54:25.122 8281-8316/com.example.opengles20 E/emuglGLESv2_enc: device/generic/goldfish-opengl/system/GLESv2_enc/GL2Encoder.cpp:s_glEnableVertexAttribArray:892 GL error 0x501
Info: Invalid vertex attribute index. Wanted index: 4294967295. Max index: 16
2019-05-14 21:54:25.123 8281-8316/com.example.opengles20 E/emuglGLESv2_enc: device/generic/goldfish-opengl/system/GLESv2_enc/GL2Encoder.cpp:s_glVertexAttribPointer:604 GL error 0x501
Info: Invalid vertex attribute index. Wanted index: 4294967295. Max index: 16
2019-05-14 21:54:25.124 8281-8316/com.example.opengles20 E/emuglGLESv2_enc: device/generic/goldfish-opengl/system/GLESv2_enc/GL2Encoder.cpp:s_glDisableVertexAttribArray:901 GL error 0x501
Info: Invalid vertex attribute index. Wanted index: 4294967295. Max index: 16
2019-05-14 21:54:25.237 8281-8316/com.example.opengles20 E/emuglGLESv2_enc: device/generic/goldfish-opengl/system/GLESv2_enc/GL2Encoder.cpp:s_glEnableVertexAttribArray:892 GL error 0x501
Info: Invalid vertex attribute index. Wanted index: 4294967295. Max index: 16
2019-05-14 21:54:25.237 8281-8316/com.example.opengles20 E/emuglGLESv2_enc: device/generic/goldfish-opengl/system/GLESv2_enc/GL2Encoder.cpp:s_glVertexAttribPointer:604 GL error 0x501
Info: Invalid vertex attribute index. Wanted index: 4294967295. Max index: 16
2019-05-14 21:54:25.238 8281-8316/com.example.opengles20 E/emuglGLESv2_enc: device/generic/goldfish-opengl/system/GLESv2_enc/GL2Encoder.cpp:s_glDisableVertexAttribArray:901 GL error 0x501
Info: Invalid vertex attribute index. Wanted index: 4294967295. Max index: 16
Also, the following exception, when implemented, is thrown:
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "v_Position");
if (mPositionHandle == -1) {
throw new RuntimeException(
"Could not get attrib location for v_Position");
}
I know that for attributes, the instruction flow goes like this:
GLES20.glBindAttribLocation(...);
-- Link Shader Program --
attributeHandle = GLES20.glGetAttribLocation(programHandle, "a_AttributenName");
GLES20.glEnableVertexAttribArray(attributeHandle);
GLES20.glVertexAttribPointer(programHandle, ..., buffer);
The instruction sequence for uniforms go like this:
uniformHandle = GLES20.glGetUniformLocation(mProgram, "u_UniformName");
// do something with it, for example:
GLES20.glUniform4fv(uniformHandle, ...);
But what is there to do for varying variables?
Thanks in advance!
private final String vertexShaderCode =
"attribute vec4 v_Position;" +
"uniform float u_Color" +
"uniform mat4 u_MVPMatrix;" +
// outgoing
"varying vec4 v_Color" +
"void main() {" +
You're missing a semi-colon after u_Color and also v_Color. Presumably your vertex shader is not compiling and that's cascading down into the errors you're seeing.
It's a pain, but it really does save time in the long run to check for errors after every OpenGLES call (glGetError). Getting detailed shader compile error logs is also fiddly but worth putting in place - see here (glGetShaderInfoLog, GL_COMPILE_STATUS).
I'm trying to render a simple 3D scene. It's basically a surface wrapped in a texture. I've tested it on 4 devices and only one (Galaxy Note 4) renders the texture properly. The other three phones (HTC Evo 3D, LG G2, Sony Xperia Z1) render the texture with all texels in one color, which seems to be an average color of the texture. E.g.: original image and rendered texture.
My first guess was there's something wrong with my fragment shader. But it's very basic and copied from a book "OpenGL ES 2 for Android":
private final String vertexShaderCode =
"uniform mat4 uMVPMatrix;"+
"attribute vec4 vPosition;"+
"attribute vec2 a_TextureCoordinates;"+
"varying vec2 v_TextureCoordinates;"+
"void main()"+
"{"+
"v_TextureCoordinates = a_TextureCoordinates;"+
"gl_Position = uMVPMatrix * vPosition;"+
"}";
private final String fragmentShaderCode =
"precision mediump float;"+
"uniform sampler2D u_TextureUnit;"+
"varying vec2 v_TextureCoordinates;"+
"void main()"+
"{"+
"gl_FragColor = texture2D(u_TextureUnit, v_TextureCoordinates);"+
"}";
Loading texture:
texCoordHandle = glGetAttribLocation(program, "a_TextureCoordinates");
texUnitHandle = glGetUniformLocation(program, "u_TextureUnit");
texHandle = new int[1];
glGenTextures(1, texHandle, 0);
BitmapFactory.Options opt = new BitmapFactory.Options();
opt.inScaled = false;
Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(),
R.drawable.way_texture_pink_square_512, opt);
glBindTexture(GL_TEXTURE_2D, texHandle[0]);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
texImage2D(GL_TEXTURE_2D, 0, bitmap, 0);
glGenerateMipmap(GL_TEXTURE_2D);
bitmap.recycle();
glBindTexture(GL_TEXTURE_2D, 0);
I hold the vertices and texture coords in different float buffers, and draw them in a loop:
glUseProgram(program);
glUniformMatrix4fv(mvpMatrixHandle, 1, false, vpMatrix, 0);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, texHandle[0]);
glUniform1i(texUnitHandle, 0);
glEnableVertexAttribArray(positionHandle);
WaySection[] ws = waySects.get(currWaySect);
for( int i = 0 ; i < ws.length ; i++ ) {
glVertexAttribPointer(
positionHandle, COORDS_PER_VERTEX,
GL_FLOAT, false,
vertexStride, ws[i].vertexBuffer);
glVertexAttribPointer(texCoordHandle, COORDS_PER_TEX_VER,
GL_FLOAT, false,
texStride, ws[i].texBuffer);
glDrawArrays(GL_TRIANGLE_FAN, 0,
ws[i].vertices.length / COORDS_PER_VERTEX);
}
glDisableVertexAttribArray(positionHandle);
What I've done:
Changed textures, tried different sizes (power of two and not), different alpha.
Tried switching on/off different options, like: GL_CULL_FACE, GL_BLEND, GL_TEXTURE_MIN(MAG)_FILTER, GL_TEXTURE_WRAP...
Checked for OpenGL errors in every possible place. glGetError() returns no error.
Checked the code in debugger. The textures converted into Bitmap looked fine before and after loading them into OpenGL.
Help, please :)
Problem solved. Erratic behavior was caused by the lack of glEnableVertexAttribArray(texCoordHandle); call in onDraw(). Note 4 was ok with that; it required only the positionHandle (vertices) to be enabled. Other phones apparently weren't to happy. Maybe because Note 4 supports GL ES 3.1?
Thanks for your time.
I'm very new to Android, Eclipse, and OpenGL ES 2.0 (and object oriented programming in general), but I got a program (basically, based on a few examples I found on the Internet) working to show a cube with each face displaying a different 512 x 512 bitmap image (a PNG image loaded and converted once during the initialization phase) a few days ago.
My system is a 32-bit Vista PC, and I've been using Bluestacks as my emulator (version 0.8.11, released June 22, 2014, Android version 4.0.4). I believe its API level is 15. I have no physical Android devices to test my programs on at this time.
I now want to try a technique to use a dynamic texture.
I've looked for complete Android app examples using a dynamic texture (in Java), but I found only one and when I run it, it doesn't work on my Bluestacks.
Here's the page that has a link to the complete project containing the program:
http://blog.shayanjaved.com/2011/05/13/android-opengl-es-2-0-render-to-texture/
When I run it on my Bluestacks, all I see is a black screen.
One thing I noticed is that the texture size is huge. I changed the size to 512 x 512 (which I know works because my cube program uses 512 x 512 textures), but I still see only a black screen.
(It would be nice if somebody could try to run this program on his/her physical Android device/s and/or Bluestacks and post the result.)
The following is the method in which a dynamic texture is created and prepared for its use:
/**
* Sets up the framebuffer and renderbuffer to render to texture
*/
private void setupRenderToTexture() {
fb = new int[1];
depthRb = new int[1];
renderTex = new int[1];
// generate
GLES20.glGenFramebuffers(1, fb, 0);
GLES20.glGenRenderbuffers(1, depthRb, 0);
GLES20.glGenTextures(1, renderTex, 0);
// generate color texture
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, renderTex[0]);
// parameters
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S,
GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T,
GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_LINEAR);
// create it
// create an empty intbuffer first?
int[] buf = new int[texW * texH];
texBuffer = ByteBuffer.allocateDirect(buf.length
* FLOAT_SIZE_BYTES).order(ByteOrder.nativeOrder()).asIntBuffer();;
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGB, texW, texH, 0, GLES20.GL_RGB, GLES20.GL_UNSIGNED_SHORT_5_6_5, texBuffer);
// create render buffer and bind 16-bit depth buffer
GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, depthRb[0]);
GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, texW, texH);
/*** PART BELOW SHOULD BE DONE IN onDrawFrame ***/
// bind framebuffer
/*GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fb[0]);
// specify texture as color attachment
GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, renderTex[0], 0);
// attach render buffer as depth buffer
GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT, GLES20.GL_RENDERBUFFER, depthRb[0]);
// check status
int status = GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER);*/
}
Does the above piece of code seem sound?
Oh, one more thing:
I've been using the following as my vertex and fragment shaders. They've been working fine to show a cube with static textures. (Normal's data are read but I'm not using any lights now.)
private final String vertexShaderCode =
// This matrix member variable provides a hook to manipulate
// the coordinates of the objects that use this vertex shader
"uniform mat4 uMVPMatrix;" +
"attribute vec4 a_Position;" +
"attribute vec4 a_Color;" +
"attribute vec3 a_Normal;" +
"attribute vec2 a_TexCoordinate;" + // Per-vertex texture coordinate information we will pass in.
"varying vec4 v_Color;" +
"varying vec2 v_TexCoordinate;" + // This will be passed into the fragment shader.
"void main() {" +
// The matrix must be included as a modifier of gl_Position.
// Note that the uMVPMatrix factor *must be first* in order
// for the matrix multiplication product to be correct.
" gl_Position = uMVPMatrix * a_Position;" +
" v_Color = a_Color;" +
" vec3 difuse = a_Normal;" +
" v_TexCoordinate = a_TexCoordinate;" + // pass through
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform sampler2D u_Texture;" + // The input texture.
"varying vec4 v_Color;" +
"varying vec2 v_TexCoordinate;" + // Interpolated texture coordinate per fragment.
"void main() {" +
" gl_FragColor = (v_Color * texture2D(u_Texture, v_TexCoordinate));" +
"}";
Should they work when I render to a texture? Or do I have to have separate vertex and fragment shaders when I'm rendering to a texture?
Thank you.
The problem is that the result of the FBO copy is filled with whatever pixel is at texture coordinate 0,0 of the source texture.
If I edit the shader to render a gradient based on texture coordinate position, the fragment shader fills the whole result as if it had texture coordinate 0, 0 fed into it.
If I edit the triangle strip vertices, things behave as expected, so I think the camera and geometry is setup right. It's just that the 2-tri quad is all the same color when it should reflect either my input texture or at least my position-gradient shaders!
I've ported this code nearly line for line from a working iOS example.
This is running alongside Unity3D, so don't assume any GL settings are default, as the engine is likely fiddling with them before my code starts.
Here's the FBO copy operation
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, mFrameBuffer);
checkGlError("glBindFramebuffer");
GLES20.glViewport(0, 0, TEXTURE_WIDTH*4, TEXTURE_HEIGHT*4);
checkGlError("glViewport");
GLES20.glDisable(GLES20.GL_BLEND);
GLES20.glDisable(GLES20.GL_DEPTH_TEST);
GLES20.glDepthMask(false);
GLES20.glDisable(GLES20.GL_CULL_FACE);
GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, 0);
GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, 0);
GLES20.glPolygonOffset(0.0f, 0.0f);
GLES20.glDisable(GLES20.GL_POLYGON_OFFSET_FILL);
checkGlError("fbo setup");
// Load the shaders if we have not done so
if (mProgram <= 0) {
createProgram();
Log.i(TAG, "InitializeTexture created program with ID: " + mProgram);
if (mProgram <= 0)
Log.e(TAG, "Failed to initialize shaders!");
}
// Set up the program
GLES20.glUseProgram(mProgram);
checkGlError("glUseProgram");
GLES20.glUniform1i(mUniforms[UNIFORM_TEXTURE], 0);
checkGlError("glUniform1i");
// clear the scene
GLES20.glClearColor(0.0f,0.0f, 0.1f, 1.0f);
checkGlError("glClearColor");
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Bind out source texture
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
checkGlError("glActiveTexture");
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mSourceTexture);
checkGlError("glBindTexture");
GLES20.glFrontFace( GLES20.GL_CW );
// Our object to render
ByteBuffer imageVerticesBB = ByteBuffer.allocateDirect(8 * 4);
imageVerticesBB.order(ByteOrder.nativeOrder());
FloatBuffer imageVertices = imageVerticesBB.asFloatBuffer();
imageVertices.put(new float[]{
-1.0f, -1.0f,
1.0f, -1.0f,
-1.0f, 1.0f,
1.0f, 1.0f}
);
imageVertices.position(0);
// The object's texture coordinates
ByteBuffer textureCoordinatesBB = ByteBuffer.allocateDirect(8 * 4);
imageVerticesBB.order(ByteOrder.nativeOrder());
FloatBuffer textureCoordinates = textureCoordinatesBB.asFloatBuffer();
textureCoordinates.put(new float[]{
0.0f, 1.0f,
1.0f, 1.0f,
0.0f, 0.0f,
1.0f, 0.0f}
);
textureCoordinates.position(0);
// Update attribute values.
GLES20.glEnableVertexAttribArray(ATTRIB_VERTEX);
GLES20.glVertexAttribPointer(ATTRIB_VERTEX, 2, GLES20.GL_FLOAT, false, 0, imageVertices);
GLES20.glEnableVertexAttribArray(ATTRIB_TEXTUREPOSITON);
GLES20.glVertexAttribPointer(ATTRIB_TEXTUREPOSITON, 2, GLES20.GL_FLOAT, false, 0, textureCoordinates);
// Draw the quad
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
If you want to dive in, I've put up a nice gist with the update loop, setup and shaders here: https://gist.github.com/acgourley/7783624
I'm checking the result of this as an Android port to UnityFBO (MIT License) so all help is both appreciated and will be shared more broadly.
The declaration of your vertex shader output and fragment shader input do not mach for the texture coordinate varying (different precision qualifiers). Ordinarily this would not be an issue, but for reasons I will discuss below using highp in your fragment shader may come back to bite you in the butt.
Vertex shader:
attribute vec4 position;
attribute mediump vec4 textureCoordinate;
varying mediump vec2 coordinate;
void main()
{
gl_Position = position;
coordinate = textureCoordinate.xy;
}
Fragment shader:
varying highp vec2 coordinate;
uniform sampler2D texture;
void main()
{
gl_FragColor = texture2D(texture, coordinate);
}
In OpenGL ES 2.0 highp is an optional feature in fragment shaders. You should not declare anything highp in a fragment shader unless GL_FRAGMENT_PRECISION_HIGH is defined by the pre-processor.
GLSL ES 1.0 Specification - 4.5.4: Available Precision Qualifiers - pp. 36
The built-in macro GL_FRAGMENT_PRECISION_HIGH is defined to one on systems supporting highp precision in the fragment language
#define GL_FRAGMENT_PRECISION_HIGH 1
and is not defined on systems not supporting highp precision in the fragment language. When defined, this macro is available in both the vertex and fragment languages. The highp qualifier is an optional feature in the fragment language and is not enabled by #extension.
The bottom line is you need to check whether the fragment shader supports highp precision before declaring something highp or re-write your declaration in the fragment shader to use mediump. I cannot see much reason for arbitrarily increasing the precision of the vertex shader coordinates in the fragment shader, I would honestly expect to see it written as highp in both the vertex shader and fragment shader or kept mediump.
I'm working on a simple pong type game to get to grips with opengl and android, and seem to have hit a brick wall in terms of performance.
I've got my game logic on a separate thread, with draw commands sent to the gl thread through a blocking queue. The problem is that I'm stuck at around 40fps, and nothing I've tried seems to improve the framerate.
To keep things simple I set up opengl with:
GLES20.glDisable(GLES20.GL_CULL_FACE);
GLES20.glDisable(GLES20.GL_DEPTH_TEST);
GLES20.glDisable(GLES20.GL_BLEND);
Set up of the opengl program and drawing is handled by the following class:
class GLRectangle {
private final String vertexShaderCode =
"precision lowp float;" +
// This matrix member variable provides a hook to manipulate
// the coordinates of the objects that use this vertex shader
"uniform lowp mat4 uMVPMatrix;" +
"attribute lowp vec4 vPosition;" +
"void main() {" +
// the matrix must be included as a modifier of gl_Position
" gl_Position = vPosition * uMVPMatrix;" +
"}";
private final String fragmentShaderCode =
"precision lowp float;" +
"uniform lowp vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
protected static int mProgram = 0;
private static ShortBuffer drawListBuffer;
private static short drawOrder[] = { 0, 1, 2, 0, 2, 3};//, 4, 5, 6, 4, 6, 7 }; // order to draw vertices
// number of coordinates per vertex in this array
private static final int COORDS_PER_VERTEX = 3;
private static final int vertexStride = COORDS_PER_VERTEX * 4; // bytes per vertex
GLRectangle(){
int vertexShader = GameRenderer.loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = GameRenderer.loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
mProgram = GLES20.glCreateProgram(); // create empty OpenGL ES Program
GLES20.glAttachShader(mProgram, vertexShader); // add the vertex shader to program
GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
GLES20.glLinkProgram(mProgram); // creates OpenGL ES program executables
// initialize byte buffer for the index list
ByteBuffer dlb = ByteBuffer.allocateDirect(
// (# of coordinate values * 2 bytes per short)
drawOrder.length * 2);
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
}
protected static void Draw(Drawable dObj, float mvpMatrix[])
{
FloatBuffer vertexBuffer = dObj.vertexBuffer;
GLES20.glUseProgram(mProgram);
//GameRenderer.checkGlError("glUseProgram");
// get handle to fragment shader's vColor member
int mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
//GameRenderer.checkGlError("glGetUniformLocation");
// get handle to shape's transformation matrix
int mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
//GameRenderer.checkGlError("glGetUniformLocation");
// get handle to vertex shader's vPosition member
int mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
//GameRenderer.checkGlError("glGetAttribLocation");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
//GameRenderer.checkGlError("glUniformMatrix4fv");
// Set color for drawing the quad
GLES20.glUniform4fv(mColorHandle, 1, dObj.color, 0);
//GameRenderer.checkGlError("glUniform4fv");
// Enable a handle to the square vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
//GameRenderer.checkGlError("glEnableVertexAttribArray");
// Prepare the square coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
//GameRenderer.checkGlError("glVertexAttribPointer");
// Draw the square
GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length,
GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
//GameRenderer.checkGlError("glDrawElements");
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
//GameRenderer.checkGlError("glDisableVertexAttribArray");
}
}
I've done plenty of profiling and googling, but cant find anything to make this work faster... I've included a screenshot of the DDMS output:
To me, it looks like glClear is causeing GLThread to sleep for 23ms... though I doubt that's really the case.
I have absolutely no idea how I can make this more efficient, there's nothing fancy going on. In my quest for better rendering performance I have switched to the multi-threaded approach I described, turned off alpha blending and depth testing, changed to a batched drawing approach (not applicable for this simple example), and switched everything to lowp in the shaders.
Any assistance with getting this to 60fps would be greatly appreciated!
Bruce
edit Well talk about overthinking a problem. It turns out that I've had the powersaving mode switched on for the past week... it seems to lock rendering to 40fps.
This behavior occurs when Power Saving mode is switched on, using a Galaxy S3.
It appears the power saving mode locks the framerate to 40fps. Switching it off easily achieved the desired 60fps.
Not sure if you have control over the EGL setup to the device surface, but if you can there is a way to set the update to run in "non-sync" mode.
egl.eglSwapInterval( dpy, 0 )
This isn't available on all devices but allows some control of your rendering.