I am new to opengl es on android and learning opengl by doing some examples. I am using two program to draw 3 objects. The following code loads the texture and draws a square. But it is appearing as a black square instead of applying the texture to the body.
my fragment-shader code
precision mediump float;
uniform sampler2D u_Texture;
varying vec2 v_TexCoordinate;
void main() {
gl_FragColor = texture2D(u_Texture, v_TexCoordinate);
}
my vertex-shader code
attribute vec2 a_TexCoordinate;
varying vec2 v_TexCoordinate;
attribute vec4 a_Position;
uniform mat4 u_Matrix;
void main() {
gl_Position = u_Matrix * a_Position;
v_TexCoordinate = a_TexCoordinate;
}
my object vertex buffer
float [] vBufferFloat = new float[] {
-0.2f, -0.2f, 1f,
0.2f, -0.2f, 1f,
0.2f, 0.2f, 1f,
-0.2f, 0.2f, 1f,
-0.2f, -0.2f, 1f,
};
my texture buffer
float [] texCoordinate = new float[] {
-0.2f, -0.2f,
0.2f, -0.2f,
0.2f, 0.2f,
-0.2f, 0.2f,
-0.2f, -0.2f,
};
my onSurfaceCreated && onDrawFrame code
public void onSurfaceCreated() {
cloudRendereProgram = ShaderHelper.createProgram(mContext, R.raw.sky_texture_vertex_shader, R.raw.sky_texture_fragment_shader);
cloudTextureId = Utils.loadTexture(mContext, com.elpis.gamecontroller.R.drawable.cloud);
aTextureLocation = GLES20.glGetAttribLocation(cloudRendereProgram, "a_TexCoordinate");
uMatrixLocation = GLES20.glGetUniformLocation(cloudRendereProgram, "u_Matrix");
aPositionLocation = GLES20.glGetAttribLocation(cloudRendereProgram, "a_Position");
uTextureLocation = GLES20.glGetUniformLocation(cloudRendereProgram, "u_Texture");
}
public void onDrawFrame() {
float [] mVMatrix = new float[16];
GLES20.glUseProgram(cloudRendereProgram);
GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false, 0, vBuff.buffer);
GLES20.glEnableVertexAttribArray(aPositionLocation);
Matrix.multiplyMM(mVMatrix, 0, modelMatrix, 0, projectionMatrix, 0);
GLES20.glUniformMatrix4fv(uMatrixLocation, 1, false, mVMatrix, 0);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, this.cloudTextureId);
GLES20.glUniform1i(uTextureLocation, 0);
GLES20.glVertexAttribPointer(aTextureLocation, 2, GLES20.GL_FLOAT, false, 0, texBuff.buffer);
GLES20.glEnableVertexAttribArray(aTextureLocation);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_FAN, 0, 5);
}
and my texture loader helper code
public static int loadTexture(Context ctx, int resId) {
final int [] textureHandle = new int[1];
GLES20.glGenTextures(1, textureHandle, 0);
if (textureHandle[0] == 0)
return 0;
final BitmapFactory.Options options = new Options();
options.inScaled = false;
final Bitmap imgTexture = BitmapFactory.decodeResource(ctx.getResources(), resId);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, imgTexture, 0);
imgTexture.recycle();
return textureHandle[0];
}
When I run the android application all I see is a black square instead of seeing a texture of cloud. So, I would appreciate if anyone point me in the right direction.
Two quick questions; Is it valid to create multiple opengl program objects with different shaders and run them concurrently?
[UPDATE]
The problem was on the onDrawFrame(). I had to use vBuff.buffer.position(0) and texBuff.buffer.position(0) to be able to draw the texture correctly.
public void onDrawFrame() {
float [] mVMatrix = new float[16];
GLES20.glUseProgram(cloudRendereProgram);
// FIX
vBuff.buffer.position(0);
texBuff.buffer.position(0);
// END FIX
GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false, 0, vBuff.buffer);
GLES20.glEnableVertexAttribArray(aPositionLocation);
Matrix.multiplyMM(mVMatrix, 0, modelMatrix, 0, projectionMatrix, 0);
GLES20.glUniformMatrix4fv(uMatrixLocation, 1, false, mVMatrix, 0);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, this.cloudTextureId);
GLES20.glUniform1i(uTextureLocation, 0);
GLES20.glVertexAttribPointer(aTextureLocation, 2, GLES20.GL_FLOAT, false, 0, texBuff.buffer);
GLES20.glEnableVertexAttribArray(aTextureLocation);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_FAN, 0, 5);
}
You're not calling glEnableVertexAttribArray for your texture coords. Add this line to your onDrawFrame():
GLES20.glEnableVertexAttribArray(aTextureLocation);
Call glGenerateMipmap after you upload the texture texImage2D() in your loadTexture() function, to make sure all mip-map levels are valid:
glGenerateMipmap(GLES20.GL_TEXTURE_2D);
Also, move these calls from your surfaceCreated() function to the start of drawFrame():
aTextureLocation = GLES20.glGetAttribLocation(cloudRendereProgram, "a_TexCoordinate");
uMatrixLocation = GLES20.glGetUniformLocation(cloudRendereProgram, "u_Matrix");
aPositionLocation = GLES20.glGetAttribLocation(cloudRendereProgram, "a_Position");
uTextureLocation = GLES20.glGetUniformLocation(cloudRendereProgram, "u_Texture");
(it could be that these variables are not bound properly or the GL context is not yet properly set up in surfaceCreated())
A debugging tip for OpenGLES. Add this function to your code (it's from the Android OpenGLES samples):
public static void checkGlError(String glOperation) {
int error;
while ((error = GLES20.glGetError()) != GLES20.GL_NO_ERROR) {
Log.e(TAG, glOperation + ": glError " + error);
throw new RuntimeException(glOperation + ": glError " + error);
}
}
you can call it after each OpenGLES call, and pass in a String that can be whatever debugging message you want. If anything goes wrong then you'll get an exception instead of just silent failure which will leave you scratching your head trying to figure out what went wrong. Make sure to remove it from your final build to avoid force closes.
Related
I have this code that should move my circle( square):
float[] scratch = new float[16];
float[] move = new float[16];
Matrix.setIdentityM(move, 0);
Matrix.translateM(move, 0, 100, 100, 0);
Matrix.multiplyMM(scratch, 0, projectionMatrix, 0, move, 0);
mCircle.draw(scratch);
projectionMatrix is the camera:
Matrix.orthoM(projectionMatrix, 0, 0, width, height, 0, -1f, 1f);
But when I execute the code I get this:
Image
I followed the code from Android Developer.
precision highp float;
uniform float uRadius;
vec2 center = vec2(uRadius, uRadius);
vec2 coord = vec2(gl_FragCoord.x, 1080. - gl_FragCoord.y);
vec2 position = coord - center;
uniform vec4 uColor;
void main()
{
if (length(position) > uRadius) {
discard;
}
gl_FragColor = uColor;
}
--------------------------------
uniform mat4 uMatrix;
attribute vec4 aPosition;
void main()
{
gl_Position = uMatrix * aPosition;
}
My main loop:
public void onDrawFrame(GL10 unused) {
// Draw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
float[] scratch = new float[16];
float[] move = new float[16];
Matrix.setIdentityM(move, 0);
Matrix.translateM(move, 0, 50, 50, 0);
Matrix.multiplyMM(scratch, 0, projectionMatrix, 0, move, 0);
mCircle.draw(scratch);
}
And circle draw functions is:
public void draw(float[] projectionMatrix) {
GLES20.glUseProgram(mProgram);
// get handle to vertex shader's vPosition member
int mPositionHandle = GLES20.glGetAttribLocation(mProgram, "aPosition");
// Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(
mPositionHandle, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
// get handle to fragment shader's vColor member
int mColorHandle = GLES20.glGetUniformLocation(mProgram, "uColor");
// Set color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, mColor, 0);
int radiusHandle = GLES20.glGetUniformLocation(mProgram, "uRadius");
MyGLRenderer.checkGlError("glGetUniformLocation");
GLES20.glUniform1f(radiusHandle, mRadius);
// get handle to shape's transformation matrix
int mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMatrix");
MyGLRenderer.checkGlError("glGetUniformLocation");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, projectionMatrix, 0);
MyGLRenderer.checkGlError("glUniformMatrix4fv");
// Draw the square
GLES20.glDrawElements(
GLES20.GL_TRIANGLES, drawOrder.length,
GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
After some hours I finally found the logical error in the fragment shader: the center of the circle was remaining the same, now I added 2 more uniforms for the offsets/position and it's working.
So I'm trying to learn openGLES 2.0 and create a textured rectangle. Apparently I didn't follow all the instructions and now I've ended up with just a odd color square.
heres my shaders.
final String vertexShader =
"uniform mat4 u_MVPMatrix; \n" // A constant representing the combined model/view/projection matrix.
+ "attribute vec2 a_TexCoordinate;\n" // Per-vertex texture coordinate information we will pass in.
+ "attribute vec4 a_Position; \n" // Per-vertex position information we will pass in.
// + "attribute vec4 a_Color; \n" // Per-vertex color information we will pass in.
// + "varying vec4 v_Color; \n" // This will be passed into the fragment shader.
+ "varying vec2 v_TexCoordinate; \n" // This will be passed into the fragment shader.
+ "void main() \n" // The entry point for our vertex shader.
+ "{ \n"
+ " v_TexCoordinate = a_TexCoordinate;\n" // Pass the texture coordinate through to the fragment shader.
// It will be interpolated across the triangle.
+ " gl_Position = u_MVPMatrix \n" // gl_Position is a special variable used to store the final position.
+ " * a_Position; \n" // Multiply the vertex by the matrix to get the final point in
+ "} \n";
final String fragmentShader =
"precision mediump float; \n" // Set the default precision to medium. We don't need as high of a
// precision in the fragment shader.
+ "uniform sampler2D u_Texture; \n" // The input texture.
+ "uniform vec4 u_Color; \n" // This is the color from the vertex shader interpolated across the
// triangle per fragment.
+ "varying vec2 v_TexCoordinate; \n" // Interpolated texture coordinate per fragment.
+ "void main() \n" // The entry point for our fragment shader.
+ "{ \n"
+ " gl_FragColor = texture2D(u_Texture, v_TexCoordinate); \n"
+ "} \n";
Here is the load texture function
public static int loadTexture(final Context context, final int resourceId)
{
final int[] textureHandle = new int[1];
GLES20.glGenTextures(1, textureHandle, 0);
if (textureHandle[0] != 0)
{
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false; // No pre-scaling
// Read in the resource
final Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), resourceId, options);
// Bind to the texture in OpenGL
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
// Set filtering
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
// Recycle the bitmap, since its data has been loaded into OpenGL.
bitmap.recycle();
}
if (textureHandle[0] == 0)
{
throw new RuntimeException("Error loading texture.");
}
return textureHandle[0];
}
And here is the drawFrame
#Override
public void onDrawFrame(GL10 gl) {
// Set the background clear color to gray.
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glClearColor(0.5f, 0.5f, 0.5f, 0.5f);
mTextureUniformHandle = GLES20.glGetUniformLocation(programHandle, "u_Texture");
mTextureCoordinateHandle = GLES20.glGetAttribLocation(programHandle, "a_TexCoordinate");
// Set the active texture unit to texture unit 0.
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
// Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDataHandle);
// Tell the texture uniform sampler to use this texture in the shader by binding to texture unit 0.
GLES20.glUniform1i(mTextureUniformHandle, 0);
// Draw the triangle facing straight on.
Matrix.setIdentityM(mModelMatrix, 0);
Matrix.translateM(mModelMatrix, 0, Fields.screen_width / 2, Fields.screen_height / 2, 0);
Matrix.scaleM(mModelMatrix, 0, Fields.screen_width / 4, Fields.screen_width / 4, 0);
drawTriangle(mTriangle1Vertices);
}
Any advice for getting rid of the wierd brown square and actually displaying the texture?
Edit: Here are the texture coordinates
public GL20Renderer(Context context) {
final float[] triangle1VerticesData = {
// X, Y, Z,
// R, G, B, A
-1f, 1f, 0.0f,
-1, -1, 0.0f,
1f, 1f, 0.0f,
-1f, -1f, 0.0f,
1f, -1f, 0.0f,
1f, 1f, 0.0f,
};
// Initialize the buffers.
mTriangle1Vertices = ByteBuffer.allocateDirect(triangle1VerticesData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mTriangle1Vertices.put(triangle1VerticesData).position(0);
final float[] triangle1TextureCoordinateData =
{
0.0f, 0.0f,
0.0f, 1.0f,
1.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
1.0f, 0.0f
};
mTriangleTextureCoordinates = ByteBuffer.allocateDirect(triangle1TextureCoordinateData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mTriangleTextureCoordinates.put(triangle1TextureCoordinateData).position(0);
this.context = context;
}
and here is the drawtriangle function
private void drawTriangle(final FloatBuffer aTriangleBuffer)
{
// Pass in the position information
aTriangleBuffer.position(mPositionOffset);
GLES20.glVertexAttribPointer(mPositionHandle, mPositionDataSize, GLES20.GL_FLOAT, false,
mStrideBytes, aTriangleBuffer);
GLES20.glEnableVertexAttribArray(mPositionHandle);
int colorHandle = GLES20.glGetUniformLocation(programHandle, "u_Color");
// Set color for drawing the triangle
GLES20.glUniform4f(colorHandle, 0.0f, 0.8f, 1.0f, 1.0f);
// This multiplies the view matrix by the model matrix, and stores the result in the MVP matrix
// (which currently contains model * view).
Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
// This multiplies the modelview matrix by the projection matrix, and stores the result in the MVP matrix
// (which now contains model * view * projection).
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mMVPMatrix, 0);
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, 6);
}
Your drawTriangle code doesn't set the texture co-ordinates anywhere. Add some code something like this:
triangle1TextureCoordinateData.position(mPositionOffset * 2 / 3);
GLES20.glVertexAttribPointer(mTextureCoordinateHandle, 2, GLES20.GL_FLOAT, false,
0, triangle1TextureCoordinateData);
GLES20.glEnableVertexAttribArray(mTextureCoordinateHandle);
Note you can set the stride to zero if your data is packed in the buffer.
Brief: when I apply to one fragment shader two textures with two different texture coordinates, I see only first texture. But when I use one texture coordinate for two textures it works fine and I can see both textures.
I work with photo filters and use OpenGL ES 2.0 to make filters. Some filters have an advance texture. First texture is a photo and second is a tracery.
Here is my vertext shader
attribute vec4 position;
attribute vec4 inputTextureCoordinate;
attribute vec4 inputTextureCoordinate2;
varying vec2 textureCoordinate;
varying vec2 textureCoordinate2;
void main() {
gl_Position = position;
textureCoordinate = inputTextureCoordinate.xy;
textureCoordinate2 = inputTextureCoordinate2.xy;
}
Here is my fragment shader
precision mediump float;
uniform sampler2D inputImageTexture1;
uniform sampler2D inputImageTexture2;
varying vec2 textureCoordinate;
varying vec2 textureCoordinate2;
void main() {
mediump vec4 color1 = texture2D(inputImageTexture1, textureCoordinate);
mediump vec4 color2 = texture2D(inputImageTexture2, textureCoordinate2);
mediump vec3 colorResult = mix(color1.rgb, color2.rgb, 0.5);
gl_FragColor = vec4(colorResult, 1.0);
}
In my code I use GLSurfaceView.Render implementation.
Initialization of coordinates:
static final float CUBE[] = {-1.0f, 1.0f, 1.0f, 1.0f, -1.0f, -1.0f, 1.0f, -1.0f,};
public static final float COORDINATES1[] = {0.0f, 1.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f,};
public static final float COORDINATES2[] = {0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f,};
...
mGLCubeBuffer = ByteBuffer.allocateDirect(CUBE.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
mGLCubeBuffer.put(CUBE).position(0);
mGLTextureCoordinates1 = ByteBuffer.allocateDirect(COORDINATES1.length * 4).order(ByteOrder.nativeOrder())
.asFloatBuffer();
mGLTextureCoordinates1.clear();
mGLTextureCoordinates1.put(COORDINATES1).position(0);
mGLTextureCoordinates2 = ByteBuffer.allocateDirect(COORDINATES2.length * 4).order(ByteOrder.nativeOrder())
.asFloatBuffer();
mGLTextureCoordinates2.clear();
mGLTextureCoordinates1.put(COORDINATES2).position(0);
onSurfaceCreate method:
#Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
GLES20.glClearColor(0, 0, 0, 1);
GLES20.glDisable(GLES20.GL_DEPTH_TEST);
GLES20.glDisable(GLES20.GL_DEPTH_BITS);
String vertexShader = RawResourceReader.readTextFileFromRawResource(mContext, R.raw.test_vertex);
String fragmentShader = RawResourceReader.readTextFileFromRawResource(mContext, R.raw.test_fragment);
mGLProgId = loadProgram(vertexShader, fragmentShader);
mGLAttribPosition = GLES20.glGetAttribLocation(mGLProgId, "position");
mGLAttribTextureCoordinate = GLES20.glGetAttribLocation(mGLProgId, "inputTextureCoordinate");
mGLAttribTextureCoordinate2 = GLES20.glGetAttribLocation(mGLProgId, "inputTextureCoordinate2");
mGLUniformTexture1 = GLES20.glGetUniformLocation(mGLProgId, "inputImageTexture1");
mGLUniformTexture2 = GLES20.glGetUniformLocation(mGLProgId, "inputImageTexture2");
mTexture1 = loadTexture(mContext, R.drawable.photo);
mTexture2 = loadTexture(mContext, R.drawable.formula1);
}
onDrawFrame method:
#Override
public void onDrawFrame(GL10 gl) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
GLES20.glUseProgram(mGLProgId);
mGLCubeBuffer.position(0);
GLES20.glVertexAttribPointer(mGLAttribPosition, 2, GLES20.GL_FLOAT, false, 0, mGLCubeBuffer);
GLES20.glEnableVertexAttribArray(mGLAttribPosition);
//set first coordinates
mGLTextureCoordinates1.position(0);
GLES20.glVertexAttribPointer(mGLAttribTextureCoordinate, 2, GLES20.GL_FLOAT, false, 0, mGLTextureCoordinates1);
GLES20.glEnableVertexAttribArray(mGLAttribTextureCoordinate);
//set second coordinates
mGLTextureCoordinates2.position(0);
GLES20.glVertexAttribPointer(mGLAttribTextureCoordinate2, 2, GLES20.GL_FLOAT, false, 0, mGLTextureCoordinates2);
GLES20.glEnableVertexAttribArray(mGLAttribTextureCoordinate2);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTexture1);
GLES20.glUniform1i(mGLUniformTexture1, 0);
GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTexture2);
GLES20.glUniform1i(mGLUniformTexture2, 1);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
GLES20.glDisableVertexAttribArray(mGLAttribPosition);
GLES20.glDisableVertexAttribArray(mGLAttribTextureCoordinate);
GLES20.glDisableVertexAttribArray(mGLAttribTextureCoordinate2);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);
}
Significant part of loadTexture method:
GLES20.glGenTextures(1, textureHandle, 0);
// Bind to the texture in OpenGL
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
// Set filtering
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_CONSTANT_ALPHA);
GLES20.glEnable(GLES20.GL_BLEND);
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
// Recycle the bitmap, since its data has been loaded into OpenGL.
bitmap.recycle();
Note that on iOs it works fine, but there is used some library. I tried to use library jp.co.cyberagent.android.gpuimage but it has a few bugs and doesn't work properly with this problem.
I want to know how solve this problem. It mays be some property which I don't know or something else. I'm new in OpenGL and hope to your help.
You can't use GLUtils.texImage2D() to load alpha textures on Android. This is a common problem that Google really should document better. The problem is that the Bitmap class converts all images into pre-multiplied format, but that does not work with OpenGL ES unless the images are completely opaque. The best solution is to use native code. This article gives more detail on this:
http://software.intel.com/en-us/articles/porting-opengl-games-to-android-on-intel-atom-processors-part-1
Edit Code added, please see below
Edit 2 - Screenshots from device included at bottom along with explanation
Edit 3 - New code added
I have 2 classes, a rendered and a custom 'quad' class.
I have these declared at class level in my renderer class:
final float[] mMVPMatrix = new float[16];
final float[] mProjMatrix = new float[16];
final float[] mVMatrix = new float[16];
And in my onSurfaceChanged method I have:
#Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 3, 7);
}
and....
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
// TODO Auto-generated method stub
myBitmap = BitmapFactory.decodeResource(curView.getResources(), R.drawable.box);
//Create new Dot objects
dot1 = new Quad();
dot1.setTexture(curView, myBitmap);
dot1.setSize(300,187); //These numbers are the size but are redundant/not used at the moment.
myBitmap.recycle();
//Set colour to black
GLES20.glClearColor(0, 0, 0, 1);
}
And finally from this class, onDrawFrame:
#Override
public void onDrawFrame(GL10 gl) {
// TODO Auto-generated method stub
//Paint the screen the colour defined in onSurfaceCreated
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Set the camera position (View matrix) so looking from the front
Matrix.setLookAtM(mVMatrix, 0, 0, 0, 3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
// Combine
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
dot1.rotateQuad(0,0,45, mMVPMatrix); //x,y,angle and matrix passed in
}
Then, in my quad class:
This declared at class level:
private float[] mRotationMatrix = new float[16];
private final float[] mMVPMatrix = new float[16];
private final float[] mProjMatrix = new float[16];
private final float[] mVMatrix = new float[16];
private int mMVPMatrixHandle;
private int mPositionHandle;
private int mRotationHandle;
//Create our vertex shader
String strVShader =
"uniform mat4 uMVPMatrix;" +
"uniform mat4 uRotate;" +
"attribute vec4 a_position;\n"+
"attribute vec2 a_texCoords;" +
"varying vec2 v_texCoords;" +
"void main()\n" +
"{\n" +
// "gl_Position = a_position * uRotate;\n"+
// "gl_Position = uRotate * a_position;\n"+
"gl_Position = a_position * uMVPMatrix;\n"+
// "gl_Position = uMVPMatrix * a_position;\n"+
"v_texCoords = a_texCoords;" +
"}";
//Fragment shader
String strFShader =
"precision mediump float;" +
"varying vec2 v_texCoords;" +
"uniform sampler2D u_baseMap;" +
"void main()" +
"{" +
"gl_FragColor = texture2D(u_baseMap, v_texCoords);" +
"}";
Then method for setting texture (don't think this is relevant to this problem though!!)
public void setTexture(GLSurfaceView view, Bitmap imgTexture){
this.imgTexture=imgTexture;
iProgId = Utils.LoadProgram(strVShader, strFShader);
iBaseMap = GLES20.glGetUniformLocation(iProgId, "u_baseMap");
iPosition = GLES20.glGetAttribLocation(iProgId, "a_position");
iTexCoords = GLES20.glGetAttribLocation(iProgId, "a_texCoords");
texID = Utils.LoadTexture(view, imgTexture);
}
And finally, my 'rotateQuad' method (which currently is supposed to draw and rotate the quad).
public void rotateQuad(float x, float y, int angle, float[] mvpMatrix){
Matrix.setRotateM(mRotationMatrix, 0, angle, 0, 0, 0.1f);
// Matrix.translateM(mRotationMatrix, 0, 0, 0, 0); //Removed temporarily
// Combine the rotation matrix with the projection and camera view
Matrix.multiplyMM(mvpMatrix, 0, mRotationMatrix, 0, mvpMatrix, 0);
float[] vertices = {
-.5f,.5f,0, 0,0,
.5f,.5f,0, 1,0,
-.5f,-.5f,0, 0,1,
.5f,-.5f,0, 1,1
};
vertexBuf = ByteBuffer.allocateDirect(vertices.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
vertexBuf.put(vertices).position(0);
//Bind the correct texture
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texID);
//Use program
GLES20.glUseProgram(iProgId);
// get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(iProgId, "uMVPMatrix");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
// get handle to shape's rotation matrix
mRotationHandle = GLES20.glGetUniformLocation(iProgId, "uRotate");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mRotationHandle, 1, false, mRotationMatrix, 0);
//Set starting position for vertices
vertexBuf.position(0);
//Specify attributes for vertex
GLES20.glVertexAttribPointer(iPosition, 3, GLES20.GL_FLOAT, false, 5 * 4, vertexBuf);
//Enable attribute for position
GLES20.glEnableVertexAttribArray(iPosition);
//Set starting position for texture
vertexBuf.position(3);
//Specify attributes for vertex
GLES20.glVertexAttribPointer(iTexCoords, 2, GLES20.GL_FLOAT, false, 5 * 4, vertexBuf);
//Enable attribute for texture
GLES20.glEnableVertexAttribArray(iTexCoords);
//Draw it
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}
for Edit 2.
This is my quad as drawn in the center of the screen. No rotation.
This is the same quad rotated at +45 Degrees with the code "gl_Position = a_position * uMVPMatrix;" + in my vertex shader (it's from a different project now so the shader variable is a_position and not vPosition), it's looks correct!
However, this is the same quad rotated at +45 Degrees with the 2 shader variables switched (so they read "gl_Position = uMVPMatrix * a_position;" - as you can see, it's not quite right.
Also just a side note, you can't see it here as the quare is symetrical, but each method also rotates in the opposite direction to the other....
Any help appreciated.
It's really impossible to tell because we don't know what you are passing to these two variables.
OpenGL is column-major format, so if vPosition is in fact a vector, and uMVPMatrix is a matrix, then the first option is correct, if this is in your shader.
If this is not in your shader but in your program code, then there is not enough information.
If you are using the first option but getting unexpected results, you are likely not computing your matrix properly or not passing the correct vertices.
Normally in the vertex shader you should multiple the positions by the MVP, that is
gl_Position = uMVPMatrix *vPosition;
When you change the order this should work...
Thanks to all for the help.
I managed to track down the problem (For the most part). I will show what I did.
It was the following line:
Matrix.multiplyMM(mvpMatrix, 0, mvpMatrix, 0, mRotationMatrix, 0);
As you can see I was multiplying the matrices and storing them back into one that I was using in the multiplication.
So I created a new matrix called mvpMatrix2 and stored the results in that. Then passed that to my vertex shader.
//Multiply matrices
Matrix.multiplyMM(mvpMatrix2, 0, mvpMatrix, 0, mRotationMatrix, 0);
//get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(iProgId, "uMVPMatrix");
//Give to vertex shader variable
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix2, 0);
After applying this, there is no distortion (And also, with regards to my other question here Using Matrix. Rotate in OpenGL ES 2.0 I am able to translate the centre of the quad). I say 'for the most part' because however, when I rotate it, it rotates backwards (so if I say rotate +45 degrees, (Clockwise), it actually rotates the quad by -45 degrees (Anit-clockwise).
But hopefully, this will help anyone who has a similar problem in the future.
I'm trying to get some textures rendered on quads, but the screen turns out empty.
(Open GL 2.0)
I formerly used a static color shader on the quads, and the quads did appear on screen, so the positioning is fine.
UPDATE: Now I can see black textures only...
Here is my code:
Vertex Shader:
uniform mat4 uTMatrix;
uniform mat4 uMVPMatrix;
attribute vec4 vPosition;
attribute vec4 vTex;
attribute vec4 inputTextureCoordinate;
varying vec2 TexCoord0;
void main(){
TexCoord0 = inputTextureCoordinate.xy;
gl_Position = uMVPMatrix * uTMatrix * vPosition;
}
Frag Shader:
varying highp vec2 TexCoord0;
uniform sampler2D colorMap;
void main(){
gl_FragColor = texture2D(colorMap, TexCoord0);
}
onDrawFrame():
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
GLES20.glUniformMatrix4fv(muMVPMatrixHandle, 1, false, mMVPMatrix, 0);
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
GLES20.glUseProgram(mProgram);
final float halfw = mWidth / 2f;
final int rowHeight = mRowHeight;
final float r = (float) rowHeight / halfw;
int i = mFirstRow;
final float initial = (float) ((i * rowHeight) - mScroll) / halfw;
Matrix.setIdentityM(mTMatrix, 0);
Matrix.translateM(mTMatrix, 0, 0, -initial, 0);
GLES20.glFrontFace(GLES20.GL_CW);
GLES20.glVertexAttribPointer(maPositionHandle, 3, GLES20.GL_FLOAT,
false, 0, mVertBuffer);
GLES20.glEnableVertexAttribArray(maPositionHandle);
GLES20.glVertexAttribPointer(maTexHandle, 3, GLES20.GL_FLOAT,
false, 0, mTexBuffer);
GLES20.glEnableVertexAttribArray(maTexHandle);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glEnable(GLES20.GL_TEXTURE_2D);
final int l = mLastRow;
for (; i <= l; i++) {
if (\*A check to see if the bitmap is cached*\) {
GLES20.glUniform1i(muTextureHandle, i);
GLES20.glUniformMatrix4fv(muTMatrixHandle, 1, false,
mTMatrix, 0);
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
}
Matrix.translateM(mTMatrix, 0, 0, -r, 0);
}
GLES20.glDisableVertexAttribArray(maPositionHandle);
GLES20.glDisableVertexAttribArray(maTexHandle);
loadTexture:
public void loadTexture(GL10 gl, Context c) {
int[] texture = new int[1];
texture[0] = mRow;
GLES20.glDeleteTextures(1, texture, 0);
texture[0] = mRow;
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texture[0]);
// Create Nearest Filtered Texture
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
// Different possible texture parameters, e.g.
// GLES20.GL_CLAMP_TO_EDGE
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_REPEAT);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_REPEAT);
Bitmap bitmap = mBitmap;
if (bitmap == null) {
bitmap = mEmptyBitmap;
}
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
}
Additionally there's this one line I'm not sure what it's supposed to do;
if (\*A check to see if the bitmap is cached*\) {
GLES20.glUniform1i(muTextureHandle, i); // ??
...
}
I'm guessing your goal is to update GL_TEXTURE0 constantly though, in which case you should put constant value 0 into muTextureHandle. Or remove this line all together as zero is the default.