Draw invisible triangle openGL ES 2.0 without using shader - android

I want to draw Invisible triangle in OpenGL ES 3.0. I thought making the Alpha channel Zero will do it.
Here is how I am passing my triangle vertices:
In my constructor I am using intialising my Triangle vertices:
final float[] triangle2VerticesData = {
// X, Y, Z,
// R, G, B, A
-0.5f, -0.25f, 0.0f,
1.0f, 1.0f, 0.0f, 0.0f, // Alpha is zero
0.5f, -0.25f, 0.0f,
0.0f, 1.0f, 1.0f, 0.0f, // Alpha is zero
0.0f, 0.559016994f, 0.0f,
1.0f, 0.0f, 1.0f, 0.0f}; // Alpha is zero
Extra Information:
In onSurfaceCreated:
GLES20.glClearColor(0.5f, 0.5f, 0.5f, 0.5f);
// ... Creating and attaching shaders, creating veiw matrix using Lookat
In onDrawFrame:
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
// ... performaing simple rotation and passing MVP to shaders
In Shaders:
final String vertexShader =
"uniform mat4 u_MVPMatrix; \n"
+ "attribute vec4 a_Position; \n"
+ "attribute vec4 a_Color; \n"
+ "varying vec4 v_Color; \n"
+ "void main() \n"
+ "{ \n"
+ " v_Color = a_Color; \n"
+ " gl_Position = u_MVPMatrix \n"
+ " * a_Position; \n"
+ "} \n";
final String fragmentShader =
"precision mediump float; \n"
+ "varying vec4 v_Color; \n"
+ "void main() \n"
+ "{ \n"
+ " gl_FragColor = v_Color; \n"
+ "}
I am able to change color of the triangle with playing R G B values.
BUT when A(Alpha) is set to 1.0f or 0.0f Does not produce any change in transperancy/invisiblity?
Can anyone tell me where I am goin wrong?

If you want the triangle to be completely invisible, simply don't render it. This is the best approach for performance, and with a little extra logic, should be easy to accomplish.
If you want some transparency, i.e. not completely invisible, then be sure to enable blending. (Plenty of information online about how to do blending. Too much to explain in a post.)
GLES20.glEnable(GLES20.GL_BLEND);
after rendering your transparent objects,
GLES20.glDisable(GLES20.GL_BLEND);

Related

Android/OpenGles drawing 2 external texture each taking half of the screen. Left and right half

I have successfully bind 2 external OES texture to my shader. Now I want each texture to take 1/2 of the screen(Left for one texture right for another). How do I go about doing it? Example:
http://vicceskep.hu/kepek/vicces_funny_007445.jpg
random image from google
Showing a full picture of each picture. It will be nice to have an efficient method to do it. The code that I am currently referencing from is the bikflake/ grafika code from github.
Visit http://bigflake.com/mediacodec/CameraToMpegTest.java.txt
To check the code out.
Okay I think i will really give in depth clarification for my question as I do not have much knowledge about 3d projections in open GL. Sorry for the numerous edits on the question.
This is my Vertex Shader code currently
private static final String VERTEX_SHADER =
// UMVPMATRIX IS AN IDENTITY MATRIX
"uniform mat4 uMVPMatrix;\n" +
//These are surfacetexture.getTransformationMatrix
"uniform mat4 uSTMatrixOne;\n" +
"uniform mat4 uSTMatrixTwo;\n" +
"attribute vec4 aPosition;\n" +
"attribute vec4 aTextureCoord;\n" +
"varying vec2 vTextureCoord;\n" +
"varying vec2 vTextureCoordTwo;\n" +
"void main() {\n" +
" gl_Position = uMVPMatrix * aPosition;\n" +
" vTextureCoord = (uSTMatrix * aTextureCoord).xy;\n" +
" vTextureCoordTwo = (uSTMatrixTwo* aTextureCoord).xy ;\n" +
"}\n";
This is my Fragment Shader code currently which does a overlay currently.
private static final String FRAGMENT_SHADER =
"#extension GL_OES_EGL_image_external : require\n" +
"precision mediump float;\n" + // highp here doesn't seem to matter
"varying vec2 vTextureCoord;\n" +
"varying vec2 vTextureCoordTwo;\n" +
"uniform samplerExternalOES sTextureOne;\n" +
"uniform samplerExternalOES sTextureTwo;\n" +
"void main() {\n" +
" lowp vec4 pixelTop = texture2D(sTextureOne, vTextureCoord);\n" +
" lowp vec4 pixelBot = texture2D(sTextureTwo, vTextureCoordTwo);" +
" gl_FragColor = pixelTop + pixelBot;\n" +
"}\n";
As for the aPosition and the a texture coordinate it is currently referenced from. It would be nice if someone explained how mTraingleVerticesData works too.
private final float[] mTriangleVerticesData = {
// X, Y, Z, U, V
-1.0f, -1.0f, 0, 0.f, 0.f,
1.0f, -1.0f, 0, 1.f, 0.f,
-1.0f, 1.0f, 0, 0.f, 1.f,
1.0f, 1.0f, 0, 1.f, 1.f,
};
GLES20.glVertexAttribPointer(maPositionHandle, 3, GLES20.GL_FLOAT, false,
TRIANGLE_VERTICES_DATA_STRIDE_BYTES, mTriangleVertices);
checkGlError("glVertexAttribPointer maPosition");
GLES20.glEnableVertexAttribArray(maPositionHandle);
checkGlError("glEnableVertexAttribArray maPositionHandle");
mTriangleVertices.position(TRIANGLE_VERTICES_DATA_UV_OFFSET);
GLES20.glVertexAttribPointer(maTextureHandle, 2, GLES20.GL_FLOAT, false,
TRIANGLE_VERTICES_DATA_STRIDE_BYTES, mTriangleVertices);
My 2 external projection binding currently
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTextureID);
//Cam Code
//Set texture to be active
GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTwoTextureID);
You can do it the way #Sung suggested but conditional statements and loops in shaders, especialy fragment, are slow. It's better to render 2 different polygons.
Yea I finally got it.I used 2 different programs to draw and used mTriangleVerticesData to edit the image proportions and use gl_position to shift the image.

Draw an Image based texture in openGl android(Wear)

I am drawing an image based texture using opengl in android, But the image is drawn partially only shown below
my coding
#Override
public void onGlContextCreated() {
super.onGlContextCreated();
shaders = new ShadersDla();
float[] vts = { // x, y, s, t.
-1, 1, 1, 1, -1, 1, 0, 0, 1, -1, 1, 1, 1, 1, 1, 0
};
// AllocateDirect prevents the GC moving this memory.
vtBuffer = ByteBuffer.allocateDirect(vts.length * 4)
.order(ByteOrder.nativeOrder())
.asFloatBuffer();
vtBuffer.put(vts);
}
#Override
public void onGlSurfaceCreated(int width, int height) {
super.onGlSurfaceCreated(width, height);
float aspectRatio = (float) width / height;
float dist = .001f;
Matrix.frustumM(projectionMatrix, 0,
-aspectRatio * dist, aspectRatio * dist, // Left, right.
-dist, dist, // Bottom, top.
dist, 100); // Near, far.
makeTexture();
}
Shader
private static final String VERTEX_SHADER =
// Pass in the modelview matrix as a constant.
"uniform mat4 u_mvpMatrix; \n"
// Pass in the position and texture coordinates per vertex.
+ "attribute vec4 a_position; \n"
+ "attribute vec2 a_texCoord; \n"
// Varyings are sent on to the fragment shader.
+ "varying vec2 v_texCoord; \n"
+ "void main() { \n"
// Transform the vertex coordinate into clip coordinates.
+ " gl_Position = u_mvpMatrix * a_position; \n"
// Pass through the texture coordinate.
+ " v_texCoord = a_texCoord; \n"
+ "} \n";
Need some help to do this stuff.kindly guide me a easy way i'm new to android and opengl....
Change the texture coordinates as
{-1.0f, 1.0f, 0,0 ,
1.0f, 1.0f, 1,0,
-1.0f,-1.0f, 0,1 ,
1.0f, -1.0f, 1,1

Issue with ID Matrix in GLES2

I have the following code in my C file...
static const char gVertexShader[] =
"attribute vec4 vPosition;\n"
"attribute vec4 vid;\n"
"varying vec4 fragColor; \n"
"attribute vec4 inColor; \n"
"void main() {\n"
" gl_Position = vid * vPosition;\n"
" fragColor = inColor; \n"
"}\n";
static const char gFragmentShader[] = "precision mediump float;\n"
"varying vec4 fragColor; \n"
"void main() {\n"
" gl_FragColor = fragColor;\n"
"}\n";
.....
GLuint gvPositionHandle;
GLuint gvColorHandle;
GLuint gvIDHandle;
....
GLfloat id[] = { 1.0f, 0.0f, 0.0f, 0.0f,
0.0f, 1.0f, 0.0f, 0.0f,
0.0f, 0.0f, 1.0f, 0.0f,
0.0f, 0.0f, 0.0f, 1.0f};
....
glVertexAttribPointer(gvPositionHandle, 2, GL_FLOAT, GL_FALSE, 0,
gTriangleVertices1);
glVertexAttribPointer(gvColorHandle, 4, GL_FLOAT, GL_FALSE, 0, current1);
glVertexAttribPointer(gvIDHandle, 4, GL_FLOAT, GL_FALSE, 0, id);
checkGlError("glVertexAttribPointer");
glEnableVertexAttribArray(gvPositionHandle);
glEnableVertexAttribArray(gvColorHandle);
glEnableVertexAttribArray(gvIDHandle);
checkGlError("glEnableVertexAttribArray");
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
I works fine without the identity matrix....
Then after I add the identity matrix (So I would not expect any change since it is an ID matrix) I see the following...
So this looks really wrong. Can anyone see what I am doing wrong? Is this a column major vs. row major issue or something?
Using mat4 instead of vec4
So I noticed that I should probably be using mat4 for my transformation so I changed it to the following....
static const char gVertexShader[] =
"attribute vec4 vPosition;\n"
"uniform mat4 vid;\n"
"varying vec4 fragColor; \n"
"attribute vec4 inColor; \n"
"void main() {\n"
" gl_Position = vid * vPosition ;\n"
" fragColor = inColor; \n"
"}\n";
GLint location = glGetUniformLocation(gProgram, "vid");
glUniformMatrix4fv(location, 1, false, id);
Now at first this appears to work, however when I change it to something like this...
GLfloat id[] = { 10.0f, 0.0f, 0.0f, 0.0f,
0.0f, 10.0f, 0.0f, 0.0f,
0.0f, 0.0f, 10.0f, 0.0f,
0.0f, 0.0f, 0.0f, 10.0f};
It does not zoom so I have a feeling it is not right yet...

OpenGL ES 2.0 Scaling not working

I need to scale an object in OpenGL|ES 2.0. Shaders:
private final String vertexShaderCode =
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
//the matrix must be included as a modifier of gl_Position
" gl_Position = vPosition * uMVPMatrix;" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
Projection:
Matrix.orthoM(mProjMatrix,0,
-1.0f, // Left
1.0f, // Right
-1.0f / ratio, // Bottom
1.0f / ratio, // Top
0.01f, // Near
10000.0f);
Drawing setup:
// Set the camera position (View matrix)
Matrix.setLookAtM(mVMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
// Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
Actual render:
float[] scale = {5f,5f,1f};
Matrix.scaleM(scale_matrix, 0, scale[0], scale[1], scale[2]);
Matrix.multiplyMM(r_matrix, 0, scale_matrix, 0, mMVPMatrix, 0);
// Combine the rotation matrix with the projection and camera view
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, r_matrix, 0);
And it will not scale. I can see the triangle and I can rotate it. But scaling does not work.
Since vectors are column vectors in OpenGL you have to change the order of the matrix multiplication in your vertex shader:
gl_Position = uMVPMatrix*vPosition;

OpenGL ES 2.0 Not rendering correctly

This is supposed to render a cube. It looks like some parts of the rear faces are rendering in front of the ones closest to the camera. This happens even if I set it farther away. This is from my renderer:
public void onDrawFrame(GL10 unused) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Set the camera position
Matrix.setLookAtM(mVMatrix, 0, 0, 0, -3f, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
Matrix.setIdentityM(mModelMatrix, 0);
Matrix.setRotateM(mModelMatrix, 0, mAngle, 0f, 1f, 0.0f);
Matrix.multiplyMM(mMVPMatrix, 0, mVMatrix, 0, mModelMatrix, 0);
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mMVPMatrix, 0);
// Draw object
cube.draw(mMVPMatrix, context);
mAngle++;
}
and my object's draw method
public void draw(float[] mvpMatrix, Context context) {
GLES20.glUseProgram(mProgram);
GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
GLES20.glVertexAttribPointer(mTexHandle, 2,
GLES20.GL_FLOAT, false,
8, textureBuffer);
GLES20.glEnableVertexAttribArray(mPositionHandle);
GLES20.glEnableVertexAttribArray(mTexHandle);
GLES20.glActiveTexture ( GLES20.GL_TEXTURE0 );
GLES20.glBindTexture ( GLES20.GL_TEXTURE_2D, mTextureID);
GLES20.glUniform1i ( mSampler, 0 );
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, vertexCount);
GLES20.glDisableVertexAttribArray(mPositionHandle);
GLES20.glDisableVertexAttribArray(mTexHandle);
}
and my shaders:
String vertexShaderCode =
"uniform mat4 uMVPMatrix;" +
"uniform float u_offset; \n" +
"attribute vec4 a_position; \n" +
"attribute vec2 a_texCoord; \n" +
"varying vec2 v_texCoord; \n" +
"void main() \n" +
"{ \n" +
" gl_Position = uMVPMatrix * a_position; \n" +
" gl_Position.x += u_offset;\n" +
" v_texCoord = a_texCoord; \n" +
"} \n";
String fragmentShaderCode =
"precision mediump float; \n" +
"varying vec2 v_texCoord; \n" +
"uniform sampler2D s_texture; \n" +
"void main() \n" +
"{ \n" +
" gl_FragColor = texture2D(s_texture, v_texCoord); \n" +
"} \n";
and the result
Picture:
http://i.imgur.com/eWI2Uom.png
Thanks
Assuming you're using the depth buffer, you don't seem to be clearing it in your onDrawFrame function. Try:
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT|GLES20.GL_DEPTH_BUFFER_BIT);
I'm not sure about Android-specific code as I program on iOS, but it's my understanding from reading "OpenGL ES 2.0 Programming Guide" (by Munshi et al.) that very little differs. Here's what my code looks like from a recent small project:
After setting up your framebuffer and color-renderbuffer, as you've already done, set up the depth buffer.
GLuint depthRenderbuffer;
GLint backingWidth, backingHeight;
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &backingWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &backingHeight);
glGenRenderbuffers(1, &depthRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, depthRenderbuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, backingWidth, backingHeight);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthRenderbuffer);
It's a good idea to include the part about setting the depthbuffer dimensions based off the color-renderbuffer, using glGetRenderbufferParameteriv(), because it ensures that no matter what, they're going to match.
One side note, on iOS it's recommended to setup the color-renderbuffer storage directly from the underlying iOS drawing layer, however in setting up the depth-renderbuffer it requires the call to glRenderbufferStorage() instead, as I've shown above.
You'll also want to include the following lines of code to your draw routine:
glClearDepthf(1.0);
glEnable(GL_DEPTH_TEST);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
If you don't make the call to glEnable(GL_DEPTH_TEST), it seems to glitch at first and then work just fine, at least on my implementation in iOS. The glClearDepthf(1.0) clears it all the way to the far-plane, as opposed to a value of 0.0 which clears it to the front-plane.
It looks like you may have some Android-specific code, but hopefully this gets you off to the right start. Cheers!

Categories

Resources