Android Opengl 2.0 Alpha Blending issue - half transparent textures - android

I have this problem with Apha Blending in my game, when I draw a surface with alpha texture, what is suposed to be invisible is invisible, but parts that are suposed to be visible are half transparent. It depends on the amount of light - the closer it is to light source the better it looks, but in shadows such objects almost dissapear.
I enable Alpha Blending:
GLES20.glEnable(GLES20.GL_BLEND);
then I set the function:
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA);
or
GLES20.glBlendFunc(GLES20.GL_ONE, GLES20.GL_ONE_MINUS_SRC_ALPHA);
effect is still the same. I use 48bit png files with alpha channel.
my fragment shader looks like this:
final String fragmentShader =
"precision mediump float; \n"
+"uniform vec3 u_LightPos; \n"
+"uniform sampler2D u_Texture; \n"
+"varying vec3 v_Position; \n"
+"varying vec4 v_Color; \n"
+"varying vec3 v_Normal; \n"
+"varying vec2 v_TexCoordinate; \n"
+"void main() \n"
+"{ \n"
+"float distance = length(u_LightPos - v_Position); \n"
+"vec3 lightVector = normalize(u_LightPos - v_Position); \n"
+"float diffuse = max(dot(v_Normal, lightVector), 0.0); \n"
+"diffuse = diffuse * (" + Float.toString((float)Brightness) +" / (1.0 + (0.08 * distance))); \n"
+"diffuse = diffuse; \n"
//+3
+"gl_FragColor = (v_Color * diffuse * texture2D(u_Texture, v_TexCoordinate)); \n"
+"} \n";
and vertex shader:
uniform mat4 u_MVPMatrix;
uniform mat4 u_MVMatrix;
attribute vec4 a_Position;
attribute vec3 a_Normal;
attribute vec2 a_TexCoordinate;
varying vec3 v_Position;
varying vec4 v_Color;
varying vec3 v_Normal;
varying vec2 v_TexCoordinate;
void main()
{
v_Position = vec3(u_MVMatrix * a_Position);
v_Color = a_Color;
v_TexCoordinate = a_TexCoordinate;
v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));
gl_Position = u_MVPMatrix * a_Position;
}
Thx for any suggestions:)

Your fragment shader multiplies all 4 (RGBA) components of the texture color with the diffuse factor. This will make your alpha component trend towards zero whenever the diffuse light disappears, turning your sprites nearly invisible.
To fix your code change it to something like this:
gl_FragColor = v_Color * texture2D(u_Texture, v_TexCoordinate);
gl_FragColor.rgb *= diffuse;

Related

OpenGL ES color errors with MIX function in Android

in opengl es program i input two textures and use fragment shader:
varying highp vec2 textureCoordinate;
varying vec2 textureCoordinate2;
uniform sampler2D inputImageTexture;
uniform sampler2D inputImageTexture2;
uniform lowp float intensity;
void main() {
highp vec4 newColor = texture2D(inputImageTexture2,textureCoordinate2);
highp vec4 vColor = texture2D(inputImageTexture, textureCoordinate);
newColor.r = newColor.r + vColor.r * (1.0 - newColor.a);
newColor.g = newColor.g + vColor.g * (1.0 - newColor.a);
newColor.b = newColor.b + vColor.b * (1.0 - newColor.a);
gl_FragColor = vec4(mix(vColor.rgb, newColor.rgb, 0.86), newColor.a);
};
but the result have some "error color" here, how to fix this?
I found the answer myself, this is because I write and read from a same FBO texture, and now I change it and work fine!

change color based on location

I would like to draw a simple square and change the color dynamically so that it is the brightest in the center and darkest in the borders.
however when I am rendering I can't see anything .
here is my vertex shader:
uniform mat4 u_Matrix;
attribute vec4 a_Position;
attribute vec4 a_Color;
varying vec4 v_Color;
varying vec4 pos;
void main()
{
v_Color = a_Color;
pos = u_Matrix * a_Position;
gl_Position = pos;
}
and here is my fragment shader :
precision mediump float;
varying vec4 v_Color;
varying vec4 pos;
void main()
{
float len = length(normalize(pos));
gl_FragColor = vec4(1-len , 1-len , 1-len , 1);
}
You can't reference gl_Position in the fragment shader.
Either hand gl_Position over to the fragment shader
attribute vec4 a_Position;
varying vec4 pos;
void main()
{
pos = a_Position;
gl_Position = a_Position;
}
or use gl_FragCoord in the fragment shader float dist = length( gl_FragCoord - viewPortCenter ); if you want to work in window coordinates.
Also rather use length, which is faster, instead of doing your own distance calculation.

OpenGL - why is normalize() not idempotent?

I am using OpenGL ES 2.0 to develop an Android game in Java. Currently I am writing my own vertex and fragment shader. I encountered a weird problem in my fragment shader: normalize(u_LightPos - v_Position) is DIFFERENT from normalize(normalize(u_LightPos - v_Position)), where u_LightPos is a uniform and v_Position a varying.
Why is normalize() not idempotent? Why do I have to call it twice to get an actually normal (length 1) vector? This is very confusing.
EDIT:
Here is the vertex shader:
uniform mat4 u_MVPMatrix;
uniform mat4 u_MVMatrix;
attribute vec4 a_Position;
attribute vec3 a_Normal;
varying vec3 v_Position;
varying vec3 v_Normal;
void main() {
v_Position = vec3(u_MVMatrix * a_Position);
v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));
gl_Position = u_MVPMatrix * a_Position;
}
And here is the fragment shader:
precision mediump float;
uniform vec3 u_LightPos;
uniform vec4 u_Color;
varying vec3 v_Position;
varying vec3 v_Normal;
void main() {
float distance = length(u_LightPos - v_Position);
vec3 lightVector = normalize(normalize(u_LightPos - v_Position));
float diffuse = max(dot(v_Normal, lightVector), 0.0);
gl_FragColor = u_Color * diffuse;
}
If I don't double normalize the lightVector, the dot product will be > 1.1, as I have tested. And no, normalizing v_Normal doesn't change that fact.
It's a precision issue. Setting the precision to highp resolves the problem. u_LightPos and v_Position differed by too much, resulting in a value that was too large to properly normalize.

Vertex Shader fails to compile. Can't find the mistake

EDIT:
Seems the mistake was, that I am not allowed to compile the shader in a seperate thread? Since I've been pushing the object-loading just now into a threaded environment, the error message came up. Just didn't think that it could be the reason for it.
My current vertex shader fails to compile for some reason. The error message I'm getting is not existent, and I can't find the mistake.
uniform mat4 u_MVPMatrix;
uniform mat4 u_MVMatrix;
uniform vec3 u_CameraPos;
attribute vec4 a_Position;
attribute vec4 a_Color;
attribute vec3 a_Normal;
varying vec3 v_Position;
varying vec4 v_Color;
varying vec3 v_Normal;
varying vec3 v_CameraPosition;
varying vec4 v_Ambient;
void main()
{
v_Position = vec3(u_MVMatrix * a_Position);
v_Color = a_Color;
v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));
//v_CameraPosition = vec3(u_MVMatrix * vec4(u_CameraPos, 0.0)); // taken out to debug
v_CameraPosition = u_CameraPos;
gl_Position = u_MVPMatrix * a_Position;
}
The fragment shader for this one is:
precision mediump float;
uniform vec3 u_LightPos;
uniform vec4 u_Light;
uniform vec4 u_Ambient;
uniform vec3 u_LightDirection;
uniform vec3 u_CameraPos;
varying vec4 v_Ambient;
varying vec3 v_Position;
varying vec4 v_Color;
varying vec3 v_Normal;
varying vec3 v_CameraPosition;
// The entry point for our fragment shader.
void main()
{
float distance = length(u_LightPos - v_Position);
vec3 lightVector = normalize(u_LightPos - v_Position);
float diffuse = max(dot(v_Normal, lightVector), 0.1);
diffuse = diffuse * (1.0 / (1.0 + (0.25 * distance * distance)));
// specular lighting removed due to debugging
gl_FragColor = v_Color * (u_Ambient + (diffuse * u_Light));
}
"Trying" to get an error message:
Log.d(TAG, "Compilation -> " + GLES20.glGetShaderInfoLog(iShader));
Returns an empty string from the method, as well as
Log.e(TAG, "Vertex Shader Failed -> " + GLES20.glGetError());
is returning simply 0.
I am developing for OpenGL ES 2.0 for Android, if there are any restrictions for Android that I am unaware of?
Thank you for any help!
OpenGL contexts work only per-thread so you are correct. If you want to create a background loading thread you need to not only create a new context in that thread, but also make sure it's sharing resources (the third parameter in eglCreateContext). Be aware that sharing context resources might not be work on some (older) devices.

Scaled ModelMatrix messes with Diffuse lighting

So, I have a hopefully simple question:
I have a simple cube, I'm useing Matrix.ScaleM to scale the modelview and compress the cube(There's a reason for this, trust me).
This work, the cube shrinks. However, my fragment shader no longer properly applies the diffuse light source to the top a bottom on the cube. The shade code is as follows.
precision mediump float;
uniform vec3 u_LightPos;
uniform sampler2D u_Texture;
uniform sampler2D u_Texture2;
varying vec3 v_Position;
varying vec4 v_Color;
varying vec3 v_Normal; // Interpolated normal for this fragment.
varying vec2 v_TexCoordinate; // Interpolated texture coordinate per fragment.
// The entry point for our fragment shader.
void main()
{
float distance = length(u_LightPos - v_Position);
// Get a lighting direction vector from the light to the vertex.
vec3 lightVector = normalize(u_LightPos - v_Position);
// Calculate the dot product of the light vector and vertex normal. If the normal and light vector are
// pointing in the same direction then it will get max illumination.
float diffuse = max(dot(v_Normal, lightVector), 0.0);
mediump float emptyness = 0.0;
mediump float half_emptyness = 0.1;
// Add attenuation.
diffuse = diffuse * (1.0 / (1.0 + (0.10 * distance)));
// Add ambient lighting
diffuse = diffuse + 0.3;
vec4 textColor1 = texture2D(u_Texture, v_TexCoordinate);
vec4 textColor2 = texture2D(u_Texture2, v_TexCoordinate);
// Multiply the color by the diffuse illumination level and texture value to get final output color.
if(textColor2.w == emptyness){
diffuse = diffuse * (1.0 / (1.0 + (0.10 * distance)));
gl_FragColor = ( diffuse * textColor1 );//v_Color *
gl_FragColor.a = 1.0;
} else{
diffuse = diffuse * (1.0 / (1.0 + (0.75 * distance)));
gl_FragColor = ( diffuse * textColor1 );//v_Color *
gl_FragColor.a = 0.0;
}
}
So, any ideas?
And I know the color is a little...odd. That's for a completely different reason.
EDIT: As requested, the vertex Shader:
uniform mat4 u_MVPMatrix;
uniform mat4 u_MVMatrix;
attribute vec4 a_Position;
attribute vec4 a_Color;
attribute vec3 a_Normal;
attribute vec2 a_TexCoordinate;
varying vec3 v_Position; // This will be passed into the fragment shader.
varying vec4 v_Color; // This will be passed into the fragment shader.
varying vec3 v_Normal; // This will be passed into the fragment shader.
varying vec2 v_TexCoordinate; // This will be passed into the fragment shader.
// The entry point for our vertex shader.
void main()
{
// Transform the vertex into eye space.
v_Position = vec3(u_MVMatrix * a_Position);
// Pass through the color.
v_Color = a_Color;
// Pass through the texture coordinate.
v_TexCoordinate = a_TexCoordinate;
// Transform the normal's orientation into eye space.
v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));
float halfer = 2.0;
// gl_Position is a special variable used to store the final position.
// Multiply the vertex by the matrix to get the final point in normalized screen coordinates.
gl_Position = u_MVPMatrix * a_Position;
}
You'll need an inverted transposed matrix like this:
Shader:
uniform mat4 u_IT_MVMatrix;
...
v_Normal = vec3(u_IT_MVMatrix * vec4(a_Normal, 0.0));
In your Java code you create the matrix from your regular MV matrix like this:
invertM(tempMatrix, 0, modelViewMatrix, 0);
transposeM(it_modelViewMatrix, 0, tempMatrix, 0);
Then you'll just need to pass this into the shader as a uniform.

Categories

Resources