OpenGL - why is normalize() not idempotent? - android

I am using OpenGL ES 2.0 to develop an Android game in Java. Currently I am writing my own vertex and fragment shader. I encountered a weird problem in my fragment shader: normalize(u_LightPos - v_Position) is DIFFERENT from normalize(normalize(u_LightPos - v_Position)), where u_LightPos is a uniform and v_Position a varying.
Why is normalize() not idempotent? Why do I have to call it twice to get an actually normal (length 1) vector? This is very confusing.
EDIT:
Here is the vertex shader:
uniform mat4 u_MVPMatrix;
uniform mat4 u_MVMatrix;
attribute vec4 a_Position;
attribute vec3 a_Normal;
varying vec3 v_Position;
varying vec3 v_Normal;
void main() {
v_Position = vec3(u_MVMatrix * a_Position);
v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));
gl_Position = u_MVPMatrix * a_Position;
}
And here is the fragment shader:
precision mediump float;
uniform vec3 u_LightPos;
uniform vec4 u_Color;
varying vec3 v_Position;
varying vec3 v_Normal;
void main() {
float distance = length(u_LightPos - v_Position);
vec3 lightVector = normalize(normalize(u_LightPos - v_Position));
float diffuse = max(dot(v_Normal, lightVector), 0.0);
gl_FragColor = u_Color * diffuse;
}
If I don't double normalize the lightVector, the dot product will be > 1.1, as I have tested. And no, normalizing v_Normal doesn't change that fact.

It's a precision issue. Setting the precision to highp resolves the problem. u_LightPos and v_Position differed by too much, resulting in a value that was too large to properly normalize.

Related

Vertex Shader fails to compile. Can't find the mistake

EDIT:
Seems the mistake was, that I am not allowed to compile the shader in a seperate thread? Since I've been pushing the object-loading just now into a threaded environment, the error message came up. Just didn't think that it could be the reason for it.
My current vertex shader fails to compile for some reason. The error message I'm getting is not existent, and I can't find the mistake.
uniform mat4 u_MVPMatrix;
uniform mat4 u_MVMatrix;
uniform vec3 u_CameraPos;
attribute vec4 a_Position;
attribute vec4 a_Color;
attribute vec3 a_Normal;
varying vec3 v_Position;
varying vec4 v_Color;
varying vec3 v_Normal;
varying vec3 v_CameraPosition;
varying vec4 v_Ambient;
void main()
{
v_Position = vec3(u_MVMatrix * a_Position);
v_Color = a_Color;
v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));
//v_CameraPosition = vec3(u_MVMatrix * vec4(u_CameraPos, 0.0)); // taken out to debug
v_CameraPosition = u_CameraPos;
gl_Position = u_MVPMatrix * a_Position;
}
The fragment shader for this one is:
precision mediump float;
uniform vec3 u_LightPos;
uniform vec4 u_Light;
uniform vec4 u_Ambient;
uniform vec3 u_LightDirection;
uniform vec3 u_CameraPos;
varying vec4 v_Ambient;
varying vec3 v_Position;
varying vec4 v_Color;
varying vec3 v_Normal;
varying vec3 v_CameraPosition;
// The entry point for our fragment shader.
void main()
{
float distance = length(u_LightPos - v_Position);
vec3 lightVector = normalize(u_LightPos - v_Position);
float diffuse = max(dot(v_Normal, lightVector), 0.1);
diffuse = diffuse * (1.0 / (1.0 + (0.25 * distance * distance)));
// specular lighting removed due to debugging
gl_FragColor = v_Color * (u_Ambient + (diffuse * u_Light));
}
"Trying" to get an error message:
Log.d(TAG, "Compilation -> " + GLES20.glGetShaderInfoLog(iShader));
Returns an empty string from the method, as well as
Log.e(TAG, "Vertex Shader Failed -> " + GLES20.glGetError());
is returning simply 0.
I am developing for OpenGL ES 2.0 for Android, if there are any restrictions for Android that I am unaware of?
Thank you for any help!
OpenGL contexts work only per-thread so you are correct. If you want to create a background loading thread you need to not only create a new context in that thread, but also make sure it's sharing resources (the third parameter in eglCreateContext). Be aware that sharing context resources might not be work on some (older) devices.

GL_BYTE normals not interpreted properly in Android shader

My shader works GREAT on iOS and Win32. The problem is Android. All devices really botch the normals if I send them as GL_BYTE. If I convert them to GL_FLOAT, everything's peachy.
glVertexAttribPointer(program->slots[Shaders::A_NORMAL_SLOT], 3, GL_FLOAT, GL_FALSE, 0, frame->normals);
glVertexAttribPointer(program->slots[Shaders::A_NORMAL_SLOT], 3, GL_BYTE, GL_TRUE, 0, frame->normals);
and YES, 'frame->normals' is of type float* in the first example, and of type char* in the second example. Like I said, this exact same code works great on iOS/Win32.
What's going on here? Is this some kind of Android bug that it won't accept byte normals?
My shader, Vertex:
uniform mat3 normal_matrix;
uniform mat4 mvp_matrix;
uniform vec3 u_lightpos;
uniform vec3 u_eyepos;
uniform vec4 u_color;
attribute vec4 a_vertex;
attribute vec3 a_normal;
attribute vec2 a_texcoord0;
attribute vec4 a_color;
varying vec2 v_texcoord0;
varying vec3 v_viewDir;
varying vec3 v_lightDir;
varying vec3 v_normal;
varying vec4 v_color;
void main( void )
{
vec4 fvObjectPosition = mvp_matrix * (a_vertex);
v_viewDir = u_eyepos;
v_lightDir = u_lightpos;
v_normal = normalize(normal_matrix * a_normal);
v_color = u_color * a_color;
v_texcoord0 = a_texcoord0.xy;
gl_Position = fvObjectPosition;
}
Fragment:
#ifdef GL_ES
precision mediump float;
#endif
uniform vec4 u_specular;
uniform vec4 u_diffuse;
uniform float u_specularpower;
uniform sampler2D t_texture0;
varying vec2 v_texcoord0;
varying vec3 v_viewDir;
varying vec3 v_lightDir;
varying vec3 v_normal;
varying vec4 v_color;
void main( void )
{
float fNDotL = dot( v_normal, v_lightDir );
vec3 fvReflection = normalize( ( ( 2.0 * v_normal ) * fNDotL ) - v_lightDir );
vec4 fvBaseColor = v_color * texture2D( t_texture0, v_texcoord0 );
vec4 fvTotalDiffuse = vec4((u_diffuse * fNDotL).xyz, 1.0) * fvBaseColor;
vec4 fvTotalSpecular = u_specular * ( pow( max( 0.0, dot( fvReflection, -v_viewDir ) ), u_specularpower ) );
gl_FragColor = vec4(( fvBaseColor + fvTotalDiffuse + fvTotalSpecular ).xyz, v_color.w);
}

Android Opengl 2.0 Alpha Blending issue - half transparent textures

I have this problem with Apha Blending in my game, when I draw a surface with alpha texture, what is suposed to be invisible is invisible, but parts that are suposed to be visible are half transparent. It depends on the amount of light - the closer it is to light source the better it looks, but in shadows such objects almost dissapear.
I enable Alpha Blending:
GLES20.glEnable(GLES20.GL_BLEND);
then I set the function:
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA);
or
GLES20.glBlendFunc(GLES20.GL_ONE, GLES20.GL_ONE_MINUS_SRC_ALPHA);
effect is still the same. I use 48bit png files with alpha channel.
my fragment shader looks like this:
final String fragmentShader =
"precision mediump float; \n"
+"uniform vec3 u_LightPos; \n"
+"uniform sampler2D u_Texture; \n"
+"varying vec3 v_Position; \n"
+"varying vec4 v_Color; \n"
+"varying vec3 v_Normal; \n"
+"varying vec2 v_TexCoordinate; \n"
+"void main() \n"
+"{ \n"
+"float distance = length(u_LightPos - v_Position); \n"
+"vec3 lightVector = normalize(u_LightPos - v_Position); \n"
+"float diffuse = max(dot(v_Normal, lightVector), 0.0); \n"
+"diffuse = diffuse * (" + Float.toString((float)Brightness) +" / (1.0 + (0.08 * distance))); \n"
+"diffuse = diffuse; \n"
//+3
+"gl_FragColor = (v_Color * diffuse * texture2D(u_Texture, v_TexCoordinate)); \n"
+"} \n";
and vertex shader:
uniform mat4 u_MVPMatrix;
uniform mat4 u_MVMatrix;
attribute vec4 a_Position;
attribute vec3 a_Normal;
attribute vec2 a_TexCoordinate;
varying vec3 v_Position;
varying vec4 v_Color;
varying vec3 v_Normal;
varying vec2 v_TexCoordinate;
void main()
{
v_Position = vec3(u_MVMatrix * a_Position);
v_Color = a_Color;
v_TexCoordinate = a_TexCoordinate;
v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));
gl_Position = u_MVPMatrix * a_Position;
}
Thx for any suggestions:)
Your fragment shader multiplies all 4 (RGBA) components of the texture color with the diffuse factor. This will make your alpha component trend towards zero whenever the diffuse light disappears, turning your sprites nearly invisible.
To fix your code change it to something like this:
gl_FragColor = v_Color * texture2D(u_Texture, v_TexCoordinate);
gl_FragColor.rgb *= diffuse;

Scaled ModelMatrix messes with Diffuse lighting

So, I have a hopefully simple question:
I have a simple cube, I'm useing Matrix.ScaleM to scale the modelview and compress the cube(There's a reason for this, trust me).
This work, the cube shrinks. However, my fragment shader no longer properly applies the diffuse light source to the top a bottom on the cube. The shade code is as follows.
precision mediump float;
uniform vec3 u_LightPos;
uniform sampler2D u_Texture;
uniform sampler2D u_Texture2;
varying vec3 v_Position;
varying vec4 v_Color;
varying vec3 v_Normal; // Interpolated normal for this fragment.
varying vec2 v_TexCoordinate; // Interpolated texture coordinate per fragment.
// The entry point for our fragment shader.
void main()
{
float distance = length(u_LightPos - v_Position);
// Get a lighting direction vector from the light to the vertex.
vec3 lightVector = normalize(u_LightPos - v_Position);
// Calculate the dot product of the light vector and vertex normal. If the normal and light vector are
// pointing in the same direction then it will get max illumination.
float diffuse = max(dot(v_Normal, lightVector), 0.0);
mediump float emptyness = 0.0;
mediump float half_emptyness = 0.1;
// Add attenuation.
diffuse = diffuse * (1.0 / (1.0 + (0.10 * distance)));
// Add ambient lighting
diffuse = diffuse + 0.3;
vec4 textColor1 = texture2D(u_Texture, v_TexCoordinate);
vec4 textColor2 = texture2D(u_Texture2, v_TexCoordinate);
// Multiply the color by the diffuse illumination level and texture value to get final output color.
if(textColor2.w == emptyness){
diffuse = diffuse * (1.0 / (1.0 + (0.10 * distance)));
gl_FragColor = ( diffuse * textColor1 );//v_Color *
gl_FragColor.a = 1.0;
} else{
diffuse = diffuse * (1.0 / (1.0 + (0.75 * distance)));
gl_FragColor = ( diffuse * textColor1 );//v_Color *
gl_FragColor.a = 0.0;
}
}
So, any ideas?
And I know the color is a little...odd. That's for a completely different reason.
EDIT: As requested, the vertex Shader:
uniform mat4 u_MVPMatrix;
uniform mat4 u_MVMatrix;
attribute vec4 a_Position;
attribute vec4 a_Color;
attribute vec3 a_Normal;
attribute vec2 a_TexCoordinate;
varying vec3 v_Position; // This will be passed into the fragment shader.
varying vec4 v_Color; // This will be passed into the fragment shader.
varying vec3 v_Normal; // This will be passed into the fragment shader.
varying vec2 v_TexCoordinate; // This will be passed into the fragment shader.
// The entry point for our vertex shader.
void main()
{
// Transform the vertex into eye space.
v_Position = vec3(u_MVMatrix * a_Position);
// Pass through the color.
v_Color = a_Color;
// Pass through the texture coordinate.
v_TexCoordinate = a_TexCoordinate;
// Transform the normal's orientation into eye space.
v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));
float halfer = 2.0;
// gl_Position is a special variable used to store the final position.
// Multiply the vertex by the matrix to get the final point in normalized screen coordinates.
gl_Position = u_MVPMatrix * a_Position;
}
You'll need an inverted transposed matrix like this:
Shader:
uniform mat4 u_IT_MVMatrix;
...
v_Normal = vec3(u_IT_MVMatrix * vec4(a_Normal, 0.0));
In your Java code you create the matrix from your regular MV matrix like this:
invertM(tempMatrix, 0, modelViewMatrix, 0);
transposeM(it_modelViewMatrix, 0, tempMatrix, 0);
Then you'll just need to pass this into the shader as a uniform.

OpenGLES 2.0 Phong shader strange result, makes my object transparent when texture is enabled!

I've been stuck for several days now, trying to make my shader working properly.
The problem is that when I'm not attaching a texture on my object, I multiply the ambient by the light color and I get a dark object when no light, and illuminated properly when a light source is activated.
The problem is that when I attach a texture and multiply it by ambient and light color I get a transparent object that shows up only when a light source is activated, and you can even see through the object while it is illuminated!
I've been trying several codes snippets from the internet but I always get the same result. What I'm doing wrong here? I'm desperate...
The application is developed on Android.
Here is my Vertex Shader:
uniform mat4 uMVPMatrix;
uniform mat4 normalMatrix;
// eye pos
uniform vec3 eyePos;
// position and normal of the vertices
attribute vec4 aPosition;
attribute vec3 aNormal;
// texture variables
uniform float hasTexture;
varying float tex;
attribute vec2 textureCoord;
varying vec2 tCoord;
// lighting
uniform vec4 lightPos;
uniform vec4 lightColor;
// material
uniform vec4 matAmbient;
uniform vec4 matDiffuse;
uniform vec4 matSpecular;
uniform float matShininess;
// normals to pass on
varying vec3 vNormal;
varying vec3 EyespaceNormal;
varying vec3 lightDir, eyeVec;
void main() {
// pass on texture variables
tex = hasTexture;
tCoord = textureCoord;
// normal
EyespaceNormal = vec3(normalMatrix * vec4(aNormal, 1.0));
// the vertex position
vec4 position = uMVPMatrix * aPosition;
// light dir
lightDir = lightPos.xyz - position.xyz;
eyeVec = -position.xyz;
gl_Position = uMVPMatrix * aPosition;
}
And here is my Fragment shader:
precision mediump float;
// texture variables
uniform sampler2D texture1; // color texture
varying float tex;
varying vec2 tCoord;
varying vec3 vNormal;
varying vec3 EyespaceNormal;
// light
uniform vec4 lightPos;
uniform vec4 lightColor;
// material
uniform vec4 matAmbient;
uniform vec4 matDiffuse;
uniform vec4 matSpecular;
uniform float matShininess;
// eye pos
uniform vec3 eyePos;
// from vertex s
varying vec3 lightDir, eyeVec;
void main() {
vec4 b = lightColor;
vec4 c = matAmbient;
vec4 d = matDiffuse;
vec4 e = matSpecular;
vec3 g = eyePos;
float f = matShininess;
vec3 N = normalize(EyespaceNormal);
vec3 E = normalize(eyeVec);
vec3 L = normalize(lightDir);
// Reflect the vector. Use this or reflect(incidentV, N);
vec3 reflectV = reflect(-L, N);
// Get lighting terms
vec4 ambientTerm;
if (tex >= 1.0) {
ambientTerm = texture2D(texture1, tCoord);
}
else
ambientTerm = matAmbient * lightColor;
vec4 diffuseTerm = matDiffuse * max(dot(N, L), 0.0);
vec4 specularTerm = matSpecular * pow(max(dot(reflectV, E), 0.0), matShininess);
gl_FragColor = ambientTerm * diffuseTerm + specularTerm;
}
Thanks in advance.
OK I found it, thanks to JPD002, I was revising the shader again, and I found out that it has to be
vec4 diffuseTerm = matDiffuse * max(dot(N, L), 0.0);
vec4 specularTerm = matSpecular * pow(max(dot(reflectV, E), 1.0), matShininess);
Thanks JDP002, it is always good to have 4 eyes on code rather than 2 =D

Categories

Resources