This is the best I can do with obtaining the debug errors.
Link for a larger image version
As you can see, glGetProgramInfoLog() returns the following sentence:
"Invalid vertex shader. Link cannot proceed."
The yellow line in the Eclipse IDE is the code line where the program had just executed, via stepping-over. The green line with an arrow pointing it is where the program will execute.
I would prefer it if it tells me what line my vertex shader code is wrong at, or something that points me in a good direction. With this vague description, I can only ask you for help.
The vertex shader code looks like this:
uniform mat4 u_mvpMatrix;
uniform mat4 u_mvMatrix;
uniform vec3 u_lightPosition;
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec3 a_normal;
varying vec4 v_color;
void main() {
vec3 modelViewVertex = vec3(u_mvMatrix * a_position);
vec3 modelViewNormal = vec3(u_mvMatrix * vec4(a_normal, 0.0));
float distance = length(u_lightPosition - modelViewVertex);
vec3 lightVector = normalize(u_lightPosition - modelViewVertex);
float diffuse = max(dot(modelViewNormal, lightVector), 0.1);
diffuse = diffuse * (1.0 / (0.25 * distance * distance));
v_color = a_color * diffuse;
gl_Position = mvpMatrix * a_position;
}
I don't know where I did wrong. Can you help me find it? If you need more info, I'll add them. Thanks in advance.
EDIT 1:
There is no debug info from glGetShaderInfo().
Larger version.
Even more info:
EDIT 2:
Tried a combination of glGetShader() and glGetShaderInfo(), still no luck.
Larger version.
Found the error!
Of all this time, I was editing the wrong source file!
Kept saving/checking/saving/checking..., I didn't realize that the source code shown above isn't included in the project, therefore the program stating that my vertex shader is invalid, is because the compiler is compiling the source code with an empty vertex shader, and not the newly-modified version (which is the source code I was editing this whole time).
I'm so sorry to waste everybody's time.
Related
New to Opengl and GLSL.
I am using OpenGL es 3.0 and my GLSL version #version 300 es.
i want to get pixel(ARGB data) at every position in my vertex shader(vertex texture fetch). i have verified that my Android tablet supports vertex texture fetch.
Now i pass in the texture(image) and Texture coordinates to the vertex shader
and execute
GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, 4);
Is this the right way or should i use GL_POINTS .
if i am using GL_POINTS how to pass the texture cooordinate?
could you provide any samples/example code that does a full pixel read(ARGB) in the vertex shader.
attaching my vertex shader
uniform sampler2D sTexture;
in vec4 aTextureCoord;
out vec3 colorFactor;
vec2 vTextureCoord;
vec4 tex;
void main()
{
vTextureCoord = aTextureCoord.xy;
tex = texture(sTexture,vTextureCoord);
float luminance = 0.299 * tex.r + 0.587 * tex.g + 0.114 * tex.b;
colorFactor = vec3(1.0, 1.0, 1.0);
gl_Position = vec4(-1.0 + (luminance * 0.00784313725), 0.0, 0.0, 1.0);
gl_PointSize = 1.0;
};
My texture coordinates passed are
{0.f,1.f}
{1.f,1.f} {0.f,0.f}
{1.f,0.f}
and the shader is triggered by
GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, 4);
Just declare a sampler and sample it exactly as per usual. E.g.
"#version 150\n"
in vec4 position;
in vec2 texCoordinate;
uniform sampler2D texID;
uniform mat4 modelViewProjection;
void main()
{
gl_Position = modelViewProjection * (position + texture(texID, texCoordinate));
}
That will sample a 4d vector from the texture unit texID at location texCoordinate, using that to perturb position prior to applying the model-view-projection matrix. The type of geometry you're drawing makes no difference.
gl_Position = vec4(-1.0 + (luminance * 0.00784313725), 0.0, 0.0, 1.0);
That's clever. It's not going to work, but that's clever.
The thing that is confusing everyone is what you haven't told us. That you're computing the histogram by using blending. That you compute the location for each fragment based on the luminance, so you get lots of overlap. And blending just adds everything together, thus producing your histogram.
FYI: It's always best to explain what it is you're actually trying to accomplish in your question, rather than hoping that someone can deduce what you're attempting to do.
That's not going to work because you only have four vertices in this case. You have lots of fragments, but they will be generated based on interpolation from your 4 vertices. And you can't change the position of a fragment from within a fragment shader.
If you want to do what you're trying to do, you still have to render one vertex for every texel you fetch. You still need to use GL_POINTS.
I would like to be able to pass more per-vertex-data to my own custom shaders in kivy than the usual vertex coords + texture coords. Specifically, I would like to pass a value that says which animation frame should be used in selecting the texture coords.
I found an example (http://shadowmint.blogspot.com/2013/10/kivy-textured-quad-easy-right-no.html), and succeeded in changing the format of the vertices passed to a mesh using an argument to the constructor of the Mesh, like this:
Mesh(mode = 'triangles', fmt=[('v_pos', 2, 'float'),
('v_tex0', 2, 'float'),
('v_frame_i', 1, 'float')]
I can then set the vertices to be drawn to something like this:
vertices = [x-r,y-r, uvpos[0],uvpos[1],animationFrame,
x-r,y+r, uvpos[0],uvpos[1]+uvsize[1],animationFrame,
x+r,y-r, uvpos[0]+uvsize[0],uvpos[1],animationFrame,
x+r,y+r, uvpos[0]+uvsize[0],uvpos[1]+uvsize[1],animationFrame,
x+r,y-r, uvpos[0]+uvsize[0],uvpos[1],animationFrame,
x-r,y+r, uvpos[0],uvpos[1]+uvsize[1],animationFrame,
]
..this works well when I run in Ubuntu, but when I run on my android device the drawn texture either doesn't draw, or it looks like the vertex or texture coordinate data is corrupt / not aligned or something.
Here is my shader code in case that is relevant. Again, this all behaves as I want it to when I run in ubuntu, but not when I run on android device.
---VERTEX SHADER---
#ifdef GL_ES
precision highp float;
#endif
/* vertex attributes */
attribute vec2 v_pos;
attribute vec2 v_tex0;
attribute float v_frame_i; // for animation
/* uniform variables */
uniform mat4 modelview_mat;
uniform mat4 projection_mat;
uniform vec4 color;
uniform float opacity;
uniform float sqrtNumFrames; // the width/height of the sprite-sheet
uniform float frameWidth;
/* Outputs to the fragment shader */
varying vec4 frag_color;
varying vec2 tc;
void main() {
frag_color = color * vec4(1.0, 1.0, 1.0, opacity);
gl_Position = projection_mat * modelview_mat * vec4(v_pos.xy, 0.0, 1.0);
float f = round(v_frame_i);
tc = v_tex0;
float w = (1.0/sqrtNumFrames);
tc *= w;
tc.x += w*mod(f,sqrtNumFrames); //////////// I think that the problem might
tc.y += w*round(f / sqrtNumFrames); ///////////// be related to this code, here?
}
---FRAGMENT SHADER---
#ifdef GL_ES
precision highp float;
#endif
/* Outputs from the vertex shader */
varying vec4 frag_color;
varying vec2 tc;
/* uniform texture samplers */
uniform sampler2D texture0;
uniform vec2 player_pos;
uniform vec2 window_size; // in pixels
void main (void){
gl_FragColor = frag_color * texture2D(texture0, tc);
}
I wonder if it may have to do with a version of GLSL and int / float math (in particular in identifying which image from the sprite sheet to draw, see the comments in the glsl code. One version is running on my desktop and another on the device?
Any suggestions for things to experiment with would be much appreciated!
After looking at the log from the running version on the android device (a Moto X phone), I saw that the custom shader was not linking. This appeared to be due to the use of the function round(x), which I replaced with floor(x+0.5) in both cases, and the shader now works on the phone and my desktop properly.
I think the problem is that the version of GLSL supported on the phone and on my PC are different..but I am not 100% certain about this.
I try to make an app with OpenGLES 2. There are 2 devices for testing.
Unfortunately there seems to be a difference between both.
I have the following "basic" shader code for testing:
// vertex shader
uniform mat4 uVMatrix; // View Matrix
uniform mat4 uPMatrix; // perspective Matrix
uniform vec3 uLight1Pos; // LightPos
attribute vec4 aPosition;
attribute vec4 aColor;
varying vec4 vColor;
void main() {
vColor = aColor;
mat4 MVPMatrix = uPMatrix * uVMatrix;
gl_Position = MVPMatrix * aPosition;
}
// fragment shader
uniform mat4 uVMatrix;
uniform vec3 uLight1Pos;
varying vec4 vColor;
varying vec3 vPosition;
void main() {
vec3 light2Pos = uLight1Pos;
gl_FragColor = vColor;
}
The problem is, some uniforms can't be found.
I attach and link the shader as usual but checking the IDs like:
uVMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uVMatrix");
uPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uPMatrix");
uLight1PosHandle = GLES20.glGetUniformLocation(mProgram, "uLight1Pos");
I get different values.
On my Galaxy S1 it is 2, 1, 3 (so all valid values - even though a weird order).
On my Galaxy S3 it is 0, 1, -1 (so the last one can not be found).
What am I doing wrong? Do I have to declare uniform differently on a S3 (Mali GPU)?
I realized that I have to "use" the uniforms in order to get a reference. That is the reason why I wrote vec3 light2Pos = uLight1Pos; If I dont do this, I also get no reference on the S1.
Thank you for your help!
Tobias
- EDIT -
Weird enough, I tried to change the uniform of the vertex shader from vec3 to a mat4:
uniform mat4 uLight1Pos;
mat4 lPos = uLight1Pos;
It appears that matrices work fine and I can get a handle for it when using a matrix. How comes?
Your "use" doesn't have a contribution to the output fragcolor. You just proved the S3 has a better compiler than the S1 for removing dead code.
GLSL compilers tend to eliminate unused uniforms which don't contribute to the result. This behavior is in compliance with OpenGL specification.
In your case, the compiler is allowed to eliminate the dead line vec3 light2Pos = uLight1Pos; rendering the uLight1Pos uniform useless.
I'm using a fragment shader that uses dFdy dFdx functions to calculate the normal of the
face to view in a flat appearance. This shader has been running ok in gles 2.0 and 3.0. Inexplicably, shader don't work in Android 4.4 ( KitKat - gles3.0 ).
(Solved!!.. individual derivatives for each component, solve the problem).
In order check error, i prepared these shaders :
//Vertex Shader
#version 300 es
precision highp float;
precision highp int;
uniform mat4 PMatrix; //Projection Matrix (varies according to camera)
uniform mat4 MVMatrix; //Model View Matrix (no change)
in vec3 vPosition;
out vec3 vPos;
main()
{
gl_Position=PMatrix * MVMatrix * vec4(vPosition.xyz,1.0);
vPos = (MVMatrix * vec4(vPosition.xyz,1.0)).xyz;
}
// Fragment shader
#version 300 es
#extension GL_OES_standard_derivatives : enable
precision highp float;
precision highp int;
in vec3 vPos;
main()
{
// don't run correctly
// vec3 fdx = dFdx(vPos);
// vec3 fdy = dFdy(vPos);
// ***Solved!*** this run correctly in KitKat
vec3 fdx = vec3(dFdx(vPos.x),dFdx(vPos.y),dFdx(vPos.z));
vec3 fdy = vec3(dFdy(vPos.x),dFdy(vPos.y),dFdy(vPos.z));
vec3 N = normalize(cross(fdx,fdy));
fragColor = vec4(N,1.0);
}
Drawing a cube, in Android<4.4 colors remain fixed for each side, independently of camera position (correct, the color identifies each normal-face). In Android 4.4 , colors vary if you move the camera.
Analyzing each derived separately :
1.- fragColor=vec4(normalize( fdx ), 1.0); colors are constantly changing (wrong)
2.- fragColor=vec4(normalize( fdy ), 1.0); the colors remain quasi-stable (quasi-ok)
A bug in the implementation of these features in Android 4.4?
We are doing something wrong?
You said that this was originally a GLES2 shader, and you are using highp in a fragment shader unconditionally? That is a disaster waiting to happen because GLES2 implementations are not required to support highp in fragment shaders. Likewise, support for dFdx (...), dFdy (...) and fwidth (...) is optional.
You need to check for GL_OES_standard_derivatives and GL_FRAGMENT_PRECISION_HIGH in the GLES2 implementation of this fragment shader.
To that end, you might consider the accuracy hint for derivatives:
GL_FRAGMENT_SHADER_DERIVATIVE_HINT_OES (GLES2, if the extension is supported)
GL_FRAGMENT_SHADER_DERIVATIVE_HINT (GLES3)
Individual derivatives in each component solve the problem in KitKat:
// Replace vec3 fdx = dFdx(vPos) by:
vec3 fdx = vec3(dFdx(vPos.x),dFdx(vPos.y),dFdx(vPos.z));
// Replace vec3 fdy = dFdy(vPos) by:
vec3 fdy = vec3(dFdy(vPos.x),dFdy(vPos.y),dFdy(vPos.z));
I've seen similar things on some AMD desktop setups.
Where dFdy( .xyz ) worked fine on NVIDIA/intel I had to do the derivate per-component to get it correctly for some AMD cards.
I want to use shading on my OpenGL objects but can't seem to access GLSL functions in my opengl package. Is there a GLSL package available for OpenGL ES in Eclipse?
EDIT: As Tim pointed out. Shaders are written as text files and then loaded using glShaderSoure, I have a C++ shader file which I had written once for a ray tracing application. But, I am really confused as to how I would go about implementing in java. Suppose I have a 2D square drawn in my Renderer class using a gl object MySquare , how will I go about implementing the java equivalent of the shader file below.
Shader.vert
varying vec3 N;
varying vec3 v;
void main()
{
// Need to transform the normal into eye space.
N = normalize(gl_NormalMatrix * gl_Normal);
// Always have to transform vertex positions so they end
// up in the right place on the screen.
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
// Fragment shader for per-pixel Phong interpolation and shading.
Shader.Frag
// The "varying" keyword means that the parameter's value is interpolated
// between the nearby vertices.
varying vec3 N;
varying vec3 v;
//Used for Environmental mapping shader calculations
const vec3 xUnitVec=vec3(1.0, 0.0, 0.0), yUnitVec=vec3(1.0, 1.0, 0.0);
uniform vec3 BaseColor, MixRatio;
uniform sampler2D EnvMap;
void main()
{
// The scene's ambient light.
vec4 ambient = gl_LightModel.ambient * gl_FrontMaterial.ambient;
// The normal vectors is generally not normalized after being
// interpolated across a triangle. Here we normalize it.
vec3 Normal = normalize(N);
// Since the vertex is in eye space, the direction to the
// viewer is simply the normalized vector from v to the
// origin.
vec3 Viewer = -normalize(v);
// Get the lighting direction and normalize it.
vec3 Light = normalize(gl_LightSource[0].position.xyz);
// Compute halfway vector
vec3 Half = normalize(Viewer+Light);
// Compute factor to prevent light leakage from below the
// surface
float B = 1.0;
if(dot(Normal, Light)<0.0) B = 0.0;
// Compute geometric terms of diffuse and specular
float diffuseShade = max(dot(Normal, Light), 0.0);
float specularShade =
B * pow(max(dot(Half, Normal), 0.0), gl_FrontMaterial.shininess);
// Compute product of geometric terms with material and
// lighting values
vec4 diffuse = diffuseShade * gl_FrontLightProduct[0].diffuse;
vec4 specular = specularShade * gl_FrontLightProduct[0].specular;
ambient += gl_FrontLightProduct[0].ambient;
// Assign final color
gl_FragColor= ambient + diffuse + specular + gl_FrontMaterial.emission;
}
Check out the tutorials over at learnopengles.com. They'll answer all the questions you have.
There's no 'java equivalent' of a shader file. The shader is written in GLSL. The shader will be the same whether your opengl is wrapped in java, or c++, or python, or whatever. Aside from small API differences between OpenGL and OpenGLES, you can upload the exact same shader in Java as you used in C++, character for character.
You can use this source:
https://github.com/markusfisch/ShaderEditor
ShaderView class is the key!
GLSL files are in raw folder.