Custom Shader not working on Android Device (using LibGDX) - android

I've been starting out using LibGDX to develop for PC as well as Android. I'm currently working on a little project requiring me to write custom shaders in GLSL (based on full-screen quads). Coming from a HLSL environment I had a little trouble setting the whole shader system up in combination with SpriteBatch (as I want to keep the code simple where I can).
I have the following shader code working on desktop:
Vertexshader:
attribute vec4 a_position;
uniform mat4 u_projectionViewMatrix;
void main()
{
gl_Position = a_position * u_projectionViewMatrix;
}
Fragmentshader:
uniform vec2 effectOrigin;
uniform vec2 effectDir;
void main()
{
float distToOrigin = distance(effectOrigin.xy, gl_FragCoord.xy);
gl_FragColor = vec4(mod(distToOrigin+effectDir.x*30, 30)-15, 0.0, 0.0, 1.0);
}
As I said it works on desktop (Windows), and gives me a nice circle pattern. On Android however it doesn't compile and gives me just the cleared background color. I suspected it had something to do with OpenGL ES 2 permissions on Android so I added this line to the manifest:
<uses-feature android:glEsVersion="0x00020000" android:required="true" />, but this didn't seem to make a difference.
I also thought it might have something to do with the precision of the floats and vectors, but I wasn't able to figure out how I would have to change them in order to fix it.
Is there someone who can help me on this? I haven't been able to find an answer elsewhere!
Thanks in advance,
Yuri
P.S. Is there a way to see what went wrong during compiling of the shaders? I know debugging is hard on shaders, but it would be awesome to have at least some idea of where it could've gone wrong during compilation.

Fixed it using P.T.´s suggestion on logging the compilation!
There were two problems in the fragment shader, the fixed shader looks like this:
precision mediump float;
uniform vec2 effectOrigin;
uniform vec2 effectDir;
void main()
{
float distToOrigin = distance(effectOrigin.xy, gl_FragCoord.xy);
gl_FragColor = vec4(mod(distToOrigin+effectDir.x*30.0, 30.0)-15.0, 0.0, 0.0, 1.0);
}
I added the precision definition at the top, and also changed the constant values from integers to floats (e.g. 15 to 15.0) to fix it!
Thanks P.T.!

Related

Rendering from GL_TEXTURE_EXTERNAL_OES to GL_TEXTURE_2D but only one color flashes

I'm pretty new to opengl-es and currently have a problem rendering my video output on a screen in Unity.
I was developing a video player project with Unity. I bought the EasyMovieTexture plugin and replaced the video player module with another open-source video player (Ijkplayer) years ago which worked fine all the time.
Now I want to replace it with newer VLC using libvlcjni. I compiled and just replaced the old Ijkplayer but it didn't work as I expected. The screen just started flashing one color from every video frame but the video was going and the audio track was normally playing too.
Screenshot - A test scene with a screen, sorry there's mistake with texcoord but only color flashing
I'd like to provide some further information hope that I can find some help (Sorry if I have some mistakes or misunderstandings):
As far as I know, these video player modules need a Surface (Or SurfaceTexture) in the Android layer and the video decoder will work as a data producer. The SurfaceTexture consumes data from the producer and converts it to the texture with the type of GL_TEXTURE_EXTERNAL_OES which can be directly consumed and displayed with components like TextureView. But this texture data cannot be consumed in the Unity layer unless I use a GLSL shader directly sampling my OES texture data, like this:
// Unity shader with GLSL
GLSLPROGRAM
#pragma only_renderers gles3
#include "UnityCG.glslinc"
// ...
// Ignoring vertex shader
// ...
in vec2 textureCoord;
layout(binding = 0) uniform samplerExternalOES _MainTex;
out vec4 fragColor;
void main()
{
fragColor = texture(_MainTex, textureCoord);
}
ENDGLSL
My approach was to convert the texture to GL_TEXTURE_2D with the native library which came along with the EasyMovieTexture plugin. Here I cannot provide the source code of this .so library but I've decompiled it in IDAPro and I know it can work along with GLES and render the external texture data to another 2d texture using FrameBuffer Object and external shader program.
Here is a random example to explain the procedure, it is NOT the accurate code from the binary: FilterFBOTexture.java
Though I cannot edit the .so file, luckily it was reading two external files as the shader program:
// vertex shader
attribute highp vec3 inVertex;
attribute mediump vec3 inNormal;
attribute mediump vec2 inTexCoord;
uniform highp mat4 MVPMatrix;
uniform mediump vec2 TexCoordMove;
varying mediump vec2 TexCoord;
void main()
{
highp vec4 vPos = vec4(0,0,0,1);
vPos.x = ( inTexCoord.x * 2.0 - 1.0 );
vPos.y = ( inTexCoord.y * 2.0 - 1.0 );
gl_Position = vPos;
mediump vec4 vec4Temp = vec4(inTexCoord.x - TexCoordMove.x,inTexCoord.y - TexCoordMove.y,0,1.0);
vec4Temp = MVPMatrix * vec4Temp;
vec4Temp.xyz = vec4Temp.xyz / vec4Temp.w;
TexCoord = vec4Temp.xy;
}
// fragment shader
#extension GL_OES_EGL_image_external : require
uniform samplerExternalOES sTexture;
uniform lowp float AlphaValue;
varying mediump vec2 TexCoord;
void main()
{
lowp vec4 color = texture2D(sTexture, TexCoord) ;
color = vec4(color.rgb, color.a * AlphaValue);
gl_FragColor = color;
}
I don't know whether I have to check the vertex shader or just dive into the source code of libvlcjni so that I can correctly render my video output. Any idea will be grateful, thanks.
Update 2022-11-4:
I turned around and began to use VLC for Android.
Big appreciation to #mfkl, I created an issue days ago on the VLC repo.
https://code.videolan.org/videolan/vlc-unity/-/issues/164
The problem still remains but at least I can work for something now.

LibGDX shader working everywhere but my Android device

As a continuation from my previous question (GLSL : Accessing an array in a for-loop hinders performance), I have encountered an entirely new and annoying problem.
So, I have a shader that performs a black hole effect.
The shader works perfectly on my computer, the android emulator, and ShaderToy – but for some reason, even though the code is exactly the same, does not work on my Android device.
The problem occurs when I zoom in too far. For whatever reason, when my zoom reaches a certain point – the whole background zooms in and then zooms out and goes crazy. Like this :
When it should look like this :
However, it does work on my device if I change this :
#ifdef GL_ES
precision mediump float;
#endif
to this :
#ifdef GL_ES
precision highp float;
#endif
The problem with this is that it also decreases my FPS from 60 down to ~40.
I believe the problem is that my Android device's OpenGL version is "OpenGL ES 3.0" according to Gdx.gl.glGetString(GL20.GL_VERSION).
But I cannot figure out how to set my version to OpenGL 2.0 since the AndroidApplicationConfiguration class is giving me little to no options.
I've tried putting <uses-feature android:glEsVersion="0x00020000" android:required="true" /> in the manifest, but it still prints "OpenGL ES 3.0".
And I still don't even know if this is actually the cause of the problem or not, so that's why I'm asking here. Thank you for taking the time to read/answer my question :).
P.S. Here's the Shader code:
#ifdef GL_ES
precision mediump float;
#endif
const int MAX_HOLES = 4;
uniform sampler2D u_sampler2D;
varying vec2 vTexCoord0;
struct BlackHole {
vec2 position;
float radius;
float deformRadius;
};
uniform vec2 screenSize;
uniform vec2 cameraPos;
uniform float cameraZoom;
uniform BlackHole blackHole[MAX_HOLES];
void main() {
vec2 pos = vTexCoord0;
float black = 0.0;
for (int i = 0; i < MAX_HOLES; i++) {
BlackHole hole = blackHole[i];
vec2 position = (hole.position - cameraPos.xy) / cameraZoom + screenSize*0.5;
float radius = hole.radius / cameraZoom;
float deformRadius = hole.deformRadius / cameraZoom;
vec2 deltaPos = vec2(position.x - gl_FragCoord.x, position.y - gl_FragCoord.y);
float dist = length(deltaPos);
float distToEdge = max(deformRadius - dist, 0.0);
float dltR = max(sign(radius - dist), 0.0);
black = min(black+dltR, 1.0);
pos += (distToEdge * normalize(deltaPos) / screenSize);
}
gl_FragColor = (1.0 - black) * texture2D(u_sampler2D, pos) + black * vec4(0, 0, 0, 1);
}
As you have found the issue is down to a lack of precision in fp16 (mediump), which is fixed by using fp32 (highp). Most maths units will have double the throughput for fp16 vs fp32, which also explains the drop in performance.
Querying the driver GLES version will return maximum supported version, not the version of the current EGL context, so what you are seeing is expected.
Also please note that "highp" is optional in OpenGL ES 2.0 fragment shaders, so there is no guarantee that your shader will work on some GPUs in an OpenGL ES 2.0 context. The Mali-4xx series only support fp16 fragment shaders, for example (I think also some of the OpenGL ES 2.0 Vivante GPUs based on past experience).
In OpenGL ES 3.0 highp is mandatory in fragment shaders, so it would be guaranteed to work there.

Why doesn't my simple fragment shader work on iOS but not Android under Cocos2d-x 3.2?

I have modified a working grey_scale fragment shader to change all of the non-transparent pixels to purple. For some reason it works great on iOS but on Android the transparent parts of the image are visible (although mostly transparent). Can anybody see what I am doing wrong?
The working grey_scale shader contains the lines that are commented out. I added the last line.
#ifdef GL_ES
precision mediump float;
#endif
varying vec4 v_fragmentColor;
varying vec2 v_texCoord;
void main(void)
{
vec4 c = texture2D(CC_Texture0, v_texCoord);
//gl_FragColor.xyz = vec3(0.2126*c.r + 0.7152*c.g + 0.0722*c.b);
//gl_FragColor.w = c.w;
gl_FragColor = vec4(0.5, 0.0, 0.4, c.w);
}
It turns out that I need to apply the alpha to all of the colors:
gl_FragColor = vec4(0.5*c.w, 0.0*c.w, 0.4*c.w, c.w);
I am sure why the old method didn't work. The image uses premultiplied alpha, so I guess the cocos render assumes it (or was somehow told to use it by TexturePacker). So the shader needs to re-multiply the color values by the alpha in order to behave the same way?
A screenshot will be worth a thousands of words.
Are you sure GL_ES is defined? If not, you will have unspecified precision for float (according to specs, it is unspecified for fragment shaders) which can lead even to compilation errors on certain OpenGL ES drivers. I'd play around with that float precision first and see the difference on Android.
I'm not sure about vec3() w/ single parameter. Does the following notation work:
float a = 0.2126*c.r + 0.7152*c.g + 0.0722*c.b;
gl_FragColor.xyz = vec3(a, a, a);
Or this, with a single assignment of gl_FragColor:
float a = 0.2126*c.r + 0.7152*c.g + 0.0722*c.b;
gl_FragColor.xyz = vec4(a, a, a, c.w);
On a side note, you may want to declare numeric literals as constants in order not to consume uniform space. More info: Declaring constants instead of literals in vertex shader. Standard practice, or needless rigor?
Like this:
const float COEFF_R = 0.2126;
const float COEFF_G = 0.7152;
const float COEFF_B = 0.0722;
float a = COEFF_R*c.r + COEFF_G*c.g + COEFF_B*c.b;
gl_FragColor.xyz = vec4(a, a, a, c.w);

Android GLKBaseEffect equivalent (already existing OpenGL shaders)

I'm learning OpenGL ES 2.0 in Android, do you know of a library providing already existing shaders?
I'm on a project with a friend who's developing on iOS, he told me that he can use GLKBaseEffect to avoid devolping custom shaders, as long as we don't need complex features. Is there an equivalent of that BaseEffect in Android?
I'm asking this because the two of us have been assigned this project by a professor, who told us that it's not important for this project to develop custom shaders, so I'm guessing there is a compilation of basical shaders that I can browse.
Is that correct?
Thank you for your help!
Android doesn't support something like the GLKBaseEffect class but I want you to know that shader is just supported for being programable so shader is not hard at all if you use simple shader codes.
If you don't want to do any post imageprocessing don't change fragment shader that is only what you should do.
Vertex shader
attribute vec4 position;
attribute vec4 inputTextureCoordinate;
varying vec2 textureCoordinate;
void main(void)
{
gl_Position = position;
textureCoordinate = inputTextureCoordinate.xy;
}
Fragment shader
uniform sampler2D texture0;
varying vec2 textureCoordinate;
void main()
gl_FragColor = texture2D(texture0, textureCoordinate);
}
Now you need to put only three values position, texture cordinate and texture :) as you need to do anywhere

Android GLES20.glBlendEquation not working?

Ive been trying to make a 2.5D engine with depth and normal map textures for a few weeks now, not unlike whats used here Linky. After thinking the drawing of a depth map in the fragment shader from a texture was impossible due to ES 2.0 missing the gl_fragDepth variable I found a tutorial for iOS where they used glBlendEquation with the mode GL_MIN/GL_MAX to "fake" depth buffering of the fragment to a framebuffer-texture Linky. Unfortunely GLES20.glBlendEquation makes the application crash on both my phones (SGS 1/2) with UnsupportedOperationException. So Im wondering if anyone has used this function to any success? GL_MIN/GL_MAX also seems to be missing from the Android Opengl ES 2.0 spec so Im probably out of luck here...
Any ideas?
BTW It does seem to work in GL11Ext but since Im using the fragment shader for normal mapping this wont work from me.
i was experimenting on my Vega tablet (Tegra) and this worked for me:
fragment shader:
#extension GL_NV_shader_framebuffer_fetch : require
// makes gl_LastFragColor accessible
precision highp float;
varying vec2 v_texcoord;
uniform sampler2D n_sampler;
void main()
{
vec4 v_tex = texture2D(n_sampler, v_texcoord);
gl_FragColor = min(gl_LastFragColor, v_tex); // MIN blending
}
Pretty easy, huh? But i'm afraid this will be NV-only.

Categories

Resources