Strange texture result when mix two camera textures in glsl - android

I'm making a simple image filter app in android and I implemented lowpass filter using same method in GPUImage(https://github.com/BradLarson/GPUImage)
it buffers the previous and current camera frames mixture and render it.
So i created a buffer FBO and render the current camera texture, re-use it as a texture for mixture in lowpass filter shader with next camera texture.
I tested my code with some smartphones(Galaxy S10, Nexus 6P, etc..) and it worked well. However in Galaxy S8(Mali-G71) the result is strange and I don't know what was wrong.
These are the wrong results
Here are my code:
Fragment shader:
varying vec2 vTextureCoord;
uniform sampler2D sTexture1;
uniform float filterStrength;
void main() {
vec4 texColor0 = texture2D(sTexture, vTextureCoord);
vec4 texColor1 = texture2D(sTexture1, vTextureCoord);
gl_FragColor = mix(texColor0, texColor1, filterStrength);
}
What can cause this results?
Thanks in advance.

The artifacts look tile aligned for Mali, so if I had to guess you are reading the currently bound framebuffer color attachment as an input texture at the same time as writing in to it.
This is "implementation defined" behavior in the specification, and concurrent reads and writes will definitely do bad things on a tile-based renderer like Mali.

Related

Rendering from GL_TEXTURE_EXTERNAL_OES to GL_TEXTURE_2D but only one color flashes

I'm pretty new to opengl-es and currently have a problem rendering my video output on a screen in Unity.
I was developing a video player project with Unity. I bought the EasyMovieTexture plugin and replaced the video player module with another open-source video player (Ijkplayer) years ago which worked fine all the time.
Now I want to replace it with newer VLC using libvlcjni. I compiled and just replaced the old Ijkplayer but it didn't work as I expected. The screen just started flashing one color from every video frame but the video was going and the audio track was normally playing too.
Screenshot - A test scene with a screen, sorry there's mistake with texcoord but only color flashing
I'd like to provide some further information hope that I can find some help (Sorry if I have some mistakes or misunderstandings):
As far as I know, these video player modules need a Surface (Or SurfaceTexture) in the Android layer and the video decoder will work as a data producer. The SurfaceTexture consumes data from the producer and converts it to the texture with the type of GL_TEXTURE_EXTERNAL_OES which can be directly consumed and displayed with components like TextureView. But this texture data cannot be consumed in the Unity layer unless I use a GLSL shader directly sampling my OES texture data, like this:
// Unity shader with GLSL
GLSLPROGRAM
#pragma only_renderers gles3
#include "UnityCG.glslinc"
// ...
// Ignoring vertex shader
// ...
in vec2 textureCoord;
layout(binding = 0) uniform samplerExternalOES _MainTex;
out vec4 fragColor;
void main()
{
fragColor = texture(_MainTex, textureCoord);
}
ENDGLSL
My approach was to convert the texture to GL_TEXTURE_2D with the native library which came along with the EasyMovieTexture plugin. Here I cannot provide the source code of this .so library but I've decompiled it in IDAPro and I know it can work along with GLES and render the external texture data to another 2d texture using FrameBuffer Object and external shader program.
Here is a random example to explain the procedure, it is NOT the accurate code from the binary: FilterFBOTexture.java
Though I cannot edit the .so file, luckily it was reading two external files as the shader program:
// vertex shader
attribute highp vec3 inVertex;
attribute mediump vec3 inNormal;
attribute mediump vec2 inTexCoord;
uniform highp mat4 MVPMatrix;
uniform mediump vec2 TexCoordMove;
varying mediump vec2 TexCoord;
void main()
{
highp vec4 vPos = vec4(0,0,0,1);
vPos.x = ( inTexCoord.x * 2.0 - 1.0 );
vPos.y = ( inTexCoord.y * 2.0 - 1.0 );
gl_Position = vPos;
mediump vec4 vec4Temp = vec4(inTexCoord.x - TexCoordMove.x,inTexCoord.y - TexCoordMove.y,0,1.0);
vec4Temp = MVPMatrix * vec4Temp;
vec4Temp.xyz = vec4Temp.xyz / vec4Temp.w;
TexCoord = vec4Temp.xy;
}
// fragment shader
#extension GL_OES_EGL_image_external : require
uniform samplerExternalOES sTexture;
uniform lowp float AlphaValue;
varying mediump vec2 TexCoord;
void main()
{
lowp vec4 color = texture2D(sTexture, TexCoord) ;
color = vec4(color.rgb, color.a * AlphaValue);
gl_FragColor = color;
}
I don't know whether I have to check the vertex shader or just dive into the source code of libvlcjni so that I can correctly render my video output. Any idea will be grateful, thanks.
Update 2022-11-4:
I turned around and began to use VLC for Android.
Big appreciation to #mfkl, I created an issue days ago on the VLC repo.
https://code.videolan.org/videolan/vlc-unity/-/issues/164
The problem still remains but at least I can work for something now.

Render FBO to same FBO

I am trying to create a ghost-like camera filter. This requires mixing the previous frame to current one. I use one FBO to make the mixing and a second one to simply put the context to the screen.
My implementation works on 4 out of 5 devices I have tried. On the fifth (Samsung galaxy S7) I get some random pixels.
The simpler shader to reproduce the error is the following (the frame counter and cropping is just for debugging). The result is that I get on the center of the screen on line gradually going up.
uniform samplerExternalOES camTexture;
uniform sampler2D fbo;
uniform int frame_no;
varying vec2 v_CamTexCoordinate;
void main ()
{
vec2 uv = v_CamTexCoordinate;
if(frame_no<10){
gl_FragColor = texture2D(camTexture, uv);
}else{
if(uv.y>0.2 && uv.y<0.8 && uv.x>0.2 && uv.x<0.8)
gl_FragColor = texture2D(fbo, uv + vec2(0.0, +0.005));
else
gl_FragColor = texture2D(camTexture, uv);
}
}
But on the Samsung I get some correct pixels and some random ones as the following sample. Some black and other random pixels going up together with the camera's pixels. Any idea of what might be wrong?
Fault sample
Correct sample

Repeated frames on Android Mali-400

On Android devices with Mali-400 GPU (Samsung Galaxy S II, Samsung Galaxy S3 Mini, Samsung Galaxy Note II), at random times the screen will start showing repeated frames.
Example from 0:51 until 1:01 on the following video https://www.youtube.com/watch?v=5-p6Oy0BZmg
It seems as if new frames aren't being rendered, and what was in the old buffer is being shown again. The game continues to advance behind the repeated frames.
This doesn't happen on other GPUs.
I read about using glFlush or glFinish, but GLSurfaceView takes care of this when doing eglSwapBuffers after onDrawFrame.
I've read about quirks of Mali-400, like is better to use varying for texture coords, or to use lowp, but it doesn't help. Here are the shaders for reference:
Vertex shader:
// Vertex Shader
attribute vec4 position;
attribute vec4 colorModelIn;
attribute vec4 colorVertexIn;
varying lowp vec4 colorOut;
uniform mat4 modelViewProjectionMatrix;
uniform mat4 modelMatrix;
attribute vec2 TexCoordIn;
varying lowp vec2 TexCoordOut;
uniform bool bUseVertexColor;
void main()
{
if( bUseVertexColor ){
colorOut = colorVertexIn * colorModelIn;
} else {
colorOut = colorModelIn;
}
TexCoordOut = TexCoordIn;
gl_Position = modelViewProjectionMatrix * modelMatrix * position;
}
Fragment shader:
// Fragment shader
varying lowp vec4 colorOut;
varying lowp vec2 TexCoordOut;
uniform sampler2D Texture;
uniform bool bUseTexture;
void main()
{
if( bUseTexture ){
gl_FragColor = colorOut * texture2D(Texture, TexCoordOut);
} else {
gl_FragColor = colorOut;
}
}
I'm aware that these shaders aren't optimal, and that I'm going down the path of reproducing the fixed pipeline.
The rendering goes back to normal after some time, or after touching the screen. The only reason I can think for going back to normal when touching is that I use color-coding to detect the touched object. I render an image to the back buffer and glReadPixels from it. Then, overwrite the back buffer with the normal game image.
I'm out of ideas on how to attack this problem.
EDIT
After following Muzza's advice I started to log GL errors. glGetInteger and glBindBuffer report out-of-memory.
Above I said that the problem solves itself after a while. When that happens these appear in the logs:
01-23 21:57:52.956: D/WebView(9860): onSizeChanged - w:480 h:75
01-23 21:57:53.126: D/TilesManager(9860): new EGLContext from framework: 40e00bd0
01-23 21:57:53.126: D/GLWebViewState(9860): Reinit shader
01-23 21:57:53.171: D/GLWebViewState(9860): Reinit transferQueue
This can happen if the OpenGL state becomes invalid in some way. The graphics drivers can just skip frames entirely. Check Logcat to see if there is any output from the drivers, and add glGetError() calls throughout your code to see if any error comes up there.

Custom Shader not working on Android Device (using LibGDX)

I've been starting out using LibGDX to develop for PC as well as Android. I'm currently working on a little project requiring me to write custom shaders in GLSL (based on full-screen quads). Coming from a HLSL environment I had a little trouble setting the whole shader system up in combination with SpriteBatch (as I want to keep the code simple where I can).
I have the following shader code working on desktop:
Vertexshader:
attribute vec4 a_position;
uniform mat4 u_projectionViewMatrix;
void main()
{
gl_Position = a_position * u_projectionViewMatrix;
}
Fragmentshader:
uniform vec2 effectOrigin;
uniform vec2 effectDir;
void main()
{
float distToOrigin = distance(effectOrigin.xy, gl_FragCoord.xy);
gl_FragColor = vec4(mod(distToOrigin+effectDir.x*30, 30)-15, 0.0, 0.0, 1.0);
}
As I said it works on desktop (Windows), and gives me a nice circle pattern. On Android however it doesn't compile and gives me just the cleared background color. I suspected it had something to do with OpenGL ES 2 permissions on Android so I added this line to the manifest:
<uses-feature android:glEsVersion="0x00020000" android:required="true" />, but this didn't seem to make a difference.
I also thought it might have something to do with the precision of the floats and vectors, but I wasn't able to figure out how I would have to change them in order to fix it.
Is there someone who can help me on this? I haven't been able to find an answer elsewhere!
Thanks in advance,
Yuri
P.S. Is there a way to see what went wrong during compiling of the shaders? I know debugging is hard on shaders, but it would be awesome to have at least some idea of where it could've gone wrong during compilation.
Fixed it using P.T.´s suggestion on logging the compilation!
There were two problems in the fragment shader, the fixed shader looks like this:
precision mediump float;
uniform vec2 effectOrigin;
uniform vec2 effectDir;
void main()
{
float distToOrigin = distance(effectOrigin.xy, gl_FragCoord.xy);
gl_FragColor = vec4(mod(distToOrigin+effectDir.x*30.0, 30.0)-15.0, 0.0, 0.0, 1.0);
}
I added the precision definition at the top, and also changed the constant values from integers to floats (e.g. 15 to 15.0) to fix it!
Thanks P.T.!

Android GLES20.glBlendEquation not working?

Ive been trying to make a 2.5D engine with depth and normal map textures for a few weeks now, not unlike whats used here Linky. After thinking the drawing of a depth map in the fragment shader from a texture was impossible due to ES 2.0 missing the gl_fragDepth variable I found a tutorial for iOS where they used glBlendEquation with the mode GL_MIN/GL_MAX to "fake" depth buffering of the fragment to a framebuffer-texture Linky. Unfortunely GLES20.glBlendEquation makes the application crash on both my phones (SGS 1/2) with UnsupportedOperationException. So Im wondering if anyone has used this function to any success? GL_MIN/GL_MAX also seems to be missing from the Android Opengl ES 2.0 spec so Im probably out of luck here...
Any ideas?
BTW It does seem to work in GL11Ext but since Im using the fragment shader for normal mapping this wont work from me.
i was experimenting on my Vega tablet (Tegra) and this worked for me:
fragment shader:
#extension GL_NV_shader_framebuffer_fetch : require
// makes gl_LastFragColor accessible
precision highp float;
varying vec2 v_texcoord;
uniform sampler2D n_sampler;
void main()
{
vec4 v_tex = texture2D(n_sampler, v_texcoord);
gl_FragColor = min(gl_LastFragColor, v_tex); // MIN blending
}
Pretty easy, huh? But i'm afraid this will be NV-only.

Categories

Resources