OpenGL ES 2.0 SL writing out from fragment shader to texture - android

Hopefully this is a really simple question.
How to write out to a texture attached to framebuffer from within a fragment shader. I assume it is just gl_FragColor, am I suppose to define/use a different variable, like gl_Data[0]?
Frag Shader:
precision mediump float;
varying vec2 vTextureCoord;
uniform sampler2D displayTexture;
void main() {
gl_FragColor = texture2D(displayTexture, vTextureCoord);
}
This question is not how to setup a texture for writing to, just how to write out from within the fragment shader. I just want to make sure I have this piece of the puzzle.

You seem to assume correct. All the drawing code should be the same drawing to render buffer as to an attached texture, even shaders.

Related

Rendering from GL_TEXTURE_EXTERNAL_OES to GL_TEXTURE_2D but only one color flashes

I'm pretty new to opengl-es and currently have a problem rendering my video output on a screen in Unity.
I was developing a video player project with Unity. I bought the EasyMovieTexture plugin and replaced the video player module with another open-source video player (Ijkplayer) years ago which worked fine all the time.
Now I want to replace it with newer VLC using libvlcjni. I compiled and just replaced the old Ijkplayer but it didn't work as I expected. The screen just started flashing one color from every video frame but the video was going and the audio track was normally playing too.
Screenshot - A test scene with a screen, sorry there's mistake with texcoord but only color flashing
I'd like to provide some further information hope that I can find some help (Sorry if I have some mistakes or misunderstandings):
As far as I know, these video player modules need a Surface (Or SurfaceTexture) in the Android layer and the video decoder will work as a data producer. The SurfaceTexture consumes data from the producer and converts it to the texture with the type of GL_TEXTURE_EXTERNAL_OES which can be directly consumed and displayed with components like TextureView. But this texture data cannot be consumed in the Unity layer unless I use a GLSL shader directly sampling my OES texture data, like this:
// Unity shader with GLSL
GLSLPROGRAM
#pragma only_renderers gles3
#include "UnityCG.glslinc"
// ...
// Ignoring vertex shader
// ...
in vec2 textureCoord;
layout(binding = 0) uniform samplerExternalOES _MainTex;
out vec4 fragColor;
void main()
{
fragColor = texture(_MainTex, textureCoord);
}
ENDGLSL
My approach was to convert the texture to GL_TEXTURE_2D with the native library which came along with the EasyMovieTexture plugin. Here I cannot provide the source code of this .so library but I've decompiled it in IDAPro and I know it can work along with GLES and render the external texture data to another 2d texture using FrameBuffer Object and external shader program.
Here is a random example to explain the procedure, it is NOT the accurate code from the binary: FilterFBOTexture.java
Though I cannot edit the .so file, luckily it was reading two external files as the shader program:
// vertex shader
attribute highp vec3 inVertex;
attribute mediump vec3 inNormal;
attribute mediump vec2 inTexCoord;
uniform highp mat4 MVPMatrix;
uniform mediump vec2 TexCoordMove;
varying mediump vec2 TexCoord;
void main()
{
highp vec4 vPos = vec4(0,0,0,1);
vPos.x = ( inTexCoord.x * 2.0 - 1.0 );
vPos.y = ( inTexCoord.y * 2.0 - 1.0 );
gl_Position = vPos;
mediump vec4 vec4Temp = vec4(inTexCoord.x - TexCoordMove.x,inTexCoord.y - TexCoordMove.y,0,1.0);
vec4Temp = MVPMatrix * vec4Temp;
vec4Temp.xyz = vec4Temp.xyz / vec4Temp.w;
TexCoord = vec4Temp.xy;
}
// fragment shader
#extension GL_OES_EGL_image_external : require
uniform samplerExternalOES sTexture;
uniform lowp float AlphaValue;
varying mediump vec2 TexCoord;
void main()
{
lowp vec4 color = texture2D(sTexture, TexCoord) ;
color = vec4(color.rgb, color.a * AlphaValue);
gl_FragColor = color;
}
I don't know whether I have to check the vertex shader or just dive into the source code of libvlcjni so that I can correctly render my video output. Any idea will be grateful, thanks.
Update 2022-11-4:
I turned around and began to use VLC for Android.
Big appreciation to #mfkl, I created an issue days ago on the VLC repo.
https://code.videolan.org/videolan/vlc-unity/-/issues/164
The problem still remains but at least I can work for something now.

OpenGL ES 2.0 or 3.0: Pow function issues

OpenGL ES 3.0 on Android, when I use Pow(x) during preprocessing and Pow(1.0/x) during post-processing, image will display abnormally.
At first I thought there was a problem with the middle fragment shader, but later I removed all the middle fragment shaders and still showed exceptions. What is the reason for this?
Initial rendering process:
preprocessing -> other filters -> post-processing
Modified rendering process code,delete other filters and post-processing:
#version 300 es
precision highp float;
uniform sampler2D mTexture;
uniform float mPow;
in vec2 vTexCoord;
out vec4 vFragColor;
void main() {
vec4 vFragColor1 = pow(texture(mTexture, vTexCoord), vec4(mPow));
vFragColor = pow(vFragColor1, vec4(1.0/mPow));
}
pow range: 0.0-100.0

GL_OES_EGL_image_external_essl3 extension in compute shaders does not work properly

I use the GL_OES_EGL_image_external_essl3 extension to access camera pictures in GLSL. For fragment shaders it works fine. Here is my simple fragment shader:
#version 320 es
#extension GL_OES_EGL_image_external_essl3 : require
precision mediump float;
uniform samplerExternalOES cameraTexture;
in vec2 v_TexCoordinate;
out vec4 fragmentColor;
void main() {
fragmentColor = texture(cameraTexture, v_TexCoordinate);
}
I can see the picture from the camera.
However when I insert a simple compute shader stage in pipeline that only copies data from that external image to a new texture which I display, I can see only black screen. I also generate a red line in the compute shader for debugging.
Here is the code of that compute shader:
#version 320 es
#extension GL_OES_EGL_image_external_essl3 : require
precision mediump float;
layout(local_size_x = LOCAL_SIZE, local_size_y = LOCAL_SIZE) in;
layout(binding=1, rgba8) uniform mediump writeonly image2D outputImage;
uniform samplerExternalOES cameraTexture;
void main() {
ivec2 position = ivec2(gl_GlobalInvocationID.xy);
vec4 cameraColor = texture(cameraTexture, vec2(gl_GlobalInvocationID.xy)/1024.);
imageStore(outputImage, position, cameraColor);
// generate a red line to see that in general the texture
// that is produced by the compute shader is displayed on the
// screen
if (position.x == 100) imageStore(outputImage, position, vec4(1,0,0,1));
}
So it seems like it accesses the texture in the same way, but vec(0,0,0,1) is returned by texture() instead. So the screen is black.
In both cases I bind the texture like this:
glActiveTexture(GL_TEXTURE0)
glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mCameraTextureId)
glUniform1i(cameraUniformHandle, 0)
Why does this extension not work properly in my compute shader? Is it supposed to work in compute shaders at all?
My platform is Samsung Galaxy S7 (Mali GPU).

GL_TEXTURE_EXTERNAL_OES texture render in fragment shader

Eh .. I am using OpenGL ES on Android .
I want to use a SurfaceTexture and camera.setPreviewTexture get camera preview texture , and use it and render this texture on screen .
Here is my question , I use a samplerExternalOES type to declare a sampler in fragment shader source . And I googled some articles tell me that the GL_TEXTURE_EXTERNAL_OES usually use to YUV data , and I think the shader didn't convert YUV to RGB . The fragment shader is process all fragment , the function is same . and it can works well , why ?
and How can I convert YUV to RGB format with shader ?
Thanks and please forgive my poor english.
Fragment shader Code :
#extension GL_OES_EGL_image_external : require
precision mediump float;
uniform samplerExternalOES uInputTex;
varying vec2 vTexCoord;
void main(){
gl_FragColor = vec4(texture2D(uInputTex,vTexCoord).rgb,1.0f);
}

Android GLKBaseEffect equivalent (already existing OpenGL shaders)

I'm learning OpenGL ES 2.0 in Android, do you know of a library providing already existing shaders?
I'm on a project with a friend who's developing on iOS, he told me that he can use GLKBaseEffect to avoid devolping custom shaders, as long as we don't need complex features. Is there an equivalent of that BaseEffect in Android?
I'm asking this because the two of us have been assigned this project by a professor, who told us that it's not important for this project to develop custom shaders, so I'm guessing there is a compilation of basical shaders that I can browse.
Is that correct?
Thank you for your help!
Android doesn't support something like the GLKBaseEffect class but I want you to know that shader is just supported for being programable so shader is not hard at all if you use simple shader codes.
If you don't want to do any post imageprocessing don't change fragment shader that is only what you should do.
Vertex shader
attribute vec4 position;
attribute vec4 inputTextureCoordinate;
varying vec2 textureCoordinate;
void main(void)
{
gl_Position = position;
textureCoordinate = inputTextureCoordinate.xy;
}
Fragment shader
uniform sampler2D texture0;
varying vec2 textureCoordinate;
void main()
gl_FragColor = texture2D(texture0, textureCoordinate);
}
Now you need to put only three values position, texture cordinate and texture :) as you need to do anywhere

Categories

Resources