GL_TEXTURE_EXTERNAL_OES texture render in fragment shader - android

Eh .. I am using OpenGL ES on Android .
I want to use a SurfaceTexture and camera.setPreviewTexture get camera preview texture , and use it and render this texture on screen .
Here is my question , I use a samplerExternalOES type to declare a sampler in fragment shader source . And I googled some articles tell me that the GL_TEXTURE_EXTERNAL_OES usually use to YUV data , and I think the shader didn't convert YUV to RGB . The fragment shader is process all fragment , the function is same . and it can works well , why ?
and How can I convert YUV to RGB format with shader ?
Thanks and please forgive my poor english.
Fragment shader Code :
#extension GL_OES_EGL_image_external : require
precision mediump float;
uniform samplerExternalOES uInputTex;
varying vec2 vTexCoord;
void main(){
gl_FragColor = vec4(texture2D(uInputTex,vTexCoord).rgb,1.0f);
}

Related

Rendering from GL_TEXTURE_EXTERNAL_OES to GL_TEXTURE_2D but only one color flashes

I'm pretty new to opengl-es and currently have a problem rendering my video output on a screen in Unity.
I was developing a video player project with Unity. I bought the EasyMovieTexture plugin and replaced the video player module with another open-source video player (Ijkplayer) years ago which worked fine all the time.
Now I want to replace it with newer VLC using libvlcjni. I compiled and just replaced the old Ijkplayer but it didn't work as I expected. The screen just started flashing one color from every video frame but the video was going and the audio track was normally playing too.
Screenshot - A test scene with a screen, sorry there's mistake with texcoord but only color flashing
I'd like to provide some further information hope that I can find some help (Sorry if I have some mistakes or misunderstandings):
As far as I know, these video player modules need a Surface (Or SurfaceTexture) in the Android layer and the video decoder will work as a data producer. The SurfaceTexture consumes data from the producer and converts it to the texture with the type of GL_TEXTURE_EXTERNAL_OES which can be directly consumed and displayed with components like TextureView. But this texture data cannot be consumed in the Unity layer unless I use a GLSL shader directly sampling my OES texture data, like this:
// Unity shader with GLSL
GLSLPROGRAM
#pragma only_renderers gles3
#include "UnityCG.glslinc"
// ...
// Ignoring vertex shader
// ...
in vec2 textureCoord;
layout(binding = 0) uniform samplerExternalOES _MainTex;
out vec4 fragColor;
void main()
{
fragColor = texture(_MainTex, textureCoord);
}
ENDGLSL
My approach was to convert the texture to GL_TEXTURE_2D with the native library which came along with the EasyMovieTexture plugin. Here I cannot provide the source code of this .so library but I've decompiled it in IDAPro and I know it can work along with GLES and render the external texture data to another 2d texture using FrameBuffer Object and external shader program.
Here is a random example to explain the procedure, it is NOT the accurate code from the binary: FilterFBOTexture.java
Though I cannot edit the .so file, luckily it was reading two external files as the shader program:
// vertex shader
attribute highp vec3 inVertex;
attribute mediump vec3 inNormal;
attribute mediump vec2 inTexCoord;
uniform highp mat4 MVPMatrix;
uniform mediump vec2 TexCoordMove;
varying mediump vec2 TexCoord;
void main()
{
highp vec4 vPos = vec4(0,0,0,1);
vPos.x = ( inTexCoord.x * 2.0 - 1.0 );
vPos.y = ( inTexCoord.y * 2.0 - 1.0 );
gl_Position = vPos;
mediump vec4 vec4Temp = vec4(inTexCoord.x - TexCoordMove.x,inTexCoord.y - TexCoordMove.y,0,1.0);
vec4Temp = MVPMatrix * vec4Temp;
vec4Temp.xyz = vec4Temp.xyz / vec4Temp.w;
TexCoord = vec4Temp.xy;
}
// fragment shader
#extension GL_OES_EGL_image_external : require
uniform samplerExternalOES sTexture;
uniform lowp float AlphaValue;
varying mediump vec2 TexCoord;
void main()
{
lowp vec4 color = texture2D(sTexture, TexCoord) ;
color = vec4(color.rgb, color.a * AlphaValue);
gl_FragColor = color;
}
I don't know whether I have to check the vertex shader or just dive into the source code of libvlcjni so that I can correctly render my video output. Any idea will be grateful, thanks.
Update 2022-11-4:
I turned around and began to use VLC for Android.
Big appreciation to #mfkl, I created an issue days ago on the VLC repo.
https://code.videolan.org/videolan/vlc-unity/-/issues/164
The problem still remains but at least I can work for something now.

OpenGL ES 2.0 or 3.0: Pow function issues

OpenGL ES 3.0 on Android, when I use Pow(x) during preprocessing and Pow(1.0/x) during post-processing, image will display abnormally.
At first I thought there was a problem with the middle fragment shader, but later I removed all the middle fragment shaders and still showed exceptions. What is the reason for this?
Initial rendering process:
preprocessing -> other filters -> post-processing
Modified rendering process code,delete other filters and post-processing:
#version 300 es
precision highp float;
uniform sampler2D mTexture;
uniform float mPow;
in vec2 vTexCoord;
out vec4 vFragColor;
void main() {
vec4 vFragColor1 = pow(texture(mTexture, vTexCoord), vec4(mPow));
vFragColor = pow(vFragColor1, vec4(1.0/mPow));
}
pow range: 0.0-100.0

ESSL Separated Shader Objects : Output from vertex requires Input from fragment

Does anybody tried the separated shader objects on Android with ESSL 3.1 ? I want to create a shader without may output variables, but want to use a shader that ignore them.
Vertex Shader
#version 310 es
#extension GL_EXT_shader_io_blocks: enable
precision highp float;
out gl_PerVertex { vec4 gl_Position; float gl_PointSize; };
out vec2 texCoord0;
in vec4 attr_Vertex;
in vec2 attr_TexCoord;
void main()
{
gl_Position = attr_Vertex;
texCoord0 = attr_TexCoord;
}
Fragment Shader
Code :
#version 310 es
#extension GL_EXT_shader_io_blocks: enable
precision highp float;
void main() {} // Do nothing, (ie, for a depth only shader)
Go this error during pipeline validation
glValidateProgramPipeline(ppo);
glGetProgramPipelineiv(ppo, GL_VALIDATE_STATUS, &status);
Output from glGetProgramPipelineInfoLog:
Error: output texCoord0 not declared in input from next stage.
On Android with ES 3.2, the message is
error: The fragment stage's input interface doesn't match preceding stage's output
However this is working on Desktop OpenGL and iOS with GL_EXT_shader_separated_objects
I have many vertex shaders and want to perform a depth pass or other only with a special fragment shader which don't need the input, or ignore them
Why output not declared in input from next stage is an error ? It should be at worst a warning, but not an error, and why it happens on ESSL 3.1 and not on Desktop (Also same shader on Desktop works - ie no error -)
According the documentation
With separable program objects, interfaces between shader stages may
involve the outputs from one program object and the inputs from a
second program object
Apparently layout(location = xx) is required to not have the validation failed ?

OpenGL ES 2.0 SL writing out from fragment shader to texture

Hopefully this is a really simple question.
How to write out to a texture attached to framebuffer from within a fragment shader. I assume it is just gl_FragColor, am I suppose to define/use a different variable, like gl_Data[0]?
Frag Shader:
precision mediump float;
varying vec2 vTextureCoord;
uniform sampler2D displayTexture;
void main() {
gl_FragColor = texture2D(displayTexture, vTextureCoord);
}
This question is not how to setup a texture for writing to, just how to write out from within the fragment shader. I just want to make sure I have this piece of the puzzle.
You seem to assume correct. All the drawing code should be the same drawing to render buffer as to an attached texture, even shaders.

Android GLKBaseEffect equivalent (already existing OpenGL shaders)

I'm learning OpenGL ES 2.0 in Android, do you know of a library providing already existing shaders?
I'm on a project with a friend who's developing on iOS, he told me that he can use GLKBaseEffect to avoid devolping custom shaders, as long as we don't need complex features. Is there an equivalent of that BaseEffect in Android?
I'm asking this because the two of us have been assigned this project by a professor, who told us that it's not important for this project to develop custom shaders, so I'm guessing there is a compilation of basical shaders that I can browse.
Is that correct?
Thank you for your help!
Android doesn't support something like the GLKBaseEffect class but I want you to know that shader is just supported for being programable so shader is not hard at all if you use simple shader codes.
If you don't want to do any post imageprocessing don't change fragment shader that is only what you should do.
Vertex shader
attribute vec4 position;
attribute vec4 inputTextureCoordinate;
varying vec2 textureCoordinate;
void main(void)
{
gl_Position = position;
textureCoordinate = inputTextureCoordinate.xy;
}
Fragment shader
uniform sampler2D texture0;
varying vec2 textureCoordinate;
void main()
gl_FragColor = texture2D(texture0, textureCoordinate);
}
Now you need to put only three values position, texture cordinate and texture :) as you need to do anywhere

Categories

Resources