I'm rendering a libgdx mesh using shader with alpha color less than 1.0.
When rendering the first frame, the alpha value set in shader is being ignored and rendered as 1.0. All following rendering frames are fine.
The same happened to me in previous project drawing lines and shapes using glDrawArrays and I haven't found the solution yet.
Code in libgdx render() loop:
Gdx.gl20.glClearColor(0, 0, 0, 1);
Gdx.gl20.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
Gdx.gl20.glEnable(GL20.GL_BLEND);
MeshManager_colorFill.meshShader.begin();
MeshManager_colorFill.meshShader.setUniformMatrix("u_worldView", Snappr.camMatrix);
meshArray_colorFill.render(MeshManager_colorFill.meshShader, GL20.GL_TRIANGLE_FAN, 0, meshArray_colorFill.get(i).getNumVertices());
MeshManager_colorFill.meshShader.end();
Gdx.gl20.glDisable(GL20.GL_BLEND);
My shader (compiled in create(){}):
public static final String meshVertexShader_colorFill =
"attribute vec2 a_position; \n" +
"uniform mat4 u_worldView;\n" +
"void main() \n" +
"{ \n" +
" gl_Position = u_worldView * vec4(a_position.xy, 0, 1); \n" +
"} \n" ;
public static final String meshFragmentShader_colorFill =
"precision mediump float;\n" +
"void main() \n" +
"{ \n" +
" gl_FragColor = vec4(1,1,1,0.2);\n" +
"}";
How do I render the very first frame as it should be?
Thanks
glBlendFunc in create() does the trick, specifically: "Gdx.gl20.glBlendFunc(GL20.GL_SRC_ALPHA, L20.GL_ONE_MINUS_SRC_ALPHA);" I'm using SpriteBatch to render a font, right after the mesh. glBlendFunc is contained internally within the SpriteBatch, looks like that's why all the other frames were fine.
I found that ModelBatch.begin(...) will disable Blending by calling Gdx.gl.glDisable(GL20.GL_BLEND). So make sure to enable blending AFTER calling ModelBatch.begin().
modelBatch.begin(camera); // resets Blending to false
// enable Blending
Gdx.gl.glEnable(GL20.GL_BLEND);
Gdx.gl.glBlendFunc(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);
// draw mesh
mesh.render(shader, GL20.GL_TRIANGLES);
modelBatch.end();
Source https://stackoverflow.com/a/66820414/2413469
Related
My Android program must use glBlitFrameBuffer() function to copy FrameBuffer object. But glBlitFrameBuffer() function is only supported on OpenGL ES 3.0+ devices. I want to support OpenGL ES 2.0+ devices.
Is there any solution/alternative for this function?
Bind texture that used as collor attachment on source frame buffer
Bind destination framebuffer
Draw full screen quad (if you need stretch or offseted reading manipulate with vertex/tex coordinates)
Fetch data from bound texture in frament shader and put it to gl_FragColor
I've created a CopyShader that simply uses a shader to copy from a texture to a framebuffer.
private static final String SHADER_VERTEX = ""
+ "attribute vec4 a_Position;\n"
+ "varying highp vec2 v_TexCoordinate;\n"
+ "void main() {\n"
+ " v_TexCoordinate = a_Position.xy * 0.5 + 0.5;\n"
+ " gl_Position = a_Position;\n"
+ "}\n";
private static final String SHADER_FRAGMENT = ""
+ ""
+ "uniform sampler2D u_Texture;\n"
+ "varying highp vec2 v_TexCoordinate;\n"
+ "void main() {\n"
+ " gl_FragColor = texture2D(u_Texture, v_TexCoordinate);\n"
+ "}\n”;
Use these as your shaders, and then just set u_Texture to the texture you want to copy from, and bind the framebuffer you want to write to, and you should be set.
Is there a way to transform a libgdx's Texture to a grayscale image? So far I had duplicate the images that I want to grayscale and I did it manually, but I think it is not the best solution because my game is using more and more images and it uses a lot of disk space.
Thought I'd share this for anyone wanting to use some copy/paste code.
import com.badlogic.gdx.graphics.glutils.ShaderProgram;
public class GrayscaleShader {
static String vertexShader = "attribute vec4 a_position;\n" +
"attribute vec4 a_color;\n" +
"attribute vec2 a_texCoord0;\n" +
"\n" +
"uniform mat4 u_projTrans;\n" +
"\n" +
"varying vec4 v_color;\n" +
"varying vec2 v_texCoords;\n" +
"\n" +
"void main() {\n" +
" v_color = a_color;\n" +
" v_texCoords = a_texCoord0;\n" +
" gl_Position = u_projTrans * a_position;\n" +
"}";
static String fragmentShader = "#ifdef GL_ES\n" +
" precision mediump float;\n" +
"#endif\n" +
"\n" +
"varying vec4 v_color;\n" +
"varying vec2 v_texCoords;\n" +
"uniform sampler2D u_texture;\n" +
"\n" +
"void main() {\n" +
" vec4 c = v_color * texture2D(u_texture, v_texCoords);\n" +
" float grey = (c.r + c.g + c.b) / 3.0;\n" +
" gl_FragColor = vec4(grey, grey, grey, c.a);\n" +
"}";
public static ShaderProgram grayscaleShader = new ShaderProgram(vertexShader,
fragmentShader);
}
To use it call
spriteBatch.setShader(GrayscaleShader.grayscaleShader)
And when you're done with grayscale don't forget to call
spriteBatch.setShader(null);
You should be able to write a GLSL shader that renders a texture in grayscale. This requires OpenGL 2.x, and doesn't really "transform" a texture, but just renders it to the display as grayscale.
For a detailed tutorial on shaders that includes a grayscale shader check out: https://github.com/mattdesl/lwjgl-basics/wiki/ShaderLesson3
(Libgdx doesn't really define the GLSL shader API, that's passed through from OpenGL, so most tutorials or code you find on the web for regular OpenGL should work.)
For a more direct hack, just take the Libgdx SpriteBatch shader and change the fragment shader so it averages the rgb components. (You can define your own ShaderProgram and provide it to a SpriteBatch to use.)
Change body of the fragment shader to something like this (untested, so may not compile):
+ " vec4 c = v_color * texture2D(u_texture, v_texCoords);\n" //
+ " float grey = (c.r + c.g + c.b) / 3.0;\n" //
+ " gl_FragColor = vec4(grey, grey, grey, c.a);\n" //
You can load up textures as luminance only, or luminance and alpha in GLES (see glTexImage2D). In libgdx you can specify PixFormat.Intensity (luminance) or LuminanceAlpha (luminance and alpha) when instantiating the Texture. This will generate a grayscale texture.
You still need to have two textures (one color, one grayscale) loaded up, but they can use the same source, and the luminance only uses only 1 byte per pixel in memory.
A more efficient solution is to implement a shader as suggested by P.T., but is only available from GLES 2.
It has been quite a while since understanding Matrix calculations and mapping them to expected results. It's sort of intimidating. Curious if there is anyone out there who have done mapping from an ImageView Matrix android.graphics.Matrix to an openGL Matrix 'android.opengl.Matrix' in order to update the texture based upon the image behind it.
There will always be a photograph directly behind the texture, and the texture in front should stay at the same scale and translation as the ImageView (actually using the ImageTouchView library). The ImageView scaling and zooming works as expected and desired.
So, while the Image is being re-sized, I want to update the OpenGL ES 2.0 shader code in the project such that it changes size with it. I thought that I would be able to do this by using callbacks and directly manipulating the shader code, but it doesn't seem to be working. Not sure if this is a shader problem or a direct Matrix passing problem. The trigger is reached from the ImageTouchView when a matrix change is detected.
#Override
public void onMatrixChanged(Matrix matrix)
{
photoViewRenderer_.updateTextureMatrix(matrix);
photoSurfaceView_.requestRender();
}
When I receive that matrix data and attempt to display the new matrix in the texture then, I get a black screen overtop of the ImageTouchView. So, this could very well just be an OpenGL problem in the code. However, this is what it looks like so as we can see exactly what is being described.
The shader code looks something like this. And I'll start with these and add additional code as requested based upon feedback.
private final String vertexShader_ =
"uniform float zoom;\n" +
"attribute vec4 a_position;\n" +
"attribute vec4 a_texCoord;\n" +
"attribute vec3 a_translation;\n" +
"varying vec2 v_texCoord;\n" +
"void main() {\n" +
" gl_Position = a_position * vec4(a_translation, 1.0);\n" +
" v_texCoord = a_texCoord.xy * zoom;\n" +
"}\n";
Before adding in the line for vec4(a_translation, 1.0); it seemed be working as expected in that the image was being displayed directly on top of the ImageTouchView of equal size just fine. So it is probably the shader. But...I cannot rule out that the data going in from an Image Matrix is screwing up the texture either and representing it way off screen. I don't know what to use as a default matrix for a_translation in to check against that.
Edit:
The black screen is actually not an issue now. The default a_position set to private float[] positionTranslationData_ = {1.0f, 1.0f, 1.0f, 1.0f, 1.0f, 1.0f, 1.0f, 1.0f, 1.0f}; brought the image back. But the texture is not getting manipulated with the inputs from the onMatrixChanged(Matrix matrix) function called in by the ImageTouchView.
If you want to translate the image (By moving the pixels renderer) you have two options.
Translate the gl_Position:
private final String vertexShader_ =
"uniform float zoom;\n" +
"attribute vec4 a_position;\n" +
"attribute vec4 a_texCoord;\n" +
"attribute vec3 a_translation;\n" +
"varying vec2 v_texCoord;\n" +
"void main() {\n" +
" gl_Position = a_position + vec4(a_translation, 1.0);\n" + // Note the + here
" v_texCoord = a_texCoord.xy * zoom;\n" +
"}\n";
Apply an affine transform, using a 4x4 matrix, to gl_Position (be aware that the 4th component of a_position must be 1.0):
private final String vertexShader_ =
"uniform float zoom;\n" +
"attribute vec4 a_position;\n" +
"attribute vec4 a_texCoord;\n" +
"attribute mat4 a_translation;\n" +
"varying vec2 v_texCoord;\n" +
"void main() {\n" +
" gl_Position = a_position*a_translation;\n" +
" v_texCoord = a_texCoord.xy * zoom;\n" +
"}\n";
if you want to move the texture (not the quad or whatever you are rendering) you can apply the same logic to the v_texCoord. This will apply a translation and/or rotation to the output texture coordinates.
Probably one reason for you problem is that right now, your code that a multiply component wise, which will multiply a_position.x by a_translation.x component, the same for 'y' and so on, which I think is not what you are trying to archive.
I've found andengine sample with blur effect and it works fine, i found that i can use ShaderProgram class and in PreDraw of the Sprite use some shaders (setShaders).
Can i use the same class ShaderProgramm and use shader without attach to sprite ?
For example in onDraw use Shader that only draw a grenn triangle on scene (like in android opengl example). Is there simple way to use ShaderProgram in this way (how can i attach shaders out of PreDraw of Sprite) ?
I've tried smth like this:
BrushShaderProgram bs = BrushShaderProgram.getInstance();
bs.link(pGLState);
GLES20.glUseProgram(bs.getId());
GLES20.glVertexAttribPointer(bs.maPositionHandle, 3, GLES20.GL_FLOAT, false, 12, triangleVB);
GLES20.glEnableVertexAttribArray(bs.maPositionHandle);
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, 3);
shader is simple:
public static final String VERTEXSHADER =
"attribute vec4 vPosition; \n" +
"void main(){ \n" +
" gl_Position = vPosition; \n" +
"} \n";
public static final String FRAGMENTSHADER = "precision mediump float; \n" +
"void main(){ \n" +
" gl_FragColor = vec4 (0.63671875, 0.76953125, 0.22265625, 1.0); \n" +
"} \n";
Why not create a Triangle class, just like the Rectangle class?
In My application i have to use the Android Camera, and OpenGLES.
I have also have to give the effect to the Camera Vision with two files called one.vsh and one.fsh But i dont know how to implement that file in OpenGLES.
Even i also dont know how to implement android camera to work with OPENGLES to do effect with that two files.
Please help me for this.
Thanks.
Well, I don't have a test about Android camera to use such effect on it.
But ofcourse, you can use the shader file in the onSurfaceCreated method as like below:
//
// Initialize the shader and program object
//
public void onSurfaceCreated(GL10 glUnused, EGLConfig config) {
String vShaderStr = "uniform mat4 u_mvpMatrix; \n"
+ "attribute vec4 a_position; \n"
+ "void main() \n"
+ "{ \n"
+ " gl_Position = u_mvpMatrix * a_position; \n"
+ "} \n";
String fShaderStr = "precision mediump float; \n"
+ "void main() \n"
+ "{ \n"
+ " gl_FragColor = vec4( 1.0, 0.0, 0.0, 1.0 ); \n"
+ "} \n";
// Load the shaders and get a linked program object
mProgramObject = ESShader.loadProgram(vShaderStr, fShaderStr);
// Get the attribute locations
mPositionLoc = GLES20.glGetAttribLocation(mProgramObject, "position");
// Get the uniform locations
mMVPLoc = GLES20.glGetUniformLocation(mProgramObject, "u_mvpMatrix");
// Generate the vertex data
mCube.genCube(1.0f);
// Starting rotation angle for the cube
mAngle = 45.0f;
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
}
Just replace the String that you want to use for vertex shader and fragment shader.
Hope this helps.