Is there a way to transform a libgdx's Texture to a grayscale image? So far I had duplicate the images that I want to grayscale and I did it manually, but I think it is not the best solution because my game is using more and more images and it uses a lot of disk space.
Thought I'd share this for anyone wanting to use some copy/paste code.
import com.badlogic.gdx.graphics.glutils.ShaderProgram;
public class GrayscaleShader {
static String vertexShader = "attribute vec4 a_position;\n" +
"attribute vec4 a_color;\n" +
"attribute vec2 a_texCoord0;\n" +
"\n" +
"uniform mat4 u_projTrans;\n" +
"\n" +
"varying vec4 v_color;\n" +
"varying vec2 v_texCoords;\n" +
"\n" +
"void main() {\n" +
" v_color = a_color;\n" +
" v_texCoords = a_texCoord0;\n" +
" gl_Position = u_projTrans * a_position;\n" +
"}";
static String fragmentShader = "#ifdef GL_ES\n" +
" precision mediump float;\n" +
"#endif\n" +
"\n" +
"varying vec4 v_color;\n" +
"varying vec2 v_texCoords;\n" +
"uniform sampler2D u_texture;\n" +
"\n" +
"void main() {\n" +
" vec4 c = v_color * texture2D(u_texture, v_texCoords);\n" +
" float grey = (c.r + c.g + c.b) / 3.0;\n" +
" gl_FragColor = vec4(grey, grey, grey, c.a);\n" +
"}";
public static ShaderProgram grayscaleShader = new ShaderProgram(vertexShader,
fragmentShader);
}
To use it call
spriteBatch.setShader(GrayscaleShader.grayscaleShader)
And when you're done with grayscale don't forget to call
spriteBatch.setShader(null);
You should be able to write a GLSL shader that renders a texture in grayscale. This requires OpenGL 2.x, and doesn't really "transform" a texture, but just renders it to the display as grayscale.
For a detailed tutorial on shaders that includes a grayscale shader check out: https://github.com/mattdesl/lwjgl-basics/wiki/ShaderLesson3
(Libgdx doesn't really define the GLSL shader API, that's passed through from OpenGL, so most tutorials or code you find on the web for regular OpenGL should work.)
For a more direct hack, just take the Libgdx SpriteBatch shader and change the fragment shader so it averages the rgb components. (You can define your own ShaderProgram and provide it to a SpriteBatch to use.)
Change body of the fragment shader to something like this (untested, so may not compile):
+ " vec4 c = v_color * texture2D(u_texture, v_texCoords);\n" //
+ " float grey = (c.r + c.g + c.b) / 3.0;\n" //
+ " gl_FragColor = vec4(grey, grey, grey, c.a);\n" //
You can load up textures as luminance only, or luminance and alpha in GLES (see glTexImage2D). In libgdx you can specify PixFormat.Intensity (luminance) or LuminanceAlpha (luminance and alpha) when instantiating the Texture. This will generate a grayscale texture.
You still need to have two textures (one color, one grayscale) loaded up, but they can use the same source, and the luminance only uses only 1 byte per pixel in memory.
A more efficient solution is to implement a shader as suggested by P.T., but is only available from GLES 2.
Related
Solved. Please see my answer below.
---------------------------
Edit 3:
It's almost solved. Last push please.
I tried the scale and the modulateGlow that Rabbid76 suggested and the butterflies are transparent. Please look at the butterfly on the chair and the big one on the floor.
And here are the original images(one of them) :
---------------------------
Edit 2:
Thanks to Rabbid76, it's almost solved!
The left image shows the current result with Rabbid76 solution. The right image shows the expected solution. As you see, the butterflies became lighter\transparent now.
---------------------------
Edit 1:
I tried to add the suggested glBlendFunc and it didn't help although it affected the glow effect a little bit. I needed to add GLES20.glEnable(GLES20.GL_BLEND) because it's disabled by default.
---------------------------
I'm using this repository
https://github.com/MasayukiSuda/GPUVideo-android
It contains filters that can be displayed on the camera. One of the filters is named "WATERMARK". When I change this image to be an image with a glow effect, the glow becomes black.
The image is located at: GPUVideo-android-master\sample\src\main\res\drawable-nodpi\sample_bitmap.png. In order to trigger the problem, it should be changed to an image with glow effect. For example I generated this image: https://drive.google.com/file/d/1u8PtIftovPbXRLnI3OR9_1Kv37L5s96_/view?usp=sharing
You can see the issue with the white glow effect that becomes black:
Just run the app, click on "Camera Record", then choose "Portrait" and from the left list choose "WATERMARK" (the 5th from the end). Don't forget to change the image I mentioned.
The relevant class is named: GlWatermarkFilter. It extends GlOverlayFilter which contains the next shader:
private final static String FRAGMENT_SHADER =
"precision mediump float;\n" +
"varying vec2 vTextureCoord;\n" +
"uniform lowp sampler2D sTexture;\n" +
"uniform lowp sampler2D oTexture;\n" +
"void main() {\n" +
" lowp vec4 textureColor = texture2D(sTexture, vTextureCoord);\n" +
" lowp vec4 textureColor2 = texture2D(oTexture, vTextureCoord);\n" +
" \n" +
" gl_FragColor = mix(textureColor, textureColor2, textureColor2.a);\n" +
"}\n";
and this class also contains this setup:
GLES20.glGenTextures(1, textures, 0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[0]);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
This class extends GlFilter which contains:
protected static final String DEFAULT_VERTEX_SHADER =
"attribute highp vec4 aPosition;\n" +
"attribute highp vec4 aTextureCoord;\n" +
"varying highp vec2 vTextureCoord;\n" +
"void main() {\n" +
"gl_Position = aPosition;\n" +
"vTextureCoord = aTextureCoord.xy;\n" +
"}\n";
protected static final String DEFAULT_FRAGMENT_SHADER =
"precision mediump float;\n" +
"varying highp vec2 vTextureCoord;\n" +
"uniform lowp sampler2D sTexture;\n" +
"void main() {\n" +
"gl_FragColor = texture2D(sTexture, vTextureCoord);\n" +
"}\n";
Please help me to figure out how to fix it. I know it's something about semi-alpha issue and this camera that doesn't know to display it. My goal is to record a video with glowing butterflies.
Most likely, the color channel of the glow effect texture fades out at the edges of the glowing area. This means that the texture's color channels at the edges of the glowing area have a gradient from white to black. If this is the case you wrongly mix the textures. You need to add the glow effect (textureColor2) to the base texture (textureColor):
gl_FragColor = textureColor + textureColor2;
An option would be scale the glow effect texture by it's alpha channel
gl_FragColor = textureColor + textureColor2 * textureColor2.a;
If the glow effect seems too bright, you can reduce the glow effect by scaling the glow texture:
float scale = 0.5;
gl_FragColor = textureColor + textureColor2 * scale;
Or you can modulate the glow effect through the color texture:
vec4 modulateGlow = textureColor2 * textureColor;
gl_FragColor = textureColor + modulateGlow;
This solves all the issues so please notice the last 2 lines:
private final static String FRAGMENT_SHADER =
"precision mediump float;\n" +
"varying vec2 vTextureCoord;\n" +
"uniform lowp sampler2D sTexture;\n" +
"uniform lowp sampler2D oTexture;\n" +
"void main() {\n" +
" lowp vec4 textureColor = texture2D(sTexture, vTextureCoord);\n" +
" lowp vec4 textureColor2 = texture2D(oTexture, vTextureCoord);\n" +
" lowp float alphaDivisor = textureColor2.a + step(textureColor2.a, 0.0);\n" +
" gl_FragColor = vec4(mix(textureColor.rgb, textureColor2.rgb / alphaDivisor, textureColor2.a), textureColor.a);\n" +
"}\n";
My Android program must use glBlitFrameBuffer() function to copy FrameBuffer object. But glBlitFrameBuffer() function is only supported on OpenGL ES 3.0+ devices. I want to support OpenGL ES 2.0+ devices.
Is there any solution/alternative for this function?
Bind texture that used as collor attachment on source frame buffer
Bind destination framebuffer
Draw full screen quad (if you need stretch or offseted reading manipulate with vertex/tex coordinates)
Fetch data from bound texture in frament shader and put it to gl_FragColor
I've created a CopyShader that simply uses a shader to copy from a texture to a framebuffer.
private static final String SHADER_VERTEX = ""
+ "attribute vec4 a_Position;\n"
+ "varying highp vec2 v_TexCoordinate;\n"
+ "void main() {\n"
+ " v_TexCoordinate = a_Position.xy * 0.5 + 0.5;\n"
+ " gl_Position = a_Position;\n"
+ "}\n";
private static final String SHADER_FRAGMENT = ""
+ ""
+ "uniform sampler2D u_Texture;\n"
+ "varying highp vec2 v_TexCoordinate;\n"
+ "void main() {\n"
+ " gl_FragColor = texture2D(u_Texture, v_TexCoordinate);\n"
+ "}\n”;
Use these as your shaders, and then just set u_Texture to the texture you want to copy from, and bind the framebuffer you want to write to, and you should be set.
I want to change a pixel color of the Bitmap with OpenGLES, by picking it with e.g. coordinates. What do I have to do? Any advices?
fragmentShader = riGraphicTools.loadShader(GLES20.GL_FRAGMENT_SHADER, riGraphicTools.test_Image);
riGraphicTools.sp_Image = GLES20.glCreateProgram();
GLES20.glAttachShader(riGraphicTools.sp_Image, fragmentShader);
public static final String test_Image =
"precision mediump float;" +
"varying vec2 v_texCoord;" +
"uniform sampler2D s_texture;" +
"void main() {" +
" gl_FragColor = texture2D( s_texture, v_texCoord );" +
"}";
i don't think ES 2.0 supports writeable textures, so instead of modifying sp_image on the fly, just render to texture with an FBO and use that texture in place of your original sp_image.
This should really be a simple task, but I fail miserably. First a confession: I know very little about OpenGL and the few things I've learned come from various tutorials (some of which are probably outdated and deprecated)
I'm trying to draw a quad using a single color defined at runtime. The quad is drawn in the correct size and position but with the wrong color. This is my vertex and fragment shader:
public static final String VS_SOLIDCOLOR =
"uniform mat4 uMVPMatrix;" +
"attribute vec4 aPosition;" +
"attribute vec4 aColor;" +
"varying vec4 vColor;" +
"void main() {" +
" gl_Position = uMVPMatrix * aPosition;" +
" vColor = aColor;" +
"}";
public static final String FS_SOLIDCOLOR =
"precision mediump float;" +
"varying vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
The color is set like this (I have removed gl error check and check for colorHandle == -1):
float r = 1.0f;
float g = 0.0f;
float b = 0.0f;
float a = 1.0f;
final int colorHandle = GLES20.glGetAttribLocation(shaderProgramSolidColor, "aColor");
GLES20.glVertexAttrib4f(colorHandle, r, g, b, a);
I would have expected the above code to result in solid red triangles, but they end up as solid yellow. From what I've read the fragment shader will interpolate the vColor vector, but I don't want that (and I'm not sure why it ends up as yellow). How do I set a color at runtime and get that drawn unchanged on my triangles?
PS If I were to do this in the fragment shader I would get a solid red color for my entire quad:
gl_FragColor = vec4(1,0,0,1);
Let me know if you need me to post more code.
I found what I did wrong. I was under the impression that I could only set variables in the vertex shader at runtime, but it's possible to manipulate the fragment shader as well. My shader should look like this:
public static final String VS_SOLIDCOLOR =
"uniform mat4 uMVPMatrix;" +
"attribute vec4 aPosition;" +
"void main() {" +
" gl_Position = uMVPMatrix * aPosition;" +
"}";
public static final String FS_SOLIDCOLOR =
"precision mediump float;" +
"uniform vec4 uColor;" +
"void main() {" +
" gl_FragColor = uColor;" +
"}";
And set like this:
final int colorHandle = GLES20.glGetUniformLocation(shaderProgramSolidColor, "uColor");
GLES20.glUniform4f(colorHandle, r, g, b, a);
This vertex shader code works on every device except for the Galaxy Note 2.
gl_Position = uMVPMatrix * vPosition;
where if I reverse the matrix multiplication to:
gl_Position = vPosition * uMVPMatrix; I can actually get things to appear.
Unfortunately, the reverse would require me to completely rewrite my transformations library.
Does anyone have any insight on what could be causing this, is this an opengl driver error with the device?
Shader code
private final String vertexShaderCode =
// This matrix member variable provides a hook to manipulate
// the coordinates of the objects that use this vertex shader
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"attribute vec2 a_TexCoordinate;" +
"varying vec2 v_TexCoordinate;" +
"void main() {" +
// the matrix must be included as a modifier of gl_Position
"v_TexCoordinate = a_TexCoordinate;" +
"gl_Position = uMVPMatrix * vPosition;" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform sampler2D u_Texture;" +
"varying vec2 v_TexCoordinate;" +
"void main() {" +
" gl_FragColor = texture2D(u_Texture, v_TexCoordinate);" +
//" gl_FragColor = vec4(v_TexCoordinate, 0, 1);" +
"}";
This is not about only a galaxy note 2 platform.
This is a mathematical question. Because both glsl/hlsl uses column major order,
It is a right way to multiply MATRIX x VECTOR
or
you can transpose the matrix using a option in
glUniformMatrix4fv( h_Uniforms[UNIFORMS_PROJECTION], 1, GL_FALSE or GL_TRUE, g_proxtrans.s);
Apparently, It can be a problem not to call this function with the option ( GL_TRUE is to use transposing ) each every frame just try .