libgdx - changing sprite color to white - android

I want to colorize a sprite so that RGB channels are all 1 and alpha remains unchanged.
I gather this should be done with shaders, but the two accepted answers on StackOverflow (Change sprite color into white and libgdx changing sprite color while hurt) don't work for me - the result is transparent, and they don't work on http://shdr.bkcore.com/ either

All you need is to replace the RGB each with 1.0 in the fragment shader.
Vertex shader-- This is like the one in SpriteBatch with vertex color removed since you aren't using it:
attribute vec4 a_position;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
void main()
{
v_texCoords = a_texCoord0;
gl_Position = u_projTrans * a_position;
}
Fragment shader-- grab just the alpha value from the texture.:
#ifdef GL_ES
precision lowp float; //since the only value we're storing is part of a color
#endif
varying vec2 v_texCoords;
uniform sampler2D u_texture;
void main()
{
float alpha = texture2D(u_texture, v_texCoords).a;
gl_FragColor = vec4(1.0, 1.0, 1.0, alpha);
}

Related

Android OpenGL ES 2.0:How to zoom texture with depth map?

Android OpenGL ES 2.0, I bind bitmap and depth map bitmap to fragment_shader.glsl,
precision mediump float;
uniform sampler2D sDepth;
uniform sampler2D sTexture;
uniform float time;
varying vec2 varyTexCoord;
void main() {
vec4 depth=texture2D(sDepth, varyTexCoord);
gl_FragColor=texture2D(sTexture, varyTexCoord);
}
sTexture : original bitmap
sDepth: depth map(ARGB_8888, 0~255, 0:far 255:near)
vertex_shader.glsl
attribute vec4 vPosition;
attribute vec2 vTexCoord;
uniform mat4 vMatrix;
varying vec2 varyTexCoord;
void main() {
gl_Position = vPosition;
varyTexCoord = vTexCoord;
}
now I want to make parallax effect :According to the depth of field value, zoom in the image, the zoom in of the near area is higher than that of the far area, creating parallax effect.
Can you give us some ideas? Thank you.
Create a mesh on a base of regular grid, let say 10x10 in (x,y) space.
Set z coordinate from depth map.
Make uv coordinates.
Render mesh with you color texture, and use gl_position = projectionMatrix * scale * vertex;
Try different dimensions for grid to find best one.

Why is there an offset when I render this overlay?

I use Vuforia SDK to render the video stream of my phone's camera on the screen.
So the texture is generated by the Vuforia library, not me.
The shaders used to render this background are:
// Vertex Shader
attribute vec4 a_position;
attribute vec2 a_textureCoords;
varying vec2 v_textureCoords;
uniform mat4 u_projectionMatrix;
void main()
{
gl_Position = u_projectionMatrix * a_position;
v_textureCoords = a_textureCoords;
}
// Fragment Shader
varying highp vec2 v_textureCoords;
uniform sampler2D u_currentTexture;
void main()
{
vec4 currentColor = texture2D(u_currentTexture, v_textureCoords);
gl_FragColor = currentColor;
}
Now, I want an overlay in the upper-left corner of the screen:
I don't want this overlay to display only a pink texture, but rather a multiply blend of the pink texture and the background texture. Note that the textures do not have the same coordinates.
But for now, let's forget about the blending and let's just render the background texture in the shader program of the pink texture. So in the end, yes, one should see no difference between the background-only version and the bacground with overlay version.
As you can see (look at the painting and the top of the chair), there is a small offset...
The shaders used to render the overlay are:
// Vertex Shader
attribute vec4 a_position;
attribute vec2 a_currentTextureCoords;
varying vec2 v_currentTextureCoords;
void main()
{
gl_Position = a_position;
v_currentTextureCoords = a_currentTextureCoords;
}
// Fragment Shader
varying highp vec2 v_currentTextureCoords;
uniform sampler2D u_currentTexture;
uniform sampler2D u_backgroundTexture;
void main()
{
vec2 screenSize = vec2(1080.0, 1920.0);
vec2 cameraResolution = vec2(720.0, 1280.0);
vec2 texelSize = vec2(1.0 / screenSize.x, 1.0 / screenSize.y);
vec2 scaleFactor = vec2(cameraResolution.x / screenSize.x, cameraResolution.y / screenSize.y);
vec2 uv = gl_FragCoord.xy * texelSize * scaleFactor;
uv = vec2(scaleFactor.y - uv.y, scaleFactor.x - uv.x);
vec4 backgroundColor = texture2D(u_backgroundTexture, uv);
gl_FragColor = backgroundColor;
}
Are my calculations wrong?
Why do you need this line?
uv = vec2(scaleFactor.y - uv.y, scaleFactor.x - uv.x);
Not sure what arithmetic relationship the absolute texture coordinates have with the scale factor which needs an addition or a subtraction ...
P.S. it's not related to your question, but your shaders will be shorter and easier to read if you just use the vector operations in the language. For example, replace:
vec2 scaleFactor = vec2(cameraResolution.x / screenSize.x, cameraResolution.y / screenSize.y);
... with ...
vec2 scaleFactor = cameraResolution / screenSize;
As long as the vector types are the same length, it will do exactly what you expect with a lot less typing ...

GLSL - ripling effect, bad performance on android

I needed to create some ripling effect for one sprite in my game, here's the vertexShader:
attribute vec4 a_position; // just taking in necessary attributes
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans; // Combination of view and projection matrix
varying vec2 v_texCoords;
void main() {
v_texCoords = a_texCoord0;
gl_Position = u_projTrans * a_position; //as I said, it is sprite so no need for modelMatrix
}
and here's the fragment:
#ifdef GL_ES
precision mediump float;
#endif
varying vec2 v_texCoords;
uniform sampler2D u_texture; //texture of sprite
uniform float time;
void main()
{
vec2 uv;
if (time > 0.0) { // time is > 0.0 when I want the ripling effect to be applied,
vec2 cPos = -1.0 + 2.0 * v_texCoords.xy; // converting tex.Coords to -1 - 1
float cLength = length(cPos); //taking length of it
uv = v_texCoords.xy +( (cPos/cLength)*cos(cLength*12.0-time*4.0)*0.03 ) // just some calculations for the ripling effect
}
else
uv = v_texCoords.xy; // if I don't want to use the ripling effect, I use normal texCoords
vec4 tex = texture2D(u_texture, uv); //sampling texture
gl_FragColor = tex;
}
It all works fine, the performance's fine on PC, but when running it on android, the performance is a lot worse... As you can see, shader's are trivial but they somehow are expensive.. Anyways, sprite I draw has width about 2000 - 4000 px and height 720. Also, when I replace v_texCoords with different vector(for example vec2(1, 1)) in cPos calc: vec2 cPos = -1.0 + 2.0 * v_texCoords.xy; the performance improves heavily..
I don't really know what's so expensive there. If anyone had some advices, I'd be happy. Thanks in advance

Lighting on OpenGL ES sphere not smooth

In OpenGL ES 2.0 for Android, I am drawing a sphere. The sphere appears on the screen as a circle, so I need to add lighting. When I added lighting, instead of it being smooth, like it would be in real life, it is a fine line between light and dark, as shown here:
However, I want it to look like this, where the shading is much smoother and blended:
Here is my vertex shader code:
uniform mat4 u_Matrix;
uniform vec3 u_VectorToLight;
attribute vec4 a_Position;
attribute vec3 a_Color;
attribute vec3 a_Normal;
varying vec3 v_Color;
void main() {
v_Color = a_Color;
vec3 scaledNormal = a_Normal;
scaledNormal = normalize(scaledNormal);
float diffuse = max(dot(scaledNormal, u_VectorToLight), 0.0);
v_Color *= diffuse;
float ambient = 0.1;
v_Color += ambient;
gl_Position = u_Matrix * a_Position;
}
And my fragment shader:
precision mediump float;
varying vec3 v_Color;
void main() {
gl_FragColor = vec4(v_Color, 1.0);
}
The normal is calculated by getting the vector from the center of the sphere to the point on the sphere, then normalizing it (giving it a length of 1)
Here is how I set the colors:
vertices[offset++] = Color.red(color);
vertices[offset++] = Color.green(color);
vertices[offset++] = Color.blue(color);
Where color is 0xffea00.
The problem is with the range of color values you use. OpenGL operates with color component values in the range [0.0, 1.0]. But you are specifying colors in the range [0, 255] in your Java code.
You have two options to fix this:
Divide the color values you get from the Color class by 255.0f.
Specify the colors with type GL_UNSIGNED_BYTE. To do this, store the values in an array/buffer with element type byte, store those values in a VBO, and then set up the vertex attribute with:
glVertexAttribPointer(attrLoc, 3, GL_UNSIGNED_BYTE, GL_TRUE, stride, 0);
Note the value for the 4th argument. While it does not matter for GL_FLOAT attributes, it is critical that you use GL_TRUE in this case, because the byte values need to be normalized.

Send alpha value to fragment shader

I have a fragment shader :
precision mediump float;
uniform vec4 vColor;
uniform sampler2D u_Texture; // The input texture.
varying vec2 v_TexCoordinate; // Interpolated texture coordinate per fragment.
varying vec4 vAlpha;
void main() {
gl_FragColor = vec4(texture2D(u_Texture, v_TexCoordinate).xyz, texture2D(u_Texture, v_TexCoordinate).w * vAlpha[3]);
}
And vertex shader :
attribute vec3 vPosition;
attribute vec2 aTextureCoord; // Per-vertex texture coordinate information we will pass in.
attribute vec4 aAlpha;
varying vec2 v_TexCoordinate;
varying vec4 vAlpha;
void main() {
v_TexCoordinate = aTextureCoord;
vAlpha = aAlpha;
gl_Position = uMVPMatrix * vec4(vPosition,1.0);
}
And try to set alpha value to texture from my program.
private final int mAlphaHandle;
private float[] color = {0.5f};
Set :
mAlphaHandle = GLES20.glGetAttribLocation(mProgram,"aAlpha");
Using :
GLES20.glEnableVertexAttribArray(mAlphaHandle);
GLES20.glVertexAttribPointer(mAlphaHandle, 1,GLES20.GL_FLOAT, false,4, alphaBuffer);
Blending is enabled.
GLES20.glEnable(GLES20.GL_BLEND);
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA);
But i have no effect on the screen. So a question is why i getting no effect on the screen? Actually a want to change alpha value dynamically in the future.

Categories

Resources