I am using code from How can I pass multiple textures to a single shader?. It works fine until I load and bind new textures after my bumpmap image. I only can get this to work if my bumpmap is the last bound and loaded texture. Even if all my images are the same size. Here's my code...
public int[] textureIDs = new int[]{1, 2, 3, 4, 5, 6, 7, 8, 9, 10}; //texture image ID's
//load bitmap and bind texture (done 10 times) textureIndex is 1-10
Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), imageiD);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureIDs[textureIndex]);
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
bitmap.recycle();
//render
shaderProgram = GraphicTools.sp_ImageBump;
GLES20.glUseProgram(shaderProgram);
t1 = GLES20.glGetUniformLocation(shaderProgram, "u_texture");
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, globals.textureIDs[1]);//textureIndex
GLES20.glUniform1i(t1, 0);
t2 = GLES20.glGetUniformLocation(shaderProgram, "u_bumptex");
GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, globals.textureIDs[2]);//bumpMapIndex);
GLES20.glUniform1i(t2, 1);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);//added this, it allows me to pass 2 textures to the shaders, otherwise TEXTURE1 is black
//fragment shader
precision mediump float;
uniform sampler2D u_bumptex;
uniform sampler2D u_texture;
varying vec2 v_texCoord;
void main()
{
vec4 bumpColor = texture2D(u_bumptex, v_texCoord);//v_bumpCoord);// get bump map color, just use green channel for brightness
gl_FragColor = texture2D(u_texture, v_texCoord) * bumpColor.g;
}
Related
I have a byte buffer of YUV-NV12 formatted image data. When I try to convert it to RGB, I get an output with a stretched colour (chroma) layer like in the image below.
I followed this great answer, which guides to convert YUV-NV21 to RGB. Since NV-12 is just NV-21 with flipped U and V data, the only change I should do is to replace u and v values in the fragment shader.
Vertex shader:
precision mediump float;
uniform mat4 uMVPMatrix;
attribute vec4 vPosition;
attribute vec4 vTextureCoordinate;
varying vec2 position;
void main()
{
gl_Position = uMVPMatrix * vPosition;
position = vTextureCoordinate.xy;
}
Fragment shader:
precision mediump float;
varying vec2 position;
uniform sampler2D uTextureY;
uniform sampler2D uTextureUV;
void main()
{
float y, u, v;
y = texture2D(uTextureY, position).r;
u = texture2D(uTextureUV, position).a - 0.5;
v = texture2D(uTextureUV, position).r - 0.5;
float r, g, b;
r = y + 1.13983 * v;
g = y - 0.39465 * u - 0.58060 * v;
b = y + 2.03211 * u;
gl_FragColor = vec4(r, g, b, 1.0);
}
Split and put image data into 2 ByteBuffer's which are mYBuffer and mUVBuffer. mSourceImage is just a Buffer which contains the image data as byte data.
ByteBuffer bb = (ByteBuffer) mSourceImage;
if (bb == null) {
return;
}
int size = mWidth * mHeight;
bb.position(0).limit(size);
mYBuffer = bb.slice();
bb.position(size).limit(bb.remaining());
mUVBuffer = bb.slice();
Generating textures:
GLES20.glGenTextures(2, mTexture, 0);
for(int i = 0; i < 2; i++) {
GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + i);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTexture[i]);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
}
Passing buffer data to textures:
mTextureYHandle = GLES20.glGetUniformLocation(mProgramId, "uTextureY");
mTextureUVHandle = GLES20.glGetUniformLocation(mProgramId, "uTextureUV");
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTexture[0]);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE, mWidth, mHeight, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, mYBuffer);
GLES20.glUniform1i(mTextureYHandle, 0);
GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTexture[1]);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE_ALPHA, mWidth / 2, mHeight / 2, 0, GLES20.GL_LUMINANCE_ALPHA, GLES20.GL_UNSIGNED_BYTE, mUVBuffer);
GLES20.glUniform1i(mTextureUVHandle, 1);
I couldn't figure out why I'm getting such an output. Any help would be much appreciated.
Nevermind, It was a tiny mistake in my code.
When splitting the byte buffer, I have used bb.position(size).limit(bb.remaining()) for the UV buffer. For some reason, bb.remaining() become 0 after getting some frames (This is actually a camera preview). Therefore I have changed it to bb.position(size).limit(size + size / 2).
Also the assumption I made by reading this,
the only change I should do is to replace u and v values in the fragment shader
appears to be wrong. It is observed that GL20.GL_LUMINANCE_ALPHA will always put the U byte into the A component of the texture, and the V byte into R, G, B components (You can use either one). Hence, no need to swap u and v values in the fragment shader (I have edited my question with the correct fragment shader code).
I will keep the question hoping this would help someone in the future.
I am using OpenGL ES 2.0 shaders to generate fractals. It has worked well, until I decided that black and white is not enough and I need a palette. I pass the palette as 1D texture to the shader, but all I get is black screen.
The shader is based on this one, with the texture passed being 2D (nx1), because of OpenGL ES 2.0 not allowing 1D textures, hence the pixel color is being got by
gl_FragColor = texture2D(palette, vec2((j == int(iterations) ? 0.0 : float(j)) / iterations, 0.5));
(I am not sure about the 0.5 here).
The relevant texture loading code:
Bitmap bitmap = Bitmap.createBitmap(colors, colors.length, 1, Bitmap.Config.ARGB_8888);
int handle = ShaderUtils.loadTexture(bitmap);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, handle);
GLES20.glUniform1i(paletteHandle, handle);
[...]
public static int loadTexture(Bitmap bitmap)
{
final int[] textureHandle = new int[1];
GLES20.glGenTextures(1, textureHandle, 0);
if (textureHandle[0] == 0)
{
throw new RuntimeException("Error generating texture name.");
}
// Bind to the texture in OpenGL
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
// Set filtering
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_NEAREST);
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
// Recycle the bitmap, since its data has been loaded into OpenGL.
bitmap.recycle();
return textureHandle[0];
}
Vertex shader:
attribute vec4 vPosition;
void main() {
gl_Position = vPosition;
}
Fragment shader:
precision mediump float;
uniform sampler2D palette;
uniform float centerX;
uniform float centerY;
uniform float scale;
uniform float iterations;
uniform vec2 resolution;
#define maxiter 1024
void main() {
vec2 center = vec2(centerX, centerY);
vec2 coord = vec2(gl_FragCoord.x, gl_FragCoord.y) / resolution;
vec2 c = (coord - center) / scale;
int j = 0;
vec2 z = c;
for(int i = 0; i<maxiter; i++) {
if (float(i) >= iterations) break;
j++;
float x = (z.x * z.x - z.y * z.y) + c.x;
float y = (z.y * z.x + z.x * z.y) + c.y;
if((x * x + y * y) > 4.0) break;
z.x = x;
z.y = y;
}
gl_FragColor = texture2D(palette, vec2((j == int(iterations) ? 0.0 : float(j)) / iterations, 0.5));
// vec3 color = vec3(float(j)/float(iterations));
// gl_FragColor = vec4(color, 1.0);
}
Problem is that this is very hard to debug. From inside the IDE I made sure that the bitmap contains proper data and there are no opengl errors in the logcat. The shader works without the texture, so it is probably the problem here. What could be the cause?
The value which you have to set to the texture sampler uniform is not the "name" of the texture object, it has to be the index of the texture unit:
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, handle);
// this is wrong
//GLES20.glUniform1i(paletteHandle, handle);
GLES20.glUniform1i(paletteHandle, 0); // 0, because of GLES20.GL_TEXTURE0
See OpenGL 4.6 API Compatibility Profile Specification; 7.10 Samplers; page 154:
Samplers are special uniforms used in the OpenGL Shading Language to identify
the texture object used for each texture lookup. The value of a sampler indicates the texture image unit being accessed. Setting a sampler’s value to i selects texture image unit number i.
I have a bitmap on my device(which is a 6x1 cubemap), which I want to render on all the faces of the cube
InputStream is = getContext().getResources().openRawResource(R.raw.photo);
Bitmap bitmap = BitmapFactory.decodeStream(is);
int bytes = bitmap.getByteCount();
ByteBuffer pixels = ByteBuffer.allocate(bytes);
bitmap.copyPixelsToBuffer(pixels);
Here is my vertex shader:
uniform mat4 uMVPMatrix;
uniform mat4 uSTMatrix;
attribute vec4 aPosition;
attribute vec4 aTextureCoord;
attribute vec4 aColor;
varying vec2 vTextureCoord;
varying vec4 vColor;
void main() {
gl_Position = uMVPMatrix * aPosition;
vTextureCoord = (uSTMatrix * aTextureCoord).xy;
vColor = aColor;
}
Here is my fragment shader:
precision mediump float;
varying vec2 vTextureCoord;
varying vec4 vColor;
uniform samplerCube sTexture;
void main() {
gl_FragColor = textureCube(sTexture, vec3(vTextureCoord, 1.0)) * vColor;
}
And here is what I am doing in my renderer in onSurfaceCreated():
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
int[] texIds = new int[1];
GLES20.glGenTextures(1, texIds, 0);
m360PhotoTextureId = texIds[0];
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(
GLES20.GL_TEXTURE_CUBE_MAP,
mTextureId);
for (int i = 0 ; i < 6 ; i++ ){
pixels.position(0);
GLES20.glTexImage2D(
GLES20.GL_TEXTURE_CUBE_MAP_POSITIVE_X + i,
0,
GLES20.GL_RGBA,
1,
1,
0,
GLES20.GL_RGBA,
GLES20.GL_UNSIGNED_BYTE,
pixels);
}
GLES20.glTexParameteri(
GLES20.GL_TEXTURE_CUBE_MAP,
GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_LINEAR);
GLES20.glTexParameteri(
GLES20.GL_TEXTURE_CUBE_MAP,
GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR);
GLES20.glTexParameteri(
GLES20.GL_TEXTURE_CUBE_MAP,
GLES20.GL_TEXTURE_WRAP_S,
GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(
GLES20.GL_TEXTURE_CUBE_MAP,
GLES20.GL_TEXTURE_WRAP_T,
GLES20.GL_CLAMP_TO_EDGE);
GLES20.glBindTexture(GLES20.GL_TEXTURE_CUBE_MAP, 0);
All I see is a black screen in my texture view, when I expect to see a photo(in the pixels) being rendered on all the faces of the cube.
Any pointers or help would be appreciated.
I tried:
Vertex shader:
uniform mat4 uMVPMatrix;
attribute vec4 aPosition;
varying vec3 vTextureCoord;
void main() {
gl_Position = uMVPMatrix * aPosition;
vTextureCoord = aPosition.xyz;
}
Fragment shader:
precision mediump float;
varying vec3 vTextureCoord;
uniform samplerCube sTexture;
void main() {
gl_FragColor = textureCube(sTexture, vTextureCoord);
}
But I get the same black screen.
A cubemap texture is a texture who's images represent the faces of a cube. The "texture coordinate" for a cubemap texture is the vector direction from the center of the cube which points to the color you want to use.
You are trying to use a regular old 2D texture coordinate, with a third component probably added to silence the compiler. You must provide directions, not 2D coordinates. You can generate them in the vertex shader from your position. But that requires knowing what space aPosition is, and you didn't tell me. So I can't show you how to do that.
Regardless, the vertex shader needs to be providing a 3D direction for the texture coordinate. It should either be generated or passed from a VS input.
Note that your program may have other problems. But this is the problem that can be deduced from your code.
I'm trying to render a simple 3D scene. It's basically a surface wrapped in a texture. I've tested it on 4 devices and only one (Galaxy Note 4) renders the texture properly. The other three phones (HTC Evo 3D, LG G2, Sony Xperia Z1) render the texture with all texels in one color, which seems to be an average color of the texture. E.g.: original image and rendered texture.
My first guess was there's something wrong with my fragment shader. But it's very basic and copied from a book "OpenGL ES 2 for Android":
private final String vertexShaderCode =
"uniform mat4 uMVPMatrix;"+
"attribute vec4 vPosition;"+
"attribute vec2 a_TextureCoordinates;"+
"varying vec2 v_TextureCoordinates;"+
"void main()"+
"{"+
"v_TextureCoordinates = a_TextureCoordinates;"+
"gl_Position = uMVPMatrix * vPosition;"+
"}";
private final String fragmentShaderCode =
"precision mediump float;"+
"uniform sampler2D u_TextureUnit;"+
"varying vec2 v_TextureCoordinates;"+
"void main()"+
"{"+
"gl_FragColor = texture2D(u_TextureUnit, v_TextureCoordinates);"+
"}";
Loading texture:
texCoordHandle = glGetAttribLocation(program, "a_TextureCoordinates");
texUnitHandle = glGetUniformLocation(program, "u_TextureUnit");
texHandle = new int[1];
glGenTextures(1, texHandle, 0);
BitmapFactory.Options opt = new BitmapFactory.Options();
opt.inScaled = false;
Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(),
R.drawable.way_texture_pink_square_512, opt);
glBindTexture(GL_TEXTURE_2D, texHandle[0]);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
texImage2D(GL_TEXTURE_2D, 0, bitmap, 0);
glGenerateMipmap(GL_TEXTURE_2D);
bitmap.recycle();
glBindTexture(GL_TEXTURE_2D, 0);
I hold the vertices and texture coords in different float buffers, and draw them in a loop:
glUseProgram(program);
glUniformMatrix4fv(mvpMatrixHandle, 1, false, vpMatrix, 0);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, texHandle[0]);
glUniform1i(texUnitHandle, 0);
glEnableVertexAttribArray(positionHandle);
WaySection[] ws = waySects.get(currWaySect);
for( int i = 0 ; i < ws.length ; i++ ) {
glVertexAttribPointer(
positionHandle, COORDS_PER_VERTEX,
GL_FLOAT, false,
vertexStride, ws[i].vertexBuffer);
glVertexAttribPointer(texCoordHandle, COORDS_PER_TEX_VER,
GL_FLOAT, false,
texStride, ws[i].texBuffer);
glDrawArrays(GL_TRIANGLE_FAN, 0,
ws[i].vertices.length / COORDS_PER_VERTEX);
}
glDisableVertexAttribArray(positionHandle);
What I've done:
Changed textures, tried different sizes (power of two and not), different alpha.
Tried switching on/off different options, like: GL_CULL_FACE, GL_BLEND, GL_TEXTURE_MIN(MAG)_FILTER, GL_TEXTURE_WRAP...
Checked for OpenGL errors in every possible place. glGetError() returns no error.
Checked the code in debugger. The textures converted into Bitmap looked fine before and after loading them into OpenGL.
Help, please :)
Problem solved. Erratic behavior was caused by the lack of glEnableVertexAttribArray(texCoordHandle); call in onDraw(). Note 4 was ok with that; it required only the positionHandle (vertices) to be enabled. Other phones apparently weren't to happy. Maybe because Note 4 supports GL ES 3.1?
Thanks for your time.
I'm trying to draw multiple hexagons on the screen that have an alpha channel. the image is this:
So, I load the texture into the program and that's ok. When it runs, the alpha channel is blended with the background color and that's ok but, when two hexagons overlap themselves, the overlapped part becomes the color of the background! Below the picture:
Of course, this is not the effect that I expected.. I want them to overlap without this background being drawn over the other texture. Here is my code for drawing:
GLES20.glUseProgram(Program);
hVertex = GLES20.glGetAttribLocation(Program,"vPosition");
hColor = GLES20.glGetUniformLocation(Program, "vColor");
uTexture = GLES20.glGetUniformLocation(Program, "u_Texture");
hTexture = GLES20.glGetAttribLocation(Program, "a_TexCoordinate");
hMatrix = GLES20.glGetUniformLocation(Program, "uMVPMatrix");
GLES20.glVertexAttribPointer(hVertex, 3, GLES20.GL_FLOAT, false, 0, bVertex);
GLES20.glEnableVertexAttribArray(hVertex);
GLES20.glUniform4fv(hColor, 1, Color, 0);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, hTexture);
GLES20.glUniform1i(uTexture, 0);
GLES20.glVertexAttribPointer(hTexture, 2, GLES20.GL_FLOAT, false, 0, bTexture);
GLES20.glEnableVertexAttribArray(hTexture);
GLES20.glBlendFunc(GLES20.GL_ONE, GLES20.GL_ONE_MINUS_SRC_ALPHA);
GLES20.glEnable(GLES20.GL_BLEND);
x=-1;y=0;z=0;
for (int i=0;i<10;i++) {
Matrix.setIdentityM(ModelMatrix, 0);
Matrix.translateM(ModelMatrix, 0, x, y, z);
x+=0.6f;
Matrix.multiplyMM(ModelMatrix, 0, ModelMatrix, 0, ProjectionMatrix, 0);
GLES20.glUniformMatrix4fv(hMatrix, 1, false, ModelMatrix, 0);
GLES20.glDrawElements(GLES20.GL_TRIANGLES, DrawOrder.length, GLES20.GL_UNSIGNED_SHORT, bDrawOrder);
}
GLES20.glDisable(GLES20.GL_BLEND);
GLES20.glDisableVertexAttribArray(hVertex);
}
And My fragment shader:
public final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"uniform sampler2D u_Texture;" +
"varying vec2 v_TexCoordinate;" +
"void main() {" +
" gl_FragColor = vColor * texture2D(u_Texture, v_TexCoordinate);" +
"}";
and my renderer code:
super(context);
setEGLContextClientVersion(2);
getHolder().setFormat(PixelFormat.TRANSLUCENT);
setEGLConfigChooser(8, 8, 8, 8, 8, 8);
renderer = new GLRenderer(context);
setRenderer(renderer);
I already tried to use diferent functions on glBlendFunc but nothing seems to work.. Does Anyone knows what the problem is? I'm really lost.. If needs anymore code just ask!
Thank you!
My guess is that you need to disable the depth test when drawing these. Since they all appear at the same depth, when you draw your leftmost ring, it writes into the depth buffer for every pixel in the quad, even the transparent ones.
Then when you draw the next quad to the right, the pixels which overlap don't get drawn because they fail the depth test, so you just get a blank area where it intersects with the first quad.