LibGDX SpriteBatch Multitexture possible? - android

I'm happily using the SpriteBatch class of the LibGDX Framework.
My aim is to modify the representation of the sprite through a shader.
batch = new SpriteBatch(2, shaderProgram);
I copied the default shader from the SpriteBatch Class and added another
uniform Sampler 2d
+ "uniform sampler2D u_Texture2;\n"//
Is there a working way to give the texture to the shader.
Doing it like this, allways ends up in a ClearColor Screen.
batch.begin();
texture2.bind(1);
shaderProgram.setUniformi("u_Texture2", 1);
batch.draw(spriteTexture,positions[0].x,positions[0].y);
batch.draw(spriteTexture,positions[1].x,positions[1].y);
batch.end();
Each texture alone is working. Drawing manually with the help of the Mesh Class works as expected. So what can i do to use the convenience of SpriteBatch?
THX for Help

I guess the problem there is related to texture bindings. SpriteBatch asumes that the active texture unit will be 0, so it makes a call to
lastTexture.bind(); instead of lastTexture.bind(0);
The problem is that the active unit you are giving it is 1 (as you call texture2.bind(1); in your code). so, texture unit 0 is never bound, and may be thats causing the blank screen.
For instance, i would add a Gdx.GL20.glActiveTexture(0); before the draw calls. I'm not quite sure it will solve the problem, but it's a start!
EDIT:
I try my suggested solution and it works! :D. To be clearer, you should be doing this:
batch.begin();
texture2.bind(1);
shaderProgram.setUniformi("u_Texture2", 1);
Gdx.gl.glActiveTexture(GL10.GL_TEXTURE0);//This is required by the SpriteBatch!!
//have the unit0 activated, so when it calls bind(), it has the desired effect.
batch.draw(spriteTexture,positions[0].x,positions[0].y);
batch.draw(spriteTexture,positions[1].x,positions[1].y);
batch.end();

Related

How do you get a texture to appear on a plane in ARcore?

I have been working with the Arcore demo code provided by Google and was working in Android Studio, I would like to avoid using Unity if I can to complete this task.
By default the plane is shown as triangles that are white and the negative space is transparent. I would like to change that plan to rather be a texture that can be tiled throughout the environment, an example of this would be a grass texture.
The default image the plane uses is a file called trigrid.png and that is defined in the HelloArActivity.java.
https://github.com/google-ar/arcore-android-sdk/blob/master/samples/java_arcore_hello_ar/app/src/main/java/com/google/ar/core/examples/java/helloar/HelloArActivity.java
I tried to replace that with an image file that was just grass texture and called it floor.png . This just appears all white and doesn't display the grass at all.
}
try {
mPlaneRenderer.createOnGlThread(/*context=*/this, "floor.png");
} catch (IOException e) {
Log.e(TAG, "Failed to read plane texture");
}
I have tried adding
GLES20.glEnable(GLES20.GL_BLEND);
in the drawPlanes function but that didn't seem to help. I also commented out some of the changing of the colors in drawPlanes as well.
//GLES20.glClearColor(1, 1, 1, 1);
//GLES20.glColorMask(false, false, false, true);
//GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
//GLES20.glColorMask(true, true, true, true);
I'm not sure what is required to make the texture show. It could have to do with the plane_fragment.shader files but I don't have any experience with those.
Any insight would be helpful.
The shaders are very important. If you want to do any graphics programming in OpenGL, you need to know at least a little bit about shaders. They are programs that run on the graphics processing unit (GPU) to determine the color of every pixel for every frame. I quick intro to shaders is here: https://youtu.be/AyNZG_mqGVE.
To answer your question, you can use a new fragment shader which just draws your texture and does not mix in other colors. This is a quick and dirty solution, in the long term you definitely want to clean up the code to not reference all uniform variables that are no longer used.
Specifically:
Create a new file called plane_simple_fragment.shader in the src/main/assets/raw directory.
Open it in the editor and add the following code:
uniform sampler2D u_Texture;
varying vec3 v_TexCoordAlpha;
void main() {
gl_FragColor = texture2D(u_Texture, v_TexCoordAlpha.xy);
}
Then in PlaneRenderer change to the new shader by replacing R.raw.plane_fragment with R.raw.plane_simple_fragment.

Libgdx mask by shader with different textures

I'm following this tutorial
https://github.com/mattdesl/lwjgl-basics/wiki/ShaderLesson4
to create mask technique in libgdx. Everything works fine as the example code.
But the example code only use
a set of (background, mask, foreground) texture. When applying in my case with different set of textures, I cannot get it works unless
using batch.flush() after batch.draw(tex0,...).
The cycle is:
batch.begin()
tex1 = foreground1;
tex2 = mask1;
tex1.bind(1);
tex2.bind(2);
Gdx.gl.glActiveTexture(GL20.GL_TEXTURE0);
batch.draw(tex0, ...)
batch.flush();
tex1 = foreground2;
tex2 = mask2;
tex1.bind(1);
tex2.bind(2);
Gdx.gl.glActiveTexture(GL20.GL_TEXTURE0);
batch.draw(tex0, ...)
batch.flush();
batch.end();
If I don't use batch.flush(), all masks and foregrounds will be identical. But using batch.flush() severely reducing performance when I test on my Android phone.
Thanks in advance, any help is highly appreciated

Unexpected gl_stack_underflow error

I am working on a small game which uses OpenGL.
I was getting a gl_stack_underflow error. I have gone through the code, and there is one glPushMatrix for each glPopMatrix. Any ideas what else could be causing this error?
Did you maybe do a
glMatrixMode(GL_MODELVIEW);
/* ... */
glPushMatrix();
"balanced" by a
glMatrixMode(GL_PROJECTION);
/* ... */
glPopMatrix();
It matters which matrix is active at the time of push/pop operation.
Anyway, you shouldn't use the OpenGL built-in matrix operations at all. Use something like GLM, Eigen or linmath.h to build the matrices as part of your programs data structures and just load the matrices you require with glLoadMatrix or, when you finally go for shaders, glUniform.
No, OpenGL built-in matrix operations are not GPU accelerated, so there's no benefit at all in using them.

How to properly mix drawing calls and changes of a sampler value with a single shader program?

I'm trying to draw two objects using two different textures with one shader program in OpenGL ES 2.0 for Android. The first object should have texture0 and the second sould have texture1.
In fragment shader I have:
uniform sampler2D tex;
and in java code:
int tiu0 = 0;
int tiu1 = 1;
int texLoc = glGetUniformLocation(program, "tex");
glUseProgram(program);
// bind texture0 to texture image unit 0
glActiveTexture(GL_TEXTURE0 + tiu0);
glBindTexture(GL_TEXTURE_2D, texture0);
// bind texture1 to texture image unit 1
glActiveTexture(GL_TEXTURE0 + tiu1);
glBindTexture(GL_TEXTURE_2D, texture1);
glUniform1i(texLoc, tiu0);
// success: glGetError returns GL_NO_ERROR, glGetUniformiv returns 0 for texLoc
drawFirstObject(); // should have texture0
glUniform1i(texLoc, tiu1);
// success: glGetError returns GL_NO_ERROR, glGetUniformiv returns 1 for texLoc
drawSecondObject(); // should have texture1
Running on Samsung Galaxy Ace with Android 2.3.3 both objects have the same texture0. Similar code runs correctly in OpenGL 2.0 on my desktop computer.
If I remove drawFirstObject, the second object will have texture1.
If I remove drawSecondObject, the first object will have texture0.
If somewhere between drawFirstObject and drawSecondObject I change the program for a while:
glUseProgram(0); // can be also any valid program other than the program from the next call
glUseProgram(program);
then both objects will have texture1.
Values of uniforms different from sampler2D are always set correctly.
I know I can draw the two objects with different textures using only one texture image unit and binding appropriate texture to that texture image unit before drawing the object, but I also want to know what's going on here.
Is something wrong with my code? Is it possible in OpenGL ES 2.0 to draw the objects with different textures by only switching between texture image units as I shown in the code? If it's impossible, is that difference between OpenGL 2.0 (where it's possible) and OpenGL ES 2.0 documented anywhere? I can't find it.
After hours of further research I've found out that this problem is specific to Adreno 200 GPU that is utilized in my Samsung Galaxy Ace (GT-S5830). It seems like Adreno 200 driver assigns the texture to the sampler in the first call to a drawing function and after that it ignores any changes to the sampler value (glUniform1i(samplerLocation, textureImageUnit)) until one of the two occurs:
glUseProgram is called with a different shader program,
a different texture is bound to any texture image unit used by the shader program.
There's a thread in the forums of the manufacturer of Adreno 200 GPU describing the very same problem.
So if you call drawing functions several times with the same shader program and with different textures bound before, there are two workarounds to the described problem:
Call glUseProgram(0); glUseProgram(yourDrawingProgram); before every drawing function.
Before every drawing call, bind different texture to at least one texture image unit used by your shader program. This solution can be difficult to maintain, because if you bind the same texture that is already bound to the texture image unit, the problem will remain. So in this case the easiest solution is to simply not change sampler values and bind textures of all texture image units used by the shader program before every drawing call.

Android test with OpenGL ES Shaders

I have written a test class that should only paint the application icon into the screen. So far no luck. What I am doing wrong?
public class GLTester
{
void test(final Context context)
{
BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false;
bitmap = BitmapFactory.decodeResource(context.getResources(), R.drawable.icon, options);
setupGLES();
createProgram();
setupTexture();
draw();
}
void draw()
{
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
glUseProgram(glProgramHandle);
}
}
A couple of things.
I presume your squareVertices buffer is supposed to contain 4 vec3s. But your shader is setup for vec4s. Maybe this is ok, but it seems odd to me.
You also are not setting up any sort of perspective matrix with glFrustum or glOrtho, and are also not setting up any sort of viewing matrix with something like Matrix.setLookAtM. You should always try to keep the vertex pipeline in mind as well. Look at slide 2 from this lecture https://wiki.engr.illinois.edu/download/attachments/195761441/3-D+Transformational+Geometry.pptx?version=2&modificationDate=1328223370000
I think what is happening is your squareVertices are going through this pipeline and coming out on the other side as pixel coordinates. So your image is probably a very tiny spec in the corner of your screen, since you are using vertices from -1.0 to 1.0.
As a shameless sidenote, I posted some code on SourceForge that makes it possible to work on and load shaders from a file in your assets folder and not have to do it inside your .java files as strings. https://sourceforge.net/projects/androidopengles/
Theres an example project in the files section that uses this shader helper.
I hope some part of this rambling was helpful. :)
It looks pretty good, but I see that for one thing you are calling glUniform inside setupTexture, while the shader is not currenty bound. You should only call glUniform after calling glUseProgram.
I don't know if this is the problem, cause I would guess that it would probably default to 0 anyway, but I don't know for sure.
Other than that, you should get familiar calling glGetError to check if there are any error conditions pending.
Also, when creating shaders, its good habit to check their success status with glGetShader(GL_COMPILE_STATUS), also glGetShaderInfoLog if the compile fails, and similar for programs with glGetProgram/glGetProgramInfoLog.

Categories

Resources