Libgdx: Framebuffer for "Fog of War"-Effect - android

I am writing a RTS Game for Android and I want the "Fog of War" effect on the player's units. This effect means that only a "circle" around each unit shows the background map while on places where no player unit is located, the screen should be black. I don't want to use shaders.
I have a first version of it working. What I am doing is to render the map to the default framebuffer, then I have a second Framebuffer (similar to light technics) which is completely black. Where the units of the players are, I then batch-draw a texture which is completely transparent and has a white circle with blurred edges in its middle.
Finally I draw the second (light) FrameBuffer's colorTexture over the first one using Gdx.gl.glBlendFunc(GL20.GL_DST_COLOR, GL20.GL_ZERO);
The visual effect now is that indeed the whole map is black and a circle around my units is visible - but a lot of white color is added.
The reason is pretty clear as I drew the light textures for the units like this:
lightBuffer.begin();
Gdx.gl.glBlendFunc(GL20.GL_SRC_ALPHA, GL20.GL_ONE);
Gdx.gl.glEnable(GL20.GL_BLEND);
Gdx.gl.glClearColor(0.1f, 0.1f, 0.1f, 1f);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.setProjectionMatrix(camera.combined);
batch.begin();
batch.setColor(1f, 1f, 1f, 1f);
for (RTSTroopAction i : unitList) {
batch.draw(lightSprite, i.getX() + (i.getWidth() / 2) - 230, i.getY() + (i.getHeight() / 2) - 230, 460, 460); //, 0, 0, lightSprite.getWidth(), lightSprite.getHeight(), false, true);
}
batch.end();
lightBuffer.end();
However, I don't want the "white stuff" on the original texture, I just want the original background shine through. How can I achieve that ?
I think it's playing around with the blendFuncs, but I was not able to figure out which values to use yet.

Thanks to Tenfour04 pointing into the right direction, I was able to find the solution. First of all, the problem is not directly within batch.end();. The problem is, that indeed a sprite batch maintains its own blendFunc Settings. These get applied when flush(); is called. (end() calls it also ).
However the batch is also calling flush when it draws a TextureRegion that is bound to a different texture than the one used in the previous draw() call.
So in my original code: whatever blendFunc I had set was always overridden when I called batch.draw(lightBuffer,...). The solution is to use the spritebatch's blendFunc and not the Gdx.gl.blendFunc.
The total working code finally looks like this:
lightBuffer.begin();
Gdx.gl.glBlendFunc(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);
Gdx.gl.glEnable(GL20.GL_BLEND);
// start rendering to the lightBuffer
// set the ambient color values, this is the "global" light of your scene
Gdx.gl.glClearColor(0.1f, 0.1f, 0.1f, 1f);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
// start rendering the lights
batch.setProjectionMatrix(camera.combined);
batch.begin();
// set the color of your light (red,green,blue,alpha values)
batch.setColor(1f, 1f, 1f, 1f);
for (RTSTroopAction i : unitList) {
if (i.getOwnerId() == game.getCallback().getPlayerId()) {
batch.draw(lightSprite, i.getX() + (i.getWidth() / 2) - 230, i.getY() + (i.getHeight() / 2) - 230, 460, 460); //, 0, 0, lightSprite.getWidth(), lightSprite.getHeight(), false, true);
}
}
batch.end();
lightBuffer.end();
// now we render the lightBuffer to the default "frame buffer"
// with the right blending !
Gdx.gl.glEnable(GL20.GL_BLEND);
Gdx.gl.glBlendFunc(GL20.GL_ZERO, GL20.GL_SRC_COLOR);
batch.setProjectionMatrix(getStage().getCamera().combined);
batch.enableBlending();
batch.setBlendFunction(GL20.GL_ZERO, GL20.GL_SRC_COLOR);
batch.begin();
Gdx.gl.glEnable(GL20.GL_BLEND);
Gdx.gl.glBlendFunc(GL20.GL_ZERO, GL20.GL_SRC_COLOR);
batch.draw(lightBufferRegion,0, 0, getStage().getWidth(), getStage().getHeight());
batch.end();
batch.setBlendFunction(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);

Related

Issued ShapeRenderer beahavior on android

I'm using ShapeRenderer to draw some kind of circular indicator under a mask texture. Everything works perfectly and as expected on Desktop but when running the same code on android, the shape rendering is always on top. Another strange difference is that all shape rendering function calls seems inverted such the first shape is drawn on top.
I've reproduced the problem on a june 2020 libgdx nightly build and with the 20/09/2020 nightly build.
Here is the code :
myConstructor(){
sr = new ShapeRenderer();
sr.scale(1 + 1.1f * GUI.ZOOM_RATIO, 1 + 1.1f * GUI.ZOOM_RATIO, 0f);
}
draw(Batch g,float arg){
g.end(); // I have some batch rendering before
sr.begin(ShapeType.Filled);
sr.setColor(Color.CYAN);
sr.circle(indX, indY, indicatorW + GUI.fit2Density(2));
sr.setColor(Color.DARK_GRAY);
sr.arc(indX, indY, indicatorW, 0f, degrees);
sr.end();
g.begin();
g.setColor(1f, 1f, 1f, 1f);
g.draw(coolDownIndic, indX - coolDownIndic.getWidth() / 2 - GUI.fit2Density(2),
indY - coolDownIndic.getHeight() / 2);
}
On Desktop I see the texture rendred over the arc and the arc over the circle. This order is exactly inverted on android, you can see the expected behavior here Desktop rendering and the incorrect behavior here Android rendering ( the purple cloud on top is a particle emitter effect not linked to the issue).
I guess this is a bug, note that I'm using the scale method on the ShapeRenderer, not sure if it can be related.
Any help would be appreciated.

Result difference between android virtual device and real device

Following code is drawing two different cube (red/green) on android studio with JNI using opengl-ES. On the virtual device, the result is correct. However, on real device, the result looks like strange.
The structure(means 3d model and its view on 2d) is correct. but the color is different to AVD. In addition it looks like the depth test is not working. What's the problem?
Simply speaking,
AVD : gives correct result (red, green cube with depth test on with specific camera pose)
real device : gives strange result (same camera pose to AVD. but color is different. two green cubes are there. also depth test not working)
float color1[] = {1.0f, 0.0f, 0.0f};
float color2[] = {0.0f, 1.0f, 0.0f};
int mColorHandle1;
int mColorHandle2;
glViewport(0,1280,720,1280);
glEnable(GL_DEPTH_TEST);
glUniform4f(mColorHandle1, color1[0], color1[1], color1[2], 1.0f);
glVertexAttribPointer(gvPositionHandle, 3, GL_FLOAT, GL_FALSE, 0, gTriangleVertices1);
glEnableVertexAttribArray(gvPositionHandle);
glDrawArrays(GL_TRIANGLES, 0, 36);
glUniform4f(mColorHandle2, color2[0], color2[1], color2[2], 1.0f);
glDrawArrays(GL_TRIANGLES, 36, 36);
glDisableVertexAttribArray(gvPositionHandle);
glDisable(GL_DEPTH_TEST);

Starfield optimization in libgdx

I want to create a static starfield in libgdx.
My first way was: create a Decal and a DecalBatch over it.
When I draw the Decal I use a Billboarding technic on the Decal
star.decal.setRotation(camera.direction, camera.up);
next: I wanted to animate the alphas on the decals, so I created on a random way some time:
star.decal.setColor(1, 1, 1, 0.6f+((float) Math.random()*0.4f) );
It is working, but my FPS went down from 55 FPS to 25 FPS (because of my 500-1000 stars)
Can I use only one batch call in any way? Maybe a particleMaterial with only one Vertex list and with a GL_POINT mode that is always face to front of my camera?
How can I do this in libgdx?
The Batch is way to complex than what you need , on every frame it needs to copy all the vertices of the sprites in another array and do calculations on them to find the scale rotation etc..
As you suspect GL_POINT sprites will be way faster and in a medium range device it should be able to render in 60 fps like 2000 points that have different position and color
here is some old code of mine ,its in c and it uses opengl es 1.1 and propably there will be a more simple way to do it in libgdx
glDisableClientState(GL_COLOR_ARRAY);
glDisableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
glEnable (GL_POINT_SPRITE_OES);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, TXTparticle);
glTexEnvi(GL_POINT_SPRITE_OES, GL_COORD_REPLACE_OES, GL_TRUE);
glPointSize(30);
glColorPointer(4, GL_FLOAT, 32, particlesC);//particlesC the vertices color
glVertexPointer(3, GL_FLOAT, 24, particlesV);//particlesV the vertices
glDrawArrays(GL_POINTS, 0, vertvitLenght/6);
glDisable( GL_POINT_SPRITE_OES );
glDisable(GL_TEXTURE_2D);
glDisableClientState(GL_COLOR_ARRAY);
glDisableClientState(GL_VERTEX_ARRAY);

Android - high fps drop with OpenGL ES 1.1

I am trying to make a simple 2D game. I have noticed that the onDrawFrame() function is called in 60 FPS with vsync. However for some reason on older devices if rendering time exceeds 1ms then it will skip a frame every few frame and the animation doesn't look smooth at all. And I am talking about devices that aren't too old...
90% of the drawing code looks like this:
gl.glBindTexture(GL11.GL_TEXTURE_2D, MainProgram.glSurfaceView.renderer.ResourceIdToTexture(R.drawable.tank));
gl.glFrontFace(GL11.GL_CW);
gl.glVertexPointer(3, GL11.GL_FLOAT, 0, vertexBuffer);
gl.glTexCoordPointer(2, GL11.GL_FLOAT, 0, tankTextureBuffer);
float[] glCoords = World.WorldCoordsToGLCoords(tank.GetPosition().PosX(), tank.GetPosition().PosY());
float[] glDims = {0.025f, 0.025f};
gl.glPushMatrix();
gl.glTranslatef(glCoords[0], glCoords[1]+glDims[1], -1.0f);
gl.glScalef(glDims[0], glDims[1], 1.0f);
float[] tc = tank.GetOwner().GetColor().asFloatArray();
gl.glColor4f(tc[0],tc[1],tc[2],tc[3]);
gl.glDrawArrays(GL11.GL_TRIANGLE_STRIP, 0, vertex.length / 3);
gl.glColor4f(1f, 1f, 1f, 1f);
gl.glPopMatrix();
Still, it took 1 millisecond to process all of the drawings to the screen and I have noticed that the FPS (I have measured the number of timer per second the surfaceView.drawFrame is called) drops from 60 to around 30 and it looks almost disgusting.
How do other games have such smooth animations even on the oldest devices?
And how do I achieve this with GLES11?

android opengl rendering garbage

I've been doing some simple shaders and Im encountering an error that happens randomly, when I start rendering my scene, sometimes the mesh is rendered with extra vectors, and if I kill the activity and then I open the same activity it renders sometimes without the extra vectors.
My guesses are that the memory on the GPU is not completely wiped out when I kill the activity. Whats more weird is that these extra polygons are rendered sometimes using my shader logic and other times they render as if they were filled with random squares.
Im going all crazy I've reviewed all the code, from where I read the obj, to where I set the vertex attributes, if you have been seen this before please let me know. BTW I'm using a motorola milestone with android 2.1.
This is the code related where I create a simple triangle and set the attributes of the vertices:
//This is where I create the mesh
mMesh = new Mesh();
mMesh.setVertices(new float[]{-0.5f, 0f, 0.5f,
0.5f, 0f, -0.5f,
-0.5f, 0f, -0.5f});
ArrayList<VertexAttribute> attributes = new ArrayList<VertexAttribute>();
attributes.add(new VertexAttribute(Usage.Position, 3, ProgramShader.POSITION_ATTRIBUTE));
VertexAttributes vertexAttributes = new VertexAttributes(attributes.toArray(new VertexAttribute[attributes.size()]));
mMesh.setVertexAttributes(vertexAttributes);
...
...
.......
//This is where I send the mesh to opengl
for(VertexAttribute attr :mVertexAttributes.getAttributes().values()){
mVertexBuffer.position(attr.offset);
int handler = shader.getHandler(attr.alias);
if(handler != -1){
try{
GLES20.glVertexAttribPointer(handler, attr.numComponents, GLES20.GL_FLOAT, false, mVertexAttributes.vertexSize, mVertexBuffer);
GLES20.glEnableVertexAttribArray(handler);
}catch (RuntimeException e) {
Log.d("CG", attr.alias);
throw e;
}
}
}
//(length = 3 for a triangle)
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, length);
Here are some screenshots for you to see the issue:
Screenshots
Also here is a link to a video I took when I run the app on the phone.
Video
So... I found the problem it was a really dumb thing I was doing,
//on this line I was sending length, where length was
//the length of the vertices for the triangle it was "9" (x,y,z for each vertex)
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, length);
//I had to divide that by the number of components for each vertex
//so when the vertex only has position attributes (x,y,z) is divided by 3
//when you have more i.e. normals it will be divided by 6 (x,y,z, normalX, normalY, normalZ)
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, length/mVertexAttributes.vertexNumComponents);
I hope this helps others.

Categories

Resources