OpenGL ES Android: why does my point cloud fade during rotate - android

I load an STL mesh, draw it correctly (using GL_TRIANGLES), rotate nicely, change colour, the lights stay in position while the mesh moves, everything is great. Then I switch off the triangles and display just the vertices (using GL_POINTS), now when I rotate (and even when I display the triangles and the vertices together) the points seem to fade out as I rotate - as if they are lit from only one side.
Does this ring any bells with anyone?
Thanks for any help.
Baz

It maybe just a perception artefact. If you still got lighting on, the points are of course lit like the triangle vertices (depending on their normals), meaning they actually have an orientation, even if this is not intuitive for points. So they may get darker or brighter when rotating them into/away from the light. It's just that this change seems more evident, because you don't have the other surface points to fill the gaps and compensate for the dimming. Try disabling lighting and they should keep their color when rotating.

Related

Missing triangles on 3D sphere with Libgdx

I'm trying to build a 3d transparent globe on android (with transparent regions at the place of water regions) and the way I'm doing it is by creating a sphere model with Libgdx and then filling it with a .png texture of the earth with transparent water regions. It is working fine except that after I disable cull face (to be able to see the back face of the sphere), I observe some triangles missing and the back face vanishes as I rotate the 3d model: Pic1, Pic2. If I rotate the sphere at some other angles it appairs to work fine and I can see without problem the back face of the globe.
I put here some relevant code:
render:
Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
camController.update();
modelBatch.begin(cam);
modelBatch.render(instance, environment);
modelBatch.end();
I've tried all possible values for the DepthTestAttribute but seems that there is no way to get rid of this very strange effect. Please give me some advice, many thanks in advance.
In the case of general geometry, one common approach is to sort the triangles, and render them in back to front order.
However, with the special properties of a sphere, I believe there is a simpler and more efficient approach that should work well. With a sphere, you always have exactly one layer of back-facing triangles and exactly one layer of front-facing triangles. So if you render the back-facing triangles first, followed by the front-facing triangles, you get an order where a triangle rendered earlier is never in front of a triangle rendered later, which is sufficient to get correct transparency rendering.
You already figured out how to render only the front-facing triangles: By enabling culling of the back faces. Rendering only the back-facing triangles is the same thing, except that you call the front faces. So the code looks like this:
glBlendFunc(...);
glEnable(GL_BLEND);
glEnable(GL_CULL_FACE);
glClear(...);
glCullFace(GL_FRONT);
// draw sphere
glCullFace(GL_BACK);
// draw sphere
This looks like the typical conflict between transparency and depth testing.
A transparent pixel is drawn, but does write a value to the depth buffer causing pixels behind it - which are supposed to be visible - to be discarded due to a failed depth test.
Same happens here. Some elements in front are drawn first - writing to the depth buffer even though parts are translucent - so elements behind it are not drawn.
A quick and dirty fix would be to discard pixels with an alpha value below a certain threshold by enabling alpha testing (or discarding it in your shader in later OpenGL versions). However, this will in most cases result in visible artifacts.
A better way would be to sort the individual elements/triangles of the globe from back to front relative to the camera and drawing them in that order.
I would also suggest you read the OpenGL wiki on https://www.opengl.org/wiki/Transparency_Sorting

Drawing a 2d convex hull shape on a 3d terrain

I have a 3d mesh, which is a terrain. This runs perfectly fine btw, but I want to have shapes moving accross this terrain. These shapes are flat on the landscape and are blob-like: They can change shape and should follow the contoures and the heightmap of the terrain. These shapes can be painted on the landscape or flow over it, that doesn't matter.
The shapes are meant to be blocks of armies moving across the map, and this should be happening Real-Time! Also: they are 2d convex hull shapes. Also they are just one color with an alpha value (like blue with alpha 0.25f).
The only problem is: I can't figure out how to do this and the question is: Can anyone tell me how to do it?
My first thoughts were just to copy the terrain vertex matrix, push it up a bit so it will be on top of the terrain, load this buffer into a VBO and update the index buffer according to the position and shape needed and then draw the shape. This is rather slow and inefficient, especially when the shape is moving and changing. Also, the resolution of the heightmap is 175x175, so the movement is not at all smooth but rather jaggy.
Then I thought, but rather new to this area, update the shape outlines to the fragment shader of the terrain and let the shader decide if a point lies in that area and change color accordingly. This also was a really slow option, but if anyone sees potential and a good way to do this, tell me!
The next option was to draw directly onto the texture, which is still in the failing stage. If someone has any good ideas on how to draw a scene to a flat area and then put that on a terrain mesh, that would be great!
So if anyone has a solution to draw a shape (or multiple) on a terrain? That would be awesome. Thanks in advance!

Seamlessly layering transparent sprites in OpenGL ES

I am working on an Android app, based on the LibGDX framework (Though I don't think that should affect this problem too much), and I am having trouble finding a way to get the results I want when drawing using transparent sprites. The problem is that the sprites visibly layer on top of each other where they overlap, similar to what is displayed in this image :
This is pretty unsightly for some of what I want to do, and even completely breaks other parts. What I would like them to do is merge together seamlessly, like so:
The only success I have had thus far is to draw the entire sequence of sprites on a separate texture at full opacity, and then draw that texture back with the desired opacity. I had this working moderately well, and I could likely make it work for most of what I need it to, but the large problem right now is that these things are dynamically drawn onto the screen, and the process of modifying a fairly large texture and sending it back are pretty taxing on mobile devices, and causes an unacceptable level of performance.
I've spent a good chunk of time looking for more ideal solutions, including experimenting with blend modes and coming up with quirky formulas that balanced out alpha and color values in ways to even things out, but nothing was particularly successful. My guess is that the only viable route for this is the previously mentioned way of creating a texture and applying the alpha difference to that, but I am unsure of the best way to make that work with lower powered mobile devices.
There might be a few other ways to do this: The most straight forward would be to attach a stencil buffer and draw circles to stencil first and then draw a full screen rect with desired color+alpha with the stencil, this should be much faster then some FBO with a separate texture.
Another thing might work is drawing those circles first with disabled blend and then your whole scene over it with inverted "blendFunc" but do note it might be impossible if other elements also need blending.
3rd instead of using stencil you could just use the alpha channel of your render buffer. Just use a color mask to draw only to alpha and draw the circles, then reenable RGB on color mask and draw the fullscreen rect using appropriate "blendFunc" also note here that if previous shapes have used blend you will need to clear the alpha to 1.0 before doing this (color mask to alpha only, disabled blend, draw full screen rect with color that has alpha set to 1.0)

Preventing gaps/borders when using texture for sprite sheet

I am using textured quads to render a grid of tiles from a sprite sheet. Unfortunately when rendered, there are small gaps between the individual tiles:
Changing the texture parameters to scale the texture using GL_NEAREST rather than GL_LINEAR fixes this, but results in artifacts within the textured quad itself. Is there some way to prevent GL_LINEAR from interpolating using pixels outside of the specified UV coordinates? Any other suggestions for how to fix this?
For reference, here's the sprite sheet I am using:
Looks like a precision problem with your texture maps, are you using floats (32bit) or something smaller ? And how do you calculate the coordinates ?
Also leaving a 1 pixel border between texture sometimes helps (sometimes you always get a rounding error).
Myself I use this program http://www.texturepacker.com/ (not affiliated in any way), and you get the texture map and UV coordinates from it, you can also specify a padding around the textures and it can also extrude the last color around your texture, so even if get weird rounding probs you can always get a perfect seam.
I would check your precision and calcs first though.

openGL fading background

I'm trying to create a particular effect where I have a bunch of particles on the screen which leave trails that slowly fade. I was hoping that I could simply use glClear with a small alpha value to do this but that doesn't seem to be working.
My second idea was to draw a black face on the front of the screen but this doesn't seem to be giving me the right effect, the particles are faded but the background doesn't really fade. My next idea is to render to a texture and fade that texture but that's a lot of extra work and I'm not sure if it will solve my problem. Can anyone think of a way to do this? Am I missing something?
Edit Also I'm having trouble finding information about rendering to a texture on android. If anyone has some links to articles that would be great.
Assuming that your 'particles' ar just a bunch of textured sprites, you can simply add color data for each vertex of the sprite using glVertexPointer(). The color you set for the vertices will then be blended with the texture of the sprite. You can easily update these values to achieve a 'fading' effect.
E.g. if you set RGBA = (1,1,1,1) for each vertex, the sprite will appear as before (no translucency), set RGBA = (1,0,0,1) the sprite will appear red (no translucency), set RGBA = (0.5,0.5,0.5,0.5) the sprite will appear half translucent, etc. You will have to set the correct glBlendFunc() beforehand to get the desired behaviour!
Cheers, Aert.

Categories

Resources