I need to draw a growing 3d line using open gl on an Android device.
The problem is I need to draw lines that scale with a "laser" type effect on them.
Originally I just thought of drawing simple gl lines or line loops but they wont scale if the camera is moved closer to them - like a fly through.
My next thought was to generate a cylinder mesh and extrude it as I would do a line in real time, accounting for 90 degree turns by adding a 45degree rotation after extruding from the end point a new cylinder, turning the end 45degrees again and extruding another cylinder to create the new line extension and so on and so-forth...
Problem with cylinders is the near clipping plane will clip through them.
Anyone have a better thought or idea they can throw at me for this?
Problem with cylinders is the near clipping plane will clip through them.
This will be the case with any kind of geometry. You can however use depth clamping to avoid some of the effects of clipping. See here for details http://arcsynthesis.org/gltut/Positioning/Tut05%20Depth%20Clamping.html
Related
I am trying to develop an Android app which is trying to draw a perfect line directly in front of me. My reference point is my phone, which means that the line has to parallel to my phone's left side.
I have following issues:
I am using Sceneform, so there is no "onSurfaceCreated" callback.(?)
I assume that the white-dots shows the surface. But, then I think if there is a detected surface then I can place a Shape on it. But is can not happen sometimes. And sometimes, I can place a Shape even if there are no visible white-dots.
When I try to draw a line between the points (0,0,0) to (1,0,0), it is not always parallel to the left side of my phone. I assume that the reason of this is related with the following fact :
angle between the left-bottom corner of the detected surface and the left-top corner is not zero. (If we consider the coordinate system as follows : phone's left side is y-axis, bottom is the x-axis.)And this angle changes each time I reopen the app.
These are more theory questions than the implementation. So, I need someone to prove or disprove, or give me guideline.
1) There isn't method like onSurfaceCreated.
2) Not all the detected planes are covered with white-dots. Is is intended because if all the detected planes are rendered with white-dots, it would confuse the users
3) When you talk about the points(0,0,0) and (1,0,0), is it world position or local position? Whether it is world position or local position, you can not draw a line parallel to your left side of phone in the way you approach.
Could someone explain me how convex path is calculated? I need to draw some cubic and additionally some lines but then path is shown as non convex. However when I leave only lines or just cubic it is then convex. The problem is that I need some non regular shaped background and need Convex path for shadow outline but can't get how I could connect drawing cubic with some lines to make convex path if it is even possible
A path is convex if it has a single contour, and only ever curves in a single direction.
Convex means it keeps bending / rotating in one direction, and one direction only. You really have to make sure that all your angles and curves add up. If your curve connects to a line it has to have the same angle or be "more convex", I hope the following 2 images will clear this up.
The picture below is not convex. That's also likely your problem. The line connects to a curve, but the curve has a different angle than the line and it will change the direction where it connects. See where the line goes down but instead of continuing the downwards-motion it suddenly goes up again. Instead of keeping one direction it will change for a moment where line and curve meet.
The above Image is exaggerated for clarity, but even small errors in the connection between the line and curve will trigger an error.
The next line connects to a curve with a steeper angle. This is convex and won't be a problem. See how the whole contour keeps a single motion in one direction, depending in which direction you follow it it keeps turning left/right.
I answered because I was facing a similar issue recently and I feel your pain. I recommend pen and paper to double and triple check the math and to use a small epsilon value to account for rounding errors etc... You really have to nail the math, because if your line and curve connection is just off by very little it will throw that exception.
Sorry for my bad paint skills
I'm trying to build a 3d transparent globe on android (with transparent regions at the place of water regions) and the way I'm doing it is by creating a sphere model with Libgdx and then filling it with a .png texture of the earth with transparent water regions. It is working fine except that after I disable cull face (to be able to see the back face of the sphere), I observe some triangles missing and the back face vanishes as I rotate the 3d model: Pic1, Pic2. If I rotate the sphere at some other angles it appairs to work fine and I can see without problem the back face of the globe.
I put here some relevant code:
render:
Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
camController.update();
modelBatch.begin(cam);
modelBatch.render(instance, environment);
modelBatch.end();
I've tried all possible values for the DepthTestAttribute but seems that there is no way to get rid of this very strange effect. Please give me some advice, many thanks in advance.
In the case of general geometry, one common approach is to sort the triangles, and render them in back to front order.
However, with the special properties of a sphere, I believe there is a simpler and more efficient approach that should work well. With a sphere, you always have exactly one layer of back-facing triangles and exactly one layer of front-facing triangles. So if you render the back-facing triangles first, followed by the front-facing triangles, you get an order where a triangle rendered earlier is never in front of a triangle rendered later, which is sufficient to get correct transparency rendering.
You already figured out how to render only the front-facing triangles: By enabling culling of the back faces. Rendering only the back-facing triangles is the same thing, except that you call the front faces. So the code looks like this:
glBlendFunc(...);
glEnable(GL_BLEND);
glEnable(GL_CULL_FACE);
glClear(...);
glCullFace(GL_FRONT);
// draw sphere
glCullFace(GL_BACK);
// draw sphere
This looks like the typical conflict between transparency and depth testing.
A transparent pixel is drawn, but does write a value to the depth buffer causing pixels behind it - which are supposed to be visible - to be discarded due to a failed depth test.
Same happens here. Some elements in front are drawn first - writing to the depth buffer even though parts are translucent - so elements behind it are not drawn.
A quick and dirty fix would be to discard pixels with an alpha value below a certain threshold by enabling alpha testing (or discarding it in your shader in later OpenGL versions). However, this will in most cases result in visible artifacts.
A better way would be to sort the individual elements/triangles of the globe from back to front relative to the camera and drawing them in that order.
I would also suggest you read the OpenGL wiki on https://www.opengl.org/wiki/Transparency_Sorting
I am working on a project that uses scale/translate/rotate. I know that I need to rotate first (bottom of list of transformations) I am currently pushing before every objects draw and popping matrix afterwards. This is done in the draw() method I created. I have tested removing drawing of objects systematically and not using transformations other than rotate on the object in question.
This is my problem. When I rotate the object at 0/360 degrees it is a perfect square, as it rotates it stretches longer along the x-axis (still maintaining properties of a rectangle if you follow me) at 270 degrees (straight down x-axis) the stretch reaches its highest point... It will stretch directly coordinating to the angle in which it is facing. I am setting point of origin to 0,0 before I re-rotate.
I am wondering if this problem is a case of a common newbie mistake, working as intended and I need to compensate, or my code has a flaw that I couldn't find in a few hours of research and digging. I will post code if requested but I think that because of the nature and the checking I have already done it may take quite a bit of space.
Any input would be greatly appreciated.
Thanks in advance!
I changed the image so that I could see more of what was happening. It doesn't appear to be rotating the way I intended so that might be the cause of all this.
After fixing the angle I have noticed that the slope is where it's being drawn from/to is off. I am assuming that I have a problem with my points.
I am using textured quads to render a grid of tiles from a sprite sheet. Unfortunately when rendered, there are small gaps between the individual tiles:
Changing the texture parameters to scale the texture using GL_NEAREST rather than GL_LINEAR fixes this, but results in artifacts within the textured quad itself. Is there some way to prevent GL_LINEAR from interpolating using pixels outside of the specified UV coordinates? Any other suggestions for how to fix this?
For reference, here's the sprite sheet I am using:
Looks like a precision problem with your texture maps, are you using floats (32bit) or something smaller ? And how do you calculate the coordinates ?
Also leaving a 1 pixel border between texture sometimes helps (sometimes you always get a rounding error).
Myself I use this program http://www.texturepacker.com/ (not affiliated in any way), and you get the texture map and UV coordinates from it, you can also specify a padding around the textures and it can also extrude the last color around your texture, so even if get weird rounding probs you can always get a perfect seam.
I would check your precision and calcs first though.