Android OpenGL - Help with C++ Conversion - android

I'm trying to convert this C++ code to Android's Java. However, I'm using GL10 and glGetDoublev apparently isn't supported on OpenGL-ES. How else can I perform this function?
// Get point by reading z buffer and unprojecting to object coords
void Pick(int x, int y)
{
GLint viewport[4];
GLdouble mvmatrix[16], projmatrix[16];
glGetIntegerv(GL_VIEWPORT, viewport);
glGetDoublev(GL_MODELVIEW_MATRIX, mvmatrix);
glGetDoublev(GL_PROJECTION_MATRIX, projmatrix);
int winx = x;
int winy = winHeight - y;
GLfloat winz = 0.0;
GLdouble objx = 0.0;
GLdouble objy = 0.0;
GLdouble objz = 0.0;
// Get winz for given winx and winy
glReadPixels(winx, winy, 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, &winz);
// Make sure there was something at the point
if (winz >= 1.0)
{
qDebug("Nothing picked");
}
else
{
// Get object coords from win coords
gluUnProject((GLdouble)winx, (GLdouble)winy, (GLdouble)winz,
mvmatrix, projmatrix, viewport,
&objx, &objy, &objz);
qDebug("Pick: win=%d,%d,%.3f obj=%.3f,%.3f,%.3f",
winx, winy, winz, objx, objy, objz);
// Place a marker at that position
Marker marker;
marker.point.x = objx;
marker.point.y = objy;
marker.point.z = objz;
markerList << marker;
// limit to two markers
if (markerList.count() > 2)
markerList.pop_front();
Rebuild();
}
}

I ran into this problem in my own Android OpenGL ES 1.0 forays. Since OpenGL ES 1.0 does not allow you to get the matrices directly (as far as I know glGetFloatv was not implemented in 1.0), as you wanted to do, you need to make a wrapper that tracks matrices.
If you use OpenGL ES 1.1, you can use glGetFloatv since it has been implemented.
Here is the website where I originally found the solution:
http://www.41post.com/1540/programming/android-opengl-get-the-modelview-matrix-on-15-cupcake
All implementation details are there.

Related

Augmented Reality + Bullet Physics - trouble with rayTest/Ray picking

I am trying to pick objects in the bullet physics world but all I seem to be able to pick is the floor/ground plane!!! I am using the Vuforia SDK and have altered the ImageTargets demo code. I have used the following code to project my touched screen points to the 3d world:
void projectTouchPointsForBullet(QCAR::Vec2F point, QCAR::Vec3F &lineStart, QCAR::Vec3F &lineEnd, QCAR::Matrix44F &modelViewMatrix)
{
QCAR::Vec4F normalisedVector((2 * point.data[0] / screenWidth - 1),
(2 * (screenHeight-point.data[1]) / screenHeight - 1),
-1,
1);
QCAR::Matrix44F modelViewProjection;
SampleUtils::multiplyMatrix(&projectionMatrix.data[0], &modelViewMatrix.data[0] , &modelViewProjection.data[0]);
QCAR::Matrix44F inversedMatrix = SampleMath::Matrix44FInverse(modelViewProjection);
QCAR::Vec4F near_point = SampleMath::Vec4FTransform( normalisedVector,inversedMatrix);
near_point.data[3] = 1.0/near_point.data[3];
near_point = QCAR::Vec4F(near_point.data[0]*near_point.data[3], near_point.data[1]*near_point.data[3], near_point.data[2]*near_point.data[3], 1);
normalisedVector.data[2] = 1.0;//z coordinate now 1
QCAR::Vec4F far_point = SampleMath::Vec4FTransform( normalisedVector, inversedMatrix);
far_point.data[3] = 1.0/far_point.data[3];
far_point = QCAR::Vec4F(far_point.data[0]*far_point.data[3], far_point.data[1]*far_point.data[3], far_point.data[2]*far_point.data[3], 1);
lineStart = QCAR::Vec3F(near_point.data[0],near_point.data[1],near_point.data[2]);
lineEnd = QCAR::Vec3F(far_point.data[0],far_point.data[1],far_point.data[2]);
}
when I try a ray test in my physics world I only seem to be hitting the ground plane! Here is the code for the ray test call:
QCAR::Vec3F intersection, lineStart;
projectTouchPointsForBullet(QCAR::Vec2F(touch1.tapX, touch1.tapY), lineStart, lineEnd,inverseProjMatrix, modelViewMatrix);
btVector3 btRayFrom = btVector3(lineEnd.data[0], lineEnd.data[1], lineEnd.data[2]);
btVector3 btRayTo = btVector3(lineStart.data[0], lineStart.data[1], lineStart.data[2]);
btCollisionWorld::ClosestRayResultCallback rayCallback(btRayFrom,btRayTo);
dynamicsWorld->rayTest(btRayFrom, btRayTo, rayCallback);
if(rayCallback.hasHit())
{
char* pPhysicsData = reinterpret_cast<char*>(rayCallback.m_collisionObject->getUserPointer());//my bodies have char* messages attached to them to determine what has been touched
btRigidBody* pBody = btRigidBody::upcast(rayCallback.m_collisionObject);
if (pBody && pPhysicsData)
{
LOG("handleTouches:: notifyOnTouchEvent from physics world!!!");
notifyOnTouchEvent(env, obj,0,0, pPhysicsData);
}
}
I know I am predominantly looking top-down so I am bound to hit the ground plane, I at least know my touch is being correctly projected into the world, but I have objects lying on the ground plane and I can't seem to be able to touch them! Any pointers would be greatly appreciated :)
I found out why I wasn't able to touch the objects - I am scaling the objects up when they are drawn, so I had to scale the view matrix by the same value before I projected my touch point into the 3d world (EDIT I also had the btRayFrom and btRayTo input cooordinates reversed, it is now fixed):
//top of code
int kObjectScale = 100.0f
....
...
//inside touch handler method
SampleUtils::scalePoseMatrix(kObjectScale, kObjectScale, kObjectScale,&modelViewMatrix.data[0]);
projectTouchPointsForBullet(QCAR::Vec2F(touch1.tapX, touch1.tapY), lineStart, lineEnd,inverseProjMatrix, modelViewMatrix);
btVector3 btRayFrom = btVector3(lineStart.data[0], lineStart.data[1], lineStart.data[2]);
btVector3 btRayTo = btVector3(lineEnd.data[0], lineEnd.data[1], lineEnd.data[2]);
My touches are projected correctly now :)

Android 3D Surface Plot

My requirement is to create a 3d surface plot(should also display the x y z axis) from a list of data points (x y z) values.The 3d visualization should be done on ANDROID.
My Inputs : Currently planning on using open gl 1.0 and java. I m also considering Adore3d , min3d and rgl package which uses open gl 1.0. Good at java ,but a novice at 3d programming.
Time Frame : 2 months
I would like to know the best way to go about it? Is opengl 1.0 good for 3d surface plotting?Any other packages/libraries that can be used with Android?
Well, you can plot the surface using OpenGL 1.0 or OpenGL 2.0. All you need to do is to draw the axes as lines and draw the surface as triangles. If you have your heightfield data, you would do:
float[][] surface;
int width, height; // 2D surface data and it's dimensions
GL.glBegin(GL.GL_LINES);
GL.glVertex3f(0, 0, 0); // line starting at 0, 0, 0
GL.glVertex3f(width, 0, 0); // line ending at width, 0, 0
GL.glVertex3f(0, 0, 0); // line starting at 0, 0, 0
GL.glVertex3f(0, 0, height); // line ending at 0, 0, height
GL.glVertex3f(0, 0, 0); // line starting at 0, 0, 0
GL.glVertex3f(0, 50, 0); // line ending at 0, 50, 0 (50 is maximal value in surface[])
GL.glEnd();
// display the axes
GL.glBegin(GL.GL_TRIANGLES);
for(int x = 1; x < width; ++ x) {
for(int y = 1; y < height; ++ y) {
float a = surface[x - 1][y - 1];
float b = surface[x][y - 1];
float c = surface[x][y];
float d = surface[x - 1][y];
// get four points on the surface (they form a quad)
GL.glVertex3f(x - 1, a, y - 1);
GL.glVertex3f(x, b, y - 1);
GL.glVertex3f(x, c, y);
// draw triangle abc
GL.glVertex3f(x - 1, a, y - 1);
GL.glVertex3f(x, c, y);
GL.glVertex3f(x - 1, d, y);
// draw triangle acd
}
}
GL.glEnd();
// display the data
This draws simple axes and heightfield, all in white color. It should be pretty straight forward to extend it from here.
Re the second part of your question:
Any other packages/libraries that can be used with Android?
Yes, it's now possible to draw an Android 3D Surface Plot with SciChart.
Link to Android Chart features page
Link to Android 3D Surface Plot example
Lots of configurations are possible including drawing wireframe, gradient colour maps, contours and real-time updates.
Disclosure, I'm the tech lead on the scichart team

Problems using gluUnProject

Basically i have an application for Android 1.5 with a GLSurfaceView class that shows a simple square polygon on the screen. I want to learn to add a new functionality, the functionality of moving the square touching it with the finger. I mean that when the user touches the square and moves the finger, the square should be moved with the finger, until the finger releases the screen.
I'm trying to use gluUnProject to obtain the OpenGL coordinates that matches the exact position of the finger, then, i will make a translatef to the polygon, and i will get the polygon moved to that position (i hope it)
The problem is that something is going wrong with gluUnProject, it is giving me this exception: java.lang.IllegalArgumentException: length - offset < n on the call to gluUnProject.
First of all, i'm passing 0 as Z win coordinate because i dont know what i have to pass as z win coordinate, because win doesn't have Z coordinates, only X and Y. I tested passing 1 on Z coordinate, and i'm getting the same exception.
float [] outputCoords=getOpenGLCoords(event.getX(), event.getY(), 0);
x=outputCoords[0];
y=outputCoords[1];
z=outputCoords[2];
.
.
.
public float[] getOpenGLCoords(float xWin,float yWin,float zWin)
{
int screenW=SectionManager.instance.getDisplayWidth();
int screenH=SectionManager.instance.getDisplayHeight();
//CODE FOR TRANSLATING FROM SCREEN COORDINATES TO OPENGL COORDINATES
mg.getCurrentProjection(MyGl);
mg.getCurrentModelView(MyGl);
float [] modelMatrix = new float[16];
float [] projMatrix = new float[16];
modelMatrix=mg.mModelView;
projMatrix=mg.mProjection;
int [] mView = new int[4];
mView[0] = 0;
mView[1] = 0;
mView[2] = screenW; //width
mView[3] = screenH; //height
float [] outputCoords = new float[3];
GLU.gluUnProject(xWin, yWin, zWin, modelMatrix, 0, projMatrix, 0, mView, 0, outputCoords, 0);
return outputCoords;
}
I answered the same question here; basically the gluUnproject function expects your outputCoords array to have size 4 instead of 3. Note that these are homogeneous coordinates, so you still have to divide the first 3 by the 4th one if you're doing perspective projection.

gluUnProject always returns zero

I need gluUnProject to convert screen coordinates to world coordinates and right now I just about have it working. When my app runs it accurately tells me the coordinates on screen which I know are stored in my renderer thread and then pumps out screen coordinates. Unfortunately the screen coordinates seem to have no effect of world coordinates and the world coordinates remain at zero.
Here is my gluUnProject method
public void vector3 (GL11 gl){
int[] viewport = new int[4];
float[] modelview = new float[16];
float[] projection = new float[16];
float winx, winy, winz;
float[] newcoords = new float[4];
gl.glGetIntegerv(GL11.GL_VIEWPORT, viewport, 0);
((GL11) gl).glGetFloatv(GL11.GL_MODELVIEW_MATRIX, modelview, 0);
((GL11) gl).glGetFloatv(GL11.GL_PROJECTION_MATRIX, projection, 0);
winx = (float)setx;
winy = (float)viewport[3] - sety;
winz = 0;
GLU.gluUnProject(setx, (float)viewport[3] - sety, 0, modelview, 0, projection, 0, viewport, 0, newcoords, 0);
posx = (int)newcoords[0];
posy = (int)newcoords[1];
posz = (int)newcoords[2];
Log.d(TAG, "x= " + String.valueOf(posx));
Log.d(TAG, "y= " + String.valueOf(posy));
Log.d(TAG, "z= " + String.valueOf(posz));
}
Now I've searched and found this forum post and they came to the conclusion that it was to do with using getFloatv instead of getDoublev, but getDoublev does not seem to be supported by GL11
The method glGetDoublev(int, float[], int) is undefined for the type GL11
and also
The method glGetDoublev(int, double[], int) is undefined for the type GL11
should the double and float thing matter and if so how do I go about using doubles
Thank you
EDIT:
I was told that gluUnproject fails when too close to the near far clipping plane so I set winz to -5 when near is 0 and far is -10. This had no effect on the output.
I also logged each part of the newcoords[] array and they all return something that is NaN (not a number) could this be the problem or something higher up in the algorithm
I'm guessing you're working on the emulator? Its OpenGL implementation is rather buggy, and after testing I found that it returns all zeroes for the following calls:
gl11.glGetIntegerv(GL11.GL_VIEWPORT, viewport, 0);
gl11.glGetFloatv(GL11.GL_MODELVIEW_MATRIX, modelview, 0);
gl11.glGetFloatv(GL11.GL_PROJECTION_MATRIX, projection, 0);
The gluUnProject() function needs to calculate the inverse of the combined modelview-projection matrix, and since these are all zeroes, the inverse does not exist and will consist of NaNs. The resulting newcoords vector is therefor also all Nans.
Try it on a device with a proper OpenGL implementation, it should work. Keep in mind to still divide by newcoords[3] though ;-)

Problems Using Wavefront .obj's texture coordinates in Android OpenGL ES

I'm writing an android app using openGL ES. I followed some online tutorials and managed to load up a textured cube using hard-coded vertices/indices/texture coordinates
As a next step I wrote a parser for wavefront .obj files. I made a mock file using the vertices etc from the tutorial, which loads fine.
However, when I use a file made using a 3d modelling package, all the textures get messed up
Below is how I'm currently getting the texture coordinates:
First I load all the texture coordinates, the vt's into a big vector
Next I find the first two texture coordinates for each f triangle (so f 1/2/3 2/5/2 3/4/1 means I take the 2nd and 5th texture coordinates. Since .obj starts counting from 1 not 0, I have to -1 from the position and then multiply the position by 2 for the x coord position and do the same but +1 for the y coord position in my vt array)
I take those texture coordinates that I just found and add them to another vector.
Once I've gone through all the vertices. I turn the vector into a FloatBuffer, passing that to glTexCoordPointer in my draw method
Here is the code for parsing the file:
private void openObjFile(String filename, Context context, GL10 gl){
Vector<String> lines = openFile(filename, context); // opens the file
Vector<String[]> tokens = new Vector<String[]>();
Vector<Float> vertices = new Vector<Float>();
Vector<Float> textureCoordinates = new Vector<Float>();
Vector<Float> vertexNormals = new Vector<Float>();
// tokenise
for(int i = 0;i<lines.size();i++){
String line = lines.get(i);
tokens.add(line.split(" "));
}
for(int j = 0;j<tokens.size();j++){
String[] linetokens = tokens.get(j);
// get rid of comments
//if(linetokens[0].equalsIgnoreCase("#")){
//tokens.remove(j);
//}
// get texture from .mtl file
if(linetokens[0].equalsIgnoreCase("mtllib")){
parseMaterials(linetokens[1],context, gl);
}
// vertices
if(linetokens[0].equalsIgnoreCase("v")){
vertices.add(Float.valueOf(linetokens[1]));
vertices.add(Float.valueOf(linetokens[2]));
vertices.add(Float.valueOf(linetokens[3]));
}
// texture coordinates
if(linetokens[0].equalsIgnoreCase("vt")){
textureCoordinates.add(Float.valueOf(linetokens[1]));
textureCoordinates.add(Float.valueOf(linetokens[2]));
}
// vertex normals
if(linetokens[0].equalsIgnoreCase("vn")){
vertexNormals.add(Float.valueOf(linetokens[1]));
vertexNormals.add(Float.valueOf(linetokens[2]));
vertexNormals.add(Float.valueOf(linetokens[3]));
}
}
// vertices
this.vertices = GraphicsUtil.getFloatBuffer(vertices);
Mesh mesh = null;
Vector<Short> indices = null;
Vector<Float> textureCoordinatesMesh = null;
Vector<Float> vertexNormalsMesh = null;
for(int j = 0;j<tokens.size();j++){
String[] linetokens = tokens.get(j);
if(linetokens[0].equalsIgnoreCase("g")){
if(mesh!=null){
mesh.setIndices(GraphicsUtil.getShortBuffer(indices));
mesh.setNumindices(indices.size());
mesh.setNormals(GraphicsUtil.getFloatBuffer(vertexNormalsMesh));
mesh.setTextureCoordinates(GraphicsUtil.getFloatBuffer(textureCoordinatesMesh));
meshes.add(mesh);
}
mesh = new Mesh();
indices = new Vector<Short>();
textureCoordinatesMesh = new Vector<Float>();
vertexNormalsMesh = new Vector<Float>();
} else if(linetokens[0].equalsIgnoreCase("usemtl")){
String material_name = linetokens[1];
for(int mn = 0;mn<materials.size();mn++){
if(materials.get(mn).getName().equalsIgnoreCase(material_name)){
mesh.setTextureID(materials.get(mn).getTextureID());
mn = materials.size();
}
}
} else if(linetokens[0].equalsIgnoreCase("f")){
for(int v = 1;v<linetokens.length;v++){
String[] vvtvn = linetokens[v].split("/");
short index = Short.parseShort(vvtvn[0]);
index -= 1;
indices.add(index);
if(v!=3){
int texturePosition = (Integer.parseInt(vvtvn[1]) - 1) * 2;
float xcoord = (textureCoordinates.get(texturePosition));
float ycoord = (textureCoordinates.get(texturePosition+1));
// normalise
if(xcoord>1 || ycoord>1){
xcoord = xcoord / Math.max(xcoord, ycoord);
ycoord = ycoord / Math.max(xcoord, ycoord);
}
textureCoordinatesMesh.add(xcoord);
textureCoordinatesMesh.add(ycoord);
}
int normalPosition = (Integer.parseInt(vvtvn[2]) - 1) *3;
vertexNormalsMesh.add(vertexNormals.get(normalPosition));
vertexNormalsMesh.add(vertexNormals.get(normalPosition)+1);
vertexNormalsMesh.add(vertexNormals.get(normalPosition)+2);
}
}
}
if(mesh!=null){
mesh.setIndices(GraphicsUtil.getShortBuffer(indices));
mesh.setNumindices(indices.size());
mesh.setNormals(GraphicsUtil.getFloatBuffer(vertexNormalsMesh));
mesh.setTextureCoordinates(GraphicsUtil.getFloatBuffer(textureCoordinatesMesh));
meshes.add(mesh);
}// Adding the final mesh
}
And here is the code for drawing:
public void draw(GL10 gl){
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
// Counter-clockwise winding.
gl.glFrontFace(GL10.GL_CCW);
gl.glEnable(GL10.GL_CULL_FACE);
gl.glCullFace(GL10.GL_BACK);
// Pass the vertex buffer in
gl.glVertexPointer(3, GL10.GL_FLOAT, 0,
vertices);
for(int i=0;i<meshes.size();i++){
meshes.get(i).draw(gl);
}
// Disable the buffers
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
}
public void draw(GL10 gl){
if(textureID>=0){
// Enable Textures
gl.glEnable(GL10.GL_TEXTURE_2D);
// Get specific texture.
gl.glBindTexture(GL10.GL_TEXTURE_2D, textureID);
// Use UV coordinates.
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
// Pass in texture coordinates
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureCoordinates);
}
// Pass in texture normals
gl.glNormalPointer(GL10.GL_FLOAT, 0, normals);
gl.glEnableClientState(GL10.GL_NORMAL_ARRAY);
gl.glDrawElements(GL10.GL_TRIANGLES, numindices,GL10.GL_UNSIGNED_SHORT, indices);
if(textureID>=0){
// Disable buffers
gl.glDisableClientState(GL10.GL_NORMAL_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
}
}
I'd really appreciate any help with this. It is frustrating to be not-quite able to load up the model from file and I'm really not sure what I'm doing wrong or missing here
I have to admit to being a little confused by the framing of your code. Specific things I would expect to be an issue:
you decline to copy a texture coordinate to the final mesh list for the third vertex associated with any face; this should put all of your coordinates out of sync after the first two
your texture coordinate normalisation step is unnecessary — to the extent that I'm not sure why it's in there — and probably broken (what if xcoord is larger than ycoord on the first line, then smaller on the second?)
OBJ considers (0, 0) to be the top left of a texture, OpenGL considers it to be the bottom left, so unless you've set the texture matrix stack to invert texture coordinates in code not shown, you need to invert them yourself, e.g. textureCoordinatesMesh.add(1.0 - ycoord);
Besides that, generic OBJ comments that I'm sure you're already well aware of and don't relate to the problem here are that you should expect to handle files that don't supply normals and files that don't supply either normals or texture coordinates (you currently assume both are present), and OBJ can hold faces with an arbitrary number of vertices, not just triangles. But they're always planar and convex, so you can just draw them as a fan or break them into triangles as though they were a fan.

Categories

Resources