I have started to learn OpenGL ES 2.0. I currently am trying to make an object dynamic by using the translateM method on it on the onDrawFrame() method. Unfortunately, when I do this, the object appears briefly and then disappears. I'm not sure what is going on. If I put the same code for translating it in onSurfaceChanged(), it works. Here is my code(without the x integer that should move it on the x-axis, so I know it wouldn't move. But even removing the dynamic int has it disappearing):
package com.background.gl.glcirclebackgroundanimation;
import static android.opengl.GLES20.GL_COLOR_BUFFER_BIT;
import static android.opengl.GLES20.glClear;
import static android.opengl.GLES20.glClearColor;
import static android.opengl.GLES20.glViewport;
import static android.opengl.Matrix.multiplyMM;
import static android.opengl.Matrix.setIdentityM;
import static android.opengl.Matrix.translateM;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
import android.content.Context;
import android.opengl.GLSurfaceView.Renderer;
import com.background.gl.helper.ColorShaderProgram;
import com.background.gl.helper.TextureShaderProgram;
import com.background.gl.objects.GLCircle;
import com.background.gl.objects.Mallet;
import com.background.gl.objects.Table;
import com.background.gl.util.MatrixHelper;
import com.background.gl.util.TextureHelper;
public class CircleDynamicBackgroundRenderer implements Renderer {
private final Context context;
private final float[] projectionMatrix = new float[16];
private final float[] modelMatrix = new float[16];
private Table table;
private Mallet mallet;
private GLCircle circle;
float x = 0.01f;
private TextureShaderProgram textureProgram;
private ColorShaderProgram colorProgram;
private int texture;
public CircleDynamicBackgroundRenderer(Context context) {
this.context = context;
}
#Override
public void onSurfaceChanged(GL10 glUnused, int width, int height) {
glViewport(0, 0, width, height);
MatrixHelper.perspectiveM(projectionMatrix, 45, (float) width
/ (float) height, 1f, 10f);
setIdentityM(modelMatrix, 0);
}
#Override
public void onSurfaceCreated(GL10 glUnused, EGLConfig config) {
glClearColor(0.0f, 0.0f, 1.0f, 0.0f);
table = new Table();
mallet = new Mallet();
circle = new GLCircle();
textureProgram = new TextureShaderProgram(context);
colorProgram = new ColorShaderProgram(context);
texture = TextureHelper.loadTexture(context, R.drawable.air_hockey_surface);
//texture2 = TextureHelper.loadTexture(context, R.drawable.air_hockey_surface_2);
}
#Override
public void onDrawFrame(GL10 glUnused) {
//Clear the rendering surface
glClear(GL_COLOR_BUFFER_BIT);
//x+=0.01f;
translateM(modelMatrix, 0, 0f, 0f, -10f);
final float[] temp = new float[16];
multiplyMM(temp, 0, projectionMatrix, 0, modelMatrix, 0);
System.arraycopy(temp, 0, projectionMatrix, 0, temp.length);
textureProgram.useProgram();
//Pass data into our shaders(u_matrix) and enable/bind the texture
//textureProgram.setUniforms2(projectionMatrix, texture, texture2);
textureProgram.setUniforms(projectionMatrix, texture);
//Bind our [vertex array] data to our shaders(attribute data)
circle.bindData(textureProgram);
//Draw it
circle.draw();
/*
// Draw the mallets.
colorProgram.useProgram();
colorProgram.setUniforms(projectionMatrix);
mallet.bindData(colorProgram);
mallet.draw();*/
}
}
I've searched for multiple solutions but I couldn't get any of them to work. I could really use some assistance. :)
Edit: Is it, because I'm multiplying the matrices, automatically moving it back -10 per frame? Because that would explain my issues.
Edit2: So I changed it from -10f to -0.01f, and it works. But, when I try the same for the x axis, it just stays the same. Why is that? Is it because of how the projection matrix divides all the values by w, so that's why the z changes, but not x or y? How would I move it left and right on the x-axis then?
Edit3: My width and height are 768, 1184. So why is there a big difference if I translate it on the x by 0.1f? Is the width and height 768 and 1184, or 1 and 1?
Edit4: Ok, now even if I remove the translateM from onDrawFrame, it still moves. I'm lost
This line is wrong
System.arraycopy(temp, 0, projectionMatrix, 0, temp.length);
I don't know what you thought when you wrote it, so I can't explain how you failed, you may only ever override the projection matrix when the window aspect ratio changes. The rest of the time this matrix has to be constant and you may not write to it. In the shader you need to pass the product of the projection and translation matrix. But when you multiply it, store the result in temp and pass temp to the shader.
Try it
translateM(modelMatrix, 0, 0.5f, 0f, -10f);
X and Y axis must be in range [-1,1]
Related
I'm writing an Android app using OpenGL ES and encountered this problem in the Nexus 5 emulator that comes with Android Studio. I have reduced my code to this small app, which simply draws a box going back and forth:
package net.jesbus.stuttertest;
import android.app.Activity;
import android.opengl.GLSurfaceView;
import android.os.Bundle;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.nio.ShortBuffer;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
public class MainActivity extends Activity
{
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
// Create GLSurfaceView
GLSurfaceView glsv = new GLSurfaceView(this);
glsv.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
// Create GLSurfaceView.Renderer
glsv.setRenderer(new GLSurfaceView.Renderer()
{
float step = 0;
boolean direction = false;
ShortBuffer iBuff;
FloatBuffer vBuff;
#Override
public void onSurfaceCreated(GL10 gl, EGLConfig config)
{
// Generate vertices index buffer
short[] pIndex = {0, 1, 2, 3};
ByteBuffer pbBuff = ByteBuffer.allocateDirect(pIndex.length * 2);
pbBuff.order(ByteOrder.nativeOrder());
iBuff = pbBuff.asShortBuffer();
iBuff.put(pIndex);
iBuff.position(0);
// Generate vertices buffer
float[] vs = new float[]
{
-1, +1, 0,
+1, +1, 0,
-1, -1, 0,
+1, -1, 0,
};
ByteBuffer bBuff = ByteBuffer.allocateDirect(vs.length * 4);
bBuff.order(ByteOrder.nativeOrder());
vBuff = bBuff.asFloatBuffer();
vBuff.put(vs);
vBuff.position(0);
}
#Override
public void onDrawFrame(final GL10 gl)
{
// Animation calculation
step += direction ? 0.02f : -0.02f;
if (step > 1) direction = false;
else if (step < 0) direction = true;
// Set background color
gl.glClearColor(0.7f, 0.7f, 1, 1);
// Clear screen
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
// Set matrix to correct location
gl.glLoadIdentity();
gl.glTranslatef(-1 + step * 2, 0, 0);
gl.glScalef(0.25f, 0.4f, 1);
// Draw box
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glFrontFace(GL10.GL_CW);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vBuff);
gl.glDrawElements(GL10.GL_TRIANGLE_STRIP, 4, GL10.GL_UNSIGNED_SHORT, iBuff);
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
}
#Override
public void onSurfaceChanged(GL10 gl, int width, int height)
{
}
});
setContentView(glsv);
}
}
I looked at it frame by frame, and it seems that instead of showing the next frame, it shows the previous frame, and then skips the frame it was supposed to show and continues:
The circles represent frames produced in onDrawFrame, and the arrows represent the flow of time.
Video showing the problem
I don't exactly know how threading is used OpenGL, but try this:
// Animation calculation
synchronized (this) {
step += direction ? 0.02f : -0.02f;
if (step > 1) direction = false;
else if (step < 0) direction = true;
}
or make the whole onDrawFrame() method synchronized if the compiler consents and OpenGL doesn't lock up...
I'm drawing on Android with coordinates between -1.0 and 1.0, but the renderer is mapping these coordinates to the wrong places. In landscape, the width doesn't extend to the corners, and in portrait, the width extends over the screen borders slightly. I'm trying to make this fill the screen resolution. Here's my code:
package com.mycompany.brickbreaker;
import android.opengl.GLES20;
import android.opengl.GLSurfaceView;
import android.opengl.Matrix;
import android.util.Log;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
public class MyGLRenderer implements GLSurfaceView.Renderer {
private BrickBreaker game;
public MyGLRenderer(BrickBreaker activity) {
super();
game = activity;
}
//mMVPMatrix stands for "Model View Projection Matrix"
private final float[] mMVPMatrix = new float[16];
private final float[] mProjectionMatrix = new float[16];
private final float[] mViewMatrix = new float[16];
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
//Set the background frame color
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
}
#Override
public void onDrawFrame(GL10 unused){
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
//Set the camera position (View Matrix)
Matrix.setLookAtM(mViewMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0f);
//Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);
//Draw the rectangle!
game.draw(mMVPMatrix);
}
#Override
public void onSurfaceChanged(GL10 unused, int width, int height){
GLES20.glViewport(0, 0, width, height);
Log.d("Touch", "Surface changed at height/width " + height + "/" + width);
float ratio = (float) width/(float) height;
//This projection matrix is applied to object coordinates in
//the onDrawFrame() method
Matrix.frustumM(mProjectionMatrix, 0, -ratio, ratio, -1, 1, 3, 7);
}
public static int loadShader(int type, String shaderCode){
//create the shader type
int shader = GLES20.glCreateShader(type);
//add the code to the shader, and compile it
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
}
I'm not an expert at openGL, but I do have a working engine.
In your on SurfaceChanged, you pass in the old projectionMatrix.
You should add the following code:
//Clear the matricies
for(int i=0;i<16;i++)
{
mProjectionMatrix[i] = 0.0f;
mViewMatrix[i] = 0.0f;
mMVPMatrix[i] = 0.0f;
}
// Setup our screen width and height for normal sprite translation.
Matrix.orthoM(mProjectionMatrix, 0, 0f, width, 0.0f, height, 0, 50);
// Set the camera position (View matrix)
Matrix.setLookAtM(mViewMatrix, 0, 0f, 0f, 1f, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
// Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);
Hopefully that fixes your problem.
So I've solved this. What was going on is that when your projection matrix is created with Matrix.frustumM(0,0,-ratio,ratio....), the openGl grid is actually changed from a -1 to 1 range in both X and Y, into a -1 to 1 grid in the y, with a -ratio to ratio range in x coordinates. Multiplying any x-coordinates I used by the ratio passed to frustumM made everything happy again.
I am developing an android based tracking system using camera and other sensors on android. I am interested in reading the transformation matrix from AndAR rather than displaying some object (e.g. a cube) when the marker is detected. I have another tracking system developed using a flavor of ARToolkit, called jARToolkit, that runs on desktop machine and gives transformation matrix between the web-camera and the pattern.
Right now I am getting the transformation matrix from AndAR, but if we compare it with the transformation matrix that I am getting from jARToolkit, it is totally different. The reason could be following problems -
The surface preview that I see on android is always rotated by 90 degrees. So my X and Y co-ordinates in the translation matrix exchange their position.
I am not sure about the unit of the translation matrix. It comes to around 4 units per cm in physical world but there is no way for me to verify this.
I would appreciate if anyone could help me to address these questions or let me know if I am missing something. Thanks in advance.
Following is the code that I am using. It is pretty much the same as in the AndAR documentation.
boolean keepRunning = true;
try {
ARToolkit artoolkit = getArtoolkit();
CustomObject object_hiro = new CustomObject("test_hiro", "hiro.patt", 80.0,new double[] { 0, 0 });
artoolkit.registerARObject(object_hiro);
}
catch (AndARException ex)
{
System.out.println("");
}
while(keepRunning)
{
double[] transMatrix = (double[]) object_hiro.getTransMatrix();
}
and here is the CustomObject.java -
import java.nio.FloatBuffer;
import javax.microedition.khronos.opengles.GL10;
import edu.dhbw.andar.ARObject;
import edu.dhbw.andar.pub.SimpleBox;
import edu.dhbw.andar.util.GraphicsUtil;
/**
* An example of an AR object being drawn on a marker.
* #author tobi
*
*/
public class CustomObject extends ARObject {
public CustomObject(String name, String patternName,
double markerWidth, double[] markerCenter) {
super(name, patternName, markerWidth, markerCenter);
float mat_ambientf[] = {0f, 1.0f, 0f, 1.0f};
float mat_flashf[] = {0f, 1.0f, 0f, 1.0f};
float mat_diffusef[] = {0f, 1.0f, 0f, 1.0f};
float mat_flash_shinyf[] = {50.0f};
mat_ambient = GraphicsUtil.makeFloatBuffer(mat_ambientf);
mat_flash = GraphicsUtil.makeFloatBuffer(mat_flashf);
mat_flash_shiny = GraphicsUtil.makeFloatBuffer(mat_flash_shinyf);
mat_diffuse = GraphicsUtil.makeFloatBuffer(mat_diffusef);
}
public CustomObject(String name, String patternName,
double markerWidth, double[] markerCenter, float[] customColor) {
super(name, patternName, markerWidth, markerCenter);
float mat_flash_shinyf[] = {50.0f};
mat_ambient = GraphicsUtil.makeFloatBuffer(customColor);
mat_flash = GraphicsUtil.makeFloatBuffer(customColor);
mat_flash_shiny = GraphicsUtil.makeFloatBuffer(mat_flash_shinyf);
mat_diffuse = GraphicsUtil.makeFloatBuffer(customColor);
}
private SimpleBox box = new SimpleBox();
private FloatBuffer mat_flash;
private FloatBuffer mat_ambient;
private FloatBuffer mat_flash_shiny;
private FloatBuffer mat_diffuse;
/**
* Everything drawn here will be drawn directly onto the marker,
* as the corresponding translation matrix will already be applied.
*/
#Override
public final void draw(GL10 gl) {
super.draw(gl);
gl.glMaterialfv(GL10.GL_FRONT_AND_BACK, GL10.GL_SPECULAR,mat_flash);
gl.glMaterialfv(GL10.GL_FRONT_AND_BACK, GL10.GL_SHININESS, mat_flash_shiny);
gl.glMaterialfv(GL10.GL_FRONT_AND_BACK, GL10.GL_DIFFUSE, mat_diffuse);
gl.glMaterialfv(GL10.GL_FRONT_AND_BACK, GL10.GL_AMBIENT, mat_ambient);
//draw cube
gl.glColor4f(1.0f, 0f, 0, 1.0f);
gl.glTranslatef( 0.0f, 0.0f, 12.5f );
box.draw(gl);
}
#Override
public void init(GL10 gl) {
// TODO Auto-generated method stub
}
}
Please let me know if I need to provide additional information.. Thanks
Original C ARToolKit has two type of transformation associated to a marker:
a 3x4 matrix (computer vision matrix from the pose estimation, obtained from arGetTransMat)
a 4x4 matrix (an OpenGL-like matrix, obtained from argConvGLcpara with the above 3x4 matrix).
In AndAR:
3x4 matrix: can be obtained by calling getTransMatrix() from your ARObject.
4x4 matrix: not publicly accessible from your ARObject, matrix stored in glCameraMatrix (see the code of ARObject.java).
in JARToolKit:
3x4 matrix: can be obtained by calling getTransMatrix
4x4 matrix: can be obtained by calling getCamTransMatrix
Maybe you access a different matrix between AndAR and JARToolKit.
The unit is relative to your marker size. Generally it's in mm, parameter in your declaration of object_hiro: 80.0 represents 80mm width. You can print your marker to this size so you get a match between your physical object and virtual content.
Do there is any special emulator settings needed to run OpenGL Apps?
I already set "GPU emulation" property to "yes".
I am trying to run an Android sample live wallpaper, using the sample source found from this link, The desired output is a rotating triangle.
After a little effort I got the app running but it doesn't draw anything in emulator but when I tested in device it works, But in the emulator it still just shows a green screen, I found a discussion on it in Google groups here. I tried to set view port as said in it. But still it doesn't show any result, on surface changed I had added this line
gl.glViewport(0, 0, width, height);
Do this is the correct way to set view port?
This is my render class,
public class MyRenderer implements GLWallpaperService.Renderer {
GLTriangle mTriangle;
public void onDrawFrame(GL10 gl) {
gl.glClearColor(0.2f, 0.4f, 0.2f, 1f);
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glMatrixMode(GL10.GL_MODELVIEW);
autoRotate(gl);
gl.glColor4f(.2f, 0f, .5f, 1f);
mTriangle.draw(gl);
}
public void onSurfaceChanged(GL10 gl, int width, int height) {
gl.glViewport(0, 0, width, height);
gl.glMatrixMode(GL10.GL_PROJECTION);
gl.glLoadIdentity();
GLU.gluPerspective(gl, 60f, (float)width/(float)height, 1f, 100f);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
gl.glTranslatef(0, 0, -5);
}
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
mTriangle = new GLTriangle();
gl.glClearDepthf(1f);
gl.glEnable(GL10.GL_DEPTH_TEST);
gl.glDepthFunc(GL10.GL_LEQUAL);
}
/**
* Called when the engine is destroyed. Do any necessary clean up because
* at this point your renderer instance is now done for.
*/
public void release() {
}
private void autoRotate(GL10 gl) {
gl.glRotatef(1, 0, 1, 0);
gl.glRotatef(0.5f, 1, 0, 0);
}
}
Herse is GLTriangle class
import java.nio.FloatBuffer;
import java.nio.ShortBuffer;
import javax.microedition.khronos.opengles.GL10;
public class GLTriangle {
private FloatBuffer _vertexBuffer;
private final int _nrOfVertices = 3;
private ShortBuffer _indexBuffer;
public GLTriangle() {
init();
}
private void init() {
// We use ByteBuffer.allocateDirect() to get memory outside of
// the normal, garbage collected heap. I think this is done
// because the buffer is subject to native I/O.
// See http://download.oracle.com/javase/1.4.2/docs/api/java/nio/ByteBuffer.html#direct
// 3 is the number of coordinates to each vertex.
_vertexBuffer = BufferFactory.createFloatBuffer(_nrOfVertices * 3);
_indexBuffer = BufferFactory.createShortBuffer(_nrOfVertices);
// Coordinates for the vertexes of the triangle.
float[] coords = {
-1f, -1f, 0f, // (x1, y1, z1)
1f, -1f, 0f, // (x2, y2, z2)
0f, 1f, 0f // (x3, y3, z3)
};
short[] _indicesArray = {0, 1, 2};
_vertexBuffer.put(coords);
_indexBuffer.put(_indicesArray);
_vertexBuffer.position(0);
_indexBuffer.position(0);
}
public void draw(GL10 gl) {
// 3 coordinates in each vertex
// 0 is the space between each vertex. They are densely packed
// in the array, so the value is 0
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, getVertexBuffer());
// Draw the primitives, in this case, triangles.
gl.glDrawElements(GL10.GL_TRIANGLES, _nrOfVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
}
private FloatBuffer getVertexBuffer() {
return _vertexBuffer;
}
}
What's going wrong here? Is there a better sample code for Open GL live wallpaper?
AT LAST I FOUND IT..
What I need to do is just add
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
to onSurfaceCreated method along with the code line
gl.glViewport(0, 0, width, height);
in the onSurfaceChanged method in MyRenderer Class
I found a similar question in stack itself [ But Solution worked for me is not marked as correct :( ]
I have been trying to make a point rotate around another point in Opengl es for android. It works but in a particular way. As the rotation gets bigger (i.e close to 90°) the point gets further away from the centre of rotation.Eventually the point rotates around the centre of rotation in an elliptical orbit however I want it to rotate in a circular fashion. anyone knows how I could be able to do this? thank you
package org.example.pointtest;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import javax.microedition.khronos.opengles.GL10;
public class LegRoot
{
public FloatBuffer hipVertexBuffer;
float[]hip={1.75f,-2.75f,0.0f};//0 hip
float[]knee={1.75f,-6.75f,0.0f};//1 knee
float[]ankle={1.75f,-10.75f,0.0f};//2 ankle
public float distance2D(float[]origin,float[]extremity)
{
float a=extremity[0]-origin[0];
float b=extremity[1]-origin[1];
float c=extremity[2]-origin[2];
float[] d={a,b,c};
return d[1];
}
public LegRoot()
{
float []hippoint=
{
1.75f,-2.75f,0.0f
};//0 hip
ByteBuffer vbb = ByteBuffer.allocateDirect(1*3*4);
vbb.order(ByteOrder.nativeOrder());
hipVertexBuffer=vbb.asFloatBuffer();
hipVertexBuffer.put(hippoint);
hipVertexBuffer.position(0);
}
public void hip(GL10 gl)
{
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0,hipVertexBuffer);// root joint transformation matrix(supposition)
gl.glPushMatrix();
gl.glColor4f(1f, 0f, 0f, 1f);
gl.glRotatef(0f, 0, 0, 1);
gl.glTranslatef(0f,0f, 0);
gl.glDrawArrays(GL10.GL_POINTS, 0, 1);
gl.glPopMatrix();
}
public void knee(GL10 gl)
{
gl.glPushMatrix();
gl.glTranslatef(-hip[0], -hip[1], 0);
gl.glRotatef(0f, 0, 0, 1);
gl.glTranslatef(hip[0], hip[1], 0);
gl.glTranslatef(0,distance2D(hip,knee), 0);
hip(gl);
gl.glPopMatrix();
}
public void ankle(GL10 gl)
{
gl.glPushMatrix();
gl.glTranslatef(-knee[0], -knee[1], 0);
gl.glRotatef(90f, 0, 0, 1);
gl.glTranslatef(knee[0], knee[1], 0);
gl.glTranslatef(0, distance2D(knee, ankle), 0);
knee(gl);
gl.glPopMatrix();
}
}
Just to have an answer here I copied my comment.
You probably have a glScale in your Matrix. You could try to load the identity matrix before drawing to ensure there is no scale involved. But this would remove all other transformations you did before.
Also note: When you have a non square display and don't take in account the aspect ratio of your display this will also look like scaled.
For your second problem: Can you post the rotate statement you mean? And maybe it is better to open a new question for a new problem.