order of android rotation matrix conversion - android

android uses the following code to calculate rotation matrix:
float Ax = gravity[0];
float Ay = gravity[1];
float Az = gravity[2];
final float Ex = geomagnetic[0];
final float Ey = geomagnetic[1];
final float Ez = geomagnetic[2];
float Hx = Ey*Az - Ez*Ay;
float Hy = Ez*Ax - Ex*Az;
float Hz = Ex*Ay - Ey*Ax;
final float normH = (float)Math.sqrt(Hx*Hx + Hy*Hy + Hz*Hz);
if (normH < 0.1f) {
// device is close to free fall (or in space?), or close to
// magnetic north pole. Typical values are > 100.
return false;
}
final float invH = 1.0f / normH;
Hx *= invH;
Hy *= invH;
Hz *= invH;
final float invA = 1.0f / (float)Math.sqrt(Ax*Ax + Ay*Ay + Az*Az);
Ax *= invA;
Ay *= invA;
Az *= invA;
final float Mx = Ay*Hz - Az*Hy;
final float My = Az*Hx - Ax*Hz;
final float Mz = Ax*Hy - Ay*Hx;
if (R != null) {
if (R.length == 9) {
R[0] = Hx; R[1] = Hy; R[2] = Hz;
R[3] = Mx; R[4] = My; R[5] = Mz;
R[6] = Ax; R[7] = Ay; R[8] = Az;
} else if (R.length == 16) {
R[0] = Hx; R[1] = Hy; R[2] = Hz; R[3] = 0;
R[4] = Mx; R[5] = My; R[6] = Mz; R[7] = 0;
R[8] = Ax; R[9] = Ay; R[10] = Az; R[11] = 0;
R[12] = 0; R[13] = 0; R[14] = 0; R[15] = 1;
}
}
i would like to know what is the logic behind this
also, what is the order of rotation used?
i want to correct the rotation using the rotation matrix. so the order of calculation by android is important.

Android assumes the gravity parameter is a vector lying on the world sky axis. That is if (w_1, w_2, w_3) is the world basis where w_1 is a unit vector pointing East, w_2 is a unit vector pointing North and w_3 is a vector pointing toward the sky, then the gravity parameter is a vector that is a multiple of w_3. Therefore the normalize of the gravity parameter is w_3.
Also, the code assume the geomagnetic field parameter is a vector lying on the plane spanned by w_2 and w_3 Thus the cross product of the normalize geomagnetic field parameter and the normalize gravity parameter is a unit vector orthogonal to the plane spanned by w_2 and w_3. Therefore this product is just w_1.
Now the cross product of w_3 and w_1 is w_2. Thus you get the change of basis from the device coordinate to the world coordinate.
I do not understand what do you mean by "the order of rotation used" and thus cannot answer that question.

Related

Detect Device to Near or away from Ear Android code

I need to know whether device is near to ear or not by using sensors
I tried using proximity, I want to combine accelerator and gyroscope sensors to exactly find the device is near or far from Ear.
Code for Proximity
#Override
public void onSensorChanged(SensorEvent event) {
float distance = event.values[0];
if (event.sensor.getType() == Sensor.TYPE_PROXIMITY) {
if (distance < mProximity.getMaximumRange()) {
iv.setText("Near");
} else {
iv.setText("far");
}
}
}
This is what I got from Android Documentation, I am sure you can dig more to get some answers to your problem, but this should be enough to get you started. You can also do some research about position sensors in android. The documentation is quite useful
// Create a constant to convert nanoseconds to seconds.
private static final float NS2S = 1.0f / 1000000000.0f;
private final float[] deltaRotationVector = new float[4]();
private float timestamp;
public void onSensorChanged(SensorEvent event) {
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if (timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
// Axis of the rotation sample, not normalized yet.
float axisX = event.values[0];
float axisY = event.values[1];
float axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
// Normalize the rotation vector if it's big enough to get the axis
// (that is, EPSILON should represent your maximum allowable margin of error)
if (omegaMagnitude > EPSILON) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = sin(thetaOverTwo);
float cosThetaOverTwo = cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
timestamp = event.timestamp;
float[] deltaRotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix,
deltaRotationVector);
// User code should concatenate the delta rotation we computed with the current rotation
// in order to get the updated rotation.
// rotationCurrent = rotationCurrent * deltaRotationMatrix;
}
}

How does Android's SensorManager.getRotation work (i.e. the math)?

I generally understand what it's intended to do ("Computes the inclination matrix I as well as the rotation matrix R transforming a vector from the device coordinate system to the world's coordinate system") but I don't get it how it works.
The javadoc is well written and the source is available here but I don't understand the math (for example, what is the mathematica/physical meaning of the values Hx, Hy and Hz? For example: Hx = Ey*Az - Ez*Ay). Also, what happens later in the method.
I've pasted the code from the GrepCode link above, leaving the source line numbers for easy reference. Thank you.
971 public static boolean More ...getRotationMatrix(float[] R, float[] I,
972 float[] gravity, float[] geomagnetic) {
973 // TODO: move this to native code for efficiency
974 float Ax = gravity[0];
975 float Ay = gravity[1];
976 float Az = gravity[2];
977 final float Ex = geomagnetic[0];
978 final float Ey = geomagnetic[1];
979 final float Ez = geomagnetic[2];
980 float Hx = Ey*Az - Ez*Ay;
981 float Hy = Ez*Ax - Ex*Az;
982 float Hz = Ex*Ay - Ey*Ax;
983 final float normH = (float)Math.sqrt(Hx*Hx + Hy*Hy + Hz*Hz);
984 if (normH < 0.1f) {
985 // device is close to free fall (or in space?), or close to
986 // magnetic north pole. Typical values are > 100.
987 return false;
988 }
989 final float invH = 1.0f / normH;
990 Hx *= invH;
991 Hy *= invH;
992 Hz *= invH;
993 final float invA = 1.0f / (float)Math.sqrt(Ax*Ax + Ay*Ay + Az*Az);
994 Ax *= invA;
995 Ay *= invA;
996 Az *= invA;
997 final float Mx = Ay*Hz - Az*Hy;
998 final float My = Az*Hx - Ax*Hz;
999 final float Mz = Ax*Hy - Ay*Hx;
1000 if (R != null) {
1001 if (R.length == 9) {
1002 R[0] = Hx; R[1] = Hy; R[2] = Hz;
1003 R[3] = Mx; R[4] = My; R[5] = Mz;
1004 R[6] = Ax; R[7] = Ay; R[8] = Az;
1005 } else if (R.length == 16) {
1006 R[0] = Hx; R[1] = Hy; R[2] = Hz; R[3] = 0;
1007 R[4] = Mx; R[5] = My; R[6] = Mz; R[7] = 0;
1008 R[8] = Ax; R[9] = Ay; R[10] = Az; R[11] = 0;
1009 R[12] = 0; R[13] = 0; R[14] = 0; R[15] = 1;
1010 }
1011 }
1012 if (I != null) {
1013 // compute the inclination matrix by projecting the geomagnetic
1014 // vector onto the Z (gravity) and X (horizontal component
1015 // of geomagnetic vector) axes.
1016 final float invE = 1.0f / (float)Math.sqrt(Ex*Ex + Ey*Ey + Ez*Ez);
1017 final float c = (Ex*Mx + Ey*My + Ez*Mz) * invE;
1018 final float s = (Ex*Ax + Ey*Ay + Ez*Az) * invE;
1019 if (I.length == 9) {
1020 I[0] = 1; I[1] = 0; I[2] = 0;
1021 I[3] = 0; I[4] = c; I[5] = s;
1022 I[6] = 0; I[7] =-s; I[8] = c;
1023 } else if (I.length == 16) {
1024 I[0] = 1; I[1] = 0; I[2] = 0;
1025 I[4] = 0; I[5] = c; I[6] = s;
1026 I[8] = 0; I[9] =-s; I[10]= c;
1027 I[3] = I[7] = I[11] = I[12] = I[13] = I[14] = 0;
1028 I[15] = 1;
1029 }
1030 }
1031 return true;
1032 }
They are the cross product. The assumption is one of the param passed in the method is gravity thus is the Sky axis. The other assumption is that the magnetic param passed in lies in the North-Sky plane. Thus the cross product of these two vectors is a vector orthogonal to the North-Sky plane which is East. Now the cross product of Sky and East is a vector orthogonal to both of these which is North. Normalize all of these will give a orthonormal basis for the World coordinate.
In the calculation above, H is East and M is North.

Andengine - Shooting bullets in front of rotating gun

Hello I searched in the forum,but coudn't find a helpful answer.
I'm making a game with AndEngine and I'm stuck for 3 days on shooting from rotating sprite.
That is my code and how I rotate the gun.I tried here to shoot a bullet ,but it shoots from a wrong starting point I would want to shoot a bullet from the end of the gun.
#Override
public boolean onSceneTouchEvent(Scene pScene, TouchEvent pSceneTouchEvent) {
if(pSceneTouchEvent.isActionMove()){
final float dX = pSceneTouchEvent.getX() - machine.getX();
final float dY = pSceneTouchEvent.getY() - machine.getY();
float angle = (float) Math.atan2(dX,dY);
float rotation = MathUtils.radToDeg(angle) + 1;
machine.setRotation(rotation - 90);
Log.d("BUG",machine.getRotation() + "");
if(machine.getRotation() >= 84 ){
machine.setRotation(84);
}
if(machine.getRotation() <= -54 ){
machine.setRotation(-54);
}
final int incrementXValue = 15;
long sElapsed = System.currentTimeMillis() - lastFire;
if(bulletsAmout > 0 && sElapsed > cooldownBetweenShoot * cdModd){
e = new Entity(0,machine.getY());
e.setRotation(getRotation());
SceneManager.getInstance().getCurrentScene().attachChild(e);
float x2 = (float) (machine.getSceneCenterCoordinates()[0] + machine.getWidth() /2 * Math.cos(machine.getRotation()));
float y2 = (float) (machine.getSceneCenterCoordinates()[1] + machine.getWidth() /2 * Math.sin(machine.getRotation()));
float realX = (float) (Math.toRadians(x2) + machine.getWidth());
realY = (float) Math.toRadians(y2);
bullets = new Sprite(realX,realY, resourcesManager.bulletRegion.deepCopy(), vbom){
protected void onManagedUpdate(float pSecondsElapsed) {
float currentX = this.getX();
this.setX(currentX + incrementXValue);
super.onManagedUpdate(pSecondsElapsed);
}
};
bullets.setScale(0.06f);
e.attachChild(bullets);
projectilesToBeAdded.add(bullets);
bulletsAmout--;
lastFire = System.currentTimeMillis();
setBulletsText(bulletsAmout);
resourcesManager.pistolSound.play();
}
return true;
}
return false;
}
Assuming you are using GLES2-AnchorCenter:
You can position the bullet by setting it to the position of the end of the gun that you can get by calling gun.convertLocalToSceneCoordinates(gunMuzzleX, gunMuzzleY).
Then set the bullets rotation to the rotation of the gun.
apply velocity to the bullet. Calculate the speed-vector as follows FloatMath.sin(rotationOfBulletInRadians) * speed and FloatMath.cos(rotationOfBulletInRadians) * speed.
Be aware that you have to pass the rotation in radians to the sin and cos function NOT in degrees!
So I found how to fix that.
The problem is in this line of code :
e = new Entity(0,machine.getY());
Should be :
e = new Entity(machine.getX() - (machine.getHeight() / 2),machine.getY())

Compute rotation matrix using the magnetic field

In get rotation matrix value it contains public static boolean getRotationMatrix (float[] R, float[] I, float[] gravity, float[] geomagnetic)
Here how can i calculate the float[] gravity?
I found a sample of code where it calculate the orientation using both Accelerometer and Magnetic field
boolean success = SensorManager.getRotationMatrix(
matrixR,
matrixI,
valuesAccelerometer,
valuesMagneticField);
if(success){
SensorManager.getOrientation(matrixR, matrixValues);
double azimuth = Math.toDegrees(matrixValues[0]);
double pitch = Math.toDegrees(matrixValues[1]);
double roll = Math.toDegrees(matrixValues[2]);
readingAzimuth.setText("Azimuth: " + String.valueOf(azimuth));
readingPitch.setText("Pitch: " + String.valueOf(pitch));
readingRoll.setText("Roll: "+String.valueOf(roll));
}
My questions are :
Is orientation value is the rotation matrix value?
If no then how can i implement this code to get the rotation matrix value using magnetic? field?
To get the rotation matrix i use this code
public void onSensorChanged(SensorEvent sensorEvent) {
if (timestamp != 0) {
final double dT = (sensorEvent.timestamp - timestamp) * NS2S;
double magneticX = sensorEvent.values[0];
double magneticY = sensorEvent.values[1];
double magneticZ = sensorEvent.values[2];
double omegaMagnitude =Math.sqrt(magneticX*magneticX + magneticY*magneticY + magneticZ*magneticZ);
if (omegaMagnitude > EPSILON) {
magneticX /= omegaMagnitude;
magneticY /= omegaMagnitude;
magneticZ /= omegaMagnitude;
}
double thetaOverTwo = omegaMagnitude * dT / 2.0f;
double sinThetaOverTwo =Math.sin(thetaOverTwo);
double cosThetaOverTwo = Math.cos(thetaOverTwo);
deltaRotationVector[0] = (double) (sinThetaOverTwo * magneticX);
deltaRotationVector[1] = (double) (sinThetaOverTwo * magneticY);
deltaRotationVector[2] = (double) (sinThetaOverTwo * magneticZ);
deltaRotationVector[3] = cosThetaOverTwo;
}
double[] deltaRotationMatrix = new double[9];
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
}
But the problem is this getRotationMatrixFromVector is says undefine for sensor.Any idea?
Orientation is not a rotation matrix as it only provides you angles related to magnetic North. You can obtain the rotation matrix (Direction Cosine Matrix) that will help you to transform coordinates from your device frame to the Earth's frame this way :
with
= azimuth (radians)
= pitch (radians)
= roll (radians)
I know that this is an old thread but in case it helps, for Android I think the 3x3 rotation matrix is actually given by a variation of the approved answer. To be specific, in Android the rotation matrix is
(cos&#966 cos&#968 - sin&#966 sin&#968 sin&#952) sin&#966 cos&#952 ( cos&#966 sin&#968 + sin&#966 cos&#968 sin&#952)
-(sin&#966 cos&#968 + cos&#966 sin&#968 sin&#952) cos&#966 cos&#952 (-sin&#966 sin&#968 + cos&#966 cos&#968 sin&#952)
-sin&#968 cos&#952 -sin&#952 cos&#966 cos&#952
where
&#966 = azimuth
&#952 = pitch
&#968 = roll
which corresponds to the 3x3 Android rotation matrix R[0] to R[8] (matrixR in the question) via
R[0] R[1] R[2]
R[3] R[4] R[5]
R[6] R[7] R[8]

Following a straight line (via Path?)

I'm working on a game which will use projectiles. So I've made a Projectile class and a new instance is created when the user touches the screen:
#Override
public boolean onTouch(View v, MotionEvent e){
float touch_x = e.getX();
float touch_y = e.getY();
new Projectile(touch_x, touch_y);
}
And the Projectile class:
public class Projectile{
float target_x;
float target_y;
Path line;
public Projectile(float x, float y){
target_x = x;
target_y = y;
line = new Path();
line.moveTo(MyGame.mPlayerXPos, MyGame.mPlayerYPos);
line.lineTo(target_x, target_y);
}
}
So this makes a Path with 2 points, the player's position and and touch coords. My question is - How can you access points on this line? For example, if I wanted to get the x,y coords of the Projectile at the half point of the line, or the point the Projectile would be at after 100 ticks (moving at a speed of X pixels/tick)?
I also need the Projectile to continue moving after it reaches the final point.. do I need to use line.addPath(line) to keep extending the Path?
EDIT
I managed to get the Projectiles moving in a straight line, but they're going in strange directions. I had to fudge some code up:
private void moveProjectiles(){
ListIterator<Projectile> it = Registry.proj.listIterator();
while ( it.hasNext() ){
Projectile p = it.next();
p.TimeAlive++;
double dist = p.TimeAlive * p.Speed;
float dx = (float) (Math.cos(p.Angle) * dist);
float dy = (float) (Math.sin(p.Angle) * dist);
p.xPos += dx;
p.yPos += -dy;
}
}
The Angle must be the problem.. I'm using this method, which works perfectly:
private double getDegreesFromTouchEvent(float x, float y){
double delta_x = x - mCanvasWidth/2;
double delta_y = mCanvasHeight/2 - y;
double radians = Math.atan2(delta_y, delta_x);
return Math.toDegrees(radians);
}
However, it returns 0-180 for touches above the center of the screen, and 0 to -180 for touches below. Is this a problem?
The best way to model this is with parametric equations. No need to use trig functions.
class Path {
private final float x1,y1,x2,y2,distance;
public Path( float x1, float y1, float x2, float y2) {
this.x1 = x1;
this.y1 = y1;
this.x2 = x2;
this.y2 = y2;
this.distance = Math.sqrt( (x2-x1)*(x2-x1)+(y2-y1)*(y2-y1));
}
public Point position( float t) {
return new Point( (1-t)*x1 + t*x2,
(1-t)*y1 + t*y2);
}
public Point position( float ticks, float speed) {
float t = ticks * speed / distance;
return position( t);
}
}
Path p = new Path(...);
// get halfway point
p.position( 0.5);
// get position after 100 ticks at 1.5 pixels per tick
p.position( 100, 1.5);
From geometry, if it's a straight line you can calculate any point on it by using polar coordinates.
If you find the angle of the line:
ang = arctan((target_y - player_y) / (target_x - player_x))
Then any point on the line can be found using trig:
x = cos(ang) * dist_along_line
y = sin(ang) * dist_along_line
If you wanted the midpoint, then you just take dist_along_line to be half the length of the line:
dist_along_line = line_length / 2 = (sqrt((target_y - player_y)^2 + (target_x - player_x)^2)) / 2
If you wanted to consider the point after 100 ticks, moving at a speed of X pixels / tick:
dist_along_line = 100 * X
Hopefully someone can comment on a way to do this more directly using the android libs.
First of all, the Path class is to be used for drawing, not for calculation of the projectile location.
So your Projectile class could have the following attributes:
float positionX;
float positionY;
float velocityX;
float velocityY;
The velocity is calculated from the targetX, targetY, playerX and playerY like so:
float distance = sqrt(pow(targetX - playerX, 2)+pow(targetY - playerY, 2))
velocityX = (targetX - playerX) * speed / distance;
velocityY = (targetY - playerY) * speed / distance;
Your position after 20 ticks is
x = positionX + 20 * velocityX;
y = positionY + 20 * velocityY;
The time it takes to reach terget is
ticksToTarget = distance / velocity;
Location of halp way point is
halfWayX = positionX + velocityX * (tickToTarget / 2);
halfWayY = positionY + velocityY * (tickToTarget / 2);

Categories

Resources