Android: axes vectors from orientation/rotational angles? - android

So there's a couple methods in the Android SensorManager to get your phone's orientation:
float[] rotational = new float[9];
float[] orientation = new float[3];
SensorManager.getRotationMatrix(rotational, whatever, whatever, whatever);
SensorManager.getOrientation(rotational, orientation);
This gives you a rotation matrix called "rotational" and an array of 3 orientation angles called "orientation". However, I can't use the angles in my AR program - what I need is the actual vectors which represent the axes.
For example, in this image from Wikipedia:
I'm basically being given the α, β, and γ angles (though not exactly since I don't have an N - I'm being given the angles from each of the blue axes), and I need to find vectors which represents the X, Y, and Z axes (red in the image). Does anyone know how to do this conversion? The directions on Wikipedia are very complicated, and my attempts to follow them have not worked. Also, I think the data that Android gives you may be in a slightly different order or format than what the conversion directions on Wikipedia expect.
Or as an alternative to these conversions, does anyone know any other ways to get the X, Y, and Z axes from the camera's perspective? (Meaning, what vector is the camera looking down? And what vector does the camera consider to be "up"?)

The rotation matrix in Android provides a rotation from the body (a.k.a device) frame to the world (a.k.a. inertial) frame. A normal back facing camera appears in landscape mode on the screen. This is native mode for a tablet, so has the following axes in the device frame:
camera_x_tablet_body = (1,0,0)
camera_y_tablet_body = (0,1,0)
camera_z_tablet_body = (0,0,1)
On a phone, where portrait is native mode, a rotation of the device into landscape with top turned to point left is:
camera_x_phone_body = (0,-1,0)
camera_y_phone_body = (1,0,0)
camera_z_phone_body = (0,0,1)
Now applying the rotation matrix will put this in the world frame, so (for rotation matrix R[] of size 9):
camera_x_tablet_world = (R[0],R[3],R[6]);
camera_y_tablet_world = (R[1],R[4],R[7]);
camera_z_tablet_world = (R[2],R[5],R[8]);
In general, you can use SensorManager.remapCoordinateSystem() which for the phone example above would be Display.getRotation()=Surface.ROTATION_90 and give the answer you provided.
But if you rotate differently (ROTATION_270 for example) it will be different.
Also, an aside: the best method to get orientation in Android is to listen for Sensor.TYPE_ROTATION_VECTOR events. These are filled with the best possible orientation on most (i.e. Gingerbread or newer) platforms. It is actually the vector part of the quaternion. You can get the full quaternion using this (and last two lines are a way to get the RotationMatrix):
float vec[] = event.values.clone();
float quat[] = new float[4];
SensorManager.getQuaternionFromVector(quat, vec);
float [] RotMat = new float[9];
SensorManager.getRotationMatrixFromVector(RotMat, quat);
More information at: http://www.sensorplatforms.com/which-sensors-in-android-gets-direct-input-what-are-virtual-sensors

SensorManager.getRotationMatrix(rotational, null, gravityVals, geoMagVals);
// camera's x-axis
Vector u = new Vector(-rotational[1], -rotational[4], -rotational[7]); // right of phone (in landscape mode)
// camera's y-axis
Vector v = new Vector(rotational[0], rotational[3], rotational[6]); // top of phone (in landscape mode)
// camera's z-axis (negative into the scene)
Vector n = new Vector(rotational[2], rotational[5], rotational[8]); // front of phone (the screen)
// world axes (x,y,z):
// +x is East
// +y is North
// +z is sky

The orientation matrix that you receive from getRotationMatrix should be based on the gravity field and the magnetic field - in other words X points East, Y - North, Z - the center of the Earth. (http://developer.android.com/reference/android/hardware/SensorManager.html)
To the point of your question, I think the three rotation values can be used directly as a vector, but provide the values in reverse order:
"For either Euler or Tait-Bryan angles, it is very simple to convert from an intrinsic (rotating axes) to an extrinsic (static axes) convention, and vice-versa: just swap the order of the operations. An (α, β, γ) rotation using X-Y-Z intrinsic convention is equivalent to a (γ, β, α) rotation using Z-Y-X extrinsic convention; this is true for all Euler or Tait-Bryan axis combinations."
Source wikipedia
I hope this helps!

Related

Placing an object with a given compass bearing in ARCore

I'd like to place a north facing arrow into an ARCore world using sceneform. I am trying to understand the correct system of transformations to go from the phone's compass to sceneform's quaternions.
This was the code I used to solve the problem:
//Get the phone's pose in ARCore
Pose deviceOrientedPose = frame.getCamera().getDisplayOrientedPose().compose(
Pose.makeInterpolated(
Pose.IDENTITY,
Pose.makeRotation(0, 0, (float)Math.sqrt(0.5f), (float)Math.sqrt(0.5f)),
dhelper.getRotation()));
float[] devquat = deviceOrientedPose.getRotationQuaternion();
//Get the phone's heading in relation to the real world
float heading = compassListener.getBearing(); //Use ROTATION_VECTOR sensor
Quaternion deviceFrame = new Quaternion();
deviceFrame.set(devquat[0],devquat[1],devquat[2],devquat[3]);
double[] rpy = quat2rpy(deviceFrame);
//Rotate around y axis... rotation in this case is the desired rotation
float rotAngle = ((360-rotation)+heading+(float)Math.toDegrees(rpy[1])+360))%360;
Quaternion qt = Quaternion.axisAngle(Vector3.up(),rotAngle);
Essentially the idea here is to acquire the device pose irregardless of orientation. Subesquently, I listen to the bearing as given by the ROTATION_VECTOR sensor. Given that AR-Core's co-ordinate system is right-handed, rotation in the AR-Core world does not follow the convention where we describe a heading as clockwise to North. (360-rotation)+heading essentially rotates the anchor to face north (via +heading), then it applies the rotation of the desired heading (360-heading). And applies this to the camera's current rotation (float)Math.toDegrees(rpy[1])+360

How to get the euler-angles from the rotation vector (Sensor.TYPE_ROTATION_VECTOR)

I rotated my android device in x direction (from -180 degree to 180 degree), see image below.
And I assume only Rotation vector x value is changed. Y and z maybe have some noise, but it should be not much difference among the values.
However, I receive this. Kindly see
https://docs.google.com/spreadsheets/d/1ZLoSKI8XNjI1v4exaXxsuMtzP0qWTP5Uu4C3YTwnsKo/edit?usp=sharing
I suspect my sensor has some problem.
Any idea? Thank you very much.
Jimmy
Your sensor is fine.Well, the rotation vector entries cannot simply be related to the rotation angle around a particular axis. The SensorEvent structure constitutes of timestamp, sensor, accuracy and values. Depending on the vector the float[] of values vary in size 1-5. The rotation vectors values are based on unit quaternions, all together forming a vector representing the orientation of this world frame relative to your smartphone fixed frame above
They are unitless and positive counter-clockwise.
The orientation of the phone is represented by the rotation necessary to align the East-North-Up coordinates with the phone's coordinates. That is, applying the rotation to the world frame (X,Y,Z) would align them with the phone coordinates (x,y,z).
If the vector would be a Rotation-Matrix one could write it as v_body = R_rot_vec * v_world (<--)pushing the world vector into a smartphone fixed description.
Furthermore about the vector:
The three elements of the rotation vector are equal to the last three components of a unit quaternion <cos(θ/2), xsin(θ/2), ysin(θ/2), z*sin(θ/2)>.
Q: So what to do with it? Depending on your Euler-angles convention (possible 24 sequences, valid 12 ones) you could calculate the corresponding angles u := [ψ,θ,φ] by e.g. applying the 123 sequence:
If you already have the rotation matrix entries get euler like so:
the 321 sequence:
with q1-3 always being the values[0-2] (Dont get confused by u_ijk as ref(Diebel) uses different conventions comp. to the standard)But wait, your linked table only does have 3 values, which is similar to what I get. This is oneSensorEvent of mine, the last three are printed from values[]
timestamp sensortype accuracy values[0] values[1] values[2]
23191581386897 11 -75 -0.0036907701 -0.014922042 0.9932963
4q - 3 values = 1q unknown. The first q0 is redundant info (also the doku says it should be there under values[3], depends on your API-level). So we can use the norm (=length) to calculate q0 from the other three. Set the equation ||q|| = 1 and solve for q0. Now all q0-3 are known.
Furthermore my android 4.4.2 does not have the fourth estimated heading Accuracy (in radians) inside value[4], so I evaluate the event.accuracy:
for (SensorEvent e : currentEvent) {
if (e != null) {
String toMsg = "";
for(int i = 0; i < e.values.length;i++) {
toMsg += " " + String.valueOf(e.values[i]);
}
iBinder.msgString(String.valueOf(e.timestamp) + " "+String.valueOf(e.sensor.getType()) + " " + String.valueOf(e.accuracy) + toMsg, 0);
}
}
Put those equations into code and you will get things sorted.
Here is a short conversion helper, converting Quats. using either XYZ or ZYX. It can be run from shell github. (BSD-licensed)
The relevant part for XYZ
/*quaternation to euler in XYZ (seq:123)*/
double* quat2eulerxyz(double* q) {
/*euler-angles*/
double psi = atan2( -2.*(q[2]*q[3] - q[0]*q[1]) , q[0]*q[0] - q[1]*q[1]- q[2]*q[2] + q[3]*q[3] );
double theta = asin( 2.*(q[1]*q[3] + q[0]*q[2]) );
double phi = atan2( 2.*(-q[1]*q[2] + q[0]*q[3]) , q[0]*q[0] + q[1]*q[1] - q[2]*q[2] - q[3]*q[3] );
/*save var. by simply pushing them back into the array and return*/
q[1] = psi;
q[2] = theta;
q[3] = phi;
return q;
}
Here some examples applying quats to euls:
**Q:** What do the sequence ijk stand for? Take two coordinate-frames A and B superposing each other(all axis within each other) and start rotating frame B through i-axis having angle `psi`, then j-axis having angle `theta` and last z-axis having `phi`. It could also be α, β, γ for i,j,k. *I don't pick up the numbers as they are confusing (Diebel vs other papers).*
R(psi,theta,phi) = R_z(phi)R_y(theta)R_x(psi) (<--)
The trick is elementary rotations are applied from right to left, although we read the sequence from left to right.
Those are the three elementary rotations youre going through to go from
A to B: *v_B = R(psi,theta,phi) v_A*
**Q:** So how to get the euler angles/quats turn from [0°,0°,0°] to eg. [0°,90°,0°]?First align both frames from the pictures, respective the known device frame B to the "invisible" worldframe A. Your done superposing when the angles all get to [0°,0°,0°]. Just figure out where is north, south and east where you are sitting right now and point the devices frame B into those directions. Now when you rotate around y-axis counter-clockwise 90° you will have the desired [0°,90°,0°], when converting the quaternion.
*Julian*
*kinematics source: [Source Diebel(Stanford)][11] with solid info on the mechanics background (careful: for Diebel XYZ is denoted u_321 (1,2,3) while ZYX is u_123 (3,2,1)), and [this][12] is a good starting point.

Correct way to use ONLY Gyroscope and Accelerometer to get the reliable current angle in any axis on ANDROID

A week ago i didn't know anything about Android Motion Sensors. After know the amazing thing called Virtual Reality I started to search about which sensors are used to get those results. Than I had a idea for a APP but I still don't know which sensors I should use for the situation below:
I have to get the phone orientation in reference to it self. I mean, I should be able to isolate each axis in degress. Something like it:
In this case, using gyroscope, I think that this variation is on the Z Axis.
Using ONLY gyroscope I had a good result for this situation, but after some repetions, I got a famous problem for the Gyro Sensor: Drift.
After this tutorial:
http://www.thousand-thoughts.com/articles/#articles
things became more clear in my head, but I still am having problems like latency between the real movement, and the output and wrong outputs when I change the device orientation (I think that the gravity is the guilty for that).
Is there some code example about how to get 0 - 360 degrees for each axis using ONLY the gyroscope and accelerometer sensors?
(I may had commited some english mistakes. Sorry for that)
The following code will give you correct lean angle, but only if your phone Z axis is 0. (like the way you illustrated it).
When starting to change the Z axis as well, it become problematic, i'm still working on that. (* Degrees has minus "-" sign when lean left and "+" sign to the right)
float[] mGravity;
float[] mGeomagnetic;
float[] temp = new float[9];
float[] RR = new float[9];
//Load rotation matrix into R
SensorManager.getRotationMatrix(temp, null,
mGravity, mGeomagnetic);
//Remap to camera's point-of-view
SensorManager.remapCoordinateSystem(temp,
SensorManager.AXIS_X,
SensorManager.AXIS_Z, RR);
//Return the orientation values
float[] values = new float[3];
SensorManager.getOrientation(RR, values);
Double degrees = (values[2] * 180) / Math.PI;

Acceleration from device's coordinate system into absolute coordinate system

From my Android device I can read an array of linear acceleration values (in the device's coordinate system) and an array of absolute orientation values (in Earth's coordinate system). What I need is to obtain the linear acceleration values in the latter coord. system.
How can I convert them?
EDIT after Ali's reply in comment:
All right, so if I understand correctly, when I measure the linear acceleration, the position of the phone completely does not matter, because the readings are given in Earth's coordinate system. right?
But I just did a test where I put the phone in different positions and got acceleration in different axes. There are 3 pairs of pictures - the first ones show how I put the device (sorry for my Paint "master skill") and the second ones show readings from data provided by the linear acc. sensor:
device put on left side
device lying on back
device standing
And now - why in the third case the acceleration occurs along the Z axis (not Y) since the device position doesn't matter?
I finally managed to solve it! So to get acceleration vector in Earth's coordinate system you need to:
get rotation matrix (float[16] so it could be used later by android.opengl.Matrix class) from SensorManager.getRotationMatrix() (using SENSOR.TYPE_GRAVITY and SENSOR.TYPE_MAGNETIC_FIELD sensors values as parameters),
use android.opengl.Matrix.invertM() on the rotation matrix to invert it (not transpose!),
use Sensor.TYPE_LINEAR_ACCELERATION sensor to get linear acceleration vector (in device's coord. sys.),
use android.opengl.Matrix.multiplyMV() to multiply the rotation matrix by linear acceleration vector.
And there you have it! I hope I will save some precious time for others.
Thanks for Edward Falk and Ali for hints!!
Based on #alex's answer, here is the code snippet:
private float[] gravityValues = null;
private float[] magneticValues = null;
#Override
public void onSensorChanged(SensorEvent event) {
if ((gravityValues != null) && (magneticValues != null)
&& (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)) {
float[] deviceRelativeAcceleration = new float[4];
deviceRelativeAcceleration[0] = event.values[0];
deviceRelativeAcceleration[1] = event.values[1];
deviceRelativeAcceleration[2] = event.values[2];
deviceRelativeAcceleration[3] = 0;
// Change the device relative acceleration values to earth relative values
// X axis -> East
// Y axis -> North Pole
// Z axis -> Sky
float[] R = new float[16], I = new float[16], earthAcc = new float[16];
SensorManager.getRotationMatrix(R, I, gravityValues, magneticValues);
float[] inv = new float[16];
android.opengl.Matrix.invertM(inv, 0, R, 0);
android.opengl.Matrix.multiplyMV(earthAcc, 0, inv, 0, deviceRelativeAcceleration, 0);
Log.d("Acceleration", "Values: (" + earthAcc[0] + ", " + earthAcc[1] + ", " + earthAcc[2] + ")");
} else if (event.sensor.getType() == Sensor.TYPE_GRAVITY) {
gravityValues = event.values;
} else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
magneticValues = event.values;
}
}
According to the documentation you get the linear acceleration in the phone's coordinate system.
You can transform any vector from the phone's coordinate system to the Earth's coordinate system by multiplying it with the rotation matrix. You can get the rotation matrix from getRotationMatrix().
(Perhaps there already is a function doing this multiplication for you but I don't do Android programming and I am not familiar with its API.)
A nice tutorial on the rotation matrix is the Direction Cosine Matrix IMU: Theory manuscript. Good luck!
OK, first of all, if you're trying to do actual inertial navigation on Android, you've got your work cut out for you. The cheap little sensor used in smart phones are just not precise enough. Although, there has been some interesting work done on intertial navigation over small distances, such as inside a building. There are probably papers on the subject you can dig up. Google "Motion Interface Developers Conference" and you might find something useful -- that's a conference that Invensense put on a couple months ago.
Second, no, linear acceleration is in device coordinates, not world coordinates. You'll have to convert yourself, which means knowing the device's 3-d orientation.
What you want to do is use a version of Android that supports the virtual sensors TYPE_GRAVITY and TYPE_LINEAR_ACCELERATION. You'll need a device with gyros to get reasonably accurate and precise readings.
Internally, the system combines gyros, accelerometers, and magnetometers in order to come up with true values for the device orientation. This effectively splits the accelerometer device into its gravity and acceleration components.
So what you want to do is to set up sensor listeners for TYPE_GRAVITY, TYPE_LINEAR_ACCELERATION, and TYPE_MAGNETOMETER. Use the gravity and magnetometer data as inputs to SensorManager. getRotationMatrix() in order to get the rotation matrix that will transform world coordinates into device coordinates or vice versa. In this case, you'll want the "versa" part. That is, convert the linear acceleration input to world coordinates by multiplying them by the transpose of the orientation matrix.

Android sensor: getRotationMatrix() returns wrong values, why?

It's past several days since I started using this function and have not yet succeeded in obtaining valid results.
What i want is basically convert acceleration vector from device's coordinates system, to real world coordinates. I' know that is possible because i have acceleration in relative coordinates and i know the orientation of the device in real world system.
Reading Android developers seems that using getRotationMatrix() i get R = rotation matrix.
So if i want A (acceleration vector in world system) from A' (acceleration vector in phone system) i must do simply:
A=R*A'
But i cant'n understand why the vector A has ALWAYS the first and the second component zero (example: +0,00;-0,00;+6,43)
My current code is similar to this:
public void onSensorChanged(SensorEvent event) {
synchronized (this) {
switch(event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
accelerometervalues = event.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomagneticmatrix =event.values.clone();
break;
}
if (geomagneticmatrix != null && accelerometervalues != null) {
float[] Rs = new float[16];
float[] I = new float[16];
SensorManager.getRotationMatrix(Rs, I, accelerometervalues, geomagneticmatrix);
float resultVec[] = new float[4];
float relativacc[]=new float [4];
relativacc[0]=accelerationvalues[0];
relativacc[1]=accelerationvalues[1];
relativacc[2]=accelerationvalues[2];
relativacc[3]=0;
Matrix.multiplyMV(resultVec, 0, Rs, 0, relativacc, 0);
//resultVec[] is the vector acceleration relative to world coordinates system..but doesn't WORK!!!!!
}
}
}
This question is very similar to this one Transforming accelerometer's data from device's coordinates to real world coordinates but there i can't find the solution...i had tried all the ways..
Please help me, i need help!!!
UPDATE:
Now my code is below, i had tried to explain matrix product, but nothing change:
float[] Rs = new float[9];
float[] I = new float[9];
SensorManager.getRotationMatrix(Rs, I, accelerationvalues, geomagneticmatrix);
float resultVec[] = new float[4];
resultVec[0]=Rs[0]*accelerationvalues[0]+Rs[1]*accelerationvalues[1]+Rs[2]*accelerationvalues[2];
resultVec[1]=Rs[3]*accelerationvalues[0]+Rs[4]*accelerationvalues[1]+Rs[5]*accelerationvalues[2];
resultVec[2]=Rs[6]*accelerationvalues[0]+Rs[7]*accelerationvalues[1]+Rs[8]*accelerationvalues[2];
Here some example of data read and result:
Rs separated by " " Rs[0] Rs[1]....Rs[8]
Av separated by " " accelerationvalues[0] ...accelerationvalues[2]
rV separated by " " resultVec[0] ...resultVec[2]
As you can notice the component on x and y axes in real world are zero (around) even if you move speddy the phone. Instead the relative acceleration vector detect correctly each movement!!!
SOLUTION
The errors in the numberrs are relative to float vars multiplication that is not the same as a double multyplication.
This sums to the fact that rotation matrix isn't costant if the phone, even if with the same orientation, is accelerating.
So is impossible translate acceleration vector to absolute coordinates during motion...
It's hard but it's the reality.
Finnaly i found the answer:
The errors in the numbers are relative to float vars multiplication that is not the same as a double multyplication. Here there is the solution.
This sums to the fact that rotation matrix isn't costant if the phone, even if with the same orientation, is accelerating. So is impossible translate acceleration vector to absolute coordinates during motion... It's hard but it's the reality.
FYI the orientation vector is made from magnetomer data AND gravity vector. This cause a ciclic problem: convert relative acc needs oirentation needs magnetic field AND gravity, but we know gravity only if the phone is stop by relative acc..so we are return to begin.
This is confirmed in Android Developers where is explained that rotation matrix give true result only when the phone isn't accelerate (e.g. they talk of free fall, infact there shouldn't be gravity mesaurement) or when it isn't in a non regulare magnetic field.
The matrices returned by this function are meaningful only when the
device is not free-falling and it is not close to the magnetic north.
If the device is accelerating, or placed into a strong magnetic field,
the returned matrices may be inaccurate.
In others world, fully un-useful...
You can trust this thing doing simple experiment on the table with Android Senor or something like this..
You must track down this arithmetic error before you worry about rotation, acceleration or anything else.
You have confirmed that
resultVec[0]=Rs[0]*accelerationvalues[0];
gives you
Rs[0]: 0.24105562
accelerationValues[0]: 6.891896
resultVec[0]: 1.1920929E-7
So once again, simpify. Try this:
Rs[0] = 0.2;
resultVec[0] = Rs[0] * 6.8
EDIT:
The last one gave resultVec[0]=1.36, so let's try this:
Rs[0] = 0.2;
accelerationValues[0] = 6.8
resultVec[0] = Rs[0] * accelerationValues[0];
If you do the sums, using the printed values you have appended, I get
`(0.00112, -0.0004, 10)`
which is not as small as what you have. Therefore there is an arithmetic error!
Could the problem be that you are using accelerationvalues[] in the last block, and accelerometervalues[] later?
I have developed several applications that make use of android sensors, so I am answering to one of your questions according to my experience:
But i cant'n understand why the vector A has ALWAYS the first and the
second component zero (example: +0,00;-0,00;+6,43)
I have observed this problem with the acceleration sensor and the magnetic field sensor, too. The readings are zero for some of the axis (two as you point, or just one in other occasions). This problem happens when you have just enabled the sensors (registerListener()) and I assume that it is related to some kind of sensor initialization.
In the case of the acceleration sensor, I have observed that just a small shaking of the device makes it to start giving correct sensor readings.
The correct solution would be the method onAccuracyChanged() giving the correct information about the sensor state. It should be returning a staus of SensorManager.SENSOR_STATUS_UNRELIABLE, but instead of that, it permanently returns SensorManager.SENSOR_STATUS_ACCURACY_HIGH on all physical devices that I have tested so far. With the method onAccuracyChanged() properly implemented, you could ignore bad readings or ask the user to wait while the sensor is being initialized.

Categories

Resources