I am trying to remap a device that has an alternate coordinate system.
The sensor is reporting values that are rotated 90° around the X axis. The format is a Quaternion in standard Android Rotation Vector notation. If I use the data unmodified I can hold the device 90° offset and successfully call getOrientation via:
private void updateOrientationFromVector(float[] rotationVector) {
float[] rotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(rotationMatrix, rotationVector);
final int worldAxisForDeviceAxisX = SensorManager.AXIS_X;
final int worldAxisForDeviceAxisY = SensorManager.AXIS_Z;
float[] adjustedRotationMatrix = new float[9];
SensorManager.remapCoordinateSystem(rotationMatrix, worldAxisForDeviceAxisX,
worldAxisForDeviceAxisY, adjustedRotationMatrix);
// Transform rotation matrix into azimuth/pitch/roll
float[] orientation = new float[3];
SensorManager.getOrientation(adjustedRotationMatrix, orientation);
// Convert radians to degrees
float azimuth = orientation[0] * 57;
float pitch = orientation[1] * 57;
float roll = orientation[2] * 57;
// Normalize for readability
if(azimuth < 0) {
azimuth = azimuth + 360;
}
Log.d("Orientation", "Azimuth: " + azimuth + "° Pitch: " + pitch + "° Roll: " + roll + "°);
}
This code works fine for all normal Android devices. If I hold a reference phone in front of me as shown, the data is converted properly and shows my correct bearings. But when I use this test device, I must hold it at the wrong orientation to show me the correct bearings.
I want to pre-process the data from this test device to rotate the axes so the this device matches all other Android devices. This will let the display logic be generic.
Unfortunately I have tried many different techniques and none are working.
First, I tried to use a the Android calls again:
private fun rotateQuaternionAxes(rotationVector :FloatArray) : FloatArray {
val rotationMatrix = FloatArray(9)
SensorManager.getRotationMatrixFromVector(rotationMatrix, rotationVector)
val worldAxisForDeviceAxisX = SensorManager.AXIS_X
val worldAxisForDeviceAxisY = SensorManager.AXIS_Z
val adjustedRotationMatrix = FloatArray(9)
SensorManager.remapCoordinateSystem(rotationMatrix, worldAxisForDeviceAxisX, worldAxisForDeviceAxisY, adjustedRotationMatrix)
val axisRemappedData = Quaternion.fromRotationMatrix(adjustedRotationMatrix)
val rotationData = floatArrayOf(
axisRemappedData.x,
axisRemappedData.y,
axisRemappedData.z,
axisRemappedData.w
)
return rotationData
}
My private Quaternion.fromRotationMatrix is not show here, and came from euclideanspace.com
When I pre-process my rotation data with this, the logic works for everything, except north and south are swapped! East and west are correct, and my pitch and roll are correct.
So I decided to follow the suggestions for Rotating a Quaternion on 1-Axis with the following code:
private fun rotateQuaternionAxes(rotationVector :FloatArray) : FloatArray {
// https://stackoverflow.com/questions/4436764/rotating-a-quaternion-on-1-axis
// Device X+ is towards power button; Y+ is toward camera; Z+ towards nav buttons
// So rotate the reported data 90 degrees around X and the axes move appropriately
val sensorQuaternion: Quaternion = Quaternion(rotationVector[0], rotationVector[1], rotationVector[2], rotationVector[3])
val manipulationQuaternion = Quaternion.axisAngle(-1.0f, 0.0f, 0.0f, 90.0f)
val axisRemappedData = Quaternion.multiply(sensorQuaternion, manipulationQuaternion)
val rotationData = floatArrayOf(
axisRemappedData.x,
axisRemappedData.y,
axisRemappedData.z,
axisRemappedData.w
)
//LogUtil.debug("Orientation Orig: $sensorQuaternion Rotated: $axisRemappedData")
return rotationData
}
This does the exact same thing! Everything is fine, except north and south are mirrored, leaving east and west correct.
My Quaternion math came from sceneform-android-sdk and I double-checked it against several online sources.
I also tried simply changing my data by just grabbing the same data differently according to Convert quaternion to a different coordinate system.
private fun rotateQuaternionAxes(rotationVector :FloatArray) : FloatArray {
// No change:
//val rotationData = floatArrayOf(x_val, y_val, z_val, w_val)
val x_val = rotationVector[0]
val y_val = rotationVector[1]
val z_val = rotationVector[2]
val w_val = rotationVector[3]
val rotationData = floatArrayOf(x_val, z_val, -y_val, w_val)
return rotationData
}
This was not even close. I played with the axes and ended up finding rotationData = floatArrayOf(-z_val, -x_val, y_val, w_val) was had correct pitch and roll, but the azimuth was completely non-functional. So I've abandoned a simple remapping as an option.
Since the Android remapCoordinateSystem and the quaternion math give the same result, they seem mathematically equivalent. And multiple sources indicate they should accomplish what I'm trying to do.
Can any one explain why remapping my axes would swap the north/south? I believe I am getting a quaternion reflection instead of rotation. There is no physical point on the device that tracks the direction it shows.
Answer
As you said, it looks like you are expecting your data to be on the East-North-Up (ENU) Frame of Reference (FoR) but you are seeing data on an East-Down-North (EDN) FoR. The link you cited to convert quaternion to another coordinate system converts from an ENU to a NDW FoR - which evidently is not what you are looking for.
There are two ways you can solve this. Either use another rotation matrix, or swap your variables. Using another rotation matrix means doing more computation - but if you really want to learn how to do this, you can check out my self-plug introduction to quaternions for reference frame rotations.
The easiest way would be to swap your variables by recognizing that your X axis is not changing, but your expected Y is measured in z' and your expected Z is measured in -y'. Where X,Y,Z are the expected FoR, and x',y',z' are the actual measured FoR. The following "swaps" should allow you to get the same behavior as your other Android devices:
x_expected = x_actual
y_expected = z_actual
z_expected = -y_actual
!!! HOWEVER !!! If your measurements are given in quaternions, then you will have to use a rotation matrix. If your measurements are given as X,Y,Z measurements, you can get away with the swap provided above.
ENU/NED/NDW Notation
East-North-Up and all other similar axes notations are defined by the order of the coordinate system, expressed as X, then Y, and lastly Z, with respect to a Global inertial (static) Frame of Reference. I've defined your expected coordinate system as if you were to lay your phone flat on the ground with the screen of the phone facing the sky and the top of your phone pointing Northward.
Related
I have a server function that detects and estimates a pose of aruco's marker from an image.
Using the function estimatePoseSingleMarkers I found the rotation and translation vector.
I need to use this value in an Android app with ARCore to create a Pose.
The documentation says that Pose needs two float array (rotation and translation): https://developers.google.com/ar/reference/java/arcore/reference/com/google/ar/core/Pose.
float[] newT = new float[] { t[0], t[1], t[2] };
Quaternion q = Quaternion.axisAngle(new Vector3(r[0], r[1], r[2]), 90);
float[] newR = new float[]{ q.x, q.y, q.z, q.w };
Pose pose = new Pose(newT, newR);
The position of the 3D object placed in this pose is totally random.
What am I doing wrong?
This is a snapshot from server image after estimate and draw axis. The image I receive is rotated of 90°, not sure if it relates to anything.
cv::aruco::estimatePoseSingleMarkers (link) returns rotation vector in Rodrigues format. Following the doc
w = norm( r ) // angle of rotation in radians
r = r/w // unit axis of rotation
thus
float w = sqrt( r[0]*r[0] + r[1]*r[1] + r[2]*r[2] );
// handle w==0.0 separately
// Get a new Quaternion using an axis/angle (degrees) to define the rotation
Quaternion q = Quaternion.axisAngle(new Vector3(r[0]/w, r[1]/w, r[2]/w), w * 180.0/3.14159 );
should work except for the right angle rotation mentioned. That is, if the lens parameters are fed to estimatePoseSingleMarkers correctly or up to certain accuracy.
I have implemented the compass reading according to the usual recommendations that I could find on the web. I use the ROTATION_VECTOR sensor type and I transform it into the (azimuth, pitch, roll) triple using the standard API calls. Here's my code:
fun Fragment.receiveAzimuthUpdates(
azimuthChanged: (Float) -> Unit,
accuracyChanged: (Int) -> Unit
) {
val sensorManager = activity!!.getSystemService(Context.SENSOR_SERVICE)
as SensorManager
val sensor = sensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR)!!
sensorManager.registerListener(OrientationListener(azimuthChanged, accuracyChanged),
sensor, 10_000)
}
private class OrientationListener(
private val azimuthChanged: (Float) -> Unit,
private val accuracyChanged: (Int) -> Unit
) : SensorEventListener {
private val rotationMatrix = FloatArray(9)
private val orientation = FloatArray(3)
override fun onSensorChanged(event: SensorEvent) {
if (event.sensor.type != Sensor.TYPE_ROTATION_VECTOR) return
SensorManager.getRotationMatrixFromVector(rotationMatrix, event.values)
SensorManager.getOrientation(rotationMatrix, orientation)
azimuthChanged(orientation[0])
}
override fun onAccuracyChanged(sensor: Sensor, accuracy: Int) {
if (sensor.type == Sensor.TYPE_ROTATION_VECTOR) {
accuracyChanged(accuracy)
}
}
}
This results in behavior that's quite good when you hold the phone horizontally, like you would a real compass. However, when you hold it like a camera, upright and in front of you, the reading breaks down. If you tilt it even slightly beyond upright, so it leans towards you, the azimuth turns to the opposite direction (sudden 180 degree rotation).
Apparently this code tracks the orientation of the phone's y-axis, which becomes vertical on an upright phone, and its ground orientation is towards you when the phone leans towards you.
What could I do to improve this behavior so it's not sensitive to the phone's pitch?
Analysis
Apparently this code tracks the orientation of the phone's y-axis, which becomes vertical on an upright phone, and its ground orientation is towards you when the phone leans towards you.
Yes, this is correct. You can inspect the code of getOrientation() to see what's going on:
public static float[] getOrientation(float[] R, float[] values) {
/*
* / R[ 0] R[ 1] R[ 2] \
* | R[ 3] R[ 4] R[ 5] |
* \ R[ 6] R[ 7] R[ 8] /
*/
values[0] = (float) Math.atan2(R[1], R[4]);
...
values[0] is the azimuth value you got.
You can interpret the rotation matrix R as the components of the vectors that point in the device's three major axes:
column 0: vector pointing to phone's right
column 1: vector pointing to phone's up
column 2: vector pointing to phone's front
The vectors are described from the perspective of the Earth's coordinate system (east, north, and sky).
With this in mind we can interpret the code in getOrientation():
select the phone's up axis (matrix column 1, stored in array elements 1, 4, 7)
project it to the Earth's horizontal plane (this is easy, just ignore the sky component stored in element 7)
Use atan2 to deduce the angle from the remaining east and north components of the vector.
There's another subtlety hiding here: the signature of atan2 is
public static double atan2(double y, double x);
Note the parameter order: y, then x. But getOrientation passes the arguments in the east, north order. This achieves two things:
makes north the reference axis (in geometry it's the x axis)
mirrors the angle: geometrical angles are anti-clockwise, but azimuth must be the clockwise angle from north
Naturally, when the phone's up axis goes vertical ("skyward") and then beyond, its azimuth flips by 180 degrees. We can fix this in a very simple way: we'll use the phone's right axis instead. Note the following:
when the phone is horizontal and facing north, its right axis is aligned with the east axis. The east axis, in the Earth's coordinate system, is the "x" geometrical axis, so our 0-angle reference is correct out-of-the-box.
when the phone turns right (eastwards), its azimuth should rise, but its geometrical angle goes negative. Therefore we must flip the sign of the geometrical angle.
Solution
So our new formula is this:
val azimuth = -atan2(R[3], R[0])
And this trivial change is all you need! No need to call getOrientation, just apply this to the orientation matrix.
Improved Solution
So far, so good. But what if the user is using the phone in the landscape orientation? The phone's axes are unaffected, but now the user perceives the phone's "left" or "right" direction as "ahead" (depending on how the user turned the phone). We can correct for this by inspecting the Display.rotation property. If the screen is rotated, we'll use the up axis of the phone to play the same role as the right axis above.
So the full code of the orientation listener becomes this:
private class OrientationListener(
private val activity: Activity,
private val azimuthChanged: (Float) -> Unit,
private val accuracyChanged: (Int) -> Unit
) : SensorEventListener {
private val rotationMatrix = FloatArray(9)
override fun onSensorChanged(event: SensorEvent) {
if (event.sensor.type != Sensor.TYPE_ROTATION_VECTOR) return
SensorManager.getRotationMatrixFromVector(rotationMatrix, event.values)
val (matrixColumn, sense) = when (val rotation =
activity.windowManager.defaultDisplay.rotation
) {
Surface.ROTATION_0 -> Pair(0, 1)
Surface.ROTATION_90 -> Pair(1, -1)
Surface.ROTATION_180 -> Pair(0, -1)
Surface.ROTATION_270 -> Pair(1, 1)
else -> error("Invalid screen rotation value: $rotation")
}
val x = sense * rotationMatrix[matrixColumn]
val y = sense * rotationMatrix[matrixColumn + 3]
azimuthChanged(-atan2(y, x))
}
override fun onAccuracyChanged(sensor: Sensor, accuracy: Int) {
if (sensor.type == Sensor.TYPE_ROTATION_VECTOR) {
accuracyChanged(accuracy)
}
}
}
With this code, you're getting the exact same behavior as on Google Maps.
Lets say you have the acceleration readings in all the 3 dimensions i.e X, Y and Z. How do you infer using the readings the phone was tilted left or right? The readings get generated every 20ms.
I actually want the logic of inferring the tilt from the readings. The tilt needs to be smooth.
A tilt can be detected in a sort of diferent ways. You can take into account 1 axis, 2 axis, or the 3 axis. Depending on how accurate you want it, and how much you feel like fighting with maths.
If you use only one axis, it is quite simple. Think the mobile is completely horizontal, and you move it like this:
using just one axis, lets say, axis x, will be enough, since you can detect accurately a change in that axis position, since even any small movement will do a change in the axis.
But, if your application is only reading that axis, and the user has the phone almost vertical, the difference in x axis will be really small even rotating the phone a big angle.
Anyways,for applications that only need coarse resolution, a single-axis can be used.
Referring to basic trigonometry, the projection of the gravity vector on the x-axis produces an output acceleration equal to the sine of the angle between the accelerometer x-axis and the horizon.
This means that having the values of an axis (those are acceleration values) you can calculate the angle in which the device is.
this means that the value given to you by the sensor, is = to 9,8 * sine of the angle, so doing the maths you can get the actual angle.
But don't worry, you don't even have to do this. Since the values are more or less proportional, as you can see in the table below, you can work directly with the value of the sensor, without taking much care of what angle represents, if you don't need it to be much accurate, since a change in that value means a proportional change in the angle, so with a few test, you will find out how big should be the change in order to be relevant to you.
So, if you take the value over the time, and compare to each other, you can figure out how big the rotation was. For this,
you consider just one axis. this will be axis X.
write a function to get the difference in the sensor value for that axis between one function call, and the next
Decide a maximum time and a minimum sensor difference, that you will consider a valid movement (e.g. a big rotation is good but only if it is fast enough, and a fast movement is good only if the difference in the angle is big enough)
if you detect two measurements that accomplish those conditions, you take note of half tilt done (in a boolean for instance), and start measuring again, but now, the new reference value is the value that was considered half tilt.
if the last difference was positive, now you need a negative difference, and if the last difference was negative, now you need a positive difference; this is, coming back. so start taking values comparing the new reference value with the new values coming from the sensor, and see if one accomplish what you decided in point 3.
if you find a valid value (accomplishing value difference and time conditions ), you have a tilt. But if you dont get a good value and the time is consumed, you reset everything: let your reference value be the last one, reset the timers, reset the half-tilt-done boolean to false, and keep measuring.
I hope this is good enough for you. For sure you can find some libraries or code snippets to help you out with this, but i think is good, as you say, to know the logic of inferring the tilt from the readings
The pictures was taken from this article, wich i recomend to read if you want to improve the accuracy and consider 2 o 3 axis for the tilt
The commonsware Sensor Monitor app does a pretty good job with this. It converts the sensor readouts to X, Y, Z values on each sensor reading, so it's pretty easy from there to determine which way the device is moving.
https://github.com/commonsguy/cw-omnibus/tree/master/Sensor/Monitor
Another item worth noting (from the Commonsware book):
There are four standard delay periods, defined as constants on the
SensorManager class:
SENSOR_DELAY_NORMAL, which is what most apps would use for broad changes, such as detecting a screen rotating from portrait to
landscape
SENSOR_DELAY_UI, for non-game cases where you want to update the UI continuously based upon sensor readings
SENSOR_DELAY_GAME, which is faster (less delay) than SENSOR_DELAY_UI, to try to drive a higher frame rate
SENSOR_DELAY_FASTEST, which is the “firehose” of sensor readings, without delay
You can use the accelerometer and magnetic field sensor to accomplish this. You can call this method in your OnSensorChanged method to detect if the phone was tilt upwards. This currently only works if the phone is held horizontally. Check the actual blog post for a more complete solution.
http://www.ahotbrew.com/how-to-detect-forward-and-backward-tilt/
public boolean isTiltUpward()
{
if (mGravity != null && mGeomagnetic != null)
{
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
if (success)
{
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
/*
* If the roll is positive, you're in reverse landscape (landscape right), and if the roll is negative you're in landscape (landscape left)
*
* Similarly, you can use the pitch to differentiate between portrait and reverse portrait.
* If the pitch is positive, you're in reverse portrait, and if the pitch is negative you're in portrait.
*
* orientation -> azimut, pitch and roll
*
*
*/
pitch = orientation[1];
roll = orientation[2];
inclineGravity = mGravity.clone();
double norm_Of_g = Math.sqrt(inclineGravity[0] * inclineGravity[0] + inclineGravity[1] * inclineGravity[1] + inclineGravity[2] * inclineGravity[2]);
// Normalize the accelerometer vector
inclineGravity[0] = (float) (inclineGravity[0] / norm_Of_g);
inclineGravity[1] = (float) (inclineGravity[1] / norm_Of_g);
inclineGravity[2] = (float) (inclineGravity[2] / norm_Of_g);
//Checks if device is flat on ground or not
int inclination = (int) Math.round(Math.toDegrees(Math.acos(inclineGravity[2])));
/*
* Float obj1 = new Float("10.2");
* Float obj2 = new Float("10.20");
* int retval = obj1.compareTo(obj2);
*
* if(retval > 0) {
* System.out.println("obj1 is greater than obj2");
* }
* else if(retval < 0) {
* System.out.println("obj1 is less than obj2");
* }
* else {
* System.out.println("obj1 is equal to obj2");
* }
*/
Float objPitch = new Float(pitch);
Float objZero = new Float(0.0);
Float objZeroPointTwo = new Float(0.2);
Float objZeroPointTwoNegative = new Float(-0.2);
int objPitchZeroResult = objPitch.compareTo(objZero);
int objPitchZeroPointTwoResult = objZeroPointTwo.compareTo(objPitch);
int objPitchZeroPointTwoNegativeResult = objPitch.compareTo(objZeroPointTwoNegative);
if (roll < 0 && ((objPitchZeroResult > 0 && objPitchZeroPointTwoResult > 0) || (objPitchZeroResult < 0 && objPitchZeroPointTwoNegativeResult > 0)) && (inclination > 30 && inclination < 40))
{
return true;
}
else
{
return false;
}
}
}
return false;
}
Is this what you're looking for?
public class AccelerometerHandler implements SensorEventListener
{
float accelX;
float accelY;
float accelZ;
public AccelerometerHandler(Context paramContext)
{
SensorManager localSensorManager = (SensorManager)paramContext.getSystemService("sensor");
if (localSensorManager.getSensorList(1).size() != 0)
localSensorManager.registerListener(this, (Sensor)localSensorManager.getSensorList(1).get(0), 1);
}
public float getAccelX()
{
return this.accelX;
}
public float getAccelY()
{
return this.accelY;
}
public float getAccelZ()
{
return this.accelZ;
}
public void onAccuracyChanged(Sensor paramSensor, int paramInt)
{
}
public void onSensorChanged(SensorEvent paramSensorEvent)
{
this.accelX = paramSensorEvent.values[0];
this.accelY = paramSensorEvent.values[1];
this.accelZ = paramSensorEvent.values[2];
}
}
I wish to get my phone's current orientation by the following method:
Get the initial orientation (azimuth) first via the getRotationMatrix() and getOrientation().
Add the integration of gyroscope reading over time to it to get the current orientation.
Phone Orientation:
The phone's x-y plane is fixed parallel with the ground plane. i.e., is in a "texting-while-walking" orientation.
"getOrientation()" Returnings:
Android API allows me to easily get the orientation, i.e., azimuth, pitch, roll, from getOrientation().
Please note that this method always returns its value within the range: [0, -PI] and [o, PI].
My Problem:
Since the integration of the gyroscope reading, denoted by dR, may be quite big, so when I do CurrentOrientation += dR, the CurrentOrientation may exceed the [0, -PI] and [o, PI] ranges.
What manipulations are needed so that I can ALWAYS get the current orientation within the the [0, -PI] and [o, PI] ranges?
I have tried the following in Python, but I highly doubt its correctness.
rotation = scipy.integrate.trapz(gyroSeries, timeSeries) # integration
if (headingDirection - rotation) < -np.pi:
headingDirection += 2 * np.pi
elif (headingDirection - rotation) > np.pi:
headingDirection -= 2 * np.pi
# Complementary Filter
headingDirection = ALPHA * (headingDirection - rotation) + (1 - ALPHA) * np.mean(azimuth[np.array(stepNo.tolist()) == i])
if headingDirection < -np.pi:
headingDirection += 2 * np.pi
elif headingDirection > np.pi:
headingDirection -= 2 * np.pi
Remarks
This is NOT that simple, because it involves the following trouble-makers:
The orientation sensor reading goes from 0 to -PI, and then DIRECTLY JUMPS to +PI and gradually gets back to 0 via +PI/2.
The integration of the gyrocope reading also leads to some trouble. Should I add dR to the orientation or subtract dR.
Do please refer to the Android Documentations first, before giving a confirmed answer.
Estimated answers will not help.
The orientation sensor actually derives its readings from the real magnetometer and the accelerometer.
I guess maybe this is the source of the confusion. Where is this stated in the documentation? More importantly, does the documentation somewhere explicitly state that the gyro readings are ignored? As far as I know the method described in this video is implemented:
Sensor Fusion on Android Devices: A Revolution in Motion Processing
This method uses the gyros and integrates their readings. This pretty much renders the rest of the question moot; nevertheless I will try to answer it.
The orientation sensor is already integrating the gyro readings for you, that is how you get the orientation. I don't understand why you are doing it yourself.
You are not doing the integration of the gyro readings properly, it is more complicated than CurrentOrientation += dR (which is incorrect). If you need to integrate the gyro readings (I don't see why, the SensorManager is already doing it for you) please read Direction Cosine Matrix IMU: Theory how to do it properly (Equation 17).
Don't try integrating with Euler angles (aka azimuth, pitch, roll), nothing good will come out.
Please use either quaternions or rotation matrices in your computations instead of Euler angles. If you work with rotation matrices, you can always convert them to Euler angles, see
Computing Euler angles from a rotation matrix by Gregory G. Slabaugh
(The same is true for quaternions.) There are (in the non-degenrate case) two ways to represent a rotation, that is, you will get two Euler angles. Pick the one that is in the range you need. (In case of gimbal lock, there are infinitely many Euler angles, see the PDF above). Just promise you won't start using Euler angles again in your computations after the rotation matrix to Euler angles conversion.
It is unclear what you are doing with the complementary filter. You can implement a pretty damn good sensor fusion based on the Direction Cosine Matrix IMU: Theory manuscript, which is basically a tutorial. It's not trivial to do it but I don't think you will find a better, more understandable tutorial than this manuscript.
One thing that I had to discover myself when I implemented sensor fusion based on this manuscript was that the so-called integral windup can occur. I took care of it by bounding the TotalCorrection (page 27). You will understand what I am talking about if you implement this sensor fusion.
UPDATE: Here I answer your questions that you posted in comments after accepting the answer.
I think the compass gives me my current orientation by using gravity and magnetic field, right? Is gyroscope used in the compass?
Yes, if the phone is more or less stationary for at least half a second, you can get a good orientation estimate by using gravity and the compass only. Here is how to do it: Can anyone tell me whether gravity sensor is as a tilt sensor to improve heading accuracy?
No, the gyroscopes are not used in the compass.
Could you please kindly explain why the integration done by me is wrong? I understand that if my phone's pitch points up, euler angle fails. But any other things wrong with my integration?
There are two unrelated things: (i) the integration should be done differently, (ii) Euler angles are trouble because of the Gimbal lock. I repeat, these two are unrelated.
As for the integration: here is a simple example how you can actually see what is wrong with your integration. Let x and y be the axes of the horizontal plane in the room. Get a phone in your hands. Rotate the phone around the x axis (of the room) by 45 degrees, then around the y axis (of the room) by 45 degrees. Then, repeat these steps from the beginning but now rotate around the y axis first, and then around the x axis. The phone ends up in a totally different orientation. If you do the integration according to CurrentOrientation += dR you will see no difference! Please read the above linked Direction Cosine Matrix IMU: Theory manuscript if you want to do the integration properly.
As for the Euler angles: they screw up the stability of the application and it is enough for me not to use them for arbitrary rotations in 3D.
I still don't understand why you are trying to do it yourself, why you don't want to use the orientation estimate provided by the platform. Chances are, you cannot do better than that.
I think you should avoid the depreciated "Orientation Sensor", and use sensor fusion methods like getRotationVector, getRotationMatrix that already implement fusion algorithms specially of Invensense, which already use gyroscope data.
If you want a simple sensor fusion algorithm called a balance filter
(refer http://www.filedump.net/dumped/filter1285099462.pdf) can be used. Approach is as in
http://postimg.org/image/9cu9dwn8z/
This integrates the gyroscope to get angle, then high-pass filters the result to remove
drift, and adds it to the smoothed accelerometer and compass results. The integrated, high-pass-fil-tered gyro data and the accelerometer/compass data are added in such a way that the two parts add
to one, so that the output is an accurate estimate in units that make sense.
For the balance filter, the time constant may be tweaked to tune the response. The shorter the time
constant, the better the response but the more acceleration noise will be allowed to pass through.
To see how this works, imagine you have the newest gyro data point (in rad/s) stored in gyro, the
newest angle measurement from the accelerometer is stored in angle_acc, and dtis the time from
the last gyro data until now. Then your new angle would be calculated using
angle = b * (angle + gyro*dt) + (1 - b) *(angle_acc);
You may start by trying b = 0.98 for instance. You will also probably want to use a fast gyroscope measurement time dt so the gyro doesn’t drift more than a couple of degrees before the next measurement is taken. The balance filter is useful and simple to implement, but is not the ideal sensor fusion approach.
Invensense’s approach involves some clever algorithms and probably some form of Kalman filter.
Source: Professional Android Sensor Programming, Adam Stroud.
If the azimuth value is inaccurate due to magnetic interference, there is nothing that you can do to eliminate it as far as I know. To get a stable reading of the azimuth you need to filter the accelerometer values if TYPE_GRAVITY is not available. If TYPE_GRAVITY is not available, then I am pretty sure that the device does not have a gyro, so the only filter that you can use is low pass filter. The following code is an implementation of a stable compass using TYPE_GRAVITY and TYPE_MAGNETIC_FIELD.
public class Compass implements SensorEventListener
{
public static final float TWENTY_FIVE_DEGREE_IN_RADIAN = 0.436332313f;
public static final float ONE_FIFTY_FIVE_DEGREE_IN_RADIAN = 2.7052603f;
private SensorManager mSensorManager;
private float[] mGravity;
private float[] mMagnetic;
// If the device is flat mOrientation[0] = azimuth, mOrientation[1] = pitch
// and mOrientation[2] = roll, otherwise mOrientation[0] is equal to Float.NAN
private float[] mOrientation = new float[3];
private LinkedList<Float> mCompassHist = new LinkedList<Float>();
private float[] mCompassHistSum = new float[]{0.0f, 0.0f};
private int mHistoryMaxLength;
public Compass(Context context)
{
mSensorManager = (SensorManager) context.getSystemService(Context.SENSOR_SERVICE);
// Adjust the history length to fit your need, the faster the sensor rate
// the larger value is needed for stable result.
mHistoryMaxLength = 20;
}
public void registerListener(int sensorRate)
{
Sensor magneticSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
if (magneticSensor != null)
{
mSensorManager.registerListener(this, magneticSensor, sensorRate);
}
Sensor gravitySensor = mSensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY);
if (gravitySensor != null)
{
mSensorManager.registerListener(this, gravitySensor, sensorRate);
}
}
public void unregisterListener()
{
mSensorManager.unregisterListener(this);
}
#Override
public void onAccuracyChanged(Sensor sensor, int accuracy)
{
}
#Override
public void onSensorChanged(SensorEvent event)
{
if (event.sensor.getType() == Sensor.TYPE_GRAVITY)
{
mGravity = event.values.clone();
}
else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
{
mMagnetic = event.values.clone();
}
if (!(mGravity == null || mMagnetic == null))
{
mOrientation = getOrientation();
}
}
private void getOrientation()
{
float[] rotMatrix = new float[9];
if (SensorManager.getRotationMatrix(rotMatrix, null,
mGravity, mMagnetic))
{
float inclination = (float) Math.acos(rotMatrix[8]);
// device is flat
if (inclination < TWENTY_FIVE_DEGREE_IN_RADIAN
|| inclination > ONE_FIFTY_FIVE_DEGREE_IN_RADIAN)
{
float[] orientation = sensorManager.getOrientation(rotMatrix, mOrientation);
mCompassHist.add(orientation[0]);
mOrientation[0] = averageAngle();
}
else
{
mOrientation[0] = Float.NAN;
clearCompassHist();
}
}
}
private void clearCompassHist()
{
mCompassHistSum[0] = 0;
mCompassHistSum[1] = 0;
mCompassHist.clear();
}
public float averageAngle()
{
int totalTerms = mCompassHist.size();
if (totalTerms > mHistoryMaxLength)
{
float firstTerm = mCompassHist.removeFirst();
mCompassHistSum[0] -= Math.sin(firstTerm);
mCompassHistSum[1] -= Math.cos(firstTerm);
totalTerms -= 1;
}
float lastTerm = mCompassHist.getLast();
mCompassHistSum[0] += Math.sin(lastTerm);
mCompassHistSum[1] += Math.cos(lastTerm);
float angle = (float) Math.atan2(mCompassHistSum[0] / totalTerms, mCompassHistSum[1] / totalTerms);
return angle;
}
}
In your activity instantiate a Compass object say in onCreate, registerListener in onResume and unregisterListener in onPause
private Compass mCompass;
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
mCompass = new Compass(this);
}
#Override
protected void onPause()
{
super.onPause();
mCompass.unregisterListener();
}
#Override
protected void onResume()
{
super.onResume();
mCompass.registerListener(SensorManager.SENSOR_DELAY_NORMAL);
}
Its better to let android's implementation of Orientation detection handle it. Now, yes values you get are from -PI to PI, and you can convert them to degrees (0-360).Some Relevant parts:
Saving data to be processed:
#Override
public void onSensorChanged(SensorEvent sensorEvent) {
switch (sensorEvent.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
mAccValues[0] = sensorEvent.values[0];
mAccValues[1] = sensorEvent.values[1];
mAccValues[2] = sensorEvent.values[2];
break;
case Sensor.TYPE_MAGNETIC_FIELD:
mMagValues[0] = sensorEvent.values[0];
mMagValues[1] = sensorEvent.values[1];
mMagValues[2] = sensorEvent.values[2];
break;
}
}
Calculating roll, pitch and yaw (azimuth).mR and mI are arrys to hold rotation and inclination matrices, mO is a temporary array. The array mResults has the values in degrees, at the end:
private void updateData() {
SensorManager.getRotationMatrix(mR, mI, mAccValues, mMagValues);
/**
* arg 2: what world(according to app) axis , device's x axis aligns with
* arg 3: what world(according to app) axis , device's y axis aligns with
* world x = app's x = app's east
* world y = app's y = app's north
* device x = device's left side = device's east
* device y = device's top side = device's north
*/
switch (mDispRotation) {
case Surface.ROTATION_90:
SensorManager.remapCoordinateSystem(mR, SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, mR2);
break;
case Surface.ROTATION_270:
SensorManager.remapCoordinateSystem(mR, SensorManager.AXIS_MINUS_Y, SensorManager.AXIS_X, mR2);
break;
case Surface.ROTATION_180:
SensorManager.remapCoordinateSystem(mR, SensorManager.AXIS_MINUS_X, SensorManager.AXIS_MINUS_Y, mR2);
break;
case Surface.ROTATION_0:
default:
mR2 = mR;
}
SensorManager.getOrientation(mR2, mO);
//--upside down when abs roll > 90--
if (Math.abs(mO[2]) > PI_BY_TWO) {
//--fix, azimuth always to true north, even when device upside down, realistic --
mO[0] = -mO[0];
//--fix, roll never upside down, even when device upside down, unrealistic --
//mO[2] = mO[2] > 0 ? PI - mO[2] : - (PI - Math.abs(mO[2]));
//--fix, pitch comes from opposite , when device goes upside down, realistic --
mO[1] = -mO[1];
}
CircleUtils.convertRadToDegrees(mO, mOut);
CircleUtils.normalize(mOut);
//--write--
mResults[0] = mOut[0];
mResults[1] = mOut[1];
mResults[2] = mOut[2];
}
I have a sensor manager that returns a rotationMatrix based on the devices Magnetometer and Accelerometer. I have been trying to also calculate the yaw pitch and roll of the user's device but am finding that pitch and roll interfere with each other and give inaccurate results. Is there a way to extract YAW PITCH and ROLL of a device from the rotationMatrix?
EDIT
Trying to interpret blender's answer below, which i am thankful for but not quite there yet, i am trying to get the angle from a rotaion matrix like this:
float R[] = phoneOri.getMatrix();
double rmYaw = Math.atan2(R[4], R[0]);
double rmPitch = Math.acos(-R[8]);
double rmRoll = Math.atan2(R[9], R[10]);
i don't know if i am referencing the wrong parts of the matrix or not but i am not getting the results i would think.
i was hoping to get values in degrees, but am getting weird integers.
my matrix is coming from my sensorManager which looks like this:
public void onSensorChanged(SensorEvent evt) {
int type=evt.sensor.getType();
if(type == Sensor.TYPE_ORIENTATION){
yaw = evt.values[0];
pitch = evt.values[1];
roll = evt.values[2];
}
if (type == Sensor.TYPE_MAGNETIC_FIELD) {
orientation[0]=(orientation[0]*1+evt.values[0])*0.5f;
orientation[1]=(orientation[1]*1+evt.values[1])*0.5f;
orientation[2]=(orientation[2]*1+evt.values[2])*0.5f;
} else if (type == Sensor.TYPE_ACCELEROMETER) {
acceleration[0]=(acceleration[0]*2+evt.values[0])*0.33334f;
acceleration[1]=(acceleration[1]*2+evt.values[1])*0.33334f;
acceleration[2]=(acceleration[2]*2+evt.values[2])*0.33334f;
}
if ((type==Sensor.TYPE_MAGNETIC_FIELD) || (type==Sensor.TYPE_ACCELEROMETER)) {
float newMat[]=new float[16];
SensorManager.getRotationMatrix(newMat, null, acceleration, orientation);
if(displayOri==0||displayOri==2){
SensorManager.remapCoordinateSystem(newMat,SensorManager.AXIS_X*-1, SensorManager.AXIS_MINUS_Y*-1,newMat);
}else{
SensorManager.remapCoordinateSystem(newMat,SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X,newMat);
}
matrix=newMat;
sample matrix when device is laying face up on table
0.9916188, -0.12448014, -0.03459576, 0.0
0.12525482, 0.9918981, 0.021199778, 0.0
0.031676512,-0.025355382, 0.9991765, 0.0
0.0, 0.0, 0.0, 1
ANSWER
double rmPitch = Math.toDegrees( Math.acos(R[10]));
I believe Blender's answer is not correct, since he gave a transformation from Rotation matrix to Euler angles (z-x-z extrinsic), and Roll Pitch Yaw are a different kind of Euler angles (z-y-x extrinsic).
The actual transformation formula would rather be:
yaw=atan2(R(2,1),R(1,1));
pitch=atan2(-R(3,1),sqrt(R(3,2)^2+R(3,3)^2)));
roll=atan2(R(3,2),R(3,3));
Source
Feedback : this implementation revealed to lack numerical stability near the singularity of the representation (gimbal lock). Therefore on C++ I recommend using Eigen library with the following line of code:
R.eulerAngles(2,1,0).reverse();
(More details here)
Yaw, pitch and roll correspond to Euler angles. You can convert a transformation matrix to Euler angles pretty easily:
Sensor Manager provides a SensorManager.getOrientation to get all the three angle.