Compute relative orientation given azimuth, pitch, and roll in android? - android

When I listen to orientation event in an android app, I get a SensorEvent, which contains 3 floats - azimuth, pitch, and roll in relation to the real-world's axis.
Now say I am building an app like labyrinth, but I don't want to force the user the be over the phone and hold the phone such that the xy plane is parallel to the ground. Instead I want to be able to allow the user to hold the phone as they wish, laying down or, perhaps, sitting down and holding the phone at an angle. In other words, I need to calibrate the phone in accordance with the user's preference.
How can I do that?
Also note that I believe that my answer has to do with getRotationMatrix and getOrientation, but I am not sure how!
Please help! I've been stuck at this for hours.

For a Labyrinth style app, you probably care more for the acceleration (gravity) vector than the axes orientation. This vector, in Phone coordinate system, is given by the combination of the three accelerometers measurements, rather than the rotation angles. Specifically, only the x and y readings should affect the ball's motion.
If you do actually need the orientation, then the 3 angular readings represent the 3 Euler angles. However, I suspect you probably don't really need the angles themselves, but rather the rotation matrix R, which is returned by the getRotationMatrix() API. Once you have this matrix, then it is basically the calibration that you are looking for. When you want to transform a vector in world coordinates to your device coordinates, you should multiply it by the inverse of this matrix (where in this special case, inv(R) = transpose(R).
So, following the example I found in the documentation, if you want to transform the world gravity vector g ([0 0 g]) to the device coordinates, multiply it by inv(R):
g = inv(R) * g
(note that this should give you the same result as reading the accelerometers)
Possible APIs to use here: invertM() and multiplyMV() methods of the matrix class.

I don't know of any android-specific APIs, but all you want to do is decrease the azimuth by a certain amount, right? So you move the "origin" from (0,0,0) to whatever they want. In pseudocode:
myGetRotationMatrix:
return getRotationMatrix() - origin

Related

Android accelerometer mapping movement to custom coordinate system

If I have custom coordinate system X - left/right, Y - forward/backward, Z - Up/down that is represented on my PC screen inside my unreal project, how would I map the accelerator values In a way that when I move my phone toward the PC screen (regardless of the phone orientation) so that my Y value goes up and same for other axes?
I got something similar working with rotation by taking "referent" rotation quaternion, inverting it and multiplying it by current rotation quaternion, but I'm just stuck on how to transform movement.
Example of my problem is that if I'm moving my phone up with screen pointing at sky my Z axis increases which is what I want, but when I also point my phone screen to my PC screen and move it forward Z axis again goes up, when I would want in this case that my Y value increases.
There is a similar question Acceleration from device's coordinate system into absolute coordinate system but that doesn't really solve my problem since I don't want to depend on the location of the north for Y and so on.
Clarification of question intent
It sounds like what you want is the acceleration of your device with respect to your laptop. As you correctly mentioned, the similar question Acceleration from device's coordinate system into absolute coordinate system maps the local accelerometer data of a device with respect to a global frame of reference (FoR) (the Cartesian "flat" Earth FoR to be specific - as opposed to the ultra-realistic spherical Earth FoR).
What you know
From your device, you know the local Phone FoR, and from the link above, you can also find the behavior of your device with respect to a flat Earth FoR with a rotation matrix, which I'll call R_EP for Rotation in Earth FoR from Phone FoR. In order to represent the acceleration of your device with respect to your laptop, you will need to know how your laptop is oriented and positioned with respect to either your phone's FoR (A), or the flat Earth FoR (B), or some other FoR that is known to both your laptop and your phone but I'll ignore this cause it's irrelevant and the method is identical to B.
What you'll need
In the first case, A, this will allow you to construct a rotation matrix which I'll call R_LP for Rotation in Laptop FoR from Phone FoR - and that would be super convenient because that's your answer. But alas, life isn't fun without a little bit of a challenge.
In the second case, B, this will allow you to construct a rotation matrix which I'll call R_LE for Rotation in Laptop FoR from Earth FoR. Because the Hamilton product is associative (but NOT commutative: Are quaternions generally multiplied in an order opposite to matrices?), you can find the acceleration of your phone with respect to your laptop by daisy-chaining the rotations, like so:
a_P]L = R_LE * R_EP * a_P]P
Where the ] means "in the frame of", and a_P is acceleration of the Phone. So a_P]L is the acceleration of the Phone in the Laptop FoR, and a_P]P is the acceleration of the Phone in the Phone's FoR.
NOTE When "daisy-chaining" rotation matrices, it's important that they follow a specific order. Always make sure that the rotation matrices are multiplied in the correct order, see Sections 2.6 and 3.1.4 in [1] for more information.
Hint
To define your laptop's FoR (orientation and position) with respect to the global "flat" Earth FoR, you can place your phone on your laptop and set the current orientation and position as your laptop's FoR. This will let you construct R_LE.
Misconceptions
A rotation quaternion, q, is NEITHER the orientation NOR attitude of one frame of reference relative to another. Instead, it represents a "midpoint" vector normal to the rotation plane about which vectors from one frame of reference are rotated to the other. This is why defining quaternions to rotate from a GLOBAL frame to a local frame (or vice-versa) is incredibly important. The ENU to NED rotation is a perfect example, where the rotation quaternion is [0; sqrt(2)/2; sqrt(2)/2; 0], a "midpoint" between the two abscissa (X) axes (in both the global and local frames of reference). If you do the "right hand rule" with your three fingers pointing along the ENU orientation, and rapidly switch back and forth from the NED orientation, you'll see that the rotation from both FoR's is simply a rotation about [1; 1; 0] in the Global FoR.
References
I cannot recommend the following open-source reference highly enough:
[1] "Quaternion kinematics for the error-state Kalman filter" by Joan SolĂ . https://hal.archives-ouvertes.fr/hal-01122406v5
For a "playground" to experiment with, and gain a "hands-on" understanding of quaternions:
[2] Visualizing quaternions, An explorable video series. Lessons by Grant Sanderson. Technology by Ben Eater https://eater.net/quaternions

Implementing an accurate compass with Android [duplicate]

I'm using the Android gravity and magnetic field sensors to calculate orientation via SensorManager.getRotationMatrix and SensorManager.getOrientation. This gives me the azimuth, pitch and orientation numbers. The results look sensible when the device is lying flat on a table.
However, I've disabled switches between portrait and landscape in the manifest, so that getWindowManager().getDefaultDisplay().getRotation() is always zero. When I rotate the device by 90 degrees so that it's standing vertical I run into trouble. Sometimes the numbers seem quite wrong, and I've realised that this relates to Gimbal lock. However, other apps don't seem to have this problem. For example, I've compared my app against two free sensor test apps (Sensor Tester (Dicotomica) and Sensor Monitoring (R's Software)). My app agrees with these apps when the device is flat, but as I rotate the device into the vertical position there can be significant differences. The two apps seem to agree with each other, so how do they get around this problem?
When the device is not flat, you have to call remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR); before calling getOrientation.
The azimuth returns by getOrientation is obtained by orthogonally project the device unit Y axis into the world East-North plane and then calculate the angle between the resulting projection vector and the North axis.
Now we normally think of direction as the direction where the back camera is pointing. That is the direction of -Z where Z is the device axis pointing out of the screen. When the device is flat we do not think of direction and accept what ever given. But when it is not flat we expect it is the direction of -Z. But getOrientation calculate the direction of the Y axis, thus we need to swap the Y and Z axes before calling getOrientation. That is exactly what remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR) does, it keep the X axis intact and map Z to Y.
Now so how do you know when to remap or not. You can do that by checking
float inclination = (float) Math.acos(rotationMatrix[8]);
if (result.inclination < TWENTY_FIVE_DEGREE_IN_RADIAN
|| result.inclination > ONE_FIFTY_FIVE_DEGREE_IN_RADIAN)
{
// device is flat just call getOrientation
}
else
{
// call remap
}
The inclination above is the angle between the device screen and the world East-North plane. It shows how much the device is tilting.
I think the best way of defining your orientation angles when the device isn't flat is to use a more appropriate angular co-ordinate system that the standard Euler angles that you get from SensorManager.getOrientation(...). I suggest the one that I describe here on math.stackexchange.com. I've also put some code that does implements it in an answer here. Apart from a good definition of azimuth, it also has a definition of the pitch angle which is exactly the angle given by Math.acos(rotationMatrix[8]) that is mentioned in another answer here.
You can get full details from the two links that I've given in the first paragraph. However, in summary, your rotation matrix R from SensorManager.getRotationMatrix(...) is
where (Ex, Ey, Ez), (Nx, Ny, Nz) and (Gx, Gy, Gz) are vectors pointing due East, North, and in the direction of Gravity. Then the azimuth angle that you want is given by
I'll add an example that worked for me, using Hoan's answer, and also highlight some new resources that are available now and may help when looking at apps that need to use the gyro sensors.
Firstly, there is a Google Code lab available with a sample app - I found the completed sample app very useful for understanding a devices behaviour.
The app display looks like this:
The codelabs and the final app version is available at these links (at the time of writing):
https://developer.android.com/codelabs/advanced-android-training-sensor-orientation#0
https://github.com/google-developer-training/android-advanced/tree/master/TiltSpot
In particular this allows you experiment as follows (its easier to do than describe...):
put your device down flat on a table and experiment with rotating on the table while flat, lifting top and bottom (i.e. lift the top of the display, where your front camera typically is, from the table while leaving the bottom, where your home button typically is if you have one, on the table. Similarly, lift the left and right edges of the device. Observe the azimuth, pitch and roll values in the app.
Put your device against a wardrobe door in portrait mode and again experiment moving it around. Open the door to see the effect also. Rotate to landscape to see the difference.
The key thing I think you may see from the above is that everything is very simple and works well when flat and everything is complex and interrelated when not flat.
Looking specifically at a use case I had to display the pitch and roll when vertical in portrait mode, the following worked for me:
when (event?.sensor?.type) {
Sensor.TYPE_ROTATION_VECTOR -> {
// Calculate the rotation matrix
val rotMatrix = FloatArray(9)
var rotationMatrixAdjusted = FloatArray(9)
val azimuthChanged: (Float) -> Unit
val orientation = FloatArray(3)
SensorManager.getRotationMatrixFromVector(rotMatrix, event.values);
SensorManager.remapCoordinateSystem(rotMatrix,
SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X,
rotationMatrixAdjusted);
SensorManager.getOrientation(rotationMatrixAdjusted, orientation)
val rollAngleRadians = orientation[1]
val pitchAngleRadians = orientation[2]
//Report results back to listener
thisListener.onRollPitchEvent(rollAngleRadians, pitchAngleRadians)
}
}

Getting magnetic field values in global coordinates

For an Android application, I need to get magnetic field measurements across the axis of global (world's) coordinate system. Here is how I'm going (guessing) to implement this. Please, correct me if necessary. Also, please, note that the question is about algorithmic part of the task, and not about Android APIs for sensors - I have an experience with the latter.
First step is to obtain TYPE_MAGNETIC_FIELD sensor data (M) and TYPE_ACCELEROMETER sensor data (G). The second is supposed to be used according to Android's documentation, but I'm not sure if it shouldn't be TYPE_GRAVITY instead (again as G), because accelerometer seems providing not the pure gravity.
Next step is to get rotation matrices via getRotationMatrix(R, I, G, M), where R and I are rotation and inclination matrix correspondingly.
And now goes the most questionnable part: in order to convert M vector into the world's coordinate system, I suppose to multiply [R * I] * M.
I'm not sure this is a correct way for transforming magnetic field reading into another basis. Also, I don't know if remapCoordinateSystem should be used in addition or as replacement for something above.
If there exists some source code which does this thing already, I'd appreciate posting a link, but I don't want to use big general purposes libraries (for example, for augmented reality support) for this specific task, because I'd like to keep it as simple as possible.
P.S.
I came to the idea to add some information to the original post for clarity.
Let us suppose a device rests on a table and continuously reads data from its magnetic sensor. Each measurement contains 3 values, presenting magnetic field in axis X, Y, Z, which are device's local coordinate system. I take it that I can neglect environmental field fluctuations (smoothed by lowpass filter), so this 3 values should remain almost the same all the time the device remains in place. If we rotate device around any axis, the values change, because we change the local coordinate system. But the field itself is not actually changed. So I want to translate local X, Y, Z field measurements into such X', Y', Z', that they keep their respective values regardless to device rotation, provided that device is not moved from its location (only rotated).
I've implemented the algorithm described above and got regular and noticable changes in values X', Y', Z', obtained through suggested transformations, so there is something wrong in it.
P.P.S.
Occasionally I've found an exact duplicate of my question here on SO - How can I get the magnetic field vector, independent of the device rotation? - but unfortunately the answer contains my suggestions, and OP of that question confirms that they do not work.
The coordinates of M with respect to the word coordinate is just the multiplication R*M.
The rotation matrix R is mathematically the change of basis matrix from the device coordinate to the word coordinate.
Let X, Y, Z be the device coordinate basis and W_1, W_2, W_3 be the word coordinate basis then
M = m_1 X + m_2 Y + m_3
and also
M = c_1 W_1 + c_2 W_2 + c_3 W_3
where R * (m_1, m_2, m_3) = (c_1, c_2, c_3) transpose.
Low pass filter is only used to filter out accelerations in the X, Y directions. RemapCoordinateSystem is used to change the order of the basis, ie changing from W_1, W_2, W_3 to W_1, W_3, W_2.
The magnetometer sensor on your device returns a 3-vector in device coordinates. You can use getRotationMatrix() to get a matrix that could be used to convert that device-coordinates
vector to world coordinates. You could also learn about Quaternions and use
TYPE_ROTATION_VECTOR directly. However, there's no Quaternion library in Android (that I know of) and that's a discussion beyond the scope of this question.
However, none of this will do you any good because the device orientation information is based in part on the value from the magnetometers. In other words, the device will always tell you that the magnetic vector is facing exactly North.
Now, what you can do is get magnetic dip. This is one of the outputs from getRotationMatrix(), although you'll have to convert a matrix to an angle for it to be useful. That too, is beyond the scope of this question.
Finally, your last option is to build a table which is level and which has an arrow on it pointing true north. (You'll have to align it by the stars at night or something.) Then, place your device flat on the table with the top of the device facing north. In this case, device coordinates will be the same as world coordinates and the magnetometer sensor will produce the values you want.
Your comments indicate that you're interested in local variations. There's simply no way to get true north with your Android device alone. Theoretically, you could build a table as I described, and then walk around holding the device in strictly the same orientation as before, keeping an eye on the table for reference. I doubt you could pull it off, though.
You could try using gyros in your app to help you keep the device oriented exactly the same way at all times, but the gyros in any Android device you use are likely to drift too much for this to work.
Or perhaps we still don't understand what you're trying to do. Bottom line, though, is that you simply cannot get a global coordinate system with an Android device alone -- whatever you get will always be aligned with the local magnetic field at that exact spot.

Changing sensor coordinate system in android

Android provides sensor data in device coordinate system no matter how it is oriented. Is there any way to have sensor data in 'gravity' coordinate system? I mean no matter how the device is oriented I want accelerometer data and orientation in coordinate system where y-axis points toward the sky, x-axis toward the east and z-axis towards south pole.
I took a look at remapCoordinateSystem but seems to be limited to only swapping axis. I guess for orientation I will have to do some low level rotation matrix transformation (or is there any better solution?). But how about acceleration data? Is there any way to have data in relation to coordinate system that is fixed (sort of world coordinate system).
The reason I need this is I'm trying to do some simple motion gestures when phone is in the pocket and it would be easier for me to have all data in coordinates system related to user rather device coordinate system (that will have a little bit different orientation in different user's pockets)
Well you basically get the North orientation when starting - for this you use the accelerometer and the magnetic field sensor to compute orientation (or the deprecated Orientation sensor).
Once you have it you can compute a rotation matrix (Direction Cosine Matrix) from those azimuth, pitch and roll angles. Multiplying your acceleration vector by that matrix will transform your device-frame movements into Earth-frame ones.
As your device will change its orientation as time goes by, you'll need to update it. To do so, retrieve gyroscope's data and update your Direction Cosine Matrix for each new value. You could also get the orientation true value just like the first time, but it's less accurate.
My solution involves DCM, but you could also use quaternions, it's just a matter of choice. Feel free to ask more if needed. I hope this is what you wanted to know !

How to remove Gravity factor from Accelerometer readings in Android 3-axis accelerometer

Can anyone help on removing the g factor from accelerometer readings?
I am using SensorEventListener with onSensorChanged() method for getting Sensor.TYPE_ACCELEROMETER data. I need only pure acceleration values in all directions. So at any state if the device is stable (or in constant speed), it should give (0.0,0.0,0.0) roughly.
Currently, depending on its pitch and roll, it gives me variable output depending on the g forces acting on each axis.
I hope there is some formula to remove this, as I also get orientation values (pitch and roll) from Sensor.TYPE_ORIENTATION listener. I have used some but it didn't work.
You can use a low-pass filter.
Do this for each of your sensor values:
g = 0.9 * g + 0.1 * v
Where v is your current sensor value and g is a global variable initially set to zero. Mind that you'll need as many g variables as you have axes.
With v = v - g you can eliminate the gravity factor from your sensor value.
Use Sensor.TYPE_LINEAR_ACCELERATION instead of Sensor.TYPE_ACCELEROMETER
Take a look of the following link.
http://developer.android.com/reference/android/hardware/SensorEvent.html
Just subtract out g (~9.8m/s^2) times the z direction of the rotation matrix.
Or to be more explicit about it, let
a = your accelerometer reading,
R = your rotation matrix (as a 9-long vector).
Then what you want is
(a[0]-g*R[6], a[1]-g*R[7], a[2]-g*R[8]).
Differentiating with respect to time a function of time rids you of the constants.
So by taking the derivative of the accelerometer's signal you'll get the "Jerk", which you can then re-integrate in order to get the non-constant part of the acceleration you're looking for.
In Layman's terms, take a sample from the accelerometer every 1 second, and subtract it from the previous sample. If the answer is (very close to) zero, you're not accelerating relatively to earth. If the result is non-zero, integrate it (in this case, multiply by one second), you have your acceleration.
Two things, though :
-Look out for noise in the signal, round off your input.
-Don't expect hyper-accurate results from on-chip accelerometers. You can use them to detect shaking, changes in orientation, but not really for knowing how many G's you're experiencing while making sharp turns in your car.
One way (for devices only with accelerometer) is to remove gravity vector from accelerometer data by subtracting the values that would come in static case for same orientation. But as orientation is again calculated by taking acceleration readings and not independently, its not very accurate.
Gyroscope may help in this case. But few androids still have a true gyroscope. And using its raw readings is not so simple.
you need to assume two coordinate systems:
1- fixed global system.
2- moving coordinate system in which the origin moves & rotates as sensor does.
in global system, g is always parallel to z axis but in moving system it is not.
so all you have to do is to compute 3*3 rotation matrix from orientation angles or
yaw, pitch & roll. (you can find formulas everywhere).
then multiply this rotation matrix by 3*1 acceleration vector measured by sensor.
this will transform coordinates and declare the values in fixed global system.
the only thing afterward is to simply subtract g from z value.

Categories

Resources