Implementing an accurate compass with Android [duplicate] - android

I'm using the Android gravity and magnetic field sensors to calculate orientation via SensorManager.getRotationMatrix and SensorManager.getOrientation. This gives me the azimuth, pitch and orientation numbers. The results look sensible when the device is lying flat on a table.
However, I've disabled switches between portrait and landscape in the manifest, so that getWindowManager().getDefaultDisplay().getRotation() is always zero. When I rotate the device by 90 degrees so that it's standing vertical I run into trouble. Sometimes the numbers seem quite wrong, and I've realised that this relates to Gimbal lock. However, other apps don't seem to have this problem. For example, I've compared my app against two free sensor test apps (Sensor Tester (Dicotomica) and Sensor Monitoring (R's Software)). My app agrees with these apps when the device is flat, but as I rotate the device into the vertical position there can be significant differences. The two apps seem to agree with each other, so how do they get around this problem?

When the device is not flat, you have to call remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR); before calling getOrientation.
The azimuth returns by getOrientation is obtained by orthogonally project the device unit Y axis into the world East-North plane and then calculate the angle between the resulting projection vector and the North axis.
Now we normally think of direction as the direction where the back camera is pointing. That is the direction of -Z where Z is the device axis pointing out of the screen. When the device is flat we do not think of direction and accept what ever given. But when it is not flat we expect it is the direction of -Z. But getOrientation calculate the direction of the Y axis, thus we need to swap the Y and Z axes before calling getOrientation. That is exactly what remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR) does, it keep the X axis intact and map Z to Y.
Now so how do you know when to remap or not. You can do that by checking
float inclination = (float) Math.acos(rotationMatrix[8]);
if (result.inclination < TWENTY_FIVE_DEGREE_IN_RADIAN
|| result.inclination > ONE_FIFTY_FIVE_DEGREE_IN_RADIAN)
{
// device is flat just call getOrientation
}
else
{
// call remap
}
The inclination above is the angle between the device screen and the world East-North plane. It shows how much the device is tilting.

I think the best way of defining your orientation angles when the device isn't flat is to use a more appropriate angular co-ordinate system that the standard Euler angles that you get from SensorManager.getOrientation(...). I suggest the one that I describe here on math.stackexchange.com. I've also put some code that does implements it in an answer here. Apart from a good definition of azimuth, it also has a definition of the pitch angle which is exactly the angle given by Math.acos(rotationMatrix[8]) that is mentioned in another answer here.
You can get full details from the two links that I've given in the first paragraph. However, in summary, your rotation matrix R from SensorManager.getRotationMatrix(...) is
where (Ex, Ey, Ez), (Nx, Ny, Nz) and (Gx, Gy, Gz) are vectors pointing due East, North, and in the direction of Gravity. Then the azimuth angle that you want is given by

I'll add an example that worked for me, using Hoan's answer, and also highlight some new resources that are available now and may help when looking at apps that need to use the gyro sensors.
Firstly, there is a Google Code lab available with a sample app - I found the completed sample app very useful for understanding a devices behaviour.
The app display looks like this:
The codelabs and the final app version is available at these links (at the time of writing):
https://developer.android.com/codelabs/advanced-android-training-sensor-orientation#0
https://github.com/google-developer-training/android-advanced/tree/master/TiltSpot
In particular this allows you experiment as follows (its easier to do than describe...):
put your device down flat on a table and experiment with rotating on the table while flat, lifting top and bottom (i.e. lift the top of the display, where your front camera typically is, from the table while leaving the bottom, where your home button typically is if you have one, on the table. Similarly, lift the left and right edges of the device. Observe the azimuth, pitch and roll values in the app.
Put your device against a wardrobe door in portrait mode and again experiment moving it around. Open the door to see the effect also. Rotate to landscape to see the difference.
The key thing I think you may see from the above is that everything is very simple and works well when flat and everything is complex and interrelated when not flat.
Looking specifically at a use case I had to display the pitch and roll when vertical in portrait mode, the following worked for me:
when (event?.sensor?.type) {
Sensor.TYPE_ROTATION_VECTOR -> {
// Calculate the rotation matrix
val rotMatrix = FloatArray(9)
var rotationMatrixAdjusted = FloatArray(9)
val azimuthChanged: (Float) -> Unit
val orientation = FloatArray(3)
SensorManager.getRotationMatrixFromVector(rotMatrix, event.values);
SensorManager.remapCoordinateSystem(rotMatrix,
SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X,
rotationMatrixAdjusted);
SensorManager.getOrientation(rotationMatrixAdjusted, orientation)
val rollAngleRadians = orientation[1]
val pitchAngleRadians = orientation[2]
//Report results back to listener
thisListener.onRollPitchEvent(rollAngleRadians, pitchAngleRadians)
}
}

Related

Android accelerometer mapping movement to custom coordinate system

If I have custom coordinate system X - left/right, Y - forward/backward, Z - Up/down that is represented on my PC screen inside my unreal project, how would I map the accelerator values In a way that when I move my phone toward the PC screen (regardless of the phone orientation) so that my Y value goes up and same for other axes?
I got something similar working with rotation by taking "referent" rotation quaternion, inverting it and multiplying it by current rotation quaternion, but I'm just stuck on how to transform movement.
Example of my problem is that if I'm moving my phone up with screen pointing at sky my Z axis increases which is what I want, but when I also point my phone screen to my PC screen and move it forward Z axis again goes up, when I would want in this case that my Y value increases.
There is a similar question Acceleration from device's coordinate system into absolute coordinate system but that doesn't really solve my problem since I don't want to depend on the location of the north for Y and so on.
Clarification of question intent
It sounds like what you want is the acceleration of your device with respect to your laptop. As you correctly mentioned, the similar question Acceleration from device's coordinate system into absolute coordinate system maps the local accelerometer data of a device with respect to a global frame of reference (FoR) (the Cartesian "flat" Earth FoR to be specific - as opposed to the ultra-realistic spherical Earth FoR).
What you know
From your device, you know the local Phone FoR, and from the link above, you can also find the behavior of your device with respect to a flat Earth FoR with a rotation matrix, which I'll call R_EP for Rotation in Earth FoR from Phone FoR. In order to represent the acceleration of your device with respect to your laptop, you will need to know how your laptop is oriented and positioned with respect to either your phone's FoR (A), or the flat Earth FoR (B), or some other FoR that is known to both your laptop and your phone but I'll ignore this cause it's irrelevant and the method is identical to B.
What you'll need
In the first case, A, this will allow you to construct a rotation matrix which I'll call R_LP for Rotation in Laptop FoR from Phone FoR - and that would be super convenient because that's your answer. But alas, life isn't fun without a little bit of a challenge.
In the second case, B, this will allow you to construct a rotation matrix which I'll call R_LE for Rotation in Laptop FoR from Earth FoR. Because the Hamilton product is associative (but NOT commutative: Are quaternions generally multiplied in an order opposite to matrices?), you can find the acceleration of your phone with respect to your laptop by daisy-chaining the rotations, like so:
a_P]L = R_LE * R_EP * a_P]P
Where the ] means "in the frame of", and a_P is acceleration of the Phone. So a_P]L is the acceleration of the Phone in the Laptop FoR, and a_P]P is the acceleration of the Phone in the Phone's FoR.
NOTE When "daisy-chaining" rotation matrices, it's important that they follow a specific order. Always make sure that the rotation matrices are multiplied in the correct order, see Sections 2.6 and 3.1.4 in [1] for more information.
Hint
To define your laptop's FoR (orientation and position) with respect to the global "flat" Earth FoR, you can place your phone on your laptop and set the current orientation and position as your laptop's FoR. This will let you construct R_LE.
Misconceptions
A rotation quaternion, q, is NEITHER the orientation NOR attitude of one frame of reference relative to another. Instead, it represents a "midpoint" vector normal to the rotation plane about which vectors from one frame of reference are rotated to the other. This is why defining quaternions to rotate from a GLOBAL frame to a local frame (or vice-versa) is incredibly important. The ENU to NED rotation is a perfect example, where the rotation quaternion is [0; sqrt(2)/2; sqrt(2)/2; 0], a "midpoint" between the two abscissa (X) axes (in both the global and local frames of reference). If you do the "right hand rule" with your three fingers pointing along the ENU orientation, and rapidly switch back and forth from the NED orientation, you'll see that the rotation from both FoR's is simply a rotation about [1; 1; 0] in the Global FoR.
References
I cannot recommend the following open-source reference highly enough:
[1] "Quaternion kinematics for the error-state Kalman filter" by Joan Solà. https://hal.archives-ouvertes.fr/hal-01122406v5
For a "playground" to experiment with, and gain a "hands-on" understanding of quaternions:
[2] Visualizing quaternions, An explorable video series. Lessons by Grant Sanderson. Technology by Ben Eater https://eater.net/quaternions

World reference values for the accelerometer sensor when walking

I have been trying to develop a Pedestrian Dead Reckoning application for Android, and after taking care of the step detection and step length components, I have decided to tackle the orientation determination problem.
After stumbling on a couple of posts regarding coordinate transformation (and even chatting with a frequent answerer), I have been getting gradually better results, but the are still some things that bother me.
The experiment:
I walked forward Northward, turned back, and walked back Southward. Repeated the procedure towards West , then East.
Issues:
I expected, while walking straight in several directions, to have
values of the X and Y values oscillate with the footsteps, and
have a relatively stable Z value throughout. Instead, the Y
values behave this way, with the Z value having its expected
behavior. How come? Does it have anything to do with me not
using remapCoordinates()? (see Graph 1)
I expected the angle plots to jump around 180º and -180º, but
why do they also do it around 35º? (see Graph 2)
Notes:
I am using gravity and magnetometer values to compute the rotation
matrix, and multiplying it using OpenGL's multiplyMV();
I am not using remapCoordinates(), because I thought I didn't need to: the
phone is upright in my pocket (Y points up/down, Z usually forward)
and should displace itself 45º forwards backwards and forwards, at
worst;
Azimuth values seem ok, and do not have the oscillation described in issue 2. (see Graph 3)
Graphs:
World Reference Gravity Acceleration
(blue is X, red is Y, green is Z)
World Reference Gravity Angles
(blue is atan2(Y/X), red is atan2(Z/Y) and green is atan2(Z/X) )
Orientation Values
(blue is azimuth, red is pitch and green is roll)

How to get the rotation between accelerometer's axis and motion vector?

I'm working on a Pedestrian Navigation System on Android. I am currently trying to get the rotation matrix between the 3-axis accelerometer referential and the motion that is applied to the device.
Let's say you are walking straight forward with the device in your hand, but the Y axis of the accelerometer (thus the device's Y axis) is not oriented in the same direction as the one you're heading yourself (basically you're holding the device in an awkward way).
Then if I want to apply distances (based on step detection), it would be wrong to apply them to the accelerometer referential (which I know the orientation) : it has to be rotated accordingly to the motion heading.
This is why I would like to know if you could enlighten me about a method to compute the accelerometer readings to turn them into rotation angles (or a matrix). Such a method has to avoid integrating the acceleration even once, as the error is abyssal on cheap accelerometers (otherwise it would be pretty easy to perform I think).
EDIT : Magnetometer and Gyroscope help you to find the device's orientation even when stationnary, but doesn't allow you to know in which direction the device is moving. I have the first one, and I'm searching for the second. Basically :
Human Referential (motion direction) -> Device Referential (or accelerometer referential) -> Earth Referential
and i'm searching the way to find the rotation matrix to compute distances from HR to DR, which I would then apply to the rotation matrix I found to go from DR to ER.
I found an article which might be useful, here is the link.
The Idea is based upon acquiring the accelerometer data in the 3 axis, (which is an n times 3
matrix) and finding the PCA (principle component analysis) of the matrix. The PCA is the 3 vectors with the highest energy of the matrix).
Idealy, the main vector (with the highest energy) direction is upwards, and the second is the heading direction.
You can read it all in the article.
I tried implementing the algorithm in matlab, the result is o.k. (not great)
Hope you can do better (I would like to hear about good results).
Hope this helps
Ariel

Android - Steering with orientation sensor

I am writing a simple 2d game similar to breakout or pong, where you move a bar along the bottom of the screen by tipping the phone left and right.
I am trying to use the Orientation sensor to achieve this movement however i've run into some issues.
After printing out the orientation sensor's values for a while I decided rotation in the Z axis gave me what I wanted, If I hold the screen horizontal and perpendicular to the floor (x: 0, y: 0, z: 90) rotation in Z left and right takes away from 90. So the value 90-z is fine for usage as an "amount of steering" value.
However this perpendicular to the floor position is not very natural to play with, people are much more likely to hold the phone at about 45 degrees in the Z axis, at which point all these values I have been using mess up completely, steering goes weird, people are unhappy..
I guess what I really need is to detect rotation in some z-axis even though the XY plane the phone is in is constantly changing. Is there some clever way to do this with Maths?
EDIT: Just found the OrientationEventListener - Exactly what I needed!

Compute relative orientation given azimuth, pitch, and roll in android?

When I listen to orientation event in an android app, I get a SensorEvent, which contains 3 floats - azimuth, pitch, and roll in relation to the real-world's axis.
Now say I am building an app like labyrinth, but I don't want to force the user the be over the phone and hold the phone such that the xy plane is parallel to the ground. Instead I want to be able to allow the user to hold the phone as they wish, laying down or, perhaps, sitting down and holding the phone at an angle. In other words, I need to calibrate the phone in accordance with the user's preference.
How can I do that?
Also note that I believe that my answer has to do with getRotationMatrix and getOrientation, but I am not sure how!
Please help! I've been stuck at this for hours.
For a Labyrinth style app, you probably care more for the acceleration (gravity) vector than the axes orientation. This vector, in Phone coordinate system, is given by the combination of the three accelerometers measurements, rather than the rotation angles. Specifically, only the x and y readings should affect the ball's motion.
If you do actually need the orientation, then the 3 angular readings represent the 3 Euler angles. However, I suspect you probably don't really need the angles themselves, but rather the rotation matrix R, which is returned by the getRotationMatrix() API. Once you have this matrix, then it is basically the calibration that you are looking for. When you want to transform a vector in world coordinates to your device coordinates, you should multiply it by the inverse of this matrix (where in this special case, inv(R) = transpose(R).
So, following the example I found in the documentation, if you want to transform the world gravity vector g ([0 0 g]) to the device coordinates, multiply it by inv(R):
g = inv(R) * g
(note that this should give you the same result as reading the accelerometers)
Possible APIs to use here: invertM() and multiplyMV() methods of the matrix class.
I don't know of any android-specific APIs, but all you want to do is decrease the azimuth by a certain amount, right? So you move the "origin" from (0,0,0) to whatever they want. In pseudocode:
myGetRotationMatrix:
return getRotationMatrix() - origin

Categories

Resources