World reference values for the accelerometer sensor when walking - android

I have been trying to develop a Pedestrian Dead Reckoning application for Android, and after taking care of the step detection and step length components, I have decided to tackle the orientation determination problem.
After stumbling on a couple of posts regarding coordinate transformation (and even chatting with a frequent answerer), I have been getting gradually better results, but the are still some things that bother me.
The experiment:
I walked forward Northward, turned back, and walked back Southward. Repeated the procedure towards West , then East.
Issues:
I expected, while walking straight in several directions, to have
values of the X and Y values oscillate with the footsteps, and
have a relatively stable Z value throughout. Instead, the Y
values behave this way, with the Z value having its expected
behavior. How come? Does it have anything to do with me not
using remapCoordinates()? (see Graph 1)
I expected the angle plots to jump around 180º and -180º, but
why do they also do it around 35º? (see Graph 2)
Notes:
I am using gravity and magnetometer values to compute the rotation
matrix, and multiplying it using OpenGL's multiplyMV();
I am not using remapCoordinates(), because I thought I didn't need to: the
phone is upright in my pocket (Y points up/down, Z usually forward)
and should displace itself 45º forwards backwards and forwards, at
worst;
Azimuth values seem ok, and do not have the oscillation described in issue 2. (see Graph 3)
Graphs:
World Reference Gravity Acceleration
(blue is X, red is Y, green is Z)
World Reference Gravity Angles
(blue is atan2(Y/X), red is atan2(Z/Y) and green is atan2(Z/X) )
Orientation Values
(blue is azimuth, red is pitch and green is roll)

Related

Interpretation of Rotation Vector Measurements

I use android devices (Xiaomi Mi9-SE (Android 10) & Huawei P10 (Android 9)) to collect movement data from bicycle rides. Hereby, I am measuring i.a the Rotation Vector which describes the rotation of the smartphone to the earth's reference frame (android documentation).
The sensor returns four values, which represent the rotation as a quaternion (w,x,y,z).
In my experiment I recorded a bike ride with two smartphones, which were glued to each other in the same alignment. When I plot the returned values from the Rotation Sensors numerically, they seem to be similar in some phases and dissimilar in others (cf. 1, 2)
However, when in a dissimilar phase (e.g. ~22:12:30h), it seems like the rotations are still somehow the same, but the axis are swapped. I plotted two corresponding rotations (cf. 3, 4). If the x and y axis were swapped and mirrored on the new y, the rotation would become similar. I wonder if this behavior is due to inaccurate/bad sensors or whether I missed something working with quaternions.
Would be really great if someone could tell me, if/where I miss a point here.

Implementing an accurate compass with Android [duplicate]

I'm using the Android gravity and magnetic field sensors to calculate orientation via SensorManager.getRotationMatrix and SensorManager.getOrientation. This gives me the azimuth, pitch and orientation numbers. The results look sensible when the device is lying flat on a table.
However, I've disabled switches between portrait and landscape in the manifest, so that getWindowManager().getDefaultDisplay().getRotation() is always zero. When I rotate the device by 90 degrees so that it's standing vertical I run into trouble. Sometimes the numbers seem quite wrong, and I've realised that this relates to Gimbal lock. However, other apps don't seem to have this problem. For example, I've compared my app against two free sensor test apps (Sensor Tester (Dicotomica) and Sensor Monitoring (R's Software)). My app agrees with these apps when the device is flat, but as I rotate the device into the vertical position there can be significant differences. The two apps seem to agree with each other, so how do they get around this problem?
When the device is not flat, you have to call remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR); before calling getOrientation.
The azimuth returns by getOrientation is obtained by orthogonally project the device unit Y axis into the world East-North plane and then calculate the angle between the resulting projection vector and the North axis.
Now we normally think of direction as the direction where the back camera is pointing. That is the direction of -Z where Z is the device axis pointing out of the screen. When the device is flat we do not think of direction and accept what ever given. But when it is not flat we expect it is the direction of -Z. But getOrientation calculate the direction of the Y axis, thus we need to swap the Y and Z axes before calling getOrientation. That is exactly what remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR) does, it keep the X axis intact and map Z to Y.
Now so how do you know when to remap or not. You can do that by checking
float inclination = (float) Math.acos(rotationMatrix[8]);
if (result.inclination < TWENTY_FIVE_DEGREE_IN_RADIAN
|| result.inclination > ONE_FIFTY_FIVE_DEGREE_IN_RADIAN)
{
// device is flat just call getOrientation
}
else
{
// call remap
}
The inclination above is the angle between the device screen and the world East-North plane. It shows how much the device is tilting.
I think the best way of defining your orientation angles when the device isn't flat is to use a more appropriate angular co-ordinate system that the standard Euler angles that you get from SensorManager.getOrientation(...). I suggest the one that I describe here on math.stackexchange.com. I've also put some code that does implements it in an answer here. Apart from a good definition of azimuth, it also has a definition of the pitch angle which is exactly the angle given by Math.acos(rotationMatrix[8]) that is mentioned in another answer here.
You can get full details from the two links that I've given in the first paragraph. However, in summary, your rotation matrix R from SensorManager.getRotationMatrix(...) is
where (Ex, Ey, Ez), (Nx, Ny, Nz) and (Gx, Gy, Gz) are vectors pointing due East, North, and in the direction of Gravity. Then the azimuth angle that you want is given by
I'll add an example that worked for me, using Hoan's answer, and also highlight some new resources that are available now and may help when looking at apps that need to use the gyro sensors.
Firstly, there is a Google Code lab available with a sample app - I found the completed sample app very useful for understanding a devices behaviour.
The app display looks like this:
The codelabs and the final app version is available at these links (at the time of writing):
https://developer.android.com/codelabs/advanced-android-training-sensor-orientation#0
https://github.com/google-developer-training/android-advanced/tree/master/TiltSpot
In particular this allows you experiment as follows (its easier to do than describe...):
put your device down flat on a table and experiment with rotating on the table while flat, lifting top and bottom (i.e. lift the top of the display, where your front camera typically is, from the table while leaving the bottom, where your home button typically is if you have one, on the table. Similarly, lift the left and right edges of the device. Observe the azimuth, pitch and roll values in the app.
Put your device against a wardrobe door in portrait mode and again experiment moving it around. Open the door to see the effect also. Rotate to landscape to see the difference.
The key thing I think you may see from the above is that everything is very simple and works well when flat and everything is complex and interrelated when not flat.
Looking specifically at a use case I had to display the pitch and roll when vertical in portrait mode, the following worked for me:
when (event?.sensor?.type) {
Sensor.TYPE_ROTATION_VECTOR -> {
// Calculate the rotation matrix
val rotMatrix = FloatArray(9)
var rotationMatrixAdjusted = FloatArray(9)
val azimuthChanged: (Float) -> Unit
val orientation = FloatArray(3)
SensorManager.getRotationMatrixFromVector(rotMatrix, event.values);
SensorManager.remapCoordinateSystem(rotMatrix,
SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X,
rotationMatrixAdjusted);
SensorManager.getOrientation(rotationMatrixAdjusted, orientation)
val rollAngleRadians = orientation[1]
val pitchAngleRadians = orientation[2]
//Report results back to listener
thisListener.onRollPitchEvent(rollAngleRadians, pitchAngleRadians)
}
}

Android, remove gravity from the accelerometer using the compass

First of all, I suppose the device at rest at the first measure so the acceleration I have is the gravity one, second I won't use a low pass filter.
Android gives me the linear acceleration and the compass value. My guess is that I can use the compass to rotate the acceleration to the earth reference and so remove gravity.
My guess is that if I calculate the difference of two compass measurement I have a quantity that is index of the rotation of the cellphone and so, if I add it to the initial gravity vector i can rotate it too.
Then at the i-th measure I have, my guess is this:
#a[i] contains the 3-acceleration at time i
#b[i] cointains the 3-compass values at time i
b[i]=numpy.sin(b[i]/(180./math.pi)) # I normalize the compass values from 0 to 1
#since b is a unit vector I need to "de-normalize" it
b[i]=b[i]*sqrt(g**2.)
deltab=b[i]-b[i-1]
# At the very beginning of the code I had something like g=a[0]
g=g-deltab #well I've tried also with the plus sign
it don't work..but I can't see the problem..any idea?
EDIT: I'm also trying this method, which again I don't know why it don't works:
I've found here to compute the rotation matrix giving the vector and the rotation angle.
Here is how to build this matrix: http://en.wikipedia.org/wiki/Rotation_matrix#Rotation_matrix_from_axis_and_angle
The angle I've choosen is the one of the scalar product by the old and the new compass direction.
The vector with which they rotate is the cross product between the old and the new vector (I guess..and this may be wrong)
So: If I get the rotation of the compass, and then I build the rotation matrix, and then I apply it to my initial gravity vector, is it the correct rotate the gravity vector?
#a[i] contains the 3-acceleration at time i
#b[i] cointains the 3-compass values at time i
omega=cross(b[i],b[i-1])
theta=dot(b[i],b[i-1])/sqrt(dot(b[i],b[i])*dot(b[i-1],b[i-1]))
M=rotationMatrix(omega,theta)
aWithoutGravity[i]=dot(M,a[i])
As you describe it, this problem can not be solved. If you had two coordinate systems, you could describe one in terms of the other, but you only have two vectors (described within coordinate system of the phone), and single vectors introduce ambiguities, since you really only know the angle between them.
For example, consider taking an initial measurement of the gravity and magnetic field vector and now rotate and spin your phone. You can again measure the magnetic field vector, but imagine you don't know the gravity vector, where will it be? All you know is its angle relative to the magnetic field vector (and you might also assume you know its magnitude), basically forming a cone around the magnetic field vector (or ring if you assume the magnitude). But since you don't know exact information about it, you can't use it to completely determine the linear acceleration. That is, given a measured acceleration (physically, gravity + linear acceleration), every vector in the cone of possible gravity vectors would imply a different linear acceleration.
It does get you part of the way there, in that you now have a complicated geometry problem where there will only be a certain range of linear acceleration vectors that will be consistent with the "gravity cone", but the problem become much more complicated than a simple subtraction, and there's no a priori reason to believe that this smaller subset of possible linear acceleration vectors tells you anything useful.
For example, consider that the original reading of the compass is (1,0,0), and the reading for acceleration is (6, 6, 0). If the phone is then just rotated about (1,0,0), other readings of acceleration are possible, such as (6, -6, 0), (6, 0, 6), (6, 0, -6), and many in between. So, because of this ambiguity, one can't tell based purely on the compass reading whether (6, 6, 0) changed into, say, (6, -6, 0) because of the acceleration or because the phone was rotated.
The compass values (i.e. the values from the android sensor Sensor.TYPE_MAGNETIC_FIELD) has components in both the gravity direction and the direction of magnetic north. So you can use it, in conjunction with the gravity vector, to work out the direction of east, because that's given by the vector cross product of the two vectors. In fact, depending on where you are in the world, the Sensor.TYPE_MAGNETIC_FIELD component in the gravity direction can be quite large.
Also note that Android has two sensors Sensor.TYPE_GRAVITY and Sensor.TYPE_LINEAR_ACCELERATION which are derived from Sensor.TYPE_ACCELEROMETER. And it's always true that the readings satisfy
Sensor.TYPE_ACCELEROMETER = Sensor.TYPE_GRAVITY + Sensor.TYPE_LINEAR_ACCELERATION

Android - Steering with orientation sensor

I am writing a simple 2d game similar to breakout or pong, where you move a bar along the bottom of the screen by tipping the phone left and right.
I am trying to use the Orientation sensor to achieve this movement however i've run into some issues.
After printing out the orientation sensor's values for a while I decided rotation in the Z axis gave me what I wanted, If I hold the screen horizontal and perpendicular to the floor (x: 0, y: 0, z: 90) rotation in Z left and right takes away from 90. So the value 90-z is fine for usage as an "amount of steering" value.
However this perpendicular to the floor position is not very natural to play with, people are much more likely to hold the phone at about 45 degrees in the Z axis, at which point all these values I have been using mess up completely, steering goes weird, people are unhappy..
I guess what I really need is to detect rotation in some z-axis even though the XY plane the phone is in is constantly changing. Is there some clever way to do this with Maths?
EDIT: Just found the OrientationEventListener - Exactly what I needed!

Compute relative orientation given azimuth, pitch, and roll in android?

When I listen to orientation event in an android app, I get a SensorEvent, which contains 3 floats - azimuth, pitch, and roll in relation to the real-world's axis.
Now say I am building an app like labyrinth, but I don't want to force the user the be over the phone and hold the phone such that the xy plane is parallel to the ground. Instead I want to be able to allow the user to hold the phone as they wish, laying down or, perhaps, sitting down and holding the phone at an angle. In other words, I need to calibrate the phone in accordance with the user's preference.
How can I do that?
Also note that I believe that my answer has to do with getRotationMatrix and getOrientation, but I am not sure how!
Please help! I've been stuck at this for hours.
For a Labyrinth style app, you probably care more for the acceleration (gravity) vector than the axes orientation. This vector, in Phone coordinate system, is given by the combination of the three accelerometers measurements, rather than the rotation angles. Specifically, only the x and y readings should affect the ball's motion.
If you do actually need the orientation, then the 3 angular readings represent the 3 Euler angles. However, I suspect you probably don't really need the angles themselves, but rather the rotation matrix R, which is returned by the getRotationMatrix() API. Once you have this matrix, then it is basically the calibration that you are looking for. When you want to transform a vector in world coordinates to your device coordinates, you should multiply it by the inverse of this matrix (where in this special case, inv(R) = transpose(R).
So, following the example I found in the documentation, if you want to transform the world gravity vector g ([0 0 g]) to the device coordinates, multiply it by inv(R):
g = inv(R) * g
(note that this should give you the same result as reading the accelerometers)
Possible APIs to use here: invertM() and multiplyMV() methods of the matrix class.
I don't know of any android-specific APIs, but all you want to do is decrease the azimuth by a certain amount, right? So you move the "origin" from (0,0,0) to whatever they want. In pseudocode:
myGetRotationMatrix:
return getRotationMatrix() - origin

Categories

Resources