yaw pitch roll of -Z axis from the android sensors - android

The yaw pitch and roll we get from the android's SensorManager.getOrientation() are all for the Y axis of the phone. By this i mean, the yaw and pitch say where the Y-axis points, and the roll is about the Y axis. (and my screen orientation is fixed to landsape, so the Y axis doesnt change). But what i want is the yaw pitch and roll of the negative Z axis (points into the phone), more like if my phone screen is a landscape window in a cockpit of a plane, what would the yaw pitch and roll be?

If I understand what you're saying, you are seeing the current yaw pitch and roll being return as if the +y axis were the default 'front' vector and the +z axis were the default 'up' vector. You would like to do a coordinate transform such that your yaw, pitch, and roll are calculated with the -z axis as the default 'front' vector and the +x vector as the default 'up' vector.
First, you'll need to compute the current front and up vectors from your yaw, pitch and roll in the current configuration. This can be done using 3D rotation matrices:
http://en.wikipedia.org/wiki/Rotation_matrix#Rotations_in_three_dimensions
Use the yaw angle to rotate about z, pitch as a rotation about x, and roll as a rotation about y. Multiply these matrices together, then multiply the front vector (0, 1, 0) and up vector (0, 0, 1) by the result to get your new front and up vectors.
Next you'll need to compute the new yaw, pitch, and roll angles. Yaw is the angle between the front vector projected into the yz plane (set x value to 0) and the -z vector (0, 0, -1), pitch is the angle between the front vector projected onto the xz plane and the -z vector, and roll is the angle between the up vector projected onto the xy plane and the +x vector (1, 0, 0). If we let Fx = the x component of the front vector, Fy be the y component, and so on, we get:
yaw = acos ( -z dot (0, Fy, Fz) ) / sqrt(Fy*Fy + Fz*Fz)
pitch = acos ( -z dot (Fx, 0, Fz) ) / sqrt(Fx*Fx + Fz*Fz)
roll = acos ( +x dot (Ux, Uy, 0 ) ) / sqrt(Ux*Ux + Uy*Uy)
You should be able to do this for other vectors if I've chosen the wrong ones.

The body axes conventions used in Android are not the same as used in aerospace (for good reasons). The x-axis of a "plane" is equivalent to the "y" axis of and Android device (and likewise x axis is related to y axis, positive or negative depending on ENU ro NED conevntions). A good article on Yaw Pitch ans Roll conventions can be found here, and may be a start toward answering your question: http://www.sensorplatforms.com/understanding-orientation-conventions-mobile-platforms/

Related

SensorManager.getOrientation returns values that are not in angles

I am trying to implement a business logic layer that tells me on what degree from 180 to -180 is the device currently is in the x axis. It will be used for a camera screen that requires the user to hold the phone vertically.
So in order to do so, I listens to both the TYPE_ACCELEROMETER and to the TYPE_MAGNETIC_FIELD sensors types as suggested in the official docs -
https://developer.android.com/guide/topics/sensors/sensors_position#sensors-pos-prox
And I ended up with the following code -
override fun onSensorChanged(event: SensorEvent?) {
val value = event?.values ?: return#onSensorChanged
var accelerometerReading = floatArrayOf()
var magnetometerReading = floatArrayOf()
when(event.sensor.type) {
TYPE_ACCELEROMETER -> {
accelerometerReading = floatArrayOf(value[0], value[1], value[2])
}
TYPE_MAGNETIC_FIELD -> {
magnetometerReading = floatArrayOf(value[0], value[1], value[2])
}
}
val rotationMatrix = FloatArray(9)
if (magnetometerReading.isEmpty() || accelerometerReading.isEmpty()) return#setOnSensorValuesChangedListener
SensorManager.getRotationMatrix(rotationMatrix, FloatArray(9), accelerometerReading, magnetometerReading)
val orientationAngles = FloatArray(3)
SensorManager.getOrientation(rotationMatrix, orientationAngles) //always returns the same values that are provided to it. why?
As you can see, I implemented the exact same code as said to do in the official docs but the values I get have nothing to do in the range of 180 to -180 in both 3 of the elements in the orientationAngles array. I get values that are very similar to the input I give it, something like [-0.051408034, -0.007878973, 0.04735359] which is some random irrelevant data for me.
Any idea why would this happen and how to indeed get what I want which is the x axis angle of the device?
Edit:
I'll try to simplify what I want.
Imagine holding a device in portrait mode locked in with it facing you. In a perfect portrait stance I want to get a 90 degree value from the sensor. When the user tilts the device either left or right the values would either go down to 0 or up to 180 (which side it is doesn't matter). All I need is these 2 dimensional x axis values.
It gives the angles in radians, not degrees. Almost nothing in math uses degrees beyond grade school math, radians is the norm. Radians generally go from 0 to 2*pi, equaling 0 to 360 degrees. The formula to convert is degrees= radians/pi * 180
According to docs, the angles returned by getOrientation returns radians in the range -PI to PI, where 0 is defined by the individual angles. From the docs:
values[0]: Azimuth, angle of rotation about the -z axis. This value represents the angle between the device's y axis and the magnetic north pole. When facing north, this angle is 0, when facing south, this angle is π. Likewise, when facing east, this angle is π/2, and when facing west, this angle is -π/2. The range of values is -π to π.
values[1]: Pitch, angle of rotation about the x axis. This value represents the angle between a plane parallel to the device's screen and a plane parallel to the ground. Assuming that the bottom edge of the device faces the user and that the screen is face-up, tilting the top edge of the device toward the ground creates a positive pitch angle. The range of values is -π/2 to π/2.
values[2]: Roll, angle of rotation about the y axis. This value represents the angle between a plane perpendicular to the device's screen and a plane perpendicular to the ground. Assuming that the bottom edge of the device faces the user and that the screen is face-up, tilting the left edge of the device toward the ground creates a positive roll angle. The range of values is -π to π.

Android device orientation , pitch calculation not consistent

I have developing an android application , which requires device inclination for real time processing. the device is inclined on a surface
i wanted to calculate the angle, for this i have used the project in github to calculate the pitch value. but the pitch values returned by this method is not accurate over multiple tests.. in the pitch value there is some margin of error most of the times .
And the same program tested over another phone it shows different pitch value in same position (laying the phones on the table) .
is there any way i can get the accurate pitch values across multiple devices.
i had used s6 and one plus 2 devices.
The Rotation Matrix is defined by applying roll first, then the pitch, and finally the yaw rotation. You can get the phone in the same position if you apply pitch first, then roll and again finally yaw. This is why you expect a certain pitch, but you get inaccurate values.
To prove to yourself this, play with the phone by bringing it in a certain random position by applying angle rotations in an certain order and then try to get to the same position by different order of rotations (a good position to try is phone in vertical position like keeping it in front of your face and tilted a bit to the side).
Most of the times you would use code like this
int rotation = ((WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay().getRotation();
if(rotation == 0) // Default display rotation is portrait
SensorManager.remapCoordinateSystem(Rmat, SensorManager.AXIS_MINUS_X, SensorManager.AXIS_Y, R2);
else // Default display rotation is landscape
SensorManager.remapCoordinateSystem(Rmat, SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, R2);
to make it more intuitive. This is how by you virtually would change the order, since you still have the Rotation Matrix defined by applying roll first, then the pitch, and finally the yaw rotation, but however this rotations are defined against an new XYZ coordinate system.
public abstract void onSensorChanged (int sensor, float[] values)
Added in API level 1
Called when sensor values have changed. The length and contents of the values array vary depending on which sensor is being monitored. See SensorManager for details on possible sensor types.
Definition of the coordinate system used below.
The X axis refers to the screen's horizontal axis (the small edge in portrait mode, the long edge in landscape mode) and points to the right.
The Y axis refers to the screen's vertical axis and points towards the top of the screen (the origin is in the lower-left corner).
The Z axis points toward the sky when the device is lying on its back on a table.
IMPORTANT NOTE: The axis are swapped when the device's screen orientation changes. To access the unswapped values, use indices 3, 4 and 5 in values[].
SENSOR_ORIENTATION, SENSOR_ORIENTATION_RAW:
All values are angles in degrees.
values[0]: Azimuth, rotation around the Z axis (0<=azimuth<360). 0 = North, 90 = East, 180 = South, 270 = West
values[1]: Pitch, rotation around X axis (-180<=pitch<=180), with positive values when the z-axis moves toward the y-axis.
values[2]: Roll, rotation around Y axis (-90<=roll<=90), with positive values when the z-axis moves toward the x-axis.
Note that this definition of yaw, pitch and roll is different from the traditional definition used in aviation where the X axis is along the long side of the plane (tail to nose).
And the difference between phones it is expected.

How can I get horizontal rotation angle whatever device orientation?

I can get horizontal rotation angle by calculating the roll value (according to the definition of iOS Device Motion) when the device is portrait.
The x, y, z-axis of the mobile device:
But when the device is in landscape, y-axis is horizontal and x-axis is vertical. How can I get the angle? The pitch value is not correct. I have tried to exchange x and y in quaternion but not worked.
And more, how can I get the angle when the device is in the middle of portrait and landscape, for example, you tilt the device 30 degrees about z-axis?
Is there a unified quaternion or rotational matrix to calculate the angle whatever device orientation?
Probably you want to get angle of some device "direction". For example if you want to get angle of Z axis projected on ground, do it exactly.
Transform the Z(0,0,1) vector into current device rotation.
You have direction of "ZinWorld" vector in some world frame.
According to world frame orientation , project "ZinWorld" into ground plane. For example , if in default world frame Z is pointing to UP , then just take X Y components of "ZinWorld".
Take atan2(X, Y) to get proper angle.

Conversion from Phone axis to world coordinate axis

I am working on a project where i want to convert my android phones axis onto the world coordinate axis. I have the acceleration along the phones 3 axis (x,y,z) and the three angles being the azimuth,pitch and roll. How can i proceed in virtually converting my phones axis to the world coordinate axis(truenorth,trueeast).
As far as i am doing is giving me wrong result.
This is my approach.
Lets say
-0.030029837 -0.008528218 -0.199289320 is the Acceleration along phone's X,Y,Z axis and
0.01618620 0.48581530 0.19617330 are the three angles(in radians) being the azimuth, pitch and roll.
To get the acceleration along the "truenorth" i am simply taking the acceleration along Y axis and multiplying it with cos of all the three angles ie.
-0.008528218*cos(0.19617330)*cos(0.48581530)*cos(0.01618620)=-0.00739584
To get the acceleration along the "trueeast" i am simply taking the acceleration along X axis and multiplying it with cos of all the three angles ie.
-0.030029837*cos(0.19617330)*cos(0.48581530)*cos(0.01618620)=-0.026042471
To get the acceleration along the "vertical upwards" i am simply taking the acceleration along Z axis and multiplying it with cos of the 2 angles ie.
-0.199289320*cos(0.19617330)*cos(0.48581530)=-0.172850298
Now to check whether i have done it correct or wrong. I did a consistency test.
The magnitude of acceleration along phone's X,Y,Z axis should be equal to the magnitude of acceleration along the trueeast, truenorth and the vertically upwards.
But this is coming to not equal.
0.20 is not equal to 0.17
Where am i going wrong? It would be of a great help if someone out there could help me out.
Thanks a lot in advance.
I am assuming that your azimuth, pitch and roll angles represent how much angle the phone axes deviate with respect to world axes. If my assumption is right then the following lines of code will fulfill your needs.
float yaw = (float) Math.toDegrees(0.01618620);
float pitch = (float) Math.toDegrees(0.48581530);
float roll = (float) Math.toDegrees(0.19617330);
float acc[] = { -0.030029837f, -0.008528218f, -0.199289320f, 1 };
float[] matrix = new float[16];
Matrix.setIdentityM(matrix, 0);
Matrix.rotateM(matrix, 0, yaw, 0, 0, 1);
Matrix.rotateM(matrix, 0, pitch, 0, 1, 0);
Matrix.rotateM(matrix, 0, roll, 1, 0, 0);
float accT[] = new float[4];
Matrix.multiplyMV(accT, 0, matrix, 0, acc, 0);
float rawMagnitude = Matrix.length(acc[0], acc[1], acc[2]);
System.out.println(rawMagnitude);
float transformedMagnitude = Matrix.length(accT[0], accT[1], accT[2]);
System.out.println(transformedMagnitude);
The console printed magnitudes are
0.2017195
0.20171948
I used android.opengl.Matrix. The static methods used here are self explanatory for more details see the API documentation.
If my assumption is exactly opposite, that is, if your azimuth, pitch and roll angles represent how much angle the world axes deviate with respect to phone axes, then just make your azimuth, pitch and roll angles negative.

Sensor value interpretation

I am currently trying to understand the sensor values I get from code similar to this.
The yaw/azimuth value seems to be okay. The problem is the pitch value, because I get -90° when the device is upright and tilting back and forward lead to the same values.
Lets say i tilt by 45° forward - the value is -45°, so its the same like tilting the device 45° backward.
Like this I cannot determine the device pitch in 360°.
Can somebody help me with that?
Taken from http://developer.android.com/reference/android/hardware/SensorListener.html:
All values are angles in degrees.
values[0]: Azimuth, rotation around the Z axis (0<=azimuth<360). 0 = North, 90 = East, 180 = South, 270 = West
values[1]: Pitch, rotation around X axis (-180<=pitch<=180), with positive values when the z-axis moves toward the y-axis.
values[2]: Roll, rotation around Y axis (-90<=roll<=90), with positive values when the z-axis moves toward the x-axis.
Note that this definition of yaw, pitch and roll is different from the traditional definition used in aviation where the X axis is along the long side of the plane (tail to nose).
So Pitch -180° - 180° instead of 0° - 360°. The difference is forward shows -45° and backward should show 45°, right?

Categories

Resources