I am using the Rotation Vector sensor to try to track how many degrees rotation 0-360 around the X axis (aka wrist movement) the user moves.
I am using the SensorManagers getOrientation to the the yaw, pitch and roll this way
float[] rotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(rotationMatrix, sensorData.getValues());
float[] orientation = new float[3];
SensorManager.getOrientation(rotationMatrix, orientation);
with the watch face pointing up it gives 0 degrees like I want but when I rotate I only get +/- 90 degree increments. rotating left gives me +90 degrees and rotating right gives me -90 degrees. I i continue rotating lets say from the +90 degrees I start getting negative degrees so when the watch is face down (180 degrees) is shows 0 again. using 90 degree increments is going to make it difficult to accurately get the actual rotation angle.
Is there a way to go from +/- 90 degree increments to 0-360 degrees?
You can use the z rotation value to find out whether the watch is face up or down. This determines the perspective you are viewing from and gives you 4 quadrants, instead of the 2 that you started with.
If z is positive: watch is face up. If z is negative, it is face down.
From here, you can create a coordinate system for conversion.
If the watch is facing up, coordinate is 90+x_rotation. Since rotating right gives [-90,0), and rotating left yields [0,90], this gives degrees of [-90+90,90+90] = [0, 180] Which is what we would expect of a watch facing up on the wrist.
If the watch is facing down, coordinate is 270+x_rotation. Quadrant 3 gives negative values of 90, making the values range from [270+-90,270] = [180,270]. Likewise, Quadrant 4 has positive values of 90, creating a range of [270,90+270]. So a watch facing down gives [180, 360].
Combined, we get the complete [0,360] device rotation range.
Hope this helps!
Related
I am trying to implement a business logic layer that tells me on what degree from 180 to -180 is the device currently is in the x axis. It will be used for a camera screen that requires the user to hold the phone vertically.
So in order to do so, I listens to both the TYPE_ACCELEROMETER and to the TYPE_MAGNETIC_FIELD sensors types as suggested in the official docs -
https://developer.android.com/guide/topics/sensors/sensors_position#sensors-pos-prox
And I ended up with the following code -
override fun onSensorChanged(event: SensorEvent?) {
val value = event?.values ?: return#onSensorChanged
var accelerometerReading = floatArrayOf()
var magnetometerReading = floatArrayOf()
when(event.sensor.type) {
TYPE_ACCELEROMETER -> {
accelerometerReading = floatArrayOf(value[0], value[1], value[2])
}
TYPE_MAGNETIC_FIELD -> {
magnetometerReading = floatArrayOf(value[0], value[1], value[2])
}
}
val rotationMatrix = FloatArray(9)
if (magnetometerReading.isEmpty() || accelerometerReading.isEmpty()) return#setOnSensorValuesChangedListener
SensorManager.getRotationMatrix(rotationMatrix, FloatArray(9), accelerometerReading, magnetometerReading)
val orientationAngles = FloatArray(3)
SensorManager.getOrientation(rotationMatrix, orientationAngles) //always returns the same values that are provided to it. why?
As you can see, I implemented the exact same code as said to do in the official docs but the values I get have nothing to do in the range of 180 to -180 in both 3 of the elements in the orientationAngles array. I get values that are very similar to the input I give it, something like [-0.051408034, -0.007878973, 0.04735359] which is some random irrelevant data for me.
Any idea why would this happen and how to indeed get what I want which is the x axis angle of the device?
Edit:
I'll try to simplify what I want.
Imagine holding a device in portrait mode locked in with it facing you. In a perfect portrait stance I want to get a 90 degree value from the sensor. When the user tilts the device either left or right the values would either go down to 0 or up to 180 (which side it is doesn't matter). All I need is these 2 dimensional x axis values.
It gives the angles in radians, not degrees. Almost nothing in math uses degrees beyond grade school math, radians is the norm. Radians generally go from 0 to 2*pi, equaling 0 to 360 degrees. The formula to convert is degrees= radians/pi * 180
According to docs, the angles returned by getOrientation returns radians in the range -PI to PI, where 0 is defined by the individual angles. From the docs:
values[0]: Azimuth, angle of rotation about the -z axis. This value represents the angle between the device's y axis and the magnetic north pole. When facing north, this angle is 0, when facing south, this angle is π. Likewise, when facing east, this angle is π/2, and when facing west, this angle is -π/2. The range of values is -π to π.
values[1]: Pitch, angle of rotation about the x axis. This value represents the angle between a plane parallel to the device's screen and a plane parallel to the ground. Assuming that the bottom edge of the device faces the user and that the screen is face-up, tilting the top edge of the device toward the ground creates a positive pitch angle. The range of values is -π/2 to π/2.
values[2]: Roll, angle of rotation about the y axis. This value represents the angle between a plane perpendicular to the device's screen and a plane perpendicular to the ground. Assuming that the bottom edge of the device faces the user and that the screen is face-up, tilting the left edge of the device toward the ground creates a positive roll angle. The range of values is -π to π.
I have developing an android application , which requires device inclination for real time processing. the device is inclined on a surface
i wanted to calculate the angle, for this i have used the project in github to calculate the pitch value. but the pitch values returned by this method is not accurate over multiple tests.. in the pitch value there is some margin of error most of the times .
And the same program tested over another phone it shows different pitch value in same position (laying the phones on the table) .
is there any way i can get the accurate pitch values across multiple devices.
i had used s6 and one plus 2 devices.
The Rotation Matrix is defined by applying roll first, then the pitch, and finally the yaw rotation. You can get the phone in the same position if you apply pitch first, then roll and again finally yaw. This is why you expect a certain pitch, but you get inaccurate values.
To prove to yourself this, play with the phone by bringing it in a certain random position by applying angle rotations in an certain order and then try to get to the same position by different order of rotations (a good position to try is phone in vertical position like keeping it in front of your face and tilted a bit to the side).
Most of the times you would use code like this
int rotation = ((WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay().getRotation();
if(rotation == 0) // Default display rotation is portrait
SensorManager.remapCoordinateSystem(Rmat, SensorManager.AXIS_MINUS_X, SensorManager.AXIS_Y, R2);
else // Default display rotation is landscape
SensorManager.remapCoordinateSystem(Rmat, SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, R2);
to make it more intuitive. This is how by you virtually would change the order, since you still have the Rotation Matrix defined by applying roll first, then the pitch, and finally the yaw rotation, but however this rotations are defined against an new XYZ coordinate system.
public abstract void onSensorChanged (int sensor, float[] values)
Added in API level 1
Called when sensor values have changed. The length and contents of the values array vary depending on which sensor is being monitored. See SensorManager for details on possible sensor types.
Definition of the coordinate system used below.
The X axis refers to the screen's horizontal axis (the small edge in portrait mode, the long edge in landscape mode) and points to the right.
The Y axis refers to the screen's vertical axis and points towards the top of the screen (the origin is in the lower-left corner).
The Z axis points toward the sky when the device is lying on its back on a table.
IMPORTANT NOTE: The axis are swapped when the device's screen orientation changes. To access the unswapped values, use indices 3, 4 and 5 in values[].
SENSOR_ORIENTATION, SENSOR_ORIENTATION_RAW:
All values are angles in degrees.
values[0]: Azimuth, rotation around the Z axis (0<=azimuth<360). 0 = North, 90 = East, 180 = South, 270 = West
values[1]: Pitch, rotation around X axis (-180<=pitch<=180), with positive values when the z-axis moves toward the y-axis.
values[2]: Roll, rotation around Y axis (-90<=roll<=90), with positive values when the z-axis moves toward the x-axis.
Note that this definition of yaw, pitch and roll is different from the traditional definition used in aviation where the X axis is along the long side of the plane (tail to nose).
And the difference between phones it is expected.
I´m trying to figure out how to calculate the inclination angle of head movement, just up-down movement, not moving right-left side. I have a wearable device (I guess the problem would be the same for mobile devices) with an accelerometer sensor and it is placed on my forehead.
So far, I think is better to leave the linear acceleration out and just work with the gravity acceleration. I´m reading and reading trying to understand better what exactly I´m looking for, but my brain is not maths material. In a starting position of standing still (stationary position), all the gravity acceleration falls into an axis, lets say X. When I start producing movement with my head, for instance moving my head down (like looking to the ground), the aceleration gravity does not fall only into an axis, but two axes (would be shared, for instace between X and Z). If I´m right and this is the right approach (is it?)...
How can I calculate this angle without doing crazy maths?
This problem is exactly the same than calculating the pitch angle of an accelerometer?
Can I use for this raw data or I need to calibrated data?
To make it clear, let´s imagine that my device axes are placed in the same position than this photo and that in the starting position the gravity falls into X axis (landscape position).
EDIT
Using the following formulas for pitch and roll (thanks to #Nuclearman link)
Roll = atan2(Y, Z) * 180/M_PI;
Pitch = atan2(X, sqrt(Y*Y + Z*Z)) * 180/M_PI
And this code I wrote:
/**
* Function to calculate the inclination angle or pitch of the head movement
*/
float calcInclinationAngle(int iDataPos){
float xVal = fAccel[0][iDataPos];
float yVal = fAccel[1][iDataPos];
float zVal = fAccel[2][iDataPos];
float pitch = 0;
//Pitch in rad format
pitch = atan2(xVal, sqrt(pow(yVal,2)+pow(zVal,2)));
//Pitch in degrees
pitch = pitch * 180/PI;
return pitch;
I always get wrong angles. When the gravity force is entirely on the X axis, I get around 45 degrees (instead of 0) and if I move the device 180 degrees (just changing the gravity force signal), I get around 17-18 degrees. I have been playing with atan2 parameters, but the angles range is always the same (25-35 degrees). Just throwing a question...should I be working with calibrated data instead of raw data?
Edit 2
I have done some progress "cheating" a bit, due to I'm totally stuck. Now I normalize the data and instead of using atan, I do pitch = 1/sin(xVal), which actually gives me a range of 90 degrees which seems to fit with my device rotation (although for example it gives me 135 degrees instead of...45, but I "fix" that substracting 90 degrees to all the angles). Anyway, I need a 180 degrees range because at the moment moving backwards or forward does not make a difference in the obtained angles.
Edit adding some pictures and information
Calculating the pitch as pitch = atan2(xVal, sqrt(pow(yVal,2)+pow(zVal,2))) * (180/PI), I obtain the following angles. Just in case is useful, the accelerometer values (raw data) in the position 1 are: ACCX 2936 ACCY 2152 ACCZ 1883
Position 1 (gravity falls into X axis): 45-46 degrees
Position 2 (rotating aprox 45 from starting point): 38
Position 3 (rotating aprox 90 from starting point): 28
Position 4 (rotating aprox 180 from starting point): 18-19
Position 5 (> 180 < 360): 18-46.
SOLUTION
Just in case someone bumps into something similar in the future. The main problem was the range of the raw data. Once I mapped this into a [-1g, 1g] range and fixed a few things about the coordinates it worked.
Measure just gravity orientation. there are two approaches I know of
1.use smooth (FIR) filter to filter out quick changes
for example remember last N measurements
and output their average
this is easy and continuous but not precise
2.check for acceleration vector size
and ignore all measurements where it does not match the gravity
a=sqrt( ax^2 + ay^2 + az^2 )
if (fabs(a*scale-9.81)>0.1) ignore...
where a is size of acceleration vector
(ax,ay,az) is measured acceleration vector (local to your device)
scale is scale of your accelerometer to convert a value to actual units like [m/s^2] or [N/kg]
device accelerometers are usualy already in [g] so the scale is 9.81
9.81 is the gravity in the area of measurement
0.1 is accuracy of acceleration size check (change it to your needs)
this approach is a bit slower
and not continuous because during acceleration will not measure the output (can use last valid output)
but it is much more precise
Now the formulas should work, from what I read your axises are
x - up,down
y - left,right
z - forward,backward
do not know the orientation so just negate output angle if i hit it the wrong way
green is the gravity
blue is measured values
red is your pitch angle (ang)
ang=asin(az*scale/9.81)
do not forget to avoid using asin with parameter out of range < -1.0 , +1.0 > !!!
When device is rotated by some amount, a simple cube has to rotate by the same amount but in the opposite direction. For example, the cube has to rotate to 45 degrees to the left if the device is rotated 45 degrees to the right. Or when pitch is 30 degrees, the cube has to rotate -30 degrees around X axis. When the yaw is 10 degrees, the cube has to rotate -10 degrees around Z axis. I've used .getRotationMatrixFromVector followed by getOrientation like so:
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR) {
SensorManager.getRotationMatrixFromVector(
mRotationMatrix , event.values);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
azimuthVal = (Math.round((Math.toDegrees(orientationVals[0]))*100.0)/100.0);
pitchVal= (Math.round((Math.toDegrees(orientationVals[1]))*100.0)/100.0);
rollVal = (Math.round((Math.toDegrees(orientationVals[2]))*100.0)/100.0);}
But the problem with it change in pitch affects roll and vice versa and as a result when device is rotated around X axis, the pinch value changes->roll changes -> the cube rotates not only around X but also around Y, when I don't need that.
I've looked around the internet and many refer to Quaternions as a solution but how can I apply quaternions to my specific application, as I need to know amount of degrees device is rotated by along an axes.
Gimbal lock happens when you want to extract (Euler) rotation angles from the rotation matrix, basically at some specific rotation we loose a degree of freedom in the equation between rotation matrix components and rotation angles and the actual rotation angles are not recoverable,
So in your code it may happen at :
SensorManager.getOrientation(mRotationMatrix, orientationVals);
You should somehow solve the issue before extracting the rotation angles,
this could be done by modifying the quaternion's components as is explained here:
http://www.euclideanspace.com/maths/geometry/rotations/conversions/quaternionToEuler/
I want to rotate the cube when the user is touching it. To rotate it,is quite simple but actual problem is if the user is tilting it then always it should stop in particuler face(cube has 6 faces and i mean that only one face should be visible at one time). Please give your suggetion if anyone worked on that.
In the case of a cube this is simple: Face normals are the cartesian axes. So one looks straigt on a face if you constrain the rotations around the cartesian axes (X, Y, Z) and the rotation angles are multiple of pi/2 = 90°.
So in your code when the user stops interacting, set the rotation angles to the next multiple of 90°
fmod(round(angle/90) * 90, 360); // degrees
fmod(round(angle/(pi/2)) * pi/2, 2*pi); // radians
Either do it hard, or animate it.
If your object is not a cube, but arbitrary, you need to find the additional rotation for the face to get perpendicular to the view axis. This angle is determined by acos( scalar_product(normalize(face_normal), normalize(view_axis)) ) the axis of rotation is given by cross_product(normalize(face_normal), normalize(view_axis))