I´m trying to figure out how to calculate the inclination angle of head movement, just up-down movement, not moving right-left side. I have a wearable device (I guess the problem would be the same for mobile devices) with an accelerometer sensor and it is placed on my forehead.
So far, I think is better to leave the linear acceleration out and just work with the gravity acceleration. I´m reading and reading trying to understand better what exactly I´m looking for, but my brain is not maths material. In a starting position of standing still (stationary position), all the gravity acceleration falls into an axis, lets say X. When I start producing movement with my head, for instance moving my head down (like looking to the ground), the aceleration gravity does not fall only into an axis, but two axes (would be shared, for instace between X and Z). If I´m right and this is the right approach (is it?)...
How can I calculate this angle without doing crazy maths?
This problem is exactly the same than calculating the pitch angle of an accelerometer?
Can I use for this raw data or I need to calibrated data?
To make it clear, let´s imagine that my device axes are placed in the same position than this photo and that in the starting position the gravity falls into X axis (landscape position).
EDIT
Using the following formulas for pitch and roll (thanks to #Nuclearman link)
Roll = atan2(Y, Z) * 180/M_PI;
Pitch = atan2(X, sqrt(Y*Y + Z*Z)) * 180/M_PI
And this code I wrote:
/**
* Function to calculate the inclination angle or pitch of the head movement
*/
float calcInclinationAngle(int iDataPos){
float xVal = fAccel[0][iDataPos];
float yVal = fAccel[1][iDataPos];
float zVal = fAccel[2][iDataPos];
float pitch = 0;
//Pitch in rad format
pitch = atan2(xVal, sqrt(pow(yVal,2)+pow(zVal,2)));
//Pitch in degrees
pitch = pitch * 180/PI;
return pitch;
I always get wrong angles. When the gravity force is entirely on the X axis, I get around 45 degrees (instead of 0) and if I move the device 180 degrees (just changing the gravity force signal), I get around 17-18 degrees. I have been playing with atan2 parameters, but the angles range is always the same (25-35 degrees). Just throwing a question...should I be working with calibrated data instead of raw data?
Edit 2
I have done some progress "cheating" a bit, due to I'm totally stuck. Now I normalize the data and instead of using atan, I do pitch = 1/sin(xVal), which actually gives me a range of 90 degrees which seems to fit with my device rotation (although for example it gives me 135 degrees instead of...45, but I "fix" that substracting 90 degrees to all the angles). Anyway, I need a 180 degrees range because at the moment moving backwards or forward does not make a difference in the obtained angles.
Edit adding some pictures and information
Calculating the pitch as pitch = atan2(xVal, sqrt(pow(yVal,2)+pow(zVal,2))) * (180/PI), I obtain the following angles. Just in case is useful, the accelerometer values (raw data) in the position 1 are: ACCX 2936 ACCY 2152 ACCZ 1883
Position 1 (gravity falls into X axis): 45-46 degrees
Position 2 (rotating aprox 45 from starting point): 38
Position 3 (rotating aprox 90 from starting point): 28
Position 4 (rotating aprox 180 from starting point): 18-19
Position 5 (> 180 < 360): 18-46.
SOLUTION
Just in case someone bumps into something similar in the future. The main problem was the range of the raw data. Once I mapped this into a [-1g, 1g] range and fixed a few things about the coordinates it worked.
Measure just gravity orientation. there are two approaches I know of
1.use smooth (FIR) filter to filter out quick changes
for example remember last N measurements
and output their average
this is easy and continuous but not precise
2.check for acceleration vector size
and ignore all measurements where it does not match the gravity
a=sqrt( ax^2 + ay^2 + az^2 )
if (fabs(a*scale-9.81)>0.1) ignore...
where a is size of acceleration vector
(ax,ay,az) is measured acceleration vector (local to your device)
scale is scale of your accelerometer to convert a value to actual units like [m/s^2] or [N/kg]
device accelerometers are usualy already in [g] so the scale is 9.81
9.81 is the gravity in the area of measurement
0.1 is accuracy of acceleration size check (change it to your needs)
this approach is a bit slower
and not continuous because during acceleration will not measure the output (can use last valid output)
but it is much more precise
Now the formulas should work, from what I read your axises are
x - up,down
y - left,right
z - forward,backward
do not know the orientation so just negate output angle if i hit it the wrong way
green is the gravity
blue is measured values
red is your pitch angle (ang)
ang=asin(az*scale/9.81)
do not forget to avoid using asin with parameter out of range < -1.0 , +1.0 > !!!
Related
I am trying to implement a business logic layer that tells me on what degree from 180 to -180 is the device currently is in the x axis. It will be used for a camera screen that requires the user to hold the phone vertically.
So in order to do so, I listens to both the TYPE_ACCELEROMETER and to the TYPE_MAGNETIC_FIELD sensors types as suggested in the official docs -
https://developer.android.com/guide/topics/sensors/sensors_position#sensors-pos-prox
And I ended up with the following code -
override fun onSensorChanged(event: SensorEvent?) {
val value = event?.values ?: return#onSensorChanged
var accelerometerReading = floatArrayOf()
var magnetometerReading = floatArrayOf()
when(event.sensor.type) {
TYPE_ACCELEROMETER -> {
accelerometerReading = floatArrayOf(value[0], value[1], value[2])
}
TYPE_MAGNETIC_FIELD -> {
magnetometerReading = floatArrayOf(value[0], value[1], value[2])
}
}
val rotationMatrix = FloatArray(9)
if (magnetometerReading.isEmpty() || accelerometerReading.isEmpty()) return#setOnSensorValuesChangedListener
SensorManager.getRotationMatrix(rotationMatrix, FloatArray(9), accelerometerReading, magnetometerReading)
val orientationAngles = FloatArray(3)
SensorManager.getOrientation(rotationMatrix, orientationAngles) //always returns the same values that are provided to it. why?
As you can see, I implemented the exact same code as said to do in the official docs but the values I get have nothing to do in the range of 180 to -180 in both 3 of the elements in the orientationAngles array. I get values that are very similar to the input I give it, something like [-0.051408034, -0.007878973, 0.04735359] which is some random irrelevant data for me.
Any idea why would this happen and how to indeed get what I want which is the x axis angle of the device?
Edit:
I'll try to simplify what I want.
Imagine holding a device in portrait mode locked in with it facing you. In a perfect portrait stance I want to get a 90 degree value from the sensor. When the user tilts the device either left or right the values would either go down to 0 or up to 180 (which side it is doesn't matter). All I need is these 2 dimensional x axis values.
It gives the angles in radians, not degrees. Almost nothing in math uses degrees beyond grade school math, radians is the norm. Radians generally go from 0 to 2*pi, equaling 0 to 360 degrees. The formula to convert is degrees= radians/pi * 180
According to docs, the angles returned by getOrientation returns radians in the range -PI to PI, where 0 is defined by the individual angles. From the docs:
values[0]: Azimuth, angle of rotation about the -z axis. This value represents the angle between the device's y axis and the magnetic north pole. When facing north, this angle is 0, when facing south, this angle is π. Likewise, when facing east, this angle is π/2, and when facing west, this angle is -π/2. The range of values is -π to π.
values[1]: Pitch, angle of rotation about the x axis. This value represents the angle between a plane parallel to the device's screen and a plane parallel to the ground. Assuming that the bottom edge of the device faces the user and that the screen is face-up, tilting the top edge of the device toward the ground creates a positive pitch angle. The range of values is -π/2 to π/2.
values[2]: Roll, angle of rotation about the y axis. This value represents the angle between a plane perpendicular to the device's screen and a plane perpendicular to the ground. Assuming that the bottom edge of the device faces the user and that the screen is face-up, tilting the left edge of the device toward the ground creates a positive roll angle. The range of values is -π to π.
I have developing an android application , which requires device inclination for real time processing. the device is inclined on a surface
i wanted to calculate the angle, for this i have used the project in github to calculate the pitch value. but the pitch values returned by this method is not accurate over multiple tests.. in the pitch value there is some margin of error most of the times .
And the same program tested over another phone it shows different pitch value in same position (laying the phones on the table) .
is there any way i can get the accurate pitch values across multiple devices.
i had used s6 and one plus 2 devices.
The Rotation Matrix is defined by applying roll first, then the pitch, and finally the yaw rotation. You can get the phone in the same position if you apply pitch first, then roll and again finally yaw. This is why you expect a certain pitch, but you get inaccurate values.
To prove to yourself this, play with the phone by bringing it in a certain random position by applying angle rotations in an certain order and then try to get to the same position by different order of rotations (a good position to try is phone in vertical position like keeping it in front of your face and tilted a bit to the side).
Most of the times you would use code like this
int rotation = ((WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay().getRotation();
if(rotation == 0) // Default display rotation is portrait
SensorManager.remapCoordinateSystem(Rmat, SensorManager.AXIS_MINUS_X, SensorManager.AXIS_Y, R2);
else // Default display rotation is landscape
SensorManager.remapCoordinateSystem(Rmat, SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, R2);
to make it more intuitive. This is how by you virtually would change the order, since you still have the Rotation Matrix defined by applying roll first, then the pitch, and finally the yaw rotation, but however this rotations are defined against an new XYZ coordinate system.
public abstract void onSensorChanged (int sensor, float[] values)
Added in API level 1
Called when sensor values have changed. The length and contents of the values array vary depending on which sensor is being monitored. See SensorManager for details on possible sensor types.
Definition of the coordinate system used below.
The X axis refers to the screen's horizontal axis (the small edge in portrait mode, the long edge in landscape mode) and points to the right.
The Y axis refers to the screen's vertical axis and points towards the top of the screen (the origin is in the lower-left corner).
The Z axis points toward the sky when the device is lying on its back on a table.
IMPORTANT NOTE: The axis are swapped when the device's screen orientation changes. To access the unswapped values, use indices 3, 4 and 5 in values[].
SENSOR_ORIENTATION, SENSOR_ORIENTATION_RAW:
All values are angles in degrees.
values[0]: Azimuth, rotation around the Z axis (0<=azimuth<360). 0 = North, 90 = East, 180 = South, 270 = West
values[1]: Pitch, rotation around X axis (-180<=pitch<=180), with positive values when the z-axis moves toward the y-axis.
values[2]: Roll, rotation around Y axis (-90<=roll<=90), with positive values when the z-axis moves toward the x-axis.
Note that this definition of yaw, pitch and roll is different from the traditional definition used in aviation where the X axis is along the long side of the plane (tail to nose).
And the difference between phones it is expected.
I have been going over Accelerometer's tutorial and some aspects still confusing to me:
values[0]: Acceleration minus Gx on the x-axis
values[1]: Acceleration minus Gy on the y-axis
values[2]: Acceleration minus Gz on the z-axis
The question is: when the device lies flat on a table in its default orientation it outputs +9,81 instead of -9.81.
If the device lies flat on table Z-axis points down - so the Gz is 9,81
therefore values[2] shall be ( 0 - Gz) = (0 - 9,81) = -9,81.
The question is why is that ?
With X and Y axes there is no such confusion.
The simple (i.e. not rigorous and will have real physicists turning in their graves) is that the usual convention is that if the Z acceleration is negative the device is accelerating downwards i.e. in this case towards the centre of the earth.
Since it is staying still in a gravitational field it 'feels like' it is accelerating upwards (i.e. positive acceleration) just to stay in place.
(this is all explained pretty clearly in the SDK docs: Using the Accelerometer
At first I thought:
x - tilting the phone from left - right and vice versa
y - upside down and vice versa
z - lifting the phone up and down
But it doesn't seem like that. Can anyone correct me if I am wrong.
I want to know these values in-order to know the angle of the phone. Is it possible to know it using these values?
Acceleration is a vector quantity, i.e. it has importance of its magnitude and also direction in which it is occurring.
Keeping this is mind and see image in link provided by Nicolai Ditlev Krogh Krüger, what we get is the right side of your phone is positive X direction, top side is positive Y direction and the screen side is positive Z direction.
Now Force=mass x acceleration
therefore acceleration = force/mass
Now force of gravity always acts on your device. To keep it from falling you have to apply equal force but in "OPPOSITE" direction.
Now when you put your phone on horizontal table, the table applies an upward force (from back of phone towards the screen), this is positive Z direction. So acceleration here is force/mass of your phone in positive Z direction. Since phone is not moving, means force by table equals gravity and there is no other force in any other directions as well. Force by gravity = mass x 9.8 (approx). Thus you see a value close to +10 for Z parameter of accelerometer and other two are near to zero.
Take one more example. This time hold your phone such that right side of phone is down and left side is up, and you hold your phone from near its ear piece and mic (opposite edges). In this scenario you are applying force against gravity upward, to keep your phone from falling, in negative X direction (remember that in phone positive X direction is pointing towards right side of phone). Thus you will see a value close to -10 for X parameter of accelerometer.
If user holds the phone at an angle (say a) from the horizontal plane (assume user tilt right side of phone down), then force of gravity becomes mass x 9.8 x Cos(a), where 'a' is in degree not radian, along the phone's X axis. Thus acceleration now becomes 9.8 x Cos(a). Since Max value of Cos(a)=1 (at a=0 degree), user will see a value less than 9.8 for X parameter of accelerometer for any other angle. If angle a= 60 degree (approx) then X parameter will show value close to 5, as Cos(60)= 0.5.
This is how the values change in the accelerometer output.
User can use the values of X, Y & Z to calculate actual angle of tilt of device from any plane (say horizontal plane).
Suppose X parameter is showing value 6. Then tilt is aCos(6/9.8) (that is Cos inverse and result will be in radians). To convert radians to degrees multiply by 180/pi. Thus final angle of tilt 'a' = aCos(6/9.8)*(180/pi) in degrees.
This how you come to know the tilt of your phone and use it to simulate force on digital objects on the screen. More tilt means more acceleration and hence more force.
After obtaining the values from accelerometer into mValuesAccel[3] and gravity values into mValuesMagnet[3],am executing the following methods.
enter code here
SensorManager.getRotationMatrix(mRotationMatrix, null, mValuesAccel,ValuesMagnet);
SensorManager.getOrientation(mRotationMatrix, mValuesOrientation);
After executing above methods, the values obtained in mValuesOrientation are:
mValuesOrientation[0]: i)if -ve X-axis faces west,then 0 degrees; ii)faces south, then -90 degrees;
iii)faces north, then -180 or 180degrees;
iv)faces east, then 90 degrees;
So, it lies between -180 degrees to 180 degrees
NOTE: 1)After converting the X value obtained into degrees, you will get the above values.
2)X-axis wil be from origin to right side,where your origin will be at bottom left of your phone.
mValuesOrientation[1]: if you lift your phone(which is lying on table) such that the top of phone faces sky, then you get -ve values, and +ve values in vice versa
mValuesOrientation[2]: if you move your phone(which is lying on table), like turning a paper in a book, to left side it gives -ve values and right side it gives +ve values.
I have a square that rotates to a random angle and then travels in a straight line in the direction it is pointing. It does this by using a variable as its x axis and then calling
Variable++
Each frame.
unfortunatley i cannot work out how to return the exact position of the square because the square can be travelling at any angle and therefore doesn't rigidly follow the world coordinte grid. This means that the x variable is not the shapes x coordinate.
How do i return the shapes exact coordinates and how do i do it in such a way that i can have two squares drawn from the same class behaving differently.
So you've got a measure of distance from where the object started along its internal sideways axis and a measure of the angle between that axis and the horizontal?
If so then the formula you want is simple trigonometry. Assuming the object started at (x, y) and has travelled 'distance' units along an axis at an angle of 'angle' with the horizontal then the current position (x', y') is:
x' = x + distance * cos(angle)
y' = y + distance * sin(angle)
If you have the origin in the lower left of the screen and axes arranged graph paper style with x increasing to the right and y increasing as you go upward, that assumes that the angle is measured anticlockwise and that the object is heading along positive x when angle is zero.
If you'll permit a hand waving explanation, the formula works because one definition of sine and cosine is that they're the (x, y) coordinates of the point on the outside of a unit circle at the angle specified. It also matches with the very first thing most people learn about trigonometry, that sine is 'opposite over hypotenuse', and cosine is 'adjacent over hypotenuse'. In this case your hypotenuse has length 'distance' and and you want to get the 'opposite' and 'adjacent' lengths of a right angled triangle that coincides with the axes.
Assuming Android follows J2SE in this area, the one thing to watch out for is that Math.sin and Math.cos take an angle in radians, whereas OpenGL's rotatef takes an argument in degrees. Math.toDegrees and Math.toRadians can do the conversion for you.
When you made the shape you should have already specified its X & Y coordinates. Im not too sure what you mean when you say you cant find the coordinates?
Also make sure you do fame independent movement; currently you are adding one to your variable on every loop of your program. This means if it runs a 60 Frame Per Second(FPS) it will move 60 units, but if it runs at 30FPS it will move at half the speed