I am currently implementing an speedometer by receiving orientation data from my phone. I am using
SensorManager.getRotationMatrix(R, I, gravity, geomagnetic);
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
float azimuth = orientation[0];
double azimuthD = Math.toDegrees(azimuth);
if(azimuthD < 0) azimuthD = 360 + azimuthD;
With this i am able to receive the rotation data from my phone, such as azimuth etc..
Anyway, this works fine while the device is placed on a table or something. But when rotating around a certain point (in my case the device is fixed on a wheel and rotating at a certain speed) the values are far away from being accurate. I believe, since I am using gravity and the geomagnetic sensor, there could be an conflict with forces that influence these sensors, while rotating. As the wheel turns, the rotation changes relative to a point, but the local device rotation stays the same.
How can I access the orientation of the device while it's turning without running into a lot of noisy data?
I read some about the ´Sensor.TYPE_ROTATION_VECTOR´ property, but couldn't quite figure out how it works. Also I read about the possibility to remap the coordination system, but how is that supposed to help, since my phone is never not vertical to the floor more like with an angle of 5°-10°.
I would appreciate any help.
Cheers,
viehlieb
I guess i found my answer.
The solution was to throw away all the code i posted above and use the gyroscope, obviously.
The gyroscope values measure angular velocity of the device's rotation. The coordinate system used is the devices own coordinate system. In my case the relevant value was the rotation around the z-axis.
Values are in radiant per second, which can be mapped to m/s if you figure multiply the wheel's circumference. So the trick was in the OnSensorChanged method:
if(sensorEvent.sensor.getType() == Sensor.TYPE_GYROSCOPE){
gyroscope = sensorEvent.values;
double rotZ = gyroscope[2];
double degrees = Math.toDegrees(gyroscope[2]);
//calculate the speed with circumference = 2.23m
float speed = (float) degrees/ 360.0f * 2.23f * 3.6f;
}
If now you'd like to have accurate values, you could store them in an array and calculate the average. Remember to clear the array every 20th time (or so) the onSensorChanged method is called. With SENSOR_DELAY_GAME registered there's sufficient data over which you could build the average.
Related
A week ago i didn't know anything about Android Motion Sensors. After know the amazing thing called Virtual Reality I started to search about which sensors are used to get those results. Than I had a idea for a APP but I still don't know which sensors I should use for the situation below:
I have to get the phone orientation in reference to it self. I mean, I should be able to isolate each axis in degress. Something like it:
In this case, using gyroscope, I think that this variation is on the Z Axis.
Using ONLY gyroscope I had a good result for this situation, but after some repetions, I got a famous problem for the Gyro Sensor: Drift.
After this tutorial:
http://www.thousand-thoughts.com/articles/#articles
things became more clear in my head, but I still am having problems like latency between the real movement, and the output and wrong outputs when I change the device orientation (I think that the gravity is the guilty for that).
Is there some code example about how to get 0 - 360 degrees for each axis using ONLY the gyroscope and accelerometer sensors?
(I may had commited some english mistakes. Sorry for that)
The following code will give you correct lean angle, but only if your phone Z axis is 0. (like the way you illustrated it).
When starting to change the Z axis as well, it become problematic, i'm still working on that. (* Degrees has minus "-" sign when lean left and "+" sign to the right)
float[] mGravity;
float[] mGeomagnetic;
float[] temp = new float[9];
float[] RR = new float[9];
//Load rotation matrix into R
SensorManager.getRotationMatrix(temp, null,
mGravity, mGeomagnetic);
//Remap to camera's point-of-view
SensorManager.remapCoordinateSystem(temp,
SensorManager.AXIS_X,
SensorManager.AXIS_Z, RR);
//Return the orientation values
float[] values = new float[3];
SensorManager.getOrientation(RR, values);
Double degrees = (values[2] * 180) / Math.PI;
Hi I am creating an application in which the user holds the phone upright and then rotates it around the y axis (similar to taking a panorama).
(source: apple.com)
I need to detect the angle of rotation. In iOS this was fairly simple with the gyroscope sensor, but I am not finding the same luck with Android. If anyone could point me in the right direction that would be great.
Assuming your Y axis points to the center of earth, the value you are looking for is called azimuth.
To monitor its change you will need to register a listener for TYPE_ACCELEROMETER and TYPE_MAGNETIC_FIELD events:
mngr = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
accelerometer = mngr.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
magneticField = mngr.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
int rate = SensorManager.SENSOR_DELAY_GAME; // or other
mngr.registerListener(sensorListener, accelerometer, rate);
mngr.registerListener(sensorListener, magneticField, rate);
And within the listener, call:
float[] values = new float[3];
SensorManager.getOrientation(R, values);
float current_azimuth_val = values[0]; // <----------
Note that the quality. and latency, if the data you will obtain is highly hardware dependent.
There are various sensors available that can be managed through a SensorManager. Of course, since every device decides whether or not to put a particular sensor on the hardware platform for their model you have to check whether one exists. Some have gyro like iOS, some can be done with accelerometer and magnometer sensors in its place.
You can get started here: http://developer.android.com/guide/topics/sensors/sensors_overview.html
From my Android device I can read an array of linear acceleration values (in the device's coordinate system) and an array of absolute orientation values (in Earth's coordinate system). What I need is to obtain the linear acceleration values in the latter coord. system.
How can I convert them?
EDIT after Ali's reply in comment:
All right, so if I understand correctly, when I measure the linear acceleration, the position of the phone completely does not matter, because the readings are given in Earth's coordinate system. right?
But I just did a test where I put the phone in different positions and got acceleration in different axes. There are 3 pairs of pictures - the first ones show how I put the device (sorry for my Paint "master skill") and the second ones show readings from data provided by the linear acc. sensor:
device put on left side
device lying on back
device standing
And now - why in the third case the acceleration occurs along the Z axis (not Y) since the device position doesn't matter?
I finally managed to solve it! So to get acceleration vector in Earth's coordinate system you need to:
get rotation matrix (float[16] so it could be used later by android.opengl.Matrix class) from SensorManager.getRotationMatrix() (using SENSOR.TYPE_GRAVITY and SENSOR.TYPE_MAGNETIC_FIELD sensors values as parameters),
use android.opengl.Matrix.invertM() on the rotation matrix to invert it (not transpose!),
use Sensor.TYPE_LINEAR_ACCELERATION sensor to get linear acceleration vector (in device's coord. sys.),
use android.opengl.Matrix.multiplyMV() to multiply the rotation matrix by linear acceleration vector.
And there you have it! I hope I will save some precious time for others.
Thanks for Edward Falk and Ali for hints!!
Based on #alex's answer, here is the code snippet:
private float[] gravityValues = null;
private float[] magneticValues = null;
#Override
public void onSensorChanged(SensorEvent event) {
if ((gravityValues != null) && (magneticValues != null)
&& (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)) {
float[] deviceRelativeAcceleration = new float[4];
deviceRelativeAcceleration[0] = event.values[0];
deviceRelativeAcceleration[1] = event.values[1];
deviceRelativeAcceleration[2] = event.values[2];
deviceRelativeAcceleration[3] = 0;
// Change the device relative acceleration values to earth relative values
// X axis -> East
// Y axis -> North Pole
// Z axis -> Sky
float[] R = new float[16], I = new float[16], earthAcc = new float[16];
SensorManager.getRotationMatrix(R, I, gravityValues, magneticValues);
float[] inv = new float[16];
android.opengl.Matrix.invertM(inv, 0, R, 0);
android.opengl.Matrix.multiplyMV(earthAcc, 0, inv, 0, deviceRelativeAcceleration, 0);
Log.d("Acceleration", "Values: (" + earthAcc[0] + ", " + earthAcc[1] + ", " + earthAcc[2] + ")");
} else if (event.sensor.getType() == Sensor.TYPE_GRAVITY) {
gravityValues = event.values;
} else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
magneticValues = event.values;
}
}
According to the documentation you get the linear acceleration in the phone's coordinate system.
You can transform any vector from the phone's coordinate system to the Earth's coordinate system by multiplying it with the rotation matrix. You can get the rotation matrix from getRotationMatrix().
(Perhaps there already is a function doing this multiplication for you but I don't do Android programming and I am not familiar with its API.)
A nice tutorial on the rotation matrix is the Direction Cosine Matrix IMU: Theory manuscript. Good luck!
OK, first of all, if you're trying to do actual inertial navigation on Android, you've got your work cut out for you. The cheap little sensor used in smart phones are just not precise enough. Although, there has been some interesting work done on intertial navigation over small distances, such as inside a building. There are probably papers on the subject you can dig up. Google "Motion Interface Developers Conference" and you might find something useful -- that's a conference that Invensense put on a couple months ago.
Second, no, linear acceleration is in device coordinates, not world coordinates. You'll have to convert yourself, which means knowing the device's 3-d orientation.
What you want to do is use a version of Android that supports the virtual sensors TYPE_GRAVITY and TYPE_LINEAR_ACCELERATION. You'll need a device with gyros to get reasonably accurate and precise readings.
Internally, the system combines gyros, accelerometers, and magnetometers in order to come up with true values for the device orientation. This effectively splits the accelerometer device into its gravity and acceleration components.
So what you want to do is to set up sensor listeners for TYPE_GRAVITY, TYPE_LINEAR_ACCELERATION, and TYPE_MAGNETOMETER. Use the gravity and magnetometer data as inputs to SensorManager. getRotationMatrix() in order to get the rotation matrix that will transform world coordinates into device coordinates or vice versa. In this case, you'll want the "versa" part. That is, convert the linear acceleration input to world coordinates by multiplying them by the transpose of the orientation matrix.
It's past several days since I started using this function and have not yet succeeded in obtaining valid results.
What i want is basically convert acceleration vector from device's coordinates system, to real world coordinates. I' know that is possible because i have acceleration in relative coordinates and i know the orientation of the device in real world system.
Reading Android developers seems that using getRotationMatrix() i get R = rotation matrix.
So if i want A (acceleration vector in world system) from A' (acceleration vector in phone system) i must do simply:
A=R*A'
But i cant'n understand why the vector A has ALWAYS the first and the second component zero (example: +0,00;-0,00;+6,43)
My current code is similar to this:
public void onSensorChanged(SensorEvent event) {
synchronized (this) {
switch(event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
accelerometervalues = event.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomagneticmatrix =event.values.clone();
break;
}
if (geomagneticmatrix != null && accelerometervalues != null) {
float[] Rs = new float[16];
float[] I = new float[16];
SensorManager.getRotationMatrix(Rs, I, accelerometervalues, geomagneticmatrix);
float resultVec[] = new float[4];
float relativacc[]=new float [4];
relativacc[0]=accelerationvalues[0];
relativacc[1]=accelerationvalues[1];
relativacc[2]=accelerationvalues[2];
relativacc[3]=0;
Matrix.multiplyMV(resultVec, 0, Rs, 0, relativacc, 0);
//resultVec[] is the vector acceleration relative to world coordinates system..but doesn't WORK!!!!!
}
}
}
This question is very similar to this one Transforming accelerometer's data from device's coordinates to real world coordinates but there i can't find the solution...i had tried all the ways..
Please help me, i need help!!!
UPDATE:
Now my code is below, i had tried to explain matrix product, but nothing change:
float[] Rs = new float[9];
float[] I = new float[9];
SensorManager.getRotationMatrix(Rs, I, accelerationvalues, geomagneticmatrix);
float resultVec[] = new float[4];
resultVec[0]=Rs[0]*accelerationvalues[0]+Rs[1]*accelerationvalues[1]+Rs[2]*accelerationvalues[2];
resultVec[1]=Rs[3]*accelerationvalues[0]+Rs[4]*accelerationvalues[1]+Rs[5]*accelerationvalues[2];
resultVec[2]=Rs[6]*accelerationvalues[0]+Rs[7]*accelerationvalues[1]+Rs[8]*accelerationvalues[2];
Here some example of data read and result:
Rs separated by " " Rs[0] Rs[1]....Rs[8]
Av separated by " " accelerationvalues[0] ...accelerationvalues[2]
rV separated by " " resultVec[0] ...resultVec[2]
As you can notice the component on x and y axes in real world are zero (around) even if you move speddy the phone. Instead the relative acceleration vector detect correctly each movement!!!
SOLUTION
The errors in the numberrs are relative to float vars multiplication that is not the same as a double multyplication.
This sums to the fact that rotation matrix isn't costant if the phone, even if with the same orientation, is accelerating.
So is impossible translate acceleration vector to absolute coordinates during motion...
It's hard but it's the reality.
Finnaly i found the answer:
The errors in the numbers are relative to float vars multiplication that is not the same as a double multyplication. Here there is the solution.
This sums to the fact that rotation matrix isn't costant if the phone, even if with the same orientation, is accelerating. So is impossible translate acceleration vector to absolute coordinates during motion... It's hard but it's the reality.
FYI the orientation vector is made from magnetomer data AND gravity vector. This cause a ciclic problem: convert relative acc needs oirentation needs magnetic field AND gravity, but we know gravity only if the phone is stop by relative acc..so we are return to begin.
This is confirmed in Android Developers where is explained that rotation matrix give true result only when the phone isn't accelerate (e.g. they talk of free fall, infact there shouldn't be gravity mesaurement) or when it isn't in a non regulare magnetic field.
The matrices returned by this function are meaningful only when the
device is not free-falling and it is not close to the magnetic north.
If the device is accelerating, or placed into a strong magnetic field,
the returned matrices may be inaccurate.
In others world, fully un-useful...
You can trust this thing doing simple experiment on the table with Android Senor or something like this..
You must track down this arithmetic error before you worry about rotation, acceleration or anything else.
You have confirmed that
resultVec[0]=Rs[0]*accelerationvalues[0];
gives you
Rs[0]: 0.24105562
accelerationValues[0]: 6.891896
resultVec[0]: 1.1920929E-7
So once again, simpify. Try this:
Rs[0] = 0.2;
resultVec[0] = Rs[0] * 6.8
EDIT:
The last one gave resultVec[0]=1.36, so let's try this:
Rs[0] = 0.2;
accelerationValues[0] = 6.8
resultVec[0] = Rs[0] * accelerationValues[0];
If you do the sums, using the printed values you have appended, I get
`(0.00112, -0.0004, 10)`
which is not as small as what you have. Therefore there is an arithmetic error!
Could the problem be that you are using accelerationvalues[] in the last block, and accelerometervalues[] later?
I have developed several applications that make use of android sensors, so I am answering to one of your questions according to my experience:
But i cant'n understand why the vector A has ALWAYS the first and the
second component zero (example: +0,00;-0,00;+6,43)
I have observed this problem with the acceleration sensor and the magnetic field sensor, too. The readings are zero for some of the axis (two as you point, or just one in other occasions). This problem happens when you have just enabled the sensors (registerListener()) and I assume that it is related to some kind of sensor initialization.
In the case of the acceleration sensor, I have observed that just a small shaking of the device makes it to start giving correct sensor readings.
The correct solution would be the method onAccuracyChanged() giving the correct information about the sensor state. It should be returning a staus of SensorManager.SENSOR_STATUS_UNRELIABLE, but instead of that, it permanently returns SensorManager.SENSOR_STATUS_ACCURACY_HIGH on all physical devices that I have tested so far. With the method onAccuracyChanged() properly implemented, you could ignore bad readings or ask the user to wait while the sensor is being initialized.
The values I'm getting for accel, x, y and z below are not as expected.
It seems to be acting as a Tilt sensor rather than accelerometer.
When I throw the phone in the air and catch it, the accel value doesn't change by more than about 10%. Contrast this to when I rotate the phone randomly, I get much larger variations of 50-100%!
What could explain this? I simply want to detect when the phone is in freefall, (and/or impacting something).
SensorManager sm = (SensorManager)getSystemService(SENSOR_SERVICE);
Sensor sensor = sm.getDefaultSensor(SensorManager.SENSOR_ACCELEROMETER);
sm.registerListener(
sel = new SensorEventListener(){
#Override public void onAccuracyChanged(Sensor arg0, int arg1) {}
#Override public void onSensorChanged(SensorEvent arg0) {
double x = se.values[0];
double y = se.values[1];
double z = se.values[2];
double accel = Math.sqrt(
Math.pow(x, 2) +
Math.pow(y, 2) +
Math.pow(z, 2));
}
},
sensor,
SensorManager.SENSOR_DELAY_NORMAL
);
(As a side question, the values for x, y and z seem much higher than they should be, with accel averaging at about 50-80, when standing still? Shouldn't it be around 9.8?)
The x, y and z values seem very sensitive to changes in the orientation of the phone, but not at all representative of acceleration. Am I missing something??
Example values with phone still, lying on back:
Accel = 85.36, x = 6.8, y = 45.25 z = 30.125
I had to replace
Sensor sensor = sm.getDefaultSensor(SensorManager.SENSOR_ACCELEROMETER);
with
Sensor sensor = sm.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Could be because when you throw the phone it is almost the same plane or angle and at low speed, when you turn the phone it changes its course and orientation rapidly and that gives higher values for the result. Accelerometer may be a misnomer for a multi-function device, there could be a selection parameter for the function you really want to get results from.
With the phone lying on its back you should get close to zero on the X and Y sensors and about 9.8 on the Z sensor. The 9.8 is of course due to gravity.
My first though would be that there is something wrong with the phone and would suggest trying same code on another phone.
However I notice that there is something wrong in the math but haven;t figured out what yet.
with x,y,z having the values you mention the resultant (square root of sum of squares) works out to 54.78 rather than 85.36 as you mention in your post.
I'm quite new to Java so I cannot easily spot what might be wrong and haven't had the opportunity yet to try that piece of code on my phone, but I think the math is simple enough for me to determine that the result is wrong. (or at least I hope so).
The other thing to check (assuming you figureout the math problem) is that the small change when you throw the phone in the air might simply be due to the slow response time. The accelerometer output may simply be changing too slowly so by the time the phone has landed the output wouldn't have changed that much. The response can be improved by using SENSOR_DELAY_GAME or SENSOR_DELAY_FASTEST instead of normal.
By the way, shouldn't that be arg0.values[] rather than se.values[]? Where does the se come from? The sensor values go into the argument of the onSensorChanged (arg0 in this case) so I cannot figure out how they are supposed to end up in se. (But then again there are many things in Java I still don't understand)