I am using the SensorEvent API in order to get data for my app from different sensors (more specifically: TYPE_ROTATION_VECTOR, TYPE_GRAVITY, TYPE_GYROSCOPE, TYPE_LINEAR_ACCELERATION). Now, I know that in iOS there is the so called CMAttitudeReferenceFrameXTrueNorthZVertical, which gives all the sensors values with respect to True North, whereas z axis will always be vertical.
I couldn't find anything similar in Android, so I am thinking to manually translate the coordinate system. I am also thinking of using the remapCoordinateSystem method. However, I still don't know how to get the data with respect to True North. Did anyone have to deal with something similar before?
This answer is inspired by the class used here:
Yout should have the onSensorChanged method in the SensorEventListener
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR) {
//get the rotation matrix from the sensor (this will be using the magnetic north)
SensorManager.getRotationMatrixFromVector(mRotationMatrix, event.values);
//change the coordinate system (your new matrix is in the variable mChangedRotationMatrix)
SensorManager.remapCoordinateSystem(mRotationMatrix, SensorManager.AXIS_Y,
SensorManager.AXIS_MINUS_X, mChangedRotationMatrix);
//get the orientationMatrix of your new coordinate system
SensorManager.getOrientation(mChangedRotationMatrix, mOrientation);
//get the magnetic heading
float magneticHeading = (float) Math.toDegrees(mOrientation[0]);
//adjust accordingly by calculating the true north with either the "computeTrueNorth" method available the above link or the method used in the link below
}
}
To get the true north in degrees you may use this answer
Related
Hi I am creating an application in which the user holds the phone upright and then rotates it around the y axis (similar to taking a panorama).
(source: apple.com)
I need to detect the angle of rotation. In iOS this was fairly simple with the gyroscope sensor, but I am not finding the same luck with Android. If anyone could point me in the right direction that would be great.
Assuming your Y axis points to the center of earth, the value you are looking for is called azimuth.
To monitor its change you will need to register a listener for TYPE_ACCELEROMETER and TYPE_MAGNETIC_FIELD events:
mngr = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
accelerometer = mngr.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
magneticField = mngr.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
int rate = SensorManager.SENSOR_DELAY_GAME; // or other
mngr.registerListener(sensorListener, accelerometer, rate);
mngr.registerListener(sensorListener, magneticField, rate);
And within the listener, call:
float[] values = new float[3];
SensorManager.getOrientation(R, values);
float current_azimuth_val = values[0]; // <----------
Note that the quality. and latency, if the data you will obtain is highly hardware dependent.
There are various sensors available that can be managed through a SensorManager. Of course, since every device decides whether or not to put a particular sensor on the hardware platform for their model you have to check whether one exists. Some have gyro like iOS, some can be done with accelerometer and magnometer sensors in its place.
You can get started here: http://developer.android.com/guide/topics/sensors/sensors_overview.html
From my Android device I can read an array of linear acceleration values (in the device's coordinate system) and an array of absolute orientation values (in Earth's coordinate system). What I need is to obtain the linear acceleration values in the latter coord. system.
How can I convert them?
EDIT after Ali's reply in comment:
All right, so if I understand correctly, when I measure the linear acceleration, the position of the phone completely does not matter, because the readings are given in Earth's coordinate system. right?
But I just did a test where I put the phone in different positions and got acceleration in different axes. There are 3 pairs of pictures - the first ones show how I put the device (sorry for my Paint "master skill") and the second ones show readings from data provided by the linear acc. sensor:
device put on left side
device lying on back
device standing
And now - why in the third case the acceleration occurs along the Z axis (not Y) since the device position doesn't matter?
I finally managed to solve it! So to get acceleration vector in Earth's coordinate system you need to:
get rotation matrix (float[16] so it could be used later by android.opengl.Matrix class) from SensorManager.getRotationMatrix() (using SENSOR.TYPE_GRAVITY and SENSOR.TYPE_MAGNETIC_FIELD sensors values as parameters),
use android.opengl.Matrix.invertM() on the rotation matrix to invert it (not transpose!),
use Sensor.TYPE_LINEAR_ACCELERATION sensor to get linear acceleration vector (in device's coord. sys.),
use android.opengl.Matrix.multiplyMV() to multiply the rotation matrix by linear acceleration vector.
And there you have it! I hope I will save some precious time for others.
Thanks for Edward Falk and Ali for hints!!
Based on #alex's answer, here is the code snippet:
private float[] gravityValues = null;
private float[] magneticValues = null;
#Override
public void onSensorChanged(SensorEvent event) {
if ((gravityValues != null) && (magneticValues != null)
&& (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)) {
float[] deviceRelativeAcceleration = new float[4];
deviceRelativeAcceleration[0] = event.values[0];
deviceRelativeAcceleration[1] = event.values[1];
deviceRelativeAcceleration[2] = event.values[2];
deviceRelativeAcceleration[3] = 0;
// Change the device relative acceleration values to earth relative values
// X axis -> East
// Y axis -> North Pole
// Z axis -> Sky
float[] R = new float[16], I = new float[16], earthAcc = new float[16];
SensorManager.getRotationMatrix(R, I, gravityValues, magneticValues);
float[] inv = new float[16];
android.opengl.Matrix.invertM(inv, 0, R, 0);
android.opengl.Matrix.multiplyMV(earthAcc, 0, inv, 0, deviceRelativeAcceleration, 0);
Log.d("Acceleration", "Values: (" + earthAcc[0] + ", " + earthAcc[1] + ", " + earthAcc[2] + ")");
} else if (event.sensor.getType() == Sensor.TYPE_GRAVITY) {
gravityValues = event.values;
} else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
magneticValues = event.values;
}
}
According to the documentation you get the linear acceleration in the phone's coordinate system.
You can transform any vector from the phone's coordinate system to the Earth's coordinate system by multiplying it with the rotation matrix. You can get the rotation matrix from getRotationMatrix().
(Perhaps there already is a function doing this multiplication for you but I don't do Android programming and I am not familiar with its API.)
A nice tutorial on the rotation matrix is the Direction Cosine Matrix IMU: Theory manuscript. Good luck!
OK, first of all, if you're trying to do actual inertial navigation on Android, you've got your work cut out for you. The cheap little sensor used in smart phones are just not precise enough. Although, there has been some interesting work done on intertial navigation over small distances, such as inside a building. There are probably papers on the subject you can dig up. Google "Motion Interface Developers Conference" and you might find something useful -- that's a conference that Invensense put on a couple months ago.
Second, no, linear acceleration is in device coordinates, not world coordinates. You'll have to convert yourself, which means knowing the device's 3-d orientation.
What you want to do is use a version of Android that supports the virtual sensors TYPE_GRAVITY and TYPE_LINEAR_ACCELERATION. You'll need a device with gyros to get reasonably accurate and precise readings.
Internally, the system combines gyros, accelerometers, and magnetometers in order to come up with true values for the device orientation. This effectively splits the accelerometer device into its gravity and acceleration components.
So what you want to do is to set up sensor listeners for TYPE_GRAVITY, TYPE_LINEAR_ACCELERATION, and TYPE_MAGNETOMETER. Use the gravity and magnetometer data as inputs to SensorManager. getRotationMatrix() in order to get the rotation matrix that will transform world coordinates into device coordinates or vice versa. In this case, you'll want the "versa" part. That is, convert the linear acceleration input to world coordinates by multiplying them by the transpose of the orientation matrix.
It's past several days since I started using this function and have not yet succeeded in obtaining valid results.
What i want is basically convert acceleration vector from device's coordinates system, to real world coordinates. I' know that is possible because i have acceleration in relative coordinates and i know the orientation of the device in real world system.
Reading Android developers seems that using getRotationMatrix() i get R = rotation matrix.
So if i want A (acceleration vector in world system) from A' (acceleration vector in phone system) i must do simply:
A=R*A'
But i cant'n understand why the vector A has ALWAYS the first and the second component zero (example: +0,00;-0,00;+6,43)
My current code is similar to this:
public void onSensorChanged(SensorEvent event) {
synchronized (this) {
switch(event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
accelerometervalues = event.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomagneticmatrix =event.values.clone();
break;
}
if (geomagneticmatrix != null && accelerometervalues != null) {
float[] Rs = new float[16];
float[] I = new float[16];
SensorManager.getRotationMatrix(Rs, I, accelerometervalues, geomagneticmatrix);
float resultVec[] = new float[4];
float relativacc[]=new float [4];
relativacc[0]=accelerationvalues[0];
relativacc[1]=accelerationvalues[1];
relativacc[2]=accelerationvalues[2];
relativacc[3]=0;
Matrix.multiplyMV(resultVec, 0, Rs, 0, relativacc, 0);
//resultVec[] is the vector acceleration relative to world coordinates system..but doesn't WORK!!!!!
}
}
}
This question is very similar to this one Transforming accelerometer's data from device's coordinates to real world coordinates but there i can't find the solution...i had tried all the ways..
Please help me, i need help!!!
UPDATE:
Now my code is below, i had tried to explain matrix product, but nothing change:
float[] Rs = new float[9];
float[] I = new float[9];
SensorManager.getRotationMatrix(Rs, I, accelerationvalues, geomagneticmatrix);
float resultVec[] = new float[4];
resultVec[0]=Rs[0]*accelerationvalues[0]+Rs[1]*accelerationvalues[1]+Rs[2]*accelerationvalues[2];
resultVec[1]=Rs[3]*accelerationvalues[0]+Rs[4]*accelerationvalues[1]+Rs[5]*accelerationvalues[2];
resultVec[2]=Rs[6]*accelerationvalues[0]+Rs[7]*accelerationvalues[1]+Rs[8]*accelerationvalues[2];
Here some example of data read and result:
Rs separated by " " Rs[0] Rs[1]....Rs[8]
Av separated by " " accelerationvalues[0] ...accelerationvalues[2]
rV separated by " " resultVec[0] ...resultVec[2]
As you can notice the component on x and y axes in real world are zero (around) even if you move speddy the phone. Instead the relative acceleration vector detect correctly each movement!!!
SOLUTION
The errors in the numberrs are relative to float vars multiplication that is not the same as a double multyplication.
This sums to the fact that rotation matrix isn't costant if the phone, even if with the same orientation, is accelerating.
So is impossible translate acceleration vector to absolute coordinates during motion...
It's hard but it's the reality.
Finnaly i found the answer:
The errors in the numbers are relative to float vars multiplication that is not the same as a double multyplication. Here there is the solution.
This sums to the fact that rotation matrix isn't costant if the phone, even if with the same orientation, is accelerating. So is impossible translate acceleration vector to absolute coordinates during motion... It's hard but it's the reality.
FYI the orientation vector is made from magnetomer data AND gravity vector. This cause a ciclic problem: convert relative acc needs oirentation needs magnetic field AND gravity, but we know gravity only if the phone is stop by relative acc..so we are return to begin.
This is confirmed in Android Developers where is explained that rotation matrix give true result only when the phone isn't accelerate (e.g. they talk of free fall, infact there shouldn't be gravity mesaurement) or when it isn't in a non regulare magnetic field.
The matrices returned by this function are meaningful only when the
device is not free-falling and it is not close to the magnetic north.
If the device is accelerating, or placed into a strong magnetic field,
the returned matrices may be inaccurate.
In others world, fully un-useful...
You can trust this thing doing simple experiment on the table with Android Senor or something like this..
You must track down this arithmetic error before you worry about rotation, acceleration or anything else.
You have confirmed that
resultVec[0]=Rs[0]*accelerationvalues[0];
gives you
Rs[0]: 0.24105562
accelerationValues[0]: 6.891896
resultVec[0]: 1.1920929E-7
So once again, simpify. Try this:
Rs[0] = 0.2;
resultVec[0] = Rs[0] * 6.8
EDIT:
The last one gave resultVec[0]=1.36, so let's try this:
Rs[0] = 0.2;
accelerationValues[0] = 6.8
resultVec[0] = Rs[0] * accelerationValues[0];
If you do the sums, using the printed values you have appended, I get
`(0.00112, -0.0004, 10)`
which is not as small as what you have. Therefore there is an arithmetic error!
Could the problem be that you are using accelerationvalues[] in the last block, and accelerometervalues[] later?
I have developed several applications that make use of android sensors, so I am answering to one of your questions according to my experience:
But i cant'n understand why the vector A has ALWAYS the first and the
second component zero (example: +0,00;-0,00;+6,43)
I have observed this problem with the acceleration sensor and the magnetic field sensor, too. The readings are zero for some of the axis (two as you point, or just one in other occasions). This problem happens when you have just enabled the sensors (registerListener()) and I assume that it is related to some kind of sensor initialization.
In the case of the acceleration sensor, I have observed that just a small shaking of the device makes it to start giving correct sensor readings.
The correct solution would be the method onAccuracyChanged() giving the correct information about the sensor state. It should be returning a staus of SensorManager.SENSOR_STATUS_UNRELIABLE, but instead of that, it permanently returns SensorManager.SENSOR_STATUS_ACCURACY_HIGH on all physical devices that I have tested so far. With the method onAccuracyChanged() properly implemented, you could ignore bad readings or ask the user to wait while the sensor is being initialized.
my android application shows the direction of a particular place in the world and therefore in needs to get the compass degree.
This is the code I've been using to calculate the degrees:
public void getDirection() {
mySensorManager = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
List<Sensor> mySensors = mySensorManager.getSensorList(Sensor.TYPE_ORIENTATION);
if(mySensors.size() > 0){
mySensorManager.registerListener(mySensorEventListener, mySensors.get(0), SensorManager.SENSOR_DELAY_UI);
}
else{
TextView alert = (TextView)findViewById(R.id.instruct);
alert.setText(getString(R.string.direction_not_found));
myCompassView.setVisibility(myCompassView.INVISIBLE);
}
}
private SensorEventListener mySensorEventListener = new SensorEventListener(){
#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
// TODO Auto-generated method stub
}
#Override
public void onSensorChanged(SensorEvent event) {
// TODO Auto-generated method stub
compassBearing = (float)event.values[0];
float bearing;
bearing = compassBearing - templeBearing;
if (bearing < 0)
bearing = 360 + bearing;
myCompassView.updateDirection(bearing);
}
};
This method usually works but sometimes it just gets the wrong north, what do I have to do to get a more accurate location?
I have a couple suggestions for you:
1) Your device may not be calibrated. In order to do it, move it around in a form of 8 (see this). If you don't if your device is calibrated or not make some tests by pointing the device at some known cardinal point direction and compare the values. Typically, if a device is not calibrated, you will see great variations in the azimuth value for small rotations. That is what I would be worried about.
Now, don't forget that the sensor gives you the bearing to Magnetic North, and not True North! This difference is known as declination of the magnetic field and its value changes from place to place and from time to time due to changes in Earth's magnetic field. This app can compute some of the values for you with relative accuracy. I wouldn't be too much worried about this as the declination is typically small, but you might be looking for good precision (where I live the declination is 3º, currently).
2) Stay away from metal objects or stuff that generate a strong magnetic field. For example, don't do tests if you have your phone near the computer or any physical keyboards! This is pure poison for testing compass-geolocation. Some apps can measure the intensity of the magnetic field (if the device supports it). When you get closer to metal stuff you will experience higher values and strong changes in directions. For fun, there are also some "metal detectors": this app recognises changes in the magnetic field and vibrates when you are close "metal object" or stuff that magnetically interfere with the device.
3) Remember to update the bearing when you tilt your device to landscape mode. (article is a must read!) This is because azimuth value is based on the rotation of the perpendicular axis to the plane of the phone. When you rotate the device to landscape, this value is changed by +/-90º! This is not resolved by disabling the application landscape mode! You will have to determine it programmatically by analysing rotations around the other two axis (pitch and roll). This is not trivial, but there are some examples somewhere in the net.
edit: If you are interested in some code, check out Mixare, it is an open source augmented reality framework under the GPL3 for Android. Take a look at their code regarding orientation, compass geolocation and bearing.
PS: I don't have any sort of connection with the creators of the mentioned applications.
Currently, I'm trying to rotate 3D Cube using orientation sensor values, using getRotation() method. Some unexpected behaviors are observed when the android device is rotated above some bounds. For instance, if I make the device 'stand up', the value of the 'roll' just becomes crazy.
Also I'm experiencing the phenomenon similar to so-called gimbal-lock. The only difference is I'm experiencing the very problem even before applying the sensor values to the 3D rotation. When I try to change the 'pitch' value by rotating the device around only 'pitch' axis, the 'yaw' value also changes according to the rotation of the pitch. It seems completely unreasonable to me.
Could somebody help me?? I'm stuck in this problem for a month.
This is a common problem with yaw, pitch and roll. You cannot get rid of it as long as you are using yaw, pitch and roll (Euler angles). This video explains why.
I use rotation matrices instead of Euler angles in my motion sensing application. For an introduction to rotation matrices I recommend:
Direction Cosine Matrix IMU: Theory
Rotation matrices work like a charm.
Quaternions are also very popular and said to be the most stable.
[This answer was copied from here.]
Using quaternions to compute YPR won't do much to solve any problem. The problem of gimbal lock (which near pitch of +/-90 can drive yaw and roll -- actually yaw-roll at the north pole -- to go crazy under slight changes/noise in the underlying quaternion).
However, if you use Yaw Pitch and Roll values to perform a rotation of a 3D object shouldn't exhibit any odd behavior near the gimbal lock position. It's just that an amibguity in yaw and roll arise and large variations in yaw and roll do not imply the actual orientation is going crazy -- just that the orientation is insensitive to large changes in yaw-roll near pitch of 90.
BUT, also note that phones and browsers for HTML5 do not properly implement yaw, pitch and roll per conventions for Android. Here is a good blog for reference:
http://www.sensorplatforms.com/understanding-orientation-conventions-mobile-platforms/
Here is a basic example, this will return the vector of gravity. Note that you can change the sensor type and the speed of sampling, more details here
SensorManager sensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
Sensor sensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
sensorManager.registerListener(new SensorEventListener() {
#Override
public void onSensorChanged(SensorEvent event) {
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
double total = Math.sqrt(x * x + y * y + z * z);
}
#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
}, sensor, SensorManager.SENSOR_DELAY_FASTEST);
Well if you running on the phone.
Quaternions are the best, and you should use it
For rotation matrix and euler angle, you can easily came across such term called gimbal lock. It happens frequently with user violent action.
Gimbal lock is the loss of one degree of freedom in a three-dimensional, three-gimbal mechanism that occurs when the axes of two of the three gimbals are driven into a parallel configuration, "locking" the system into rotation in a degenerate two-dimensional space.
Rotation matrix and euler angle are good for slow moving robot action.
For details on quaternions concatnations and convert point to new system,
you can refer to wiki link
https://en.wikipedia.org/wiki/Quaternion