Android: Problems calculating the Orientation of the Device - android

i'am trying to build a simple Augmented Reality App, so I start working with sensor Data.
According to this thread (Android compass example) and example (http://www.codingforandroid.com/2011/01/using-orientation-sensors-simple.html), the calculation of the orientation using the Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGNETIC_FIELD doesn't really fit.
So I'm not able to get "good" values. The azimut values doesn't make any sense at all, so if I just move the Phone upside the value changes extremly. Even if I just rotate the phone, the values doesn't represent the phones orientation.
Has anybody an idea, who to improve the values quality according to the given example?

In what kind of orientation do you use this sample app? From what is written is this code, the only orientation supported is Portrait or flat on the table, it depends on devices. What do you mean by "good"?
It is normal that the value is not "good" when rotating the device, the device coordinate system is supposed to be working in Portrait, or flat i don't know (Y axis vertical along the screen pointing up, Z axis pointing out of the screen coming from the center of screen, X axis perpendicular to the Y axis going on the right along the screen). Having this, rotating the device will not rotate the device coordinate system, you'll have to remap it.
But if you want the heading of the device in Portrait orientation, here is a piece of code that works good for me:
#Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrix,
event.values);
SensorManager
.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}
You'll get the heading (or azimuth) in:
orientationVals[0]

Answer from Tíbó is good, but if you log roll value, you will expect irregular numbers.
(roll is important for AR Browsers)
This is due to
SensorManager.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
You have to use different matrix for in and out of remap. This following code works for me with a correct roll value:
#Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrixFromVector, event.values);
SensorManager.remapCoordinateSystem(mRotationMatrixFromVector,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}

Probably late to the party. Anyway here is how I got the azimuth
private final int sensorType = Sensor.TYPE_ROTATION_VECTOR;
float[] rotMat = new float[9];
float[] vals = new float[3];
#Override
public void onSensorChanged(SensorEvent event) {
sensorHasChanged = false;
if (event.sensor.getType() == sensorType){
SensorManager.getRotationMatrixFromVector(rotMat,
event.values);
SensorManager
.remapCoordinateSystem(rotMat,
SensorManager.AXIS_X, SensorManager.AXIS_Y,
rotMat);
SensorManager.getOrientation(rotMat, vals);
azimuth = deg(vals[0]); // in degrees [-180, +180]
pitch = deg(vals[1]);
roll = deg(vals[2]);
sensorHasChanged = true;
}
}
Hope it helps

Have you tried the combined (sensor-fusion) type Sensor.TYPE_ROTATION_VECTOR. This may give better results:
Go to https://developer.android.com/reference/android/hardware/SensorEvent.html and search for 'rotation_vector'.

Here's a Kotlin approach with all the necessary matrices included (for some reason the previous answers leave out the array sizes, which matter)
// This is determined from the deprecated Sensor.TYPE_ORIENTATION
var lastOrientation: FloatArray = FloatArray(3)
var lastHeading: Float = 0f
var currentHeading: Float = 0f
// This is from the non deprecated Sensor.TYPE_ROTATION_VECTOR
var lastVectorOrientation: FloatArray = FloatArray(5)
var lastVectorHeading: Float = 0f
var currentVectorHeading: Float = 0f
override fun onSensorChanged(event: SensorEvent) {
when(event.sensor?.type) {
null -> return
Sensor.TYPE_ORIENTATION -> {
lastOrientation = event.values
lastHeading = currentHeading
currentHeading = abs(event.values[0].roundToInt().toFloat())
}
Sensor.TYPE_ROTATION_VECTOR -> {
lastVectorOrientation = event.values
lastVectorHeading = currentVectorHeading
val tempRotationMatrix = FloatArray(9)
val tempOrientationMatrix = FloatArray(3)
getRotationMatrixFromVector(tempRotationMatrix, event.values)
remapCoordinateSystem(tempRotationMatrix, AXIS_X, AXIS_Z, tempRotationMatrix)
getOrientation(tempRotationMatrix, tempOrientationMatrix)
currentVectorHeading = Math.toDegrees(tempOrientationMatrix[0].toDouble()).toFloat()
if(currentVectorHeading < 0) {
currentVectorHeading += 360f//heading = 360 - abs(neg heading), which is really 360 + (-heading)
}
}
else -> return
}
}
I've also included the deprecated Sensor.TYPE_ORIENTATION for anybody wanting to see the difference between the two approaches. There is a several degree difference when using the deprecated method vs the updated approach.

Related

Translating Accelerometer Vectors from Device to Earth Coordinates

I have scoured the internet and StackOverflow to find a way to translate Android accelerometer vectors from the device coordinate system to Earth coordinate systems. I am using the Sensor.TYPE_ROTATION_VECTOR to do this. I am accessing the sensors using NativeScript and using MathJS to do matrix computations.
I grab the rotation vector using Sensor.TYPE_ROTATION_VECTOR
I calculate the rotation matrix following the Android code for getRotationMatrixFromVector() given at https://android.googlesource.com/platform/frameworks/base/+/master/core/java/android/hardware/SensorManager.java.
---2.1 I calculate q0, q1=acceleration.x, q2=acceleration.y, q3=acceleration.z
---2.2 I tried calculating the rotation matrix using both the float[9] and float[16] matrix sizes but I can't get either of them to work.
---2.3 I tried inverting and transposing both these rotation matrices.
I now multiply the rotation matrix (tried normal, inverted, and transpose) by the matrix [[accel.x],[accel.y],[accel.z],[0.0]] (4x1 matrix)
When I look at my newly translated acceleration values, they aren't rotated at all. If I point my device's x-axis toward the Earth's north and accelerate it this direction, my acceleration.x will be >1 and the other values almost 0. Here my acceleration.y should be translated to be >1. Similarly, if I point my device's y-axis toward the sky and accelerate it this direction, my acceleration.y reads >1 when my acceleration.z should read >1.
Can anyone tell me what I am doing wrong in these steps? The consensus seems to be grab the rotation vector, transform it to a rotation matrix, invert the matrix, then multiply this inverted matrix by the accelerometer vector. It doesn't work for me. Thanks,
var r;
var rinv;
accelerometer.startAccelerometerUpdates(function (accel) {
if (accel.sensortype == 11) {
var q0 = Math.sqrt(1 - accel.x*accel.x - accel.y*accel.y - accel.z*accel.z);
var q1 = accel.x;
var q2 = accel.y;
var q3 = accel.z;
//calculate rotation matrix from unit quaternion
var sq1 = 2*q1*q1;
var sq2 = 2*q2*q2;
var sq3 = 2*q3*q3;
var q1q2 = 2*q1*q2;
var q3q0 = 2*q3*q0;
var q1q3 = 2*q1*q3;
var q2q0 = 2*q2*q0;
var q2q3 = 2*q2*q3;
var q1q0 = 2*q1*q0;
//r = math.matrix([[1-sq2-sq3,q1q2-q3q0,q1q3+q2q0],[q1q2+q3q0,1-sq1-sq3,q2q3-q1q0],[q1q3-q2q0,q2q3+q1q0,1-sq1-sq2]]);
r = math.matrix([[1-sq2-sq3,q1q2-q3q0,q1q3+q2q0,0.0],[q1q2+q3q0,1-sq1-sq3,q2q3-q1q0,0.0],[q1q3-q2q0,q2q3+q1q0,1-sq1-sq2,0.0],[0.0,0.0,0.0,1.0]]);
rinv = math.inv(r);
}
if (accel.sensortype == 10) {
//filter accelerometer errors
if (Math.abs(accel.x) < 10 || Math.abs(accel.y) < 10 || Math.abs(accel.z) < 10) {
if ((Math.abs(accel.x) > .15 && Math.abs(accel.x)/oldAX < 2) || (Math.abs(accel.y) > .15 && Math.abs(accel.y)/oldAY < 2) || (Math.abs(accel.z) > .15 && Math.abs(accel.z)/oldAZ < 2)) { //filter errors in accelerometer
var Ad = math.matrix([[accel.x],[accel.y],[accel.z],[0.0]]);
//transform acceleration values from device coordinates to Earth coordinates
var Ag = math.multiply(rinv,Ad);
var ax = Ag.get([0,0]);
var ay = Ag.get([1,0]);
var az = Ag.get([2,0]);
if (ax > 1 || ay > 1 || az > 1) { //only show large values for easier analyses
page.getViewById("rotationLabel").text = "Earth Axes Acceleration \n x: " + ax + "\ny: " + ay + "\nz: " + az;
}
I expect the device's accelerometer values to be translated to the Earth's coordinate system like on this page at the rotation vector portion.
https://developer.android.com/guide/topics/sensors/sensors_motion
The NativeScript-Accelerometer-Advanced plugin is coded incorrectly. Calculating the unit quaternion value q0 is not necessary as it is provided by the sensor. Dividing by gravity (9.81 m/s^2) is not necessary as it is not included in the sensors. I fixed this in my local copy and my rotation method is working correctly:
1. Get rotation vector
2. Calculate rotation matrix
3. Invert rotation matrix
4. Multiply inverted rotation matrix by accelerometer vector with [1.0] appended as the 4th row

Using a rotation matrix to convert euler units to 360 degrees

In Android, I am using the accelerometer and magnetic field sensor to calculate spatial positioning, as shown in the code below. The getRotationMatrix method generates values that are in real-world units with azimuth, pitch and roll. Azimuth and roll give values in the range of 0 to 180 or 0 to -180. Pitch however gives values from 0 to 90 or 0 to -90. That's a problem for my app because in my app I need to determine unique locations regardless how the device is oriented. With roll, you can have 2 locations with the same value.
I need to apply a matrix transformation that remaps the sensor values to values that range from 0 to 360 degrees (actually, 360 wouldn't be valid since it's the same as 0 and anything close to 360 would result in a number like 359.99999...)
I am not a mathematician and don't know how to use matrixes, let alone use them in Android but I am aware that this is what is required to get the 0 to 360 degree conversion. It would be nice if the matrix also took care of the azimuth and roll as well so that they also produce values from 0 to 360 but if that isn't possible, that's fine since unique positions can still be derived from their sensor values. Any suggestions how how I create this matrix transformation?
#Override
public void onSensorChanged(SensorEvent event)
{
try
{
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
accelerometerValues = event.values;
else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
magneticFieldValues = event.values;
if ((accelerometerValues != null) && (magneticFieldValues != null))
{
float[] R = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, accelerometerValues, magneticFieldValues);
if (success)
{
float[] values = new float[3];
SensorManager.getOrientation(R, values);
// Convert from radians to degrees if preferred.
values[0] = (float) Math.toDegrees(values[0]); // Azimuth
values[1] = (float) Math.toDegrees(values[1]); // Pitch
values[2] = (float) Math.toDegrees(values[2]); // Roll
}
}
}
catch (Exception ex)
{
}
}
EDIT:
The raw event values for the pitch do not give you unique values as you rotate the device 360 degrees, so I highly doubt any matrix transformation is going to produce the results I am after. Maybe I am using the wrong sensors.

Using Android gyroscope instead of accelerometer. I find lots of bits and pieces, but no complete code

The Sensor Fusion video looks great, but there's no code:
http://www.youtube.com/watch?v=C7JQ7Rpwn2k&feature=player_detailpage#t=1315s
Here is my code which just uses accelerometer and compass. I also use a Kalman filter on the 3 orientation values, but that's too much code to show here. Ultimately, this works ok, but the result is either too jittery or too laggy depending on what I do with the results and how low I make the filtering factors.
/** Just accelerometer and magnetic sensors */
public abstract class SensorsListener2
implements
SensorEventListener
{
/** The lower this is, the greater the preference which is given to previous values. (slows change) */
private static final float accelFilteringFactor = 0.1f;
private static final float magFilteringFactor = 0.01f;
public abstract boolean getIsLandscape();
#Override
public void onSensorChanged(SensorEvent event) {
Sensor sensor = event.sensor;
int type = sensor.getType();
switch (type) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags[0] = event.values[0] * magFilteringFactor + mags[0] * (1.0f - magFilteringFactor);
mags[1] = event.values[1] * magFilteringFactor + mags[1] * (1.0f - magFilteringFactor);
mags[2] = event.values[2] * magFilteringFactor + mags[2] * (1.0f - magFilteringFactor);
isReady = true;
break;
case Sensor.TYPE_ACCELEROMETER:
accels[0] = event.values[0] * accelFilteringFactor + accels[0] * (1.0f - accelFilteringFactor);
accels[1] = event.values[1] * accelFilteringFactor + accels[1] * (1.0f - accelFilteringFactor);
accels[2] = event.values[2] * accelFilteringFactor + accels[2] * (1.0f - accelFilteringFactor);
break;
default:
return;
}
if(mags != null && accels != null && isReady) {
isReady = false;
SensorManager.getRotationMatrix(rot, inclination, accels, mags);
boolean isLandscape = getIsLandscape();
if(isLandscape) {
outR = rot;
} else {
// Remap the coordinates to work in portrait mode.
SensorManager.remapCoordinateSystem(rot, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
}
SensorManager.getOrientation(outR, values);
double x180pi = 180.0 / Math.PI;
float azimuth = (float)(values[0] * x180pi);
float pitch = (float)(values[1] * x180pi);
float roll = (float)(values[2] * x180pi);
// In landscape mode swap pitch and roll and invert the pitch.
if(isLandscape) {
float tmp = pitch;
pitch = -roll;
roll = -tmp;
azimuth = 180 - azimuth;
} else {
pitch = -pitch - 90;
azimuth = 90 - azimuth;
}
onOrientationChanged(azimuth,pitch,roll);
}
}
private float[] mags = new float[3];
private float[] accels = new float[3];
private boolean isReady;
private float[] rot = new float[9];
private float[] outR = new float[9];
private float[] inclination = new float[9];
private float[] values = new float[3];
/**
Azimuth: angle between the magnetic north direction and the Y axis, around the Z axis (0 to 359). 0=North, 90=East, 180=South, 270=West
Pitch: rotation around X axis (-180 to 180), with positive values when the z-axis moves toward the y-axis.
Roll: rotation around Y axis (-90 to 90), with positive values when the x-axis moves toward the z-axis.
*/
public abstract void onOrientationChanged(float azimuth, float pitch, float roll);
}
I tried to figure out how to add gyroscope data, but I am just not doing it right. The google doc at http://developer.android.com/reference/android/hardware/SensorEvent.html shows some code to get a delta matrix from the gyroscope data. The idea seems to be that I'd crank down the filters for the accelerometer and magnetic sensors so that they were really stable. That would keep track of the long term orientation.
Then, I'd keep a history of the most recent N delta matrices from the gyroscope. Each time I got a new one I'd drop off the oldest one and multiply them all together to get a final matrix which I would multiply against the stable matrix returned by the accelerometer and magnetic sensors.
This doesn't seem to work. Or, at least, my implementation of it does not work. The result is far more jittery than just the accelerometer. Increasing the size of the gyroscope history actually increases the jitter which makes me think that I'm not calculating the right values from the gyroscope.
public abstract class SensorsListener3
implements
SensorEventListener
{
/** The lower this is, the greater the preference which is given to previous values. (slows change) */
private static final float kFilteringFactor = 0.001f;
private static final float magKFilteringFactor = 0.001f;
public abstract boolean getIsLandscape();
#Override
public void onSensorChanged(SensorEvent event) {
Sensor sensor = event.sensor;
int type = sensor.getType();
switch (type) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags[0] = event.values[0] * magKFilteringFactor + mags[0] * (1.0f - magKFilteringFactor);
mags[1] = event.values[1] * magKFilteringFactor + mags[1] * (1.0f - magKFilteringFactor);
mags[2] = event.values[2] * magKFilteringFactor + mags[2] * (1.0f - magKFilteringFactor);
isReady = true;
break;
case Sensor.TYPE_ACCELEROMETER:
accels[0] = event.values[0] * kFilteringFactor + accels[0] * (1.0f - kFilteringFactor);
accels[1] = event.values[1] * kFilteringFactor + accels[1] * (1.0f - kFilteringFactor);
accels[2] = event.values[2] * kFilteringFactor + accels[2] * (1.0f - kFilteringFactor);
break;
case Sensor.TYPE_GYROSCOPE:
gyroscopeSensorChanged(event);
break;
default:
return;
}
if(mags != null && accels != null && isReady) {
isReady = false;
SensorManager.getRotationMatrix(rot, inclination, accels, mags);
boolean isLandscape = getIsLandscape();
if(isLandscape) {
outR = rot;
} else {
// Remap the coordinates to work in portrait mode.
SensorManager.remapCoordinateSystem(rot, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
}
if(gyroUpdateTime!=0) {
matrixHistory.mult(matrixTmp,matrixResult);
outR = matrixResult;
}
SensorManager.getOrientation(outR, values);
double x180pi = 180.0 / Math.PI;
float azimuth = (float)(values[0] * x180pi);
float pitch = (float)(values[1] * x180pi);
float roll = (float)(values[2] * x180pi);
// In landscape mode swap pitch and roll and invert the pitch.
if(isLandscape) {
float tmp = pitch;
pitch = -roll;
roll = -tmp;
azimuth = 180 - azimuth;
} else {
pitch = -pitch - 90;
azimuth = 90 - azimuth;
}
onOrientationChanged(azimuth,pitch,roll);
}
}
private void gyroscopeSensorChanged(SensorEvent event) {
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if(gyroUpdateTime != 0) {
final float dT = (event.timestamp - gyroUpdateTime) * NS2S;
// Axis of the rotation sample, not normalized yet.
float axisX = event.values[0];
float axisY = event.values[1];
float axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = (float)Math.sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
// Normalize the rotation vector if it's big enough to get the axis
if(omegaMagnitude > EPSILON) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = (float)Math.sin(thetaOverTwo);
float cosThetaOverTwo = (float)Math.cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
gyroUpdateTime = event.timestamp;
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
// User code should concatenate the delta rotation we computed with the current rotation
// in order to get the updated rotation.
// rotationCurrent = rotationCurrent * deltaRotationMatrix;
matrixHistory.add(deltaRotationMatrix);
}
private float[] mags = new float[3];
private float[] accels = new float[3];
private boolean isReady;
private float[] rot = new float[9];
private float[] outR = new float[9];
private float[] inclination = new float[9];
private float[] values = new float[3];
// gyroscope stuff
private long gyroUpdateTime = 0;
private static final float NS2S = 1.0f / 1000000000.0f;
private float[] deltaRotationMatrix = new float[9];
private final float[] deltaRotationVector = new float[4];
//TODO: I have no idea how small this value should be.
private static final float EPSILON = 0.000001f;
private float[] matrixMult = new float[9];
private MatrixHistory matrixHistory = new MatrixHistory(100);
private float[] matrixTmp = new float[9];
private float[] matrixResult = new float[9];
/**
Azimuth: angle between the magnetic north direction and the Y axis, around the Z axis (0 to 359). 0=North, 90=East, 180=South, 270=West
Pitch: rotation around X axis (-180 to 180), with positive values when the z-axis moves toward the y-axis.
Roll: rotation around Y axis (-90 to 90), with positive values when the x-axis moves toward the z-axis.
*/
public abstract void onOrientationChanged(float azimuth, float pitch, float roll);
}
public class MatrixHistory
{
public MatrixHistory(int size) {
vals = new float[size][];
}
public void add(float[] val) {
synchronized(vals) {
vals[ix] = val;
ix = (ix + 1) % vals.length;
if(ix==0)
full = true;
}
}
public void mult(float[] tmp, float[] output) {
synchronized(vals) {
if(full) {
for(int i=0; i<vals.length; ++i) {
if(i==0) {
System.arraycopy(vals[i],0,output,0,vals[i].length);
} else {
MathUtils.multiplyMatrix3x3(output,vals[i],tmp);
System.arraycopy(tmp,0,output,0,tmp.length);
}
}
} else {
if(ix==0)
return;
for(int i=0; i<ix; ++i) {
if(i==0) {
System.arraycopy(vals[i],0,output,0,vals[i].length);
} else {
MathUtils.multiplyMatrix3x3(output,vals[i],tmp);
System.arraycopy(tmp,0,output,0,tmp.length);
}
}
}
}
}
private int ix = 0;
private boolean full = false;
private float[][] vals;
}
The second block of code contains my changes from the first block of code which add the gyroscope to the mix.
Specifically, the filtering factor for accel is made smaller (making the value more stable). The MatrixHistory class keeps track of the last 100 gyroscope deltaRotationMatrix values which are calculated in the gyroscopeSensorChanged method.
I've seen many questions on this site on this topic. They've helped me get to this point, but I cannot figure out what to do next. I really wish the Sensor Fusion guy had just posted some code somewhere. He obviously had it all put together.
Well, +1 to you for even knowing what a Kalman filter is. If you'd like, I'll edit this post and give you the code I wrote a couple years ago to do what you're trying to do.
But first, I'll tell you why you don't need it.
Modern implementations of the Android sensor stack use Sensor Fusion, as Stan mentioned above. This just means that all of the available data -- accel, mag, gyro -- is collected together in one algorithm, and then all the outputs are read back out in the form of Android sensors.
Edit: I just stumbled on this superb Google Tech Talk on the subject: Sensor Fusion on Android Devices: A Revolution in Motion Processing. Well worth the 45 minutes to watch it if you're interested in the topic.
In essence, Sensor Fusion is a black box. I've looked into the source code of the Android implementation, and it's a big Kalman filter written in C++. Some pretty good code in there, and far more sophisticated than any filter I ever wrote, and probably more sophisticated that what you're writing. Remember, these guys are doing this for a living.
I also know that at least one chipset manufacturer has their own sensor fusion implementation. The manufacturer of the device then chooses between the Android and the vendor implementation based on their own criteria.
Finally, as Stan mentioned above, Invensense has their own sensor fusion implementation at the chip level.
Anyway, what it all boils down to is that the built-in sensor fusion in your device is likely to be superior to anything you or I could cobble together. So what you really want to do is to access that.
In Android, there are both physical and virtual sensors. The virtual sensors are the ones that are synthesized from the available physical sensors. The best-known example is TYPE_ORIENTATION which takes accelerometer and magnetometer and creates roll/pitch/heading output. (By the way, you should not use this sensor; it has too many limitations.)
But the important thing is that newer versions of Android contain these two new virtual sensors:
TYPE_GRAVITY is the accelerometer input with the effect of motion filtered out
TYPE_LINEAR_ACCELERATION is the accelerometer with the gravity component filtered out.
These two virtual sensors are synthesized through a combination of accelerometer input and gyro input.
Another notable sensor is TYPE_ROTATION_VECTOR which is a Quaternion synthesized from accelerometer, magnetometer, and gyro. It represents the full 3-d orientation of the device with the effects of linear acceleration filtered out.
However, Quaternions are a little bit abstract for most people, and since you're likely working with 3-d transformations anyway, your best approach is to combine TYPE_GRAVITY and TYPE_MAGNETIC_FIELD via SensorManager.getRotationMatrix().
One more point: if you're working with a device running an older version of Android, you need to detect that you're not receiving TYPE_GRAVITY events and use TYPE_ACCELEROMETER instead. Theoretically, this would be a place to use your own kalman filter, but if your device doesn't have sensor fusion built in, it probably doesn't have gyros either.
Anyway, here's some sample code to show how I do it.
// Requires 1.5 or above
class Foo extends Activity implements SensorEventListener {
SensorManager sensorManager;
float[] gData = new float[3]; // Gravity or accelerometer
float[] mData = new float[3]; // Magnetometer
float[] orientation = new float[3];
float[] Rmat = new float[9];
float[] R2 = new float[9];
float[] Imat = new float[9];
boolean haveGrav = false;
boolean haveAccel = false;
boolean haveMag = false;
onCreate() {
// Get the sensor manager from system services
sensorManager =
(SensorManager)getSystemService(Context.SENSOR_SERVICE);
}
onResume() {
super.onResume();
// Register our listeners
Sensor gsensor = sensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY);
Sensor asensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Sensor msensor = sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
sensorManager.registerListener(this, gsensor, SensorManager.SENSOR_DELAY_GAME);
sensorManager.registerListener(this, asensor, SensorManager.SENSOR_DELAY_GAME);
sensorManager.registerListener(this, msensor, SensorManager.SENSOR_DELAY_GAME);
}
public void onSensorChanged(SensorEvent event) {
float[] data;
switch( event.sensor.getType() ) {
case Sensor.TYPE_GRAVITY:
gData[0] = event.values[0];
gData[1] = event.values[1];
gData[2] = event.values[2];
haveGrav = true;
break;
case Sensor.TYPE_ACCELEROMETER:
if (haveGrav) break; // don't need it, we have better
gData[0] = event.values[0];
gData[1] = event.values[1];
gData[2] = event.values[2];
haveAccel = true;
break;
case Sensor.TYPE_MAGNETIC_FIELD:
mData[0] = event.values[0];
mData[1] = event.values[1];
mData[2] = event.values[2];
haveMag = true;
break;
default:
return;
}
if ((haveGrav || haveAccel) && haveMag) {
SensorManager.getRotationMatrix(Rmat, Imat, gData, mData);
SensorManager.remapCoordinateSystem(Rmat,
SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, R2);
// Orientation isn't as useful as a rotation matrix, but
// we'll show it here anyway.
SensorManager.getOrientation(R2, orientation);
float incl = SensorManager.getInclination(Imat);
Log.d(TAG, "mh: " + (int)(orientation[0]*DEG));
Log.d(TAG, "pitch: " + (int)(orientation[1]*DEG));
Log.d(TAG, "roll: " + (int)(orientation[2]*DEG));
Log.d(TAG, "yaw: " + (int)(orientation[0]*DEG));
Log.d(TAG, "inclination: " + (int)(incl*DEG));
}
}
}
Hmmm; if you happen to have a Quaternion library handy, it's probably simpler just to receive TYPE_ROTATION_VECTOR and convert that to an array.
To the question where to find complete code, here's a default implementation on Android jelly bean: https://android.googlesource.com/platform/frameworks/base/+/jb-release/services/sensorservice/
Start by checking the fusion.cpp/h.
It uses Modified Rodrigues Parameters (close to Euler angles) instead of quaternions. In addition to orientation the Kalman filter estimates gyro drift. For measurement updates it uses magnetometer and, a bit incorrectly, acceleration (specific force).
To make use of the code you should either be a wizard or know the basics of INS and KF. Many parameters have to be fine-tuned for the filter to work. As Edward adequately put, these guys are doing this for living.
At least in google's galaxy nexus this default implementation is left unused and is overridden by Invense's proprietary system.

Transforming accelerometer's data from device's coordinates to real world coordinates

I'm really sorry if this is a very basic question, but I have not choice but ask it: How do you translate the accelerometer data from the device coordinates to real world coordinates?
I mean, assuming that the accelerometer is giving me somenting like (Ax,Ay,Az) -in device's coordinates-, what transformations should I apply to transform the values into (Ax',Ay',Az') -in real world's coordinates-, so I can use the acceleration vector in real worlds coordinates to calculate if the device is accelerating north, east, south-west,etc?
I have been working around this issue during the past few days. At first I thought that it shound't be difficult, but after searching dozens of pages I haven't come up with anything functional.
By the way, here is some code with what I've implemented so far:
private SensorEventListener mSensorEventListener = new SensorEventListener() {
public void onAccuracyChanged(Sensor sensor, int accuracy){
}
public void onSensorChanged(SensorEvent event) {
switch(event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
accelerometervalues = event.values.clone();
AX.setText(accelerometervalues[0]+"");
AY.setText(accelerometervalues[1]+"");
AZ.setText(accelerometervalues[2]+"");
break;
case Sensor.TYPE_ORIENTATION:
orientationvalues = event.values.clone();
azimuth.setText(orientationvalues[0]+"");
pitch.setText(orientationvalues[1]+"");
roll.setText(orientationvalues[2]+"");
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomagneticmatrix =event.values.clone();
TAX.setText(geomagneticmatrix[0]+"");
TAY.setText(geomagneticmatrix[1]+"");
TAZ.setText(geomagneticmatrix[2]+"");
break;
}
if (geomagneticmatrix != null && accelerometervalues != null) {
float[] R = new float[16];
float[] I = new float[16];
SensorManager.getRotationMatrix(R, I, accelerometervalues, geomagneticmatrix);
//What should I do here to transform the components of accelerometervalues into real world acceleration components??
}
}
};
I have:
A vector of accelerations in native coordinates in accelerometervalues.
A vector of magnetic field values in geomagneticmatrix.
Azimuth, pitch and roll in orientationvalues.
Rotation matrix R.
Inclination matrix I.
I think all the necessary information is there, azimuth, pitch and roll should describe the displacement of the device's coordinate system in relation with the real world coordinate system. Also, I believe that R is/can even be used as a true north vector inside the devices coordinates.
It seems to me that obtaing the values of acceleration in real world is just a mathematical transformation away from those data. I just can't figure it out.
Thanks in advance.
Edited:
I have tried directly multipliying the components of accelerometervalues with the rotation matrix R (trueaccel=accel*R) but it didn't work.
trueacceleration[0]= accelerometervalues[0]*R[0]+accelerometervalues[1]*R[1]+accelerometervalues[2]*R[2];
trueacceleration[1]= accelerometervalues[0]*R[1]+accelerometervalues[1]*R[4]+accelerometervalues[2]*R[7];
trueacceleration[2]= accelerometervalues[0]*R[2]+accelerometervalues[1]*R[5]+accelerometervalues[2]*R[8];
I have also tried multipliying accelerometervalues with the inclination matrix I. Also multipliying with both R and I (trueaccel=accel*R*I) and that didn't work either. Neither does calling to remapcoordinates() and then multiply in any of the previous forms.
Does anybody have an idea about what am I doing wrong?
Oki, I have worked this out mathematically myself so please bear with me.
If you want to translate an acceleration vector accelerationvalues into an acceleration vector trueacceleration expressed in real world's coordinates, once you have azimuth,pitch and roll stored in a orientationvalues vector, just do the following:
trueacceleration[0] =(float) (accelerometervalues[0]*(Math.cos(orientationvalues[2])*Math.cos(orientationvalues[0])+Math.sin(orientationvalues[2])*Math.sin(orientationvalues[1])*Math.sin(orientationvalues[0])) + accelerometervalues[1]*(Math.cos(orientationvalues[1])*Math.sin(orientationvalues[0])) + accelerometervalues[2]*(-Math.sin(orientationvalues[2])*Math.cos(orientationvalues[0])+Math.cos(orientationvalues[2])*Math.sin(orientationvalues[1])*Math.sin(orientationvalues[0])));
trueacceleration[1] = (float) (accelerometervalues[0]*(-Math.cos(orientationvalues[2])*Math.sin(orientationvalues[0])+Math.sin(orientationvalues[2])*Math.sin(orientationvalues[1])*Math.cos(orientationvalues[0])) + accelerometervalues[1]*(Math.cos(orientationvalues[1])*Math.cos(orientationvalues[0])) + accelerometervalues[2]*(Math.sin(orientationvalues[2])*Math.sin(orientationvalues[0])+ Math.cos(orientationvalues[2])*Math.sin(orientationvalues[1])*Math.cos(orientationvalues[0])));
trueacceleration[2] = (float) (accelerometervalues[0]*(Math.sin(orientationvalues[2])*Math.cos(orientationvalues[1])) + accelerometervalues[1]*(-Math.sin(orientationvalues[1])) + accelerometervalues[2]*(Math.cos(orientationvalues[2])*Math.cos(orientationvalues[1])));
Try this, its working for me
private float[] gravityValues = null;
private float[] magneticValues = null;
private SensorManager mSensorManager = null;
private void registerSensorListener(Context context) {
mSensorManager = (SensorManager) context.getSystemService(SENSOR_SERVICE);
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),
SensorManager.SENSOR_DELAY_FASTEST);
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_GYROSCOPE),
SensorManager.SENSOR_DELAY_FASTEST);
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),
SensorManager.SENSOR_DELAY_FASTEST);
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY),
SensorManager.SENSOR_DELAY_FASTEST);
}
#Override
public void onSensorChanged(SensorEvent event) {
if ((gravityValues != null) && (magneticValues != null)
&& (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)) {
float[] deviceRelativeAcceleration = new float[4];
deviceRelativeAcceleration[0] = event.values[0];
deviceRelativeAcceleration[1] = event.values[1];
deviceRelativeAcceleration[2] = event.values[2];
deviceRelativeAcceleration[3] = 0;
Log.d("Raw Acceleration::","Values: (" + event.values[0] + ", " + event.values[1] + ", " + event.values[2] + ")");
// Change the device relative acceleration values to earth relative values
// X axis -> East
// Y axis -> North Pole
// Z axis -> Sky
float[] R = new float[16], I = new float[16], earthAcc = new float[16];
SensorManager.getRotationMatrix(R, I, gravityValues, magneticValues);
float[] inv = new float[16];
android.opengl.Matrix.invertM(inv, 0, R, 0);
android.opengl.Matrix.multiplyMV(earthAcc, 0, inv, 0, deviceRelativeAcceleration, 0);
Log.d("Earth Acceleration", "Values: (" + earthAcc[0] + ", " + earthAcc[1] + ", " + earthAcc[2] + ")");
} else if (event.sensor.getType() == Sensor.TYPE_GRAVITY) {
gravityValues = event.values;
} else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
magneticValues = event.values;
}
}
You need to be able to know the reference coordinate system that also gives you the orientation of your device within 'real' world coordinates. Without that information, it look impossible to transform your data into anything useful.
For example, does your device have a type of 'directional' sensor that would help make sense of the accelerometer data (gyro & compass for example?)
I am dealing with the same problem. What you can do is, as you have the R[] matrix multiply your acceleration vector and voilá.
float resultVec[] = new float[4];
Matrix.multiplyMV(trueacceleration, 0, R, 0, accelerometervalues, 0);
PS: accelerometervalues must be a 4 field vector, just add 0 to the last field.
This is what I used to map accelrometer data from local(Mobile) frame of reference to Earth frame of reference, to get rid of orientation in dependency. Since in earth frame Z-axis is pointing towards the sky and must show value ~=9.81m/sec^2. One phenomenon that I couldn't understand is when I put phone on the revolving chair any any orientation and rotate at constant speed then XEarth and YEarth values shows rotation with 90 degree phase shift and oscillates like a sin/cosine wave which i assume North and East axis.
public void onSensorChanged(SensorEvent event) {
switch(event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
System.arraycopy(event.values, 0, accel, 0, 3);
//To get Quternion representation of Accelrometer data
SensorManager.getQuaternionFromVector(quatA , event.values);
q1.w = quatA[0]; q1.x = quatA[1]; q1.y = quatA[2]; q1.z = quatA[3];
break;
case Sensor.TYPE_ROTATION_VECTOR:
SensorManager.getRotationMatrixFromVector(rotationMatrix1,event.values);
System.arraycopy(event.values, 0, rotationVector, 0, 3);
SensorManager.getQuaternionFromVector(quat , event.values);
q2.w = quat[0]; q2.x = quat[1]; q2.y = quat[2]; q2.z = quat[3];
rotationMatrix2 = getRotationMatrixFromQuaternion(q2);
rotationResult = matrixMultiplication(accel,rotationMatrix2);
//You can use rotationMatrix1 or rotationMatrix2
break;
//Accel Data rotated as per earth frame of reference
//rotationResult[0];
//rotationResult[1];
//rotationResult[2];
}
private float[] getRotationMatrixFromQuaternion(Quaternion q22) {
// TODO Auto-generated method stub
float [] q = new float[4];
float [] result = new float[9];
q[0] = q22.w;
q[1] = q22.x;
q[2] = q22.y;
q[3] = q22.z;
result[0] = q[0]*q[0] + q[1]*q[1] - q[2]*q[2] -q[3]*q[3];
result[1] = 2 * (q[1]*q[2] - q[0]*q[3]);
result[2] = 2 * (q[1]*q[3] + q[0]*q[2]);
result[3] = 2 * (q[1]*q[2] + q[0]*q[3]);
result[4] = q[0]*q[0] - q[1]*q[1] + q[2]*q[2] - q[3]*q[3];
result[5] = 2 * (q[2]*q[3] - q[0]*q[1]);
result[7] = 2 * (q[2]*q[3] + q[0]*q[1]);
result[6] = 2 * (q[1]*q[3] - q[0]*q[2]);
result[8] = q[0]*q[0] - q[1]*q[1] - q[2]*q[2] + q[3]*q[3];
return result;
}
private float[] matrixMultiplication(float[] A, float[] B) {
float[] result = new float[3];
result[0] = A[0] * B[0] + A[1] * B[1] + A[2] * B[2];
result[1] = A[0] * B[3] + A[1] * B[4] + A[2] * B[5];
result[2] = A[0] * B[6] + A[1] * B[7] + A[2] * B[8];
return result;
}

Convert values form Sensor.TYPE_ORIENTATION to Euler angles?

I have to write a compass app in Android. The only thing the user sees on the screen is a cube with a red wall which has to point north. This is not important. What's important is that I need to rotate that cube accordingly to the rotation of the device itself so that the red wall continues to point north no matter how the phone is being held. My code is simple and straightforward:
#Override
public void onSensorChanged(SensorEvent event) {
synchronized (this) {
switch (event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
direction = event.values[2];
break;
case Sensor.TYPE_ORIENTATION:
if (direction < 0) {
angleX = event.values[1];
angleY = -event.values[2];
angleZ = event.values[0];
} else {
angleX = -event.values[1];
angleY = -event.values[2];
angleZ = event.values[0];
}
break;
}
}
}
I have added this extra direction variable that simply stores whether the phone's display is pointing downwards or upwards. I don't know if I need it but it seems to fix some bugs. I am using the SensorSimulator for android but whenever my pitch slider goes in the [-90, 90] interval the other variables get mixed up. It's like they get a 180 offset. But I can't detect when I am in this interval because the range of the pitch is from -90 to 90 so I can move that slider from left to write and I will always be in that interval.
This was all just to show you how far has my code advanced. I am not saying how this problem should be solved because I will only probably stir myself into a dead end. You see, I have been trying to write that app for 3 days now, and you can imagine how pissed my boss is. I have read all sorts of tutorials and tried every formula I could find or think of. So please help me. All I have to do is know how to rotate my cube, the rotation angles of which are EULER ANGLES in degrees.
Here's some code I wrote to do something pretty similar, really only caring about the rotation of the device in the roll direction. Hope it helps! It just uses the accelerometer values to determine the pitch, no need to get orientation of the view.
public void onSensorChanged(SensorEvent event) {
float x = -1 * event.values[0] / SensorManager.GRAVITY_EARTH;
float y = -1 * event.values[1] / SensorManager.GRAVITY_EARTH;
float z = -1 * event.values[2] / SensorManager.GRAVITY_EARTH;
float signedRawRoll = (float) (Math.atan2(x, y) * 180 / Math.PI);
float unsignedRawRoll = Math.abs(signedRawRoll);
float rollSign = signedRawRoll / unsignedRawRoll;
float rawPitch = Math.abs(z * 180);
// Use a basic low-pass filter to only keep the gravity in the accelerometer values for the X and Y axes
// adjust the filter weight based on pitch, as roll is harder to define as pitch approaches 180.
float filterWeight = rawPitch > 165 ? 0.85f : 0.7f;
float newUnsignedRoll = filterWeight * Math.abs(this.roll) + (1 - filterWeight) * unsignedRawRoll;
this.roll = rollSign * newUnsignedRoll;
if (Float.isInfinite(this.roll) || Float.isNaN(this.roll)) {
this.roll = 0;
}
this.pitch = filterWeight * this.pitch + (1 - filterWeight) * rawPitch;
for (IAngleListener listener : listeners) {
listener.deviceRollAndPitch(this.roll, this.pitch);
}
}

Categories

Resources