Detect Device to Near or away from Ear Android code - android

I need to know whether device is near to ear or not by using sensors
I tried using proximity, I want to combine accelerator and gyroscope sensors to exactly find the device is near or far from Ear.
Code for Proximity
#Override
public void onSensorChanged(SensorEvent event) {
float distance = event.values[0];
if (event.sensor.getType() == Sensor.TYPE_PROXIMITY) {
if (distance < mProximity.getMaximumRange()) {
iv.setText("Near");
} else {
iv.setText("far");
}
}
}

This is what I got from Android Documentation, I am sure you can dig more to get some answers to your problem, but this should be enough to get you started. You can also do some research about position sensors in android. The documentation is quite useful
// Create a constant to convert nanoseconds to seconds.
private static final float NS2S = 1.0f / 1000000000.0f;
private final float[] deltaRotationVector = new float[4]();
private float timestamp;
public void onSensorChanged(SensorEvent event) {
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if (timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
// Axis of the rotation sample, not normalized yet.
float axisX = event.values[0];
float axisY = event.values[1];
float axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
// Normalize the rotation vector if it's big enough to get the axis
// (that is, EPSILON should represent your maximum allowable margin of error)
if (omegaMagnitude > EPSILON) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = sin(thetaOverTwo);
float cosThetaOverTwo = cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
timestamp = event.timestamp;
float[] deltaRotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix,
deltaRotationVector);
// User code should concatenate the delta rotation we computed with the current rotation
// in order to get the updated rotation.
// rotationCurrent = rotationCurrent * deltaRotationMatrix;
}
}

Related

2D values from Quaternion

i'm working in a AR application in android with the Epson Moverio BT-200.
I have a quaternion that change his values with my sensor fusion algorithm.
In my application i'm trying to move a 2D item changing his margin left and margin top values when I move my head.
I'd like to know how can I extract, from the quaternion values, only the "horizontal" and "vertical" movements.
I could extract from the quaternion the pitch and roll values, but I read that there are several problems with euler angle. Could I do this only working with quaternions?
This is my actual code. I solved the problem using the Quaternions for the algorithm, and at the end I extract the euler angles from the rotation matrix.
This is the algorithm for take the values from the sensors:
private static final float NS2S = 1.0f / 1000000000.0f;
private final Quaternion deltaQuaternion = new Quaternion();
private Quaternion quaternionGyroscope = new Quaternion();
private Quaternion quaternionRotationVector = new Quaternion();
private long timestamp;
private static final double EPSILON = 0.1f;
private double gyroscopeRotationVelocity = 0;
private boolean positionInitialised = false;
private int panicCounter;
private static final float DIRECT_INTERPOLATION_WEIGHT = 0.005f;
private static final float OUTLIER_THRESHOLD = 0.85f;
private static final float OUTLIER_PANIC_THRESHOLD = 0.65f;
private static final int PANIC_THRESHOLD = 60;
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR) {
// Process rotation vector (just safe it)
float[] q = new float[4];
// Calculate angle. Starting with API_18, Android will provide this value as event.values[3], but if not, we have to calculate it manually.
SensorManager.getQuaternionFromVector(q, event.values);
// Store in quaternion
quaternionRotationVector.setXYZW(q[1], q[2], q[3], -q[0]);
if (!positionInitialised) {
// Override
quaternionGyroscope.set(quaternionRotationVector);
positionInitialised = true;
}
} else if (event.sensor.getType() == Sensor.TYPE_GYROSCOPE) {
// Process Gyroscope and perform fusion
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if (timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
// Axis of the rotation sample, not normalized yet.
float axisX = event.values[0];
float axisY = event.values[1];
float axisZ = event.values[2];
// Calculate the angular speed of the sample
gyroscopeRotationVelocity = Math.sqrt(axisX * axisX + axisY * axisY + axisZ * axisZ);
// Normalize the rotation vector if it's big enough to get the axis
if (gyroscopeRotationVelocity > EPSILON) {
axisX /= gyroscopeRotationVelocity;
axisY /= gyroscopeRotationVelocity;
axisZ /= gyroscopeRotationVelocity;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
double thetaOverTwo = gyroscopeRotationVelocity * dT / 2.0f;
double sinThetaOverTwo = Math.sin(thetaOverTwo);
double cosThetaOverTwo = Math.cos(thetaOverTwo);
deltaQuaternion.setX((float) (sinThetaOverTwo * axisX));
deltaQuaternion.setY((float) (sinThetaOverTwo * axisY));
deltaQuaternion.setZ((float) (sinThetaOverTwo * axisZ));
deltaQuaternion.setW(-(float) cosThetaOverTwo);
// Move current gyro orientation
deltaQuaternion.multiplyByQuat(quaternionGyroscope, quaternionGyroscope);
// Calculate dot-product to calculate whether the two orientation sensors have diverged
// (if the dot-product is closer to 0 than to 1), because it should be close to 1 if both are the same.
float dotProd = quaternionGyroscope.dotProduct(quaternionRotationVector);
// If they have diverged, rely on gyroscope only (this happens on some devices when the rotation vector "jumps").
if (Math.abs(dotProd) < OUTLIER_THRESHOLD) {
// Increase panic counter
if (Math.abs(dotProd) < OUTLIER_PANIC_THRESHOLD) {
panicCounter++;
}
// Directly use Gyro
setOrientationQuaternionAndMatrix(quaternionGyroscope);
} else {
// Both are nearly saying the same. Perform normal fusion.
// Interpolate with a fixed weight between the two absolute quaternions obtained from gyro and rotation vector sensors
// The weight should be quite low, so the rotation vector corrects the gyro only slowly, and the output keeps responsive.
Quaternion interpolate = new Quaternion();
quaternionGyroscope.slerp(quaternionRotationVector, interpolate, DIRECT_INTERPOLATION_WEIGHT);
// Use the interpolated value between gyro and rotationVector
setOrientationQuaternionAndMatrix(interpolate);
// Override current gyroscope-orientation
quaternionGyroscope.copyVec4(interpolate);
// Reset the panic counter because both sensors are saying the same again
panicCounter = 0;
}
if (panicCounter > PANIC_THRESHOLD) {
Log.d("Rotation Vector",
"Panic counter is bigger than threshold; this indicates a Gyroscope failure. Panic reset is imminent.");
if (gyroscopeRotationVelocity < 3) {
Log.d("Rotation Vector",
"Performing Panic-reset. Resetting orientation to rotation-vector value.");
// Manually set position to whatever rotation vector says.
setOrientationQuaternionAndMatrix(quaternionRotationVector);
// Override current gyroscope-orientation with corrected value
quaternionGyroscope.copyVec4(quaternionRotationVector);
panicCounter = 0;
} else {
Log.d("Rotation Vector",
String.format(
"Panic reset delayed due to ongoing motion (user is still shaking the device). Gyroscope Velocity: %.2f > 3",
gyroscopeRotationVelocity));
}
}
}
timestamp = event.timestamp;
}
}
private void setOrientationQuaternionAndMatrix(Quaternion quaternion) {
Quaternion correctedQuat = quaternion.clone();
// We inverted w in the deltaQuaternion, because currentOrientationQuaternion required it.
// Before converting it back to matrix representation, we need to revert this process
correctedQuat.w(-correctedQuat.w());
synchronized (syncToken) {
// Use gyro only
currentOrientationQuaternion.copyVec4(quaternion);
// Set the rotation matrix as well to have both representations
SensorManager.getRotationMatrixFromVector(currentOrientationRotationMatrix.matrix, correctedQuat.ToArray());
}
}
And this is how I take the euler angles rotation values:
/**
* #return Returns the current rotation of the device in the Euler-Angles
*/
public EulerAngles getEulerAngles() {
float[] angles = new float[3];
float[] remappedOrientationMatrix = new float[16];
SensorManager.remapCoordinateSystem(currentOrientationRotationMatrix.getMatrix(), SensorManager.AXIS_X,
SensorManager.AXIS_Z, remappedOrientationMatrix);
SensorManager.getOrientation(remappedOrientationMatrix, angles);
return new EulerAngles(angles[0], angles[1], angles[2]);
}
I solved my problem with this solution. Now won't be difficult to move my 2d Object with this sensors values. Sorry for lenght of my answer, but I hope that it could be useful for someone :)

How to update the azimuth with the rotation matrix from gyroscope?

Suppose I have my current orientation as (azimuth, pitch, roll). Now I wish to update my orientation with the gyroscope. According to the codes given by the Android development web, I can obtain the so-called deltaRotationMatrix as follows:
// Create a constant to convert nanoseconds to seconds.
private static final float NS2S = 1.0f / 1000000000.0f;
private final float[] deltaRotationVector = new float[4]();
private float timestamp;
public void onSensorChanged(SensorEvent event) {
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if (timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
// Axis of the rotation sample, not normalized yet.
float axisX = event.values[0];
float axisY = event.values[1];
float axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
// Normalize the rotation vector if it's big enough to get the axis
// (that is, EPSILON should represent your maximum allowable margin of error)
if (omegaMagnitude > EPSILON) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = sin(thetaOverTwo);
float cosThetaOverTwo = cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
timestamp = event.timestamp;
float[] deltaRotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
// User code should concatenate the delta rotation we computed with the current rotation
// in order to get the updated rotation.
// rotationCurrent = rotationCurrent * deltaRotationMatrix;
}
}
How should I proceed with this snippet so as to update my orientation?
You just need to multiply the deltaRotationMatrix by the rotationCurrentMatrix and then make a call to SensorManager.getOrientation(). You will need to implement a matrix multiplication method. You will also need an initial currentRotationMatrix, you can use the acceleration sensor and magnetic sensor with SensorManager.getRotationMatrix and SensorManager.getOrientation to get the initial currentRotationMatrix. Alternatively, you could use TYPE_ROTATION_VECTOR to get the initial currentRotationMatrix.
currentRotationMatrix = matrixMultiplication(
currentRotationMatrix,
deltaRotationMatrix);
SensorManager.getOrientation(currentRotationMatrix,
gyroscopeOrientation;
Unfortunately, what you will realize is that even the TYPE_GYROSCOPE sensor which is supposed to be calibrated for drift doesn't do a very good job and the sensor quickly drifts out of rotation with the device. Frustrating.
I have a GitHub repo with all of this implemented here
And a working project on the Play Store here

Removing Noise from recorded accelerometer/gyroscope data

I have seen some answers to how reduce the noise of for example the accelerometer x,y,z values while listening, but my problem is a bit different.
I have some recorded data already (in csv-files) and I would like to remove/reduce the noise afterwards, if that's possible.
Here is the data that was recorded:
X,Y,Z from gyroscope
Delta 0-3 from gyroscope, which was calculated in this way:
axisX = 0;
axisY = 0;
axisZ = 0;
// This timestep's delta rotation to be multiplied by the
// current rotation
// after computing it from the gyro sample data.
if (timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
// Axis of the rotation sample, not normalized yet.
axisX = event.values[0];
axisY = event.values[1];
axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = FloatMath.sqrt(axisX * axisX + axisY
* axisY + axisZ * axisZ);
// Normalize the rotation vector if it's big enough to get
// the axis (that is, EPSILON should represent your maximum
// allowable margin of error)
if (omegaMagnitude > 0.000000001f) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the
// timestep in order to get a delta rotation from this
// sample over the timestep We will convert this axis-angle
// representation of the delta rotation into a quaternion
// before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = FloatMath.sin(thetaOverTwo);
float cosThetaOverTwo = FloatMath.cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
timestamp = event.timestamp;
float[] deltaRotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix,deltaRotationVector);
Pitch/Roll/Azimuth/Inclination, which was calculated in this way:
// Calculation of the orientation through the
// magnetic-field and accelerometer sensors.
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
mGravity = event.values;
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
mGeomagnetic = event.values;
if (mGravity != null && mGeomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
// get the current orientation
// orientation consist of: azimut, pitch and roll in radians
azimut = orientation[0] * (180 / (float) java.lang.Math.PI);
pitch = orientation[1] * (180 / (float) java.lang.Math.PI);
roll = orientation[2] * (180 / (float) java.lang.Math.PI);
inclination = SensorManager.getInclination(I) * (180 / (float) java.lang.Math.PI);
}
}
The X/Y/Z from accelerometer wasn't written in the files.
So my question is:
Can I remove the noise from this data?
Thanks in advance.
I do not know if it is too late for you, just write in case you still need it.
You can implement some kind of filter with it. Low pass filter is typical. Otherwise, try Complementary Filter.
Me personally I preferred Kalman Filter, although it is a bit computationally expensive.
Since you don't have the accelerometer recorded, and if I understand correctly what you use is the orientation. I would recommend converting the euler angles to quaternions representation and using averaging to smooth the data, this is not regular averaging, see below.
You can implement rolling window filter by averaging using this this matlab code example:
https://stackoverflow.com/a/29315869/6589074
All the best,
Lev

Gyroscope Issues with Device Orientation

I'm getting pitch and roll data from my device's gyroscope using this tutorial: http://www.thousand-thoughts.com/2012/03/android-sensor-fusion-tutorial/
All the readings are extremely accurate (there's a filter applied to the code in the tutorial to eliminate gyro drift). Unfortunately, the code only works when my device is placed flat on a surface that is parallel to the ground. The most ideal position for my app to work would be with the top of the device pointing straight up (ie, the device is perpendicular to the ground with the screen facing the user). Whenever I orient my device in this position, the pitch values go to +90 degrees (as expected). What I would like to do is set this position as the 0 degree point (or initial position) for my device so that the pitch readings are 0 degrees when my device is oriented upright (in portrait mode) with the screen facing the user.
I asked the author of the tutorial with help on this issue and he responded:
"If you want to have the upright position as the initial one, you will have to rotate your frame of reference accordingly. The simplest way would be to rotate the resulting rotation matrix by -90 degrees about the x-axis. But you have to be careful about at which point in the algorithm to apply this rotation. Always remember that rotations are not commutative operations. To be more specific on this, I would have to review the code again, since I haven’t worked with it for a while now."
I'm really really confused and stumped as to how to rotate my frame of reference. I guess the bottom line is that I have no idea how to rotate the matrix by -90 degrees about the x-axis. If someone could help me out with this part, it would be fantastic. Here's my code in case anyone would like to refer to it:
public class AttitudeDisplayIndicator extends SherlockActivity implements SensorEventListener {
private SensorManager mSensorManager = null;
// angular speeds from gyro
private float[] gyro = new float[3];
// rotation matrix from gyro data
private float[] gyroMatrix = new float[9];
// orientation angles from gyro matrix
private float[] gyroOrientation = new float[3];
// magnetic field vector
private float[] magnet = new float[3];
// accelerometer vector
private float[] accel = new float[3];
// orientation angles from accel and magnet
private float[] accMagOrientation = new float[3];
// final orientation angles from sensor fusion
private float[] fusedOrientation = new float[3];
// accelerometer and magnetometer based rotation matrix
private float[] rotationMatrix = new float[9];
public static final float EPSILON = 0.000000001f;
private static final float NS2S = 1.0f / 1000000000.0f;
private float timestamp;
private boolean initState = true;
public static final int TIME_CONSTANT = 30;
public static final float FILTER_COEFFICIENT = 0.98f;
private Timer fuseTimer = new Timer();
// The following members are only for displaying the sensor output.
public Handler mHandler;
DecimalFormat d = new DecimalFormat("#.##");
//ADI background image.
private ImageView adiBackground;
//ADI axes.
private ImageView adiAxes;
//ADI frame.
private ImageView adiFrame;
//Layout.
private RelativeLayout layout;
//Pitch and Roll TextViews.
private TextView pitchAngleText;
private TextView bankAngleText;
//Instantaneous output values from sensors as the device moves.
public static double pitch;
public static double roll;
//Matrix for rotating the ADI (roll).
Matrix mMatrix = new Matrix();
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_attitude_display_indicator);
gyroOrientation[0] = 0.0f;
gyroOrientation[1] = 0.0f;
gyroOrientation[2] = 0.0f;
// initialise gyroMatrix with identity matrix
gyroMatrix[0] = 1.0f; gyroMatrix[1] = 0.0f; gyroMatrix[2] = 0.0f;
gyroMatrix[3] = 0.0f; gyroMatrix[4] = 1.0f; gyroMatrix[5] = 0.0f;
gyroMatrix[6] = 0.0f; gyroMatrix[7] = 0.0f; gyroMatrix[8] = 1.0f;
// get sensorManager and initialise sensor listeners
mSensorManager = (SensorManager) this.getSystemService(SENSOR_SERVICE);
initListeners();
// wait for one second until gyroscope and magnetometer/accelerometer
// data is initialised then scedule the complementary filter task
fuseTimer.scheduleAtFixedRate(new calculateFusedOrientationTask(),
1000, TIME_CONSTANT);
mHandler = new Handler();
adiBackground = (ImageView) findViewById(R.id.adi_background);
adiFrame = (ImageView) findViewById(R.id.adi_frame);
adiAxes = (ImageView) findViewById(R.id.adi_axes);
layout = (RelativeLayout) findViewById(R.id.adi_layout);
new Color();
layout.setBackgroundColor(Color.rgb(150, 150, 150));
pitchAngleText = (TextView) findViewById(R.id.pitch_angle_text);
bankAngleText = (TextView) findViewById(R.id.bank_angle_text);
}
// This function registers sensor listeners for the accelerometer, magnetometer and gyroscope.
public void initListeners(){
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),
SensorManager.SENSOR_DELAY_FASTEST);
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_GYROSCOPE),
SensorManager.SENSOR_DELAY_FASTEST);
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),
SensorManager.SENSOR_DELAY_FASTEST);
}
//#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
//#Override
public void onSensorChanged(SensorEvent event) {
switch(event.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
// copy new accelerometer data into accel array and calculate orientation
System.arraycopy(event.values, 0, accel, 0, 3);
calculateAccMagOrientation();
break;
case Sensor.TYPE_GYROSCOPE:
// process gyro data
gyroFunction(event);
break;
case Sensor.TYPE_MAGNETIC_FIELD:
// copy new magnetometer data into magnet array
System.arraycopy(event.values, 0, magnet, 0, 3);
break;
}
}
// calculates orientation angles from accelerometer and magnetometer output
public void calculateAccMagOrientation() {
if(SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet)) {
SensorManager.getOrientation(rotationMatrix, accMagOrientation);
}
}
// This function is borrowed from the Android reference
// at http://developer.android.com/reference/android/hardware/SensorEvent.html#values
// It calculates a rotation vector from the gyroscope angular speed values.
private void getRotationVectorFromGyro(float[] gyroValues,
float[] deltaRotationVector,
float timeFactor)
{
float[] normValues = new float[3];
// Calculate the angular speed of the sample
float omegaMagnitude =
(float)Math.sqrt(gyroValues[0] * gyroValues[0] +
gyroValues[1] * gyroValues[1] +
gyroValues[2] * gyroValues[2]);
// Normalize the rotation vector if it's big enough to get the axis
if(omegaMagnitude > EPSILON) {
normValues[0] = gyroValues[0] / omegaMagnitude;
normValues[1] = gyroValues[1] / omegaMagnitude;
normValues[2] = gyroValues[2] / omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * timeFactor;
float sinThetaOverTwo = (float)Math.sin(thetaOverTwo);
float cosThetaOverTwo = (float)Math.cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * normValues[0];
deltaRotationVector[1] = sinThetaOverTwo * normValues[1];
deltaRotationVector[2] = sinThetaOverTwo * normValues[2];
deltaRotationVector[3] = cosThetaOverTwo;
}
// This function performs the integration of the gyroscope data.
// It writes the gyroscope based orientation into gyroOrientation.
public void gyroFunction(SensorEvent event) {
// don't start until first accelerometer/magnetometer orientation has been acquired
if (accMagOrientation == null)
return;
// initialisation of the gyroscope based rotation matrix
if(initState) {
float[] initMatrix = new float[9];
initMatrix = getRotationMatrixFromOrientation(accMagOrientation);
float[] test = new float[3];
SensorManager.getOrientation(initMatrix, test);
gyroMatrix = matrixMultiplication(gyroMatrix, initMatrix);
initState = false;
}
// copy the new gyro values into the gyro array
// convert the raw gyro data into a rotation vector
float[] deltaVector = new float[4];
if(timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
System.arraycopy(event.values, 0, gyro, 0, 3);
getRotationVectorFromGyro(gyro, deltaVector, dT / 2.0f);
}
// measurement done, save current time for next interval
timestamp = event.timestamp;
// convert rotation vector into rotation matrix
float[] deltaMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(deltaMatrix, deltaVector);
// apply the new rotation interval on the gyroscope based rotation matrix
gyroMatrix = matrixMultiplication(gyroMatrix, deltaMatrix);
// get the gyroscope based orientation from the rotation matrix
SensorManager.getOrientation(gyroMatrix, gyroOrientation);
}
private float[] getRotationMatrixFromOrientation(float[] o) {
float[] xM = new float[9];
float[] yM = new float[9];
float[] zM = new float[9];
float sinX = (float)Math.sin(o[1]);
float cosX = (float)Math.cos(o[1]);
float sinY = (float)Math.sin(o[2]);
float cosY = (float)Math.cos(o[2]);
float sinZ = (float)Math.sin(o[0]);
float cosZ = (float)Math.cos(o[0]);
// rotation about x-axis (pitch)
xM[0] = 1.0f; xM[1] = 0.0f; xM[2] = 0.0f;
xM[3] = 0.0f; xM[4] = cosX; xM[5] = sinX;
xM[6] = 0.0f; xM[7] = -sinX; xM[8] = cosX;
// rotation about y-axis (roll)
yM[0] = cosY; yM[1] = 0.0f; yM[2] = sinY;
yM[3] = 0.0f; yM[4] = 1.0f; yM[5] = 0.0f;
yM[6] = -sinY; yM[7] = 0.0f; yM[8] = cosY;
// rotation about z-axis (azimuth)
zM[0] = cosZ; zM[1] = sinZ; zM[2] = 0.0f;
zM[3] = -sinZ; zM[4] = cosZ; zM[5] = 0.0f;
zM[6] = 0.0f; zM[7] = 0.0f; zM[8] = 1.0f;
// rotation order is y, x, z (roll, pitch, azimuth)
float[] resultMatrix = matrixMultiplication(xM, yM);
resultMatrix = matrixMultiplication(zM, resultMatrix);
return resultMatrix;
}
private float[] matrixMultiplication(float[] A, float[] B) {
float[] result = new float[9];
result[0] = A[0] * B[0] + A[1] * B[3] + A[2] * B[6];
result[1] = A[0] * B[1] + A[1] * B[4] + A[2] * B[7];
result[2] = A[0] * B[2] + A[1] * B[5] + A[2] * B[8];
result[3] = A[3] * B[0] + A[4] * B[3] + A[5] * B[6];
result[4] = A[3] * B[1] + A[4] * B[4] + A[5] * B[7];
result[5] = A[3] * B[2] + A[4] * B[5] + A[5] * B[8];
result[6] = A[6] * B[0] + A[7] * B[3] + A[8] * B[6];
result[7] = A[6] * B[1] + A[7] * B[4] + A[8] * B[7];
result[8] = A[6] * B[2] + A[7] * B[5] + A[8] * B[8];
return result;
}
class calculateFusedOrientationTask extends TimerTask {
public void run() {
float oneMinusCoeff = 1.0f - FILTER_COEFFICIENT;
/*
* Fix for 179° <--> -179° transition problem:
* Check whether one of the two orientation angles (gyro or accMag) is negative while the other one is positive.
* If so, add 360° (2 * math.PI) to the negative value, perform the sensor fusion, and remove the 360° from the result
* if it is greater than 180°. This stabilizes the output in positive-to-negative-transition cases.
*/
// azimuth
if (gyroOrientation[0] < -0.5 * Math.PI && accMagOrientation[0] > 0.0) {
fusedOrientation[0] = (float) (FILTER_COEFFICIENT * (gyroOrientation[0] + 2.0 * Math.PI) + oneMinusCoeff * accMagOrientation[0]);
fusedOrientation[0] -= (fusedOrientation[0] > Math.PI) ? 2.0 * Math.PI : 0;
}
else if (accMagOrientation[0] < -0.5 * Math.PI && gyroOrientation[0] > 0.0) {
fusedOrientation[0] = (float) (FILTER_COEFFICIENT * gyroOrientation[0] + oneMinusCoeff * (accMagOrientation[0] + 2.0 * Math.PI));
fusedOrientation[0] -= (fusedOrientation[0] > Math.PI)? 2.0 * Math.PI : 0;
}
else {
fusedOrientation[0] = FILTER_COEFFICIENT * gyroOrientation[0] + oneMinusCoeff * accMagOrientation[0];
}
// pitch
if (gyroOrientation[1] < -0.5 * Math.PI && accMagOrientation[1] > 0.0) {
fusedOrientation[1] = (float) (FILTER_COEFFICIENT * (gyroOrientation[1] + 2.0 * Math.PI) + oneMinusCoeff * accMagOrientation[1]);
fusedOrientation[1] -= (fusedOrientation[1] > Math.PI) ? 2.0 * Math.PI : 0;
}
else if (accMagOrientation[1] < -0.5 * Math.PI && gyroOrientation[1] > 0.0) {
fusedOrientation[1] = (float) (FILTER_COEFFICIENT * gyroOrientation[1] + oneMinusCoeff * (accMagOrientation[1] + 2.0 * Math.PI));
fusedOrientation[1] -= (fusedOrientation[1] > Math.PI)? 2.0 * Math.PI : 0;
}
else {
fusedOrientation[1] = FILTER_COEFFICIENT * gyroOrientation[1] + oneMinusCoeff * accMagOrientation[1];
}
// roll
if (gyroOrientation[2] < -0.5 * Math.PI && accMagOrientation[2] > 0.0) {
fusedOrientation[2] = (float) (FILTER_COEFFICIENT * (gyroOrientation[2] + 2.0 * Math.PI) + oneMinusCoeff * accMagOrientation[2]);
fusedOrientation[2] -= (fusedOrientation[2] > Math.PI) ? 2.0 * Math.PI : 0;
}
else if (accMagOrientation[2] < -0.5 * Math.PI && gyroOrientation[2] > 0.0) {
fusedOrientation[2] = (float) (FILTER_COEFFICIENT * gyroOrientation[2] + oneMinusCoeff * (accMagOrientation[2] + 2.0 * Math.PI));
fusedOrientation[2] -= (fusedOrientation[2] > Math.PI)? 2.0 * Math.PI : 0;
}
else {
fusedOrientation[2] = FILTER_COEFFICIENT * gyroOrientation[2] + oneMinusCoeff * accMagOrientation[2];
}
// overwrite gyro matrix and orientation with fused orientation
// to comensate gyro drift
gyroMatrix = getRotationMatrixFromOrientation(fusedOrientation);
System.arraycopy(fusedOrientation, 0, gyroOrientation, 0, 3);
// update sensor output in GUI
mHandler.post(updateOrientationDisplayTask);
}
}
Thanks in advance for your help!
The theory...
I'm not really sure about the format in which your "frame of reference" matrix is represented, but typically rotations are done with matrix multiplication.
Basically, you would take your "frame of reference matrix" and multiply it by a 90 degrees rotation matrix.
Such a matrix can be found on Wikipedia:
Three-dimensional rotation matrices
Since your angle is 90 degrees, your sines and cosines would resolve to 1's or 0's which you can plug directly into the matrix instead of computing the sines and cosines. For example, a matrix that would rotate 90 degrees counter-clockwise about the x axis would look like this:
1 0 0
0 0 1
0 -1 0
Also, please not that matrices like these operate on row vectors of x y z coordinates.
So for example, if you have a point in space that is at (2,5,7) and you would like to rotate it using the above matrix, you would have to do the following operation:
|2 5 7| |1 0 0|
|0 0 1|
|0 -1 0|
Which gives [2 -7 5]
...applied to your code
I have glanced quickly at your code and it seems like the modification you need to make involves the output of calculateAccMagOrientation() because it is used to initialize the orientation of the device.
1: public void calculateAccMagOrientation() {
2: if(SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet)) {
3: SensorManager.getOrientation(rotationMatrix, accMagOrientation);
4: }
5: }
At line 2 in the above snippet is where you get your initial rotationMatrix. Try multiplying rotationMatrix by a hand crafted 90 degrees rotation matrix before calling getOrientation at line 3. I think this will effectively re-align your reference orientation:
public void calculateAccMagOrientation() {
if(SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet)) {
rotationMatrix = matrixMultiplication(rotationMatrix, my90DegRotationMatrix);
SensorManager.getOrientation(rotationMatrix, accMagOrientation);
}
}
Please note that depending on how the angles work in Android, you might need to use a 90 degrees clockwise rotation matrix instead of a counter-clockwise.
Alternative solution
It just occurred to me, maybe you could also simply subtract 90 from the final pitch result before displaying it?

Compute rotation matrix using the magnetic field

In get rotation matrix value it contains public static boolean getRotationMatrix (float[] R, float[] I, float[] gravity, float[] geomagnetic)
Here how can i calculate the float[] gravity?
I found a sample of code where it calculate the orientation using both Accelerometer and Magnetic field
boolean success = SensorManager.getRotationMatrix(
matrixR,
matrixI,
valuesAccelerometer,
valuesMagneticField);
if(success){
SensorManager.getOrientation(matrixR, matrixValues);
double azimuth = Math.toDegrees(matrixValues[0]);
double pitch = Math.toDegrees(matrixValues[1]);
double roll = Math.toDegrees(matrixValues[2]);
readingAzimuth.setText("Azimuth: " + String.valueOf(azimuth));
readingPitch.setText("Pitch: " + String.valueOf(pitch));
readingRoll.setText("Roll: "+String.valueOf(roll));
}
My questions are :
Is orientation value is the rotation matrix value?
If no then how can i implement this code to get the rotation matrix value using magnetic? field?
To get the rotation matrix i use this code
public void onSensorChanged(SensorEvent sensorEvent) {
if (timestamp != 0) {
final double dT = (sensorEvent.timestamp - timestamp) * NS2S;
double magneticX = sensorEvent.values[0];
double magneticY = sensorEvent.values[1];
double magneticZ = sensorEvent.values[2];
double omegaMagnitude =Math.sqrt(magneticX*magneticX + magneticY*magneticY + magneticZ*magneticZ);
if (omegaMagnitude > EPSILON) {
magneticX /= omegaMagnitude;
magneticY /= omegaMagnitude;
magneticZ /= omegaMagnitude;
}
double thetaOverTwo = omegaMagnitude * dT / 2.0f;
double sinThetaOverTwo =Math.sin(thetaOverTwo);
double cosThetaOverTwo = Math.cos(thetaOverTwo);
deltaRotationVector[0] = (double) (sinThetaOverTwo * magneticX);
deltaRotationVector[1] = (double) (sinThetaOverTwo * magneticY);
deltaRotationVector[2] = (double) (sinThetaOverTwo * magneticZ);
deltaRotationVector[3] = cosThetaOverTwo;
}
double[] deltaRotationMatrix = new double[9];
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
}
But the problem is this getRotationMatrixFromVector is says undefine for sensor.Any idea?
Orientation is not a rotation matrix as it only provides you angles related to magnetic North. You can obtain the rotation matrix (Direction Cosine Matrix) that will help you to transform coordinates from your device frame to the Earth's frame this way :
with
= azimuth (radians)
= pitch (radians)
= roll (radians)
I know that this is an old thread but in case it helps, for Android I think the 3x3 rotation matrix is actually given by a variation of the approved answer. To be specific, in Android the rotation matrix is
(cos&#966 cos&#968 - sin&#966 sin&#968 sin&#952) sin&#966 cos&#952 ( cos&#966 sin&#968 + sin&#966 cos&#968 sin&#952)
-(sin&#966 cos&#968 + cos&#966 sin&#968 sin&#952) cos&#966 cos&#952 (-sin&#966 sin&#968 + cos&#966 cos&#968 sin&#952)
-sin&#968 cos&#952 -sin&#952 cos&#966 cos&#952
where
&#966 = azimuth
&#952 = pitch
&#968 = roll
which corresponds to the 3x3 Android rotation matrix R[0] to R[8] (matrixR in the question) via
R[0] R[1] R[2]
R[3] R[4] R[5]
R[6] R[7] R[8]

Categories

Resources