I'm getting pitch and roll data from my device's gyroscope using this tutorial: http://www.thousand-thoughts.com/2012/03/android-sensor-fusion-tutorial/
All the readings are extremely accurate (there's a filter applied to the code in the tutorial to eliminate gyro drift). Unfortunately, the code only works when my device is placed flat on a surface that is parallel to the ground. The most ideal position for my app to work would be with the top of the device pointing straight up (ie, the device is perpendicular to the ground with the screen facing the user). Whenever I orient my device in this position, the pitch values go to +90 degrees (as expected). What I would like to do is set this position as the 0 degree point (or initial position) for my device so that the pitch readings are 0 degrees when my device is oriented upright (in portrait mode) with the screen facing the user.
I asked the author of the tutorial with help on this issue and he responded:
"If you want to have the upright position as the initial one, you will have to rotate your frame of reference accordingly. The simplest way would be to rotate the resulting rotation matrix by -90 degrees about the x-axis. But you have to be careful about at which point in the algorithm to apply this rotation. Always remember that rotations are not commutative operations. To be more specific on this, I would have to review the code again, since I haven’t worked with it for a while now."
I'm really really confused and stumped as to how to rotate my frame of reference. I guess the bottom line is that I have no idea how to rotate the matrix by -90 degrees about the x-axis. If someone could help me out with this part, it would be fantastic. Here's my code in case anyone would like to refer to it:
public class AttitudeDisplayIndicator extends SherlockActivity implements SensorEventListener {
private SensorManager mSensorManager = null;
// angular speeds from gyro
private float[] gyro = new float[3];
// rotation matrix from gyro data
private float[] gyroMatrix = new float[9];
// orientation angles from gyro matrix
private float[] gyroOrientation = new float[3];
// magnetic field vector
private float[] magnet = new float[3];
// accelerometer vector
private float[] accel = new float[3];
// orientation angles from accel and magnet
private float[] accMagOrientation = new float[3];
// final orientation angles from sensor fusion
private float[] fusedOrientation = new float[3];
// accelerometer and magnetometer based rotation matrix
private float[] rotationMatrix = new float[9];
public static final float EPSILON = 0.000000001f;
private static final float NS2S = 1.0f / 1000000000.0f;
private float timestamp;
private boolean initState = true;
public static final int TIME_CONSTANT = 30;
public static final float FILTER_COEFFICIENT = 0.98f;
private Timer fuseTimer = new Timer();
// The following members are only for displaying the sensor output.
public Handler mHandler;
DecimalFormat d = new DecimalFormat("#.##");
//ADI background image.
private ImageView adiBackground;
//ADI axes.
private ImageView adiAxes;
//ADI frame.
private ImageView adiFrame;
//Layout.
private RelativeLayout layout;
//Pitch and Roll TextViews.
private TextView pitchAngleText;
private TextView bankAngleText;
//Instantaneous output values from sensors as the device moves.
public static double pitch;
public static double roll;
//Matrix for rotating the ADI (roll).
Matrix mMatrix = new Matrix();
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_attitude_display_indicator);
gyroOrientation[0] = 0.0f;
gyroOrientation[1] = 0.0f;
gyroOrientation[2] = 0.0f;
// initialise gyroMatrix with identity matrix
gyroMatrix[0] = 1.0f; gyroMatrix[1] = 0.0f; gyroMatrix[2] = 0.0f;
gyroMatrix[3] = 0.0f; gyroMatrix[4] = 1.0f; gyroMatrix[5] = 0.0f;
gyroMatrix[6] = 0.0f; gyroMatrix[7] = 0.0f; gyroMatrix[8] = 1.0f;
// get sensorManager and initialise sensor listeners
mSensorManager = (SensorManager) this.getSystemService(SENSOR_SERVICE);
initListeners();
// wait for one second until gyroscope and magnetometer/accelerometer
// data is initialised then scedule the complementary filter task
fuseTimer.scheduleAtFixedRate(new calculateFusedOrientationTask(),
1000, TIME_CONSTANT);
mHandler = new Handler();
adiBackground = (ImageView) findViewById(R.id.adi_background);
adiFrame = (ImageView) findViewById(R.id.adi_frame);
adiAxes = (ImageView) findViewById(R.id.adi_axes);
layout = (RelativeLayout) findViewById(R.id.adi_layout);
new Color();
layout.setBackgroundColor(Color.rgb(150, 150, 150));
pitchAngleText = (TextView) findViewById(R.id.pitch_angle_text);
bankAngleText = (TextView) findViewById(R.id.bank_angle_text);
}
// This function registers sensor listeners for the accelerometer, magnetometer and gyroscope.
public void initListeners(){
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),
SensorManager.SENSOR_DELAY_FASTEST);
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_GYROSCOPE),
SensorManager.SENSOR_DELAY_FASTEST);
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),
SensorManager.SENSOR_DELAY_FASTEST);
}
//#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
//#Override
public void onSensorChanged(SensorEvent event) {
switch(event.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
// copy new accelerometer data into accel array and calculate orientation
System.arraycopy(event.values, 0, accel, 0, 3);
calculateAccMagOrientation();
break;
case Sensor.TYPE_GYROSCOPE:
// process gyro data
gyroFunction(event);
break;
case Sensor.TYPE_MAGNETIC_FIELD:
// copy new magnetometer data into magnet array
System.arraycopy(event.values, 0, magnet, 0, 3);
break;
}
}
// calculates orientation angles from accelerometer and magnetometer output
public void calculateAccMagOrientation() {
if(SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet)) {
SensorManager.getOrientation(rotationMatrix, accMagOrientation);
}
}
// This function is borrowed from the Android reference
// at http://developer.android.com/reference/android/hardware/SensorEvent.html#values
// It calculates a rotation vector from the gyroscope angular speed values.
private void getRotationVectorFromGyro(float[] gyroValues,
float[] deltaRotationVector,
float timeFactor)
{
float[] normValues = new float[3];
// Calculate the angular speed of the sample
float omegaMagnitude =
(float)Math.sqrt(gyroValues[0] * gyroValues[0] +
gyroValues[1] * gyroValues[1] +
gyroValues[2] * gyroValues[2]);
// Normalize the rotation vector if it's big enough to get the axis
if(omegaMagnitude > EPSILON) {
normValues[0] = gyroValues[0] / omegaMagnitude;
normValues[1] = gyroValues[1] / omegaMagnitude;
normValues[2] = gyroValues[2] / omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * timeFactor;
float sinThetaOverTwo = (float)Math.sin(thetaOverTwo);
float cosThetaOverTwo = (float)Math.cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * normValues[0];
deltaRotationVector[1] = sinThetaOverTwo * normValues[1];
deltaRotationVector[2] = sinThetaOverTwo * normValues[2];
deltaRotationVector[3] = cosThetaOverTwo;
}
// This function performs the integration of the gyroscope data.
// It writes the gyroscope based orientation into gyroOrientation.
public void gyroFunction(SensorEvent event) {
// don't start until first accelerometer/magnetometer orientation has been acquired
if (accMagOrientation == null)
return;
// initialisation of the gyroscope based rotation matrix
if(initState) {
float[] initMatrix = new float[9];
initMatrix = getRotationMatrixFromOrientation(accMagOrientation);
float[] test = new float[3];
SensorManager.getOrientation(initMatrix, test);
gyroMatrix = matrixMultiplication(gyroMatrix, initMatrix);
initState = false;
}
// copy the new gyro values into the gyro array
// convert the raw gyro data into a rotation vector
float[] deltaVector = new float[4];
if(timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
System.arraycopy(event.values, 0, gyro, 0, 3);
getRotationVectorFromGyro(gyro, deltaVector, dT / 2.0f);
}
// measurement done, save current time for next interval
timestamp = event.timestamp;
// convert rotation vector into rotation matrix
float[] deltaMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(deltaMatrix, deltaVector);
// apply the new rotation interval on the gyroscope based rotation matrix
gyroMatrix = matrixMultiplication(gyroMatrix, deltaMatrix);
// get the gyroscope based orientation from the rotation matrix
SensorManager.getOrientation(gyroMatrix, gyroOrientation);
}
private float[] getRotationMatrixFromOrientation(float[] o) {
float[] xM = new float[9];
float[] yM = new float[9];
float[] zM = new float[9];
float sinX = (float)Math.sin(o[1]);
float cosX = (float)Math.cos(o[1]);
float sinY = (float)Math.sin(o[2]);
float cosY = (float)Math.cos(o[2]);
float sinZ = (float)Math.sin(o[0]);
float cosZ = (float)Math.cos(o[0]);
// rotation about x-axis (pitch)
xM[0] = 1.0f; xM[1] = 0.0f; xM[2] = 0.0f;
xM[3] = 0.0f; xM[4] = cosX; xM[5] = sinX;
xM[6] = 0.0f; xM[7] = -sinX; xM[8] = cosX;
// rotation about y-axis (roll)
yM[0] = cosY; yM[1] = 0.0f; yM[2] = sinY;
yM[3] = 0.0f; yM[4] = 1.0f; yM[5] = 0.0f;
yM[6] = -sinY; yM[7] = 0.0f; yM[8] = cosY;
// rotation about z-axis (azimuth)
zM[0] = cosZ; zM[1] = sinZ; zM[2] = 0.0f;
zM[3] = -sinZ; zM[4] = cosZ; zM[5] = 0.0f;
zM[6] = 0.0f; zM[7] = 0.0f; zM[8] = 1.0f;
// rotation order is y, x, z (roll, pitch, azimuth)
float[] resultMatrix = matrixMultiplication(xM, yM);
resultMatrix = matrixMultiplication(zM, resultMatrix);
return resultMatrix;
}
private float[] matrixMultiplication(float[] A, float[] B) {
float[] result = new float[9];
result[0] = A[0] * B[0] + A[1] * B[3] + A[2] * B[6];
result[1] = A[0] * B[1] + A[1] * B[4] + A[2] * B[7];
result[2] = A[0] * B[2] + A[1] * B[5] + A[2] * B[8];
result[3] = A[3] * B[0] + A[4] * B[3] + A[5] * B[6];
result[4] = A[3] * B[1] + A[4] * B[4] + A[5] * B[7];
result[5] = A[3] * B[2] + A[4] * B[5] + A[5] * B[8];
result[6] = A[6] * B[0] + A[7] * B[3] + A[8] * B[6];
result[7] = A[6] * B[1] + A[7] * B[4] + A[8] * B[7];
result[8] = A[6] * B[2] + A[7] * B[5] + A[8] * B[8];
return result;
}
class calculateFusedOrientationTask extends TimerTask {
public void run() {
float oneMinusCoeff = 1.0f - FILTER_COEFFICIENT;
/*
* Fix for 179° <--> -179° transition problem:
* Check whether one of the two orientation angles (gyro or accMag) is negative while the other one is positive.
* If so, add 360° (2 * math.PI) to the negative value, perform the sensor fusion, and remove the 360° from the result
* if it is greater than 180°. This stabilizes the output in positive-to-negative-transition cases.
*/
// azimuth
if (gyroOrientation[0] < -0.5 * Math.PI && accMagOrientation[0] > 0.0) {
fusedOrientation[0] = (float) (FILTER_COEFFICIENT * (gyroOrientation[0] + 2.0 * Math.PI) + oneMinusCoeff * accMagOrientation[0]);
fusedOrientation[0] -= (fusedOrientation[0] > Math.PI) ? 2.0 * Math.PI : 0;
}
else if (accMagOrientation[0] < -0.5 * Math.PI && gyroOrientation[0] > 0.0) {
fusedOrientation[0] = (float) (FILTER_COEFFICIENT * gyroOrientation[0] + oneMinusCoeff * (accMagOrientation[0] + 2.0 * Math.PI));
fusedOrientation[0] -= (fusedOrientation[0] > Math.PI)? 2.0 * Math.PI : 0;
}
else {
fusedOrientation[0] = FILTER_COEFFICIENT * gyroOrientation[0] + oneMinusCoeff * accMagOrientation[0];
}
// pitch
if (gyroOrientation[1] < -0.5 * Math.PI && accMagOrientation[1] > 0.0) {
fusedOrientation[1] = (float) (FILTER_COEFFICIENT * (gyroOrientation[1] + 2.0 * Math.PI) + oneMinusCoeff * accMagOrientation[1]);
fusedOrientation[1] -= (fusedOrientation[1] > Math.PI) ? 2.0 * Math.PI : 0;
}
else if (accMagOrientation[1] < -0.5 * Math.PI && gyroOrientation[1] > 0.0) {
fusedOrientation[1] = (float) (FILTER_COEFFICIENT * gyroOrientation[1] + oneMinusCoeff * (accMagOrientation[1] + 2.0 * Math.PI));
fusedOrientation[1] -= (fusedOrientation[1] > Math.PI)? 2.0 * Math.PI : 0;
}
else {
fusedOrientation[1] = FILTER_COEFFICIENT * gyroOrientation[1] + oneMinusCoeff * accMagOrientation[1];
}
// roll
if (gyroOrientation[2] < -0.5 * Math.PI && accMagOrientation[2] > 0.0) {
fusedOrientation[2] = (float) (FILTER_COEFFICIENT * (gyroOrientation[2] + 2.0 * Math.PI) + oneMinusCoeff * accMagOrientation[2]);
fusedOrientation[2] -= (fusedOrientation[2] > Math.PI) ? 2.0 * Math.PI : 0;
}
else if (accMagOrientation[2] < -0.5 * Math.PI && gyroOrientation[2] > 0.0) {
fusedOrientation[2] = (float) (FILTER_COEFFICIENT * gyroOrientation[2] + oneMinusCoeff * (accMagOrientation[2] + 2.0 * Math.PI));
fusedOrientation[2] -= (fusedOrientation[2] > Math.PI)? 2.0 * Math.PI : 0;
}
else {
fusedOrientation[2] = FILTER_COEFFICIENT * gyroOrientation[2] + oneMinusCoeff * accMagOrientation[2];
}
// overwrite gyro matrix and orientation with fused orientation
// to comensate gyro drift
gyroMatrix = getRotationMatrixFromOrientation(fusedOrientation);
System.arraycopy(fusedOrientation, 0, gyroOrientation, 0, 3);
// update sensor output in GUI
mHandler.post(updateOrientationDisplayTask);
}
}
Thanks in advance for your help!
The theory...
I'm not really sure about the format in which your "frame of reference" matrix is represented, but typically rotations are done with matrix multiplication.
Basically, you would take your "frame of reference matrix" and multiply it by a 90 degrees rotation matrix.
Such a matrix can be found on Wikipedia:
Three-dimensional rotation matrices
Since your angle is 90 degrees, your sines and cosines would resolve to 1's or 0's which you can plug directly into the matrix instead of computing the sines and cosines. For example, a matrix that would rotate 90 degrees counter-clockwise about the x axis would look like this:
1 0 0
0 0 1
0 -1 0
Also, please not that matrices like these operate on row vectors of x y z coordinates.
So for example, if you have a point in space that is at (2,5,7) and you would like to rotate it using the above matrix, you would have to do the following operation:
|2 5 7| |1 0 0|
|0 0 1|
|0 -1 0|
Which gives [2 -7 5]
...applied to your code
I have glanced quickly at your code and it seems like the modification you need to make involves the output of calculateAccMagOrientation() because it is used to initialize the orientation of the device.
1: public void calculateAccMagOrientation() {
2: if(SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet)) {
3: SensorManager.getOrientation(rotationMatrix, accMagOrientation);
4: }
5: }
At line 2 in the above snippet is where you get your initial rotationMatrix. Try multiplying rotationMatrix by a hand crafted 90 degrees rotation matrix before calling getOrientation at line 3. I think this will effectively re-align your reference orientation:
public void calculateAccMagOrientation() {
if(SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet)) {
rotationMatrix = matrixMultiplication(rotationMatrix, my90DegRotationMatrix);
SensorManager.getOrientation(rotationMatrix, accMagOrientation);
}
}
Please note that depending on how the angles work in Android, you might need to use a 90 degrees clockwise rotation matrix instead of a counter-clockwise.
Alternative solution
It just occurred to me, maybe you could also simply subtract 90 from the final pitch result before displaying it?
Related
I am trying to figure out how to point the device marker where the user is heading. I tried the code below to get the bearing angle between the true north and where the user is heading. Once the bearing is found, I would apply it to the device marker and rotate it based on that bearing, but it only works when the device is not moving.
I tried out the code I found from the link below.
Code Source Link
// raw inputs from Android sensors
float m_Norm_Gravity; // length of raw gravity vector received in onSensorChanged(...). NB: should be about 10
float[] m_NormGravityVector = m_NormMagFieldValues = null;; // Normalised gravity vector, (i.e. length of this vector is 1), which points straight up into space
float m_Norm_MagField; // length of raw magnetic field vector received in onSensorChanged(...).
float[] m_NormMagFieldValues; // Normalised magnetic field vector, (i.e. length of this vector is 1)
// accuracy specifications. SENSOR_UNAVAILABLE if unknown, otherwise SensorManager.SENSOR_STATUS_UNRELIABLE, SENSOR_STATUS_ACCURACY_LOW, SENSOR_STATUS_ACCURACY_MEDIUM or SENSOR_STATUS_ACCURACY_HIGH
int m_GravityAccuracy; // accuracy of gravity sensor
int m_MagneticFieldAccuracy; // accuracy of magnetic field sensor
// values calculated once gravity and magnetic field vectors are available
float[] m_NormEastVector = new float[3]; // normalised cross product of raw gravity vector with magnetic field values, points east
float[] m_NormNorthVector = new float[3]; // Normalised vector pointing to magnetic north
boolean m_OrientationOK = false; // set true if m_azimuth_radians and m_pitch_radians have successfully been calculated following a call to onSensorChanged(...)
float m_azimuth_radians; // angle of the device from magnetic north
float m_pitch_radians; // tilt angle of the device from the horizontal. m_pitch_radians = 0 if the device if flat, m_pitch_radians = Math.PI/2 means the device is upright.
float m_pitch_axis_radians; // angle which defines the axis for the rotation m_pitch_radians
#Override
public void onSensorChanged(SensorEvent sensorEvent) {
int SensorType = sensorEvent.sensor.getType();
switch(SensorType) {
case Sensor.TYPE_GRAVITY:
if (m_NormGravityVector == null) m_NormGravityVector = new float[3];
System.arraycopy(sensorEvent.values, 0, m_NormGravityVector, 0, m_NormGravityVector.length);
m_Norm_Gravity = (float)Math.sqrt(m_NormGravityVector[0]*m_NormGravityVector[0] + m_NormGravityVector[1]*m_NormGravityVector[1] + m_NormGravityVector[2]*m_NormGravityVector[2]);
for(int i=0; i < m_NormGravityVector.length; i++) m_NormGravityVector[i] /= m_Norm_Gravity;
break;
case Sensor.TYPE_MAGNETIC_FIELD:
if (m_NormMagFieldValues == null) m_NormMagFieldValues = new float[3];
System.arraycopy(sensorEvent.values, 0, m_NormMagFieldValues, 0, m_NormMagFieldValues.length);
m_Norm_MagField = (float)Math.sqrt(m_NormMagFieldValues[0]*m_NormMagFieldValues[0] + m_NormMagFieldValues[1]*m_NormMagFieldValues[1] + m_NormMagFieldValues[2]*m_NormMagFieldValues[2]);
for(int i=0; i < m_NormMagFieldValues.length; i++) m_NormMagFieldValues[i] /= m_Norm_MagField;
break;
}
if (m_NormGravityVector != null && m_NormMagFieldValues != null) {
// first calculate the horizontal vector that points due east
float East_x = m_NormMagFieldValues[1] * m_NormGravityVector[2] - m_NormMagFieldValues[2] * m_NormGravityVector[1];
float East_y = m_NormMagFieldValues[2] * m_NormGravityVector[0] - m_NormMagFieldValues[0] * m_NormGravityVector[2];
float East_z = m_NormMagFieldValues[0] * m_NormGravityVector[1] - m_NormMagFieldValues[1] * m_NormGravityVector[0];
float norm_East = (float) Math.sqrt(East_x * East_x + East_y * East_y + East_z * East_z);
if (m_Norm_Gravity * m_Norm_MagField * norm_East < 0.1f) { // Typical values are > 100.
m_OrientationOK = false; // device is close to free fall (or in space?), or close to magnetic north pole.
} else {
m_NormEastVector[0] = East_x / norm_East;
m_NormEastVector[1] = East_y / norm_East;
m_NormEastVector[2] = East_z / norm_East;
// next calculate the horizontal vector that points due north
float M_dot_G = (m_NormGravityVector[0] * m_NormMagFieldValues[0] + m_NormGravityVector[1] * m_NormMagFieldValues[1] + m_NormGravityVector[2] * m_NormMagFieldValues[2]);
float North_x = m_NormMagFieldValues[0] - m_NormGravityVector[0] * M_dot_G;
float North_y = m_NormMagFieldValues[1] - m_NormGravityVector[1] * M_dot_G;
float North_z = m_NormMagFieldValues[2] - m_NormGravityVector[2] * M_dot_G;
float norm_North = (float) Math.sqrt(North_x * North_x + North_y * North_y + North_z * North_z);
m_NormNorthVector[0] = North_x / norm_North;
m_NormNorthVector[1] = North_y / norm_North;
m_NormNorthVector[2] = North_z / norm_North;
// take account of screen rotation away from its natural rotation
int rotation = getActivity().getWindowManager().getDefaultDisplay().getRotation();
float screen_adjustment = 0;
switch (rotation) {
case Surface.ROTATION_0:
screen_adjustment = 0;
break;
case Surface.ROTATION_90:
screen_adjustment = (float) Math.PI / 2;
break;
case Surface.ROTATION_180:
screen_adjustment = (float) Math.PI;
break;
case Surface.ROTATION_270:
screen_adjustment = 3 * (float) Math.PI / 2;
break;
}
// NB: the rotation matrix has now effectively been calculated. It consists of the three vectors m_NormEastVector[], m_NormNorthVector[] and m_NormGravityVector[]
// calculate all the required angles from the rotation matrix
// NB: see https://math.stackexchange.com/questions/381649/whats-the-best-3d-angular-co-ordinate-system-for-working-with-smartfone-apps
float sin = m_NormEastVector[1] - m_NormNorthVector[0], cos = m_NormEastVector[0] + m_NormNorthVector[1];
m_azimuth_radians = (float) (sin != 0 && cos != 0 ? Math.atan2(sin, cos) : 0);
m_pitch_radians = (float) Math.acos(m_NormGravityVector[2]);
sin = -m_NormEastVector[1] - m_NormNorthVector[0];
cos = m_NormEastVector[0] - m_NormNorthVector[1];
float aximuth_plus_two_pitch_axis_radians = (float) (sin != 0 && cos != 0 ? Math.atan2(sin, cos) : 0);
m_pitch_axis_radians = (float) (aximuth_plus_two_pitch_axis_radians - m_azimuth_radians) / 2;
m_azimuth_radians += screen_adjustment;
m_pitch_axis_radians += screen_adjustment;
m_OrientationOK = true;
currentAzimuth = (float) Math.toDegrees(m_azimuth_radians);
}
}
}
#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
int SensorType = sensor.getType();
switch(SensorType) {
case Sensor.TYPE_GRAVITY: m_GravityAccuracy = accuracy; break;
case Sensor.TYPE_MAGNETIC_FIELD: m_MagneticFieldAccuracy = accuracy; break;
}
}
I need to know whether device is near to ear or not by using sensors
I tried using proximity, I want to combine accelerator and gyroscope sensors to exactly find the device is near or far from Ear.
Code for Proximity
#Override
public void onSensorChanged(SensorEvent event) {
float distance = event.values[0];
if (event.sensor.getType() == Sensor.TYPE_PROXIMITY) {
if (distance < mProximity.getMaximumRange()) {
iv.setText("Near");
} else {
iv.setText("far");
}
}
}
This is what I got from Android Documentation, I am sure you can dig more to get some answers to your problem, but this should be enough to get you started. You can also do some research about position sensors in android. The documentation is quite useful
// Create a constant to convert nanoseconds to seconds.
private static final float NS2S = 1.0f / 1000000000.0f;
private final float[] deltaRotationVector = new float[4]();
private float timestamp;
public void onSensorChanged(SensorEvent event) {
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if (timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
// Axis of the rotation sample, not normalized yet.
float axisX = event.values[0];
float axisY = event.values[1];
float axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
// Normalize the rotation vector if it's big enough to get the axis
// (that is, EPSILON should represent your maximum allowable margin of error)
if (omegaMagnitude > EPSILON) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = sin(thetaOverTwo);
float cosThetaOverTwo = cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
timestamp = event.timestamp;
float[] deltaRotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix,
deltaRotationVector);
// User code should concatenate the delta rotation we computed with the current rotation
// in order to get the updated rotation.
// rotationCurrent = rotationCurrent * deltaRotationMatrix;
}
}
I need to get device orientation. As I know usually used TYPE_ACCELEROMETER and TYPE_MAGNETIC_FIELD sensors. My problem is that SensorManager.getDefaultSensor returns me null for geomagnetic sensor. It returns null for TYPE_ORIENTATION sensor too.
manager = (SensorManager) getSystemService(SENSOR_SERVICE);
Sensor sensorAcc = manager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), //normal object
sensorMagn = manager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD); //null
orientationListener = new OrientationSensorListener();
manager.registerListener(orientationListener, sensorAcc, 10);
manager.registerListener(orientationListener, sensorMagn, 10);
I need another ways to get device orientation.
Orientation can be decomposed in three Euler angle : Pitch, Roll and Azimuth.
With only accelerometer datas, you cannot compute your Azimuth, neither the sign of your pitch.
You can try something as this to know something about your pitch and roll :
private final float[] mMagnet = new float[3]; // magnetic field vector
private final float[] mAcceleration = new float[3]; // accelerometer vector
private final float[] mAccMagOrientation = new float[3]; // orientation angles from mAcceleration and mMagnet
private float[] mRotationMatrix = new float[9]; // accelerometer and magnetometer based rotation matrix
public void onSensorChanged(SensorEvent event) {
switch (event.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
System.arraycopy(event.values, 0, mAcceleration, 0, 3); // save datas
calculateAccMagOrientation(); // then calculate new orientation
break;
case Sensor.TYPE_MAGNETIC_FIELD:
System.arraycopy(event.values, 0, mMagnet, 0, 3); // save datas
break;
default: break;
}
}
public void calculateAccMagOrientation() {
if (SensorManager.getRotationMatrix(mRotationMatrix, null, mAcceleration, mMagnet))
SensorManager.getOrientation(mRotationMatrix, mAccMagOrientation);
else { // Most chances are that there are no magnet datas
double gx, gy, gz;
gx = mAcceleration[0] / 9.81f;
gy = mAcceleration[1] / 9.81f;
gz = mAcceleration[2] / 9.81f;
// http://theccontinuum.com/2012/09/24/arduino-imu-pitch-roll-from-accelerometer/
float pitch = (float) -Math.atan(gy / Math.sqrt(gx * gx + gz * gz));
float roll = (float) -Math.atan(gx / Math.sqrt(gy * gy + gz * gz));
float azimuth = 0; // Impossible to guess
mAccMagOrientation[0] = azimuth;
mAccMagOrientation[1] = pitch;
mAccMagOrientation[2] = roll;
mRotationMatrix = getRotationMatrixFromOrientation(mAccMagOrientation);
}
}
public static float[] getRotationMatrixFromOrientation(float[] o) {
float[] xM = new float[9];
float[] yM = new float[9];
float[] zM = new float[9];
float sinX = (float) Math.sin(o[1]);
float cosX = (float) Math.cos(o[1]);
float sinY = (float) Math.sin(o[2]);
float cosY = (float) Math.cos(o[2]);
float sinZ = (float) Math.sin(o[0]);
float cosZ = (float) Math.cos(o[0]);
// rotation about x-axis (pitch)
xM[0] = 1.0f;xM[1] = 0.0f;xM[2] = 0.0f;
xM[3] = 0.0f;xM[4] = cosX;xM[5] = sinX;
xM[6] = 0.0f;xM[7] =-sinX;xM[8] = cosX;
// rotation about y-axis (roll)
yM[0] = cosY;yM[1] = 0.0f;yM[2] = sinY;
yM[3] = 0.0f;yM[4] = 1.0f;yM[5] = 0.0f;
yM[6] =-sinY;yM[7] = 0.0f;yM[8] = cosY;
// rotation about z-axis (azimuth)
zM[0] = cosZ;zM[1] = sinZ;zM[2] = 0.0f;
zM[3] =-sinZ;zM[4] = cosZ;zM[5] = 0.0f;
zM[6] = 0.0f;zM[7] = 0.0f;zM[8] = 1.0f;
// rotation order is y, x, z (roll, pitch, azimuth)
float[] resultMatrix = matrixMultiplication(xM, yM);
resultMatrix = matrixMultiplication(zM, resultMatrix);
return resultMatrix;
}
public static float[] matrixMultiplication(float[] A, float[] B) {
float[] result = new float[9];
result[0] = A[0] * B[0] + A[1] * B[3] + A[2] * B[6];
result[1] = A[0] * B[1] + A[1] * B[4] + A[2] * B[7];
result[2] = A[0] * B[2] + A[1] * B[5] + A[2] * B[8];
result[3] = A[3] * B[0] + A[4] * B[3] + A[5] * B[6];
result[4] = A[3] * B[1] + A[4] * B[4] + A[5] * B[7];
result[5] = A[3] * B[2] + A[4] * B[5] + A[5] * B[8];
result[6] = A[6] * B[0] + A[7] * B[3] + A[8] * B[6];
result[7] = A[6] * B[1] + A[7] * B[4] + A[8] * B[7];
result[8] = A[6] * B[2] + A[7] * B[5] + A[8] * B[8];
return result;
}
To get the angle of rotation when the device is not flat, implements OrientationEventListener. onOrientationChanged will give the device orientation. See https://developer.android.com/reference/android/view/OrientationEventListener.html for detail.
i did something like :
public class MainActivity extends AppCompatActivity
{
SensorManager sensorManager;
Sensor sensor;
ImageView imageViewProtractorPointer;
/////////////////////////////////////
///////////// onResume //////////////
/////////////////////////////////////
#Override
protected void onResume()
{
super.onResume();
// register sensor listener again if return to application
if(sensor !=null) sensorManager.registerListener(sensorListener,sensor,SensorManager.SENSOR_DELAY_NORMAL);
}
/////////////////////////////////////
///////////// onCreate //////////////
/////////////////////////////////////
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
imageViewProtractorPointer = (ImageView)findViewById(R.id.imageView2);
// get the SensorManager
sensorManager = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
// get the Sensor ACCELEROMETER
sensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
}
/////////////////////////////////////
///////////// onPause ///////////////
/////////////////////////////////////
#Override
protected void onPause()
{
// Unregister the sensor listener to prevent battery drain if not in use
super.onPause();
if(sensor !=null) sensorManager.unregisterListener(sensorListener);
}
/////////////////////////////////////////////
/////////// SensorEventListener /////////////
/////////////////////////////////////////////
SensorEventListener sensorListener = new SensorEventListener()
{
#Override
public void onSensorChanged(SensorEvent sensorEvent)
{
// i will use values from 0 to 9 without decimal
int x = (int)sensorEvent.values[0];
int y = (int)sensorEvent.values[1];
int angle = 0;
if(y>=0 && x<=0) angle = x*10;
if(x<=0 && y<=0) angle = (y*10)-90;
if(x>=0 && y<=0) angle = (-x*10)-180;
if(x>=0 && y>=0) angle = (-y*10)-270;
imageViewProtractorPointer.setRotation((float)angle);
}
#Override
public void onAccuracyChanged(Sensor sensor, int i){}
};
}
if you want to understand about my if statements see this image
for my use i lock the screen in a portrait mode , and a use 2 images to show the angle on screen , this is my screenshot
i'm still have to make it little better , just not enough time for it.
i hope this help , if you need a full code let me know.
You have to add some permissions to the manifest. The docs states:
the default sensor matching the requested type and wakeUp properties if one exists and the application has the necessary permissions, or null otherwise
I know it sounds counter intuitive, but apparently the permissions you need are:
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION"/>
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"/>
<uses-permission android:name="android.permission.ACCESS_MOCK_LOCATION"/>
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
(or a sub-set of those).
See here: manifest.xml when using sensors
https://developer.android.com/guide/topics/sensors/sensors_position.html
https://developer.android.com/guide/topics/sensors/sensors_motion.html
https://developer.android.com/guide/topics/sensors/sensors_overview.html
Which sensor for rotating android phone?
You can try the GEOMAGNETIC_ROTATION_VECTOR like this:
private SensorManager mSensorManager;
private Sensor mSensor;
...
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_GEOMAGNETIC_ROTATION_VECTOR)
And compute the sensor info with this:
...
// Rotation matrix based on current readings from accelerometer and magnetometer.
final float[] rotationMatrix = new float[9];
mSensorManager.getRotationMatrix(rotationMatrix, null, accelerometerReading, magnetometerReading);
// Express the updated rotation matrix as three orientation angles.
final float[] orientationAngles = new float[3];
mSensorManager.getOrientation(rotationMatrix, orientationAngles);
Extracted from android docs: https://developer.android.com/guide/topics/sensors/sensors_position.html
add the proper permissions in the manifest.
Hope this helps.
For anyone who is still confused by this problem, say: you want to get the orientation of your phone (the azimuth, pitch and roll), but sometimes the magnetic field is unstable, so that the orientation you get is also unstable. The answers above may help you get the pitch and roll angle, but you still cannot get the azimuth angle. They said it is impossible. Then you become desperate. So, what should you do to solve this problem?
If you only care about the orientation and you don't care about where north is, here is my suggestion, try this sensor, it works awesome in my case:
TYPE_GAME_ROTATION_VECTOR
I have seen some answers to how reduce the noise of for example the accelerometer x,y,z values while listening, but my problem is a bit different.
I have some recorded data already (in csv-files) and I would like to remove/reduce the noise afterwards, if that's possible.
Here is the data that was recorded:
X,Y,Z from gyroscope
Delta 0-3 from gyroscope, which was calculated in this way:
axisX = 0;
axisY = 0;
axisZ = 0;
// This timestep's delta rotation to be multiplied by the
// current rotation
// after computing it from the gyro sample data.
if (timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
// Axis of the rotation sample, not normalized yet.
axisX = event.values[0];
axisY = event.values[1];
axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = FloatMath.sqrt(axisX * axisX + axisY
* axisY + axisZ * axisZ);
// Normalize the rotation vector if it's big enough to get
// the axis (that is, EPSILON should represent your maximum
// allowable margin of error)
if (omegaMagnitude > 0.000000001f) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the
// timestep in order to get a delta rotation from this
// sample over the timestep We will convert this axis-angle
// representation of the delta rotation into a quaternion
// before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = FloatMath.sin(thetaOverTwo);
float cosThetaOverTwo = FloatMath.cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
timestamp = event.timestamp;
float[] deltaRotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix,deltaRotationVector);
Pitch/Roll/Azimuth/Inclination, which was calculated in this way:
// Calculation of the orientation through the
// magnetic-field and accelerometer sensors.
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
mGravity = event.values;
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
mGeomagnetic = event.values;
if (mGravity != null && mGeomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
// get the current orientation
// orientation consist of: azimut, pitch and roll in radians
azimut = orientation[0] * (180 / (float) java.lang.Math.PI);
pitch = orientation[1] * (180 / (float) java.lang.Math.PI);
roll = orientation[2] * (180 / (float) java.lang.Math.PI);
inclination = SensorManager.getInclination(I) * (180 / (float) java.lang.Math.PI);
}
}
The X/Y/Z from accelerometer wasn't written in the files.
So my question is:
Can I remove the noise from this data?
Thanks in advance.
I do not know if it is too late for you, just write in case you still need it.
You can implement some kind of filter with it. Low pass filter is typical. Otherwise, try Complementary Filter.
Me personally I preferred Kalman Filter, although it is a bit computationally expensive.
Since you don't have the accelerometer recorded, and if I understand correctly what you use is the orientation. I would recommend converting the euler angles to quaternions representation and using averaging to smooth the data, this is not regular averaging, see below.
You can implement rolling window filter by averaging using this this matlab code example:
https://stackoverflow.com/a/29315869/6589074
All the best,
Lev
In get rotation matrix value it contains public static boolean getRotationMatrix (float[] R, float[] I, float[] gravity, float[] geomagnetic)
Here how can i calculate the float[] gravity?
I found a sample of code where it calculate the orientation using both Accelerometer and Magnetic field
boolean success = SensorManager.getRotationMatrix(
matrixR,
matrixI,
valuesAccelerometer,
valuesMagneticField);
if(success){
SensorManager.getOrientation(matrixR, matrixValues);
double azimuth = Math.toDegrees(matrixValues[0]);
double pitch = Math.toDegrees(matrixValues[1]);
double roll = Math.toDegrees(matrixValues[2]);
readingAzimuth.setText("Azimuth: " + String.valueOf(azimuth));
readingPitch.setText("Pitch: " + String.valueOf(pitch));
readingRoll.setText("Roll: "+String.valueOf(roll));
}
My questions are :
Is orientation value is the rotation matrix value?
If no then how can i implement this code to get the rotation matrix value using magnetic? field?
To get the rotation matrix i use this code
public void onSensorChanged(SensorEvent sensorEvent) {
if (timestamp != 0) {
final double dT = (sensorEvent.timestamp - timestamp) * NS2S;
double magneticX = sensorEvent.values[0];
double magneticY = sensorEvent.values[1];
double magneticZ = sensorEvent.values[2];
double omegaMagnitude =Math.sqrt(magneticX*magneticX + magneticY*magneticY + magneticZ*magneticZ);
if (omegaMagnitude > EPSILON) {
magneticX /= omegaMagnitude;
magneticY /= omegaMagnitude;
magneticZ /= omegaMagnitude;
}
double thetaOverTwo = omegaMagnitude * dT / 2.0f;
double sinThetaOverTwo =Math.sin(thetaOverTwo);
double cosThetaOverTwo = Math.cos(thetaOverTwo);
deltaRotationVector[0] = (double) (sinThetaOverTwo * magneticX);
deltaRotationVector[1] = (double) (sinThetaOverTwo * magneticY);
deltaRotationVector[2] = (double) (sinThetaOverTwo * magneticZ);
deltaRotationVector[3] = cosThetaOverTwo;
}
double[] deltaRotationMatrix = new double[9];
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
}
But the problem is this getRotationMatrixFromVector is says undefine for sensor.Any idea?
Orientation is not a rotation matrix as it only provides you angles related to magnetic North. You can obtain the rotation matrix (Direction Cosine Matrix) that will help you to transform coordinates from your device frame to the Earth's frame this way :
with
= azimuth (radians)
= pitch (radians)
= roll (radians)
I know that this is an old thread but in case it helps, for Android I think the 3x3 rotation matrix is actually given by a variation of the approved answer. To be specific, in Android the rotation matrix is
(cosφ cosψ - sinφ sinψ sinθ) sinφ cosθ ( cosφ sinψ + sinφ cosψ sinθ)
-(sinφ cosψ + cosφ sinψ sinθ) cosφ cosθ (-sinφ sinψ + cosφ cosψ sinθ)
-sinψ cosθ -sinθ cosφ cosθ
where
φ = azimuth
θ = pitch
ψ = roll
which corresponds to the 3x3 Android rotation matrix R[0] to R[8] (matrixR in the question) via
R[0] R[1] R[2]
R[3] R[4] R[5]
R[6] R[7] R[8]