I am trying to bring a motion parallax effect to the view using Sensor type accelerometer as it support most of the android devices.
I have the following code in my onSensorChanged() method:
float[] g = new float[3];
g = sensorEvent.values.clone();
float norm_Of_g = (float) Math.sqrt(g[0] * g[0] + g[1] * g[1] + g[2] * g[2]);
// Normalize the accelerometer vector
g[0] = g[0] / norm_Of_g;
g[1] = g[1] / norm_Of_g;
g[2] = g[2] / norm_Of_g;
if (startingAngle == 999) {
startingAngle = (float) (Math.acos(g[0]) * 180 / Math.PI);
currentAngle = (float) (Math.acos(g[0]) * 180 / Math.PI);
lowerLimitAngle = currentAngle - kRotationAngleLimit;
upperLimitAngle = currentAngle + kRotationAngleLimit;
initialScrollX = scrollView.getScrollX();
pixelsPerDegree = ((simpleDraweeView.getDrawable().getIntrinsicWidth()/ (2 * kRotationAngleLimit)));
}
currentAngle = (float) (Math.acos(g[0]) * 180 / Math.PI);
if (currentAngle >= lowerLimitAngle && currentAngle <= upperLimitAngle) {
xOffset = (startingAngle - currentAngle) * pixelsPerDegree + initialScrollX;
}
//
scrollView.setScrollX((int) xOffset);
This works perfectly when i use TYPE_ROTATION_VECTOR.
On using TYPE_ACCELEROMETER, i get a fluctuate reading with it. Due to which the scroll is jerky.
Please help me on this.
How to measure the tilt of the phone in XY plane using accelerometer in Android
Related
In android using jetpack compose i am trying to implement a spirit level app.
i got pitch, tilt and azimuth value from TYPE_ROTATION_VECTOR and it's working fine in portrait mode. But in landscape mode it seems like the co-ordinate system remain same.
when i tilting my phone the sensor still recognize i am pitching. Seems like the X, Y axis still same as portrait mode. How can i get pitch and tilt value according to landscape mode?
I am new in development, i am sorry if my question doesn't make sense.
I tried with the help of How to measure the tilt of the phone in XY plane using accelerometer in Android link.
here is my code.
if (event?.sensor?.type == Sensor.TYPE_ROTATION_VECTOR) {
// get the rotation vector values
val g = event.values.clone()
// Normalize each values
val norm: Double = sqrt(g[0].toDouble() * g[0] + g[1] * g[1] + g[2] * g[2] + g[3] * g[3])
g[0] /= norm.toFloat()
g[1] /= norm.toFloat()
g[2] /= norm.toFloat()
g[3] /= norm.toFloat()
// get all axis values
val x: Float = g[0]
val y: Float = g[1]
val z: Float = g[2]
val w: Float = g[3]
//Calculate Pitch in degrees (-180 to 180)
val sinP: Float = 2f * (w * x + y * z)
val cosP: Float = 1f - 2f * (x * x + y * y)
pitch = atan2(sinP, cosP) * (180/ Math.PI.toFloat())
//Calculate Tilt in degrees (-90 to 90)
val sinT = 2f * (w * y - z * x)
tilt = if (abs(sinT) >= 1)
(Math.PI / 2).withSign(sinT.toDouble()).toFloat() * (180 / Math.PI.toFloat())
else asin(sinT) * (180 / Math.PI.toFloat())
//Calculate Azimuth in degrees (0 to 360; 0 = North, 90 = East, 180 = South, 270 = West)
val sinA = 2f * (w * z + x * y)
val cosA = 1f - 2f * (y * y + z * z)
azimuth = atan2(sinA, cosA) * (180 / Math.PI.toFloat())
}
How to get roll around Y axis?, I had achieved this when device is flat but not working when device hold on vertical, in front of face :( please help me with this.
float[] gravity = new float[3];
private float[] acceleration = new float[3];
//Above globel variables
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
lowPassFilter.filter(event.values);
System.arraycopy(acceleration, 0, this.acceleration, 0, acceleration.length);
gravity = event.values.clone();
double norm_Of_g = Math.sqrt(gravity[0] * gravity[0] + gravity[1] * gravity[1] + gravity[2] * gravity[2]);
// Normalize the accelerometer vector
gravity[0] = (float) (gravity[0] / norm_Of_g);
gravity[1] = (float) (gravity[1] / norm_Of_g);
gravity[2] = (float) (gravity[2] / norm_Of_g);
int rotation = (int) Math.round(Math.toDegrees(Math.atan2(gravity[0], gravity[1])));
tvXYZ.setText(String.format("X:%.2f, Y:%.2f, Z:%.2f\nArray Size: %d, Position in Array : %d",
gravity[0], gravity[1], gravity[2], list.size(), rotation));
imageView.setRotationY(rotation);
}
I'm getting pitch and roll data from my device's gyroscope using this tutorial: http://www.thousand-thoughts.com/2012/03/android-sensor-fusion-tutorial/
All the readings are extremely accurate (there's a filter applied to the code in the tutorial to eliminate gyro drift). Unfortunately, the code only works when my device is placed flat on a surface that is parallel to the ground. The most ideal position for my app to work would be with the top of the device pointing straight up (ie, the device is perpendicular to the ground with the screen facing the user). Whenever I orient my device in this position, the pitch values go to +90 degrees (as expected). What I would like to do is set this position as the 0 degree point (or initial position) for my device so that the pitch readings are 0 degrees when my device is oriented upright (in portrait mode) with the screen facing the user.
I asked the author of the tutorial with help on this issue and he responded:
"If you want to have the upright position as the initial one, you will have to rotate your frame of reference accordingly. The simplest way would be to rotate the resulting rotation matrix by -90 degrees about the x-axis. But you have to be careful about at which point in the algorithm to apply this rotation. Always remember that rotations are not commutative operations. To be more specific on this, I would have to review the code again, since I haven’t worked with it for a while now."
I'm really really confused and stumped as to how to rotate my frame of reference. I guess the bottom line is that I have no idea how to rotate the matrix by -90 degrees about the x-axis. If someone could help me out with this part, it would be fantastic. Here's my code in case anyone would like to refer to it:
public class AttitudeDisplayIndicator extends SherlockActivity implements SensorEventListener {
private SensorManager mSensorManager = null;
// angular speeds from gyro
private float[] gyro = new float[3];
// rotation matrix from gyro data
private float[] gyroMatrix = new float[9];
// orientation angles from gyro matrix
private float[] gyroOrientation = new float[3];
// magnetic field vector
private float[] magnet = new float[3];
// accelerometer vector
private float[] accel = new float[3];
// orientation angles from accel and magnet
private float[] accMagOrientation = new float[3];
// final orientation angles from sensor fusion
private float[] fusedOrientation = new float[3];
// accelerometer and magnetometer based rotation matrix
private float[] rotationMatrix = new float[9];
public static final float EPSILON = 0.000000001f;
private static final float NS2S = 1.0f / 1000000000.0f;
private float timestamp;
private boolean initState = true;
public static final int TIME_CONSTANT = 30;
public static final float FILTER_COEFFICIENT = 0.98f;
private Timer fuseTimer = new Timer();
// The following members are only for displaying the sensor output.
public Handler mHandler;
DecimalFormat d = new DecimalFormat("#.##");
//ADI background image.
private ImageView adiBackground;
//ADI axes.
private ImageView adiAxes;
//ADI frame.
private ImageView adiFrame;
//Layout.
private RelativeLayout layout;
//Pitch and Roll TextViews.
private TextView pitchAngleText;
private TextView bankAngleText;
//Instantaneous output values from sensors as the device moves.
public static double pitch;
public static double roll;
//Matrix for rotating the ADI (roll).
Matrix mMatrix = new Matrix();
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_attitude_display_indicator);
gyroOrientation[0] = 0.0f;
gyroOrientation[1] = 0.0f;
gyroOrientation[2] = 0.0f;
// initialise gyroMatrix with identity matrix
gyroMatrix[0] = 1.0f; gyroMatrix[1] = 0.0f; gyroMatrix[2] = 0.0f;
gyroMatrix[3] = 0.0f; gyroMatrix[4] = 1.0f; gyroMatrix[5] = 0.0f;
gyroMatrix[6] = 0.0f; gyroMatrix[7] = 0.0f; gyroMatrix[8] = 1.0f;
// get sensorManager and initialise sensor listeners
mSensorManager = (SensorManager) this.getSystemService(SENSOR_SERVICE);
initListeners();
// wait for one second until gyroscope and magnetometer/accelerometer
// data is initialised then scedule the complementary filter task
fuseTimer.scheduleAtFixedRate(new calculateFusedOrientationTask(),
1000, TIME_CONSTANT);
mHandler = new Handler();
adiBackground = (ImageView) findViewById(R.id.adi_background);
adiFrame = (ImageView) findViewById(R.id.adi_frame);
adiAxes = (ImageView) findViewById(R.id.adi_axes);
layout = (RelativeLayout) findViewById(R.id.adi_layout);
new Color();
layout.setBackgroundColor(Color.rgb(150, 150, 150));
pitchAngleText = (TextView) findViewById(R.id.pitch_angle_text);
bankAngleText = (TextView) findViewById(R.id.bank_angle_text);
}
// This function registers sensor listeners for the accelerometer, magnetometer and gyroscope.
public void initListeners(){
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),
SensorManager.SENSOR_DELAY_FASTEST);
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_GYROSCOPE),
SensorManager.SENSOR_DELAY_FASTEST);
mSensorManager.registerListener(this,
mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),
SensorManager.SENSOR_DELAY_FASTEST);
}
//#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
//#Override
public void onSensorChanged(SensorEvent event) {
switch(event.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
// copy new accelerometer data into accel array and calculate orientation
System.arraycopy(event.values, 0, accel, 0, 3);
calculateAccMagOrientation();
break;
case Sensor.TYPE_GYROSCOPE:
// process gyro data
gyroFunction(event);
break;
case Sensor.TYPE_MAGNETIC_FIELD:
// copy new magnetometer data into magnet array
System.arraycopy(event.values, 0, magnet, 0, 3);
break;
}
}
// calculates orientation angles from accelerometer and magnetometer output
public void calculateAccMagOrientation() {
if(SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet)) {
SensorManager.getOrientation(rotationMatrix, accMagOrientation);
}
}
// This function is borrowed from the Android reference
// at http://developer.android.com/reference/android/hardware/SensorEvent.html#values
// It calculates a rotation vector from the gyroscope angular speed values.
private void getRotationVectorFromGyro(float[] gyroValues,
float[] deltaRotationVector,
float timeFactor)
{
float[] normValues = new float[3];
// Calculate the angular speed of the sample
float omegaMagnitude =
(float)Math.sqrt(gyroValues[0] * gyroValues[0] +
gyroValues[1] * gyroValues[1] +
gyroValues[2] * gyroValues[2]);
// Normalize the rotation vector if it's big enough to get the axis
if(omegaMagnitude > EPSILON) {
normValues[0] = gyroValues[0] / omegaMagnitude;
normValues[1] = gyroValues[1] / omegaMagnitude;
normValues[2] = gyroValues[2] / omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * timeFactor;
float sinThetaOverTwo = (float)Math.sin(thetaOverTwo);
float cosThetaOverTwo = (float)Math.cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * normValues[0];
deltaRotationVector[1] = sinThetaOverTwo * normValues[1];
deltaRotationVector[2] = sinThetaOverTwo * normValues[2];
deltaRotationVector[3] = cosThetaOverTwo;
}
// This function performs the integration of the gyroscope data.
// It writes the gyroscope based orientation into gyroOrientation.
public void gyroFunction(SensorEvent event) {
// don't start until first accelerometer/magnetometer orientation has been acquired
if (accMagOrientation == null)
return;
// initialisation of the gyroscope based rotation matrix
if(initState) {
float[] initMatrix = new float[9];
initMatrix = getRotationMatrixFromOrientation(accMagOrientation);
float[] test = new float[3];
SensorManager.getOrientation(initMatrix, test);
gyroMatrix = matrixMultiplication(gyroMatrix, initMatrix);
initState = false;
}
// copy the new gyro values into the gyro array
// convert the raw gyro data into a rotation vector
float[] deltaVector = new float[4];
if(timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
System.arraycopy(event.values, 0, gyro, 0, 3);
getRotationVectorFromGyro(gyro, deltaVector, dT / 2.0f);
}
// measurement done, save current time for next interval
timestamp = event.timestamp;
// convert rotation vector into rotation matrix
float[] deltaMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(deltaMatrix, deltaVector);
// apply the new rotation interval on the gyroscope based rotation matrix
gyroMatrix = matrixMultiplication(gyroMatrix, deltaMatrix);
// get the gyroscope based orientation from the rotation matrix
SensorManager.getOrientation(gyroMatrix, gyroOrientation);
}
private float[] getRotationMatrixFromOrientation(float[] o) {
float[] xM = new float[9];
float[] yM = new float[9];
float[] zM = new float[9];
float sinX = (float)Math.sin(o[1]);
float cosX = (float)Math.cos(o[1]);
float sinY = (float)Math.sin(o[2]);
float cosY = (float)Math.cos(o[2]);
float sinZ = (float)Math.sin(o[0]);
float cosZ = (float)Math.cos(o[0]);
// rotation about x-axis (pitch)
xM[0] = 1.0f; xM[1] = 0.0f; xM[2] = 0.0f;
xM[3] = 0.0f; xM[4] = cosX; xM[5] = sinX;
xM[6] = 0.0f; xM[7] = -sinX; xM[8] = cosX;
// rotation about y-axis (roll)
yM[0] = cosY; yM[1] = 0.0f; yM[2] = sinY;
yM[3] = 0.0f; yM[4] = 1.0f; yM[5] = 0.0f;
yM[6] = -sinY; yM[7] = 0.0f; yM[8] = cosY;
// rotation about z-axis (azimuth)
zM[0] = cosZ; zM[1] = sinZ; zM[2] = 0.0f;
zM[3] = -sinZ; zM[4] = cosZ; zM[5] = 0.0f;
zM[6] = 0.0f; zM[7] = 0.0f; zM[8] = 1.0f;
// rotation order is y, x, z (roll, pitch, azimuth)
float[] resultMatrix = matrixMultiplication(xM, yM);
resultMatrix = matrixMultiplication(zM, resultMatrix);
return resultMatrix;
}
private float[] matrixMultiplication(float[] A, float[] B) {
float[] result = new float[9];
result[0] = A[0] * B[0] + A[1] * B[3] + A[2] * B[6];
result[1] = A[0] * B[1] + A[1] * B[4] + A[2] * B[7];
result[2] = A[0] * B[2] + A[1] * B[5] + A[2] * B[8];
result[3] = A[3] * B[0] + A[4] * B[3] + A[5] * B[6];
result[4] = A[3] * B[1] + A[4] * B[4] + A[5] * B[7];
result[5] = A[3] * B[2] + A[4] * B[5] + A[5] * B[8];
result[6] = A[6] * B[0] + A[7] * B[3] + A[8] * B[6];
result[7] = A[6] * B[1] + A[7] * B[4] + A[8] * B[7];
result[8] = A[6] * B[2] + A[7] * B[5] + A[8] * B[8];
return result;
}
class calculateFusedOrientationTask extends TimerTask {
public void run() {
float oneMinusCoeff = 1.0f - FILTER_COEFFICIENT;
/*
* Fix for 179° <--> -179° transition problem:
* Check whether one of the two orientation angles (gyro or accMag) is negative while the other one is positive.
* If so, add 360° (2 * math.PI) to the negative value, perform the sensor fusion, and remove the 360° from the result
* if it is greater than 180°. This stabilizes the output in positive-to-negative-transition cases.
*/
// azimuth
if (gyroOrientation[0] < -0.5 * Math.PI && accMagOrientation[0] > 0.0) {
fusedOrientation[0] = (float) (FILTER_COEFFICIENT * (gyroOrientation[0] + 2.0 * Math.PI) + oneMinusCoeff * accMagOrientation[0]);
fusedOrientation[0] -= (fusedOrientation[0] > Math.PI) ? 2.0 * Math.PI : 0;
}
else if (accMagOrientation[0] < -0.5 * Math.PI && gyroOrientation[0] > 0.0) {
fusedOrientation[0] = (float) (FILTER_COEFFICIENT * gyroOrientation[0] + oneMinusCoeff * (accMagOrientation[0] + 2.0 * Math.PI));
fusedOrientation[0] -= (fusedOrientation[0] > Math.PI)? 2.0 * Math.PI : 0;
}
else {
fusedOrientation[0] = FILTER_COEFFICIENT * gyroOrientation[0] + oneMinusCoeff * accMagOrientation[0];
}
// pitch
if (gyroOrientation[1] < -0.5 * Math.PI && accMagOrientation[1] > 0.0) {
fusedOrientation[1] = (float) (FILTER_COEFFICIENT * (gyroOrientation[1] + 2.0 * Math.PI) + oneMinusCoeff * accMagOrientation[1]);
fusedOrientation[1] -= (fusedOrientation[1] > Math.PI) ? 2.0 * Math.PI : 0;
}
else if (accMagOrientation[1] < -0.5 * Math.PI && gyroOrientation[1] > 0.0) {
fusedOrientation[1] = (float) (FILTER_COEFFICIENT * gyroOrientation[1] + oneMinusCoeff * (accMagOrientation[1] + 2.0 * Math.PI));
fusedOrientation[1] -= (fusedOrientation[1] > Math.PI)? 2.0 * Math.PI : 0;
}
else {
fusedOrientation[1] = FILTER_COEFFICIENT * gyroOrientation[1] + oneMinusCoeff * accMagOrientation[1];
}
// roll
if (gyroOrientation[2] < -0.5 * Math.PI && accMagOrientation[2] > 0.0) {
fusedOrientation[2] = (float) (FILTER_COEFFICIENT * (gyroOrientation[2] + 2.0 * Math.PI) + oneMinusCoeff * accMagOrientation[2]);
fusedOrientation[2] -= (fusedOrientation[2] > Math.PI) ? 2.0 * Math.PI : 0;
}
else if (accMagOrientation[2] < -0.5 * Math.PI && gyroOrientation[2] > 0.0) {
fusedOrientation[2] = (float) (FILTER_COEFFICIENT * gyroOrientation[2] + oneMinusCoeff * (accMagOrientation[2] + 2.0 * Math.PI));
fusedOrientation[2] -= (fusedOrientation[2] > Math.PI)? 2.0 * Math.PI : 0;
}
else {
fusedOrientation[2] = FILTER_COEFFICIENT * gyroOrientation[2] + oneMinusCoeff * accMagOrientation[2];
}
// overwrite gyro matrix and orientation with fused orientation
// to comensate gyro drift
gyroMatrix = getRotationMatrixFromOrientation(fusedOrientation);
System.arraycopy(fusedOrientation, 0, gyroOrientation, 0, 3);
// update sensor output in GUI
mHandler.post(updateOrientationDisplayTask);
}
}
Thanks in advance for your help!
The theory...
I'm not really sure about the format in which your "frame of reference" matrix is represented, but typically rotations are done with matrix multiplication.
Basically, you would take your "frame of reference matrix" and multiply it by a 90 degrees rotation matrix.
Such a matrix can be found on Wikipedia:
Three-dimensional rotation matrices
Since your angle is 90 degrees, your sines and cosines would resolve to 1's or 0's which you can plug directly into the matrix instead of computing the sines and cosines. For example, a matrix that would rotate 90 degrees counter-clockwise about the x axis would look like this:
1 0 0
0 0 1
0 -1 0
Also, please not that matrices like these operate on row vectors of x y z coordinates.
So for example, if you have a point in space that is at (2,5,7) and you would like to rotate it using the above matrix, you would have to do the following operation:
|2 5 7| |1 0 0|
|0 0 1|
|0 -1 0|
Which gives [2 -7 5]
...applied to your code
I have glanced quickly at your code and it seems like the modification you need to make involves the output of calculateAccMagOrientation() because it is used to initialize the orientation of the device.
1: public void calculateAccMagOrientation() {
2: if(SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet)) {
3: SensorManager.getOrientation(rotationMatrix, accMagOrientation);
4: }
5: }
At line 2 in the above snippet is where you get your initial rotationMatrix. Try multiplying rotationMatrix by a hand crafted 90 degrees rotation matrix before calling getOrientation at line 3. I think this will effectively re-align your reference orientation:
public void calculateAccMagOrientation() {
if(SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet)) {
rotationMatrix = matrixMultiplication(rotationMatrix, my90DegRotationMatrix);
SensorManager.getOrientation(rotationMatrix, accMagOrientation);
}
}
Please note that depending on how the angles work in Android, you might need to use a 90 degrees clockwise rotation matrix instead of a counter-clockwise.
Alternative solution
It just occurred to me, maybe you could also simply subtract 90 from the final pitch result before displaying it?
I tried to use the Z axis data from SensorEvent.values, but it doesn't detect rotation of my phone in the XY plane, ie. around the Z-axis.
I am using this as a reference for the co-ordinate axes. Is it correct?
How do I measure that motion using accelerometer values?
These games do something similar: Extreme Skater, Doodle Jump.
PS: my phone orientation will be landscape.
Essentially, there is 2 cases here: the device is laying flat and not flat. Flat here means the angle between the surface of the device screen and the world xy plane (I call it the inclination) is less than 25 degree or larger than 155 degree. Think of the phone lying flat or tilt up just a little bit from a table.
First you need to normalize the accelerometer vector.
That is if g is the vector returns by the accelerometer sensor event values. In code
float[] g = new float[3];
g = event.values.clone();
double norm_Of_g = Math.sqrt(g[0] * g[0] + g[1] * g[1] + g[2] * g[2]);
// Normalize the accelerometer vector
g[0] = g[0] / norm_Of_g
g[1] = g[1] / norm_Of_g
g[2] = g[2] / norm_Of_g
Then the inclination can be calculated as
int inclination = (int) Math.round(Math.toDegrees(Math.acos(g[2])));
Thus
if (inclination < 25 || inclination > 155)
{
// device is flat
}
else
{
// device is not flat
}
For the case of laying flat, you have to use a compass to see how much the device is rotating from the starting position.
For the case of not flat, the rotation (tilt) is calculated as follow
int rotation = (int) Math.round(Math.toDegrees(Math.atan2(g[0], g[1])));
Now rotation = 0 means the device is in normal position. That is portrait without any tilt for most phone and probably landscape for tablet. So if you hold a phone as in your picture above and start rotating, the rotation will change and when the phone is in landscape the rotation will be 90 or -90 depends on the direction of rotation.
The accelerometer is sufficient for checking if the phone is flat as Hoan very nicely demonstrated.
For anyone who arrives here looking to not only check if the phone flat, but what the rotation of the phone is, it can be achieved through the Rotation Vector Motion Sensor.
private double pitch, tilt, azimuth;
#Override
public void onSensorChanged(SensorEvent event) {
//Get Rotation Vector Sensor Values
double[] g = convertFloatsToDoubles(event.values.clone());
//Normalise
double norm = Math.sqrt(g[0] * g[0] + g[1] * g[1] + g[2] * g[2] + g[3] * g[3]);
g[0] /= norm;
g[1] /= norm;
g[2] /= norm;
g[3] /= norm;
//Set values to commonly known quaternion letter representatives
double x = g[0];
double y = g[1];
double z = g[2];
double w = g[3];
//Calculate Pitch in degrees (-180 to 180)
double sinP = 2.0 * (w * x + y * z);
double cosP = 1.0 - 2.0 * (x * x + y * y);
pitch = Math.atan2(sinP, cosP) * (180 / Math.PI);
//Calculate Tilt in degrees (-90 to 90)
double sinT = 2.0 * (w * y - z * x);
if (Math.abs(sinT) >= 1)
tilt = Math.copySign(Math.PI / 2, sinT) * (180 / Math.PI);
else
tilt = Math.asin(sinT) * (180 / Math.PI);
//Calculate Azimuth in degrees (0 to 360; 0 = North, 90 = East, 180 = South, 270 = West)
double sinA = 2.0 * (w * z + x * y);
double cosA = 1.0 - 2.0 * (y * y + z * z);
azimuth = Math.atan2(sinA, cosA) * (180 / Math.PI);
}
private double[] convertFloatsToDoubles(float[] input)
{
if (input == null)
return null;
double[] output = new double[input.length];
for (int i = 0; i < input.length; i++)
output[i] = input[i];
return output;
}
Then to check if the phone is flat you can simply compare the tilt and pitch values with a tolerance values. For example
public boolean flatEnough(double degreeTolerance) {
return tilt <= degreeTolerance && tilt >= -degreeTolerance && pitch <= degreeTolerance && pitch >= -degreeTolerance;
}
The advantage to doing it this way is you can check if the phone is being held in any specific rotation.
It is worth noting that the app's orientation will not affect the values of pitch, tilt, and azimuth.
Working off of the perfect response from #Dan
He missed a very slight bit of information that #davy307 pointed out.
When initializing the mAccelerometer, you must define it as Sensor.TYPE_ROTATION_VECTOR otherwise, it will not have the 3rd rotation vector and throw an ArrayIndexOutOfBounds exception.
mSensorManager = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
mAccelerometer = mSensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);
Otherwise, this is a perfect solution... Appreciated!
I am working on rotating the rectangle using the orientation values from gyroscope sensors in Android 3.1 device.
I have to rotate my device very fast to get values 1.0 and more.
Here is the code
final float currentRotVector[] = { 1, 0, 0, 0 };
if (timestamp != 0)
{
final float dT = (event.timestamp - timestamp) * NS2S;
// Axis of the rotation sample, not normalized yet.
// Calculate the angular speed of the sample
float omegaMagnitude = (float) Math.sqrt(X * X + Y * Y + Z * Z);
// Normalize the rotation vector if it's big enough to get the axis
if (omegaMagnitude > EPSILON)
{
X /= omegaMagnitude;
Y /= omegaMagnitude;
Z /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = dT * omegaMagnitude / 2.0f;
float sinThetaOverTwo = (float) Math.sin(thetaOverTwo);
float cosThetaOverTwo = (float) Math.cos(thetaOverTwo);
deltaRotationVector[0] = cosThetaOverTwo;
deltaRotationVector[1] = sinThetaOverTwo * X;
deltaRotationVector[2] = sinThetaOverTwo * Y;
deltaRotationVector[3] = sinThetaOverTwo * Z;
/* quaternion multiplication
Reference: http://www.cprogramming.com/tutorial/3d/quaternions.html
*/
currentRotVector[0] = deltaRotationVector[0] * currentRotVector[0] -
deltaRotationVector[1] * currentRotVector[1] -
deltaRotationVector[2] * currentRotVector[2] -
deltaRotationVector[3] * currentRotVector[3];
currentRotVector[1] = deltaRotationVector[0] * currentRotVector[1] +
deltaRotationVector[1] * currentRotVector[0] +
deltaRotationVector[2] * currentRotVector[3] -
deltaRotationVector[3] * currentRotVector[2];
currentRotVector[2] = deltaRotationVector[0] * currentRotVector[2] -
deltaRotationVector[1] * currentRotVector[3] +
deltaRotationVector[2] * currentRotVector[0] +
deltaRotationVector[3] * currentRotVector[1];
currentRotVector[3] = deltaRotationVector[0] * currentRotVector[3] +
deltaRotationVector[1] * currentRotVector[2] -
deltaRotationVector[2] * currentRotVector[1] +
deltaRotationVector[3] * currentRotVector[0];
final float rad2deg = (float) (180.0f / Math.PI);
RotAngle = currentRotVector[0] * rad2deg;
axisX = currentRotVector[1];
axisY = currentRotVector[2];
axisZ = currentRotVector[3];
Log.i("Sensor Orientation GyroScope", "axisX: " + axisX + //
" axisY: " + axisY + //
" axisZ: " + axisZ + //
" RotAngle: " + RotAngle);
}
timestamp = event.timestamp;
I am getting some outputs like
axisX: 0.69363713 axisY: 0.18359372 axisZ: 0.0228636 RotAngle: 36.7191
And because of the axis values, the output rectangle looked tweaked when the device is lay down on the table.
Is there any problem in the above code?
The values are measeured in rad/s. This has been standarized starting from Android 2.3
To get values of about 1.0 you have to turn at a speed of almost 60 deg/s
Some devices having previous Android versions return (returned) values in degrees/s, but these are just a few. As an example, the LG Optimus Black (P970) with android 2.2 is one of these devices returning deg/s, but this is not the common case.