I'm using android's TYPE_ROTATION_VECTOR sensor to determine rotation around x,y and z and use these measures in my application in a certain way. But when I rotate phone around X axis, roll value (rotation around Y) also changes. And I don't want that. How can I ignore changes in roll when I rotate around X? Here is piece of my code
private final float[] mRotationMatrix = new float[16];
private float[] orientationVals = new float[3];
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR) {
SensorManager.getRotationMatrixFromVector(
mRotationMatrix , event.values);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
azimuthVal = (Math.round((Math.toDegrees(orientationVals[0]))*100.0)/100.0);
pitchVal= (Math.round((Math.toDegrees(orientationVals[1]))*100.0)/100.0);
rollVal = (Math.round((Math.toDegrees(orientationVals[2]))*100.0)/100.0);
}
Related
In my app I am calculating azimuth, pitch and roll using accelerometer and magnetometer values:
float[] rotationMatrix = new float[9];
float[] inclinationMatrix = new float[9];
private float[] mMagnetometerValue = new float[3]; //event.values
private float[] mAccelerometerValue = new float[3]; //event.values
SensorManager.getRotationMatrix(rotationMatrix, inclinationMatrix, mAccelerometerValue, mMagnetometerValue);
// Orientation.
float[] orientation = new float[3];
SensorManager.getOrientation(rotationMatrix, orientation);
Now orientation array returns azimuth, pitch and roll.
As documentation says getOrientation function:
values[0]: Azimuth, angle of rotation about the -z axis
values1: Pitch, angle of rotation about the x axis
values2: Roll, angle of rotation about the y axis.
Now good.
But testing in another phone failed, because this phone has only accelerometer, not magnetometer, nor gyroscope or rotation vector sensor. After debugging I found that phone has device orientation sensor. Only doc I am found is this google gist.
Okay give a try. But this sensor returns only one float value, not float array.
So I can not pass this as parameter to SensorManager.getRotationMatrix function.
Now question:
How to retrieve pitch and roll with SensorManager.getRotationMatrix and SensorManager.getOrientation or without this functions using accelerometer and this device orientation sensor.
I want to implement rotation vector sensor. I searched as much as I could about this sensor but I confused some points. I understand that I need to use quaternions other than Euler angles for my app to be more stable. I will use this sensor datas for gesture recognition. So I want to use quaternions but I don’t understand how can I convert them to quaternions after using remapCoordinateSystem method? Or am I wrong about what I want to do? Even a little bit help will be very useful for me.
#Override
public void onSensorChanged(SensorEvent event) {
float[] rotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(rotationMatrix, event.values);
float[] adjustedRotationMatrix = new float[9];
SensorManager.remapCoordinateSystem(rotationMatrix, SensorManager.AXIS_X, SensorManager.AXIS_Z, adjustedRotationMatrix);
float[] orientation = new float[3];
SensorManager.getOrientation(adjustedRotationMatrix, orientation);
float pitch = orientation[1];
float roll = orientation[2];
float yaw = orientation[0];
yawTextView.setText(getResources().getString(
R.string.yaw_sensor,yaw));
pitchTextView.setText(getResources().getString(
R.string.pitch_sensor,pitch));
rollTextView.setText(getResources().getString(
R.string.roll_sensor,roll));
}
Thank you.
See Euler angles to quaternion conversion.
Or otherwise, if you'd like to calculate quaternion components directly from the rotation matrix, see this page.
I want to get the rotation on x, y and z axis
I want to get these values
You can obtain Azimuth, Pitch and Roll in this way:
private SensorManager sensorManager;
...
// Rotation matrix based on current readings from accelerometer and magnetometer.
final float[] rotationMatrix = new float[9];
SensorManager.getRotationMatrix(rotationMatrix, null,
accelerometerReading, magnetometerReading);
// Express the updated rotation matrix as three orientation angles.
final float[] orientationAngles = new float[3];
SensorManager.getOrientation(rotationMatrix, orientationAngles);
source https://developer.android.com/guide/topics/sensors/sensors_position#java
I need to extract rotation of an Android device about the device Z-axis
I am currently using the ROTATION_VECTOR sensor and the following code, where rotVec is cloned from the values member of the SensorEvent:
float[] q = new float[4];
float[] globalRot = new float[9];
float[] localRot = new float[9];
float[] euler = new float[3];
SensorManager.getQuaternionFromVector(q, rotVec);
SensorManager.getRotationMatrixFromVector(globalRot,q);
SensorManager.remapCoordinateSystem(globalRot, SensorManager.AXIS_Y, SensorManager.AXIS_Z, localRot);
SensorManager.getOrientation(localRot, euler);
float rotation = euler[1];
As the device gets closer to flat, the value reported drifts from the true value. Also, the value is inverted after +- 90 degrees (0 is defined as Y pointing straight up).
What am I doing wrong, and what calls should I be using?
I'm developing an Android 2.2 application.
I want to know when user moves device up or down, and when it moves to the left or to the right. The device will be at rest when mounted vertically. In other words, using the camera (Y axis along the camera's axis) for an augmented reality application where the rotation angles are needed:
remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR);
This is my code:
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
accelValues = event.values;
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
magneValues = event.values;
updateOrientation(calculateOrientation());
}
private float[] calculateOrientation() {
float[] values = new float[3];
float[] R = new float[9];
float[] outR = new float[9];
SensorManager.getRotationMatrix(R, null, accelValues, magneValues);
SensorManager.remapCoordinateSystem(R,
SensorManager.AXIS_X,
SensorManager.AXIS_Z,
outR);
SensorManager.getOrientation(outR, values);
// Convert from Radians to Degrees.
values[0] = (float) Math.toDegrees(values[0]); // Azimuth, rotation around Z
values[1] = (float) Math.toDegrees(values[1]); // Pitch, rotation around X
values[2] = (float) Math.toDegrees(values[2]); // Roll, rotation around Y
return values;
}
but I'm not sure how can I know if user moves device to the left or to the right.
And, how can I know if user walk?
There is an Azimuth, a Pitch and a Roll, but I don't know how can I use these values.
Any advice?