I have implemented listener for both Rotation Vector and Orientation Vector though i know it's depreciated i wanted to test both.
I know Rotation Vector is a fusion sensor & recommended but according to it the NORTH (the value[0] returned by getOrientation(rotationMatrix,value) geaving bearing 0) doesn't matches with NORTH from Orientation sensor.
I've also tallied from different apps from playstore, orientation sensor values seem to be more close to them.
Moreover many times my azimuth value[0] from Rotation_Vector then getOrientation just shoots up and keep oscillating between -180 to 180
P.S "getRotationMatrix(float[] R, float[] I, float[] gravity, float[] geomagnetic)" also gives same result as Rotation Vector.
public final void onSensorChanged(SensorEvent event)
{
float rotationMatrix[];
switch(event.sensor.getType())
{
.
.
.
case Sensor.TYPE_ROTATION_VECTOR:
rotationMatrix=new float[16];
mSensorManager.getRotationMatrixFromVector(rotationMatrix,event.values);
determineOrientation(rotationMatrix);
break;
case Sensor.TYPE_ORIENTATION:
sensorZValue.setText(""+event.values[0]); //rotation about geographical z axis
sensorXValue.setText(""+event.values[1]); //rotation about geographical x axis
sensorYValue.setText(""+event.values[2]); //rotation about geographical y axis
}//switch case ends
}
private void determineOrientation(float[] rotationMatrix)
{
float[] orientationValues = new float[3];
SensorManager.getOrientation(rotationMatrix, orientationValues);
double azimuth = Math.toDegrees(orientationValues[0]);
double pitch = Math.toDegrees(orientationValues[1]);
double roll = Math.toDegrees(orientationValues[2]);
sensorZValue.setText(String.valueOf(azimuth)); //rotation about geographical z axis
sensorXValue.setText(String.valueOf(pitch)); //rotation about geographical x axis
sensorYValue.setText(String.valueOf(roll)); //rotation about geographical y axis
}
I want to determine the angle between phone's Y axis and the Vector pointing North so that was my initial implementation.
Please suggest.
I think this will help...
Android Compass that can Compensate for Tilt and Pitch
This calculates North using more reliable sources.
Hope this helps.
In Android, I am using the accelerometer and magnetic field sensor to calculate spatial positioning, as shown in the code below. The getRotationMatrix method generates values that are in real-world units with azimuth, pitch and roll. Azimuth and roll give values in the range of 0 to 180 or 0 to -180. Pitch however gives values from 0 to 90 or 0 to -90. That's a problem for my app because in my app I need to determine unique locations regardless how the device is oriented. With roll, you can have 2 locations with the same value.
I need to apply a matrix transformation that remaps the sensor values to values that range from 0 to 360 degrees (actually, 360 wouldn't be valid since it's the same as 0 and anything close to 360 would result in a number like 359.99999...)
I am not a mathematician and don't know how to use matrixes, let alone use them in Android but I am aware that this is what is required to get the 0 to 360 degree conversion. It would be nice if the matrix also took care of the azimuth and roll as well so that they also produce values from 0 to 360 but if that isn't possible, that's fine since unique positions can still be derived from their sensor values. Any suggestions how how I create this matrix transformation?
#Override
public void onSensorChanged(SensorEvent event)
{
try
{
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
accelerometerValues = event.values;
else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
magneticFieldValues = event.values;
if ((accelerometerValues != null) && (magneticFieldValues != null))
{
float[] R = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, accelerometerValues, magneticFieldValues);
if (success)
{
float[] values = new float[3];
SensorManager.getOrientation(R, values);
// Convert from radians to degrees if preferred.
values[0] = (float) Math.toDegrees(values[0]); // Azimuth
values[1] = (float) Math.toDegrees(values[1]); // Pitch
values[2] = (float) Math.toDegrees(values[2]); // Roll
}
}
}
catch (Exception ex)
{
}
}
EDIT:
The raw event values for the pitch do not give you unique values as you rotate the device 360 degrees, so I highly doubt any matrix transformation is going to produce the results I am after. Maybe I am using the wrong sensors.
I just did my project for getting data gyroscope sensor. If I put my handphone on the table horizontally, the result of gyroscope sensor are :
Roll (X) : 5.326322E-7
Pitch (Y) : 5.326322E-7
Yaw (Z) : 5.326322E-7
Logically, the result should be 0, because the handphone lay on the table. So,anybody can help me? I give my code below. Thank you very much for the response in advance :).
public void onSensorChanged(SensorEvent event) {
if(event.sensor.getType()==Sensor.TYPE_GYROSCOPE)
{
float roolX = event.values[0];
float pitchY = event.values[1];
float yawZ = event.values[2];
koordinatrollX.setText("Orientation X (Roll) :" + Float.toString(event.values[0]));
koordinatpitchY.setText("Orientation Y (Pitch) :" + Float.toString(event.values[1]));
koordinatyawZ.setText("Orientation Z (Yaw):" + Float.toString(event.values[2]));
}
E-7 = 10^-7 so it's very close to 0. You can't expect the hardware to be perfect calibrated or the table to be perfect flat. You could let the user recalibrate based on the surface the phone is at but the result you get is already very close to zero.
i'am trying to build a simple Augmented Reality App, so I start working with sensor Data.
According to this thread (Android compass example) and example (http://www.codingforandroid.com/2011/01/using-orientation-sensors-simple.html), the calculation of the orientation using the Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGNETIC_FIELD doesn't really fit.
So I'm not able to get "good" values. The azimut values doesn't make any sense at all, so if I just move the Phone upside the value changes extremly. Even if I just rotate the phone, the values doesn't represent the phones orientation.
Has anybody an idea, who to improve the values quality according to the given example?
In what kind of orientation do you use this sample app? From what is written is this code, the only orientation supported is Portrait or flat on the table, it depends on devices. What do you mean by "good"?
It is normal that the value is not "good" when rotating the device, the device coordinate system is supposed to be working in Portrait, or flat i don't know (Y axis vertical along the screen pointing up, Z axis pointing out of the screen coming from the center of screen, X axis perpendicular to the Y axis going on the right along the screen). Having this, rotating the device will not rotate the device coordinate system, you'll have to remap it.
But if you want the heading of the device in Portrait orientation, here is a piece of code that works good for me:
#Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrix,
event.values);
SensorManager
.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}
You'll get the heading (or azimuth) in:
orientationVals[0]
Answer from Tíbó is good, but if you log roll value, you will expect irregular numbers.
(roll is important for AR Browsers)
This is due to
SensorManager.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
You have to use different matrix for in and out of remap. This following code works for me with a correct roll value:
#Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrixFromVector, event.values);
SensorManager.remapCoordinateSystem(mRotationMatrixFromVector,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}
Probably late to the party. Anyway here is how I got the azimuth
private final int sensorType = Sensor.TYPE_ROTATION_VECTOR;
float[] rotMat = new float[9];
float[] vals = new float[3];
#Override
public void onSensorChanged(SensorEvent event) {
sensorHasChanged = false;
if (event.sensor.getType() == sensorType){
SensorManager.getRotationMatrixFromVector(rotMat,
event.values);
SensorManager
.remapCoordinateSystem(rotMat,
SensorManager.AXIS_X, SensorManager.AXIS_Y,
rotMat);
SensorManager.getOrientation(rotMat, vals);
azimuth = deg(vals[0]); // in degrees [-180, +180]
pitch = deg(vals[1]);
roll = deg(vals[2]);
sensorHasChanged = true;
}
}
Hope it helps
Have you tried the combined (sensor-fusion) type Sensor.TYPE_ROTATION_VECTOR. This may give better results:
Go to https://developer.android.com/reference/android/hardware/SensorEvent.html and search for 'rotation_vector'.
Here's a Kotlin approach with all the necessary matrices included (for some reason the previous answers leave out the array sizes, which matter)
// This is determined from the deprecated Sensor.TYPE_ORIENTATION
var lastOrientation: FloatArray = FloatArray(3)
var lastHeading: Float = 0f
var currentHeading: Float = 0f
// This is from the non deprecated Sensor.TYPE_ROTATION_VECTOR
var lastVectorOrientation: FloatArray = FloatArray(5)
var lastVectorHeading: Float = 0f
var currentVectorHeading: Float = 0f
override fun onSensorChanged(event: SensorEvent) {
when(event.sensor?.type) {
null -> return
Sensor.TYPE_ORIENTATION -> {
lastOrientation = event.values
lastHeading = currentHeading
currentHeading = abs(event.values[0].roundToInt().toFloat())
}
Sensor.TYPE_ROTATION_VECTOR -> {
lastVectorOrientation = event.values
lastVectorHeading = currentVectorHeading
val tempRotationMatrix = FloatArray(9)
val tempOrientationMatrix = FloatArray(3)
getRotationMatrixFromVector(tempRotationMatrix, event.values)
remapCoordinateSystem(tempRotationMatrix, AXIS_X, AXIS_Z, tempRotationMatrix)
getOrientation(tempRotationMatrix, tempOrientationMatrix)
currentVectorHeading = Math.toDegrees(tempOrientationMatrix[0].toDouble()).toFloat()
if(currentVectorHeading < 0) {
currentVectorHeading += 360f//heading = 360 - abs(neg heading), which is really 360 + (-heading)
}
}
else -> return
}
}
I've also included the deprecated Sensor.TYPE_ORIENTATION for anybody wanting to see the difference between the two approaches. There is a several degree difference when using the deprecated method vs the updated approach.
I have to write a compass app in Android. The only thing the user sees on the screen is a cube with a red wall which has to point north. This is not important. What's important is that I need to rotate that cube accordingly to the rotation of the device itself so that the red wall continues to point north no matter how the phone is being held. My code is simple and straightforward:
#Override
public void onSensorChanged(SensorEvent event) {
synchronized (this) {
switch (event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
direction = event.values[2];
break;
case Sensor.TYPE_ORIENTATION:
if (direction < 0) {
angleX = event.values[1];
angleY = -event.values[2];
angleZ = event.values[0];
} else {
angleX = -event.values[1];
angleY = -event.values[2];
angleZ = event.values[0];
}
break;
}
}
}
I have added this extra direction variable that simply stores whether the phone's display is pointing downwards or upwards. I don't know if I need it but it seems to fix some bugs. I am using the SensorSimulator for android but whenever my pitch slider goes in the [-90, 90] interval the other variables get mixed up. It's like they get a 180 offset. But I can't detect when I am in this interval because the range of the pitch is from -90 to 90 so I can move that slider from left to write and I will always be in that interval.
This was all just to show you how far has my code advanced. I am not saying how this problem should be solved because I will only probably stir myself into a dead end. You see, I have been trying to write that app for 3 days now, and you can imagine how pissed my boss is. I have read all sorts of tutorials and tried every formula I could find or think of. So please help me. All I have to do is know how to rotate my cube, the rotation angles of which are EULER ANGLES in degrees.
Here's some code I wrote to do something pretty similar, really only caring about the rotation of the device in the roll direction. Hope it helps! It just uses the accelerometer values to determine the pitch, no need to get orientation of the view.
public void onSensorChanged(SensorEvent event) {
float x = -1 * event.values[0] / SensorManager.GRAVITY_EARTH;
float y = -1 * event.values[1] / SensorManager.GRAVITY_EARTH;
float z = -1 * event.values[2] / SensorManager.GRAVITY_EARTH;
float signedRawRoll = (float) (Math.atan2(x, y) * 180 / Math.PI);
float unsignedRawRoll = Math.abs(signedRawRoll);
float rollSign = signedRawRoll / unsignedRawRoll;
float rawPitch = Math.abs(z * 180);
// Use a basic low-pass filter to only keep the gravity in the accelerometer values for the X and Y axes
// adjust the filter weight based on pitch, as roll is harder to define as pitch approaches 180.
float filterWeight = rawPitch > 165 ? 0.85f : 0.7f;
float newUnsignedRoll = filterWeight * Math.abs(this.roll) + (1 - filterWeight) * unsignedRawRoll;
this.roll = rollSign * newUnsignedRoll;
if (Float.isInfinite(this.roll) || Float.isNaN(this.roll)) {
this.roll = 0;
}
this.pitch = filterWeight * this.pitch + (1 - filterWeight) * rawPitch;
for (IAngleListener listener : listeners) {
listener.deviceRollAndPitch(this.roll, this.pitch);
}
}