Ive notice when I hold my phone just below the horizon I get > -90. When I begin to tilt the phone so its pointing towards the sky it reflects around -90 so, -88, -90, -88. As its tilted from the ground to the sky.
Has anyone experienced this before. (It doesnt seem to be related to the remapCoordinateSystem). (Ive previously commented it out). When the phone camera is pointed towards the ground the pitch reading is zero. When its pointed towards the ceiling its also zero.
Thanks for any help.
#Override
public void onSensorChanged(SensorEvent event) {
synchronized (MainActivity.this) { // TilteController
switch (event.sensor.getType()) {
case Sensor.TYPE_MAGNETIC_FIELD:
mMagneticValues = event.values.clone();
break;
case Sensor.TYPE_ACCELEROMETER:
mAccelerometerValues = event.values.clone();
break;
}
if (mMagneticValues != null && mAccelerometerValues != null) {
SensorManager.getRotationMatrix(R, null, mAccelerometerValues, mMagneticValues);
Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
int rotation = display.getRotation();
switch (rotation)
{
case Configuration.ORIENTATION_LANDSCAPE:
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_MINUS_Y, SensorManager.AXIS_MINUS_X, R);//shouldnt be the same R in and out
case Configuration.ORIENTATION_PORTRAIT:
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_Y, SensorManager.AXIS_X, R);//shouldnt be the same R in and out
}
float[] orientation = new float[3];
SensorManager.getOrientation(R, orientation);
mAzimuth = orientation[0];
mPitch = orientation[1];
mRoll = orientation[2];
dirText.setText("Azimuth, Pitch, Roll || " + radStr(mAzimuth) +", "+ radStr(mPitch) +", "+ radStr(mRoll));
glView.rotate((float)Math.toDegrees(mAzimuth),(float)Math.toDegrees(mPitch),(float)Math.toDegrees(mRoll));
//glView.azimuth=(float)Math.toDegrees(smooth(mAzimuth));
glView.pitch=(float)(Math.toDegrees(smooth(mPitch)));//-90 makes it cenetr
//glView.roll=(float)Math.toDegrees(smooth(-mRoll));
//Log.i("Azimuth, Pitch, Roll", mAzimuth+", "+mPitch+", "+mRoll);
}
}
}
A temporary fix for my needs involves rotating the matrix before calling get orientation.
Matrix.rotateM(R, 0, 90, 0, 1, 0);
How ever you do a complete flip, it experiences the same issue. (This should be solved by adding in Azimuth) But it isnt a brilliant solution.
So if others are reading this and are trying to make the horizon 0 degrees. Rotate the matrix before, as oppose to rotating your display (Im using OpenGL)
Ending up locking the App in landscape and applying the following
SensorManager.getRotationMatrix(R, null, mAccelerometerValues, mMagneticValues);
Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
int rotation = display.getRotation();
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, R);//shouldnt be the same R in and out
float[] orientation = new float[3];
SensorManager.getOrientation(R, orientation);
Related
i'am trying to build a simple Augmented Reality App, so I start working with sensor Data.
According to this thread (Android compass example) and example (http://www.codingforandroid.com/2011/01/using-orientation-sensors-simple.html), the calculation of the orientation using the Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGNETIC_FIELD doesn't really fit.
So I'm not able to get "good" values. The azimut values doesn't make any sense at all, so if I just move the Phone upside the value changes extremly. Even if I just rotate the phone, the values doesn't represent the phones orientation.
Has anybody an idea, who to improve the values quality according to the given example?
In what kind of orientation do you use this sample app? From what is written is this code, the only orientation supported is Portrait or flat on the table, it depends on devices. What do you mean by "good"?
It is normal that the value is not "good" when rotating the device, the device coordinate system is supposed to be working in Portrait, or flat i don't know (Y axis vertical along the screen pointing up, Z axis pointing out of the screen coming from the center of screen, X axis perpendicular to the Y axis going on the right along the screen). Having this, rotating the device will not rotate the device coordinate system, you'll have to remap it.
But if you want the heading of the device in Portrait orientation, here is a piece of code that works good for me:
#Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrix,
event.values);
SensorManager
.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}
You'll get the heading (or azimuth) in:
orientationVals[0]
Answer from Tíbó is good, but if you log roll value, you will expect irregular numbers.
(roll is important for AR Browsers)
This is due to
SensorManager.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
You have to use different matrix for in and out of remap. This following code works for me with a correct roll value:
#Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrixFromVector, event.values);
SensorManager.remapCoordinateSystem(mRotationMatrixFromVector,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}
Probably late to the party. Anyway here is how I got the azimuth
private final int sensorType = Sensor.TYPE_ROTATION_VECTOR;
float[] rotMat = new float[9];
float[] vals = new float[3];
#Override
public void onSensorChanged(SensorEvent event) {
sensorHasChanged = false;
if (event.sensor.getType() == sensorType){
SensorManager.getRotationMatrixFromVector(rotMat,
event.values);
SensorManager
.remapCoordinateSystem(rotMat,
SensorManager.AXIS_X, SensorManager.AXIS_Y,
rotMat);
SensorManager.getOrientation(rotMat, vals);
azimuth = deg(vals[0]); // in degrees [-180, +180]
pitch = deg(vals[1]);
roll = deg(vals[2]);
sensorHasChanged = true;
}
}
Hope it helps
Have you tried the combined (sensor-fusion) type Sensor.TYPE_ROTATION_VECTOR. This may give better results:
Go to https://developer.android.com/reference/android/hardware/SensorEvent.html and search for 'rotation_vector'.
Here's a Kotlin approach with all the necessary matrices included (for some reason the previous answers leave out the array sizes, which matter)
// This is determined from the deprecated Sensor.TYPE_ORIENTATION
var lastOrientation: FloatArray = FloatArray(3)
var lastHeading: Float = 0f
var currentHeading: Float = 0f
// This is from the non deprecated Sensor.TYPE_ROTATION_VECTOR
var lastVectorOrientation: FloatArray = FloatArray(5)
var lastVectorHeading: Float = 0f
var currentVectorHeading: Float = 0f
override fun onSensorChanged(event: SensorEvent) {
when(event.sensor?.type) {
null -> return
Sensor.TYPE_ORIENTATION -> {
lastOrientation = event.values
lastHeading = currentHeading
currentHeading = abs(event.values[0].roundToInt().toFloat())
}
Sensor.TYPE_ROTATION_VECTOR -> {
lastVectorOrientation = event.values
lastVectorHeading = currentVectorHeading
val tempRotationMatrix = FloatArray(9)
val tempOrientationMatrix = FloatArray(3)
getRotationMatrixFromVector(tempRotationMatrix, event.values)
remapCoordinateSystem(tempRotationMatrix, AXIS_X, AXIS_Z, tempRotationMatrix)
getOrientation(tempRotationMatrix, tempOrientationMatrix)
currentVectorHeading = Math.toDegrees(tempOrientationMatrix[0].toDouble()).toFloat()
if(currentVectorHeading < 0) {
currentVectorHeading += 360f//heading = 360 - abs(neg heading), which is really 360 + (-heading)
}
}
else -> return
}
}
I've also included the deprecated Sensor.TYPE_ORIENTATION for anybody wanting to see the difference between the two approaches. There is a several degree difference when using the deprecated method vs the updated approach.
I'm trying to get the direction of the camera in Android. I have code that's working perfectly in portrait (I test it by slowly turning in a circle and looking at updates 1s apart), but it isn't working at all in landscape- The numbers seem to change randomly. It also gets totally out of whack after switching from portrait to landscape. Here's my code
public void onSensorChanged(SensorEvent event) {
switch (event.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
accelerometerValues = event.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomagneticMatrix = event.values.clone();
break;
default:
break;
}
if (geomagneticMatrix != null && accelerometerValues != null) {
float[] R = new float[16];
float[] I = new float[16];
float[] outR = new float[16];
//Get the rotation matrix, then remap it from camera surface to world coordinates
SensorManager.getRotationMatrix(R, I, accelerometerValues, geomagneticMatrix);
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
float values[] = new float[4];
SensorManager.getOrientation(outR,values);
float direction = normalizeDegrees((float) Math.toDegrees(values[0]));
float pitch = normalizeDegrees((float) Math.toDegrees(values[1]));
float roll = normalizeDegrees((float) Math.toDegrees(values[2]));
if((int)direction != (int)lastDirection){
lastDirection = direction;
for(CompassListener listener: listeners){
listener.onDirectionChanged(lastDirection, pitch, roll);
}
}
}
}
Any ideas what I'm doing wrong? I freely admit I don't 100% understand this. I also don't know why Google deprecated the orientation sensor- it seems like a common enough desire.
Did you consider, that when you change from portrait to landscape, accelerometer axes change ? Like Y-axis becomes Z-axis and so on. This might be one source of strange behavior.
I seemed to have solved it, or at least improved it to the point where I know what was the problem. I put in a filter such that instead of delivering a single sensor reading, I'm remembering the last reading and applying a delta to it. Each new sensor point is allowed to add a maximum of 5 degrees. This completely filters out the weird hops, and forces it to converge to a value. I sill see an occasional odd jump, but I figure what I need is a more sophisticated filter. New code:
public void onSensorChanged(SensorEvent event) {
if (event.accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE)
return;
switch (event.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
accelerometerValues = event.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomagneticMatrix = event.values.clone();
break;
}
if (geomagneticMatrix != null && accelerometerValues != null) {
float[] R = new float[16];
float[] I = new float[16];
float[] outR = new float[16];
//Get the rotation matrix, then remap it from camera surface to world coordinates
SensorManager.getRotationMatrix(R, I, accelerometerValues, geomagneticMatrix);
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
float values[] = new float[4];
SensorManager.getOrientation(outR,values);
int direction = filterChange(normalizeDegrees(Math.toDegrees(values[0])));
int pitch = normalizeDegrees(Math.toDegrees(values[1]));
int roll = normalizeDegrees(Math.toDegrees(values[2]));
if((int)direction != (int)lastDirection){
lastDirection = (int)direction;
lastPitch = (int)pitch;
lastRoll = (int)roll;
for(CompassListener listener: listeners){
listener.onDirectionChanged(lastDirection, pitch, roll);
}
}
}
}
//Normalize a degree from 0 to 360 instead of -180 to 180
private int normalizeDegrees(double rads){
return (int)((rads+360)%360);
}
//We want to ignore large bumps in individual readings. So we're going to cap the number of degrees we can change per report
private int filterChange(int newDir){
int change = newDir - lastDirection;
int circularChange = newDir-(lastDirection+360);
int smallestChange;
if(Math.abs(change) < Math.abs(circularChange)){
smallestChange = change;
}
else{
smallestChange = circularChange;
}
smallestChange = Math.max(Math.min(change,5),-5);
return lastDirection+smallestChange;
}
What I want to happen, is to remap the coordinate system, when the phone is turned away from it's "natural" orientation. So that when using a phone, and it's in landscape, it should read the same values, as if it were being held in portrait.
I'm checking to see if rotation equals Surface.ROTATION_90, and if so, then remap the coordinate system.
I admit I don't quite understand how to do it properly, and could use a little guidance.
So, you need to run these two methods:
SensorManager.getRotationMatrix(inR, I, grav, mag);
SensorManager.remapCoordinateSystem(inR, SensorManager.AXIS_Y,SensorManager.AXIS_MINUS_X, outR);
What's required to pass into these methods? I created a new float array, then passed just the orientationsensor data to the mag field, which didn't work. So, I registered both the accelerometer and magnetic field sensors. Fed the data from both of those to the getRotatioMatrix method, and I always get a NullPointerException (even though the JavaDoc says some arguments can be null). I even tried passing data to each argument, and still got a NullPointerException.
My question is, what is the proper data that I need to pass into the getRotationMatrix method?
I found that a very simple way to do this is the one used in the SDK sample AccelerometerPlay.
First you get your display like this, for example in onResume():
WindowManager windowManager = (WindowManager) getSystemService(WINDOW_SERVICE);
mDisplay = windowManager.getDefaultDisplay();
Then in onSensorChanged() you can use this simple code:
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER)
return;
switch (mDisplay.getRotation()) {
case Surface.ROTATION_0:
mSensorX = event.values[0];
mSensorY = event.values[1];
break;
case Surface.ROTATION_90:
mSensorX = -event.values[1];
mSensorY = event.values[0];
break;
case Surface.ROTATION_180:
mSensorX = -event.values[0];
mSensorY = -event.values[1];
break;
case Surface.ROTATION_270:
mSensorX = event.values[1];
mSensorY = -event.values[0];
break;
}
}
Hope this will help.
this is my code, and it works without NPEs. Note, that I have just one Listener, but you have to register it to listen to both sensors (ACCELEROMETER and MAGNETICFIELD).
private SensorEventListener mOrientationSensorsListener = new SensorEventListener() {
private float[] mR = new float[9];
private float[] mRemappedR = new float[9];
private float[] mGeomagneticVector = new float[3];
private float[] mGravityVector = new float[3];
#Override
public void onSensorChanged(SensorEvent event) {
synchronized(this) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravityVector = Util.exponentialSmoothing(event.values, mGravityVector, 0.2f);
} else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
mGeomagneticVector = Util.exponentialSmoothing(event.values, mGeomagneticVector, 0.5f);
SensorManager.getRotationMatrix(mR, null, mGravityVector, mGeomagneticVector);
SensorManager.remapCoordinateSystem(mR, SensorManager.AXIS_Y,SensorManager.AXIS_MINUS_X, mRemappedR);
}
}
}
The exponentialSmoothing method does some smoothing of the sensor results and looks like that (the alpha value can go from 0 to 1, where 1 means no smoothing at all):
public static float[] exponentialSmoothing(float[] input, float[] output, float alpha) {
for (int i=0; i<input.length; i++) {
output[i] = output[i] + alpha * (input[i] - output[i]);
}
return output;
}
As for the synchronized bit -- I'm not sure that it is needed, just read it somewhere and added it.
My augmented reality app needs the compass bearing of the camera view, and there's plenty of examples of getting the direction from the sensormanager.
However I'm finding the resulting value different depending on the phone orientation - landscape rotated to right is about 10 degrees different to landscape rotated to left (difference between ROTATION_0 and ROTATION_180 is less, but still different). This difference is enough to ruin any AR effect.
Is it something to do with calibration? (I'm not convinced I'm doing the figure of 8 thing properly - I've tried various ways I've found on youtube).
Any ideas why there's a difference? Have I messed up on the rotation matrix stuff? I have the option of restricting the app to a single orientation, but it still concerns me that the compass reading still isn't very accurate (even though after filtering it's fairly stable)
public void onSensorChanged(SensorEvent event) {
if (event.accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE) {
return;
}
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) mGravity = event.values;
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) mGeomagnetic = event.values;
if (mGravity != null && mGeomagnetic != null) {
float[] rotationMatrixA = mRotationMatrixA;
if (SensorManager.getRotationMatrix(rotationMatrixA, null, mGravity, mGeomagnetic)) {
float[] rotationMatrixB = mRotationMatrixB;
Display display = getWindowManager().getDefaultDisplay();
int deviceRot = display.getRotation();
switch (deviceRot)
{
// portrait - normal
case Surface.ROTATION_0: SensorManager.remapCoordinateSystem(rotationMatrixA,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
rotationMatrixB);
break;
// rotated left (landscape - keys to bottom)
case Surface.ROTATION_90: SensorManager.remapCoordinateSystem(rotationMatrixA,
SensorManager.AXIS_Z, SensorManager.AXIS_MINUS_X,
rotationMatrixB);
break;
// upside down
case Surface.ROTATION_180: SensorManager.remapCoordinateSystem(rotationMatrixA,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
rotationMatrixB);
break;
// rotated right
case Surface.ROTATION_270: SensorManager.remapCoordinateSystem(rotationMatrixA,
SensorManager.AXIS_MINUS_Z, SensorManager.AXIS_X,
rotationMatrixB);
break;
default: break;
}
float[] dv = new float[3];
SensorManager.getOrientation(rotationMatrixB, dv);
// add to smoothing filter
fd.AddLatest((double)dv[0]);
}
mDraw.invalidate();
}
}
Try this
public void onSensorChanged(SensorEvent event) {
if (event.accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE) {
return;
}
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) mGravity = event.values.clone ();
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) mGeomagnetic = event.values.clone ();
if (mGravity != null && mGeomagnetic != null) {
float[] rotationMatrixA = mRotationMatrixA;
if (SensorManager.getRotationMatrix(rotationMatrixA, null, mGravity, mGeomagnetic)) {
float[] rotationMatrixB = mRotationMatrixB;
SensorManager.remapCoordinateSystem(rotationMatrixA,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
rotationMatrixB);
float[] dv = new float[3];
SensorManager.getOrientation(rotationMatrixB, dv);
// add to smoothing filter
fd.AddLatest((double)dv[0]);
}
mDraw.invalidate();
}
}
You do not need the switch statement, there seems to be a lot of confusion concerning getRotationMatrix, remapCoordinateSystem and getOrientation from stackoverflow questions.
I probably will write a detail explanation of these in the near future.
Hoan's answer is actually incorrect because it doesn't account for the display rotation. This is the correct answer.
I'm developing an Android 2.2 application.
I want to know when user moves device up or down, and when it moves to the left or to the right. The device will be at rest when mounted vertically. In other words, using the camera (Y axis along the camera's axis) for an augmented reality application where the rotation angles are needed:
remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR);
This is my code:
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
accelValues = event.values;
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
magneValues = event.values;
updateOrientation(calculateOrientation());
}
private float[] calculateOrientation() {
float[] values = new float[3];
float[] R = new float[9];
float[] outR = new float[9];
SensorManager.getRotationMatrix(R, null, accelValues, magneValues);
SensorManager.remapCoordinateSystem(R,
SensorManager.AXIS_X,
SensorManager.AXIS_Z,
outR);
SensorManager.getOrientation(outR, values);
// Convert from Radians to Degrees.
values[0] = (float) Math.toDegrees(values[0]); // Azimuth, rotation around Z
values[1] = (float) Math.toDegrees(values[1]); // Pitch, rotation around X
values[2] = (float) Math.toDegrees(values[2]); // Roll, rotation around Y
return values;
}
but I'm not sure how can I know if user moves device to the left or to the right.
And, how can I know if user walk?
There is an Azimuth, a Pitch and a Roll, but I don't know how can I use these values.
Any advice?