hello i want to ask which sensor is the best to find your current orientation? the orientation sensor or the combination of accelerometer and magnometer (compass). I have seen a lot of augmented reality versions and i wonder which one is the best! Some of these use the orientation to find the azimuth whereas other use the accelerometer and magnometer. As i know orientation sensor is deprecated.
I found that if you need framework that is free MIXARE is really great
You can use gyroscope sensor to find the rotation vector. It can be used to get the azimuth.
SensorManager mSensorManager = (SensorManager)
getSystemService(Context.SENSOR_SERVICE);
rSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);
And in onSensorChanged function,
SensorManager.getRotationMatrixFromVector(mRotationMatrix, event.values);
SensorManager.getOrientation(mRotationMatrix, mValues);
azimuth = Math.toDegrees(mValues[0]));
But in some devices this sensor may not be available. In that case you can use combination of Accelerometer and Magnetometer.
mSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
aSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
In onSensorChaged function,
switch (event.sensor.getType()) {
case Sensor.TYPE_MAGNETIC_FIELD:
magnetic = event.values.clone();
break;
case Sensor.TYPE_ACCELEROMETER:
accelerometer = event.values.clone();
break;
}
if (magnetic != null && accelerometer != null) {
Rot = new float[9];
I = new float[9];
SensorManager.getRotationMatrix(Rot, I, accelerometer, magnetic);
float[] outR = new float[9];
SensorManager.remapCoordinateSystem(Rot, SensorManager.AXIS_X,
SensorManager.AXIS_Z, outR);
SensorManager.getOrientation(outR, values);
azimuth = values[0];
magnetic = null;
accelerometer = null;
}
Gyroscope provides the best result out of the two options.
Related
i need to find out the device orientation, however i must do it after data sample from the sensors is done and not waiting for an event to happen.
i saw this code:
float[] rotationMatrix = new float[9];
float[] inclinationMatrix = new float[9];
float[] accelerometer; // values from sensor
float[] magnetic; // values from sensor
SensorManager.getRotationMatrix(rotationMatrix, inclinationMatrix,
accelerometer, magnetic)
int inclination = (int)
Math.round(Math.toDegrees(Math.acos(rotationMatrix[8])));
if (inclination < 90)
{
// face up
}
if (inclination > 90)
{
// face down
}
but the accelerometer, magnetic arguments must me initlized.
please help me out
Android has the function:
SensorManager.getRotationMatrix
Which you can use to get a rotation matrix expressed as a float[]. I can do the maths to multiply some other vector (e.g. coming out of the accelerometer). Obviously it's better to re-use existing functionality if possible though - is there a standard way in Android to do this?
Why don't you take the accelerometer provided by the device's sensor ?
If the device has accelerometer sensor and gyroscope sensor you better take the values from the Sensor.
SensorEvent event;
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
timestampAccSys = System.currentTimeMillis();
timestampAcc = event.timestamp;
valueAccX = event.values[0];
valueAccY = event.values[1];
valueAccZ = event.values[2];
newValueAcc = true;
}
}
I am building an Android application which logs the degrees of the compass of the device into a file. There are two methods the get this degrees:
Method 1:
SensorManager mSensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
Sensor orientationSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ORIENTATION);
mSensorManager.registerListener(this, orientationSensor, SensorManager.SENSOR_DELAY_NORMAL);
public void onSensorChanged(SensorEvent event) {
float azimuthInDegrees = event.values[0]
}
Method 2:
SensorManager mSensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
Sensor accelerometer = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Sensor magnetometer = mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
mSensorManager.registerListener(this, accelerometer, SensorManager.SENSOR_DELAY_NORMAL);
mSensorManager.registerListener(this, magnetometer, SensorManager.SENSOR_DELAY_NORMAL);
float[] mGravity;
float[] mGeomagnetic;
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravity = event.values;
}
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
mGeomagnetic = event.values;
}
if (mGravity != null && mGeomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
float azimuthInDegress = ((float) Math.toDegrees(orientation[0]) + 360) % 360;
}
}
}
I tried out both methods by placing my device in the North direction (which is around 360 degrees):
Method 1 returns perfect results but unfortunately this method is deprecated:
359.6567
359.5034
359.859
359.76212
359.8878
359.87048
359.8356
359.80356
359.81192
359.7671
359.84668
359.88528
Method 2 also returns good results but sometimes (randomly) it returns an incorrect degree:
359.91495
359.83652
263.67697
359.67993
359.70038
359.688
359.71155
359.70276
359.6984
359.6429
270.6323
359.62302
359.49954
359.44757
359.47803
359.4947
359.39572
As you can see, some incorrect degrees are randomly returned with the second method. The device is calibrated and I think that the problem is with the second method as the first method returns perfect results. Can you guys help me out?
The problem is in the assigment of mGravity and mGeomagnetic it should be event.values.clone(). mGravity has class scope but, by using mGravity = event.values, you assign its value to a value in an address that has method scope. So as soon as onSensorChanged is called again and it is magnetic type, the mGravity is now pointing to a variable which no longer exists and thus can have any value.
I'm trying to get the direction of the camera in Android. I have code that's working perfectly in portrait (I test it by slowly turning in a circle and looking at updates 1s apart), but it isn't working at all in landscape- The numbers seem to change randomly. It also gets totally out of whack after switching from portrait to landscape. Here's my code
public void onSensorChanged(SensorEvent event) {
switch (event.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
accelerometerValues = event.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomagneticMatrix = event.values.clone();
break;
default:
break;
}
if (geomagneticMatrix != null && accelerometerValues != null) {
float[] R = new float[16];
float[] I = new float[16];
float[] outR = new float[16];
//Get the rotation matrix, then remap it from camera surface to world coordinates
SensorManager.getRotationMatrix(R, I, accelerometerValues, geomagneticMatrix);
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
float values[] = new float[4];
SensorManager.getOrientation(outR,values);
float direction = normalizeDegrees((float) Math.toDegrees(values[0]));
float pitch = normalizeDegrees((float) Math.toDegrees(values[1]));
float roll = normalizeDegrees((float) Math.toDegrees(values[2]));
if((int)direction != (int)lastDirection){
lastDirection = direction;
for(CompassListener listener: listeners){
listener.onDirectionChanged(lastDirection, pitch, roll);
}
}
}
}
Any ideas what I'm doing wrong? I freely admit I don't 100% understand this. I also don't know why Google deprecated the orientation sensor- it seems like a common enough desire.
Did you consider, that when you change from portrait to landscape, accelerometer axes change ? Like Y-axis becomes Z-axis and so on. This might be one source of strange behavior.
I seemed to have solved it, or at least improved it to the point where I know what was the problem. I put in a filter such that instead of delivering a single sensor reading, I'm remembering the last reading and applying a delta to it. Each new sensor point is allowed to add a maximum of 5 degrees. This completely filters out the weird hops, and forces it to converge to a value. I sill see an occasional odd jump, but I figure what I need is a more sophisticated filter. New code:
public void onSensorChanged(SensorEvent event) {
if (event.accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE)
return;
switch (event.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
accelerometerValues = event.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomagneticMatrix = event.values.clone();
break;
}
if (geomagneticMatrix != null && accelerometerValues != null) {
float[] R = new float[16];
float[] I = new float[16];
float[] outR = new float[16];
//Get the rotation matrix, then remap it from camera surface to world coordinates
SensorManager.getRotationMatrix(R, I, accelerometerValues, geomagneticMatrix);
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
float values[] = new float[4];
SensorManager.getOrientation(outR,values);
int direction = filterChange(normalizeDegrees(Math.toDegrees(values[0])));
int pitch = normalizeDegrees(Math.toDegrees(values[1]));
int roll = normalizeDegrees(Math.toDegrees(values[2]));
if((int)direction != (int)lastDirection){
lastDirection = (int)direction;
lastPitch = (int)pitch;
lastRoll = (int)roll;
for(CompassListener listener: listeners){
listener.onDirectionChanged(lastDirection, pitch, roll);
}
}
}
}
//Normalize a degree from 0 to 360 instead of -180 to 180
private int normalizeDegrees(double rads){
return (int)((rads+360)%360);
}
//We want to ignore large bumps in individual readings. So we're going to cap the number of degrees we can change per report
private int filterChange(int newDir){
int change = newDir - lastDirection;
int circularChange = newDir-(lastDirection+360);
int smallestChange;
if(Math.abs(change) < Math.abs(circularChange)){
smallestChange = change;
}
else{
smallestChange = circularChange;
}
smallestChange = Math.max(Math.min(change,5),-5);
return lastDirection+smallestChange;
}
What I want to happen, is to remap the coordinate system, when the phone is turned away from it's "natural" orientation. So that when using a phone, and it's in landscape, it should read the same values, as if it were being held in portrait.
I'm checking to see if rotation equals Surface.ROTATION_90, and if so, then remap the coordinate system.
I admit I don't quite understand how to do it properly, and could use a little guidance.
So, you need to run these two methods:
SensorManager.getRotationMatrix(inR, I, grav, mag);
SensorManager.remapCoordinateSystem(inR, SensorManager.AXIS_Y,SensorManager.AXIS_MINUS_X, outR);
What's required to pass into these methods? I created a new float array, then passed just the orientationsensor data to the mag field, which didn't work. So, I registered both the accelerometer and magnetic field sensors. Fed the data from both of those to the getRotatioMatrix method, and I always get a NullPointerException (even though the JavaDoc says some arguments can be null). I even tried passing data to each argument, and still got a NullPointerException.
My question is, what is the proper data that I need to pass into the getRotationMatrix method?
I found that a very simple way to do this is the one used in the SDK sample AccelerometerPlay.
First you get your display like this, for example in onResume():
WindowManager windowManager = (WindowManager) getSystemService(WINDOW_SERVICE);
mDisplay = windowManager.getDefaultDisplay();
Then in onSensorChanged() you can use this simple code:
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER)
return;
switch (mDisplay.getRotation()) {
case Surface.ROTATION_0:
mSensorX = event.values[0];
mSensorY = event.values[1];
break;
case Surface.ROTATION_90:
mSensorX = -event.values[1];
mSensorY = event.values[0];
break;
case Surface.ROTATION_180:
mSensorX = -event.values[0];
mSensorY = -event.values[1];
break;
case Surface.ROTATION_270:
mSensorX = event.values[1];
mSensorY = -event.values[0];
break;
}
}
Hope this will help.
this is my code, and it works without NPEs. Note, that I have just one Listener, but you have to register it to listen to both sensors (ACCELEROMETER and MAGNETICFIELD).
private SensorEventListener mOrientationSensorsListener = new SensorEventListener() {
private float[] mR = new float[9];
private float[] mRemappedR = new float[9];
private float[] mGeomagneticVector = new float[3];
private float[] mGravityVector = new float[3];
#Override
public void onSensorChanged(SensorEvent event) {
synchronized(this) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravityVector = Util.exponentialSmoothing(event.values, mGravityVector, 0.2f);
} else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
mGeomagneticVector = Util.exponentialSmoothing(event.values, mGeomagneticVector, 0.5f);
SensorManager.getRotationMatrix(mR, null, mGravityVector, mGeomagneticVector);
SensorManager.remapCoordinateSystem(mR, SensorManager.AXIS_Y,SensorManager.AXIS_MINUS_X, mRemappedR);
}
}
}
The exponentialSmoothing method does some smoothing of the sensor results and looks like that (the alpha value can go from 0 to 1, where 1 means no smoothing at all):
public static float[] exponentialSmoothing(float[] input, float[] output, float alpha) {
for (int i=0; i<input.length; i++) {
output[i] = output[i] + alpha * (input[i] - output[i]);
}
return output;
}
As for the synchronized bit -- I'm not sure that it is needed, just read it somewhere and added it.