I'm having some troubles implementing an augmented reality app for android and I'm hoping that someone could help me. (Sorry for my English...)
Basically I'm getting values from the accelerometer and mangetic field sensors, then as I read remapCoordinatessystem(inR, AXIS_X, AXIS_Z, outR)...and eventually I getOrientation...
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravity = event.values;
}
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
mGeomagnetic = event.values;
}
if (mGravity != null && mGeomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
float outR[] = new float[9];
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(outR, orientation);
// Here I pass the orientation values to the object, which is render by the min3d framework
}
}
}
Am I getting the values correct? Or I need to transform them into degrees?¿ I'm lost...
Then I rotate my 3D object with the values I have read from the sensors... but the obejct it's not moving at all.
public void updateScene() {
objModel.rotation().y = _orientation[2];
objModel.rotation().x = _orientation[1];
objModel.rotation().z = _orientation[0];
}
OpenGL is not my friend...so I'm not sure I'm transforming correctly... which is the order of the rotation axis or it doesn't matter... and which value from orientation should correspond to the axis of the 3D object loaded by Min3D?
If this is not the path I must follow... could someone guide me to the correct one please? It's been a few weeks fighting with this.
Thank you so much... (StackOverflow lover)
I had an issue with
mGeomagnetic = event.values;
you should write
mGeomagnetic = event.values.clone();
instead
Related
I have an app which uses orientation data which works very well using the pre API-8 method of using a Sensor.TYPE_ORIENTAITON. Smoothing that data was relatively easy.
I am trying to update the code to avoid using this deprecated approach. The new standard approach is to replace the single Sensor.TYPE_ORIENTATION with a Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGENTIC_FIELD combination. As that data is received, it is sent (via SensorManager.getRotationMatrix()) to SensorManager.getOrientation(). This (theoretically) returns the same information as Sensor.TYPE_ORIENTATION did (apart from different units and axis orientation).
However, this approach seems to generate data which is much more jittery (ie noisy) than the deprecated method (which still works). So, if you compare the same information on the same device, the deprecated method provides much less noisy data than the current method.
How do I get the actual same (less noisy) data that the deprecated method used to provide?
To make my question a little clearer: I have read various answers on this subject, and I have tried all sorts of filter: simple KF / IIR low pass as you suggest; median filter between 5 and 19 points, but so far I have yet to get anywhere close to the smoothness of the data the phone supplies via TYPE_ORIENTATION.
Apply a low-pass filter to your sensor output.
This is my low-pass filter method:
private static final float ALPHA = 0.5f;
//lower alpha should equal smoother movement
...
private float[] applyLowPassFilter(float[] input, float[] output) {
if ( output == null ) return input;
for ( int i=0; i<input.length; i++ ) {
output[i] = output[i] + ALPHA * (input[i] - output[i]);
}
return output;
}
Apply it like so:
float[] mGravity;
float[] mGeomagnetic;
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
mGravity = applyLowPassFilter(event.values.clone(), mGravity);
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
mGeomagnetic = applyLowPassFilter(event.values.clone(), mGeomagnetic);
if (mGravity != null && mGeomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
azimuth = -orientation[0];
invalidate();
}
}
}
This is obviously code for a compass, remove what you don't need.
Also, take a look at this SE question How to implement low pass filter using java
It turns out that there is another, not particularly documented, way to get orientation data. Hidden in the list of sensor types is TYPE_ROTATION_VECTOR. So, set one up:
Sensor mRotationVectorSensor = sensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);
sensorManager.registerListener(this, mRotationVectorSensor, SensorManager.SENSOR_DELAY_GAME);
Then:
#Override
public void onSensorChanged(SensorEvent event) {
final int eventType = event.sensor.getType();
if (eventType != Sensor.TYPE_ROTATION_VECTOR) return;
long timeNow = System.nanoTime();
float mOrientationData[] = new float[3];
calcOrientation(mOrientationData, event.values.clone());
// Do what you want with mOrientationData
}
The key mechanism is going from the incoming rotation data to an orientation vector via a rotation matrix. The slightly frustrating thing is the orientation vector comes from quaternion data in the first place, but I can't see how to get the quaternion delivered direct. (If you ever wondered how quaternions relate to orientatin and rotation information, and why they are used, see here.)
private void calcOrientation(float[] orientation, float[] incomingValues) {
// Get the quaternion
float[] quatF = new float[4];
SensorManager.getQuaternionFromVector(quatF, incomingValues);
// Get the rotation matrix
//
// This is a variant on the code presented in
// http://www.euclideanspace.com/maths/geometry/rotations/conversions/quaternionToMatrix/
// which has been altered for scaling and (I think) a different axis arrangement. It
// tells you the rotation required to get from the between the phone's axis
// system and the earth's.
//
// Phone axis system:
// https://developer.android.com/guide/topics/sensors/sensors_overview.html#sensors-coords
//
// Earth axis system:
// https://developer.android.com/reference/android/hardware/SensorManager.html#getRotationMatrix(float[], float[], float[], float[])
//
// Background information:
// https://en.wikipedia.org/wiki/Rotation_matrix
//
float[][] rotMatF = new float[3][3];
rotMatF[0][0] = quatF[1]*quatF[1] + quatF[0]*quatF[0] - 0.5f;
rotMatF[0][1] = quatF[1]*quatF[2] - quatF[3]*quatF[0];
rotMatF[0][2] = quatF[1]*quatF[3] + quatF[2]*quatF[0];
rotMatF[1][0] = quatF[1]*quatF[2] + quatF[3]*quatF[0];
rotMatF[1][1] = quatF[2]*quatF[2] + quatF[0]*quatF[0] - 0.5f;
rotMatF[1][2] = quatF[2]*quatF[3] - quatF[1]*quatF[0];
rotMatF[2][0] = quatF[1]*quatF[3] - quatF[2]*quatF[0];
rotMatF[2][1] = quatF[2]*quatF[3] + quatF[1]*quatF[0];
rotMatF[2][2] = quatF[3]*quatF[3] + quatF[0]*quatF[0] - 0.5f;
// Get the orientation of the phone from the rotation matrix
//
// There is some discussion of this at
// http://stackoverflow.com/questions/30279065/how-to-get-the-euler-angles-from-the-rotation-vector-sensor-type-rotation-vecto
// in particular equation 451.
//
final float rad2deg = (float)(180.0 / PI);
orientation[0] = (float)Math.atan2(-rotMatF[1][0], rotMatF[0][0]) * rad2deg;
orientation[1] = (float)Math.atan2(-rotMatF[2][1], rotMatF[2][2]) * rad2deg;
orientation[2] = (float)Math.asin ( rotMatF[2][0]) * rad2deg;
if (orientation[0] < 0) orientation[0] += 360;
}
This seems to give data very similar in feel (I haven't run numeric tests) to the old TYPE_ORIENTATION data: it was usable for motion control of the device with marginal filtering.
There is also helpful information here, and a possible alternative solution here.
Here's what worked out for me using SensorManager.SENSOR_DELAY_GAME for a fast update i.e.
#Override
protected void onResume() {
super.onResume();
sensor_manager.registerListener(this, sensor_manager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_GAME);
sensor_manager.registerListener(this, sensor_manager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD), SensorManager.SENSOR_DELAY_GAME);
}
MOVING AVERAGE
(less efficient)
private float[] gravity;
private float[] geomagnetic;
private float azimuth;
private float pitch;
private float roll;
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
gravity = moving_average_gravity(event.values);
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
geomagnetic = moving_average_geomagnetic(event.values);
if (gravity != null && geomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, gravity, geomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
azimuth = (float) Math.toDegrees(orientation[0]);
pitch = (float) Math.toDegrees(orientation[1]);
roll = (float) Math.toDegrees(orientation[2]);
//if(roll>-46F && roll<46F)view.setTranslationX((roll/45F)*max_translation); //tilt from -45° to 45° to x-translate a view positioned centrally in a layout, from 0 - max_translation
Log.i("TAG","azimuth: "+azimuth+" | pitch: "+pitch+" | roll: "+roll);
}
}
}
private ArrayList<Float[]> moving_gravity;
private ArrayList<Float[]> moving_geomagnetic;
private static final float moving_average_size=12;//change
private float[] moving_average_gravity(float[] gravity) {
if(moving_gravity ==null){
moving_gravity =new ArrayList<>();
for (int i = 0; i < moving_average_size; i++) {
moving_gravity.add(new Float[]{0F,0F,0F});
}return new float[]{0F,0F,0F};
}
moving_gravity.remove(0);
moving_gravity.add(new Float[]{gravity[0],gravity[1],gravity[2]});
return moving_average(moving_gravity);
}
private float[] moving_average_geomagnetic(float[] geomagnetic) {
if(moving_geomagnetic ==null){
this.moving_geomagnetic =new ArrayList<>();
for (int i = 0; i < moving_average_size; i++) {
moving_geomagnetic.add(new Float[]{0F,0F,0F});
}return new float[]{0F,0F,0F};
}
moving_geomagnetic.remove(0);
moving_geomagnetic.add(new Float[]{geomagnetic[0],geomagnetic[1],geomagnetic[2]});
return moving_average(moving_geomagnetic);
}
private float[] moving_average(ArrayList<Float[]> moving_values){
float[] moving_average =new float[]{0F,0F,0F};
for (int i = 0; i < moving_average_size; i++) {
moving_average[0]+= moving_values.get(i)[0];
moving_average[1]+= moving_values.get(i)[1];
moving_average[2]+= moving_values.get(i)[2];
}
moving_average[0]= moving_average[0]/moving_average_size;
moving_average[1]= moving_average[1]/moving_average_size;
moving_average[2]= moving_average[2]/moving_average_size;
return moving_average;
}
LOW PASS FILTER
(more efficient)
private float[] gravity;
private float[] geomagnetic;
private float azimuth;
private float pitch;
private float roll;
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
gravity = LPF(event.values.clone(), gravity);
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
geomagnetic = LPF(event.values.clone(), geomagnetic);
if (gravity != null && geomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, gravity, geomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
azimuth = (float) Math.toDegrees(orientation[0]);
pitch = (float) Math.toDegrees(orientation[1]);
roll = (float) Math.toDegrees(orientation[2]);
//if(roll>-46F && roll<46F)view.setTranslationX((roll/45F)*max_translation); //tilt from -45° to 45° to x-translate a view positioned centrally in a layout, from 0 - max_translation
Log.i("TAG","azimuth: "+azimuth+" | pitch: "+pitch+" | roll: "+roll);
}
}
}
private static final float ALPHA = 1/16F;//adjust sensitivity
private float[] LPF(float[] input, float[] output) {
if ( output == null ) return input;
for ( int i=0; i<input.length; i++ ) {
output[i] = output[i] + ALPHA * (input[i] - output[i]);
}return output;
}
N.B
moving average of 12 values instead as per here
low pass filter of ALPHA = 0.0625 instead as per here
I need to be able to measure the change of the accelerometer, not its actual position. I need to move a sprite around with it, and it only works when the device's 'default', if you will, is flat on a table. I need to be able to measure its change. I have heard matrix multiplication and things are needed, but are there any easier ways to do this?
When you register a SensorEventListener, you must Override onSensorChanged(). This returns the change on X, Y, & Z axes. If you want to calculate the rotation matrix from this data, you also need to use the magnetic field sensor. How right? This should get you going:
// Deduced from Accelerometer data
private float[] gravityMatrix = new float[3];
// Magnetic field
private float[] geomagneticMatrix = new float[3];
private boolean sensorReady = false;
private int counter = 0;
public void onSensorChanged(SensorEvent event) {
float[] mIdentityMatrix = new float[16];
mIdentityMatrix[0] = 1;
mIdentityMatrix[4] = 1;
mIdentityMatrix[8] = 1;
mIdentityMatrix[12] = 1;
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
if (counter++ % 10 == 0) {
gravityMatrix = event.values.clone();
sensorReady = true;
}
}
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
geomagneticMatrix = event.values.clone();
if (sensorReady)
SensorManager.getRotationMatrix(mRotationMatrix,
mIdentityMatrix, gravityMatrix, geomagneticMatrix);
}
I am building an Android application which logs the degrees of the compass of the device into a file. There are two methods the get this degrees:
Method 1:
SensorManager mSensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
Sensor orientationSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ORIENTATION);
mSensorManager.registerListener(this, orientationSensor, SensorManager.SENSOR_DELAY_NORMAL);
public void onSensorChanged(SensorEvent event) {
float azimuthInDegrees = event.values[0]
}
Method 2:
SensorManager mSensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
Sensor accelerometer = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Sensor magnetometer = mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
mSensorManager.registerListener(this, accelerometer, SensorManager.SENSOR_DELAY_NORMAL);
mSensorManager.registerListener(this, magnetometer, SensorManager.SENSOR_DELAY_NORMAL);
float[] mGravity;
float[] mGeomagnetic;
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravity = event.values;
}
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
mGeomagnetic = event.values;
}
if (mGravity != null && mGeomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
float azimuthInDegress = ((float) Math.toDegrees(orientation[0]) + 360) % 360;
}
}
}
I tried out both methods by placing my device in the North direction (which is around 360 degrees):
Method 1 returns perfect results but unfortunately this method is deprecated:
359.6567
359.5034
359.859
359.76212
359.8878
359.87048
359.8356
359.80356
359.81192
359.7671
359.84668
359.88528
Method 2 also returns good results but sometimes (randomly) it returns an incorrect degree:
359.91495
359.83652
263.67697
359.67993
359.70038
359.688
359.71155
359.70276
359.6984
359.6429
270.6323
359.62302
359.49954
359.44757
359.47803
359.4947
359.39572
As you can see, some incorrect degrees are randomly returned with the second method. The device is calibrated and I think that the problem is with the second method as the first method returns perfect results. Can you guys help me out?
The problem is in the assigment of mGravity and mGeomagnetic it should be event.values.clone(). mGravity has class scope but, by using mGravity = event.values, you assign its value to a value in an address that has method scope. So as soon as onSensorChanged is called again and it is magnetic type, the mGravity is now pointing to a variable which no longer exists and thus can have any value.
What I want to happen, is to remap the coordinate system, when the phone is turned away from it's "natural" orientation. So that when using a phone, and it's in landscape, it should read the same values, as if it were being held in portrait.
I'm checking to see if rotation equals Surface.ROTATION_90, and if so, then remap the coordinate system.
I admit I don't quite understand how to do it properly, and could use a little guidance.
So, you need to run these two methods:
SensorManager.getRotationMatrix(inR, I, grav, mag);
SensorManager.remapCoordinateSystem(inR, SensorManager.AXIS_Y,SensorManager.AXIS_MINUS_X, outR);
What's required to pass into these methods? I created a new float array, then passed just the orientationsensor data to the mag field, which didn't work. So, I registered both the accelerometer and magnetic field sensors. Fed the data from both of those to the getRotatioMatrix method, and I always get a NullPointerException (even though the JavaDoc says some arguments can be null). I even tried passing data to each argument, and still got a NullPointerException.
My question is, what is the proper data that I need to pass into the getRotationMatrix method?
I found that a very simple way to do this is the one used in the SDK sample AccelerometerPlay.
First you get your display like this, for example in onResume():
WindowManager windowManager = (WindowManager) getSystemService(WINDOW_SERVICE);
mDisplay = windowManager.getDefaultDisplay();
Then in onSensorChanged() you can use this simple code:
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER)
return;
switch (mDisplay.getRotation()) {
case Surface.ROTATION_0:
mSensorX = event.values[0];
mSensorY = event.values[1];
break;
case Surface.ROTATION_90:
mSensorX = -event.values[1];
mSensorY = event.values[0];
break;
case Surface.ROTATION_180:
mSensorX = -event.values[0];
mSensorY = -event.values[1];
break;
case Surface.ROTATION_270:
mSensorX = event.values[1];
mSensorY = -event.values[0];
break;
}
}
Hope this will help.
this is my code, and it works without NPEs. Note, that I have just one Listener, but you have to register it to listen to both sensors (ACCELEROMETER and MAGNETICFIELD).
private SensorEventListener mOrientationSensorsListener = new SensorEventListener() {
private float[] mR = new float[9];
private float[] mRemappedR = new float[9];
private float[] mGeomagneticVector = new float[3];
private float[] mGravityVector = new float[3];
#Override
public void onSensorChanged(SensorEvent event) {
synchronized(this) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravityVector = Util.exponentialSmoothing(event.values, mGravityVector, 0.2f);
} else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
mGeomagneticVector = Util.exponentialSmoothing(event.values, mGeomagneticVector, 0.5f);
SensorManager.getRotationMatrix(mR, null, mGravityVector, mGeomagneticVector);
SensorManager.remapCoordinateSystem(mR, SensorManager.AXIS_Y,SensorManager.AXIS_MINUS_X, mRemappedR);
}
}
}
The exponentialSmoothing method does some smoothing of the sensor results and looks like that (the alpha value can go from 0 to 1, where 1 means no smoothing at all):
public static float[] exponentialSmoothing(float[] input, float[] output, float alpha) {
for (int i=0; i<input.length; i++) {
output[i] = output[i] + alpha * (input[i] - output[i]);
}
return output;
}
As for the synchronized bit -- I'm not sure that it is needed, just read it somewhere and added it.
My augmented reality app needs the compass bearing of the camera view, and there's plenty of examples of getting the direction from the sensormanager.
However I'm finding the resulting value different depending on the phone orientation - landscape rotated to right is about 10 degrees different to landscape rotated to left (difference between ROTATION_0 and ROTATION_180 is less, but still different). This difference is enough to ruin any AR effect.
Is it something to do with calibration? (I'm not convinced I'm doing the figure of 8 thing properly - I've tried various ways I've found on youtube).
Any ideas why there's a difference? Have I messed up on the rotation matrix stuff? I have the option of restricting the app to a single orientation, but it still concerns me that the compass reading still isn't very accurate (even though after filtering it's fairly stable)
public void onSensorChanged(SensorEvent event) {
if (event.accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE) {
return;
}
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) mGravity = event.values;
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) mGeomagnetic = event.values;
if (mGravity != null && mGeomagnetic != null) {
float[] rotationMatrixA = mRotationMatrixA;
if (SensorManager.getRotationMatrix(rotationMatrixA, null, mGravity, mGeomagnetic)) {
float[] rotationMatrixB = mRotationMatrixB;
Display display = getWindowManager().getDefaultDisplay();
int deviceRot = display.getRotation();
switch (deviceRot)
{
// portrait - normal
case Surface.ROTATION_0: SensorManager.remapCoordinateSystem(rotationMatrixA,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
rotationMatrixB);
break;
// rotated left (landscape - keys to bottom)
case Surface.ROTATION_90: SensorManager.remapCoordinateSystem(rotationMatrixA,
SensorManager.AXIS_Z, SensorManager.AXIS_MINUS_X,
rotationMatrixB);
break;
// upside down
case Surface.ROTATION_180: SensorManager.remapCoordinateSystem(rotationMatrixA,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
rotationMatrixB);
break;
// rotated right
case Surface.ROTATION_270: SensorManager.remapCoordinateSystem(rotationMatrixA,
SensorManager.AXIS_MINUS_Z, SensorManager.AXIS_X,
rotationMatrixB);
break;
default: break;
}
float[] dv = new float[3];
SensorManager.getOrientation(rotationMatrixB, dv);
// add to smoothing filter
fd.AddLatest((double)dv[0]);
}
mDraw.invalidate();
}
}
Try this
public void onSensorChanged(SensorEvent event) {
if (event.accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE) {
return;
}
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) mGravity = event.values.clone ();
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) mGeomagnetic = event.values.clone ();
if (mGravity != null && mGeomagnetic != null) {
float[] rotationMatrixA = mRotationMatrixA;
if (SensorManager.getRotationMatrix(rotationMatrixA, null, mGravity, mGeomagnetic)) {
float[] rotationMatrixB = mRotationMatrixB;
SensorManager.remapCoordinateSystem(rotationMatrixA,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
rotationMatrixB);
float[] dv = new float[3];
SensorManager.getOrientation(rotationMatrixB, dv);
// add to smoothing filter
fd.AddLatest((double)dv[0]);
}
mDraw.invalidate();
}
}
You do not need the switch statement, there seems to be a lot of confusion concerning getRotationMatrix, remapCoordinateSystem and getOrientation from stackoverflow questions.
I probably will write a detail explanation of these in the near future.
Hoan's answer is actually incorrect because it doesn't account for the display rotation. This is the correct answer.