I need to be able to measure the change of the accelerometer, not its actual position. I need to move a sprite around with it, and it only works when the device's 'default', if you will, is flat on a table. I need to be able to measure its change. I have heard matrix multiplication and things are needed, but are there any easier ways to do this?
When you register a SensorEventListener, you must Override onSensorChanged(). This returns the change on X, Y, & Z axes. If you want to calculate the rotation matrix from this data, you also need to use the magnetic field sensor. How right? This should get you going:
// Deduced from Accelerometer data
private float[] gravityMatrix = new float[3];
// Magnetic field
private float[] geomagneticMatrix = new float[3];
private boolean sensorReady = false;
private int counter = 0;
public void onSensorChanged(SensorEvent event) {
float[] mIdentityMatrix = new float[16];
mIdentityMatrix[0] = 1;
mIdentityMatrix[4] = 1;
mIdentityMatrix[8] = 1;
mIdentityMatrix[12] = 1;
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
if (counter++ % 10 == 0) {
gravityMatrix = event.values.clone();
sensorReady = true;
}
}
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
geomagneticMatrix = event.values.clone();
if (sensorReady)
SensorManager.getRotationMatrix(mRotationMatrix,
mIdentityMatrix, gravityMatrix, geomagneticMatrix);
}
Related
I have an app which uses orientation data which works very well using the pre API-8 method of using a Sensor.TYPE_ORIENTAITON. Smoothing that data was relatively easy.
I am trying to update the code to avoid using this deprecated approach. The new standard approach is to replace the single Sensor.TYPE_ORIENTATION with a Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGENTIC_FIELD combination. As that data is received, it is sent (via SensorManager.getRotationMatrix()) to SensorManager.getOrientation(). This (theoretically) returns the same information as Sensor.TYPE_ORIENTATION did (apart from different units and axis orientation).
However, this approach seems to generate data which is much more jittery (ie noisy) than the deprecated method (which still works). So, if you compare the same information on the same device, the deprecated method provides much less noisy data than the current method.
How do I get the actual same (less noisy) data that the deprecated method used to provide?
To make my question a little clearer: I have read various answers on this subject, and I have tried all sorts of filter: simple KF / IIR low pass as you suggest; median filter between 5 and 19 points, but so far I have yet to get anywhere close to the smoothness of the data the phone supplies via TYPE_ORIENTATION.
Apply a low-pass filter to your sensor output.
This is my low-pass filter method:
private static final float ALPHA = 0.5f;
//lower alpha should equal smoother movement
...
private float[] applyLowPassFilter(float[] input, float[] output) {
if ( output == null ) return input;
for ( int i=0; i<input.length; i++ ) {
output[i] = output[i] + ALPHA * (input[i] - output[i]);
}
return output;
}
Apply it like so:
float[] mGravity;
float[] mGeomagnetic;
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
mGravity = applyLowPassFilter(event.values.clone(), mGravity);
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
mGeomagnetic = applyLowPassFilter(event.values.clone(), mGeomagnetic);
if (mGravity != null && mGeomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
azimuth = -orientation[0];
invalidate();
}
}
}
This is obviously code for a compass, remove what you don't need.
Also, take a look at this SE question How to implement low pass filter using java
It turns out that there is another, not particularly documented, way to get orientation data. Hidden in the list of sensor types is TYPE_ROTATION_VECTOR. So, set one up:
Sensor mRotationVectorSensor = sensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);
sensorManager.registerListener(this, mRotationVectorSensor, SensorManager.SENSOR_DELAY_GAME);
Then:
#Override
public void onSensorChanged(SensorEvent event) {
final int eventType = event.sensor.getType();
if (eventType != Sensor.TYPE_ROTATION_VECTOR) return;
long timeNow = System.nanoTime();
float mOrientationData[] = new float[3];
calcOrientation(mOrientationData, event.values.clone());
// Do what you want with mOrientationData
}
The key mechanism is going from the incoming rotation data to an orientation vector via a rotation matrix. The slightly frustrating thing is the orientation vector comes from quaternion data in the first place, but I can't see how to get the quaternion delivered direct. (If you ever wondered how quaternions relate to orientatin and rotation information, and why they are used, see here.)
private void calcOrientation(float[] orientation, float[] incomingValues) {
// Get the quaternion
float[] quatF = new float[4];
SensorManager.getQuaternionFromVector(quatF, incomingValues);
// Get the rotation matrix
//
// This is a variant on the code presented in
// http://www.euclideanspace.com/maths/geometry/rotations/conversions/quaternionToMatrix/
// which has been altered for scaling and (I think) a different axis arrangement. It
// tells you the rotation required to get from the between the phone's axis
// system and the earth's.
//
// Phone axis system:
// https://developer.android.com/guide/topics/sensors/sensors_overview.html#sensors-coords
//
// Earth axis system:
// https://developer.android.com/reference/android/hardware/SensorManager.html#getRotationMatrix(float[], float[], float[], float[])
//
// Background information:
// https://en.wikipedia.org/wiki/Rotation_matrix
//
float[][] rotMatF = new float[3][3];
rotMatF[0][0] = quatF[1]*quatF[1] + quatF[0]*quatF[0] - 0.5f;
rotMatF[0][1] = quatF[1]*quatF[2] - quatF[3]*quatF[0];
rotMatF[0][2] = quatF[1]*quatF[3] + quatF[2]*quatF[0];
rotMatF[1][0] = quatF[1]*quatF[2] + quatF[3]*quatF[0];
rotMatF[1][1] = quatF[2]*quatF[2] + quatF[0]*quatF[0] - 0.5f;
rotMatF[1][2] = quatF[2]*quatF[3] - quatF[1]*quatF[0];
rotMatF[2][0] = quatF[1]*quatF[3] - quatF[2]*quatF[0];
rotMatF[2][1] = quatF[2]*quatF[3] + quatF[1]*quatF[0];
rotMatF[2][2] = quatF[3]*quatF[3] + quatF[0]*quatF[0] - 0.5f;
// Get the orientation of the phone from the rotation matrix
//
// There is some discussion of this at
// http://stackoverflow.com/questions/30279065/how-to-get-the-euler-angles-from-the-rotation-vector-sensor-type-rotation-vecto
// in particular equation 451.
//
final float rad2deg = (float)(180.0 / PI);
orientation[0] = (float)Math.atan2(-rotMatF[1][0], rotMatF[0][0]) * rad2deg;
orientation[1] = (float)Math.atan2(-rotMatF[2][1], rotMatF[2][2]) * rad2deg;
orientation[2] = (float)Math.asin ( rotMatF[2][0]) * rad2deg;
if (orientation[0] < 0) orientation[0] += 360;
}
This seems to give data very similar in feel (I haven't run numeric tests) to the old TYPE_ORIENTATION data: it was usable for motion control of the device with marginal filtering.
There is also helpful information here, and a possible alternative solution here.
Here's what worked out for me using SensorManager.SENSOR_DELAY_GAME for a fast update i.e.
#Override
protected void onResume() {
super.onResume();
sensor_manager.registerListener(this, sensor_manager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_GAME);
sensor_manager.registerListener(this, sensor_manager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD), SensorManager.SENSOR_DELAY_GAME);
}
MOVING AVERAGE
(less efficient)
private float[] gravity;
private float[] geomagnetic;
private float azimuth;
private float pitch;
private float roll;
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
gravity = moving_average_gravity(event.values);
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
geomagnetic = moving_average_geomagnetic(event.values);
if (gravity != null && geomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, gravity, geomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
azimuth = (float) Math.toDegrees(orientation[0]);
pitch = (float) Math.toDegrees(orientation[1]);
roll = (float) Math.toDegrees(orientation[2]);
//if(roll>-46F && roll<46F)view.setTranslationX((roll/45F)*max_translation); //tilt from -45° to 45° to x-translate a view positioned centrally in a layout, from 0 - max_translation
Log.i("TAG","azimuth: "+azimuth+" | pitch: "+pitch+" | roll: "+roll);
}
}
}
private ArrayList<Float[]> moving_gravity;
private ArrayList<Float[]> moving_geomagnetic;
private static final float moving_average_size=12;//change
private float[] moving_average_gravity(float[] gravity) {
if(moving_gravity ==null){
moving_gravity =new ArrayList<>();
for (int i = 0; i < moving_average_size; i++) {
moving_gravity.add(new Float[]{0F,0F,0F});
}return new float[]{0F,0F,0F};
}
moving_gravity.remove(0);
moving_gravity.add(new Float[]{gravity[0],gravity[1],gravity[2]});
return moving_average(moving_gravity);
}
private float[] moving_average_geomagnetic(float[] geomagnetic) {
if(moving_geomagnetic ==null){
this.moving_geomagnetic =new ArrayList<>();
for (int i = 0; i < moving_average_size; i++) {
moving_geomagnetic.add(new Float[]{0F,0F,0F});
}return new float[]{0F,0F,0F};
}
moving_geomagnetic.remove(0);
moving_geomagnetic.add(new Float[]{geomagnetic[0],geomagnetic[1],geomagnetic[2]});
return moving_average(moving_geomagnetic);
}
private float[] moving_average(ArrayList<Float[]> moving_values){
float[] moving_average =new float[]{0F,0F,0F};
for (int i = 0; i < moving_average_size; i++) {
moving_average[0]+= moving_values.get(i)[0];
moving_average[1]+= moving_values.get(i)[1];
moving_average[2]+= moving_values.get(i)[2];
}
moving_average[0]= moving_average[0]/moving_average_size;
moving_average[1]= moving_average[1]/moving_average_size;
moving_average[2]= moving_average[2]/moving_average_size;
return moving_average;
}
LOW PASS FILTER
(more efficient)
private float[] gravity;
private float[] geomagnetic;
private float azimuth;
private float pitch;
private float roll;
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
gravity = LPF(event.values.clone(), gravity);
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
geomagnetic = LPF(event.values.clone(), geomagnetic);
if (gravity != null && geomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, gravity, geomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
azimuth = (float) Math.toDegrees(orientation[0]);
pitch = (float) Math.toDegrees(orientation[1]);
roll = (float) Math.toDegrees(orientation[2]);
//if(roll>-46F && roll<46F)view.setTranslationX((roll/45F)*max_translation); //tilt from -45° to 45° to x-translate a view positioned centrally in a layout, from 0 - max_translation
Log.i("TAG","azimuth: "+azimuth+" | pitch: "+pitch+" | roll: "+roll);
}
}
}
private static final float ALPHA = 1/16F;//adjust sensitivity
private float[] LPF(float[] input, float[] output) {
if ( output == null ) return input;
for ( int i=0; i<input.length; i++ ) {
output[i] = output[i] + ALPHA * (input[i] - output[i]);
}return output;
}
N.B
moving average of 12 values instead as per here
low pass filter of ALPHA = 0.0625 instead as per here
I'm having a really annoying problem with a AR view acting like a compass. So when I hold the phone in portrait (so that the screen is pointing to my face), then I call the remapCoordinateSystem that the pitch is 0 when holding it portrait. Then the azimuth (compass functionality) is perfect, but as soon as I tilt the phone the azimuth gets ruined, if I bend forward the azimuth increases and if I bend backwards it decreases.
I use 2 sensors to get the readings, Sensor.TYPE_MAGNETIC_FIELD and Sensor.TYPE_GRAVITY.
I use a lowpassfilter which is pretty basic, it's implemented with an alpha constant and is used directly on the read values from the sensors.
Here is my code:
float[] rotationMatrix = new float[9];
SensorManager.getRotationMatrix(rotationMatrix, null, gravitymeterValues,
magnetometerValues);
float[] remappedRotationMatrix = new float[9];
SensorManager.remapCoordinateSystem(rotationMatrix, SensorManager.AXIS_X,
SensorManager.AXIS_Z, remappedRotationMatrix);
float results[] = new float[3];
SensorManager.getOrientation(remappedRotationMatrix, results);
float azimuth = (float) (results[0] * 180 / Math.PI);
if (azimuth < 0) {
azimuth += 360;
}
float pitch = (float) (results[1] * 180 / Math.PI);
float roll = (float) (results[2] * 180 / Math.PI);
As you see there is no magic here. I call this piece of code when the gravitymeterValues and the magnetometerValues are ready to be used.
My question is how do I stop the azimuth from going crazy when I tilt the phone?
I checked a free app on the Google Play Store, Compass and it hasn't solved this problem, but I hope there is a solution.
I have 2 solutions in mind:
Make the AR view only work in very constrainted pitch angles, right now I have something like pitch >= -5 && pitch <= 30. If this isn't fullfilled the user is shown a screen that asks him/her to rotate the phone to portrait.
Somehow use the pitch to suppress the azimuth, this seems like a pretty device-specific solution though, but of course I'm open for suggestions.
I can also add that I've been searching for a couple of hours for a decent solution and I haven't found any that has given me any better solutions than 2) here.
Thanks in advance!
For complete code see https://github.com/hoananguyen/dsensor
Keep a history and average out, I do not know the correct interpretation of pitch and roll so the following code is for azimuth only.
Class members
private List<float[]> mRotHist = new ArrayList<float[]>();
private int mRotHistIndex;
// Change the value so that the azimuth is stable and fit your requirement
private int mHistoryMaxLength = 40;
float[] mGravity;
float[] mMagnetic;
float[] mRotationMatrix = new float[9];
// the direction of the back camera, only valid if the device is tilted up by
// at least 25 degrees.
private float mFacing = Float.NAN;
public static final float TWENTY_FIVE_DEGREE_IN_RADIAN = 0.436332313f;
public static final float ONE_FIFTY_FIVE_DEGREE_IN_RADIAN = 2.7052603f;
onSensorChanged
#Override
public void onSensorChanged(SensorEvent event)
{
if (event.sensor.getType() == Sensor.TYPE_GRAVITY)
{
mGravity = event.values.clone();
}
else
{
mMagnetic = event.values.clone();
}
if (mGravity != null && mMagnetic != null)
{
if (SensorManager.getRotationMatrix(mRotationMatrix, null, mGravity, mMagnetic))
{
// inclination is the degree of tilt by the device independent of orientation (portrait or landscape)
// if less than 25 or more than 155 degrees the device is considered lying flat
float inclination = (float) Math.acos(mRotationMatrix[8]);
if (inclination < TWENTY_FIVE_DEGREE_IN_RADIAN
|| inclination > ONE_FIFTY_FIVE_DEGREE_IN_RADIAN)
{
// mFacing is undefined, so we need to clear the history
clearRotHist();
mFacing = Float.NaN;
}
else
{
setRotHist();
// mFacing = azimuth is in radian
mFacing = findFacing();
}
}
}
}
private void clearRotHist()
{
if (DEBUG) {Log.d(TAG, "clearRotHist()");}
mRotHist.clear();
mRotHistIndex = 0;
}
private void setRotHist()
{
if (DEBUG) {Log.d(TAG, "setRotHist()");}
float[] hist = mRotationMatrix.clone();
if (mRotHist.size() == mHistoryMaxLength)
{
mRotHist.remove(mRotHistIndex);
}
mRotHist.add(mRotHistIndex++, hist);
mRotHistIndex %= mHistoryMaxLength;
}
private float findFacing()
{
if (DEBUG) {Log.d(TAG, "findFacing()");}
float[] averageRotHist = average(mRotHist);
return (float) Math.atan2(-averageRotHist[2], -averageRotHist[5]);
}
public float[] average(List<float[]> values)
{
float[] result = new float[9];
for (float[] value : values)
{
for (int i = 0; i < 9; i++)
{
result[i] += value[i];
}
}
for (int i = 0; i < 9; i++)
{
result[i] = result[i] / values.size();
}
return result;
}
I'm trying to get the direction of the camera in Android. I have code that's working perfectly in portrait (I test it by slowly turning in a circle and looking at updates 1s apart), but it isn't working at all in landscape- The numbers seem to change randomly. It also gets totally out of whack after switching from portrait to landscape. Here's my code
public void onSensorChanged(SensorEvent event) {
switch (event.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
accelerometerValues = event.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomagneticMatrix = event.values.clone();
break;
default:
break;
}
if (geomagneticMatrix != null && accelerometerValues != null) {
float[] R = new float[16];
float[] I = new float[16];
float[] outR = new float[16];
//Get the rotation matrix, then remap it from camera surface to world coordinates
SensorManager.getRotationMatrix(R, I, accelerometerValues, geomagneticMatrix);
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
float values[] = new float[4];
SensorManager.getOrientation(outR,values);
float direction = normalizeDegrees((float) Math.toDegrees(values[0]));
float pitch = normalizeDegrees((float) Math.toDegrees(values[1]));
float roll = normalizeDegrees((float) Math.toDegrees(values[2]));
if((int)direction != (int)lastDirection){
lastDirection = direction;
for(CompassListener listener: listeners){
listener.onDirectionChanged(lastDirection, pitch, roll);
}
}
}
}
Any ideas what I'm doing wrong? I freely admit I don't 100% understand this. I also don't know why Google deprecated the orientation sensor- it seems like a common enough desire.
Did you consider, that when you change from portrait to landscape, accelerometer axes change ? Like Y-axis becomes Z-axis and so on. This might be one source of strange behavior.
I seemed to have solved it, or at least improved it to the point where I know what was the problem. I put in a filter such that instead of delivering a single sensor reading, I'm remembering the last reading and applying a delta to it. Each new sensor point is allowed to add a maximum of 5 degrees. This completely filters out the weird hops, and forces it to converge to a value. I sill see an occasional odd jump, but I figure what I need is a more sophisticated filter. New code:
public void onSensorChanged(SensorEvent event) {
if (event.accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE)
return;
switch (event.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
accelerometerValues = event.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomagneticMatrix = event.values.clone();
break;
}
if (geomagneticMatrix != null && accelerometerValues != null) {
float[] R = new float[16];
float[] I = new float[16];
float[] outR = new float[16];
//Get the rotation matrix, then remap it from camera surface to world coordinates
SensorManager.getRotationMatrix(R, I, accelerometerValues, geomagneticMatrix);
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
float values[] = new float[4];
SensorManager.getOrientation(outR,values);
int direction = filterChange(normalizeDegrees(Math.toDegrees(values[0])));
int pitch = normalizeDegrees(Math.toDegrees(values[1]));
int roll = normalizeDegrees(Math.toDegrees(values[2]));
if((int)direction != (int)lastDirection){
lastDirection = (int)direction;
lastPitch = (int)pitch;
lastRoll = (int)roll;
for(CompassListener listener: listeners){
listener.onDirectionChanged(lastDirection, pitch, roll);
}
}
}
}
//Normalize a degree from 0 to 360 instead of -180 to 180
private int normalizeDegrees(double rads){
return (int)((rads+360)%360);
}
//We want to ignore large bumps in individual readings. So we're going to cap the number of degrees we can change per report
private int filterChange(int newDir){
int change = newDir - lastDirection;
int circularChange = newDir-(lastDirection+360);
int smallestChange;
if(Math.abs(change) < Math.abs(circularChange)){
smallestChange = change;
}
else{
smallestChange = circularChange;
}
smallestChange = Math.max(Math.min(change,5),-5);
return lastDirection+smallestChange;
}
What I want to happen, is to remap the coordinate system, when the phone is turned away from it's "natural" orientation. So that when using a phone, and it's in landscape, it should read the same values, as if it were being held in portrait.
I'm checking to see if rotation equals Surface.ROTATION_90, and if so, then remap the coordinate system.
I admit I don't quite understand how to do it properly, and could use a little guidance.
So, you need to run these two methods:
SensorManager.getRotationMatrix(inR, I, grav, mag);
SensorManager.remapCoordinateSystem(inR, SensorManager.AXIS_Y,SensorManager.AXIS_MINUS_X, outR);
What's required to pass into these methods? I created a new float array, then passed just the orientationsensor data to the mag field, which didn't work. So, I registered both the accelerometer and magnetic field sensors. Fed the data from both of those to the getRotatioMatrix method, and I always get a NullPointerException (even though the JavaDoc says some arguments can be null). I even tried passing data to each argument, and still got a NullPointerException.
My question is, what is the proper data that I need to pass into the getRotationMatrix method?
I found that a very simple way to do this is the one used in the SDK sample AccelerometerPlay.
First you get your display like this, for example in onResume():
WindowManager windowManager = (WindowManager) getSystemService(WINDOW_SERVICE);
mDisplay = windowManager.getDefaultDisplay();
Then in onSensorChanged() you can use this simple code:
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER)
return;
switch (mDisplay.getRotation()) {
case Surface.ROTATION_0:
mSensorX = event.values[0];
mSensorY = event.values[1];
break;
case Surface.ROTATION_90:
mSensorX = -event.values[1];
mSensorY = event.values[0];
break;
case Surface.ROTATION_180:
mSensorX = -event.values[0];
mSensorY = -event.values[1];
break;
case Surface.ROTATION_270:
mSensorX = event.values[1];
mSensorY = -event.values[0];
break;
}
}
Hope this will help.
this is my code, and it works without NPEs. Note, that I have just one Listener, but you have to register it to listen to both sensors (ACCELEROMETER and MAGNETICFIELD).
private SensorEventListener mOrientationSensorsListener = new SensorEventListener() {
private float[] mR = new float[9];
private float[] mRemappedR = new float[9];
private float[] mGeomagneticVector = new float[3];
private float[] mGravityVector = new float[3];
#Override
public void onSensorChanged(SensorEvent event) {
synchronized(this) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravityVector = Util.exponentialSmoothing(event.values, mGravityVector, 0.2f);
} else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
mGeomagneticVector = Util.exponentialSmoothing(event.values, mGeomagneticVector, 0.5f);
SensorManager.getRotationMatrix(mR, null, mGravityVector, mGeomagneticVector);
SensorManager.remapCoordinateSystem(mR, SensorManager.AXIS_Y,SensorManager.AXIS_MINUS_X, mRemappedR);
}
}
}
The exponentialSmoothing method does some smoothing of the sensor results and looks like that (the alpha value can go from 0 to 1, where 1 means no smoothing at all):
public static float[] exponentialSmoothing(float[] input, float[] output, float alpha) {
for (int i=0; i<input.length; i++) {
output[i] = output[i] + alpha * (input[i] - output[i]);
}
return output;
}
As for the synchronized bit -- I'm not sure that it is needed, just read it somewhere and added it.
I'm having some troubles implementing an augmented reality app for android and I'm hoping that someone could help me. (Sorry for my English...)
Basically I'm getting values from the accelerometer and mangetic field sensors, then as I read remapCoordinatessystem(inR, AXIS_X, AXIS_Z, outR)...and eventually I getOrientation...
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravity = event.values;
}
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
mGeomagnetic = event.values;
}
if (mGravity != null && mGeomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
float outR[] = new float[9];
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(outR, orientation);
// Here I pass the orientation values to the object, which is render by the min3d framework
}
}
}
Am I getting the values correct? Or I need to transform them into degrees?¿ I'm lost...
Then I rotate my 3D object with the values I have read from the sensors... but the obejct it's not moving at all.
public void updateScene() {
objModel.rotation().y = _orientation[2];
objModel.rotation().x = _orientation[1];
objModel.rotation().z = _orientation[0];
}
OpenGL is not my friend...so I'm not sure I'm transforming correctly... which is the order of the rotation axis or it doesn't matter... and which value from orientation should correspond to the axis of the 3D object loaded by Min3D?
If this is not the path I must follow... could someone guide me to the correct one please? It's been a few weeks fighting with this.
Thank you so much... (StackOverflow lover)
I had an issue with
mGeomagnetic = event.values;
you should write
mGeomagnetic = event.values.clone();
instead