Get min and max values for getOrientation on accelerometer? - android

I am currently working with the accelerometer on Android, and have run into an interesting situation. I need to find out the maximum values for the following, WITHOUT requiring the end user to flop their phone around in some elaborate calibration routine. The values I am looking for specifically are...
#Override
public void onSensorChanged(SensorEvent event) {
SensorManager.getOrientation(outR, xyz);
float X, Y, XMax, YMax;
X = xyz[1];
Y = xyz[2];
XMax = ???;
YMax = ???;
}
I've poked around online, but have not had much luck determining what the values would be (it appears it changes from phone to phone), nor seen anyone mention a way to pull that value from somewhere.
Does anyone know how I might determine the maximum/minimum values of xyz[1] and xyz[2]?
UPDATE: An Example of what I mean is the following:
When I report directly the outputed float from xyz[1], and then move my phone forward and backward all the way, I get the following results:
min value it reports: -1.50 (give or take a 10ths)
max value it reports: 1.50 (give or take a 10ths)
when I roll my phone left and right, I get the following for xyz[2]:
min value it reports: -3.14 (give or take a 10ths)
max value it reports: 3.14 (give or take a 10ths)
I was told this changes between phones, however this could be wrote.

I don't know what max/min values you're talking about but getOrientation will populate a vector with the following data:
xyz[0]: azimuth [-π,π]
xyz[1]: pitch [-π,π]
xyz[2]: roll [-π/2,π/2]
From docs:
All three angles above are in radians and positive in the
counter-clockwise direction.
Refer also to the SensorManager.java:1094 class:
https://android.googlesource.com/platform/frameworks/base/+/b267554/core/java/android/hardware/SensorManager.java

Related

Calibrating 3d Accelerometer for 2d Game

I am making a 2d game. The phone is held horizontally and a character moves up/down & left/right to avoid obstacles. The character is controlled by the accelerometer on the phone. Everything works fine if the player doesn't mind (0,0) (the point where the character stands still) being when the phone is held perfectly flat. In this scenario it's possible to just read the Y and X values directly and use them to control the character. The accelerometer values are between -10 and 10 (they get multiplied by an acceleration constant to decide the movement speed of the character), libgdx is the framework used.
The problem is that having (0,0) isn't very comfortable, so the idea is to calibrate it so that 0,0 will be set to the phones position at a specific point in time.
Which brings me to my question, how would I do this? I tried just reading the current X and Y values then subtracting it. The problem with that is that when the phone is held at a 90 degree angle then the X offset value is 10 (which is the max value) so it ends up becoming impossible to move because the value will never go over 10 (10-10 = 0). The Z axis has to come into play here somehow, I'm just not sure how.
Thanks for the help, I tried explaining as best as I can, I did try searching for the solution, but I don't even know what the proper term is for what I'm looking for.
An old question, but I am providing the answer here as I couldn't find a good answer for Android or LibGDX anywhere. The code below is based on a solution someone posted for iOS (sorry, I have lost the reference).
You can do this in three parts:
Capture a vector representing the neutral direction:
Vector3 tiltCalibration = new Vector3(
Gdx.input.getAccelerometerX(),
Gdx.input.getAccelerometerY(),
Gdx.input.getAccelerometerZ() );
Transform this vector into a rotation matrix:
public void initTiltControls( Vector3 tiltCalibration ) {
Vector3.tmp.set( 0, 0, 1 );
Vector3.tmp2.set( tiltCalibration ).nor();
Quaternion rotateQuaternion = new Quaternion().setFromCross( Vector3.tmp, Vector3.tmp2 );
Matrix4 m = new Matrix4( Vector3.Zero, rotateQuaternion, new Vector3( 1f, 1f, 1f ) );
this.calibrationMatrix = m.inv();
}
Whenever you need inputs from the accelerometer, first run them through the rotation matrix:
public void handleAccelerometerInputs( float x, float y, float z ) {
Vector3.tmp.set( x, y, z );
Vector3.tmp.mul( this.calibrationMatrix );
x = Vector3.tmp.x;
y = Vector3.tmp.y;
z = Vector3.tmp.z;
[use x, y and z here]
...
}
For a simple solution you can look at the methods:
Gdx.input.getAzimuth(), Gdx.input.getPitch(), Gdx.input.getRoll()
The downside is that those somehow use the internal compass to give your devices rotation compared to North/South/East/West. I did only test that very shortly so I'm not 100% sure about it though. Might be worth a look.
The more complex method involves some trigonometry, basically you have to calculate the angle the phone is held at from Gdx.input.getAccelerometerX/Y/Z(). Must be something like (for rotation along the longer side of the phone):
Math.atan(Gdx.input.getAccelerometerX() / Gdx.input.getAccelerometerZ());
For both approaches you then store the initial angle and subtract it later on again. You have to watch out for the ranges though, I think Math.atan(...) is within -Pi and Pi.
Hopefully that'll get you started somehow. You might search for "Accelerometer to pitch/roll/rotation" and similar, too.

Android sensor: getRotationMatrix() returns wrong values, why?

It's past several days since I started using this function and have not yet succeeded in obtaining valid results.
What i want is basically convert acceleration vector from device's coordinates system, to real world coordinates. I' know that is possible because i have acceleration in relative coordinates and i know the orientation of the device in real world system.
Reading Android developers seems that using getRotationMatrix() i get R = rotation matrix.
So if i want A (acceleration vector in world system) from A' (acceleration vector in phone system) i must do simply:
A=R*A'
But i cant'n understand why the vector A has ALWAYS the first and the second component zero (example: +0,00;-0,00;+6,43)
My current code is similar to this:
public void onSensorChanged(SensorEvent event) {
synchronized (this) {
switch(event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
accelerometervalues = event.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomagneticmatrix =event.values.clone();
break;
}
if (geomagneticmatrix != null && accelerometervalues != null) {
float[] Rs = new float[16];
float[] I = new float[16];
SensorManager.getRotationMatrix(Rs, I, accelerometervalues, geomagneticmatrix);
float resultVec[] = new float[4];
float relativacc[]=new float [4];
relativacc[0]=accelerationvalues[0];
relativacc[1]=accelerationvalues[1];
relativacc[2]=accelerationvalues[2];
relativacc[3]=0;
Matrix.multiplyMV(resultVec, 0, Rs, 0, relativacc, 0);
//resultVec[] is the vector acceleration relative to world coordinates system..but doesn't WORK!!!!!
}
}
}
This question is very similar to this one Transforming accelerometer's data from device's coordinates to real world coordinates but there i can't find the solution...i had tried all the ways..
Please help me, i need help!!!
UPDATE:
Now my code is below, i had tried to explain matrix product, but nothing change:
float[] Rs = new float[9];
float[] I = new float[9];
SensorManager.getRotationMatrix(Rs, I, accelerationvalues, geomagneticmatrix);
float resultVec[] = new float[4];
resultVec[0]=Rs[0]*accelerationvalues[0]+Rs[1]*accelerationvalues[1]+Rs[2]*accelerationvalues[2];
resultVec[1]=Rs[3]*accelerationvalues[0]+Rs[4]*accelerationvalues[1]+Rs[5]*accelerationvalues[2];
resultVec[2]=Rs[6]*accelerationvalues[0]+Rs[7]*accelerationvalues[1]+Rs[8]*accelerationvalues[2];
Here some example of data read and result:
Rs separated by " " Rs[0] Rs[1]....Rs[8]
Av separated by " " accelerationvalues[0] ...accelerationvalues[2]
rV separated by " " resultVec[0] ...resultVec[2]
As you can notice the component on x and y axes in real world are zero (around) even if you move speddy the phone. Instead the relative acceleration vector detect correctly each movement!!!
SOLUTION
The errors in the numberrs are relative to float vars multiplication that is not the same as a double multyplication.
This sums to the fact that rotation matrix isn't costant if the phone, even if with the same orientation, is accelerating.
So is impossible translate acceleration vector to absolute coordinates during motion...
It's hard but it's the reality.
Finnaly i found the answer:
The errors in the numbers are relative to float vars multiplication that is not the same as a double multyplication. Here there is the solution.
This sums to the fact that rotation matrix isn't costant if the phone, even if with the same orientation, is accelerating. So is impossible translate acceleration vector to absolute coordinates during motion... It's hard but it's the reality.
FYI the orientation vector is made from magnetomer data AND gravity vector. This cause a ciclic problem: convert relative acc needs oirentation needs magnetic field AND gravity, but we know gravity only if the phone is stop by relative acc..so we are return to begin.
This is confirmed in Android Developers where is explained that rotation matrix give true result only when the phone isn't accelerate (e.g. they talk of free fall, infact there shouldn't be gravity mesaurement) or when it isn't in a non regulare magnetic field.
The matrices returned by this function are meaningful only when the
device is not free-falling and it is not close to the magnetic north.
If the device is accelerating, or placed into a strong magnetic field,
the returned matrices may be inaccurate.
In others world, fully un-useful...
You can trust this thing doing simple experiment on the table with Android Senor or something like this..
You must track down this arithmetic error before you worry about rotation, acceleration or anything else.
You have confirmed that
resultVec[0]=Rs[0]*accelerationvalues[0];
gives you
Rs[0]: 0.24105562
accelerationValues[0]: 6.891896
resultVec[0]: 1.1920929E-7
So once again, simpify. Try this:
Rs[0] = 0.2;
resultVec[0] = Rs[0] * 6.8
EDIT:
The last one gave resultVec[0]=1.36, so let's try this:
Rs[0] = 0.2;
accelerationValues[0] = 6.8
resultVec[0] = Rs[0] * accelerationValues[0];
If you do the sums, using the printed values you have appended, I get
`(0.00112, -0.0004, 10)`
which is not as small as what you have. Therefore there is an arithmetic error!
Could the problem be that you are using accelerationvalues[] in the last block, and accelerometervalues[] later?
I have developed several applications that make use of android sensors, so I am answering to one of your questions according to my experience:
But i cant'n understand why the vector A has ALWAYS the first and the
second component zero (example: +0,00;-0,00;+6,43)
I have observed this problem with the acceleration sensor and the magnetic field sensor, too. The readings are zero for some of the axis (two as you point, or just one in other occasions). This problem happens when you have just enabled the sensors (registerListener()) and I assume that it is related to some kind of sensor initialization.
In the case of the acceleration sensor, I have observed that just a small shaking of the device makes it to start giving correct sensor readings.
The correct solution would be the method onAccuracyChanged() giving the correct information about the sensor state. It should be returning a staus of SensorManager.SENSOR_STATUS_UNRELIABLE, but instead of that, it permanently returns SensorManager.SENSOR_STATUS_ACCURACY_HIGH on all physical devices that I have tested so far. With the method onAccuracyChanged() properly implemented, you could ignore bad readings or ask the user to wait while the sensor is being initialized.

android remove gravity from accelerometer readings

I am developing an application for Android where I need to remove gravity from accelerometer readings. I have read multiple discussions on this problem, I have also found an algorithm here, but I didn't really understand it.
I want to filter gravity from each axis, not from the total acceleration.
Could you please help me out? My code should be something like:
public void onSensorChanged(SensorEvent sensorEvent) {
float vals[] = sensorEvent.values;
float accelerationX = filterGravity(vals[0]);
float accelerationY = filterGravity(vals[1]);
float accelerationZ = filterGravity(vals[2]);
}
What code should I place in the filterGravity() method?
For a basic solution you would need a low pass filter other approaches like a Kalman filter are pretty tough regarding the maths behind. A simple example for Android is one click away from your link at http://developer.android.com/reference/android/hardware/SensorEvent.html#values.
Simply spoken a low pass filter builds a weighted average from all your history values. If you have for example a filtering factor of 0.1 it means that 10% of your current value is added to the previous mean value: newMeanValue = 10% of currentValue + 90% of oldMeanValue. That means even if there is an abrupt peak it will only push your mean value slowly because of the 10%.
Linear acceleration is what you need. Check Sensor.TYPE_LINEAR_ACCELERATION here.
If you do not have a phone with TYPE_LINEAR_ACCELERATION, you are stuck with TYPE_ACCELERATION, which cannot separate gravity (tilt) from linear acceleration.
One option is to apply the low-pass filter. Another approach is using sensor fusion if the gyroscope is available. Both approaches have their advantages and disadvantages.
I have lots of working examples in the open source project Acceleration Explorer.
Not too sure what you are trying to acomplish but if you are look for the magnitude (which will give a result between 0 and 1) then all you do is divide the result by 10
public void onSensorChanged(SensorEvent sensorEvent) {
float vals[] = sensorEvent.values;
float accelerationX = (vals[0]/10);
float accelerationY = (vals[1]/10);
float accelerationZ = (vals[2]/10);
}
In a game environment when the sensor is at full tilt your object will be at its maximum speed

Accelerometer values behaving like tilt meter?

The values I'm getting for accel, x, y and z below are not as expected.
It seems to be acting as a Tilt sensor rather than accelerometer.
When I throw the phone in the air and catch it, the accel value doesn't change by more than about 10%. Contrast this to when I rotate the phone randomly, I get much larger variations of 50-100%!
What could explain this? I simply want to detect when the phone is in freefall, (and/or impacting something).
SensorManager sm = (SensorManager)getSystemService(SENSOR_SERVICE);
Sensor sensor = sm.getDefaultSensor(SensorManager.SENSOR_ACCELEROMETER);
sm.registerListener(
sel = new SensorEventListener(){
#Override public void onAccuracyChanged(Sensor arg0, int arg1) {}
#Override public void onSensorChanged(SensorEvent arg0) {
double x = se.values[0];
double y = se.values[1];
double z = se.values[2];
double accel = Math.sqrt(
Math.pow(x, 2) +
Math.pow(y, 2) +
Math.pow(z, 2));
}
},
sensor,
SensorManager.SENSOR_DELAY_NORMAL
);
(As a side question, the values for x, y and z seem much higher than they should be, with accel averaging at about 50-80, when standing still? Shouldn't it be around 9.8?)
The x, y and z values seem very sensitive to changes in the orientation of the phone, but not at all representative of acceleration. Am I missing something??
Example values with phone still, lying on back:
Accel = 85.36, x = 6.8, y = 45.25 z = 30.125
I had to replace
Sensor sensor = sm.getDefaultSensor(SensorManager.SENSOR_ACCELEROMETER);
with
Sensor sensor = sm.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Could be because when you throw the phone it is almost the same plane or angle and at low speed, when you turn the phone it changes its course and orientation rapidly and that gives higher values for the result. Accelerometer may be a misnomer for a multi-function device, there could be a selection parameter for the function you really want to get results from.
With the phone lying on its back you should get close to zero on the X and Y sensors and about 9.8 on the Z sensor. The 9.8 is of course due to gravity.
My first though would be that there is something wrong with the phone and would suggest trying same code on another phone.
However I notice that there is something wrong in the math but haven;t figured out what yet.
with x,y,z having the values you mention the resultant (square root of sum of squares) works out to 54.78 rather than 85.36 as you mention in your post.
I'm quite new to Java so I cannot easily spot what might be wrong and haven't had the opportunity yet to try that piece of code on my phone, but I think the math is simple enough for me to determine that the result is wrong. (or at least I hope so).
The other thing to check (assuming you figureout the math problem) is that the small change when you throw the phone in the air might simply be due to the slow response time. The accelerometer output may simply be changing too slowly so by the time the phone has landed the output wouldn't have changed that much. The response can be improved by using SENSOR_DELAY_GAME or SENSOR_DELAY_FASTEST instead of normal.
By the way, shouldn't that be arg0.values[] rather than se.values[]? Where does the se come from? The sensor values go into the argument of the onSensorChanged (arg0 in this case) so I cannot figure out how they are supposed to end up in se. (But then again there are many things in Java I still don't understand)

How to detect walking with Android accelerometer

I'm writing an application and my aim is to detect when a user is walking.
I'm using a Kalman filter like this:
float kFilteringFactor=0.6f;
gravity[0] = (accelerometer_values[0] * kFilteringFactor) + (gravity[0] * (1.0f - kFilteringFactor));
gravity[1] = (accelerometer_values[1] * kFilteringFactor) + (gravity[1] * (1.0f - kFilteringFactor));
gravity[2] = (accelerometer_values[2] * kFilteringFactor) + (gravity[2] * (1.0f - kFilteringFactor));
linear_acceleration[0] = (accelerometer_values[0] - gravity[0]);
linear_acceleration[1] = (accelerometer_values[1] - gravity[1]);
linear_acceleration[2] = (accelerometer_values[2] - gravity[2]);
float magnitude = 0.0f;
magnitude = (float)Math.sqrt(linear_acceleration[0]*linear_acceleration[0]+linear_acceleration[1]*linear_acceleration[1]+linear_acceleration[2]*linear_acceleration[2]);
magnitude = Math.abs(magnitude);
if(magnitude>0.2)
//walking
The array gravity[] is initialized with 0s.
I can detect when a user is walking or not (looking at the value of the magnitude of the acceleration vector), but my problem is that when a user is not walking and he moves the phones, it seems that he is walking.
Am I using the right filter?
Is it right to watch only the magnitude of the vector or have I to look at the single values ??
Google provides an API for this called DetectedActivity that can be obtained using the ActivityRecognitionApi. Those docs can be accessed here and here.
DetectedActivity has the method public int getType() to get the current activity of the user and also public int getConfidence() which returns a value from 0 to 100. The higher the value returned by getConfidence(), the more certain the API is that the user is performing the returned activity.
Here is a constant summary of what is returned by getType():
int IN_VEHICLE The device is in a vehicle, such as a car.
int ON_BICYCLE The device is on a bicycle.
int ON_FOOT The device is on a user who is walking or running.
int RUNNING The device is on a user who is running.
int STILL The device is still (not moving).
int TILTING The device angle relative to gravity changed significantly.
int UNKNOWN Unable to detect the current activity.
int WALKING The device is on a user who is walking.
My first intuition would be to run an FFT analysis on the sensor history, and see what frequencies have high magnitudes when walking.
It's essentially seeing what walking "sounds like", treating the accelerometer sensor inputs like a microphone and seeing the frequencies that are loud when walking (in other words, at what frequency is the biggest acceleration happening).
I'd guess you'd be looking for a high magnitude at some low frequency (like footstep rate) or maybe something else. It would be interesting to see the data.
My guess is you run the FFT and look for the magnitude at some frequency to be greater than some threshold, or the difference between magnitudes of two of the frequencies is more than some amount. Again, the actual data would determine how you attempt to detect it.
For walking detection I use the derivative applied to the smoothed signal from accelerometer. When the derivative is greater than threshold value I can suggest that it was a step. But I guess that it's not best practise, furthermore it only works when the phone is placed in a pants pocket.
The following code was used in this app https://play.google.com/store/apps/details?id=com.tartakynov.robotnoise
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER){
return;
}
final float z = smooth(event.values[2]); // scalar kalman filter
if (Math.abs(z - mLastZ) > LEG_THRSHOLD_AMPLITUDE)
{
mInactivityCount = 0;
int currentActivity = (z > mLastZ) ? LEG_MOVEMENT_FORWARD : LEG_MOVEMENT_BACKWARD;
if (currentActivity != mLastActivity){
mLastActivity = currentActivity;
notifyListeners(currentActivity);
}
} else {
if (mInactivityCount > LEG_THRSHOLD_INACTIVITY) {
if (mLastActivity != LEG_MOVEMENT_NONE){
mLastActivity = LEG_MOVEMENT_NONE;
notifyListeners(LEG_MOVEMENT_NONE);
}
} else {
mInactivityCount++;
}
}
mLastZ = z;
}
EDIT: I don't think it's accurate enough since when walking normally the average acceleration would be near 0. The most you could do measuring acceleration is detect when someone starts walking or stops (But as you said, it's difficult to filter it from the device moved by someone standing at one place)
So... what I wrote earlier, probably wouldn't work anyway:
You can "predict" whether the user is moving by discarding when the user is not moving (obvious), And first two options coming to my mind are:
Check whether the phone is "hidden", using proximity and light sensor (optional). This method is less accurate but easier.
Controlling the continuity of the movement, if the phone is moving for more than... 10 seconds and the movement is not despicable, then you consider he is walking. I know is not perfet either, but it's difficult wihout using any kind of positioning, by the way... why don't you just use LocationManager?
Try detecting the up and down oscillations, the fore and aft oscillations and the frequency of each and make sure they stay aligned within bounds on average, because you would detect walking and specifically that person's gait style which should remain relatively constant for several steps at once to qualify as moving.
As long as the last 3 oscillations line up within reason then conclude walking is occurring as long as this also is true:-
You measure horizontal acceleration and update a velocity value with it. Velocity will drift with time, but you need to keep a moving average of velocity smoothed over the time of a step, and as long as it doesn't drift more than say half of walking speed per 3 oscillations then it's walking but only if it initially rose to walking speed within a short time ie half a second or 2 oscillations perhaps.
All of that should just about cover it.
Of course, a little ai would help make things simpler or just as complex but amazingly accurate if you considered all of these as inputs to a NN. Ie preprocessing.

Categories

Resources