I'm writing an application and my aim is to detect when a user is walking.
I'm using a Kalman filter like this:
float kFilteringFactor=0.6f;
gravity[0] = (accelerometer_values[0] * kFilteringFactor) + (gravity[0] * (1.0f - kFilteringFactor));
gravity[1] = (accelerometer_values[1] * kFilteringFactor) + (gravity[1] * (1.0f - kFilteringFactor));
gravity[2] = (accelerometer_values[2] * kFilteringFactor) + (gravity[2] * (1.0f - kFilteringFactor));
linear_acceleration[0] = (accelerometer_values[0] - gravity[0]);
linear_acceleration[1] = (accelerometer_values[1] - gravity[1]);
linear_acceleration[2] = (accelerometer_values[2] - gravity[2]);
float magnitude = 0.0f;
magnitude = (float)Math.sqrt(linear_acceleration[0]*linear_acceleration[0]+linear_acceleration[1]*linear_acceleration[1]+linear_acceleration[2]*linear_acceleration[2]);
magnitude = Math.abs(magnitude);
if(magnitude>0.2)
//walking
The array gravity[] is initialized with 0s.
I can detect when a user is walking or not (looking at the value of the magnitude of the acceleration vector), but my problem is that when a user is not walking and he moves the phones, it seems that he is walking.
Am I using the right filter?
Is it right to watch only the magnitude of the vector or have I to look at the single values ??
Google provides an API for this called DetectedActivity that can be obtained using the ActivityRecognitionApi. Those docs can be accessed here and here.
DetectedActivity has the method public int getType() to get the current activity of the user and also public int getConfidence() which returns a value from 0 to 100. The higher the value returned by getConfidence(), the more certain the API is that the user is performing the returned activity.
Here is a constant summary of what is returned by getType():
int IN_VEHICLE The device is in a vehicle, such as a car.
int ON_BICYCLE The device is on a bicycle.
int ON_FOOT The device is on a user who is walking or running.
int RUNNING The device is on a user who is running.
int STILL The device is still (not moving).
int TILTING The device angle relative to gravity changed significantly.
int UNKNOWN Unable to detect the current activity.
int WALKING The device is on a user who is walking.
My first intuition would be to run an FFT analysis on the sensor history, and see what frequencies have high magnitudes when walking.
It's essentially seeing what walking "sounds like", treating the accelerometer sensor inputs like a microphone and seeing the frequencies that are loud when walking (in other words, at what frequency is the biggest acceleration happening).
I'd guess you'd be looking for a high magnitude at some low frequency (like footstep rate) or maybe something else. It would be interesting to see the data.
My guess is you run the FFT and look for the magnitude at some frequency to be greater than some threshold, or the difference between magnitudes of two of the frequencies is more than some amount. Again, the actual data would determine how you attempt to detect it.
For walking detection I use the derivative applied to the smoothed signal from accelerometer. When the derivative is greater than threshold value I can suggest that it was a step. But I guess that it's not best practise, furthermore it only works when the phone is placed in a pants pocket.
The following code was used in this app https://play.google.com/store/apps/details?id=com.tartakynov.robotnoise
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER){
return;
}
final float z = smooth(event.values[2]); // scalar kalman filter
if (Math.abs(z - mLastZ) > LEG_THRSHOLD_AMPLITUDE)
{
mInactivityCount = 0;
int currentActivity = (z > mLastZ) ? LEG_MOVEMENT_FORWARD : LEG_MOVEMENT_BACKWARD;
if (currentActivity != mLastActivity){
mLastActivity = currentActivity;
notifyListeners(currentActivity);
}
} else {
if (mInactivityCount > LEG_THRSHOLD_INACTIVITY) {
if (mLastActivity != LEG_MOVEMENT_NONE){
mLastActivity = LEG_MOVEMENT_NONE;
notifyListeners(LEG_MOVEMENT_NONE);
}
} else {
mInactivityCount++;
}
}
mLastZ = z;
}
EDIT: I don't think it's accurate enough since when walking normally the average acceleration would be near 0. The most you could do measuring acceleration is detect when someone starts walking or stops (But as you said, it's difficult to filter it from the device moved by someone standing at one place)
So... what I wrote earlier, probably wouldn't work anyway:
You can "predict" whether the user is moving by discarding when the user is not moving (obvious), And first two options coming to my mind are:
Check whether the phone is "hidden", using proximity and light sensor (optional). This method is less accurate but easier.
Controlling the continuity of the movement, if the phone is moving for more than... 10 seconds and the movement is not despicable, then you consider he is walking. I know is not perfet either, but it's difficult wihout using any kind of positioning, by the way... why don't you just use LocationManager?
Try detecting the up and down oscillations, the fore and aft oscillations and the frequency of each and make sure they stay aligned within bounds on average, because you would detect walking and specifically that person's gait style which should remain relatively constant for several steps at once to qualify as moving.
As long as the last 3 oscillations line up within reason then conclude walking is occurring as long as this also is true:-
You measure horizontal acceleration and update a velocity value with it. Velocity will drift with time, but you need to keep a moving average of velocity smoothed over the time of a step, and as long as it doesn't drift more than say half of walking speed per 3 oscillations then it's walking but only if it initially rose to walking speed within a short time ie half a second or 2 oscillations perhaps.
All of that should just about cover it.
Of course, a little ai would help make things simpler or just as complex but amazingly accurate if you considered all of these as inputs to a NN. Ie preprocessing.
Related
I am using below code to identify the movement of the device, means I would like to know that device is moving or not. I also use Google Activity APIs which provides different activity modes like WALKING, ON_FOOT, STILL, etc without using GPS. I would like to achieve the same with Sensors but I am not able to get it accurately.
The issue with the following code is that as soon as I move the device quickly like take it from the table then I am getting the result as moving whereas it's not actually moving.
// calling method from onSensorChanged method and using TYPE_ACCELEROMETER sensor.
double speed = getAccelerometer(event.values);
// then checking the speed.
if(speed > 0.9 && speed < 1.1) {
// device is not moving
} else {
// device is moving.
}
/**
* #return
*/
private double getAccelerometer(float[] values) {
// Movement
float x = values[0];
float y = values[1];
float z = values[2];
float accelerationSquareRoot =
(float) ((x * x + y * y + z * z) / (9.80665 * 9.80665));
return Math.sqrt(accelerationSquareRoot);
}
Can anyone guide me how to make this logic accurate so that I can identify the device is moving or not?
The accelerometer is made to return acceleration data and according to Netwon's 2nd law if the acceleration is constant then the body is not moving or moving with constant speed(this is quite impossibile in your case).
Therefore if you keep reading the same data on all three axis(or better in a quite strict range) from accelerometer over time it means the phone is not moving otherwise it is.
For the purpose, you need to use Activity Recognition API which will provide you some events like moving, stop, driving, e.t.c, And activity recognize use some sensor data and also help of location service when is running. For the more how we can use and what actually it. You can read from below link
https://developers.google.com/location-context/activity-recognition/
I am having one android smart phone containing accelerator sensor, compass sensor and gyroscope sensor . i want to calculate the distance of displacement using this sensors.
I already tried with the basic method ie.,
final velocity = initial velocity + ( acceleration * time taken)
distance = time taken * speed
But i am unable to get the correct displacement. Every time i tried for same displacement i am gettng diffrent results.
The equation you may be looking for looking for is:
Velocity = (Gravity*Acceleration)/(2*PI*freq)
A correct use of units for this equation (metric) would be
Gravity = mm/s squared = 9806.65
Acceleration = average acceleration over 1 second
Frequency = Hz (of the acceleration waveform over 1 second)
For example, if you gathered data from all 3 axes of the accelerometer, you would do the following to get a acceleration waveform (in raw values) for a 3D space:
inputArray[i] = sqrt(X*X + Y*Y + Z*Z);
Once the data is collected, only use the amount of samples in the waveform that would have been collected (if there is a 1ms delay between values only use 1000 values).
Add the values together and divide by the amount of samples to get your average (you may need to make all values positive if the accelerometer data have minus values) you could use this algorithm to do this before finding the average.
for(i = 0; i < 1000; i++){
if(inputArray[i] < 0){
inputArray[i] = inputArray[i] - (inputArray[i]*2);
}
}
Once you have the acceleration average output you need to perform the equation above.
static double PI = 3.1415926535897932384626433832795;
static double gravity = 9806.65;
double Accel2mms(double accel, double freq){
double result = 0;
result = (gravity*accel)/(2*PI*freq);
return result;
}
An example could be that the average acceleration is 3 gs over 1 second in a swing:
NOTE: This calculation is based on a sinusoidal waveform, so the frequency would be representative of the physical movement of the accelerometer not the frequency of the sampling rate
Accel2mms(3, 1);
3 gs over 1 second with a frequency of 1 (1 swing in one direction) = 4682.330468 mm/s or 4.7 meters.
Hope this is something like what you're looking for.
Bear in mind this calculation is based on a sinusoidal waveform but is being adapted to calculate based on a single movement (frequency 1) so it may not be very accurate. But in theory should work.
As #rIHaNJiTHiN mentioned in the comments, there is no reliable way to get displacement from 2nd and 3rd order sensors (sensors that measure derivatives of displacement like velocity and acceleration).
GPS is the only way to measure absolute displacement, though its precision and accuracy are not so high at short distances and short time periods (an in certain places with a bad signal).
I am trying to calculate the approximate position of an Android phone in a room. I tried with different methods such as location (wich is terrible in indoors) and gyroscope+compass. I only need to know the approximate position after walking during 5-10seconds so I think the integration of linear acceleration could be enough. I know the error is terrible because of the propagation of the error but maybe it will work in my setup. I only need the approximate position to point a camera to the Android phone.
I coded the double integration but I am doing sth wrong. IF the phone is static on a table the position (x,y,z) always keep increasing. What is the problem?
static final float NS2S = 1.0f / 1000000000.0f;
float[] last_values = null;
float[] velocity = null;
float[] position = null;
float[] acceleration = null;
long last_timestamp = 0;
SensorManager mSensorManager;
Sensor mAccelerometer;
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_LINEAR_ACCELERATION)
return;
if(last_values != null){
float dt = (event.timestamp - last_timestamp) * NS2S;
acceleration[0]=(float) event.values[0] - (float) 0.0188;
acceleration[1]=(float) event.values[1] - (float) 0.00217;
acceleration[2]=(float) event.values[2] + (float) 0.01857;
for(int index = 0; index < 3;++index){
velocity[index] += (acceleration[index] + last_values[index])/2 * dt;
position[index] += velocity[index] * dt;
}
}
else{
last_values = new float[3];
acceleration = new float[3];
velocity = new float[3];
position = new float[3];
velocity[0] = velocity[1] = velocity[2] = 0f;
position[0] = position[1] = position[2] = 0f;
}
System.arraycopy(acceleration, 0, last_values, 0, 3);
last_timestamp = event.timestamp;
}
These are the positions I get when the phone is on the table (no motion). The (x,y,z) values are increasing but the phone is still.
And these are the positions after calculate the moving average for each axis and substract from each measurement. The phone is also still.
How to improve the code or another method to get the approximate position inside a room?
There are unavoidable measurement errors in the accelerometer. These are caused by tiny vibrations in the table, imperfections in the manufacturing, etc. etc. Accumulating these errors over time results in a Random Walk. This is why positioning systems can only use accelerometers as a positioning aid through some filter. They still require some form of dead reckoning such as GPS (which doesn't work well in doors).
There is a great deal of current research for indoor positioning systems. Some areas of research into systems that can take advantage of existing infrastructure are WiFi and LED lighting positioning. There is no obvious solution yet, but I'm sure we'll need a dedicated solution for accurate, reliable indoor positioning.
You said the position always keeps increasing. Do you mean the x, y, and z components only ever become positive, even after resetting several times? Or do you mean the position keeps drifting from zero?
If you output the raw acceleration measurements when the phone is still you should see the measurement errors. Put a bunch of these measurements in an Excel spreadsheet. Calculate the mean and the standard deviation. The mean should be zero for all axes. If not there is a bias that you can remove in your code with a simple averaging filter (calculate a running average and subtract that from each result). The standard deviation will show you how far you can expect to drift in each axis after N time steps as standard_deviation * sqrt(N). This should help you mathematically determine the expected accuracy as a function of time (or N time steps).
Brian is right, there are already deployed indoor positioning systems that work with infrastructure that you can easily find in (almost) any room.
One of the solutions that has proven to be most reliable is WiFi fingerprinting. I recommend you take a look at indoo.rs - www.indoo.rs - they are pioneers in the industry and have a pretty developed system already.
This may not be the most elegant or reliable solution, but in my case it serves the purpose.
Note In my case, I am grabbing a location before the user can even enter the activity that needs indoor positioning.. and I am only concerned with a rough estimate of how much they have moved around.
I have a sensor manager that is creating a rotation matrix based on the device orientation. (using Sensor.TYPE_ROTATION_VECTOR) That obviously doesn't give me movement forward, backward, or side to side, but instead only the device orientation. With that device orientation i have a good idea of the user's bearing in degrees (which way they are facing) and using the Sensor_Step_Detector available in KitKat 4.4, I make the assumption that a step is 1 meter in the direction the user is facing..
Again, I know this is not full proof or very accurate, but depending on your purpose this too might be a simple solution..
everytime a step is detected i basically call this function:
public void computeNewLocationByStep() {
Location newLocal = new Location("");
double vAngle = getBearingInDegrees(); // returns my users bearing
double vDistance = 1 / g.kEarthRadiusInMeters; //kEarthRadiusInMeters = 6353000;
vAngle = Math.toRadians(vAngle);
double vLat1 = Math.toRadians(_location.getLatitude());
double vLng1 = Math.toRadians(_location.getLongitude());
double vNewLat = Math.asin(Math.sin(vLat1) * Math.cos(vDistance) +
Math.cos(vLat1) * Math.sin(vDistance) * Math.cos(vAngle));
double vNewLng = vLng1 + Math.atan2(Math.sin(vAngle) * Math.sin(vDistance) * Math.cos(vLat1),
Math.cos(vDistance) - Math.sin(vLat1) * Math.sin(vNewLat));
newLocal.setLatitude(Math.toDegrees(vNewLat));
newLocal.setLongitude(Math.toDegrees(vNewLng));
stepCount =0;
_location = newLocal;
}
Using only the phone's (Android) built in accelerometer, how would I go about finding its velocity?
I have been tinkering with the maths of this but whatever function I come up with tends to lead to exponential growth of the velocity. I am working from the assumption that on startup of the App, the phone is at a standstill. This should definitely make finding the velocity (at least roughly) possible.
I have a decent background in physics and math too, so I shouldn't have any difficulty with any concepts here.
How should I do it?
That will really depend on what the acceleration is and for how long. A mild, long acceleration could be measurable, but any sudden increase in acceleration, followed by a constant velocity, will make your measurements quite difficult and prone to error.
Assuming constant acceleration, the formula is extremely simple: a = (V1-V0)/t . So, knowing the time and the acceleration, and assuming V0 = 0, then V1 = a*t
In a more real world, you probably won't have a constant acceleration, so you should calculate Delta V for each measurement, and adding all those changes in velocity to get the final velocity. Always consider that you won't have a continuous acceleration data, so this is the most feasible way (i.e, real data vs integral math theory).
In any way, even in the best scenario, you will end up with a very high error margin, so I do not recommend this approach for any app that truly depends on real velocities.
First, you have to remove the acceleration due to gravity from the accelerometer data. Then it's just a matter of integrating the acceleration to get the velocity. Don't forget that acceleration and velocity are properly vectors, not scalars, and that you will also have to track rotation of the phone in space to properly determine the orientation of the acceleration vector with respect to the calculated velocity vector.
There is nothing else to do but agree with the reasonable arguments put forward in all the great answers above, however if you are the pragmatic type like me, I need to come up with a solution that works somehow.
I suffered a similar problem to yours and I decided to make my own solution after not finding any on-line. I only needed a simple "tilt" input for controlling a game so this solution will probably NOT work for more complex needs, however I decided to share it in case others where looking for something similar.
NOTE: I have pasted my entire code here, and it is free to use for any purpose.
Basically what I do in my code is to look for accelerometer sensor. If not found, tilt feedback will be disabled. If accelerometer sensor is present, I look for magnetic field sensor, and if it is present, I get my tilt angle the recommended way by combining accelerometer and magnetic field data.
public TiltSensor(Context c) {
man = (SensorManager) c.getSystemService(Context.SENSOR_SERVICE);
mag_sensor = man.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
acc_sensor = man.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
has_mag = man.registerListener(this, mag_sensor, delay);
has_acc = man.registerListener(this, acc_sensor, delay);
if (has_acc) {
tiltAvailble = true;
if (has_mag) {
Log.d("TiltCalc", "Using accelerometer + compass.");
}
else {
Log.d("TiltCalc", "Using only accelerometer.");
}
}
else {
tiltAvailble = false;
Log.d("TiltCalc", "No acceptable hardware found, tilt not available.");
//No use in having listeners registered
pause();
}
}
If however only the accelerometer sensor was present, I fall back to accumulating the acceleration, that is continuously damped (multiplied by 0.99) to remove any drift. For my simple tilt needs this works great.
#Override
public void onSensorChanged(SensorEvent e) {
final float[] vals = e.values;
final int type = e.sensor.getType();
switch (type) {
case (Sensor.TYPE_ACCELEROMETER): {
needsRecalc = true;
if (!has_mag) {
System.arraycopy(accelerometer, 0, old_acc, 0, 3);
}
System.arraycopy(vals, 0, accelerometer, 0, 3);
if (!has_mag) {
for (int i = 0; i < 3; i++) {
//Accumulate changes
final float sensitivity = 0.08f;
dampened_acc[i] += (accelerometer[i] - old_acc[i]) * sensitivity;
//Even out drift over time
dampened_acc[i] *= 0.99;
}
}
}
break;
case (Sensor.TYPE_MAGNETIC_FIELD): {
needsRecalc = true;
System.arraycopy(vals, 0, magnetic_field, 0, 3);
}
break;
}
}
In conclusion I will just repeat that this is probably not "correct" in any way, it simply works as a simple input to a game. To use this code I simply do something like the following (yes magic constants are bad mkay):
Ship ship = mShipLayer.getShip();
mTiltSensor.getTilt(vals);
float deltaY = -vals[1] * 2;//1 is the index of the axis we are after
float offset = ((deltaY - (deltaY / 1.5f)));
if (null != ship) {
ship.setOffset(offset);
}
Enjoi!
Integrating acceleration to get velocity is an unstable problem and your error will diverge after a couple of seconds or so. Phone accelerometers are also not very accurate, which doesn't help, and some of them don't allow you to distinguish between tilt and translation easily, in which case you're really in trouble.
The accelerometers in a phone are pretty much useless for such a task. You need highly accurate accelerometers with very low drift - something which is way beyond what you will find in a phone. At best you might get useful results for a few seconds, or if very lucky for a minute or two after which the results become meaningless.
Also, you need to have a three axis gyroscope which you would use to integrate the velocity in the right direction. Some phones have gyros, but they are even poorer than the accelerometers as far as drift and accuracy are concerned.
One possibly useful application though would be to use the accelerometers in conjunction with gyros or the magnetic compass to fill in for missing data from the GPS. Each time the GPS gives a good fix one would reset the initial conditions of position, speed and orientation and the accelerometers would provide the data until the next valid GPS fix.
Gravity is going to destroy all of your measurements. The phone, at standstill, is experiencing a high constant upward (yes, UP) acceleration. An accelerometer can't distinguish between acceleration and gravity (technically, they are the same), so it would get to extremely high velocities after a few seconds.
If you never tilt your accelerometer even slightly, then you can simply subtract the constant gravitional pull from the z-axis (or whichever axis is pointing up/down), but thats quite unlikely.
Basically, you have to use a complicated system of a gyroscope/magnetometor and an accelerometer to calculate the exact direction of gravity and then subtract the acceleration.
v = Integral(a) ?
Generally though, I'd think the inaccuracies in the accelerometers would make this quite tough
If the phone is at standstil, you have ZERO acceleration, so your speed is 0. Probably you should find location data from GPS and get the associated time samples and compute velocity distance over time.
I want to be able to feature a fairly simple fall detection algorithm in my application. At the moment in onSensorChanged(), I am getting the absolute value of the current x,x,z values and subtracting SensorManager.GRAVITY_EARTH (9.8 m/s) from this. The resulting value has to be bigger than a threshold value 10 times in a row to set a flag saying a fall has been detected by the accelerometer, the threshold value is about 8m/s.
Also I'm comparing the orientation of the phone as soon as the threshold has been passed and the orienation of it when the threshold is no longer being passed, this sets another flag saying the orientation sensor has detected a fall.
When both flags are set, an event occurs to check is user ok, etc etc. My problem is with the threshold, when the phone is held straight up the absolute value of accelerometer is about 9.8 m/s, but when i hold it still at an angle it can be over 15m/s. This is causing other events to trigger the fall detection, and if i increase the threshold to avoid that, it won't detect falls.
Can anyone give me some advice here with what possible values i should use or how to even improve my method? Many thanks.
First, I want to remind you that you cannot just add the x, y, z values together as they are, you have to use vector mathematics. This is why you get values of over 15 m/s. As long as the phone is not moving, the vector sum should always be about 9.8 m/s. You calculate it using SQRT(x*x + y*y + z*z). If you need more information, you can read about vector mathematics, maybe http://en.wikipedia.org/wiki/Euclidean_vector#Length is a good start for it.
I also suggest another algorithm: In free fall, all three of the x,y,z values of the accelerometer should be near zero. (At least, that's what I learned in physics classes a long time ago in school.) So maybe you can use a formula like if the vector sum of x,y,z <= 3 m/s than you detect a free fall. And if the vector sum then raises to a value over 20 m/s, than you detect the landing.
Those thresholds are just a wild guess. Maybe you just record the x,y,z values in a test application, and then move around the phone, and then analyze offline how the values (and their normal and vector sum) behave to get a feeling for which thresholds are sensible.
I have acutally published a paper on this issue. Please feel free to check out "ifall" # ww2.cs.fsu.edu/~sposaro
We basically take the root sum of squares and look for 3 things
1. Lower threshold broke. Ie fallinging
2. Upper threshold broke. Ie hitting the ground
3. Flatline around 1g, ie longlie, laying on the ground for an extended period of time
I forgot to update this thread, but iFall is now available on the Android Market.
Also check out ww2.cs.fsu.edu/~sposaro/iFall for more information
Its possible using the Accelerometer sensor.
Write this in the sensor changed listener..
if (sensor == Sensor.TYPE_ACCELEROMETER) {
long curTime = System.currentTimeMillis();
// only allow one update every 100ms.
if ((curTime - lastUpdate) > 100) {
long diffTime = (curTime - lastUpdate);
lastUpdate = curTime;
x = values[SensorManager.DATA_X];
y = values[SensorManager.DATA_Y];
z = values[SensorManager.DATA_Z];
float speed = Math.abs(x + y + z - last_x - last_y - last_z) / diffTime * 10000;
Log.d("getShakeDetection", "speed: " + speed);
if (speed > DashplexManager.getInstance().SHAKE_THRESHOLD) {
result = true;
}
last_x = x;
last_y = y;
last_z = z;
}
}