Is there a way to get the velocity without GPS in Android? - android

Is there a way to get the velocity without GPS in Android? I don't need the accurate values.

Well, sort of, but you will need to do alot of processing..
You can make frequent accelerometer readings and integrate the values once to get velocity. This won't get you an accurate starting velocity but after a while it will probably work (unless you start when the phone is driving along in a car). See also this post.
Now, some pseudo code:
We start at t=0 and measure acceleration in all three axis.
a = get_acceleration()
vx = vx + a.x - gravity.x;
vy = vy + a.y - gravity.y;
vz = vz + a.z - gravity.z;
After doing this for a few seconds, the sum of all the acceleration values (if you sample frequently, ie, 50Hz) should be velocity. You will also need to work out which way up your device is and therefore how much of the acceleration components you are reading is due to gravity and compensate.

Related

Velocity from Accelerometer

I came to the conclusion that it is impossible to determine the velocity given the accelerometer in an android device. Assuming my initial velocity to be zero the integration schema employed to get the velocity would be
v_ib_b = old_v_ib_b + f_ib_b * dt ,
where v_ib_b is the velocity in the inertial body frame resolved along the body axes and f_ib_b is the measured force by the accelerometer.
If I want to compute the velocity of the smartphone in the local navigation frame ENU the following formula can be used (simplified Bortz equation neglecting coriolis/sculling and transport rate):
v_eb_n = old_v_eb_n + (C_b_n * (f_ib_b + 0.5 * cross(omega_ib_b, f_ib_b)) - g) * dt
where C_b_n is the rotation matrix body to ENU.
I attempted to use the Sensor.TYPE_ROTATION_VECTOR, MadgwickAHRS and my own sensor fusion to obtain the orientation and hence the rotation matrix in the ENU frame, respectively. No matter which one was used this is an ouput of integrating the velocity over a few minutes while riding with a bike (smartphone was mounted to the handlebar facing sky):
and this is the raw data integrated:
I assume the accelerometer to perform poor, being noisy and cancel out any measurements over time hence not being able to obtain the velocity. Any ideas?
The relatively very small changes in acceleration will be mixed in with the noise. Is something blocking the use of the GPS data? Another direction would be combining the gyroscope data with the accelerometer data for better noise reduction, creating a full IMU.

Measuring how much phone has moved linearly as well as angularly?

I am trying to make an app has a reset button and displays how much phone has moved linearly as well as angularly. It has been difficult because I don't know much about accelerometer and gyroscope or even android programming. So I have been thinking about just using these data I got from the app called Sensor Kinetic first to do the calculation to measure how much phone moved linearly and angularly.
The two images below are the sample data (time, X, Y, Z). I believe it can be done with these data should be but I haven't progressed for the past few hours researching. I know they are not syncing in term of time but I only the starting point and the last point to measure.
gyroscope_sample_data rad/s :
accelerometer_sample_data m/s^2 :
I have been trying too look up how to measure the distance angularly and linearly but I found a lot of work talking about it but nothing seems to have done it this way or measure both linear and angular distance.
Edited: I wrote this to calculate the distance the phone move linearly with the accelerometer but I received around 103 meter for only rotating the phone around me. I am also not sure about the gyroscope for angular distance either so let me know what you all think.
def acc_calculation(dataset):
dx, dy, dz = 0.0, 0.0, 0.0
vx, vy, vz = 0.0, 0.0, 0.0
acceleration_x, acceleration_y, acceleration_z = dataset['X_value'], dataset['Y_value'], dataset['Z_value']
for i in range(1, len(dataset['time'])):
dt = float(dataset['time'][i]) - float(dataset['time'][i-1])
print(dt)
vx += (float(acceleration_x[i-1]) + float(acceleration_x[i]))/2.0*dt
vy += (float(acceleration_y[i-1]) + float(acceleration_y[i]))/2.0*dt
vz += (float(acceleration_z[i-1]) + float(acceleration_z[i]))/2.0*dt
dx += vx*dt;
dy += vy*dt;
dz += vz*dt;
dl = math.sqrt(dx**2 + dy**2 + dz**2)
return dl
I am trying to post more related links but stackoverflow doesn't let me. If anyone has any leads and especially detailed information about it for apps or work that already done it, I would really appreciate.
Accelerometer & Gyro Tutorial :
http://www.instructables.com/id/Accelerometer-Gyro-Tutorial/step3/Combining-the-Accelerometer-and-Gyro/#intro
How can I find distance traveled with a gyroscope and accelerometer?

How to find distance of displacement using accelerometer sensor in android smartphone?

I am having one android smart phone containing accelerator sensor, compass sensor and gyroscope sensor . i want to calculate the distance of displacement using this sensors.
I already tried with the basic method ie.,
final velocity = initial velocity + ( acceleration * time taken)
distance = time taken * speed
But i am unable to get the correct displacement. Every time i tried for same displacement i am gettng diffrent results.
The equation you may be looking for looking for is:
Velocity = (Gravity*Acceleration)/(2*PI*freq)
A correct use of units for this equation (metric) would be
Gravity = mm/s squared = 9806.65
Acceleration = average acceleration over 1 second
Frequency = Hz (of the acceleration waveform over 1 second)
For example, if you gathered data from all 3 axes of the accelerometer, you would do the following to get a acceleration waveform (in raw values) for a 3D space:
inputArray[i] = sqrt(X*X + Y*Y + Z*Z);
Once the data is collected, only use the amount of samples in the waveform that would have been collected (if there is a 1ms delay between values only use 1000 values).
Add the values together and divide by the amount of samples to get your average (you may need to make all values positive if the accelerometer data have minus values) you could use this algorithm to do this before finding the average.
for(i = 0; i < 1000; i++){
if(inputArray[i] < 0){
inputArray[i] = inputArray[i] - (inputArray[i]*2);
}
}
Once you have the acceleration average output you need to perform the equation above.
static double PI = 3.1415926535897932384626433832795;
static double gravity = 9806.65;
double Accel2mms(double accel, double freq){
double result = 0;
result = (gravity*accel)/(2*PI*freq);
return result;
}
An example could be that the average acceleration is 3 gs over 1 second in a swing:
NOTE: This calculation is based on a sinusoidal waveform, so the frequency would be representative of the physical movement of the accelerometer not the frequency of the sampling rate
Accel2mms(3, 1);
3 gs over 1 second with a frequency of 1 (1 swing in one direction) = 4682.330468 mm/s or 4.7 meters.
Hope this is something like what you're looking for.
Bear in mind this calculation is based on a sinusoidal waveform but is being adapted to calculate based on a single movement (frequency 1) so it may not be very accurate. But in theory should work.
As #rIHaNJiTHiN mentioned in the comments, there is no reliable way to get displacement from 2nd and 3rd order sensors (sensors that measure derivatives of displacement like velocity and acceleration).
GPS is the only way to measure absolute displacement, though its precision and accuracy are not so high at short distances and short time periods (an in certain places with a bad signal).

Android - How to approach fall detection algorithm

I want to be able to feature a fairly simple fall detection algorithm in my application. At the moment in onSensorChanged(), I am getting the absolute value of the current x,x,z values and subtracting SensorManager.GRAVITY_EARTH (9.8 m/s) from this. The resulting value has to be bigger than a threshold value 10 times in a row to set a flag saying a fall has been detected by the accelerometer, the threshold value is about 8m/s.
Also I'm comparing the orientation of the phone as soon as the threshold has been passed and the orienation of it when the threshold is no longer being passed, this sets another flag saying the orientation sensor has detected a fall.
When both flags are set, an event occurs to check is user ok, etc etc. My problem is with the threshold, when the phone is held straight up the absolute value of accelerometer is about 9.8 m/s, but when i hold it still at an angle it can be over 15m/s. This is causing other events to trigger the fall detection, and if i increase the threshold to avoid that, it won't detect falls.
Can anyone give me some advice here with what possible values i should use or how to even improve my method? Many thanks.
First, I want to remind you that you cannot just add the x, y, z values together as they are, you have to use vector mathematics. This is why you get values of over 15 m/s. As long as the phone is not moving, the vector sum should always be about 9.8 m/s. You calculate it using SQRT(x*x + y*y + z*z). If you need more information, you can read about vector mathematics, maybe http://en.wikipedia.org/wiki/Euclidean_vector#Length is a good start for it.
I also suggest another algorithm: In free fall, all three of the x,y,z values of the accelerometer should be near zero. (At least, that's what I learned in physics classes a long time ago in school.) So maybe you can use a formula like if the vector sum of x,y,z <= 3 m/s than you detect a free fall. And if the vector sum then raises to a value over 20 m/s, than you detect the landing.
Those thresholds are just a wild guess. Maybe you just record the x,y,z values in a test application, and then move around the phone, and then analyze offline how the values (and their normal and vector sum) behave to get a feeling for which thresholds are sensible.
I have acutally published a paper on this issue. Please feel free to check out "ifall" # ww2.cs.fsu.edu/~sposaro
We basically take the root sum of squares and look for 3 things
1. Lower threshold broke. Ie fallinging
2. Upper threshold broke. Ie hitting the ground
3. Flatline around 1g, ie longlie, laying on the ground for an extended period of time
I forgot to update this thread, but iFall is now available on the Android Market.
Also check out ww2.cs.fsu.edu/~sposaro/iFall for more information
Its possible using the Accelerometer sensor.
Write this in the sensor changed listener..
if (sensor == Sensor.TYPE_ACCELEROMETER) {
long curTime = System.currentTimeMillis();
// only allow one update every 100ms.
if ((curTime - lastUpdate) > 100) {
long diffTime = (curTime - lastUpdate);
lastUpdate = curTime;
x = values[SensorManager.DATA_X];
y = values[SensorManager.DATA_Y];
z = values[SensorManager.DATA_Z];
float speed = Math.abs(x + y + z - last_x - last_y - last_z) / diffTime * 10000;
Log.d("getShakeDetection", "speed: " + speed);
if (speed > DashplexManager.getInstance().SHAKE_THRESHOLD) {
result = true;
}
last_x = x;
last_y = y;
last_z = z;
}
}

How to detect walking with Android accelerometer

I'm writing an application and my aim is to detect when a user is walking.
I'm using a Kalman filter like this:
float kFilteringFactor=0.6f;
gravity[0] = (accelerometer_values[0] * kFilteringFactor) + (gravity[0] * (1.0f - kFilteringFactor));
gravity[1] = (accelerometer_values[1] * kFilteringFactor) + (gravity[1] * (1.0f - kFilteringFactor));
gravity[2] = (accelerometer_values[2] * kFilteringFactor) + (gravity[2] * (1.0f - kFilteringFactor));
linear_acceleration[0] = (accelerometer_values[0] - gravity[0]);
linear_acceleration[1] = (accelerometer_values[1] - gravity[1]);
linear_acceleration[2] = (accelerometer_values[2] - gravity[2]);
float magnitude = 0.0f;
magnitude = (float)Math.sqrt(linear_acceleration[0]*linear_acceleration[0]+linear_acceleration[1]*linear_acceleration[1]+linear_acceleration[2]*linear_acceleration[2]);
magnitude = Math.abs(magnitude);
if(magnitude>0.2)
//walking
The array gravity[] is initialized with 0s.
I can detect when a user is walking or not (looking at the value of the magnitude of the acceleration vector), but my problem is that when a user is not walking and he moves the phones, it seems that he is walking.
Am I using the right filter?
Is it right to watch only the magnitude of the vector or have I to look at the single values ??
Google provides an API for this called DetectedActivity that can be obtained using the ActivityRecognitionApi. Those docs can be accessed here and here.
DetectedActivity has the method public int getType() to get the current activity of the user and also public int getConfidence() which returns a value from 0 to 100. The higher the value returned by getConfidence(), the more certain the API is that the user is performing the returned activity.
Here is a constant summary of what is returned by getType():
int IN_VEHICLE The device is in a vehicle, such as a car.
int ON_BICYCLE The device is on a bicycle.
int ON_FOOT The device is on a user who is walking or running.
int RUNNING The device is on a user who is running.
int STILL The device is still (not moving).
int TILTING The device angle relative to gravity changed significantly.
int UNKNOWN Unable to detect the current activity.
int WALKING The device is on a user who is walking.
My first intuition would be to run an FFT analysis on the sensor history, and see what frequencies have high magnitudes when walking.
It's essentially seeing what walking "sounds like", treating the accelerometer sensor inputs like a microphone and seeing the frequencies that are loud when walking (in other words, at what frequency is the biggest acceleration happening).
I'd guess you'd be looking for a high magnitude at some low frequency (like footstep rate) or maybe something else. It would be interesting to see the data.
My guess is you run the FFT and look for the magnitude at some frequency to be greater than some threshold, or the difference between magnitudes of two of the frequencies is more than some amount. Again, the actual data would determine how you attempt to detect it.
For walking detection I use the derivative applied to the smoothed signal from accelerometer. When the derivative is greater than threshold value I can suggest that it was a step. But I guess that it's not best practise, furthermore it only works when the phone is placed in a pants pocket.
The following code was used in this app https://play.google.com/store/apps/details?id=com.tartakynov.robotnoise
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER){
return;
}
final float z = smooth(event.values[2]); // scalar kalman filter
if (Math.abs(z - mLastZ) > LEG_THRSHOLD_AMPLITUDE)
{
mInactivityCount = 0;
int currentActivity = (z > mLastZ) ? LEG_MOVEMENT_FORWARD : LEG_MOVEMENT_BACKWARD;
if (currentActivity != mLastActivity){
mLastActivity = currentActivity;
notifyListeners(currentActivity);
}
} else {
if (mInactivityCount > LEG_THRSHOLD_INACTIVITY) {
if (mLastActivity != LEG_MOVEMENT_NONE){
mLastActivity = LEG_MOVEMENT_NONE;
notifyListeners(LEG_MOVEMENT_NONE);
}
} else {
mInactivityCount++;
}
}
mLastZ = z;
}
EDIT: I don't think it's accurate enough since when walking normally the average acceleration would be near 0. The most you could do measuring acceleration is detect when someone starts walking or stops (But as you said, it's difficult to filter it from the device moved by someone standing at one place)
So... what I wrote earlier, probably wouldn't work anyway:
You can "predict" whether the user is moving by discarding when the user is not moving (obvious), And first two options coming to my mind are:
Check whether the phone is "hidden", using proximity and light sensor (optional). This method is less accurate but easier.
Controlling the continuity of the movement, if the phone is moving for more than... 10 seconds and the movement is not despicable, then you consider he is walking. I know is not perfet either, but it's difficult wihout using any kind of positioning, by the way... why don't you just use LocationManager?
Try detecting the up and down oscillations, the fore and aft oscillations and the frequency of each and make sure they stay aligned within bounds on average, because you would detect walking and specifically that person's gait style which should remain relatively constant for several steps at once to qualify as moving.
As long as the last 3 oscillations line up within reason then conclude walking is occurring as long as this also is true:-
You measure horizontal acceleration and update a velocity value with it. Velocity will drift with time, but you need to keep a moving average of velocity smoothed over the time of a step, and as long as it doesn't drift more than say half of walking speed per 3 oscillations then it's walking but only if it initially rose to walking speed within a short time ie half a second or 2 oscillations perhaps.
All of that should just about cover it.
Of course, a little ai would help make things simpler or just as complex but amazingly accurate if you considered all of these as inputs to a NN. Ie preprocessing.

Categories

Resources