Approximate indoor positioning using the integration of the linear acceleration - android

I am trying to calculate the approximate position of an Android phone in a room. I tried with different methods such as location (wich is terrible in indoors) and gyroscope+compass. I only need to know the approximate position after walking during 5-10seconds so I think the integration of linear acceleration could be enough. I know the error is terrible because of the propagation of the error but maybe it will work in my setup. I only need the approximate position to point a camera to the Android phone.
I coded the double integration but I am doing sth wrong. IF the phone is static on a table the position (x,y,z) always keep increasing. What is the problem?
static final float NS2S = 1.0f / 1000000000.0f;
float[] last_values = null;
float[] velocity = null;
float[] position = null;
float[] acceleration = null;
long last_timestamp = 0;
SensorManager mSensorManager;
Sensor mAccelerometer;
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_LINEAR_ACCELERATION)
return;
if(last_values != null){
float dt = (event.timestamp - last_timestamp) * NS2S;
acceleration[0]=(float) event.values[0] - (float) 0.0188;
acceleration[1]=(float) event.values[1] - (float) 0.00217;
acceleration[2]=(float) event.values[2] + (float) 0.01857;
for(int index = 0; index < 3;++index){
velocity[index] += (acceleration[index] + last_values[index])/2 * dt;
position[index] += velocity[index] * dt;
}
}
else{
last_values = new float[3];
acceleration = new float[3];
velocity = new float[3];
position = new float[3];
velocity[0] = velocity[1] = velocity[2] = 0f;
position[0] = position[1] = position[2] = 0f;
}
System.arraycopy(acceleration, 0, last_values, 0, 3);
last_timestamp = event.timestamp;
}
These are the positions I get when the phone is on the table (no motion). The (x,y,z) values are increasing but the phone is still.
And these are the positions after calculate the moving average for each axis and substract from each measurement. The phone is also still.
How to improve the code or another method to get the approximate position inside a room?

There are unavoidable measurement errors in the accelerometer. These are caused by tiny vibrations in the table, imperfections in the manufacturing, etc. etc. Accumulating these errors over time results in a Random Walk. This is why positioning systems can only use accelerometers as a positioning aid through some filter. They still require some form of dead reckoning such as GPS (which doesn't work well in doors).
There is a great deal of current research for indoor positioning systems. Some areas of research into systems that can take advantage of existing infrastructure are WiFi and LED lighting positioning. There is no obvious solution yet, but I'm sure we'll need a dedicated solution for accurate, reliable indoor positioning.
You said the position always keeps increasing. Do you mean the x, y, and z components only ever become positive, even after resetting several times? Or do you mean the position keeps drifting from zero?
If you output the raw acceleration measurements when the phone is still you should see the measurement errors. Put a bunch of these measurements in an Excel spreadsheet. Calculate the mean and the standard deviation. The mean should be zero for all axes. If not there is a bias that you can remove in your code with a simple averaging filter (calculate a running average and subtract that from each result). The standard deviation will show you how far you can expect to drift in each axis after N time steps as standard_deviation * sqrt(N). This should help you mathematically determine the expected accuracy as a function of time (or N time steps).

Brian is right, there are already deployed indoor positioning systems that work with infrastructure that you can easily find in (almost) any room.
One of the solutions that has proven to be most reliable is WiFi fingerprinting. I recommend you take a look at indoo.rs - www.indoo.rs - they are pioneers in the industry and have a pretty developed system already.

This may not be the most elegant or reliable solution, but in my case it serves the purpose.
Note In my case, I am grabbing a location before the user can even enter the activity that needs indoor positioning.. and I am only concerned with a rough estimate of how much they have moved around.
I have a sensor manager that is creating a rotation matrix based on the device orientation. (using Sensor.TYPE_ROTATION_VECTOR) That obviously doesn't give me movement forward, backward, or side to side, but instead only the device orientation. With that device orientation i have a good idea of the user's bearing in degrees (which way they are facing) and using the Sensor_Step_Detector available in KitKat 4.4, I make the assumption that a step is 1 meter in the direction the user is facing..
Again, I know this is not full proof or very accurate, but depending on your purpose this too might be a simple solution..
everytime a step is detected i basically call this function:
public void computeNewLocationByStep() {
Location newLocal = new Location("");
double vAngle = getBearingInDegrees(); // returns my users bearing
double vDistance = 1 / g.kEarthRadiusInMeters; //kEarthRadiusInMeters = 6353000;
vAngle = Math.toRadians(vAngle);
double vLat1 = Math.toRadians(_location.getLatitude());
double vLng1 = Math.toRadians(_location.getLongitude());
double vNewLat = Math.asin(Math.sin(vLat1) * Math.cos(vDistance) +
Math.cos(vLat1) * Math.sin(vDistance) * Math.cos(vAngle));
double vNewLng = vLng1 + Math.atan2(Math.sin(vAngle) * Math.sin(vDistance) * Math.cos(vLat1),
Math.cos(vDistance) - Math.sin(vLat1) * Math.sin(vNewLat));
newLocal.setLatitude(Math.toDegrees(vNewLat));
newLocal.setLongitude(Math.toDegrees(vNewLng));
stepCount =0;
_location = newLocal;
}

Related

How to detect device is moving or not using sensor android

I am using below code to identify the movement of the device, means I would like to know that device is moving or not. I also use Google Activity APIs which provides different activity modes like WALKING, ON_FOOT, STILL, etc without using GPS. I would like to achieve the same with Sensors but I am not able to get it accurately.
The issue with the following code is that as soon as I move the device quickly like take it from the table then I am getting the result as moving whereas it's not actually moving.
// calling method from onSensorChanged method and using TYPE_ACCELEROMETER sensor.
double speed = getAccelerometer(event.values);
// then checking the speed.
if(speed > 0.9 && speed < 1.1) {
// device is not moving
} else {
// device is moving.
}
/**
* #return
*/
private double getAccelerometer(float[] values) {
// Movement
float x = values[0];
float y = values[1];
float z = values[2];
float accelerationSquareRoot =
(float) ((x * x + y * y + z * z) / (9.80665 * 9.80665));
return Math.sqrt(accelerationSquareRoot);
}
Can anyone guide me how to make this logic accurate so that I can identify the device is moving or not?
The accelerometer is made to return acceleration data and according to Netwon's 2nd law if the acceleration is constant then the body is not moving or moving with constant speed(this is quite impossibile in your case).
Therefore if you keep reading the same data on all three axis(or better in a quite strict range) from accelerometer over time it means the phone is not moving otherwise it is.
For the purpose, you need to use Activity Recognition API which will provide you some events like moving, stop, driving, e.t.c, And activity recognize use some sensor data and also help of location service when is running. For the more how we can use and what actually it. You can read from below link
https://developers.google.com/location-context/activity-recognition/

Android accelerometer calibration?

TL;DR
How come the accelerometer values I get from Sensor.TYPE_ACCELEROMETER are slightly offset? I don't mean by gravity, but by some small error that varies from axis to axis and phone to phone.
Can I calibrate the accelerometer? Or is there a standard way of compensating for these errors?
I'm developing an app that has a need for as precise acceleration measurements as possible (mainly vertical acceleration, i.e. same direction as gravity).
I've been doing A LOT of testing, and it turns out that the raw values I get from Sensor.TYPE_ACCELEROMETER are off. If I let the phone rest at a perfectly horizontal surface with the screen up, the accelerometer shows a Z-value of 9.0, where it should be about 9.81. Likewise, if I put the phone in portrait or landscape mode, the X- and Y- accelerometer values show about 9.6. instead of 9.81.
This of course affects my vertical acceleration, as I'm using SensorManager.getRotationMatrixFromVector(), to calculate the vertical acceleration, resulting in a vertical acceleration that is off by a different amount depending on the rotation of the device.
Now, before anyone jumps the gun and mentions that I should try using Sensor.TYPE_LINEAR_ACCELERATION instead, I must point out that I'm actually doing that as well, parallel to the TYPE_ACCELERATION. By using the gravity sensor I then calculate the vertical acceleration (as described in this answer). The funny thing is that I get EXACTLY the same result as the method that uses the raw accelerometer, SensorManager.getRotationMatrixFromVector() and matrix multiplication (and finally subtracting gravity).
The only way I'm able to get almost exactly zero vertical acceleration for a stationary phone in any rotation is to get the raw accelerometer values, add an offset (from earlier observations, i.e. X+0.21, Y+0.21 and Z+0.81) and then performing the rotation matrix stuff to get the world coordinate system accelerations. Note that since it's not just the calculated vertical acceleration that is wrong - it's actually the raw values from Sensor.TYPE_ACCELEROMETER, which I would think excludes other error sources like gyroscope sensor, etc?
I have tested this on two different phones (Samsung Galaxy S5 and Sony Xperia Z3 compact), and both have these accelerometer value deviances - but of course not the same values on both phones.
How come the the values of Sensor.TYPE_ACCELEROMETER are off, and is there a better way of "calibrating" the accelerometer than simply observing how much they deviate from gravity and adding the difference to the values before using them?
You should calibrate gains, offsets, and angle of the 3 accelerometers.
Unfortunately it's not possible to deepen the whole topic here.
I'll write a small introduction, describing the basic concept, and then I'll post a link to the code of a simple Clinometer that implements the calibration.
The calibration routine could be done with 7 misurations (calculate the mean value of a good number of samples) in different ortogonal positions at your choice, in order to have all +-0 and +-g values of your accelerometers. For example:
STEP 1 = Lay flat
STEP 2 = Rotate 180°
STEP 3 = Lay on the left side
STEP 4 = Rotate 180°
STEP 5 = Lay vertical
STEP 6 = Rotate 180° upside-down
STEP 7 = Lay face down
Then you can use the 7 measurements mean[][] to calculate offsets and gains:
calibrationOffset[0] = (mean[0][2] + mean[0][3]) / 2;
calibrationOffset[1] = (mean[1][4] + mean[1][5]) / 2;
calibrationOffset[2] = (mean[2][0] + mean[2][6]) / 2;
calibrationGain[0] = (mean[0][2] - mean[0][3]) / (STANDARD_GRAVITY * 2);
calibrationGain[1] = (mean[1][4] - mean[1][5]) / (STANDARD_GRAVITY * 2);
calibrationGain[2] = (mean[2][0] - mean[2][6]) / (STANDARD_GRAVITY * 2);
using the values of mean[axis][step], where STANDARD_GRAVITY = 9.81.
Then apply the Gain and Offset Corrections to measurements:
for (int i = 0; i < 7; i++) {
mean[0][i] = (mean[0][i] - calibrationOffset[0]) / calibrationGain[0];
mean[1][i] = (mean[1][i] - calibrationOffset[1]) / calibrationGain[1];
mean[2][i] = (mean[2][i] - calibrationOffset[2]) / calibrationGain[2];
}
and finally calculates the correction angles:
for (int i = 0; i < 7; i++) {
angle[0][i] = (float) (Math.toDegrees(Math.asin(mean[0][i]
/ Math.sqrt(mean[0][i] * mean[0][i] + mean[1][i] * mean[1][i] + mean[2][i] * mean[2][i]))));
angle[1][i] = (float) (Math.toDegrees(Math.asin(mean[1][i]
/ Math.sqrt(mean[0][i] * mean[0][i] + mean[1][i] * mean[1][i] + mean[2][i] * mean[2][i]))));
angle[2][i] = (float) (Math.toDegrees(Math.asin(mean[2][i]
/ Math.sqrt(mean[0][i] * mean[0][i] + mean[1][i] * mean[1][i] + mean[2][i] * mean[2][i]))));
}
calibrationAngle[2] = (angle[0][0] + angle[0][1])/2; // angle 0 = X axis
calibrationAngle[1] = -(angle[1][0] + angle[1][1])/2; // angle 1 = Y axis
calibrationAngle[0] = -(angle[1][3] - angle[1][2])/2; // angle 2 = Z axis
You can find a simple but complete implementation of a 3-axis calibration in this opensource Clinometer app: https://github.com/BasicAirData/Clinometer.
There is also the APK and the link of the Google Play Store if you want to try it.
You can find the calibration routine in CalibrationActivity.java;
The calibration parameters are applied in ClinometerActivity.java.
Furthermore, you can find a very good technical article that deepens the 3-axis calibration here: https://www.digikey.it/it/articles/using-an-accelerometer-for-inclination-sensing.

constant rotating device- how to receive valid orientation data?

I am currently implementing an speedometer by receiving orientation data from my phone. I am using
SensorManager.getRotationMatrix(R, I, gravity, geomagnetic);
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
float azimuth = orientation[0];
double azimuthD = Math.toDegrees(azimuth);
if(azimuthD < 0) azimuthD = 360 + azimuthD;
With this i am able to receive the rotation data from my phone, such as azimuth etc..
Anyway, this works fine while the device is placed on a table or something. But when rotating around a certain point (in my case the device is fixed on a wheel and rotating at a certain speed) the values are far away from being accurate. I believe, since I am using gravity and the geomagnetic sensor, there could be an conflict with forces that influence these sensors, while rotating. As the wheel turns, the rotation changes relative to a point, but the local device rotation stays the same.
How can I access the orientation of the device while it's turning without running into a lot of noisy data?
I read some about the ´Sensor.TYPE_ROTATION_VECTOR´ property, but couldn't quite figure out how it works. Also I read about the possibility to remap the coordination system, but how is that supposed to help, since my phone is never not vertical to the floor more like with an angle of 5°-10°.
I would appreciate any help.
Cheers,
viehlieb
I guess i found my answer.
The solution was to throw away all the code i posted above and use the gyroscope, obviously.
The gyroscope values measure angular velocity of the device's rotation. The coordinate system used is the devices own coordinate system. In my case the relevant value was the rotation around the z-axis.
Values are in radiant per second, which can be mapped to m/s if you figure multiply the wheel's circumference. So the trick was in the OnSensorChanged method:
if(sensorEvent.sensor.getType() == Sensor.TYPE_GYROSCOPE){
gyroscope = sensorEvent.values;
double rotZ = gyroscope[2];
double degrees = Math.toDegrees(gyroscope[2]);
//calculate the speed with circumference = 2.23m
float speed = (float) degrees/ 360.0f * 2.23f * 3.6f;
}
If now you'd like to have accurate values, you could store them in an array and calculate the average. Remember to clear the array every 20th time (or so) the onSensorChanged method is called. With SENSOR_DELAY_GAME registered there's sufficient data over which you could build the average.

Android - How to approach fall detection algorithm

I want to be able to feature a fairly simple fall detection algorithm in my application. At the moment in onSensorChanged(), I am getting the absolute value of the current x,x,z values and subtracting SensorManager.GRAVITY_EARTH (9.8 m/s) from this. The resulting value has to be bigger than a threshold value 10 times in a row to set a flag saying a fall has been detected by the accelerometer, the threshold value is about 8m/s.
Also I'm comparing the orientation of the phone as soon as the threshold has been passed and the orienation of it when the threshold is no longer being passed, this sets another flag saying the orientation sensor has detected a fall.
When both flags are set, an event occurs to check is user ok, etc etc. My problem is with the threshold, when the phone is held straight up the absolute value of accelerometer is about 9.8 m/s, but when i hold it still at an angle it can be over 15m/s. This is causing other events to trigger the fall detection, and if i increase the threshold to avoid that, it won't detect falls.
Can anyone give me some advice here with what possible values i should use or how to even improve my method? Many thanks.
First, I want to remind you that you cannot just add the x, y, z values together as they are, you have to use vector mathematics. This is why you get values of over 15 m/s. As long as the phone is not moving, the vector sum should always be about 9.8 m/s. You calculate it using SQRT(x*x + y*y + z*z). If you need more information, you can read about vector mathematics, maybe http://en.wikipedia.org/wiki/Euclidean_vector#Length is a good start for it.
I also suggest another algorithm: In free fall, all three of the x,y,z values of the accelerometer should be near zero. (At least, that's what I learned in physics classes a long time ago in school.) So maybe you can use a formula like if the vector sum of x,y,z <= 3 m/s than you detect a free fall. And if the vector sum then raises to a value over 20 m/s, than you detect the landing.
Those thresholds are just a wild guess. Maybe you just record the x,y,z values in a test application, and then move around the phone, and then analyze offline how the values (and their normal and vector sum) behave to get a feeling for which thresholds are sensible.
I have acutally published a paper on this issue. Please feel free to check out "ifall" # ww2.cs.fsu.edu/~sposaro
We basically take the root sum of squares and look for 3 things
1. Lower threshold broke. Ie fallinging
2. Upper threshold broke. Ie hitting the ground
3. Flatline around 1g, ie longlie, laying on the ground for an extended period of time
I forgot to update this thread, but iFall is now available on the Android Market.
Also check out ww2.cs.fsu.edu/~sposaro/iFall for more information
Its possible using the Accelerometer sensor.
Write this in the sensor changed listener..
if (sensor == Sensor.TYPE_ACCELEROMETER) {
long curTime = System.currentTimeMillis();
// only allow one update every 100ms.
if ((curTime - lastUpdate) > 100) {
long diffTime = (curTime - lastUpdate);
lastUpdate = curTime;
x = values[SensorManager.DATA_X];
y = values[SensorManager.DATA_Y];
z = values[SensorManager.DATA_Z];
float speed = Math.abs(x + y + z - last_x - last_y - last_z) / diffTime * 10000;
Log.d("getShakeDetection", "speed: " + speed);
if (speed > DashplexManager.getInstance().SHAKE_THRESHOLD) {
result = true;
}
last_x = x;
last_y = y;
last_z = z;
}
}

How to detect walking with Android accelerometer

I'm writing an application and my aim is to detect when a user is walking.
I'm using a Kalman filter like this:
float kFilteringFactor=0.6f;
gravity[0] = (accelerometer_values[0] * kFilteringFactor) + (gravity[0] * (1.0f - kFilteringFactor));
gravity[1] = (accelerometer_values[1] * kFilteringFactor) + (gravity[1] * (1.0f - kFilteringFactor));
gravity[2] = (accelerometer_values[2] * kFilteringFactor) + (gravity[2] * (1.0f - kFilteringFactor));
linear_acceleration[0] = (accelerometer_values[0] - gravity[0]);
linear_acceleration[1] = (accelerometer_values[1] - gravity[1]);
linear_acceleration[2] = (accelerometer_values[2] - gravity[2]);
float magnitude = 0.0f;
magnitude = (float)Math.sqrt(linear_acceleration[0]*linear_acceleration[0]+linear_acceleration[1]*linear_acceleration[1]+linear_acceleration[2]*linear_acceleration[2]);
magnitude = Math.abs(magnitude);
if(magnitude>0.2)
//walking
The array gravity[] is initialized with 0s.
I can detect when a user is walking or not (looking at the value of the magnitude of the acceleration vector), but my problem is that when a user is not walking and he moves the phones, it seems that he is walking.
Am I using the right filter?
Is it right to watch only the magnitude of the vector or have I to look at the single values ??
Google provides an API for this called DetectedActivity that can be obtained using the ActivityRecognitionApi. Those docs can be accessed here and here.
DetectedActivity has the method public int getType() to get the current activity of the user and also public int getConfidence() which returns a value from 0 to 100. The higher the value returned by getConfidence(), the more certain the API is that the user is performing the returned activity.
Here is a constant summary of what is returned by getType():
int IN_VEHICLE The device is in a vehicle, such as a car.
int ON_BICYCLE The device is on a bicycle.
int ON_FOOT The device is on a user who is walking or running.
int RUNNING The device is on a user who is running.
int STILL The device is still (not moving).
int TILTING The device angle relative to gravity changed significantly.
int UNKNOWN Unable to detect the current activity.
int WALKING The device is on a user who is walking.
My first intuition would be to run an FFT analysis on the sensor history, and see what frequencies have high magnitudes when walking.
It's essentially seeing what walking "sounds like", treating the accelerometer sensor inputs like a microphone and seeing the frequencies that are loud when walking (in other words, at what frequency is the biggest acceleration happening).
I'd guess you'd be looking for a high magnitude at some low frequency (like footstep rate) or maybe something else. It would be interesting to see the data.
My guess is you run the FFT and look for the magnitude at some frequency to be greater than some threshold, or the difference between magnitudes of two of the frequencies is more than some amount. Again, the actual data would determine how you attempt to detect it.
For walking detection I use the derivative applied to the smoothed signal from accelerometer. When the derivative is greater than threshold value I can suggest that it was a step. But I guess that it's not best practise, furthermore it only works when the phone is placed in a pants pocket.
The following code was used in this app https://play.google.com/store/apps/details?id=com.tartakynov.robotnoise
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER){
return;
}
final float z = smooth(event.values[2]); // scalar kalman filter
if (Math.abs(z - mLastZ) > LEG_THRSHOLD_AMPLITUDE)
{
mInactivityCount = 0;
int currentActivity = (z > mLastZ) ? LEG_MOVEMENT_FORWARD : LEG_MOVEMENT_BACKWARD;
if (currentActivity != mLastActivity){
mLastActivity = currentActivity;
notifyListeners(currentActivity);
}
} else {
if (mInactivityCount > LEG_THRSHOLD_INACTIVITY) {
if (mLastActivity != LEG_MOVEMENT_NONE){
mLastActivity = LEG_MOVEMENT_NONE;
notifyListeners(LEG_MOVEMENT_NONE);
}
} else {
mInactivityCount++;
}
}
mLastZ = z;
}
EDIT: I don't think it's accurate enough since when walking normally the average acceleration would be near 0. The most you could do measuring acceleration is detect when someone starts walking or stops (But as you said, it's difficult to filter it from the device moved by someone standing at one place)
So... what I wrote earlier, probably wouldn't work anyway:
You can "predict" whether the user is moving by discarding when the user is not moving (obvious), And first two options coming to my mind are:
Check whether the phone is "hidden", using proximity and light sensor (optional). This method is less accurate but easier.
Controlling the continuity of the movement, if the phone is moving for more than... 10 seconds and the movement is not despicable, then you consider he is walking. I know is not perfet either, but it's difficult wihout using any kind of positioning, by the way... why don't you just use LocationManager?
Try detecting the up and down oscillations, the fore and aft oscillations and the frequency of each and make sure they stay aligned within bounds on average, because you would detect walking and specifically that person's gait style which should remain relatively constant for several steps at once to qualify as moving.
As long as the last 3 oscillations line up within reason then conclude walking is occurring as long as this also is true:-
You measure horizontal acceleration and update a velocity value with it. Velocity will drift with time, but you need to keep a moving average of velocity smoothed over the time of a step, and as long as it doesn't drift more than say half of walking speed per 3 oscillations then it's walking but only if it initially rose to walking speed within a short time ie half a second or 2 oscillations perhaps.
All of that should just about cover it.
Of course, a little ai would help make things simpler or just as complex but amazingly accurate if you considered all of these as inputs to a NN. Ie preprocessing.

Categories

Resources