using LinearAcceleration and time passed to get distance traveled - android [duplicate] - android

This question already has answers here:
How to use Accelerometer to measure distance for Android Application Development
(2 answers)
Closed 9 years ago.
I know i am opening up a can of worms with trying to get the linear motion of a device using the accelerometer, but please just humor me.
I am trying to figure out the right formula to take the Sensor.TYPE_LINEAR_ACCELEROMETER (which i believe is normal accelerometer data minus gravity) and essentially say "this much time has passed and the i have accelerated x amount since last time, so i have traveled d amount.
should be something like distanceTraveledOnX = linearAccerationOfX * TimePassed;
easy enough in the real world right? if i have been going 1 mile a minute for 10minutes then i have traveled 10 miles.. speed * time = distance
problem is im not sure what the linearAcceleration is using for unit of measure. I know my timePassed is in NanoSeconds as i am saying (in my onSensorChanged)
currentTime = System.nanoTime();//var of type (double)
timePassed = currentTime - lastTime;
lastTime = currentTime;
can someone please help me figure out the formula for translating the linearAcceleration value to a nanoSecond measurement..
thanks
EDIT
here is the code im currently using but im always getting 0 :
public void onSensorChanged(SensorEvent evt) {
if (type == Sensor.TYPE_LINEAR_ACCELERATION) {
newTime = System.currentTimeMillis()/1000;
float oldVelocity = lastTime1-lastTime0;
float newVelocity = newTime- lastTime1;
if(oldVelocity<1)oldVelocity =1;
newX = lastX1 + ((lastX1 - lastX0)/oldVelocity)*newVelocity +(evt.values[0]/2)*(newVelocity*newVelocity);
lastX0 = lastX1;
lastX1 = newX;
lastTime0 = lastTime1;
lastTime1 = newTime;
Log.v("SENSOR MAN LINEAR", "new X:"+newX);
}
}

This stuff is high school physics, and if you don't know the difference between acceleration and velocity, you'll need to review it before you have any hope here.
I can tell you this much: the linear acceleration readings from a cell phone or tablet aren't remotely precise or accurate enough to do what you want without constant correction (via gps or other methods). There is an entire field of study trying to solve this problem. I've attended conferences on it.
That said, you also need to take into account that the orientation of your device will also change, unless this is some sort of special application, e.g. the device is trapped onto a sled which can only move in one direction.
Let's assume that case, and assume that the device is strapped to your sled with the right side of the device (+X axis) aligned in the direction of travel. Let's also assume that the initial position of the sled is known (call it X0) when the program starts, and that the initial velocity is zero.
Your code looks approximately like this:
double x0; // previous position, meters
double x; // current position
double v0; // previous velocity, meters/second
double v; // current velocity
long t0; // previous time, nanoseconds
long t; // current time
public void onStart() {
x0 = getInitialPosition();
x = x0;
v0 = 0;
v = v;
t0 = System.getCurrentTime() * 1000000;
// Enable sensors; left as an exercise for the reader
}
public void onSensorChanged(SensorEvent event) {
// Assume linear acceleration is the only active sensor
double accel = event.values[0]; // X axis is our axis of acceleration
t = event.timestamp;
double dt = (t - t0) * .000001;
v = v0 + accel * dt;
x = x0 + v * dt;
t0 = t;
v0 = v;
x0 = x;
}
This is by no means a complete solution. Doing this right involves differential equations which I'm not equipped to explain here (translation: I've forgotten everything I learned in college). However, if your acceleration value is accurate enough, and your time slice is short enough, this is viable.
If you need to solve this in more than one direction, it's only slightly more complicated provided that the device never changes orientation. If it does, then you also need to capture the rotation sensor and learn about quaternions and rotation matrices.
And even if you do everything right, errors will still accumulate, so now you want some sort of correction factor based on GPS, known geometry of the environment (e.g. if you're indoors and the software has a map of the building, it can make corrections when you turn a corner), and other environmental clues such as WiFi hotspots in known locations.
You might want to read up on Kalman filters at this point.
Executive summary: this is a HARD problem in the general case, and if you solve it, there's probably fame and fortune waiting for you.

Well, the correct form, known from school, is
finalXPosition = (linearAcceleration*timePassed^2)/2+ initialVelocity*timePassed+initialXPosition
finalVelocity = initialVelocity*timePassed
chaining these chunks you'll get your theoretical values.
In practice, best results are achieved by regular calibration of initialXPosition and initialVelocity through GPS.
simple example to receive calibrated horizontal acceleration in onSensorChanged:
class Integrator {
private float position = 0f;
private float velocity = 0f;
public void setGpsPosition (float gpsPosition) {
position = gpsPosition;
}
public void setGpsVelocity (float gpsVelocity) {
velocity = gpsVelocity;
}
public void onAccelerationChangeHandler(float acceleration, float timePassed) {
position += acceleration*timePassed*timePassed/2f + velocity*timePassed;
velocity += acceleration*timePassed;
}
public float getCurrentPosition() {
return position;
}
}
usage for x-acceleration:
long lastTime = 0;
public void onSensorChanged(SensorEvent evt) {
if (evt.sensor.getType() == Sensor.TYPE_LINEAR_ACCELERATION) {
long newTime = System.currentTimeMillis();
OnAccelerationChangeHandler(evt.values[0], (newTime-lastTime)/1000);
lastTime = newTime;
}
Please, note that outside a minute scale the error makes this all meaningless w/o gps correction. Understand, that if you are walking at constant speed the sensor won't give you anything at all.

Related

DJI MobileSDK I want to specify the distance with VirtualStick

I want to move forward by specifying a distance of 5 meters.
I want to know the relationship between the unit of the value of mPitch, mRoll, mYaw, and mThrottle specified by the argument of mFlightController.sendVirtualStickFlightControlData and the flight distance.
What value must be set for mRoll to advance MAVIC by 5 meters?
I want to know the calculation method.
This is the example code.
// What value must be set for mRoll to advance MAVIC by 5 meters?
float distance = 0.5f;
float rollJoyControlMaxSpeed = 10;
float mPitch = 0.0f;
float mRoll = (float)(rollJoyControlMaxSpeed * distance);
float mYaw = 0.0f;
float mThrottle = 0.0f;
mFlightController.sendVirtualStickFlightControlData(
new FlightControlData(
mPitch, mRoll, mYaw, mThrottle
), new CommonCallbacks.CompletionCallback() {
#Override
public void onResult(DJIError djiError) {
if (djiError!=null){
setResultToToast(djiError.getDescription());
}
}
}
What you're asking for is inertial navigation and the DJI SDK contains no class or function for that at the moment.
In inertial navigation the standard approach is to use the velocity and acceleration in order to calculate the distance moved so far.
The velocity is provided by FlightControllerStateCallback:
aircraft.getFlightController().setStateCallback(new FlightControllerState.Callback() {
#Override
public void onUpdate(#NonNull FlightControllerState flightControllerState) {
//get current velocity
float velocityX = flightControllerState.getVelocityX();
float velocityY = flightControllerState.getVelocityY();
float velocityZ = flightControllerState.getVelocityZ();
}
});
The acceleration values are not accessible.
In theory you need to fly at a speed of 1 m/s for 5 seconds to reach your goal of 5 meters. In reality you need to account for the acceleration and braking distance.
The most promising approach I see is to test how much time the Mavic needs to accelerate to a certain speed and also how long the braking distance is in relation to the speed. Based on these you should be able to approximate the distance flown at an acceptable level. Here's some pseudocode of how I imagine it to be working:
set velocity to 1 m/s
-> the drone needs 2 seconds to achieve the speed and will fly 1.5m during that time. [1]
once the velocity has been reached(information from callback), wait 2 seconds.
-> the drone flies 2m
set velocity to 0
-> the braking distance for 1m/s is 1.5m [1]
the total distance flown: 5m
Please note that these values [1] are purely made up and you need to get approximate values for these via probably tedious testing.

Vertical orientation degree - Android

Anyone knows how to get smooth vertical orientation degree in Android?
I already tried OrientationEventListener as shown below but it's very noisy. already tried all rates, Normal, Delay, Game and Fastest, all shown the same result.
myOrientationEventListener = new OrientationEventListener(this, SensorManager.SENSOR_DELAY_NORMAL) {
#Override
public void onOrientationChanged(int arg0) {
orientaion = arg0;
Log.i("orientaion", "orientaion:" + orientaion);
}
};
So there are two things going on that can affect what you need.
Sensor delay. Android provides four different sensor delay modes: SENSOR_DELAY_UI, SENSOR_DELAY_NORMAL, SENSOR_DELAY_GAME, and SENSOR_DELAY_FASTEST, where SENSOR_DELAY_UI has the longest interval between two data points and SENSOR_DELAY_FASTEST has the shortest. The shorter the interval the higher data sampling rate (number of samples per second). Higher sampling rate gives you more "responsive" data, but comes with greater noise, while lower sampling rate gives you more "laggy" data, but more smooth.
Noise filtering. With the above in mind, you need to decide which route you want to take. Does your application need fast response? If it does, you probably want to choose a higher sampling rate. Does your application need smooth data? I guess this is obviously YES given the context of the question, which means you need noise filtering. For sensor data, noise is mostly high frequency in nature (noise value oscillates very fast with time). So a low pass filter (LPF) is generally adequate.
A simple way to implement LPF is exponential smoothing. To integrate with your code:
int orientation = <init value>;
float update_rate = <value between 0 to 1>;
myOrientationEventListener = new OrientationEventListener(this, SensorManager.SENSOR_DELAY_NORMAL) {
#Override
public void onOrientationChanged(int arg0) {
orientation = (int)(orientation * (1f - update_rate) + arg0 * update_rate);
Log.i("orientation", "orientation:" + orientation);
}
};
Larger update_value means the resulting data is less smooth, which should be intuitive: if update_value == 1f, it falls back to your original code. Another note about update_value is it depends on the time interval between updates (related to sensor delay modes). You probably can tune this value to find one works for you, but if you want to know exactly how it works, check the alpha value definition under Electronic low-pass filters -> Discrete-time realization.
I had a similar problem showing an artificial horizon on my device. The low pass filter (LPF) solved this issue.
However you need to consider when you use the orientation angle in degrees and apply the LPF on it blindly, the result is faulty when the device is in portrait mode and turned from left to ride or opposite. The reason for this is the shift between 359 and 0 degree. Therefore I recommend to convert the degree into radians and apply the LPF on the sin and cos values of the orientation angle.
Further I recommend to use a dynamic alpha or update rate for the LPF. A static value for the alpha might be perfect on your device but not on any other.
The following class filters based on radians and uses a dynamic alpha as described above:
import static java.lang.Math.*;
Filter {
private static final float TIME_CONSTANT = .297f;
private static final float NANOS = 1000000000.0f;
private static final int MAX = 360;
private double alpha;
private float timestamp;
private float timestampOld;
private int count;
private int values[];
Filter() {
timestamp = System.nanoTime();
timestampOld = System.nanoTime();
values = new int[0];
}
int filter(int input) {
//there is no need to filter if we have only one
if(values.length == 0) {
values = new int[] {0, input};
return input;
}
//filter based on last element from array and input
int filtered = filter(values[1], input);
//new array based on previous result and filter
values = new int[] {values[1], filtered};
return filtered;
}
private int filter(int previous, int current) {
calculateAlpha();
//convert to radians
double radPrev = toRadians(previous);
double radCurrent = toRadians(current);
//filter based on sin & cos
double sumSin = filter(sin(radPrev), sin(radCurrent));
double sumCos = filter(cos(radPrev), cos(radCurrent));
//calculate result angle
double radRes = atan2(sumSin, sumCos);
//convert radians to degree, round it and normalize (modulo of 360)
long round = round(toDegrees(radRes));
return (int) ((MAX + round) % MAX);
}
//dynamic alpha
private void calculateAlpha() {
timestamp = System.nanoTime();
float diff = timestamp - timestampOld;
double dt = 1 / (count / (diff / NANOS));
count++;
alpha = dt/(TIME_CONSTANT + dt);
}
private double filter(double previous, double current) {
return (previous + alpha * (current - previous));
}
}
For further readings see this discussion.

How to detect left and right tilt of an android device mounted with an accelerometer?

Lets say you have the acceleration readings in all the 3 dimensions i.e X, Y and Z. How do you infer using the readings the phone was tilted left or right? The readings get generated every 20ms.
I actually want the logic of inferring the tilt from the readings. The tilt needs to be smooth.
A tilt can be detected in a sort of diferent ways. You can take into account 1 axis, 2 axis, or the 3 axis. Depending on how accurate you want it, and how much you feel like fighting with maths.
If you use only one axis, it is quite simple. Think the mobile is completely horizontal, and you move it like this:
using just one axis, lets say, axis x, will be enough, since you can detect accurately a change in that axis position, since even any small movement will do a change in the axis.
But, if your application is only reading that axis, and the user has the phone almost vertical, the difference in x axis will be really small even rotating the phone a big angle.
Anyways,for applications that only need coarse resolution, a single-axis can be used.
Referring to basic trigonometry, the projection of the gravity vector on the x-axis produces an output acceleration equal to the sine of the angle between the accelerometer x-axis and the horizon.
This means that having the values of an axis (those are acceleration values) you can calculate the angle in which the device is.
this means that the value given to you by the sensor, is = to 9,8 * sine of the angle, so doing the maths you can get the actual angle.
But don't worry, you don't even have to do this. Since the values are more or less proportional, as you can see in the table below, you can work directly with the value of the sensor, without taking much care of what angle represents, if you don't need it to be much accurate, since a change in that value means a proportional change in the angle, so with a few test, you will find out how big should be the change in order to be relevant to you.
So, if you take the value over the time, and compare to each other, you can figure out how big the rotation was. For this,
you consider just one axis. this will be axis X.
write a function to get the difference in the sensor value for that axis between one function call, and the next
Decide a maximum time and a minimum sensor difference, that you will consider a valid movement (e.g. a big rotation is good but only if it is fast enough, and a fast movement is good only if the difference in the angle is big enough)
if you detect two measurements that accomplish those conditions, you take note of half tilt done (in a boolean for instance), and start measuring again, but now, the new reference value is the value that was considered half tilt.
if the last difference was positive, now you need a negative difference, and if the last difference was negative, now you need a positive difference; this is, coming back. so start taking values comparing the new reference value with the new values coming from the sensor, and see if one accomplish what you decided in point 3.
if you find a valid value (accomplishing value difference and time conditions ), you have a tilt. But if you dont get a good value and the time is consumed, you reset everything: let your reference value be the last one, reset the timers, reset the half-tilt-done boolean to false, and keep measuring.
I hope this is good enough for you. For sure you can find some libraries or code snippets to help you out with this, but i think is good, as you say, to know the logic of inferring the tilt from the readings
The pictures was taken from this article, wich i recomend to read if you want to improve the accuracy and consider 2 o 3 axis for the tilt
The commonsware Sensor Monitor app does a pretty good job with this. It converts the sensor readouts to X, Y, Z values on each sensor reading, so it's pretty easy from there to determine which way the device is moving.
https://github.com/commonsguy/cw-omnibus/tree/master/Sensor/Monitor
Another item worth noting (from the Commonsware book):
There are four standard delay periods, defined as constants on the
SensorManager class:
SENSOR_DELAY_NORMAL, which is what most apps would use for broad changes, such as detecting a screen rotating from portrait to
landscape
SENSOR_DELAY_UI, for non-game cases where you want to update the UI continuously based upon sensor readings
SENSOR_DELAY_GAME, which is faster (less delay) than SENSOR_DELAY_UI, to try to drive a higher frame rate
SENSOR_DELAY_FASTEST, which is the “firehose” of sensor readings, without delay
You can use the accelerometer and magnetic field sensor to accomplish this. You can call this method in your OnSensorChanged method to detect if the phone was tilt upwards. This currently only works if the phone is held horizontally. Check the actual blog post for a more complete solution.
http://www.ahotbrew.com/how-to-detect-forward-and-backward-tilt/
public boolean isTiltUpward()
{
if (mGravity != null && mGeomagnetic != null)
{
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
if (success)
{
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
/*
* If the roll is positive, you're in reverse landscape (landscape right), and if the roll is negative you're in landscape (landscape left)
*
* Similarly, you can use the pitch to differentiate between portrait and reverse portrait.
* If the pitch is positive, you're in reverse portrait, and if the pitch is negative you're in portrait.
*
* orientation -> azimut, pitch and roll
*
*
*/
pitch = orientation[1];
roll = orientation[2];
inclineGravity = mGravity.clone();
double norm_Of_g = Math.sqrt(inclineGravity[0] * inclineGravity[0] + inclineGravity[1] * inclineGravity[1] + inclineGravity[2] * inclineGravity[2]);
// Normalize the accelerometer vector
inclineGravity[0] = (float) (inclineGravity[0] / norm_Of_g);
inclineGravity[1] = (float) (inclineGravity[1] / norm_Of_g);
inclineGravity[2] = (float) (inclineGravity[2] / norm_Of_g);
//Checks if device is flat on ground or not
int inclination = (int) Math.round(Math.toDegrees(Math.acos(inclineGravity[2])));
/*
* Float obj1 = new Float("10.2");
* Float obj2 = new Float("10.20");
* int retval = obj1.compareTo(obj2);
*
* if(retval > 0) {
* System.out.println("obj1 is greater than obj2");
* }
* else if(retval < 0) {
* System.out.println("obj1 is less than obj2");
* }
* else {
* System.out.println("obj1 is equal to obj2");
* }
*/
Float objPitch = new Float(pitch);
Float objZero = new Float(0.0);
Float objZeroPointTwo = new Float(0.2);
Float objZeroPointTwoNegative = new Float(-0.2);
int objPitchZeroResult = objPitch.compareTo(objZero);
int objPitchZeroPointTwoResult = objZeroPointTwo.compareTo(objPitch);
int objPitchZeroPointTwoNegativeResult = objPitch.compareTo(objZeroPointTwoNegative);
if (roll < 0 && ((objPitchZeroResult > 0 && objPitchZeroPointTwoResult > 0) || (objPitchZeroResult < 0 && objPitchZeroPointTwoNegativeResult > 0)) && (inclination > 30 && inclination < 40))
{
return true;
}
else
{
return false;
}
}
}
return false;
}
Is this what you're looking for?
public class AccelerometerHandler implements SensorEventListener
{
float accelX;
float accelY;
float accelZ;
public AccelerometerHandler(Context paramContext)
{
SensorManager localSensorManager = (SensorManager)paramContext.getSystemService("sensor");
if (localSensorManager.getSensorList(1).size() != 0)
localSensorManager.registerListener(this, (Sensor)localSensorManager.getSensorList(1).get(0), 1);
}
public float getAccelX()
{
return this.accelX;
}
public float getAccelY()
{
return this.accelY;
}
public float getAccelZ()
{
return this.accelZ;
}
public void onAccuracyChanged(Sensor paramSensor, int paramInt)
{
}
public void onSensorChanged(SensorEvent paramSensorEvent)
{
this.accelX = paramSensorEvent.values[0];
this.accelY = paramSensorEvent.values[1];
this.accelZ = paramSensorEvent.values[2];
}
}

calculating position with the LinearAcceleration

I am trying to calculate a new X position based on Sensor.TYPE_LINEAR_ACCELERATION that i will be applying to my android game.
I an starting with the following variables all set to 0:
float newX=0;
float lastX0 =0;
float lastX1 =0;
and my Time Variables set initially like so:
float newTime = System.currentTimeMillis()/1000;
float lastTime0 = lastTime1 = newTime;
Then my On SensorChange looks like this:
public onSensorChanged(SensorEvent evt){
if (type == Sensor.TYPE_LINEAR_ACCELERATION) {
newTime = System.currentTimeMillis()/1000;
float oldDeltaTime = lastTime1-lastTime0;
float newDeltaTime = newTime- lastTime1;
if(oldDeltaTime<1)oldDeltaTime =1;
newX = lastX1 + ((lastX1 - lastX0)/oldDeltaTime)*newDeltaTime +(evt.values[0]/2)*(newDeltaTime*newDeltaTime);
lastX0 = lastX1;
lastX1 = newX;
lastTime0 = lastTime1;
lastTime1 = newTime;
Log.v("SENSOR MAN LINEAR", "new X:"+newX);
}
}
but i am getting 0 all the time for newX in my log.
Has anyone successfully translated LinearAcceleration into position?
can anyone figure out what i am doing wrong..
i am crying inside!
My experience is that the errors associated with measuring Sensor.TYPE_LINEAR_ACCELERATION make it impossible to get a good estimate of position from Sensor.TYPE_LINEAR_ACCELERATION alone.
I'm not sure why you're getting zero all the time. To debug that, I'd simply Log.v all your variables to work out what the problem is.
I'm also not sure that your formula for newX is making the best use of the sensor values. Acceleration is a change of speed, and you implementation doesn't seem to correspond to that.
Finally, the event values depend on the orientation of the device. So to have any chance of success, you need to combine with compass sensor readings to try and work out how the Sensor.TYPE_LINEAR_ACCELERATION values map to directions in the real world.

How can I get the direction of movement using an accelerometer?

I'm developing a Android application and I would like to know if is possible detect the direction of movement with one axis fixed. For example, I want put my phone on the table and detect the direction when I move it (left, right, up and down). The distance is not necessary, I just want know the accurate direction.
Yes.
Using the SensorEventListener.onSensorChanged(SensorEvent event) you can determine the values provided along the X & Y axis. You would need to record these values and then compare them to any new values that you receive on subsequent calls to the onSensorChanged method to get a delta value. If the delta value on one axis is positive then the device is moving one way, if its negative its moving the opposite way.
You will probably need to fine tune both the rate at which you receive accelerometer events and the threshold at which you consider a delta value to indicate a change in direction.
Here's a quick code example of what I'm talking about:
public class AccelerometerExample extends Activity implements SensorEventListener {
TextView textView;
StringBuilder builder = new StringBuilder();
float [] history = new float[2];
String [] direction = {"NONE","NONE"};
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
textView = new TextView(this);
setContentView(textView);
SensorManager manager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
Sensor accelerometer = manager.getSensorList(Sensor.TYPE_ACCELEROMETER).get(0);
manager.registerListener(this, accelerometer, SensorManager.SENSOR_DELAY_GAME);
}
#Override
public void onSensorChanged(SensorEvent event) {
float xChange = history[0] - event.values[0];
float yChange = history[1] - event.values[1];
history[0] = event.values[0];
history[1] = event.values[1];
if (xChange > 2){
direction[0] = "LEFT";
}
else if (xChange < -2){
direction[0] = "RIGHT";
}
if (yChange > 2){
direction[1] = "DOWN";
}
else if (yChange < -2){
direction[1] = "UP";
}
builder.setLength(0);
builder.append("x: ");
builder.append(direction[0]);
builder.append(" y: ");
builder.append(direction[1]);
textView.setText(builder.toString());
}
#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
// nothing to do here
}
}
This code will only provide you with the general direction on the X and Y axis that the phone has moved in. To provide a more fine grained determination of direction (e.g. to attempt to mimic the movement of a computer mouse) you might find that a phone's accelerometer is not fit for purpose.
To attempt this, I would first set the sensor delay to SensorManager.SENSOR_DELAY_FAST and create a list of multiple history events so that I could detect movement over time and not be influenced by slight movements in the opposite direction that often happen when taking accelerometer measurements at such a fine level. You would also need to measure the amount of time that has passed to help calculate the accurate measure of movement as suggested in your comments.
From what I've read, with the sensors one can detect only accelerations and phone orientation. So you can easily detect the start of the movement (and in which direction) and the stop of the movement, since the velocity is changing so there is acceleration (when stoping acceleration is against the velocity direction).
If the phone is moving with constant velocity, the accelerometers will give zero values (linear acceleration which subtracts gravity). So in order to know if the phone is moving you should compute the velocity at each instant, by
V(t)=V(t-1)+a*dt
in which:
V(t-1) is the known velocity at previous instant,
V(t) is the velocity at current instant,
a is the acceleration (consider the acceleration in previous instant or
mean acceleration between previous and current instant).
The problem is that due to the uncertainty of the sensor values, you might end up summing small errors at each instant, which will lead to erroneous velocity values. Probably you'll have to adjust a low pass filter to the values.

Categories

Resources