I want to detect if the user has taken a turn on the road while driving using the sensors on the android phone. How do I code this? I am collecting data live from all the sensors(accelerometer,location,rotation,geomagnetic) and storing them on the sd card. So now i just want to know whether the user has a taken a turn and in which direction he has turned.
I assume the registration of the sensor is done properly. You can detect the direction by using the orientation sensor (deprecated) as follows:
#Override
public void onSensorChanged(SensorEvent event) {
float azimuth_angle = event.values[0];
int precision = 2;
if (prevAzimuth - azimuth_angle < precision * -1)
Log.v("->", "RIGHT");
else if (prevAzimuth - azimuth_angle > precision)
Log.v("<-", "LEFT");
prevAzimuth = azimuth_angle;
}
Note: The variable of "prevAzimuth" is declared as global. You can change "precision" value to whatever you want. We need this value because we do not want to see output after each trivial change in azimuth angle. However, too large precision gives imprecise results. To me, "2" is optimum.
If you are tracking location coordinates, you can also track shifts between the angle from previous locations.
angle = arctan((Y2 - Y1) / (X2 - X1)) * 180 / PI
See this answer for calculating x and y.
Decision to use sensor values is based on an unrealistic assumption that the device is never rotated with respect to the vehicle.
Related
I am using below code to identify the movement of the device, means I would like to know that device is moving or not. I also use Google Activity APIs which provides different activity modes like WALKING, ON_FOOT, STILL, etc without using GPS. I would like to achieve the same with Sensors but I am not able to get it accurately.
The issue with the following code is that as soon as I move the device quickly like take it from the table then I am getting the result as moving whereas it's not actually moving.
// calling method from onSensorChanged method and using TYPE_ACCELEROMETER sensor.
double speed = getAccelerometer(event.values);
// then checking the speed.
if(speed > 0.9 && speed < 1.1) {
// device is not moving
} else {
// device is moving.
}
/**
* #return
*/
private double getAccelerometer(float[] values) {
// Movement
float x = values[0];
float y = values[1];
float z = values[2];
float accelerationSquareRoot =
(float) ((x * x + y * y + z * z) / (9.80665 * 9.80665));
return Math.sqrt(accelerationSquareRoot);
}
Can anyone guide me how to make this logic accurate so that I can identify the device is moving or not?
The accelerometer is made to return acceleration data and according to Netwon's 2nd law if the acceleration is constant then the body is not moving or moving with constant speed(this is quite impossibile in your case).
Therefore if you keep reading the same data on all three axis(or better in a quite strict range) from accelerometer over time it means the phone is not moving otherwise it is.
For the purpose, you need to use Activity Recognition API which will provide you some events like moving, stop, driving, e.t.c, And activity recognize use some sensor data and also help of location service when is running. For the more how we can use and what actually it. You can read from below link
https://developers.google.com/location-context/activity-recognition/
I am currently implementing an speedometer by receiving orientation data from my phone. I am using
SensorManager.getRotationMatrix(R, I, gravity, geomagnetic);
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
float azimuth = orientation[0];
double azimuthD = Math.toDegrees(azimuth);
if(azimuthD < 0) azimuthD = 360 + azimuthD;
With this i am able to receive the rotation data from my phone, such as azimuth etc..
Anyway, this works fine while the device is placed on a table or something. But when rotating around a certain point (in my case the device is fixed on a wheel and rotating at a certain speed) the values are far away from being accurate. I believe, since I am using gravity and the geomagnetic sensor, there could be an conflict with forces that influence these sensors, while rotating. As the wheel turns, the rotation changes relative to a point, but the local device rotation stays the same.
How can I access the orientation of the device while it's turning without running into a lot of noisy data?
I read some about the ´Sensor.TYPE_ROTATION_VECTOR´ property, but couldn't quite figure out how it works. Also I read about the possibility to remap the coordination system, but how is that supposed to help, since my phone is never not vertical to the floor more like with an angle of 5°-10°.
I would appreciate any help.
Cheers,
viehlieb
I guess i found my answer.
The solution was to throw away all the code i posted above and use the gyroscope, obviously.
The gyroscope values measure angular velocity of the device's rotation. The coordinate system used is the devices own coordinate system. In my case the relevant value was the rotation around the z-axis.
Values are in radiant per second, which can be mapped to m/s if you figure multiply the wheel's circumference. So the trick was in the OnSensorChanged method:
if(sensorEvent.sensor.getType() == Sensor.TYPE_GYROSCOPE){
gyroscope = sensorEvent.values;
double rotZ = gyroscope[2];
double degrees = Math.toDegrees(gyroscope[2]);
//calculate the speed with circumference = 2.23m
float speed = (float) degrees/ 360.0f * 2.23f * 3.6f;
}
If now you'd like to have accurate values, you could store them in an array and calculate the average. Remember to clear the array every 20th time (or so) the onSensorChanged method is called. With SENSOR_DELAY_GAME registered there's sufficient data over which you could build the average.
I want to be able to feature a fairly simple fall detection algorithm in my application. At the moment in onSensorChanged(), I am getting the absolute value of the current x,x,z values and subtracting SensorManager.GRAVITY_EARTH (9.8 m/s) from this. The resulting value has to be bigger than a threshold value 10 times in a row to set a flag saying a fall has been detected by the accelerometer, the threshold value is about 8m/s.
Also I'm comparing the orientation of the phone as soon as the threshold has been passed and the orienation of it when the threshold is no longer being passed, this sets another flag saying the orientation sensor has detected a fall.
When both flags are set, an event occurs to check is user ok, etc etc. My problem is with the threshold, when the phone is held straight up the absolute value of accelerometer is about 9.8 m/s, but when i hold it still at an angle it can be over 15m/s. This is causing other events to trigger the fall detection, and if i increase the threshold to avoid that, it won't detect falls.
Can anyone give me some advice here with what possible values i should use or how to even improve my method? Many thanks.
First, I want to remind you that you cannot just add the x, y, z values together as they are, you have to use vector mathematics. This is why you get values of over 15 m/s. As long as the phone is not moving, the vector sum should always be about 9.8 m/s. You calculate it using SQRT(x*x + y*y + z*z). If you need more information, you can read about vector mathematics, maybe http://en.wikipedia.org/wiki/Euclidean_vector#Length is a good start for it.
I also suggest another algorithm: In free fall, all three of the x,y,z values of the accelerometer should be near zero. (At least, that's what I learned in physics classes a long time ago in school.) So maybe you can use a formula like if the vector sum of x,y,z <= 3 m/s than you detect a free fall. And if the vector sum then raises to a value over 20 m/s, than you detect the landing.
Those thresholds are just a wild guess. Maybe you just record the x,y,z values in a test application, and then move around the phone, and then analyze offline how the values (and their normal and vector sum) behave to get a feeling for which thresholds are sensible.
I have acutally published a paper on this issue. Please feel free to check out "ifall" # ww2.cs.fsu.edu/~sposaro
We basically take the root sum of squares and look for 3 things
1. Lower threshold broke. Ie fallinging
2. Upper threshold broke. Ie hitting the ground
3. Flatline around 1g, ie longlie, laying on the ground for an extended period of time
I forgot to update this thread, but iFall is now available on the Android Market.
Also check out ww2.cs.fsu.edu/~sposaro/iFall for more information
Its possible using the Accelerometer sensor.
Write this in the sensor changed listener..
if (sensor == Sensor.TYPE_ACCELEROMETER) {
long curTime = System.currentTimeMillis();
// only allow one update every 100ms.
if ((curTime - lastUpdate) > 100) {
long diffTime = (curTime - lastUpdate);
lastUpdate = curTime;
x = values[SensorManager.DATA_X];
y = values[SensorManager.DATA_Y];
z = values[SensorManager.DATA_Z];
float speed = Math.abs(x + y + z - last_x - last_y - last_z) / diffTime * 10000;
Log.d("getShakeDetection", "speed: " + speed);
if (speed > DashplexManager.getInstance().SHAKE_THRESHOLD) {
result = true;
}
last_x = x;
last_y = y;
last_z = z;
}
}
I am working on a project which includes an Android application which is used for
controlling/steering.
Speed: When you tilt the phone forward/backwards (pitch) it simulates giving gas and breaking.
Direction: When you tilt the phone left/right (roll) it simulates steering to the left and right.
I have already written some code which seemed to work fine. But when I took a closer look, I found that some values are acting weird.
When I tilt the phone forward/backward to handle the speed it works perfect I get the expected speed and direction values. But when I tilt the phone to the left/right to handle the direction it seems to corrupt some values. When it is tilting to the left/right that doesn't only change the direction value (roll) but it also affects the speed value (pitch).
For extra information:
Programming for Android 2.2
Device is an Google Nexus One
Holding the device in portrait
The most relevant code I use to read the sensor values is as follows:
public void onSensorChanged(SensorEvent sensorEvent)
{
synchronized (this)
{
if (sensorEvent.sensor.getType() == Sensor.TYPE_ORIENTATION)
{
float azimuth = sensorEvent.values[0]; // azimuth rotation around the z-axis
float pitch = sensorEvent.values[1]; // pitch rotation around the x-axis
float roll = sensorEvent.values[2]; // roll rotation around the y-axis
System.out.println("pitch: " + pitch);
System.out.println("roll: " + roll);
System.out.println("--------------------");
// Convert the sensor values to the actual speed and direction values
float speed = (pitch * 2.222f) + 200;
float direction = roll * -2.222f;
So when I run the code, and I look at the printed values. When tilting the device left/right, it seems to affect the pitch value as well. How come? And how can I get the pure pitch value, when 'roll'-ing? So that tilting the phone to the left/right doesn't affect/corrupt the pitch value.
You could read up on Gimbal lock. That's bitten me before.
I'm writing an application and my aim is to detect when a user is walking.
I'm using a Kalman filter like this:
float kFilteringFactor=0.6f;
gravity[0] = (accelerometer_values[0] * kFilteringFactor) + (gravity[0] * (1.0f - kFilteringFactor));
gravity[1] = (accelerometer_values[1] * kFilteringFactor) + (gravity[1] * (1.0f - kFilteringFactor));
gravity[2] = (accelerometer_values[2] * kFilteringFactor) + (gravity[2] * (1.0f - kFilteringFactor));
linear_acceleration[0] = (accelerometer_values[0] - gravity[0]);
linear_acceleration[1] = (accelerometer_values[1] - gravity[1]);
linear_acceleration[2] = (accelerometer_values[2] - gravity[2]);
float magnitude = 0.0f;
magnitude = (float)Math.sqrt(linear_acceleration[0]*linear_acceleration[0]+linear_acceleration[1]*linear_acceleration[1]+linear_acceleration[2]*linear_acceleration[2]);
magnitude = Math.abs(magnitude);
if(magnitude>0.2)
//walking
The array gravity[] is initialized with 0s.
I can detect when a user is walking or not (looking at the value of the magnitude of the acceleration vector), but my problem is that when a user is not walking and he moves the phones, it seems that he is walking.
Am I using the right filter?
Is it right to watch only the magnitude of the vector or have I to look at the single values ??
Google provides an API for this called DetectedActivity that can be obtained using the ActivityRecognitionApi. Those docs can be accessed here and here.
DetectedActivity has the method public int getType() to get the current activity of the user and also public int getConfidence() which returns a value from 0 to 100. The higher the value returned by getConfidence(), the more certain the API is that the user is performing the returned activity.
Here is a constant summary of what is returned by getType():
int IN_VEHICLE The device is in a vehicle, such as a car.
int ON_BICYCLE The device is on a bicycle.
int ON_FOOT The device is on a user who is walking or running.
int RUNNING The device is on a user who is running.
int STILL The device is still (not moving).
int TILTING The device angle relative to gravity changed significantly.
int UNKNOWN Unable to detect the current activity.
int WALKING The device is on a user who is walking.
My first intuition would be to run an FFT analysis on the sensor history, and see what frequencies have high magnitudes when walking.
It's essentially seeing what walking "sounds like", treating the accelerometer sensor inputs like a microphone and seeing the frequencies that are loud when walking (in other words, at what frequency is the biggest acceleration happening).
I'd guess you'd be looking for a high magnitude at some low frequency (like footstep rate) or maybe something else. It would be interesting to see the data.
My guess is you run the FFT and look for the magnitude at some frequency to be greater than some threshold, or the difference between magnitudes of two of the frequencies is more than some amount. Again, the actual data would determine how you attempt to detect it.
For walking detection I use the derivative applied to the smoothed signal from accelerometer. When the derivative is greater than threshold value I can suggest that it was a step. But I guess that it's not best practise, furthermore it only works when the phone is placed in a pants pocket.
The following code was used in this app https://play.google.com/store/apps/details?id=com.tartakynov.robotnoise
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER){
return;
}
final float z = smooth(event.values[2]); // scalar kalman filter
if (Math.abs(z - mLastZ) > LEG_THRSHOLD_AMPLITUDE)
{
mInactivityCount = 0;
int currentActivity = (z > mLastZ) ? LEG_MOVEMENT_FORWARD : LEG_MOVEMENT_BACKWARD;
if (currentActivity != mLastActivity){
mLastActivity = currentActivity;
notifyListeners(currentActivity);
}
} else {
if (mInactivityCount > LEG_THRSHOLD_INACTIVITY) {
if (mLastActivity != LEG_MOVEMENT_NONE){
mLastActivity = LEG_MOVEMENT_NONE;
notifyListeners(LEG_MOVEMENT_NONE);
}
} else {
mInactivityCount++;
}
}
mLastZ = z;
}
EDIT: I don't think it's accurate enough since when walking normally the average acceleration would be near 0. The most you could do measuring acceleration is detect when someone starts walking or stops (But as you said, it's difficult to filter it from the device moved by someone standing at one place)
So... what I wrote earlier, probably wouldn't work anyway:
You can "predict" whether the user is moving by discarding when the user is not moving (obvious), And first two options coming to my mind are:
Check whether the phone is "hidden", using proximity and light sensor (optional). This method is less accurate but easier.
Controlling the continuity of the movement, if the phone is moving for more than... 10 seconds and the movement is not despicable, then you consider he is walking. I know is not perfet either, but it's difficult wihout using any kind of positioning, by the way... why don't you just use LocationManager?
Try detecting the up and down oscillations, the fore and aft oscillations and the frequency of each and make sure they stay aligned within bounds on average, because you would detect walking and specifically that person's gait style which should remain relatively constant for several steps at once to qualify as moving.
As long as the last 3 oscillations line up within reason then conclude walking is occurring as long as this also is true:-
You measure horizontal acceleration and update a velocity value with it. Velocity will drift with time, but you need to keep a moving average of velocity smoothed over the time of a step, and as long as it doesn't drift more than say half of walking speed per 3 oscillations then it's walking but only if it initially rose to walking speed within a short time ie half a second or 2 oscillations perhaps.
All of that should just about cover it.
Of course, a little ai would help make things simpler or just as complex but amazingly accurate if you considered all of these as inputs to a NN. Ie preprocessing.