Video explaining for those who does not understand
THIS ANSWER IS NOT CORRECTLY ANSWERED PLEASE TRY TO ANSWER IT WITH ANOTHER SOLUTION (100 Bounty is out of date)
Same question but better explained
This questions was accepted as a correct but it is not at all, I tried it with my old device ZTE and it worked most of time, but now I have a Samsung Galazy A5 2016 and it doesn't work, neither on a LG G3.
The thing is trying using Accelerometer and some Sensors I have to be able to detect any of those two moviments that I made on the Video.
There are two moviments :
Smashing it (with a little bit of velocity)
Free fall
I let you to decide and convince me what's the better option and EASIER one to do, with better I mean that works on most of devices.
A stationary device will have an gravity value of +9.81, which corresponds to the acceleration of the device (0 m/s2 minus the force of gravity, which is -9.81 m/s2). Thus if the device is moving downward from stationary then the gravity will be less than 9.81. A free fall device will have gravity equals 0.
Below is how to determine if the device starts moving downward. It will not be able to determine whether the device is moving downward if the device is already moving downward with constant speed, since in this case there is no acceleration downward and the gravity norm should be around 9.81.
You need to use TYPE_GRAVITY. If the device does not have TYPE_GRAVITY, then low pass filter TYPE_ACCELEROMETER to get the gravity vector.
As above a stationary device will have a gravity vector with norm equal 9.81. However, this value will vary slightly with devices. Thus you need first to determine this stationary gravity norm. You can do this by register for TYPE_GRAVITY or TYPE_ACCELEROMETER and ask the user to lay the device flat and then press a button. Once the button is pressed the app will calculate the norm of the gravity in onSensorChanged.
private float mStationaryGravityNorm;
private float mDeviation = 0.01;
private float mCount;
private boolean mIsCalculatingStationGravityNorm = true;
Button button = findViewById(R.id.button);
button.setOnClickListener(new View.OnClickListener( {
#Override
public void onClick(View v) {
// register sensor
}
});
#Override
public void onSensorChanged(SensorEvent event) {
// Will average out 100 gravity values.
if (mIsCalculatingStationGravityNorm) {
if (mCount++ < 100) {
mStationaryGravityNorm += Math.sqrt(event.values[0] * event.values[0] + event.values[1] * event.values[1] + event.values[2] * event.values[2]);
} else {
mStationaryGravityNorm /= 100;
mIsCalculatingStationGravityNorm = false;
} else {
float gravityNorm = Math.sqrt(event.values[0] * event.values[0] + event.values[1] * event.values[1] + event.values[2] * event.values[2]);
if (gravityNorm < mStationaryGravityNorm - mDeviation) {
// moving down
}
}
PS For moving up an down you do want to calculate gravity. When the device is stationary, the gravity norm is approximately 9.81 (depending on device). Now if the device is moving down, there is an acceleration downward, thus the gravity norm will be less than 9.81 and if the device is moving up the gravity norm will be more than 9.81. So by comparing the gravity norm against this stationary gravity norm, you will know if the device moving up or down. This is independent of the device orientation. TYPE_GRAVITY will give better accuracy but if the device does not have this type then low pass filter TYPE_ACCELERATOR will give you the gravity vector.
if you want to see if the device is in free fall, you should check if the normal is closer to zero.
http://developer.android.com/guide/topics/sensors/sensors_motion.html
public void onSensorChanged(SensorEvent event) {
double noraml = Math.sqrt(Math.pow(event.values[0].getX(),2)+
Math.pow(event.values[1].getY(),2)+
Math.pow(event.values[2].getZ(),2));
if (normal < 0)
return true;
return false;
}
Related
I want to detect if the user has taken a turn on the road while driving using the sensors on the android phone. How do I code this? I am collecting data live from all the sensors(accelerometer,location,rotation,geomagnetic) and storing them on the sd card. So now i just want to know whether the user has a taken a turn and in which direction he has turned.
I assume the registration of the sensor is done properly. You can detect the direction by using the orientation sensor (deprecated) as follows:
#Override
public void onSensorChanged(SensorEvent event) {
float azimuth_angle = event.values[0];
int precision = 2;
if (prevAzimuth - azimuth_angle < precision * -1)
Log.v("->", "RIGHT");
else if (prevAzimuth - azimuth_angle > precision)
Log.v("<-", "LEFT");
prevAzimuth = azimuth_angle;
}
Note: The variable of "prevAzimuth" is declared as global. You can change "precision" value to whatever you want. We need this value because we do not want to see output after each trivial change in azimuth angle. However, too large precision gives imprecise results. To me, "2" is optimum.
If you are tracking location coordinates, you can also track shifts between the angle from previous locations.
angle = arctan((Y2 - Y1) / (X2 - X1)) * 180 / PI
See this answer for calculating x and y.
Decision to use sensor values is based on an unrealistic assumption that the device is never rotated with respect to the vehicle.
The values I'm getting for accel, x, y and z below are not as expected.
It seems to be acting as a Tilt sensor rather than accelerometer.
When I throw the phone in the air and catch it, the accel value doesn't change by more than about 10%. Contrast this to when I rotate the phone randomly, I get much larger variations of 50-100%!
What could explain this? I simply want to detect when the phone is in freefall, (and/or impacting something).
SensorManager sm = (SensorManager)getSystemService(SENSOR_SERVICE);
Sensor sensor = sm.getDefaultSensor(SensorManager.SENSOR_ACCELEROMETER);
sm.registerListener(
sel = new SensorEventListener(){
#Override public void onAccuracyChanged(Sensor arg0, int arg1) {}
#Override public void onSensorChanged(SensorEvent arg0) {
double x = se.values[0];
double y = se.values[1];
double z = se.values[2];
double accel = Math.sqrt(
Math.pow(x, 2) +
Math.pow(y, 2) +
Math.pow(z, 2));
}
},
sensor,
SensorManager.SENSOR_DELAY_NORMAL
);
(As a side question, the values for x, y and z seem much higher than they should be, with accel averaging at about 50-80, when standing still? Shouldn't it be around 9.8?)
The x, y and z values seem very sensitive to changes in the orientation of the phone, but not at all representative of acceleration. Am I missing something??
Example values with phone still, lying on back:
Accel = 85.36, x = 6.8, y = 45.25 z = 30.125
I had to replace
Sensor sensor = sm.getDefaultSensor(SensorManager.SENSOR_ACCELEROMETER);
with
Sensor sensor = sm.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Could be because when you throw the phone it is almost the same plane or angle and at low speed, when you turn the phone it changes its course and orientation rapidly and that gives higher values for the result. Accelerometer may be a misnomer for a multi-function device, there could be a selection parameter for the function you really want to get results from.
With the phone lying on its back you should get close to zero on the X and Y sensors and about 9.8 on the Z sensor. The 9.8 is of course due to gravity.
My first though would be that there is something wrong with the phone and would suggest trying same code on another phone.
However I notice that there is something wrong in the math but haven;t figured out what yet.
with x,y,z having the values you mention the resultant (square root of sum of squares) works out to 54.78 rather than 85.36 as you mention in your post.
I'm quite new to Java so I cannot easily spot what might be wrong and haven't had the opportunity yet to try that piece of code on my phone, but I think the math is simple enough for me to determine that the result is wrong. (or at least I hope so).
The other thing to check (assuming you figureout the math problem) is that the small change when you throw the phone in the air might simply be due to the slow response time. The accelerometer output may simply be changing too slowly so by the time the phone has landed the output wouldn't have changed that much. The response can be improved by using SENSOR_DELAY_GAME or SENSOR_DELAY_FASTEST instead of normal.
By the way, shouldn't that be arg0.values[] rather than se.values[]? Where does the se come from? The sensor values go into the argument of the onSensorChanged (arg0 in this case) so I cannot figure out how they are supposed to end up in se. (But then again there are many things in Java I still don't understand)
I want to be able to feature a fairly simple fall detection algorithm in my application. At the moment in onSensorChanged(), I am getting the absolute value of the current x,x,z values and subtracting SensorManager.GRAVITY_EARTH (9.8 m/s) from this. The resulting value has to be bigger than a threshold value 10 times in a row to set a flag saying a fall has been detected by the accelerometer, the threshold value is about 8m/s.
Also I'm comparing the orientation of the phone as soon as the threshold has been passed and the orienation of it when the threshold is no longer being passed, this sets another flag saying the orientation sensor has detected a fall.
When both flags are set, an event occurs to check is user ok, etc etc. My problem is with the threshold, when the phone is held straight up the absolute value of accelerometer is about 9.8 m/s, but when i hold it still at an angle it can be over 15m/s. This is causing other events to trigger the fall detection, and if i increase the threshold to avoid that, it won't detect falls.
Can anyone give me some advice here with what possible values i should use or how to even improve my method? Many thanks.
First, I want to remind you that you cannot just add the x, y, z values together as they are, you have to use vector mathematics. This is why you get values of over 15 m/s. As long as the phone is not moving, the vector sum should always be about 9.8 m/s. You calculate it using SQRT(x*x + y*y + z*z). If you need more information, you can read about vector mathematics, maybe http://en.wikipedia.org/wiki/Euclidean_vector#Length is a good start for it.
I also suggest another algorithm: In free fall, all three of the x,y,z values of the accelerometer should be near zero. (At least, that's what I learned in physics classes a long time ago in school.) So maybe you can use a formula like if the vector sum of x,y,z <= 3 m/s than you detect a free fall. And if the vector sum then raises to a value over 20 m/s, than you detect the landing.
Those thresholds are just a wild guess. Maybe you just record the x,y,z values in a test application, and then move around the phone, and then analyze offline how the values (and their normal and vector sum) behave to get a feeling for which thresholds are sensible.
I have acutally published a paper on this issue. Please feel free to check out "ifall" # ww2.cs.fsu.edu/~sposaro
We basically take the root sum of squares and look for 3 things
1. Lower threshold broke. Ie fallinging
2. Upper threshold broke. Ie hitting the ground
3. Flatline around 1g, ie longlie, laying on the ground for an extended period of time
I forgot to update this thread, but iFall is now available on the Android Market.
Also check out ww2.cs.fsu.edu/~sposaro/iFall for more information
Its possible using the Accelerometer sensor.
Write this in the sensor changed listener..
if (sensor == Sensor.TYPE_ACCELEROMETER) {
long curTime = System.currentTimeMillis();
// only allow one update every 100ms.
if ((curTime - lastUpdate) > 100) {
long diffTime = (curTime - lastUpdate);
lastUpdate = curTime;
x = values[SensorManager.DATA_X];
y = values[SensorManager.DATA_Y];
z = values[SensorManager.DATA_Z];
float speed = Math.abs(x + y + z - last_x - last_y - last_z) / diffTime * 10000;
Log.d("getShakeDetection", "speed: " + speed);
if (speed > DashplexManager.getInstance().SHAKE_THRESHOLD) {
result = true;
}
last_x = x;
last_y = y;
last_z = z;
}
}
I am working on a project which includes an Android application which is used for
controlling/steering.
Speed: When you tilt the phone forward/backwards (pitch) it simulates giving gas and breaking.
Direction: When you tilt the phone left/right (roll) it simulates steering to the left and right.
I have already written some code which seemed to work fine. But when I took a closer look, I found that some values are acting weird.
When I tilt the phone forward/backward to handle the speed it works perfect I get the expected speed and direction values. But when I tilt the phone to the left/right to handle the direction it seems to corrupt some values. When it is tilting to the left/right that doesn't only change the direction value (roll) but it also affects the speed value (pitch).
For extra information:
Programming for Android 2.2
Device is an Google Nexus One
Holding the device in portrait
The most relevant code I use to read the sensor values is as follows:
public void onSensorChanged(SensorEvent sensorEvent)
{
synchronized (this)
{
if (sensorEvent.sensor.getType() == Sensor.TYPE_ORIENTATION)
{
float azimuth = sensorEvent.values[0]; // azimuth rotation around the z-axis
float pitch = sensorEvent.values[1]; // pitch rotation around the x-axis
float roll = sensorEvent.values[2]; // roll rotation around the y-axis
System.out.println("pitch: " + pitch);
System.out.println("roll: " + roll);
System.out.println("--------------------");
// Convert the sensor values to the actual speed and direction values
float speed = (pitch * 2.222f) + 200;
float direction = roll * -2.222f;
So when I run the code, and I look at the printed values. When tilting the device left/right, it seems to affect the pitch value as well. How come? And how can I get the pure pitch value, when 'roll'-ing? So that tilting the phone to the left/right doesn't affect/corrupt the pitch value.
You could read up on Gimbal lock. That's bitten me before.
I'm writing an application and my aim is to detect when a user is walking.
I'm using a Kalman filter like this:
float kFilteringFactor=0.6f;
gravity[0] = (accelerometer_values[0] * kFilteringFactor) + (gravity[0] * (1.0f - kFilteringFactor));
gravity[1] = (accelerometer_values[1] * kFilteringFactor) + (gravity[1] * (1.0f - kFilteringFactor));
gravity[2] = (accelerometer_values[2] * kFilteringFactor) + (gravity[2] * (1.0f - kFilteringFactor));
linear_acceleration[0] = (accelerometer_values[0] - gravity[0]);
linear_acceleration[1] = (accelerometer_values[1] - gravity[1]);
linear_acceleration[2] = (accelerometer_values[2] - gravity[2]);
float magnitude = 0.0f;
magnitude = (float)Math.sqrt(linear_acceleration[0]*linear_acceleration[0]+linear_acceleration[1]*linear_acceleration[1]+linear_acceleration[2]*linear_acceleration[2]);
magnitude = Math.abs(magnitude);
if(magnitude>0.2)
//walking
The array gravity[] is initialized with 0s.
I can detect when a user is walking or not (looking at the value of the magnitude of the acceleration vector), but my problem is that when a user is not walking and he moves the phones, it seems that he is walking.
Am I using the right filter?
Is it right to watch only the magnitude of the vector or have I to look at the single values ??
Google provides an API for this called DetectedActivity that can be obtained using the ActivityRecognitionApi. Those docs can be accessed here and here.
DetectedActivity has the method public int getType() to get the current activity of the user and also public int getConfidence() which returns a value from 0 to 100. The higher the value returned by getConfidence(), the more certain the API is that the user is performing the returned activity.
Here is a constant summary of what is returned by getType():
int IN_VEHICLE The device is in a vehicle, such as a car.
int ON_BICYCLE The device is on a bicycle.
int ON_FOOT The device is on a user who is walking or running.
int RUNNING The device is on a user who is running.
int STILL The device is still (not moving).
int TILTING The device angle relative to gravity changed significantly.
int UNKNOWN Unable to detect the current activity.
int WALKING The device is on a user who is walking.
My first intuition would be to run an FFT analysis on the sensor history, and see what frequencies have high magnitudes when walking.
It's essentially seeing what walking "sounds like", treating the accelerometer sensor inputs like a microphone and seeing the frequencies that are loud when walking (in other words, at what frequency is the biggest acceleration happening).
I'd guess you'd be looking for a high magnitude at some low frequency (like footstep rate) or maybe something else. It would be interesting to see the data.
My guess is you run the FFT and look for the magnitude at some frequency to be greater than some threshold, or the difference between magnitudes of two of the frequencies is more than some amount. Again, the actual data would determine how you attempt to detect it.
For walking detection I use the derivative applied to the smoothed signal from accelerometer. When the derivative is greater than threshold value I can suggest that it was a step. But I guess that it's not best practise, furthermore it only works when the phone is placed in a pants pocket.
The following code was used in this app https://play.google.com/store/apps/details?id=com.tartakynov.robotnoise
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER){
return;
}
final float z = smooth(event.values[2]); // scalar kalman filter
if (Math.abs(z - mLastZ) > LEG_THRSHOLD_AMPLITUDE)
{
mInactivityCount = 0;
int currentActivity = (z > mLastZ) ? LEG_MOVEMENT_FORWARD : LEG_MOVEMENT_BACKWARD;
if (currentActivity != mLastActivity){
mLastActivity = currentActivity;
notifyListeners(currentActivity);
}
} else {
if (mInactivityCount > LEG_THRSHOLD_INACTIVITY) {
if (mLastActivity != LEG_MOVEMENT_NONE){
mLastActivity = LEG_MOVEMENT_NONE;
notifyListeners(LEG_MOVEMENT_NONE);
}
} else {
mInactivityCount++;
}
}
mLastZ = z;
}
EDIT: I don't think it's accurate enough since when walking normally the average acceleration would be near 0. The most you could do measuring acceleration is detect when someone starts walking or stops (But as you said, it's difficult to filter it from the device moved by someone standing at one place)
So... what I wrote earlier, probably wouldn't work anyway:
You can "predict" whether the user is moving by discarding when the user is not moving (obvious), And first two options coming to my mind are:
Check whether the phone is "hidden", using proximity and light sensor (optional). This method is less accurate but easier.
Controlling the continuity of the movement, if the phone is moving for more than... 10 seconds and the movement is not despicable, then you consider he is walking. I know is not perfet either, but it's difficult wihout using any kind of positioning, by the way... why don't you just use LocationManager?
Try detecting the up and down oscillations, the fore and aft oscillations and the frequency of each and make sure they stay aligned within bounds on average, because you would detect walking and specifically that person's gait style which should remain relatively constant for several steps at once to qualify as moving.
As long as the last 3 oscillations line up within reason then conclude walking is occurring as long as this also is true:-
You measure horizontal acceleration and update a velocity value with it. Velocity will drift with time, but you need to keep a moving average of velocity smoothed over the time of a step, and as long as it doesn't drift more than say half of walking speed per 3 oscillations then it's walking but only if it initially rose to walking speed within a short time ie half a second or 2 oscillations perhaps.
All of that should just about cover it.
Of course, a little ai would help make things simpler or just as complex but amazingly accurate if you considered all of these as inputs to a NN. Ie preprocessing.