Issue while trying to create a fall detection algorithm (Android) - android

I am currently trying to write a fall detection algorithm on android. I have successfully detected free fall, however when the phone lands the resulting vector is always around 1G and I am not able to create an upper threshold. Here is my code for the method:
if (mySensor.getType() == Sensor.TYPE_ACCELEROMETER) {
float x = sensorEvent.values[0];
float y = sensorEvent.values[1];
float z = sensorEvent.values[2];
double totalAccel = Math.sqrt(Math.pow(x,2) + Math.pow(y,2) + Math.pow(z,2));
if(totalAccel < FALL_THRESHOLD && !isFalling){
isFalling = true;
}
if(isFalling && totalAccel > FALL_THRESHOLD){
isFalling = false;
TextView view = (TextView) findViewById(R.id.values);
view.setText("" + totalAccel);
}

Related

Android Linear Acceleration: negative values

I am programming an accelerometre in order to measure a distance. I already read that integrating twice the acceleration, the error is very consequent but I don't care about that, I have enough precision for what I want to do.
The problem is that my position in x axis is always negative and I don't know where it comes from because I move my phone in all directions and I still have negative values.
This is my code that I picked and arranged from internet:
#Override
public void onSensorChanged(SensorEvent event){
if(last_values != null){
float dt = (event.timestamp - last_timestamp) * NS2S;
acceleration[0]= event.values[0];
acceleration[1]= event.values[1];
acceleration[2]= event.values[2];
for(int index = 0 ; index < 3 ; ++index){
velocity[index] += (acceleration[index] + last_values[index])/2 * dt;
position[index] += velocity[index] * dt;
}
}
else{
last_values = new float[3];
acceleration = new float[3];
velocity = new float[3];
position = new float[3];
velocity[0] = velocity[1] = velocity[2] = 0f;
position[0] = position[1] = position[2] = 0f;
}
xarr.add(position[0]);
yarr.add(position[1]);
zarr.add(position[2]);
tvX.setText(String.valueOf(position[0]));
tvY.setText(String.valueOf(position[1]));
tvZ.setText(String.valueOf(position[2]));
last_timestamp = event.timestamp;
}
And this are a few X axis values:
X axis
-0,001147982
-0,003418462
-0,006807898
-0,011325749
-0,01697435
-0,023762027
-0,031681452
-0,04072018
-0,050905682
-0,062279336
-0,07478787
-0,088445522
-0,103226766
-0,119143963
-0,136174649
-0,154338345
-0,17362912
-0,19407627
-0,215692967
Please help.

How to get smooth orientation data in android

I have an app which uses orientation data which works very well using the pre API-8 method of using a Sensor.TYPE_ORIENTAITON. Smoothing that data was relatively easy.
I am trying to update the code to avoid using this deprecated approach. The new standard approach is to replace the single Sensor.TYPE_ORIENTATION with a Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGENTIC_FIELD combination. As that data is received, it is sent (via SensorManager.getRotationMatrix()) to SensorManager.getOrientation(). This (theoretically) returns the same information as Sensor.TYPE_ORIENTATION did (apart from different units and axis orientation).
However, this approach seems to generate data which is much more jittery (ie noisy) than the deprecated method (which still works). So, if you compare the same information on the same device, the deprecated method provides much less noisy data than the current method.
How do I get the actual same (less noisy) data that the deprecated method used to provide?
To make my question a little clearer: I have read various answers on this subject, and I have tried all sorts of filter: simple KF / IIR low pass as you suggest; median filter between 5 and 19 points, but so far I have yet to get anywhere close to the smoothness of the data the phone supplies via TYPE_ORIENTATION.
Apply a low-pass filter to your sensor output.
This is my low-pass filter method:
private static final float ALPHA = 0.5f;
//lower alpha should equal smoother movement
...
private float[] applyLowPassFilter(float[] input, float[] output) {
if ( output == null ) return input;
for ( int i=0; i<input.length; i++ ) {
output[i] = output[i] + ALPHA * (input[i] - output[i]);
}
return output;
}
Apply it like so:
float[] mGravity;
float[] mGeomagnetic;
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
mGravity = applyLowPassFilter(event.values.clone(), mGravity);
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
mGeomagnetic = applyLowPassFilter(event.values.clone(), mGeomagnetic);
if (mGravity != null && mGeomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
azimuth = -orientation[0];
invalidate();
}
}
}
This is obviously code for a compass, remove what you don't need.
Also, take a look at this SE question How to implement low pass filter using java
It turns out that there is another, not particularly documented, way to get orientation data. Hidden in the list of sensor types is TYPE_ROTATION_VECTOR. So, set one up:
Sensor mRotationVectorSensor = sensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);
sensorManager.registerListener(this, mRotationVectorSensor, SensorManager.SENSOR_DELAY_GAME);
Then:
#Override
public void onSensorChanged(SensorEvent event) {
final int eventType = event.sensor.getType();
if (eventType != Sensor.TYPE_ROTATION_VECTOR) return;
long timeNow = System.nanoTime();
float mOrientationData[] = new float[3];
calcOrientation(mOrientationData, event.values.clone());
// Do what you want with mOrientationData
}
The key mechanism is going from the incoming rotation data to an orientation vector via a rotation matrix. The slightly frustrating thing is the orientation vector comes from quaternion data in the first place, but I can't see how to get the quaternion delivered direct. (If you ever wondered how quaternions relate to orientatin and rotation information, and why they are used, see here.)
private void calcOrientation(float[] orientation, float[] incomingValues) {
// Get the quaternion
float[] quatF = new float[4];
SensorManager.getQuaternionFromVector(quatF, incomingValues);
// Get the rotation matrix
//
// This is a variant on the code presented in
// http://www.euclideanspace.com/maths/geometry/rotations/conversions/quaternionToMatrix/
// which has been altered for scaling and (I think) a different axis arrangement. It
// tells you the rotation required to get from the between the phone's axis
// system and the earth's.
//
// Phone axis system:
// https://developer.android.com/guide/topics/sensors/sensors_overview.html#sensors-coords
//
// Earth axis system:
// https://developer.android.com/reference/android/hardware/SensorManager.html#getRotationMatrix(float[], float[], float[], float[])
//
// Background information:
// https://en.wikipedia.org/wiki/Rotation_matrix
//
float[][] rotMatF = new float[3][3];
rotMatF[0][0] = quatF[1]*quatF[1] + quatF[0]*quatF[0] - 0.5f;
rotMatF[0][1] = quatF[1]*quatF[2] - quatF[3]*quatF[0];
rotMatF[0][2] = quatF[1]*quatF[3] + quatF[2]*quatF[0];
rotMatF[1][0] = quatF[1]*quatF[2] + quatF[3]*quatF[0];
rotMatF[1][1] = quatF[2]*quatF[2] + quatF[0]*quatF[0] - 0.5f;
rotMatF[1][2] = quatF[2]*quatF[3] - quatF[1]*quatF[0];
rotMatF[2][0] = quatF[1]*quatF[3] - quatF[2]*quatF[0];
rotMatF[2][1] = quatF[2]*quatF[3] + quatF[1]*quatF[0];
rotMatF[2][2] = quatF[3]*quatF[3] + quatF[0]*quatF[0] - 0.5f;
// Get the orientation of the phone from the rotation matrix
//
// There is some discussion of this at
// http://stackoverflow.com/questions/30279065/how-to-get-the-euler-angles-from-the-rotation-vector-sensor-type-rotation-vecto
// in particular equation 451.
//
final float rad2deg = (float)(180.0 / PI);
orientation[0] = (float)Math.atan2(-rotMatF[1][0], rotMatF[0][0]) * rad2deg;
orientation[1] = (float)Math.atan2(-rotMatF[2][1], rotMatF[2][2]) * rad2deg;
orientation[2] = (float)Math.asin ( rotMatF[2][0]) * rad2deg;
if (orientation[0] < 0) orientation[0] += 360;
}
This seems to give data very similar in feel (I haven't run numeric tests) to the old TYPE_ORIENTATION data: it was usable for motion control of the device with marginal filtering.
There is also helpful information here, and a possible alternative solution here.
Here's what worked out for me using SensorManager.SENSOR_DELAY_GAME for a fast update i.e.
#Override
protected void onResume() {
super.onResume();
sensor_manager.registerListener(this, sensor_manager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_GAME);
sensor_manager.registerListener(this, sensor_manager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD), SensorManager.SENSOR_DELAY_GAME);
}
MOVING AVERAGE
(less efficient)
private float[] gravity;
private float[] geomagnetic;
private float azimuth;
private float pitch;
private float roll;
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
gravity = moving_average_gravity(event.values);
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
geomagnetic = moving_average_geomagnetic(event.values);
if (gravity != null && geomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, gravity, geomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
azimuth = (float) Math.toDegrees(orientation[0]);
pitch = (float) Math.toDegrees(orientation[1]);
roll = (float) Math.toDegrees(orientation[2]);
//if(roll>-46F && roll<46F)view.setTranslationX((roll/45F)*max_translation); //tilt from -45° to 45° to x-translate a view positioned centrally in a layout, from 0 - max_translation
Log.i("TAG","azimuth: "+azimuth+" | pitch: "+pitch+" | roll: "+roll);
}
}
}
private ArrayList<Float[]> moving_gravity;
private ArrayList<Float[]> moving_geomagnetic;
private static final float moving_average_size=12;//change
private float[] moving_average_gravity(float[] gravity) {
if(moving_gravity ==null){
moving_gravity =new ArrayList<>();
for (int i = 0; i < moving_average_size; i++) {
moving_gravity.add(new Float[]{0F,0F,0F});
}return new float[]{0F,0F,0F};
}
moving_gravity.remove(0);
moving_gravity.add(new Float[]{gravity[0],gravity[1],gravity[2]});
return moving_average(moving_gravity);
}
private float[] moving_average_geomagnetic(float[] geomagnetic) {
if(moving_geomagnetic ==null){
this.moving_geomagnetic =new ArrayList<>();
for (int i = 0; i < moving_average_size; i++) {
moving_geomagnetic.add(new Float[]{0F,0F,0F});
}return new float[]{0F,0F,0F};
}
moving_geomagnetic.remove(0);
moving_geomagnetic.add(new Float[]{geomagnetic[0],geomagnetic[1],geomagnetic[2]});
return moving_average(moving_geomagnetic);
}
private float[] moving_average(ArrayList<Float[]> moving_values){
float[] moving_average =new float[]{0F,0F,0F};
for (int i = 0; i < moving_average_size; i++) {
moving_average[0]+= moving_values.get(i)[0];
moving_average[1]+= moving_values.get(i)[1];
moving_average[2]+= moving_values.get(i)[2];
}
moving_average[0]= moving_average[0]/moving_average_size;
moving_average[1]= moving_average[1]/moving_average_size;
moving_average[2]= moving_average[2]/moving_average_size;
return moving_average;
}
LOW PASS FILTER
(more efficient)
private float[] gravity;
private float[] geomagnetic;
private float azimuth;
private float pitch;
private float roll;
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
gravity = LPF(event.values.clone(), gravity);
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
geomagnetic = LPF(event.values.clone(), geomagnetic);
if (gravity != null && geomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, gravity, geomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
azimuth = (float) Math.toDegrees(orientation[0]);
pitch = (float) Math.toDegrees(orientation[1]);
roll = (float) Math.toDegrees(orientation[2]);
//if(roll>-46F && roll<46F)view.setTranslationX((roll/45F)*max_translation); //tilt from -45° to 45° to x-translate a view positioned centrally in a layout, from 0 - max_translation
Log.i("TAG","azimuth: "+azimuth+" | pitch: "+pitch+" | roll: "+roll);
}
}
}
private static final float ALPHA = 1/16F;//adjust sensitivity
private float[] LPF(float[] input, float[] output) {
if ( output == null ) return input;
for ( int i=0; i<input.length; i++ ) {
output[i] = output[i] + ALPHA * (input[i] - output[i]);
}return output;
}
N.B
moving average of 12 values instead as per here
low pass filter of ALPHA = 0.0625 instead as per here

Android detect phone lifting action

I want to perform some activity when the user lifts the phone from a flat surface. The method I am using right now is detect shake motion using phone's Accelerometer using the following code:
sensorMan = (SensorManager) getSystemService(SENSOR_SERVICE);
accelerometer = sensorMan.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
sensorMan.registerListener(this, accelerometer, SensorManager.SENSOR_STATUS_ACCURACY_HIGH);
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravity = event.values.clone();
// Shake detection
float x = mGravity[0];
float y = mGravity[1];
float z = mGravity[2];
mAccelLast = mAccelCurrent;
mAccelCurrent = FloatMath.sqrt(x * x + y * y + z * z);
float delta = mAccelCurrent - mAccelLast;
mAccel = mAccel * 0.9f + delta;
if (mAccel > 0.9) {
//Perform certain tasks.
}
}
The issue I am facing with this code is the 0.9f threshold is reached sometimes even if the phone is still on the flat surface. I tried logging the mAccel value and found it to be rannging from 9.0 to 0.4 even when the phone is not even touched. Is there any guaranteed way to detect the phone's lift movement?
Solved the issue. All I wanted to do was to check for the "Y" value stated in the question and check if the value was greater than 1.0.
Note that, if the phone is kept in vertical position the Y is always around 9.8 but in such cases you can check for X instead. In my case user had to lift the phone and somewhen he will tilt the phone so I put a check for if(y >= 1.0 && y <= 2.0);
EDIT : UPDATED CODE
#Override
public void onSensorChanged(SensorEvent event) {
try {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravity = event.values.clone();
// Shake detection
float x = mGravity[0];
float y = mGravity[1];
float z = mGravity[2];
float yAbs = Math.abs(mGravity[1]);
mAccelLast = mAccelCurrent;
mAccelCurrent = FloatMath.sqrt(x * x + y * y + z * z);
float delta = mAccelCurrent - mAccelLast;
mAccel = mAccel * 0.9f + delta;
if (yAbs > 2.0 && yAbs < 4.0 && !isAlerted() && !isCallActive()) {
alert();
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
I would add the Gyroscope into the detection routine too.
The Phone gets Accelerated AND gets up from x=0 y=0 z=0 to, lets say y=120, that's the Trigger.
Look here
for Infos how to using it.
Another Sensor for lifting detection would be the Proximity Sensor, when the Phone lays flat on the Desk dinstance would be 0, if its picked up that value would raise quickly

Increase the Range of Android Accelerometer

This question has been asked in a few ways but there has been no conclusive answer to it.
I am trying to find out the maximum range of the accelerometer on Android phones.
Some forums claim +-2Gs and some +-3.5Gs.
The accelerometer hardware (of the LSM330 which is on the s4) has a higher range, up to 16Gs.
http://www.st.com/st-web-ui/static/active/en/resource/technical/document/datasheet/DM00059856.pdf
I wrote an application to practically find this range and loaded it onto an S4. The following picture shows the readings.
Clearly, the maximum range in each direction is 2Gs.
Is there a way to increase this range and if so, how?
Has anyone found a larger default range on other Android phones?
For those interested, here is the nb part of my code:
public class MainActivity extends Activity implements SensorEventListener{
Sensor accelerometer;
SensorManager sm;
TextView maxValue;
TextView realTimeValues;
TextView realTimeResultant;
TextView maxValues;
TextView maxResultant;
float x = 0;
float y = 0;
float z = 0;
float res = 0;
float xMax = 0;
float yMax = 0;
float zMax = 0;
float resMax = 0;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
sm = (SensorManager)getSystemService(SENSOR_SERVICE);
accelerometer = sm.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
sm.registerListener(this, accelerometer, SensorManager.SENSOR_DELAY_FASTEST);
maxValue = (TextView)findViewById(R.id.MaxValue);
realTimeValues = (TextView)findViewById(R.id.RealTimeValues);
realTimeResultant = (TextView)findViewById(R.id.RealTimeResultant);
maxValues = (TextView)findViewById(R.id.maxValues);
maxResultant = (TextView)findViewById(R.id.maxResultant);
float max = accelerometer.getMaximumRange();
maxValue.setText("Max range: "+ max);
}
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER)
return;
x = event.values[0];
y = event.values[1];
z = event.values[2];
res = (float) Math.sqrt( x*x + y*y + z*z);
realTimeValues.setText("X: " + x + "\nY: " + y + "\nZ: " + z);
realTimeResultant.setText(res + " m/s^2");
if (Math.abs(x) > Math.abs(xMax))
xMax = x;
if (Math.abs(y) > Math.abs(yMax))
yMax = y;
if (Math.abs(z) > Math.abs(zMax))
zMax = z;
if (res > resMax)
resMax = res;
maxValues.setText("X: " + xMax + "\nY: " + yMax + "\nZ: " + zMax);
maxResultant.setText(resMax + " m/s^2");
}
}
It seems impossible to do so.
It should also be noted that the Linear accelerometer is cannot essentially increase the range of accelerometer readings, even though its values go up to 3Gs (on some phones). The reason for this increase is only due to the way in which the linear acceleration is calculated (the use of a filter essentially) which sometimes results in values higher than 2Gs should the direction be switched quickly enough.
See here for calculation of linear accelerometer: http://developer.android.com/guide/topics/sensors/sensors_motion.html

Compass readings on SGS III

My app needs to show the current bearing of the device using its compass. The code I'm using (below) works perfectly fine on my Galaxy Nexus and Galaxy One, but the compass is spinning around wildly on a Samsung Galaxy S III. I've tried doing a figure-8 to recalibrate the device, but that doesn't change anything. The weird thing is that other compass apps downloaded from Google Play work just fine on the SIII. What could be the issue here?
float[] mGravity;
float[] mGeomagnetic;
public void onSensorChanged( SensorEvent event ) {
float azimuth = 0f;
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
mGravity = event.values;
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
mGeomagnetic = event.values;
if (mGravity != null && mGeomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
azimuth = orientation[0]; // orientation contains: azimut, pitch and roll
}
}
//Discard 0.0-values
if(azimuth == 0.0) { return; }
//Convert the sensor value to degrees
azimuth = (float) Math.toDegrees(azimuth); //same as azimuth = -azimuth*360/(2*3.14159f);
//Smooth the sensor-output
azimuth = smoothValues(azimuth);
}
//From http://stackoverflow.com/questions/4699417/android-compass-orientation-on-unreliable-low-pass-filter
//SmoothFactorCompass: The easing float that defines how smooth the movement will be (1 is no smoothing and 0 is never updating, my default is 0.5).
//SmoothThresholdCompass: The threshold in which the distance is big enough to turn immediately (0 is jump always, 360 is never jumping, my default is 30).
static final float SmoothFactorCompass = 0.5f;
static final float SmoothThresholdCompass = 30.0f;
float oldCompass = 0.0f;
private float smoothValues (float newCompass){
if (Math.abs(newCompass - oldCompass) < 180) {
if (Math.abs(newCompass - oldCompass) > SmoothThresholdCompass) {
oldCompass = newCompass;
}
else {
oldCompass = oldCompass + SmoothFactorCompass * (newCompass - oldCompass);
}
}
else {
if (360.0 - Math.abs(newCompass - oldCompass) > SmoothThresholdCompass) {
oldCompass = newCompass;
}
else {
if (oldCompass > newCompass) {
oldCompass = (oldCompass + SmoothFactorCompass * ((360 + newCompass - oldCompass) % 360) + 360) % 360;
}
else {
oldCompass = (oldCompass - SmoothFactorCompass * ((360 - newCompass + oldCompass) % 360) + 360) % 360;
}
}
}
return oldCompass;
}
Currently I am investigating compass mechanism on Android and I would recommend to start with low-pass filter in your case.
What you need to do - is to apply low-pass filter to both ACCELEROMETER and MAGNETIC_FIELD sensors data.
Here is how I implemented that:
private float[] accel;
private float[] geomagnetic;
float R[] = new float[9];
float I[] = new float[9];
float orientation[] = new float[3];
#Override
public void onSensorChanged(SensorEvent event)
{
synchronized (this)
{
float azimuth = -1f;
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
accel = lowPass( event.values.clone(), accel );
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
geomagnetic = lowPass(event.values.clone(), geomagnetic);
if (accel != null && geomagnetic != null)
{
boolean success = SensorManager.getRotationMatrix(R, I,
accel, geomagnetic);
SensorManager.remapCoordinateSystem(R,
SensorManager.AXIS_X, SensorManager.AXIS_Z, R);
if (success)
{
SensorManager.getOrientation(R, orientation);
azimuth = orientation[0]; // orientation contains:
// azimuth, pitch
// and roll
float newHeading = azimuth * 360 / (2 * 3.14159f);
//do what you need to do with new heading
}
}
}
}
/*
* time smoothing constant for low-pass filter 0 ≤ alpha ≤ 1 ; a smaller
* value basically means more smoothing See:
* http://en.wikipedia.org/wiki/Low-pass_filter#Discrete-time_realization
*/
static final float ALPHA = 0.15f;
/**
* #see http
* ://en.wikipedia.org/wiki/Low-pass_filter#Algorithmic_implementation
* #see http
* ://developer.android.com/reference/android/hardware/SensorEvent.html
* #values
*/
protected float[] lowPass(float[] input, float[] output)
{
if (output == null)
return input;
for (int i = 0; i < input.length; i++)
{
output[i] = output[i] + ALPHA * (input[i] - output[i]);
}
return output;
}
may be you should clone the value from the sensor reading first and then lowpass the filter if this does not work , I know SONY XPeria phone is very sensitive unlike samsung which is quite stable .
In my view , S3 reading is quite ok under android OS 4.3
event.values.clone;
My recent experience couple data feedback force me to apply rotation vector logic on SONY phone loaded with Invesense sensor and it seemed to be working , although I still do not like the choppy response but I have live with that kind of response on SONY android 4.3 OS . It is also appear to me that any phone loaded invensense sensor need to apply the rotation vector for it to work properly

Categories

Resources