I am trying to make a simple Android application, this application will get Direction when moving android device from right to left or when moving android device from Top to Button. the movement just in 2d for example android device on table or board. for example when the movement from right to left the The Left Value shell be true and right value shell False so on
Code of onCreate Method
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mAccelerometer = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
mSensorManager.registerListener(this, mAccelerometer, SensorManager.SENSOR_DELAY_NORMAL);
tvLeft= (TextView) findViewById(R.id.textViewLeft);
tvRight= (TextView) findViewById(R.id.textViewRight);
tvTop= (TextView) findViewById(R.id.textViewTop);
tvButton= (TextView) findViewById(R.id.textViewBoutton);
}
Code of onSensorChanged Method
public void onSensorChanged(SensorEvent event) {
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
long curTime = System.currentTimeMillis();
if ((curTime - lastUpdate) > 100) {
lastUpdate = curTime;
float diffX=last_x-x;
float diffY=last_y-y;
Log.d("Mhd",Float.toString(diffX)+" X Diff");
Log.d("Mhd",Float.toString(diffY)+" y Diff");
diffX = Float.parseFloat(String.format("%1$,.1f", diffX));
diffY = Float.parseFloat(String.format("%1$,.1f", diffY));
if(x>last_x && Math.abs(diffX) > 0.1){
tvLeft.setText("T");
tvRight.setText("F");
}else if(x<last_x && Math.abs(diffX) > 0.1){
tvRight.setText("T");
tvLeft.setText("F");
}
if(y>last_y && Math.abs(diffY) > 0.1){
tvTop.setText("T");
tvButton.setText("F");
}else if(y<last_y && Math.abs(diffY) > 0.1)
tvButton.setText("T");
tvTop.setText("F");
}
last_x = x;
last_y = y;
last_z = z;
}
}
Finally the program dose not gives an error
but the generated data is not true and when the android device stable the data still
generated.
please any idea to improve my code.
You need to think about what an accelerometer is and what it does. It's basically a force meter. It detects a force on the device, and measures it. It always tells you the current level of force (which will actually never be 0 due to gravity, unless you're in outer space). It does not detect motion. It does not return values only when moved. It returns values every time its asked, which with SENSOR_DELAY_NORMAL means every few hundred milliseconds.
The data will never be 100% correct. The data will always have noise, because all physical sensors do. Reverbations from old forces, mild forces due to people walking around the room, your table being not 100% steady- the sensor will pick up all of those minor changes. It will also be wrong by some amount each time, because they're only so accurate. You need to look for large changes only, if you're looking for very small changes you need much better hardware than comes in phones.
Finally, it captures forces not motions. That means if you were to move the device at constant speed, there wouldn't be any change detected- an object at constant speed has no force applied to it. It will only catch changes in speed- accelerations and decelerations. You need to look for these rather than assuming a change means its moving. For example, moving the device left would see an acceleration to the left, followed by 0 acceletation, then an acceleration to the right (really a deceleration) as you stopped, followed by nothing.
You need to rework your whole approach. You need much, much higher noise thresholds, and you need to look for accelerations and decelerations to determine if it is moving, otherwise your deceleration will look like a move in the wrong direction. It's a lot more complicated than what you have here.
Related
Video explaining for those who does not understand
THIS ANSWER IS NOT CORRECTLY ANSWERED PLEASE TRY TO ANSWER IT WITH ANOTHER SOLUTION (100 Bounty is out of date)
Same question but better explained
This questions was accepted as a correct but it is not at all, I tried it with my old device ZTE and it worked most of time, but now I have a Samsung Galazy A5 2016 and it doesn't work, neither on a LG G3.
The thing is trying using Accelerometer and some Sensors I have to be able to detect any of those two moviments that I made on the Video.
There are two moviments :
Smashing it (with a little bit of velocity)
Free fall
I let you to decide and convince me what's the better option and EASIER one to do, with better I mean that works on most of devices.
A stationary device will have an gravity value of +9.81, which corresponds to the acceleration of the device (0 m/s2 minus the force of gravity, which is -9.81 m/s2). Thus if the device is moving downward from stationary then the gravity will be less than 9.81. A free fall device will have gravity equals 0.
Below is how to determine if the device starts moving downward. It will not be able to determine whether the device is moving downward if the device is already moving downward with constant speed, since in this case there is no acceleration downward and the gravity norm should be around 9.81.
You need to use TYPE_GRAVITY. If the device does not have TYPE_GRAVITY, then low pass filter TYPE_ACCELEROMETER to get the gravity vector.
As above a stationary device will have a gravity vector with norm equal 9.81. However, this value will vary slightly with devices. Thus you need first to determine this stationary gravity norm. You can do this by register for TYPE_GRAVITY or TYPE_ACCELEROMETER and ask the user to lay the device flat and then press a button. Once the button is pressed the app will calculate the norm of the gravity in onSensorChanged.
private float mStationaryGravityNorm;
private float mDeviation = 0.01;
private float mCount;
private boolean mIsCalculatingStationGravityNorm = true;
Button button = findViewById(R.id.button);
button.setOnClickListener(new View.OnClickListener( {
#Override
public void onClick(View v) {
// register sensor
}
});
#Override
public void onSensorChanged(SensorEvent event) {
// Will average out 100 gravity values.
if (mIsCalculatingStationGravityNorm) {
if (mCount++ < 100) {
mStationaryGravityNorm += Math.sqrt(event.values[0] * event.values[0] + event.values[1] * event.values[1] + event.values[2] * event.values[2]);
} else {
mStationaryGravityNorm /= 100;
mIsCalculatingStationGravityNorm = false;
} else {
float gravityNorm = Math.sqrt(event.values[0] * event.values[0] + event.values[1] * event.values[1] + event.values[2] * event.values[2]);
if (gravityNorm < mStationaryGravityNorm - mDeviation) {
// moving down
}
}
PS For moving up an down you do want to calculate gravity. When the device is stationary, the gravity norm is approximately 9.81 (depending on device). Now if the device is moving down, there is an acceleration downward, thus the gravity norm will be less than 9.81 and if the device is moving up the gravity norm will be more than 9.81. So by comparing the gravity norm against this stationary gravity norm, you will know if the device moving up or down. This is independent of the device orientation. TYPE_GRAVITY will give better accuracy but if the device does not have this type then low pass filter TYPE_ACCELERATOR will give you the gravity vector.
if you want to see if the device is in free fall, you should check if the normal is closer to zero.
http://developer.android.com/guide/topics/sensors/sensors_motion.html
public void onSensorChanged(SensorEvent event) {
double noraml = Math.sqrt(Math.pow(event.values[0].getX(),2)+
Math.pow(event.values[1].getY(),2)+
Math.pow(event.values[2].getZ(),2));
if (normal < 0)
return true;
return false;
}
I want to detect if the user has taken a turn on the road while driving using the sensors on the android phone. How do I code this? I am collecting data live from all the sensors(accelerometer,location,rotation,geomagnetic) and storing them on the sd card. So now i just want to know whether the user has a taken a turn and in which direction he has turned.
I assume the registration of the sensor is done properly. You can detect the direction by using the orientation sensor (deprecated) as follows:
#Override
public void onSensorChanged(SensorEvent event) {
float azimuth_angle = event.values[0];
int precision = 2;
if (prevAzimuth - azimuth_angle < precision * -1)
Log.v("->", "RIGHT");
else if (prevAzimuth - azimuth_angle > precision)
Log.v("<-", "LEFT");
prevAzimuth = azimuth_angle;
}
Note: The variable of "prevAzimuth" is declared as global. You can change "precision" value to whatever you want. We need this value because we do not want to see output after each trivial change in azimuth angle. However, too large precision gives imprecise results. To me, "2" is optimum.
If you are tracking location coordinates, you can also track shifts between the angle from previous locations.
angle = arctan((Y2 - Y1) / (X2 - X1)) * 180 / PI
See this answer for calculating x and y.
Decision to use sensor values is based on an unrealistic assumption that the device is never rotated with respect to the vehicle.
I am trying to calculate the approximate position of an Android phone in a room. I tried with different methods such as location (wich is terrible in indoors) and gyroscope+compass. I only need to know the approximate position after walking during 5-10seconds so I think the integration of linear acceleration could be enough. I know the error is terrible because of the propagation of the error but maybe it will work in my setup. I only need the approximate position to point a camera to the Android phone.
I coded the double integration but I am doing sth wrong. IF the phone is static on a table the position (x,y,z) always keep increasing. What is the problem?
static final float NS2S = 1.0f / 1000000000.0f;
float[] last_values = null;
float[] velocity = null;
float[] position = null;
float[] acceleration = null;
long last_timestamp = 0;
SensorManager mSensorManager;
Sensor mAccelerometer;
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_LINEAR_ACCELERATION)
return;
if(last_values != null){
float dt = (event.timestamp - last_timestamp) * NS2S;
acceleration[0]=(float) event.values[0] - (float) 0.0188;
acceleration[1]=(float) event.values[1] - (float) 0.00217;
acceleration[2]=(float) event.values[2] + (float) 0.01857;
for(int index = 0; index < 3;++index){
velocity[index] += (acceleration[index] + last_values[index])/2 * dt;
position[index] += velocity[index] * dt;
}
}
else{
last_values = new float[3];
acceleration = new float[3];
velocity = new float[3];
position = new float[3];
velocity[0] = velocity[1] = velocity[2] = 0f;
position[0] = position[1] = position[2] = 0f;
}
System.arraycopy(acceleration, 0, last_values, 0, 3);
last_timestamp = event.timestamp;
}
These are the positions I get when the phone is on the table (no motion). The (x,y,z) values are increasing but the phone is still.
And these are the positions after calculate the moving average for each axis and substract from each measurement. The phone is also still.
How to improve the code or another method to get the approximate position inside a room?
There are unavoidable measurement errors in the accelerometer. These are caused by tiny vibrations in the table, imperfections in the manufacturing, etc. etc. Accumulating these errors over time results in a Random Walk. This is why positioning systems can only use accelerometers as a positioning aid through some filter. They still require some form of dead reckoning such as GPS (which doesn't work well in doors).
There is a great deal of current research for indoor positioning systems. Some areas of research into systems that can take advantage of existing infrastructure are WiFi and LED lighting positioning. There is no obvious solution yet, but I'm sure we'll need a dedicated solution for accurate, reliable indoor positioning.
You said the position always keeps increasing. Do you mean the x, y, and z components only ever become positive, even after resetting several times? Or do you mean the position keeps drifting from zero?
If you output the raw acceleration measurements when the phone is still you should see the measurement errors. Put a bunch of these measurements in an Excel spreadsheet. Calculate the mean and the standard deviation. The mean should be zero for all axes. If not there is a bias that you can remove in your code with a simple averaging filter (calculate a running average and subtract that from each result). The standard deviation will show you how far you can expect to drift in each axis after N time steps as standard_deviation * sqrt(N). This should help you mathematically determine the expected accuracy as a function of time (or N time steps).
Brian is right, there are already deployed indoor positioning systems that work with infrastructure that you can easily find in (almost) any room.
One of the solutions that has proven to be most reliable is WiFi fingerprinting. I recommend you take a look at indoo.rs - www.indoo.rs - they are pioneers in the industry and have a pretty developed system already.
This may not be the most elegant or reliable solution, but in my case it serves the purpose.
Note In my case, I am grabbing a location before the user can even enter the activity that needs indoor positioning.. and I am only concerned with a rough estimate of how much they have moved around.
I have a sensor manager that is creating a rotation matrix based on the device orientation. (using Sensor.TYPE_ROTATION_VECTOR) That obviously doesn't give me movement forward, backward, or side to side, but instead only the device orientation. With that device orientation i have a good idea of the user's bearing in degrees (which way they are facing) and using the Sensor_Step_Detector available in KitKat 4.4, I make the assumption that a step is 1 meter in the direction the user is facing..
Again, I know this is not full proof or very accurate, but depending on your purpose this too might be a simple solution..
everytime a step is detected i basically call this function:
public void computeNewLocationByStep() {
Location newLocal = new Location("");
double vAngle = getBearingInDegrees(); // returns my users bearing
double vDistance = 1 / g.kEarthRadiusInMeters; //kEarthRadiusInMeters = 6353000;
vAngle = Math.toRadians(vAngle);
double vLat1 = Math.toRadians(_location.getLatitude());
double vLng1 = Math.toRadians(_location.getLongitude());
double vNewLat = Math.asin(Math.sin(vLat1) * Math.cos(vDistance) +
Math.cos(vLat1) * Math.sin(vDistance) * Math.cos(vAngle));
double vNewLng = vLng1 + Math.atan2(Math.sin(vAngle) * Math.sin(vDistance) * Math.cos(vLat1),
Math.cos(vDistance) - Math.sin(vLat1) * Math.sin(vNewLat));
newLocal.setLatitude(Math.toDegrees(vNewLat));
newLocal.setLongitude(Math.toDegrees(vNewLng));
stepCount =0;
_location = newLocal;
}
The values I'm getting for accel, x, y and z below are not as expected.
It seems to be acting as a Tilt sensor rather than accelerometer.
When I throw the phone in the air and catch it, the accel value doesn't change by more than about 10%. Contrast this to when I rotate the phone randomly, I get much larger variations of 50-100%!
What could explain this? I simply want to detect when the phone is in freefall, (and/or impacting something).
SensorManager sm = (SensorManager)getSystemService(SENSOR_SERVICE);
Sensor sensor = sm.getDefaultSensor(SensorManager.SENSOR_ACCELEROMETER);
sm.registerListener(
sel = new SensorEventListener(){
#Override public void onAccuracyChanged(Sensor arg0, int arg1) {}
#Override public void onSensorChanged(SensorEvent arg0) {
double x = se.values[0];
double y = se.values[1];
double z = se.values[2];
double accel = Math.sqrt(
Math.pow(x, 2) +
Math.pow(y, 2) +
Math.pow(z, 2));
}
},
sensor,
SensorManager.SENSOR_DELAY_NORMAL
);
(As a side question, the values for x, y and z seem much higher than they should be, with accel averaging at about 50-80, when standing still? Shouldn't it be around 9.8?)
The x, y and z values seem very sensitive to changes in the orientation of the phone, but not at all representative of acceleration. Am I missing something??
Example values with phone still, lying on back:
Accel = 85.36, x = 6.8, y = 45.25 z = 30.125
I had to replace
Sensor sensor = sm.getDefaultSensor(SensorManager.SENSOR_ACCELEROMETER);
with
Sensor sensor = sm.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Could be because when you throw the phone it is almost the same plane or angle and at low speed, when you turn the phone it changes its course and orientation rapidly and that gives higher values for the result. Accelerometer may be a misnomer for a multi-function device, there could be a selection parameter for the function you really want to get results from.
With the phone lying on its back you should get close to zero on the X and Y sensors and about 9.8 on the Z sensor. The 9.8 is of course due to gravity.
My first though would be that there is something wrong with the phone and would suggest trying same code on another phone.
However I notice that there is something wrong in the math but haven;t figured out what yet.
with x,y,z having the values you mention the resultant (square root of sum of squares) works out to 54.78 rather than 85.36 as you mention in your post.
I'm quite new to Java so I cannot easily spot what might be wrong and haven't had the opportunity yet to try that piece of code on my phone, but I think the math is simple enough for me to determine that the result is wrong. (or at least I hope so).
The other thing to check (assuming you figureout the math problem) is that the small change when you throw the phone in the air might simply be due to the slow response time. The accelerometer output may simply be changing too slowly so by the time the phone has landed the output wouldn't have changed that much. The response can be improved by using SENSOR_DELAY_GAME or SENSOR_DELAY_FASTEST instead of normal.
By the way, shouldn't that be arg0.values[] rather than se.values[]? Where does the se come from? The sensor values go into the argument of the onSensorChanged (arg0 in this case) so I cannot figure out how they are supposed to end up in se. (But then again there are many things in Java I still don't understand)
I want to be able to feature a fairly simple fall detection algorithm in my application. At the moment in onSensorChanged(), I am getting the absolute value of the current x,x,z values and subtracting SensorManager.GRAVITY_EARTH (9.8 m/s) from this. The resulting value has to be bigger than a threshold value 10 times in a row to set a flag saying a fall has been detected by the accelerometer, the threshold value is about 8m/s.
Also I'm comparing the orientation of the phone as soon as the threshold has been passed and the orienation of it when the threshold is no longer being passed, this sets another flag saying the orientation sensor has detected a fall.
When both flags are set, an event occurs to check is user ok, etc etc. My problem is with the threshold, when the phone is held straight up the absolute value of accelerometer is about 9.8 m/s, but when i hold it still at an angle it can be over 15m/s. This is causing other events to trigger the fall detection, and if i increase the threshold to avoid that, it won't detect falls.
Can anyone give me some advice here with what possible values i should use or how to even improve my method? Many thanks.
First, I want to remind you that you cannot just add the x, y, z values together as they are, you have to use vector mathematics. This is why you get values of over 15 m/s. As long as the phone is not moving, the vector sum should always be about 9.8 m/s. You calculate it using SQRT(x*x + y*y + z*z). If you need more information, you can read about vector mathematics, maybe http://en.wikipedia.org/wiki/Euclidean_vector#Length is a good start for it.
I also suggest another algorithm: In free fall, all three of the x,y,z values of the accelerometer should be near zero. (At least, that's what I learned in physics classes a long time ago in school.) So maybe you can use a formula like if the vector sum of x,y,z <= 3 m/s than you detect a free fall. And if the vector sum then raises to a value over 20 m/s, than you detect the landing.
Those thresholds are just a wild guess. Maybe you just record the x,y,z values in a test application, and then move around the phone, and then analyze offline how the values (and their normal and vector sum) behave to get a feeling for which thresholds are sensible.
I have acutally published a paper on this issue. Please feel free to check out "ifall" # ww2.cs.fsu.edu/~sposaro
We basically take the root sum of squares and look for 3 things
1. Lower threshold broke. Ie fallinging
2. Upper threshold broke. Ie hitting the ground
3. Flatline around 1g, ie longlie, laying on the ground for an extended period of time
I forgot to update this thread, but iFall is now available on the Android Market.
Also check out ww2.cs.fsu.edu/~sposaro/iFall for more information
Its possible using the Accelerometer sensor.
Write this in the sensor changed listener..
if (sensor == Sensor.TYPE_ACCELEROMETER) {
long curTime = System.currentTimeMillis();
// only allow one update every 100ms.
if ((curTime - lastUpdate) > 100) {
long diffTime = (curTime - lastUpdate);
lastUpdate = curTime;
x = values[SensorManager.DATA_X];
y = values[SensorManager.DATA_Y];
z = values[SensorManager.DATA_Z];
float speed = Math.abs(x + y + z - last_x - last_y - last_z) / diffTime * 10000;
Log.d("getShakeDetection", "speed: " + speed);
if (speed > DashplexManager.getInstance().SHAKE_THRESHOLD) {
result = true;
}
last_x = x;
last_y = y;
last_z = z;
}
}