my android application shows the direction of a particular place in the world and therefore in needs to get the compass degree.
This is the code I've been using to calculate the degrees:
public void getDirection() {
mySensorManager = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
List<Sensor> mySensors = mySensorManager.getSensorList(Sensor.TYPE_ORIENTATION);
if(mySensors.size() > 0){
mySensorManager.registerListener(mySensorEventListener, mySensors.get(0), SensorManager.SENSOR_DELAY_UI);
}
else{
TextView alert = (TextView)findViewById(R.id.instruct);
alert.setText(getString(R.string.direction_not_found));
myCompassView.setVisibility(myCompassView.INVISIBLE);
}
}
private SensorEventListener mySensorEventListener = new SensorEventListener(){
#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
// TODO Auto-generated method stub
}
#Override
public void onSensorChanged(SensorEvent event) {
// TODO Auto-generated method stub
compassBearing = (float)event.values[0];
float bearing;
bearing = compassBearing - templeBearing;
if (bearing < 0)
bearing = 360 + bearing;
myCompassView.updateDirection(bearing);
}
};
This method usually works but sometimes it just gets the wrong north, what do I have to do to get a more accurate location?
I have a couple suggestions for you:
1) Your device may not be calibrated. In order to do it, move it around in a form of 8 (see this). If you don't if your device is calibrated or not make some tests by pointing the device at some known cardinal point direction and compare the values. Typically, if a device is not calibrated, you will see great variations in the azimuth value for small rotations. That is what I would be worried about.
Now, don't forget that the sensor gives you the bearing to Magnetic North, and not True North! This difference is known as declination of the magnetic field and its value changes from place to place and from time to time due to changes in Earth's magnetic field. This app can compute some of the values for you with relative accuracy. I wouldn't be too much worried about this as the declination is typically small, but you might be looking for good precision (where I live the declination is 3º, currently).
2) Stay away from metal objects or stuff that generate a strong magnetic field. For example, don't do tests if you have your phone near the computer or any physical keyboards! This is pure poison for testing compass-geolocation. Some apps can measure the intensity of the magnetic field (if the device supports it). When you get closer to metal stuff you will experience higher values and strong changes in directions. For fun, there are also some "metal detectors": this app recognises changes in the magnetic field and vibrates when you are close "metal object" or stuff that magnetically interfere with the device.
3) Remember to update the bearing when you tilt your device to landscape mode. (article is a must read!) This is because azimuth value is based on the rotation of the perpendicular axis to the plane of the phone. When you rotate the device to landscape, this value is changed by +/-90º! This is not resolved by disabling the application landscape mode! You will have to determine it programmatically by analysing rotations around the other two axis (pitch and roll). This is not trivial, but there are some examples somewhere in the net.
edit: If you are interested in some code, check out Mixare, it is an open source augmented reality framework under the GPL3 for Android. Take a look at their code regarding orientation, compass geolocation and bearing.
PS: I don't have any sort of connection with the creators of the mentioned applications.
Related
I published a game that uses motion sensors two days ago on Google Play Store. This game is a boxer game and allows users when they shake smartphone score will be evaluated automatically. To evaluate score, it uses TYPE_LINEAR_ACCELERATION sensor.
But the problem is after I published the game, some of users send their score. I saw that for some smartphones it is easy to get 900 points when for some smartphones it is hard to get 500 points. I mean that, if the same user shake(with same force) different phones in this game; for X smartphone he gets (for example) 400 points, for Y smartphone the gets (for example) 850 points.
Why does this inequality occur?
I understood(guess) that some smartphones evaluates score less, when some smartphones evaluates score more.
My implentation (roughly)
sensorManager = (SensorManager) getActivity().getSystemService(Context.SENSOR_SERVICE);
sensorManager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_LINEAR_ACCELERATION), SensorManager.SENSOR_DELAY_FASTEST);
-
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_LINEAR_ACCELERATION) {
float[] values = event.values;
float x = values[0];//to get X - axis acceleration
//.......
//........
}
}
I can share my application link, but is it allowed? If yes, please say. I can share my application link.
Note: I want to explain my algorithm, to get score I get the maximum acceleration from sensor and keep the maximum value in a variable. after the user click show score button, I multiply by 40 and get score.
I do not understand your question. Type_Linear_Acceleration is a virtual sensor that gets its data from a combination of data from the accelerometer and the gyroscope.
Most smartphone accelerometer and gyroscopes do not have the exact same sensor, and obviously the data will be slightly different between each device.
Your haven't described how you calculate the score.
I would do a square root calculation of the 3 dimensions, and then use the result (the magnetude) as the score
score = getMagnetude(values);
and the function:
private float getMagnetude(float[] v) {
return Math.sqrt( v[0]*v[0] + v[1]*v[1] + v[2]*v[2] );
}
I understood my problem. I am using Sensor.TYPE_LINEAR_ACCELERATION so it uses Gyroscope + Accelerometer sensor.
What is the operation logic of TYPE_LINEAR_ACCELERATION sensor?
It uses accelerometer and gyroscope sensor.It have a formula that:
Linear Acceleration = k1 x accelerometer + k2 xgyroscope
k1 and k2 is constant values.
For gyroscope there is no any maximum(I think) value but for accelerometer sensor different smartphone uses different maximum value.
These maximum values are -+g(9.8), -+2g, -+4g and -+8g.
Finally, for different smartphones different results is shown because users usually reach the maximum value when they shake their smartphone.
I want to detect if the device is facing up. (Not angled but flat to the ground facing up).
On some devices for facing up, z value will return values between 9~10. (Most devices)
However, on Nexus 7, for facing up, z value will return values between 6~8.
My code was:
if(z_value > 9.0) {
// device facing up
}
else {
// device is in angled
}
However, above code doesn't work anymore. Since Nexus7 doesn't reach z_value of 9.
How can I detect if the device is facing (entirely) up or not. (not asking z_value > 0)
My full code is below:
#Override
protected void onStart() {
super.onStart();
try {
sensorManager = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
List<Sensor> sensorList = sensorManager.getSensorList(Sensor.TYPE_ACCELEROMETER);
if(sensorList.size() > 0){
accelerometerPresent = true;
accelerometerSensor = sensorList.get(0);
}
else{
accelerometerPresent = false;
}
if(accelerometerPresent){
sensorManager.registerListener(accelerometerListener, accelerometerSensor, SensorManager.SENSOR_DELAY_UI);
}
} catch(Exception e) {}
}
private SensorEventListener accelerometerListener = new SensorEventListener(){
#Override
public void onAccuracyChanged(Sensor arg0, int arg1) {}
#Override
public void onSensorChanged(SensorEvent arg0) {
float z_value = arg0.values[2];
Log.d("test", "z:" + z_value);
}};
Note #1
arg0.sensor.getMaximumRange() returns 19.6133 for Nexus 7. Which sensor never returns.
Note #2
If you shake the devices, z_value tends to go little higher (sometimes 8~8.5).
If you steadily tilt the device, z_value doesn't reach 8 (max).
Apparently the device is poorly calibrated. A well-calibrated device should return 9.81m/s^2, the gravitational acceleration.
What you could do instead: Compare the z value to the x and y values. If the z value dominates than the device is facing up. For example:
if (z/sqrt(x^2+y^2+z^2+1.0e-6) > 0.9) { // Facing up
I added the term 1.0e-6 so that you won't accidentally divide by zero.
This heuristic requires testing and tweaking but I guess you get the idea. Good luck!
The general idea about this kind of sensors is to avoid them or to not abuse them in a way that your application relies on a really precise measure.
The problem with this sensors is basically that they have really different range of values and they are really noisy, so they need to be normalized by you, the coder, and they can't be so reliable in terms of accuracy.
I forgot the exact name of this public speech, but even Google and the Android team suggest to not rely so much on this in terms of decimals or to not expect great accuracy; also remember that every little part of this smartphones and tablets is really really cheap, we are talking about sensors that usually do not cost more than 1$.
Try to code in a way that your code only requires the general direction or axis, if you want a value be sure that it's an integer and always do rough calc and do not expect precision or decimals.
Also remember that Android doesn't offers a normalized approach to this values, so you have to deal with that in your code.
Using only the phone's (Android) built in accelerometer, how would I go about finding its velocity?
I have been tinkering with the maths of this but whatever function I come up with tends to lead to exponential growth of the velocity. I am working from the assumption that on startup of the App, the phone is at a standstill. This should definitely make finding the velocity (at least roughly) possible.
I have a decent background in physics and math too, so I shouldn't have any difficulty with any concepts here.
How should I do it?
That will really depend on what the acceleration is and for how long. A mild, long acceleration could be measurable, but any sudden increase in acceleration, followed by a constant velocity, will make your measurements quite difficult and prone to error.
Assuming constant acceleration, the formula is extremely simple: a = (V1-V0)/t . So, knowing the time and the acceleration, and assuming V0 = 0, then V1 = a*t
In a more real world, you probably won't have a constant acceleration, so you should calculate Delta V for each measurement, and adding all those changes in velocity to get the final velocity. Always consider that you won't have a continuous acceleration data, so this is the most feasible way (i.e, real data vs integral math theory).
In any way, even in the best scenario, you will end up with a very high error margin, so I do not recommend this approach for any app that truly depends on real velocities.
First, you have to remove the acceleration due to gravity from the accelerometer data. Then it's just a matter of integrating the acceleration to get the velocity. Don't forget that acceleration and velocity are properly vectors, not scalars, and that you will also have to track rotation of the phone in space to properly determine the orientation of the acceleration vector with respect to the calculated velocity vector.
There is nothing else to do but agree with the reasonable arguments put forward in all the great answers above, however if you are the pragmatic type like me, I need to come up with a solution that works somehow.
I suffered a similar problem to yours and I decided to make my own solution after not finding any on-line. I only needed a simple "tilt" input for controlling a game so this solution will probably NOT work for more complex needs, however I decided to share it in case others where looking for something similar.
NOTE: I have pasted my entire code here, and it is free to use for any purpose.
Basically what I do in my code is to look for accelerometer sensor. If not found, tilt feedback will be disabled. If accelerometer sensor is present, I look for magnetic field sensor, and if it is present, I get my tilt angle the recommended way by combining accelerometer and magnetic field data.
public TiltSensor(Context c) {
man = (SensorManager) c.getSystemService(Context.SENSOR_SERVICE);
mag_sensor = man.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
acc_sensor = man.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
has_mag = man.registerListener(this, mag_sensor, delay);
has_acc = man.registerListener(this, acc_sensor, delay);
if (has_acc) {
tiltAvailble = true;
if (has_mag) {
Log.d("TiltCalc", "Using accelerometer + compass.");
}
else {
Log.d("TiltCalc", "Using only accelerometer.");
}
}
else {
tiltAvailble = false;
Log.d("TiltCalc", "No acceptable hardware found, tilt not available.");
//No use in having listeners registered
pause();
}
}
If however only the accelerometer sensor was present, I fall back to accumulating the acceleration, that is continuously damped (multiplied by 0.99) to remove any drift. For my simple tilt needs this works great.
#Override
public void onSensorChanged(SensorEvent e) {
final float[] vals = e.values;
final int type = e.sensor.getType();
switch (type) {
case (Sensor.TYPE_ACCELEROMETER): {
needsRecalc = true;
if (!has_mag) {
System.arraycopy(accelerometer, 0, old_acc, 0, 3);
}
System.arraycopy(vals, 0, accelerometer, 0, 3);
if (!has_mag) {
for (int i = 0; i < 3; i++) {
//Accumulate changes
final float sensitivity = 0.08f;
dampened_acc[i] += (accelerometer[i] - old_acc[i]) * sensitivity;
//Even out drift over time
dampened_acc[i] *= 0.99;
}
}
}
break;
case (Sensor.TYPE_MAGNETIC_FIELD): {
needsRecalc = true;
System.arraycopy(vals, 0, magnetic_field, 0, 3);
}
break;
}
}
In conclusion I will just repeat that this is probably not "correct" in any way, it simply works as a simple input to a game. To use this code I simply do something like the following (yes magic constants are bad mkay):
Ship ship = mShipLayer.getShip();
mTiltSensor.getTilt(vals);
float deltaY = -vals[1] * 2;//1 is the index of the axis we are after
float offset = ((deltaY - (deltaY / 1.5f)));
if (null != ship) {
ship.setOffset(offset);
}
Enjoi!
Integrating acceleration to get velocity is an unstable problem and your error will diverge after a couple of seconds or so. Phone accelerometers are also not very accurate, which doesn't help, and some of them don't allow you to distinguish between tilt and translation easily, in which case you're really in trouble.
The accelerometers in a phone are pretty much useless for such a task. You need highly accurate accelerometers with very low drift - something which is way beyond what you will find in a phone. At best you might get useful results for a few seconds, or if very lucky for a minute or two after which the results become meaningless.
Also, you need to have a three axis gyroscope which you would use to integrate the velocity in the right direction. Some phones have gyros, but they are even poorer than the accelerometers as far as drift and accuracy are concerned.
One possibly useful application though would be to use the accelerometers in conjunction with gyros or the magnetic compass to fill in for missing data from the GPS. Each time the GPS gives a good fix one would reset the initial conditions of position, speed and orientation and the accelerometers would provide the data until the next valid GPS fix.
Gravity is going to destroy all of your measurements. The phone, at standstill, is experiencing a high constant upward (yes, UP) acceleration. An accelerometer can't distinguish between acceleration and gravity (technically, they are the same), so it would get to extremely high velocities after a few seconds.
If you never tilt your accelerometer even slightly, then you can simply subtract the constant gravitional pull from the z-axis (or whichever axis is pointing up/down), but thats quite unlikely.
Basically, you have to use a complicated system of a gyroscope/magnetometor and an accelerometer to calculate the exact direction of gravity and then subtract the acceleration.
v = Integral(a) ?
Generally though, I'd think the inaccuracies in the accelerometers would make this quite tough
If the phone is at standstil, you have ZERO acceleration, so your speed is 0. Probably you should find location data from GPS and get the associated time samples and compute velocity distance over time.
The values I'm getting for accel, x, y and z below are not as expected.
It seems to be acting as a Tilt sensor rather than accelerometer.
When I throw the phone in the air and catch it, the accel value doesn't change by more than about 10%. Contrast this to when I rotate the phone randomly, I get much larger variations of 50-100%!
What could explain this? I simply want to detect when the phone is in freefall, (and/or impacting something).
SensorManager sm = (SensorManager)getSystemService(SENSOR_SERVICE);
Sensor sensor = sm.getDefaultSensor(SensorManager.SENSOR_ACCELEROMETER);
sm.registerListener(
sel = new SensorEventListener(){
#Override public void onAccuracyChanged(Sensor arg0, int arg1) {}
#Override public void onSensorChanged(SensorEvent arg0) {
double x = se.values[0];
double y = se.values[1];
double z = se.values[2];
double accel = Math.sqrt(
Math.pow(x, 2) +
Math.pow(y, 2) +
Math.pow(z, 2));
}
},
sensor,
SensorManager.SENSOR_DELAY_NORMAL
);
(As a side question, the values for x, y and z seem much higher than they should be, with accel averaging at about 50-80, when standing still? Shouldn't it be around 9.8?)
The x, y and z values seem very sensitive to changes in the orientation of the phone, but not at all representative of acceleration. Am I missing something??
Example values with phone still, lying on back:
Accel = 85.36, x = 6.8, y = 45.25 z = 30.125
I had to replace
Sensor sensor = sm.getDefaultSensor(SensorManager.SENSOR_ACCELEROMETER);
with
Sensor sensor = sm.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Could be because when you throw the phone it is almost the same plane or angle and at low speed, when you turn the phone it changes its course and orientation rapidly and that gives higher values for the result. Accelerometer may be a misnomer for a multi-function device, there could be a selection parameter for the function you really want to get results from.
With the phone lying on its back you should get close to zero on the X and Y sensors and about 9.8 on the Z sensor. The 9.8 is of course due to gravity.
My first though would be that there is something wrong with the phone and would suggest trying same code on another phone.
However I notice that there is something wrong in the math but haven;t figured out what yet.
with x,y,z having the values you mention the resultant (square root of sum of squares) works out to 54.78 rather than 85.36 as you mention in your post.
I'm quite new to Java so I cannot easily spot what might be wrong and haven't had the opportunity yet to try that piece of code on my phone, but I think the math is simple enough for me to determine that the result is wrong. (or at least I hope so).
The other thing to check (assuming you figureout the math problem) is that the small change when you throw the phone in the air might simply be due to the slow response time. The accelerometer output may simply be changing too slowly so by the time the phone has landed the output wouldn't have changed that much. The response can be improved by using SENSOR_DELAY_GAME or SENSOR_DELAY_FASTEST instead of normal.
By the way, shouldn't that be arg0.values[] rather than se.values[]? Where does the se come from? The sensor values go into the argument of the onSensorChanged (arg0 in this case) so I cannot figure out how they are supposed to end up in se. (But then again there are many things in Java I still don't understand)
I'm writing an application and my aim is to detect when a user is walking.
I'm using a Kalman filter like this:
float kFilteringFactor=0.6f;
gravity[0] = (accelerometer_values[0] * kFilteringFactor) + (gravity[0] * (1.0f - kFilteringFactor));
gravity[1] = (accelerometer_values[1] * kFilteringFactor) + (gravity[1] * (1.0f - kFilteringFactor));
gravity[2] = (accelerometer_values[2] * kFilteringFactor) + (gravity[2] * (1.0f - kFilteringFactor));
linear_acceleration[0] = (accelerometer_values[0] - gravity[0]);
linear_acceleration[1] = (accelerometer_values[1] - gravity[1]);
linear_acceleration[2] = (accelerometer_values[2] - gravity[2]);
float magnitude = 0.0f;
magnitude = (float)Math.sqrt(linear_acceleration[0]*linear_acceleration[0]+linear_acceleration[1]*linear_acceleration[1]+linear_acceleration[2]*linear_acceleration[2]);
magnitude = Math.abs(magnitude);
if(magnitude>0.2)
//walking
The array gravity[] is initialized with 0s.
I can detect when a user is walking or not (looking at the value of the magnitude of the acceleration vector), but my problem is that when a user is not walking and he moves the phones, it seems that he is walking.
Am I using the right filter?
Is it right to watch only the magnitude of the vector or have I to look at the single values ??
Google provides an API for this called DetectedActivity that can be obtained using the ActivityRecognitionApi. Those docs can be accessed here and here.
DetectedActivity has the method public int getType() to get the current activity of the user and also public int getConfidence() which returns a value from 0 to 100. The higher the value returned by getConfidence(), the more certain the API is that the user is performing the returned activity.
Here is a constant summary of what is returned by getType():
int IN_VEHICLE The device is in a vehicle, such as a car.
int ON_BICYCLE The device is on a bicycle.
int ON_FOOT The device is on a user who is walking or running.
int RUNNING The device is on a user who is running.
int STILL The device is still (not moving).
int TILTING The device angle relative to gravity changed significantly.
int UNKNOWN Unable to detect the current activity.
int WALKING The device is on a user who is walking.
My first intuition would be to run an FFT analysis on the sensor history, and see what frequencies have high magnitudes when walking.
It's essentially seeing what walking "sounds like", treating the accelerometer sensor inputs like a microphone and seeing the frequencies that are loud when walking (in other words, at what frequency is the biggest acceleration happening).
I'd guess you'd be looking for a high magnitude at some low frequency (like footstep rate) or maybe something else. It would be interesting to see the data.
My guess is you run the FFT and look for the magnitude at some frequency to be greater than some threshold, or the difference between magnitudes of two of the frequencies is more than some amount. Again, the actual data would determine how you attempt to detect it.
For walking detection I use the derivative applied to the smoothed signal from accelerometer. When the derivative is greater than threshold value I can suggest that it was a step. But I guess that it's not best practise, furthermore it only works when the phone is placed in a pants pocket.
The following code was used in this app https://play.google.com/store/apps/details?id=com.tartakynov.robotnoise
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER){
return;
}
final float z = smooth(event.values[2]); // scalar kalman filter
if (Math.abs(z - mLastZ) > LEG_THRSHOLD_AMPLITUDE)
{
mInactivityCount = 0;
int currentActivity = (z > mLastZ) ? LEG_MOVEMENT_FORWARD : LEG_MOVEMENT_BACKWARD;
if (currentActivity != mLastActivity){
mLastActivity = currentActivity;
notifyListeners(currentActivity);
}
} else {
if (mInactivityCount > LEG_THRSHOLD_INACTIVITY) {
if (mLastActivity != LEG_MOVEMENT_NONE){
mLastActivity = LEG_MOVEMENT_NONE;
notifyListeners(LEG_MOVEMENT_NONE);
}
} else {
mInactivityCount++;
}
}
mLastZ = z;
}
EDIT: I don't think it's accurate enough since when walking normally the average acceleration would be near 0. The most you could do measuring acceleration is detect when someone starts walking or stops (But as you said, it's difficult to filter it from the device moved by someone standing at one place)
So... what I wrote earlier, probably wouldn't work anyway:
You can "predict" whether the user is moving by discarding when the user is not moving (obvious), And first two options coming to my mind are:
Check whether the phone is "hidden", using proximity and light sensor (optional). This method is less accurate but easier.
Controlling the continuity of the movement, if the phone is moving for more than... 10 seconds and the movement is not despicable, then you consider he is walking. I know is not perfet either, but it's difficult wihout using any kind of positioning, by the way... why don't you just use LocationManager?
Try detecting the up and down oscillations, the fore and aft oscillations and the frequency of each and make sure they stay aligned within bounds on average, because you would detect walking and specifically that person's gait style which should remain relatively constant for several steps at once to qualify as moving.
As long as the last 3 oscillations line up within reason then conclude walking is occurring as long as this also is true:-
You measure horizontal acceleration and update a velocity value with it. Velocity will drift with time, but you need to keep a moving average of velocity smoothed over the time of a step, and as long as it doesn't drift more than say half of walking speed per 3 oscillations then it's walking but only if it initially rose to walking speed within a short time ie half a second or 2 oscillations perhaps.
All of that should just about cover it.
Of course, a little ai would help make things simpler or just as complex but amazingly accurate if you considered all of these as inputs to a NN. Ie preprocessing.