Hy everyone, I have two questions
1) I have not been able to find out in which units Location.distanceBetween receivs the latitude and longitude. Is it degrees or microdegrees? Whats the unit for the distance returned?
Sorry for this noob questions but I have not been able to find anything on the documentation.
2) In windows XP using Eclipse 3.3.2. Emulator does not send coordinates properly. Either it by hand or by loading a gpx file the locationListener is not invoked. This same code I have tried it in Ubuntu and works fine. Does someone know how can I solve this? In the office there is no linux installed and I can take my personal laptop.
Thanks a lot in advanced!
1) From android source:
public static void distanceBetween(double startLatitude, double startLongitude,
double endLatitude, double endLongitude, float[] results) {
if (results == null || results.length < 1) {
throw new IllegalArgumentException("results is null or has length < 1");
}
computeDistanceAndBearing(startLatitude, startLongitude,
endLatitude, endLongitude, results);
}
And inside computeDistanceAndBearing there is comment that says:
// Based on http://www.ngs.noaa.gov/PUBS_LIB/inverse.pdf
// using the "Inverse Formula" (section 4)
I would check there
2)
Go to:
Settings
Applications
Development
Check if "Allow mock locations" is on.
I had problems with the emulator not getting locations from ddms and it turned out to be because I had an non-english locale which appears to screw it up. You can try adding "-Duser.language=en" to the start parameters for java in the ddms start script and see if it helps. The bug is here
Oh and I don't know the input units for lat/long but the resulting float is in metres according to the api docs.
Related
There is an Android as well as iOS application that I am working on.
Both the applications use google's PlaceAutocomplete controller to get a location's lat-long. In iOS we get lat-long upto 6 decimal places sometimes 5 also. Where as in android we get more than 6 decimal places. The precision of the same location coordinate for Android and iOS differs.
For example consider location Pune
Android Latlng: 18.520431,73.856744
iOS Latlng: 18.52043,73.856744
So as you can see there is difference between the precision of latitudes of the same location.
Is there a way to avoid this as my application needs comparison of these lat-longs?
You should not rely on the precision of the coordinates to compare them because they can change or, as you experiment, vary between platforms.
Instead, you can set a tolerance to determine if two locations are the same. For example:
float YOUR_TOLERANCE = 1; // 1 meter
if (location.distanceTo(otherLocation) < YOUR_TOLERANCE) {
// Both locations are considered the same
}
In android also there is three type of location :
GPS_PROVIDER
NETWORK_PROVIDER
PASSIVE_PROVIDER
So, as per my coding experience i come to know that if you use :
locationManager.requestLocationUpdates(LocationManager.GPS_PROVIDER, 5000, 10, new MyLocationListener());
you will get long precision of decimal like more then 14+ and if you will you use fusion of them with this :
LocationServices.FusedLocationApi.requestLocationUpdates(mGoogleApiClient, mLocationRequest, my_google_listener);
then you will get 6 to 7 digit of precision. try it !!!
I'm using google.maps.geometry.spherical.computeDistanceBetween() to compute the distance between relatively close points (10-30 meters). This works perfectly in Linux (Chrome and Firefox), but sometimes gives me crazy results in Android. One case that I got was with this:
var p1 = new google.maps.LatLng(-22.960584,-43.206687999999986);
var p2 = new google.maps.LatLng(-22.960584,-43.206939000000034);
alert(google.maps.geometry.spherical.computeDistanceBetween(p1,p2));
It should give 25 meters or so, yet once I got hundred of thousands of meters. Again, it is not always that I get crazy values, just "sometimes", probably related with lots of computations?
Is this a well known bug? If it is, I cannot use this method and would have to make my own.
Thanks,
L.
As far as I can tell, this is an Android bug. I think this could very well explain why in the MyTracks app I usually get randomly points in other continents and huge distances.
I computed the distance function with the method below, and now I always get the correct values. In particular, if this is what it looks, this is a very serious bug for Android app that uses this function.
In case anyone cares, this is the distance function between to LatLng p and q:
function dist(p,q) {
var c = Math.PI/180;
// Google (gives randomly wrong results in Android!)
//return google.maps.geometry.spherical.computeDistanceBetween(p,q);
// Chord
//return 9019995.5222 * Math.sqrt((1-Math.cos(c*(p.lat()-q.lat())))
// + (1-Math.cos(c*(p.lng()-q.lng()))) * Math.cos(c*p.lat()) * Math.cos(c*q.lat()));
// Taylor for chord
return 111318.845 * Math.sqrt(Math.pow(p.lat()-q.lat(),2)
+ Math.pow(p.lng()-q.lng(),2) * Math.cos(c*p.lat()) * Math.cos(c*q.lat()));
}
Notice that these are the computations for the chord, that is, the distance in R^3, not the geodesic distance in the sphere. Certainly more than enough for hiking/car travel computations using GPS. I ended up using the Taylor expansion since it is precise to 1/10 mm, and less tough with the CPU.
I am writing an application using phonegap to store an update lat/lon every 5 seconds to a mysql database. I would like to be able to allow my users to see the total distance traveled since starting the app.
I've taken a look at the Phonegap geolocation API and cannot see a way to calculate total distance traveled based upon lat / lon updates. Is there a way to accomplish this?
EDIT: # Drew thanks for the link. I have looked it over and the JS version of Haversine looks straight forward. the difficult part will be the way phonegap pulls and stores lat/lon. Currently my function to get and send the location to MySQL is
function geo_success(position) {
$("#status p").text("Tracking active");
$('#status').removeClass("stopped").addClass("active");
$('button').text("Stop tracking");
latlon.lat = position.coords.latitude;
latlon.lon = position.coords.longitude;
latlon.alt = position.coords.altitude;
if(!position.coords.speed) { latlon.speed = 0; }
else{ latlon.speed = position.coords.speed }
if(first) {
intervalId = setInterval(send, 5000);
}
first = false;
}
Is there a way you can think of to store the latest value for lat1 lon1 and use the previous for lat2 lon2 and cycle the newest incoming coordinates through those 2 sets of variables? That way i can take the returned variable d from the haversine and store it in the db (to be able to sum it up later). Many thanks.
You would have to create an algorithm yourself that take those coordinates every 5 seconds, do some algebra on them to determine the distance between the two, and add it to the total distance somewhere, then repeat for the next 5 seconds.
For the actual algorithm of calculating the distance, look at this answer.
I am creating an app in android where i need to detect if the person has fall down. I know that this question has been asked and answered as to use vector mathematics in other forums but i am not getting the accurate results out of it.
Below is my code to detect the fall:
#Override
public void onSensorChanged(SensorEvent arg0) {
// TODO Auto-generated method stub
if (arg0.sensor.getType()==Sensor.TYPE_ACCELEROMETER) {
double gvt=SensorManager.STANDARD_GRAVITY;
float vals[] = arg0.values;
//int sensor=arg0.sensor.getType();
double xx=arg0.values[0];
double yy=arg0.values[1];
double zz=arg0.values[2];
double aaa=Math.round(Math.sqrt(Math.pow(xx, 2)
+Math.pow(yy, 2)
+Math.pow(zz, 2)));
if (aaa<=6.0) {
min=true;
//mintime=System.currentTimeMillis();
}
if (min==true) {
i++;
if(aaa>=13.5) {
max=true;
}
}
if (min==true && max==true) {
Toast.makeText(FallDetectionActivity.this,"FALL DETECTED!!!!!" ,Toast.LENGTH_LONG).show();
i=0;
min=false;
max=false;
}
if (i>4) {
i=0;
min=false;
max=false;
}
}
}
To explain the above code i have used the vector sum and checking if the value has reached below or equal to 6(while fall) and suddenly greater than 13.5(while landing) to confirm the fall.
Now i was been told in the forums that if the device is still the vector sum will return the value of 9.8. While fall it should be close to 0 and should go to around 20 while landing. This doesn't seem to happen in my case. Please can anybody suggest if i am going wrong anywhere?
There is a guy who developed an android app for that. Maybe you can get some information from his site: http://ww2.cs.fsu.edu/~sposaro/iFall/. He also made an article explaining how he detected the fall. It is really interesting, you should check it out!
Link for the paper: http://ww2.cs.fsu.edu/~sposaro/publications/iFall.pdf
Resuming, the fall detection is based on the resultant of the X-Y-Z acceleration. Based on this value:
When falling, the falling generally starts with a free fall period, making the resultand drop significantly below 1g.
On the impact on the ground, there is a peak in the amplitude of the resultant, with values higher than 3g.
After that, if the person could not move due to the fall, the resultant will remain close to 1G.
Following will happen if person / phone falls down:
absolute acceleration vector value goes to 0 ( with some noise of course )
there will be fair spike in absolute vector value on landing ( up to maximal value provided by accelerometer )
When phone is immobile, you have vector of modulo earth gravity pointing up
Your code is basically correct, but I would use some averaging because accelerometers used in phones are cheap crap - noisy and lacking precision
To add averaging to your signal means:- moving average. It depends on your windows size. For example. Say I have a one vector with the following numbers: 1,2,3,4,5,6. and my window size is 2. Then the moving average is to take every two numbers from your vector and average them by 2. So you would take 1+2/2, and then move one to the next twos. 2+3/2, and so on.
I have a location with latitude and longitude and want to get a new location that has a distance of x meters from that location at an angle of d degrees. This would be the reverse of Location.distanceBetween(). Is there any Android API to do that. I know that I could program such a function myself, but I wonder if there is an API for it already.
There are some formulae and sample code (JavaScript) for this here: Movable Type Scripts. Look for 'Destination point given distance and bearing from start point'.
Here's an excerpt of the JavaScript from the site:
var lat2 = Math.asin( Math.sin(lat1)*Math.cos(d/R) +
Math.cos(lat1)*Math.sin(d/R)*Math.cos(brng) );
var lon2 = lon1 + Math.atan2(Math.sin(brng)*Math.sin(d/R)*Math.cos(lat1),
Math.cos(d/R)-Math.sin(lat1)*Math.sin(lat2));
In the above code, d is the distance, brng is the bearing in degrees, and R is the Earth's radius.
Porting this to Java should be trivial.
This is called the "first geodesic" (or sometimes "principal geodesic") problem, which will probably help you in finding an algorithm on Google if you need to implement this yourself.
Implement this yourself for now, but do expect this function to show up at some point, so code accordingly - create your own function, add a few unit tests.
In the future add the following to you function:
def myFunc(args):
res = # compute stuff
#if(debug):
res2 = # make api call
assert(res = res2)
return res
And some time later:
def myFunc(args):
return # make api call
And some time later remove the function altogether.
Here is the reverse of it:
SphericalUtil.computeOffsetOrigin(loc1, dist, angle);
It also has the distanceBetween function:
SphericalUtil.computeDistanceBetween(...);
Lib:
SphericalUtil