I am developing augmented reality android application based on real time location.
It is a simple concept: my application should show some places around me. I have
researched this intensively and yet I am still running into issues. I have my GPS coordinates
and the target place's GPS coordinates.
My question is: How can I retrieve what my phone's camera is looking at (for example a building)?
What is the logical way to solve something like this?
Augmented Reality will transfer real coordinates system to camera coordinates system. In AR Location-based, the real coordinate is Geographic coordinate system. We will convert the GPS coordinate (Latitude, Longitude, Altitude) to Navigation coordinate (East, North, Up), then transfer Navigation coordinate to Camera coordinate and display it on camera view.
I just create the demo for you:
https://github.com/dat-ng/ar-location-based-android
First check the availability of sensors in the device. If the device supports TYPE_ROTATION_VECTOR then register a sensor listener for the same or else check for TYPE_MAGNETIC_FIELD and TYPE_ACCELEROMETER.
SensorManager mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
rSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);
if (rSensor == null) {
mSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
aSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
}
Then in onSensorChanged function you should calculate the azimuth.
if (rSensor == null) {
switch (event.sensor.getType()) {
case Sensor.TYPE_MAGNETIC_FIELD:
magnetic = event.values.clone();
break;
case Sensor.TYPE_ACCELEROMETER:
accelerometer = event.values.clone();
break;
}
if (magnetic != null && accelerometer != null) {
Rot = new float[9];
I = new float[9];
SensorManager.getRotationMatrix(Rot, I, accelerometer, magnetic);
float[] outR = new float[9];
SensorManager.remapCoordinateSystem(Rot, SensorManager.AXIS_X,
SensorManager.AXIS_Z, outR);
SensorManager.getOrientation(outR, values);
azimuth = values[0];
magnetic = null;
accelerometer = null;
}
} else {
SensorManager.getRotationMatrixFromVector(mRotationMatrix, event.values);
SensorManager.getOrientation(mRotationMatrix, mValues);
azimuth = Math.toDegrees(mValues[0]));
}
}
You can use this azimuth to plot the location in cameraview
double angle = bearing(myLatitude, myLongitude, dLatitude, dLongitude) - azimuth;
double xAxis, yAxis;
if(angle < 0)
angle = (angle+360)%360;
xAxis = Math.sin(Math.toRadians(angle)) * dist;
yAxis = Math.sqrt(Math.pow(dist, 2) - Math.pow(xAxis, 2));
if (angle > 90 && angle < 270)
yAxis *= -1;
double xAxisPosition = angle * (screenWidth / 90d);
xAxis = xAxisPosition - spotImageWidth/2;
float x, y;
if (angle <= 45)
x = (float) ((screenWidth / 2) + xAxis);
else if (angle >= 315)
x = (float) ((screenWidth / 2) - ((screenWidth*4) - xAxis));
else
x = (float) (float)(screenWidth*9);
y = (float)(((screenHeight - 300) - (i * 100)));
protected static double bearing(double lat1, double lon1, double lat2, double lon2) {
double longDiff = Math.toRadians(lon2 - lon1);
double la1 = Math.toRadians(lat1);
double la2 = Math.toRadians(lat2);
double y = Math.sin(longDiff) * Math.cos(la2);
double x = Math.cos(la1) * Math.sin(la2) - Math.sin(la1) * Math.cos(la2) * Math.cos(longDiff);
double result = Math.toDegrees(Math.atan2(y, x));
return (result+360.0d)%360.0d;
}
x, y will give the coordinates of the destination location.
The first step you need is to use sensor to get the direction of the back camera. You can read more about sensor at http://developer.android.com/reference/android/hardware/SensorManager.html
After you are done coding with sensors come back and ask the next question.
Try the DroidAR SDK https://github.com/bitstars/droidar . This is a AR SDK for Android. Most of your problems should be solved with it. There are also video manuals. You can also look into the code if you need just some stuff for your project.
There're two directions in this issue, device and targets.
The Azimuth of device location is shown below:
This information can be collected by sensors. However, if the orientation of the device is not fixed, you should do SensorManager.remapCoordinateSystem
The azimuth to targets is shown below:
It's probably the best figure I can find on the internet.
Once you have device location and the target location can be computed by:
azi = Math.abs(Math.toDegrees(Math.atan((tlon-lon)/(tlat-lat))));
where tlat and tlon indicates target gps locations, lat and lon are device location.
the value of this equation lies in -90 to +90, which is not what azimuth really is. So, there're 3 additional code should be added.
if((tlon-lon)>0&&(tlat-lat)<0){
azi = 180 - azi;
}
if((tlon-lon)<0&&(tlat-lat)<0){
azi = 180 + azi;
}
if((tlon-lon)<0&&(tlat-lat)>0){
azi = 360 - azi;
}
After these are all done, it's just easy to detect if targets are in your sight.
Hope these helped.
Related
I want to display an arrow at my location on a google map view that displays my direction relative to a destination location (instead of north).
a) I have calculated north using the sensor values from the magnetometer and accelerometer. I know this is correct because it lines up with the compass used on the Google Map view.
b) I have calculated the initial bearing from my location to the destination location by using myLocation.bearingTo(destLocation);
I'm missing the last step; from these two values (a & b) what formula do I use to get the direction in which the phone is pointing relative to the destination location?
Appreciate any help for an addled mind!
Ok I figured this out. For anyone else trying to do this you need:
a) heading: your heading from the hardware compass. This is in degrees east of magnetic north
b) bearing: the bearing from your location to the destination location. This is in degrees east of true north.
myLocation.bearingTo(destLocation);
c) declination: the difference between true north and magnetic north
The heading that is returned from the magnetometer + accelermometer is in degrees east of true (magnetic) north (-180 to +180) so you need to get the difference between north and magnetic north for your location. This difference is variable depending where you are on earth. You can obtain by using GeomagneticField class.
GeomagneticField geoField;
private final LocationListener locationListener = new LocationListener() {
public void onLocationChanged(Location location) {
geoField = new GeomagneticField(
Double.valueOf(location.getLatitude()).floatValue(),
Double.valueOf(location.getLongitude()).floatValue(),
Double.valueOf(location.getAltitude()).floatValue(),
System.currentTimeMillis()
);
...
}
}
Armed with these you calculate the angle of the arrow to draw on your map to show where you are facing in relation to your destination object rather than true north.
First adjust your heading with the declination:
heading += geoField.getDeclination();
Second, you need to offset the direction in which the phone is facing (heading) from the target destination rather than true north. This is the part that I got stuck on. The heading value returned from the compass gives you a value that describes where magnetic north is (in degrees east of true north) in relation to where the phone is pointing. So e.g. if the value is -10 you know that magnetic north is 10 degrees to your left. The bearing gives you the angle of your destination in degrees east of true north. So after you've compensated for the declination you can use the formula below to get the desired result:
heading = myBearing - (myBearing + heading);
You'll then want to convert from degrees east of true north (-180 to +180) into normal degrees (0 to 360):
Math.round(-heading / 360 + 180)
#Damian - The idea is very good and I agree with answer, but when I used your code I had wrong values, so I wrote this on my own (somebody told the same in your comments). Counting heading with the declination is good, I think, but later I used something like that:
heading = (bearing - heading) * -1;
instead of Damian's code:
heading = myBearing - (myBearing + heading);
and changing -180 to 180 for 0 to 360:
private float normalizeDegree(float value){
if(value >= 0.0f && value <= 180.0f){
return value;
}else{
return 180 + (180 + value);
}
and then when you want to rotate your arrow you can use code like this:
private void rotateArrow(float angle){
Matrix matrix = new Matrix();
arrowView.setScaleType(ScaleType.MATRIX);
matrix.postRotate(angle, 100f, 100f);
arrowView.setImageMatrix(matrix);
}
where arrowView is ImageView with arrow picture and 100f parameters in postRotate is pivX and pivY).
I hope I will help somebody.
In this an arrow on compass shows the direction from your location to Kaaba(destination Location)
you can simple use bearingTo in this way.bearing to will give you the direct angle from your location to destination location
Location userLoc=new Location("service Provider");
//get longitudeM Latitude and altitude of current location with gps class and set in userLoc
userLoc.setLongitude(longitude);
userLoc.setLatitude(latitude);
userLoc.setAltitude(altitude);
Location destinationLoc = new Location("service Provider");
destinationLoc.setLatitude(21.422487); //kaaba latitude setting
destinationLoc.setLongitude(39.826206); //kaaba longitude setting
float bearTo=userLoc.bearingTo(destinationLoc);
bearingTo will give you a range from -180 to 180, which will confuse things a bit. We will need to convert this value into a range from 0 to 360 to get the correct rotation.
This is a table of what we really want, comparing to what bearingTo gives us
+-----------+--------------+
| bearingTo | Real bearing |
+-----------+--------------+
| 0 | 0 |
+-----------+--------------+
| 90 | 90 |
+-----------+--------------+
| 180 | 180 |
+-----------+--------------+
| -90 | 270 |
+-----------+--------------+
| -135 | 225 |
+-----------+--------------+
| -180 | 180 |
+-----------+--------------+
so we have to add this code after bearTo
// If the bearTo is smaller than 0, add 360 to get the rotation clockwise.
if (bearTo < 0) {
bearTo = bearTo + 360;
//bearTo = -100 + 360 = 260;
}
you need to implements the SensorEventListener and its functions(onSensorChanged,onAcurracyChabge) and write all the code inside onSensorChanged
Complete code is here for Direction of Qibla compass
public class QiblaDirectionCompass extends Service implements SensorEventListener{
public static ImageView image,arrow;
// record the compass picture angle turned
private float currentDegree = 0f;
private float currentDegreeNeedle = 0f;
Context context;
Location userLoc=new Location("service Provider");
// device sensor manager
private static SensorManager mSensorManager ;
private Sensor sensor;
public static TextView tvHeading;
public QiblaDirectionCompass(Context context, ImageView compass, ImageView needle,TextView heading, double longi,double lati,double alti ) {
image = compass;
arrow = needle;
// TextView that will tell the user what degree is he heading
tvHeading = heading;
userLoc.setLongitude(longi);
userLoc.setLatitude(lati);
userLoc.setAltitude(alti);
mSensorManager = (SensorManager) context.getSystemService(SENSOR_SERVICE);
sensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ORIENTATION);
if(sensor!=null) {
// for the system's orientation sensor registered listeners
mSensorManager.registerListener(this, sensor, SensorManager.SENSOR_DELAY_GAME);//SensorManager.SENSOR_DELAY_Fastest
}else{
Toast.makeText(context,"Not Supported", Toast.LENGTH_SHORT).show();
}
// initialize your android device sensor capabilities
this.context =context;
#Override
public void onCreate() {
// TODO Auto-generated method stub
Toast.makeText(context, "Started", Toast.LENGTH_SHORT).show();
mSensorManager.registerListener(this, sensor, SensorManager.SENSOR_DELAY_GAME); //SensorManager.SENSOR_DELAY_Fastest
super.onCreate();
}
#Override
public void onDestroy() {
mSensorManager.unregisterListener(this);
Toast.makeText(context, "Destroy", Toast.LENGTH_SHORT).show();
super.onDestroy();
}
#Override
public void onSensorChanged(SensorEvent sensorEvent) {
Location destinationLoc = new Location("service Provider");
destinationLoc.setLatitude(21.422487); //kaaba latitude setting
destinationLoc.setLongitude(39.826206); //kaaba longitude setting
float bearTo=userLoc.bearingTo(destinationLoc);
//bearTo = The angle from true north to the destination location from the point we're your currently standing.(asal image k N se destination taak angle )
//head = The angle that you've rotated your phone from true north. (jaise image lagi hai wo true north per hai ab phone jitne rotate yani jitna image ka n change hai us ka angle hai ye)
GeomagneticField geoField = new GeomagneticField( Double.valueOf( userLoc.getLatitude() ).floatValue(), Double
.valueOf( userLoc.getLongitude() ).floatValue(),
Double.valueOf( userLoc.getAltitude() ).floatValue(),
System.currentTimeMillis() );
head -= geoField.getDeclination(); // converts magnetic north into true north
if (bearTo < 0) {
bearTo = bearTo + 360;
//bearTo = -100 + 360 = 260;
}
//This is where we choose to point it
float direction = bearTo - head;
// If the direction is smaller than 0, add 360 to get the rotation clockwise.
if (direction < 0) {
direction = direction + 360;
}
tvHeading.setText("Heading: " + Float.toString(degree) + " degrees" );
RotateAnimation raQibla = new RotateAnimation(currentDegreeNeedle, direction, Animation.RELATIVE_TO_SELF, 0.5f, Animation.RELATIVE_TO_SELF, 0.5f);
raQibla.setDuration(210);
raQibla.setFillAfter(true);
arrow.startAnimation(raQibla);
currentDegreeNeedle = direction;
// create a rotation animation (reverse turn degree degrees)
RotateAnimation ra = new RotateAnimation(currentDegree, -degree, Animation.RELATIVE_TO_SELF, 0.5f, Animation.RELATIVE_TO_SELF, 0.5f);
// how long the animation will take place
ra.setDuration(210);
// set the animation after the end of the reservation status
ra.setFillAfter(true);
// Start the animation
image.startAnimation(ra);
currentDegree = -degree;
}
#Override
public void onAccuracyChanged(Sensor sensor, int i) {
}
#Nullable
#Override
public IBinder onBind(Intent intent) {
return null;
}
xml code is here
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:background="#drawable/flag_pakistan">
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:id="#+id/heading"
android:textColor="#color/colorAccent"
android:layout_centerHorizontal="true"
android:layout_marginBottom="100dp"
android:layout_marginTop="20dp"
android:text="Heading: 0.0" />
<RelativeLayout
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="#+id/heading"
android:scaleType="centerInside"
android:layout_centerVertical="true"
android:layout_centerHorizontal="true">
<ImageView
android:id="#+id/imageCompass"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:scaleType="centerInside"
android:layout_centerVertical="true"
android:layout_centerHorizontal="true"
android:src="#drawable/images_compass"/>
<ImageView
android:id="#+id/needle"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerVertical="true"
android:layout_centerHorizontal="true"
android:scaleType="centerInside"
android:src="#drawable/arrow2"/>
</RelativeLayout>
</RelativeLayout>
I know this is a little old but for the sake of folks like myself from google who didn't find a complete answer here. Here are some extracts from my app which put the arrows inside a custom listview....
Location loc; //Will hold lastknown location
Location wptLoc = new Location(""); // Waypoint location
float dist = -1;
float bearing = 0;
float heading = 0;
float arrow_rotation = 0;
LocationManager lm = (LocationManager) getSystemService(Context.LOCATION_SERVICE);
loc = lm.getLastKnownLocation(LocationManager.GPS_PROVIDER);
if(loc == null) { //No recent GPS fix
Criteria criteria = new Criteria();
criteria.setAccuracy(Criteria.ACCURACY_FINE);
criteria.setAltitudeRequired(false);
criteria.setBearingRequired(true);
criteria.setCostAllowed(true);
criteria.setSpeedRequired(false);
loc = lm.getLastKnownLocation(lm.getBestProvider(criteria, true));
}
if(loc != null) {
wptLoc.setLongitude(cursor.getFloat(2)); //Cursor is from SimpleCursorAdapter
wptLoc.setLatitude(cursor.getFloat(3));
dist = loc.distanceTo(wptLoc);
bearing = loc.bearingTo(wptLoc); // -180 to 180
heading = loc.getBearing(); // 0 to 360
// *** Code to calculate where the arrow should point ***
arrow_rotation = (360+((bearing + 360) % 360)-heading) % 360;
}
I willing to bet it could be simplified but it works!
LastKnownLocation was used since this code was from new SimpleCursorAdapter.ViewBinder()
onLocationChanged contains a call to notifyDataSetChanged();
code also from new SimpleCursorAdapter.ViewBinder() to set image rotation and listrow colours (only applied in a single columnIndex mind you)...
LinearLayout ll = ((LinearLayout)view.getParent());
ll.setBackgroundColor(bc);
int childcount = ll.getChildCount();
for (int i=0; i < childcount; i++){
View v = ll.getChildAt(i);
if(v instanceof TextView) ((TextView)v).setTextColor(fc);
if(v instanceof ImageView) {
ImageView img = (ImageView)v;
img.setImageResource(R.drawable.ic_arrow);
Matrix matrix = new Matrix();
img.setScaleType(ScaleType.MATRIX);
matrix.postRotate(arrow_rotation, img.getWidth()/2, img.getHeight()/2);
img.setImageMatrix(matrix);
}
In case you're wondering I did away with the magnetic sensor dramas, wasn't worth the hassle in my case.
I hope somebody finds this as useful as I usually do when google brings me to stackoverflow!
I'm no expert in map-reading / navigation and so on but surely 'directions' are absolute and not relative or in reality, they are relative to N or S which themselves are fixed/absolute.
Example: Suppose an imaginary line drawn between you and your destination corresponds with 'absolute' SE (a bearing of 135 degrees relative to magnetic N). Now suppose your phone is pointing NW - if you draw an imaginary line from an imaginary object on the horizon to your destination, it will pass through your location and have an angle of 180 degrees. Now 180 degrees in the sense of a compass actually refers to S but the destination is not 'due S' of the imaginary object your phone is pointing at and, moreover, if you travelled to that imaginary point, your destination would still be SE of where you moved to.
In reality, the 180 degree line actually tells you the destination is 'behind you' relative to the way the phone (and presumably you) are pointing.
Having said that, however, if calculating the angle of a line from the imaginary point to your destination (passing through your location) in order to draw a pointer towards your destination is what you want...simply subtract the (absolute) bearing of the destination from the absolute bearing of the imaginary object and ignore a negation (if present). e.g., NW - SE is 315 - 135 = 180 so draw the pointer to point at the bottom of the screen indicating 'behind you'.
EDIT: I got the Maths slightly wrong...subtract the smaller of the bearings from the larger then subtract the result from 360 to get the angle in which to draw the pointer on the screen.
If you are on the same timezone
Convert GPS to UTM
http://www.ibm.com/developerworks/java/library/j-coordconvert/
http://stackoverflow.com/questions/176137/java-convert-lat-lon-to-utm
UTM coordinates get you a simples X Y 2D
Calculate the angle between both UTM locations
http://forums.groundspeak.com/GC/index.php?showtopic=146917
This gives the direction as if you were looking north
So whatever you rotate related do North just subtract this angle
If both point have a UTM 45º degree angle and you are 5º east of north, your arrow will point to 40º of north
Here is how I have done it:
Canvas g = new Canvas( compass );
Paint p = new Paint( Paint.ANTI_ALIAS_FLAG );
float rotation = display.getOrientation() * 90;
g.translate( -box.left, -box.top );
g.rotate( -bearing - rotation, box.exactCenterX(), box.exactCenterY() );
drawCompass( g, p );
drawNeedle( g, p );
This is the best way to detect Bearing from Location Object on Google Map:->
float targetBearing=90;
Location endingLocation=new Location("ending point");
Location
startingLocation=new Location("starting point");
startingLocation.setLatitude(mGoogleMap.getCameraPosition().target.latitude);
startingLocation.setLongitude(mGoogleMap.getCameraPosition().target.longitude);
endingLocation.setLatitude(mLatLng.latitude);
endingLocation.setLongitude(mLatLng.longitude);
targetBearing =
startingLocation.bearingTo(endingLocation);
The formula will give the bearing using the coordinates of the start point to the end point see
The following code will give you the bearing (angle between 0-360)
private double bearing(Location startPoint, Location endPoint) {
double longitude1 = startPoint.getLongitude();
double latitude1 = Math.toRadians(startPoint.getLatitude());
double longitude2 = endPoint.getLongitude();
double latitude2 = Math.toRadians(endPoint.getLatitude());
double longDiff = Math.toRadians(longitude2 - longitude1);
double y = Math.sin(longDiff) * Math.cos(latitude2);
double x = Math.cos(latitude1) * Math.sin(latitude2) - Math.sin(latitude1) * Math.cos(latitude2) * Math.cos(longDiff);
return Math.toDegrees(Math.atan2(y, x));
}
This works for me hope it will work others as well
I am in the process of figuring it out now but it seems as though the math depends on where you and your target are on the earth relative to true and magnetic North. For example:
float thetaMeThem = 0.0;
if (myLocation.bearingTo(targetLocation) > myLocation.getBearing()){
thetaMeThem = myLocation.bearingTo(targetLocation) - azimuth + declination;}
See Sensor.TYPE_ORIENTATION for azimuth.
See getDeclination() for declination
This assumes declination is negative (west of true north) and theirBearing > yourBearing.
If declination is positive and yourBearing > theirBearing another option:
float thetaMeThem = 0.0;
if (myLocation.bearingTo(targetLocation) < myLocation.getBearing()){
thetaMeThem = azimuth - (myLocation.bearingTo(targetLocation) - declination);}
I haven't tested this fully but playing with the angles on paper got me here.
Terminology : The difference between TRUE north and Magnetic North is known as "variation" not declination. The difference between what your compass reads and the magnetic heading is known as "deviation" and varies with heading. A compass swing identifies device errors and allows corrections to be applied if the device has correction built in. A magnetic compass will have a deviation card which describes the device error on any heading.
Declination : A term used in Astro navigation : Declination is like latitude. It reports how far a star is from the celestial equator. To find the declination of a star follow an hour circle "straight down" from the star to the celestial equator. The angle from the star to the celestial equator along the hour circle is the star's declination.
As this question has been asked already, i didnt get any clear idea in it. So i am asking this question again.I am struggling with it for two days.
I want to calculate the distance between camera of my devices with that of the object that are infront of camera.
I tried to use sensor managers as it was suggested in the following link but i am getting wrong results in it.
I have used two sensors accelerometer and magnetic field.
I calculate the distance based on orientation of the device as suggested in the above link.
onSensorChanged Callback in Sensor listener
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
gravity = event.values.clone();
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
geoMagnetic = event.values.clone();
if (gravity != null && geoMagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, gravity,
geoMagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
azimuth = 57.29578F * orientation[0];
pitch = 57.29578F * orientation[1];
roll = 57.29578F * orientation[2];
}
}
}
Portrait:
double dist = Math.abs((float) (1.4f * Math.tan(roll * Math.PI / 180)));
Landscape:
double dist = Math.abs((float) (1.4f * Math.tan(pitch* Math.PI / 180)));
Please provide information about how to calculate the distance. Any ideas and suggestion are appreciable,
Thanks in advance.
The Sensor Fusion video looks great, but there's no code:
http://www.youtube.com/watch?v=C7JQ7Rpwn2k&feature=player_detailpage#t=1315s
Here is my code which just uses accelerometer and compass. I also use a Kalman filter on the 3 orientation values, but that's too much code to show here. Ultimately, this works ok, but the result is either too jittery or too laggy depending on what I do with the results and how low I make the filtering factors.
/** Just accelerometer and magnetic sensors */
public abstract class SensorsListener2
implements
SensorEventListener
{
/** The lower this is, the greater the preference which is given to previous values. (slows change) */
private static final float accelFilteringFactor = 0.1f;
private static final float magFilteringFactor = 0.01f;
public abstract boolean getIsLandscape();
#Override
public void onSensorChanged(SensorEvent event) {
Sensor sensor = event.sensor;
int type = sensor.getType();
switch (type) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags[0] = event.values[0] * magFilteringFactor + mags[0] * (1.0f - magFilteringFactor);
mags[1] = event.values[1] * magFilteringFactor + mags[1] * (1.0f - magFilteringFactor);
mags[2] = event.values[2] * magFilteringFactor + mags[2] * (1.0f - magFilteringFactor);
isReady = true;
break;
case Sensor.TYPE_ACCELEROMETER:
accels[0] = event.values[0] * accelFilteringFactor + accels[0] * (1.0f - accelFilteringFactor);
accels[1] = event.values[1] * accelFilteringFactor + accels[1] * (1.0f - accelFilteringFactor);
accels[2] = event.values[2] * accelFilteringFactor + accels[2] * (1.0f - accelFilteringFactor);
break;
default:
return;
}
if(mags != null && accels != null && isReady) {
isReady = false;
SensorManager.getRotationMatrix(rot, inclination, accels, mags);
boolean isLandscape = getIsLandscape();
if(isLandscape) {
outR = rot;
} else {
// Remap the coordinates to work in portrait mode.
SensorManager.remapCoordinateSystem(rot, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
}
SensorManager.getOrientation(outR, values);
double x180pi = 180.0 / Math.PI;
float azimuth = (float)(values[0] * x180pi);
float pitch = (float)(values[1] * x180pi);
float roll = (float)(values[2] * x180pi);
// In landscape mode swap pitch and roll and invert the pitch.
if(isLandscape) {
float tmp = pitch;
pitch = -roll;
roll = -tmp;
azimuth = 180 - azimuth;
} else {
pitch = -pitch - 90;
azimuth = 90 - azimuth;
}
onOrientationChanged(azimuth,pitch,roll);
}
}
private float[] mags = new float[3];
private float[] accels = new float[3];
private boolean isReady;
private float[] rot = new float[9];
private float[] outR = new float[9];
private float[] inclination = new float[9];
private float[] values = new float[3];
/**
Azimuth: angle between the magnetic north direction and the Y axis, around the Z axis (0 to 359). 0=North, 90=East, 180=South, 270=West
Pitch: rotation around X axis (-180 to 180), with positive values when the z-axis moves toward the y-axis.
Roll: rotation around Y axis (-90 to 90), with positive values when the x-axis moves toward the z-axis.
*/
public abstract void onOrientationChanged(float azimuth, float pitch, float roll);
}
I tried to figure out how to add gyroscope data, but I am just not doing it right. The google doc at http://developer.android.com/reference/android/hardware/SensorEvent.html shows some code to get a delta matrix from the gyroscope data. The idea seems to be that I'd crank down the filters for the accelerometer and magnetic sensors so that they were really stable. That would keep track of the long term orientation.
Then, I'd keep a history of the most recent N delta matrices from the gyroscope. Each time I got a new one I'd drop off the oldest one and multiply them all together to get a final matrix which I would multiply against the stable matrix returned by the accelerometer and magnetic sensors.
This doesn't seem to work. Or, at least, my implementation of it does not work. The result is far more jittery than just the accelerometer. Increasing the size of the gyroscope history actually increases the jitter which makes me think that I'm not calculating the right values from the gyroscope.
public abstract class SensorsListener3
implements
SensorEventListener
{
/** The lower this is, the greater the preference which is given to previous values. (slows change) */
private static final float kFilteringFactor = 0.001f;
private static final float magKFilteringFactor = 0.001f;
public abstract boolean getIsLandscape();
#Override
public void onSensorChanged(SensorEvent event) {
Sensor sensor = event.sensor;
int type = sensor.getType();
switch (type) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags[0] = event.values[0] * magKFilteringFactor + mags[0] * (1.0f - magKFilteringFactor);
mags[1] = event.values[1] * magKFilteringFactor + mags[1] * (1.0f - magKFilteringFactor);
mags[2] = event.values[2] * magKFilteringFactor + mags[2] * (1.0f - magKFilteringFactor);
isReady = true;
break;
case Sensor.TYPE_ACCELEROMETER:
accels[0] = event.values[0] * kFilteringFactor + accels[0] * (1.0f - kFilteringFactor);
accels[1] = event.values[1] * kFilteringFactor + accels[1] * (1.0f - kFilteringFactor);
accels[2] = event.values[2] * kFilteringFactor + accels[2] * (1.0f - kFilteringFactor);
break;
case Sensor.TYPE_GYROSCOPE:
gyroscopeSensorChanged(event);
break;
default:
return;
}
if(mags != null && accels != null && isReady) {
isReady = false;
SensorManager.getRotationMatrix(rot, inclination, accels, mags);
boolean isLandscape = getIsLandscape();
if(isLandscape) {
outR = rot;
} else {
// Remap the coordinates to work in portrait mode.
SensorManager.remapCoordinateSystem(rot, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
}
if(gyroUpdateTime!=0) {
matrixHistory.mult(matrixTmp,matrixResult);
outR = matrixResult;
}
SensorManager.getOrientation(outR, values);
double x180pi = 180.0 / Math.PI;
float azimuth = (float)(values[0] * x180pi);
float pitch = (float)(values[1] * x180pi);
float roll = (float)(values[2] * x180pi);
// In landscape mode swap pitch and roll and invert the pitch.
if(isLandscape) {
float tmp = pitch;
pitch = -roll;
roll = -tmp;
azimuth = 180 - azimuth;
} else {
pitch = -pitch - 90;
azimuth = 90 - azimuth;
}
onOrientationChanged(azimuth,pitch,roll);
}
}
private void gyroscopeSensorChanged(SensorEvent event) {
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if(gyroUpdateTime != 0) {
final float dT = (event.timestamp - gyroUpdateTime) * NS2S;
// Axis of the rotation sample, not normalized yet.
float axisX = event.values[0];
float axisY = event.values[1];
float axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = (float)Math.sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
// Normalize the rotation vector if it's big enough to get the axis
if(omegaMagnitude > EPSILON) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = (float)Math.sin(thetaOverTwo);
float cosThetaOverTwo = (float)Math.cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
gyroUpdateTime = event.timestamp;
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
// User code should concatenate the delta rotation we computed with the current rotation
// in order to get the updated rotation.
// rotationCurrent = rotationCurrent * deltaRotationMatrix;
matrixHistory.add(deltaRotationMatrix);
}
private float[] mags = new float[3];
private float[] accels = new float[3];
private boolean isReady;
private float[] rot = new float[9];
private float[] outR = new float[9];
private float[] inclination = new float[9];
private float[] values = new float[3];
// gyroscope stuff
private long gyroUpdateTime = 0;
private static final float NS2S = 1.0f / 1000000000.0f;
private float[] deltaRotationMatrix = new float[9];
private final float[] deltaRotationVector = new float[4];
//TODO: I have no idea how small this value should be.
private static final float EPSILON = 0.000001f;
private float[] matrixMult = new float[9];
private MatrixHistory matrixHistory = new MatrixHistory(100);
private float[] matrixTmp = new float[9];
private float[] matrixResult = new float[9];
/**
Azimuth: angle between the magnetic north direction and the Y axis, around the Z axis (0 to 359). 0=North, 90=East, 180=South, 270=West
Pitch: rotation around X axis (-180 to 180), with positive values when the z-axis moves toward the y-axis.
Roll: rotation around Y axis (-90 to 90), with positive values when the x-axis moves toward the z-axis.
*/
public abstract void onOrientationChanged(float azimuth, float pitch, float roll);
}
public class MatrixHistory
{
public MatrixHistory(int size) {
vals = new float[size][];
}
public void add(float[] val) {
synchronized(vals) {
vals[ix] = val;
ix = (ix + 1) % vals.length;
if(ix==0)
full = true;
}
}
public void mult(float[] tmp, float[] output) {
synchronized(vals) {
if(full) {
for(int i=0; i<vals.length; ++i) {
if(i==0) {
System.arraycopy(vals[i],0,output,0,vals[i].length);
} else {
MathUtils.multiplyMatrix3x3(output,vals[i],tmp);
System.arraycopy(tmp,0,output,0,tmp.length);
}
}
} else {
if(ix==0)
return;
for(int i=0; i<ix; ++i) {
if(i==0) {
System.arraycopy(vals[i],0,output,0,vals[i].length);
} else {
MathUtils.multiplyMatrix3x3(output,vals[i],tmp);
System.arraycopy(tmp,0,output,0,tmp.length);
}
}
}
}
}
private int ix = 0;
private boolean full = false;
private float[][] vals;
}
The second block of code contains my changes from the first block of code which add the gyroscope to the mix.
Specifically, the filtering factor for accel is made smaller (making the value more stable). The MatrixHistory class keeps track of the last 100 gyroscope deltaRotationMatrix values which are calculated in the gyroscopeSensorChanged method.
I've seen many questions on this site on this topic. They've helped me get to this point, but I cannot figure out what to do next. I really wish the Sensor Fusion guy had just posted some code somewhere. He obviously had it all put together.
Well, +1 to you for even knowing what a Kalman filter is. If you'd like, I'll edit this post and give you the code I wrote a couple years ago to do what you're trying to do.
But first, I'll tell you why you don't need it.
Modern implementations of the Android sensor stack use Sensor Fusion, as Stan mentioned above. This just means that all of the available data -- accel, mag, gyro -- is collected together in one algorithm, and then all the outputs are read back out in the form of Android sensors.
Edit: I just stumbled on this superb Google Tech Talk on the subject: Sensor Fusion on Android Devices: A Revolution in Motion Processing. Well worth the 45 minutes to watch it if you're interested in the topic.
In essence, Sensor Fusion is a black box. I've looked into the source code of the Android implementation, and it's a big Kalman filter written in C++. Some pretty good code in there, and far more sophisticated than any filter I ever wrote, and probably more sophisticated that what you're writing. Remember, these guys are doing this for a living.
I also know that at least one chipset manufacturer has their own sensor fusion implementation. The manufacturer of the device then chooses between the Android and the vendor implementation based on their own criteria.
Finally, as Stan mentioned above, Invensense has their own sensor fusion implementation at the chip level.
Anyway, what it all boils down to is that the built-in sensor fusion in your device is likely to be superior to anything you or I could cobble together. So what you really want to do is to access that.
In Android, there are both physical and virtual sensors. The virtual sensors are the ones that are synthesized from the available physical sensors. The best-known example is TYPE_ORIENTATION which takes accelerometer and magnetometer and creates roll/pitch/heading output. (By the way, you should not use this sensor; it has too many limitations.)
But the important thing is that newer versions of Android contain these two new virtual sensors:
TYPE_GRAVITY is the accelerometer input with the effect of motion filtered out
TYPE_LINEAR_ACCELERATION is the accelerometer with the gravity component filtered out.
These two virtual sensors are synthesized through a combination of accelerometer input and gyro input.
Another notable sensor is TYPE_ROTATION_VECTOR which is a Quaternion synthesized from accelerometer, magnetometer, and gyro. It represents the full 3-d orientation of the device with the effects of linear acceleration filtered out.
However, Quaternions are a little bit abstract for most people, and since you're likely working with 3-d transformations anyway, your best approach is to combine TYPE_GRAVITY and TYPE_MAGNETIC_FIELD via SensorManager.getRotationMatrix().
One more point: if you're working with a device running an older version of Android, you need to detect that you're not receiving TYPE_GRAVITY events and use TYPE_ACCELEROMETER instead. Theoretically, this would be a place to use your own kalman filter, but if your device doesn't have sensor fusion built in, it probably doesn't have gyros either.
Anyway, here's some sample code to show how I do it.
// Requires 1.5 or above
class Foo extends Activity implements SensorEventListener {
SensorManager sensorManager;
float[] gData = new float[3]; // Gravity or accelerometer
float[] mData = new float[3]; // Magnetometer
float[] orientation = new float[3];
float[] Rmat = new float[9];
float[] R2 = new float[9];
float[] Imat = new float[9];
boolean haveGrav = false;
boolean haveAccel = false;
boolean haveMag = false;
onCreate() {
// Get the sensor manager from system services
sensorManager =
(SensorManager)getSystemService(Context.SENSOR_SERVICE);
}
onResume() {
super.onResume();
// Register our listeners
Sensor gsensor = sensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY);
Sensor asensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Sensor msensor = sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
sensorManager.registerListener(this, gsensor, SensorManager.SENSOR_DELAY_GAME);
sensorManager.registerListener(this, asensor, SensorManager.SENSOR_DELAY_GAME);
sensorManager.registerListener(this, msensor, SensorManager.SENSOR_DELAY_GAME);
}
public void onSensorChanged(SensorEvent event) {
float[] data;
switch( event.sensor.getType() ) {
case Sensor.TYPE_GRAVITY:
gData[0] = event.values[0];
gData[1] = event.values[1];
gData[2] = event.values[2];
haveGrav = true;
break;
case Sensor.TYPE_ACCELEROMETER:
if (haveGrav) break; // don't need it, we have better
gData[0] = event.values[0];
gData[1] = event.values[1];
gData[2] = event.values[2];
haveAccel = true;
break;
case Sensor.TYPE_MAGNETIC_FIELD:
mData[0] = event.values[0];
mData[1] = event.values[1];
mData[2] = event.values[2];
haveMag = true;
break;
default:
return;
}
if ((haveGrav || haveAccel) && haveMag) {
SensorManager.getRotationMatrix(Rmat, Imat, gData, mData);
SensorManager.remapCoordinateSystem(Rmat,
SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, R2);
// Orientation isn't as useful as a rotation matrix, but
// we'll show it here anyway.
SensorManager.getOrientation(R2, orientation);
float incl = SensorManager.getInclination(Imat);
Log.d(TAG, "mh: " + (int)(orientation[0]*DEG));
Log.d(TAG, "pitch: " + (int)(orientation[1]*DEG));
Log.d(TAG, "roll: " + (int)(orientation[2]*DEG));
Log.d(TAG, "yaw: " + (int)(orientation[0]*DEG));
Log.d(TAG, "inclination: " + (int)(incl*DEG));
}
}
}
Hmmm; if you happen to have a Quaternion library handy, it's probably simpler just to receive TYPE_ROTATION_VECTOR and convert that to an array.
To the question where to find complete code, here's a default implementation on Android jelly bean: https://android.googlesource.com/platform/frameworks/base/+/jb-release/services/sensorservice/
Start by checking the fusion.cpp/h.
It uses Modified Rodrigues Parameters (close to Euler angles) instead of quaternions. In addition to orientation the Kalman filter estimates gyro drift. For measurement updates it uses magnetometer and, a bit incorrectly, acceleration (specific force).
To make use of the code you should either be a wizard or know the basics of INS and KF. Many parameters have to be fine-tuned for the filter to work. As Edward adequately put, these guys are doing this for living.
At least in google's galaxy nexus this default implementation is left unused and is overridden by Invense's proprietary system.
This question already has answers here:
Calculate compass bearing / heading to location in Android
(11 answers)
Closed 7 years ago.
I'm trying to make MyLocationOverlay compass to point to a particular location. I want to show a direction from current user's location to a given location. So far I've managed to calculate a direction and pass it to MyLocationOverlay. But the compass arrow starts pointing to different locations chaotically. Sometimes it points to the right direction but usually it shows complete nonsense.
Is there a way to make the compass work as it should?
This is how I calculate the direction in my activity
#Override
public void onSensorChanged(SensorEvent event) {
if(location != null && carOverlay.size() > 0
&& !map.getUserOverlay().isPointingToNorth()) {
float azimuth = event.values[0];
azimuth = azimuth * 180 / (float) Math.PI;
GeomagneticField geoField = new GeomagneticField(
Double.valueOf(location.getLatitude()).floatValue(),
Double.valueOf(location.getLongitude()).floatValue(),
Double.valueOf(location.getAltitude()).floatValue(),
System.currentTimeMillis());
azimuth += geoField.getDeclination();
GeoPoint point = carOverlay.getItem(0).getPoint();
Location target = new Location(provider);
float lat = (float) (point.getLatitudeE6() / 1E6f);
float lon = (float) (point.getLongitudeE6() / 1E6f);
target.setLatitude(lat);
target.setLongitude(lon);
float bearing = location.bearingTo(target);
float direction = azimuth - bearing;
map.getUserOverlay().putCompassDirection(direction);
}
}
This is the overriden method in my custom overlay
#Override
protected void drawCompass(Canvas canvas, float bearing) {
if(pointToNorth) {
super.drawCompass(canvas, bearing);
}
else {
super.drawCompass(canvas, compassDirection);
}
}
Why do you calculate your "bearing/azimuth" yourself? MyLocationOverlay hold the Method getOrientation() to give you the azimuth. I would definetly leave onSensorChanged untouched, and let the MyLocationOverlay calculate this for you. Normally to draw your Compass correct you just have to call enableCompass()
I have to write a compass app in Android. The only thing the user sees on the screen is a cube with a red wall which has to point north. This is not important. What's important is that I need to rotate that cube accordingly to the rotation of the device itself so that the red wall continues to point north no matter how the phone is being held. My code is simple and straightforward:
#Override
public void onSensorChanged(SensorEvent event) {
synchronized (this) {
switch (event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
direction = event.values[2];
break;
case Sensor.TYPE_ORIENTATION:
if (direction < 0) {
angleX = event.values[1];
angleY = -event.values[2];
angleZ = event.values[0];
} else {
angleX = -event.values[1];
angleY = -event.values[2];
angleZ = event.values[0];
}
break;
}
}
}
I have added this extra direction variable that simply stores whether the phone's display is pointing downwards or upwards. I don't know if I need it but it seems to fix some bugs. I am using the SensorSimulator for android but whenever my pitch slider goes in the [-90, 90] interval the other variables get mixed up. It's like they get a 180 offset. But I can't detect when I am in this interval because the range of the pitch is from -90 to 90 so I can move that slider from left to write and I will always be in that interval.
This was all just to show you how far has my code advanced. I am not saying how this problem should be solved because I will only probably stir myself into a dead end. You see, I have been trying to write that app for 3 days now, and you can imagine how pissed my boss is. I have read all sorts of tutorials and tried every formula I could find or think of. So please help me. All I have to do is know how to rotate my cube, the rotation angles of which are EULER ANGLES in degrees.
Here's some code I wrote to do something pretty similar, really only caring about the rotation of the device in the roll direction. Hope it helps! It just uses the accelerometer values to determine the pitch, no need to get orientation of the view.
public void onSensorChanged(SensorEvent event) {
float x = -1 * event.values[0] / SensorManager.GRAVITY_EARTH;
float y = -1 * event.values[1] / SensorManager.GRAVITY_EARTH;
float z = -1 * event.values[2] / SensorManager.GRAVITY_EARTH;
float signedRawRoll = (float) (Math.atan2(x, y) * 180 / Math.PI);
float unsignedRawRoll = Math.abs(signedRawRoll);
float rollSign = signedRawRoll / unsignedRawRoll;
float rawPitch = Math.abs(z * 180);
// Use a basic low-pass filter to only keep the gravity in the accelerometer values for the X and Y axes
// adjust the filter weight based on pitch, as roll is harder to define as pitch approaches 180.
float filterWeight = rawPitch > 165 ? 0.85f : 0.7f;
float newUnsignedRoll = filterWeight * Math.abs(this.roll) + (1 - filterWeight) * unsignedRawRoll;
this.roll = rollSign * newUnsignedRoll;
if (Float.isInfinite(this.roll) || Float.isNaN(this.roll)) {
this.roll = 0;
}
this.pitch = filterWeight * this.pitch + (1 - filterWeight) * rawPitch;
for (IAngleListener listener : listeners) {
listener.deviceRollAndPitch(this.roll, this.pitch);
}
}