Android Device (GPS) Direction - android

Using Location.getBearing(); I seem to get randomly changing bearings.
Aka, I can turn the device around slowly and it wont notice, it just chooses its own random bearings.
I know the device is working, as the "You are here" icon in the Maps app on the tablet slowly rotates as I rotate the device.
Is there a different proper way of getting bearing? I am using the GPS. Maybe there is a better way to determine which direction you are facing.

Try to get bearing from Accelerometer sensor and Magnetic Field (G-) sensor.
Here's a tutorial: http://android-coding.blogspot.co.at/2012/03/create-our-android-compass.html

The Location.getBearing() returns the bearing that the GPS satellites computed for you. It is not a real time representation of the heading of your device. The Google Maps app uses the device's built in G-sensors to get the direction you are facing.

Following from herom's answer using the link: http://android-coding.blogspot.co.at/2012/03/create-our-android-compass.html
I extended my class to implement the sensor: extends Activity implements SensorEventListener
And implemented as suggested, but modified it to take into account, the orientation of the screen.
Here is the code I went with:
#Override
public void onSensorChanged(SensorEvent event) {
switch(event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
for(int i =0; i < 3; i++){
valuesAccelerometer[i] = event.values[i];
}
break;
case Sensor.TYPE_MAGNETIC_FIELD:
for(int i =0; i < 3; i++){
valuesMagneticField[i] = event.values[i];
}
break;
}
boolean success = SensorManager.getRotationMatrix(
matrixR,
matrixI,
valuesAccelerometer,
valuesMagneticField);
if(success){
SensorManager.getOrientation(matrixR, matrixValues);
double azimuth = Math.toDegrees(matrixValues[0]);
//double pitch = Math.toDegrees(matrixValues[1]);
//double roll = Math.toDegrees(matrixValues[2]);
WindowManager mWindowManager = (WindowManager) getSystemService(WINDOW_SERVICE);
Display mDisplay = mWindowManager.getDefaultDisplay();
Float degToAdd = 0f;
if(mDisplay.getRotation() == Surface.ROTATION_0)
degToAdd = 0.0f;
if(mDisplay.getRotation() == Surface.ROTATION_90)
degToAdd = 90.0f;
if(mDisplay.getRotation() == Surface.ROTATION_180)
degToAdd = 180.0f;
if(mDisplay.getRotation() == Surface.ROTATION_270)
degToAdd = 270.0f;
mapView.setFacingDirection((float) (azimuth + degToAdd)); //DEGREES NOT RADIANS
}
}

Related

Android camera affects sensors (Accelerometer & Magnetic Field) while phone faces user

For an application I'm making, I need to have a camera and a compass. The application is set to be at landscape mode in the manifest.
First I've implemented the compass. As suggested in Android Developers, I used two sensors - Accelerometer and Magnetic Field. This is how I've done it:
I have my activity implement SensorEventListener. In onCreate() I initialize my sensorManager using:
sManager = (SensorManager) getSystemService(SENSOR_SERVICE);
I register my listeners in onResume() like so:
sManager.registerListener(this, sManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),SensorManager.SENSOR_DELAY_NORMAL);
sManager.registerListener(this, sManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),SensorManager.SENSOR_DELAY_NORMAL);
and of course unregister them in onPause().
I don't use onAccuracyChanged(). this is what I do in onSensorChanged():
#Override
public void onSensorChanged(SensorEvent event) {
switch (event.sensor.getType()) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags = event.values.clone();
break;
case Sensor.TYPE_ACCELEROMETER:
accels = event.values.clone();
break;
}
if (mags != null && accels != null) {
gravity = new float[9];
magnetic = new float[9];
SensorManager.getRotationMatrix(gravity, magnetic, accels, mags);
float[] outGravity = new float[9];
float inclination = (float) Math.acos(gravity[8]);
if (inclination < Math.toRadians(25)
|| inclination > Math.toRadians(155)) {
// device is close to flat. Remap for landscape.
SensorManager.remapCoordinateSystem(gravity, SensorManager.AXIS_Y,SensorManager.AXIS_MINUS_X, outGravity);
SensorManager.getOrientation(outGravity, values);
} else {
// device is not flat. Remap for landscape and perpendicular
SensorManager.remapCoordinateSystem(gravity, SensorManager.AXIS_X,SensorManager.AXIS_Z, outGravity);
SensorManager.getOrientation(outGravity, values);
}
azimuth = Math.round(Math.toDegrees(values[0]));
}
}
As you can see, I differentiate between when the phone is lying flat on the table, and when the user holds it (as you would when taking a picture). When I use this code alone, everything works great more or less. I'm getting correct azimuth values both when phone is lying on the table and when holding it perpendicular to the table (about 5-10 degrees difference, but I can live with that).
The problem starts when adding the camera preview to the application.
I have my activity implement SurfaceHolder.Callback. I initialize my camera in onCreate():
SurfaceView cameraView = (SurfaceView)findViewById(R.id.camera_view);
surfaceHolder = cameraView.getHolder();
surfaceHolder.addCallback(this);
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.HONEYCOMB) {
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
This is how I implement the interface:
#Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
camera = Camera.open();
camera.setDisplayOrientation(0);
}
#Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {
if (isCameraOn) {
camera.stopPreview();
isCameraOn = false;
}
if (camera != null) {
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
isCameraOn = true;
} catch (IOException e) {
e.printStackTrace();
}
}
}
#Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
camera.stopPreview();
camera.release();
camera = null;
}
When I add the camera code to my project, and show the camera on the phone's screen, my sensors dont work properly when phone is perpendicular suddenly. If phone is lying flat on the table, the azimuth values I'm getting are correct. When phone is being held perpendicular to the table, my azimuth values are off by about 40 degrees (though stable).
I've tried looking for a solution (both by myself and online), but so far my efforts were in vain. I would love to get some direction on how to tackle this problem.Thanks!
First TYPE_MAGNETIC_FIELD sensor will not available in all devices.
You can use TYPE_ACCELEROMETER sensor alone to accomplish your requirement.
Retrieve accelerometer sensor
Sensor accelerometer = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Just compare and copy values when sensor change event call
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
mGravity = event.values;
}
Then you can use below function to get sensor values of all axis.
public int[] getDeviceAngles() {
float[] g = mGravity.clone();
double normOfG = Math.sqrt(g[0] * g[0] + g[1] * g[1] + g[2] * g[2]);
// Normalize the accelerometer vector
g[0] = (float) (g[0] / normOfG);
g[1] = (float) (g[1] / normOfG);
g[2] = (float) (g[2] / normOfG);
int x = (int) Math.round(Math.toDegrees(Math.atan2(g[1], g[0])));
int pitch = (int) Math.round(Math.toDegrees(Math.atan2(g[1], g[2])));
int rollValue = (int) Math.round(Math.toDegrees(Math.atan2(g[2], g[0])));
int pitchValue = pitch * -1;
int[] values = new int[3];
values[0] = x;
values[1] = pitchValue;
values[2] = rollValue;
//values contains: azimut, pitch and roll
return values;
}

Using accelerometer and magnetic sensor to detect the directions

I am working on an android project where I need to detect the exact directions when user turns (i.e. left or right or bottom or top). I have found the directions too by using accelerometer with magnetic sensor, but I couldn't fix the maximum values to be turned.
The problem I face is this:
I am unable to make it detect the center point/position.
When a user turns the phone left from the central position, it detects it as a left turn.
However, when I return the position to the central position, the phone detects it as two movements: both left. My understanding is this: If the sensor had got the central position (point of origin) correctly fixed, the movements would be correctly detected. But, because the central position is not fixed, this is not happening.
Kindly guide me in fixing it. Do let me know if you need further clarifications.
My code is as below
public void onSensorChanged(SensorEvent event) {
azimuth = event.values[0]; // azimuth
pitch = event.values[1]; // pitch
roll = event.values[2]; // roll
if (pitch < -45 && pitch > -135) {
// top
currentSide = Side.TOP;
} else if (pitch > 45 && pitch < 135) {
// bottom
currentSide = Side.BOTTOM;
} else if (roll > 45) {
// right
currentSide = Side.RIGHT;
} else if (roll < -45) {
// left
currentSide = Side.LEFT;
}
if (currentSide != null && !currentSide.equals(oldSide)) {
switch (currentSide) {
case TOP :
listener.onTopUp();
break;
case BOTTOM :
listener.onBottomUp();
break;
case LEFT:
listener.onLeftUp();
break;
case RIGHT:
listener.onRightUp();
break;
}
oldSide = currentSide;
}
// forwards orientation to the OrientationListener
listener.onOrientationChanged(azimuth, pitch, roll);
}
Thanks for helping!

How to get phone heading for augmented reality?

Possible duplicate : Android getOrientation Azimuth gets polluted when phone is tilted
I am new to android development and I try to get the user facing.
After searching on the web I am pretty sure I need to use remapCoordinateSystem, android documentation say it is good to use for an augmented reality app.
I try to use some existing code to be sure that is a good way.
(Finally I want to create an augmented reality application)
The following code should return the user facing in degree :
Code :
Sensor sensor = event.sensor;
if (sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
gravity = event.values.clone();
}else if (sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
geomagnetic = event.values.clone();
}
if (gravity != null && geomagnetic != null){
_rM = SensorManager.getRotationMatrix (R, null, gravity, geomagnetic);
if(_rM){
int mScreenRotation = activity.getWindowManager().getDefaultDisplay().getRotation(); // get user orientation mode
switch (mScreenRotation) { // Handle device rotation (landscape, portrait)
case Surface.ROTATION_0:
axisX = SensorManager.AXIS_X;
axisY = SensorManager.AXIS_Y;
break;
case Surface.ROTATION_90:
axisX = SensorManager.AXIS_Y;
axisY = SensorManager.AXIS_MINUS_X;
break;
case Surface.ROTATION_180: // not handled by my phone so I can't test
axisX = SensorManager.AXIS_MINUS_X;
axisY = SensorManager.AXIS_MINUS_Y;
break;
case Surface.ROTATION_270:
axisX = SensorManager.AXIS_MINUS_Y;
axisY = SensorManager.AXIS_X;
break;
default:
break;
}
SensorManager.remapCoordinateSystem (R, axisX, axisY, outR);
SensorManager.getOrientation (outR, gO);
}
}
userFacing = (float) Math.toDegrees(gO[0]); // Radian to degree
if(userFacing<0) { userFacing+=360; } // -180;180 to 0;360
return userFacing;
Actually I am able to get an accurate compass when the phone is facing down, but when I try to move the phone (as a user would do), the results are very inaccurate : when i tilt the device, direction can move of 40°...
I saw this link : https://stackoverflow.com/questions/17979238/android-getorientation-azimuth-gets-polluted-when-phone-is-tilted
But if i stay tilted, the results are inaccurate again (it's an average, so it's normal!)
And i need the compass working anytime...
I think I will use TYPE_GYROSCOPE, but not all devices have a gyroscope nowadays so i need another solution for all the other devices !
Hope you can understand my problem, and sorry for my bad english! (I'm French)
Solution
Ok so after adapting the application on iOS, I decided to check if the android methods were as good as I though. A month ago, I posted a solution and it was really wrong. I made a huge mistake on remapCoordinateSystem : I used AXIS_X and AXIS_Y to get my matrice and I tried to correct my values with different values. As #HoanNguyen suggested, we just have to use AXIS_X and AXIS_Zand android handle the rest.
So here is the final code. Lot more shorter and easier:
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
gravity = lowPass(event.values.clone(), gravity);
}
else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
geomagnetic = lowPass(event.values.clone(), geomagnetic);
}
if (geomagnetic != null && gravity != null) {
if (SensorManager.getRotationMatrix (R_in, null, gravity, geomagnetic)) {
SensorManager.remapCoordinateSystem (R_in, SensorManager.AXIS_X, SensorManager.AXIS_Z, R_out);
SensorManager.getOrientation (R_out, gO);
}
}
And to get magnetic north, you just have to use gO[0] like heading = gO[0]; (Caution, the returned values are in radians)
Hope it could here someone!
Both your code and the solution you found are wrong if by "phone facing" you means the opposite direction of the device z-coordinate. For augmented reality, you just call
remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR);
independent of the device orientation. Then the azimuth in the call to getOrientation() gives the direction of the phone facing with respect to Magnetic North.
These 2 calls amount to projection of the device z-coordinate to the XY plane of the world coordinates and then calculate the direction of the resulting vector.

updating Textview based on sensordata

I'm trying to update a textview based on sensorinput - more precise pitch. I have no problem getting the sensor data, converting it to degrees and displaying it in a textview.
The problem is, that I wan't different numbers displayed, based on the pitch in degrees. I have written a if-else if statement and placed it in the onsensorchanged, but apart from the initial number it does not update.
#Override
public void onSensorChanged(SensorEvent event) {
switch(event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
for(int i =0; i < 3; i++){
valuesAccelerometer[i] = event.values[i];
}
break;
case Sensor.TYPE_MAGNETIC_FIELD:
for(int i =0; i < 3; i++){
valuesMagneticField[i] = event.values[i];
}
break;
}
boolean success = SensorManager.getRotationMatrix(
matrixR,
matrixI,
valuesAccelerometer,
valuesMagneticField);
if(success){
SensorManager.getOrientation(matrixR, matrixValues);
// Float to double
double pitch = Math.toDegrees(matrixValues[1]);
// 1 decimal
pitch = Math.abs(round(pitch, 0));
//set textview vinkel to degrees
vinkel.setText(String.valueOf(pitch));
// find tubesize from edittext
String tubesizestring = tubesize.getText().toString();
if(tubesizestring=="1000"){
if(pitch>=0.6){
kwh.setText("2,69");
}else if(pitch>=1.0){
kwh.setText("3,47");
}else if(pitch>=2.0){
kwh.setText("4,90");
}else if(pitch>=5.0){
kwh.setText("7,75");
}else if(pitch>=10.0){
kwh.setText("10,96");
}else if(pitch>=20.0){
kwh.setText("15,50");
}else if(pitch>=30.0){
kwh.setText("18,99");
}else{
kwh.setText("more than 30 degrees");
}
}
}
I hope it is clear what I'm trying to do. Othervise please ask
Hope somebody can point me in the right direction
It doesn't work because your logic is fundamentally flawed. Let's assume the pitch is around 25. It's greater than 0.6 and 1.0 and so on. So obviously only the first if statement will be seen, since the others are else if statements. To get it to work, change the order of the statements.
if(pitch>=30.0){
kwh.setText("18,99");
}else if(pitch>=20.0){
kwh.setText("15,50");
}else if(pitch>=10.0){
kwh.setText("10,96");
}else if(pitch>=5.0){
kwh.setText("7,75");
}else if(pitch>=2.0){
kwh.setText("4,90");
}
else if(pitch>=1.0){
kwh.setText("3,47");
}
else if(pitch>=0.6){
kwh.setText("2,69");
}eelse{
kwh.setText("more than 30 degrees");

Android Camera AutoFocus on Demand

The built in Camcorder App (like the one on the HTC EVO) seems to call camera.autoFocus() only when the preview image changes. If you hold the camera steady no camera.autoFocus() happens.
I would like to duplicate this behavior while camera.startPreview() is active as in the initial preview setup code below:
camera = camera.open();
Camera.Parameters parameters = camera.getParameters();
List<String> focusModes = parameters.getSupportedFocusModes();
if (focusModes.contains(Camera.Parameters.FOCUS_MODE_AUTO))
{
parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
}
camera.setParameters(parameters);
camera.setPreviewDisplay(holder);
camera.startPreview();
All the examples I found for autoFocus() seem to be calling it every 500ms to 2000ms, or once the instant before the picture is taken or recording is started.
The EVO Camcorder app seems to use a sensor or an algorithm to trigger autoFocus(). However this autoFocus() trigger is done it works exceptionally well. Does anyone have any knowledge of how to trigger autoFocus() on demand when it is needed, such as when the camera is moved close or farther from the subject or is panned slightly?
Thank you,
Gerry
Android has introduced continuous auto focus since API Level 9 (Gingerbread). It works better than calling Camera.autoFocus periodically.
I had the same problem in one of my applications.
My solution was to use a sensor listener and do auto focus when the user shook the device to some threshold. Here is the code.
public void setCameraFocus(AutoFocusCallback autoFocus){
if (mCamera.getParameters().getFocusMode().equals(mCamera.getParameters().FOCUS_MODE_AUTO) ||
mCamera.getParameters().getFocusMode().equals(mCamera.getParameters().FOCUS_MODE_MACRO)){
mCamera.autoFocus(autoFocus);
}
}
The callback for auto focus:
// this is the autofocus call back
private AutoFocusCallback myAutoFocusCallback = new AutoFocusCallback(){
public void onAutoFocus(boolean autoFocusSuccess, Camera arg1) {
//Wait.oneSec();
mAutoFocus = true;
}};
And the way to call the focus.
public void onSensorChanged(SensorEvent event) {
if (mInvalidate == true){
mView.invalidate();
mInvalidate = false;
}
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
if (!mInitialized){
mLastX = x;
mLastY = y;
mLastZ = z;
mInitialized = true;
}
float deltaX = Math.abs(mLastX - x);
float deltaY = Math.abs(mLastY - y);
float deltaZ = Math.abs(mLastZ - z);
if (deltaX > .5 && mAutoFocus){ //AUTOFOCUS (while it is not autofocusing)
mAutoFocus = false;
mPreview.setCameraFocus(myAutoFocusCallback);
}
if (deltaY > .5 && mAutoFocus){ //AUTOFOCUS (while it is not autofocusing)
mAutoFocus = false;
mPreview.setCameraFocus(myAutoFocusCallback);
}
if (deltaZ > .5 && mAutoFocus){ //AUTOFOCUS (while it is not autofocusing) */
mAutoFocus = false;
mPreview.setCameraFocus(myAutoFocusCallback);
}
mLastX = x;
mLastY = y;
mLastZ = z;
}
You can see the complete project here: http://adblogcat.com/a-camera-preview-with-a-bounding-box-like-google-goggles/
It is very possible to call a refocus with a simpler technique, if you have a white box flash within the cameras view (from code, not a real box) it will rapidly call a refocus. I own the EVO 4G and one of the previous posters is correct, it does continually refocus without the need to change what it is looking at ever since the update to Gingerbread.
For taking pictures, you can set this.
Applications can call autoFocus(AutoFocusCallback) in this mode. If the autofocus is in the middle of scanning, the focus callback will return when it completes. If the autofocus is not scanning, the focus callback will immediately return with a boolean that indicates whether the focus is sharp or not. The apps can then decide if they want to take a picture immediately or to change the focus mode to auto, and run a full autofocus cycle.
i would make use of the SensorEventListener. All you would need to do is to listen to sensor events and fire the autofocus once the phones orientation has changed by a sufficient threshhold.
http://developer.android.com/reference/android/hardware/SensorEventListener.html

Categories

Resources