Android using RotateAnimation pointing towards a specific location - android

I am trying to create an arrow that points towards a specific coordinate, like a compass.
I am using Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGNETIC_FIELD from the phone to calculate the azimuth angle (I took reference from this question Using orientation sensor to point towards a specific location ):
//calculate RotationMatrix
if (gravity != null && geomag != null) {
boolean success = SensorManager.getRotationMatrix(inR, I,
gravity, geomag);
if (success) {
SensorManager.getOrientation(inR, orientVals);
azimuth = Math.toDegrees(orientVals[0]);
pitch = Math.toDegrees(orientVals[1]);
roll = Math.toDegrees(orientVals[2]);
}
}
azimuth += geomagneticField.getDeclination();
//azimuth = Math.round(sensorEvent.values[0]);
float bearing = lastLocation.bearingTo(targetLocation);
float angle = (float) azimuth + bearing;
I then use RotatateAnimation to create the Rotation of the arrow itself:
//the +90 below is because the ImageView arrow is pointing towards left by default
RotateAnimation animation = new RotateAnimation(
-(angle+90),
-(angle+90),
Animation.RELATIVE_TO_SELF, 0.5f,
Animation.RELATIVE_TO_SELF,
0.5f);
However, I tested four different targetLocations each in directions north east south and west, the arrow only correctly point towards the east west locations. In case of locations in direction north and south the arrow is rotated by 180 degrees, meaning that it will point completely towards the opposite direction. If I add 180 degrees to the rotation, the arrow will correctly point to the locations in north or south, but east and west locations will be wrong then.
It is really frustrating and this makes absolutely no sense to me.
I would be really really thankful if someone could help me out here! Thanks in advance!

I did some further tests and realized that the range from the bearing and the azimuth angles are each (0...180,-180...0) going clockwise and starting from 0, whereas the RotateAnimation takes angles from 0...360 clockwise. Therefore I just converted the azimuth and bearing angles by adding 360 if they were smaller than 0 before I proceeded in the calculations.

Related

SensorManager.getOrientation returns values that are not in angles

I am trying to implement a business logic layer that tells me on what degree from 180 to -180 is the device currently is in the x axis. It will be used for a camera screen that requires the user to hold the phone vertically.
So in order to do so, I listens to both the TYPE_ACCELEROMETER and to the TYPE_MAGNETIC_FIELD sensors types as suggested in the official docs -
https://developer.android.com/guide/topics/sensors/sensors_position#sensors-pos-prox
And I ended up with the following code -
override fun onSensorChanged(event: SensorEvent?) {
val value = event?.values ?: return#onSensorChanged
var accelerometerReading = floatArrayOf()
var magnetometerReading = floatArrayOf()
when(event.sensor.type) {
TYPE_ACCELEROMETER -> {
accelerometerReading = floatArrayOf(value[0], value[1], value[2])
}
TYPE_MAGNETIC_FIELD -> {
magnetometerReading = floatArrayOf(value[0], value[1], value[2])
}
}
val rotationMatrix = FloatArray(9)
if (magnetometerReading.isEmpty() || accelerometerReading.isEmpty()) return#setOnSensorValuesChangedListener
SensorManager.getRotationMatrix(rotationMatrix, FloatArray(9), accelerometerReading, magnetometerReading)
val orientationAngles = FloatArray(3)
SensorManager.getOrientation(rotationMatrix, orientationAngles) //always returns the same values that are provided to it. why?
As you can see, I implemented the exact same code as said to do in the official docs but the values I get have nothing to do in the range of 180 to -180 in both 3 of the elements in the orientationAngles array. I get values that are very similar to the input I give it, something like [-0.051408034, -0.007878973, 0.04735359] which is some random irrelevant data for me.
Any idea why would this happen and how to indeed get what I want which is the x axis angle of the device?
Edit:
I'll try to simplify what I want.
Imagine holding a device in portrait mode locked in with it facing you. In a perfect portrait stance I want to get a 90 degree value from the sensor. When the user tilts the device either left or right the values would either go down to 0 or up to 180 (which side it is doesn't matter). All I need is these 2 dimensional x axis values.
It gives the angles in radians, not degrees. Almost nothing in math uses degrees beyond grade school math, radians is the norm. Radians generally go from 0 to 2*pi, equaling 0 to 360 degrees. The formula to convert is degrees= radians/pi * 180
According to docs, the angles returned by getOrientation returns radians in the range -PI to PI, where 0 is defined by the individual angles. From the docs:
values[0]: Azimuth, angle of rotation about the -z axis. This value represents the angle between the device's y axis and the magnetic north pole. When facing north, this angle is 0, when facing south, this angle is π. Likewise, when facing east, this angle is π/2, and when facing west, this angle is -π/2. The range of values is -π to π.
values[1]: Pitch, angle of rotation about the x axis. This value represents the angle between a plane parallel to the device's screen and a plane parallel to the ground. Assuming that the bottom edge of the device faces the user and that the screen is face-up, tilting the top edge of the device toward the ground creates a positive pitch angle. The range of values is -π/2 to π/2.
values[2]: Roll, angle of rotation about the y axis. This value represents the angle between a plane perpendicular to the device's screen and a plane perpendicular to the ground. Assuming that the bottom edge of the device faces the user and that the screen is face-up, tilting the left edge of the device toward the ground creates a positive roll angle. The range of values is -π to π.

Conversion from Phone axis to world coordinate axis

I am working on a project where i want to convert my android phones axis onto the world coordinate axis. I have the acceleration along the phones 3 axis (x,y,z) and the three angles being the azimuth,pitch and roll. How can i proceed in virtually converting my phones axis to the world coordinate axis(truenorth,trueeast).
As far as i am doing is giving me wrong result.
This is my approach.
Lets say
-0.030029837 -0.008528218 -0.199289320 is the Acceleration along phone's X,Y,Z axis and
0.01618620 0.48581530 0.19617330 are the three angles(in radians) being the azimuth, pitch and roll.
To get the acceleration along the "truenorth" i am simply taking the acceleration along Y axis and multiplying it with cos of all the three angles ie.
-0.008528218*cos(0.19617330)*cos(0.48581530)*cos(0.01618620)=-0.00739584
To get the acceleration along the "trueeast" i am simply taking the acceleration along X axis and multiplying it with cos of all the three angles ie.
-0.030029837*cos(0.19617330)*cos(0.48581530)*cos(0.01618620)=-0.026042471
To get the acceleration along the "vertical upwards" i am simply taking the acceleration along Z axis and multiplying it with cos of the 2 angles ie.
-0.199289320*cos(0.19617330)*cos(0.48581530)=-0.172850298
Now to check whether i have done it correct or wrong. I did a consistency test.
The magnitude of acceleration along phone's X,Y,Z axis should be equal to the magnitude of acceleration along the trueeast, truenorth and the vertically upwards.
But this is coming to not equal.
0.20 is not equal to 0.17
Where am i going wrong? It would be of a great help if someone out there could help me out.
Thanks a lot in advance.
I am assuming that your azimuth, pitch and roll angles represent how much angle the phone axes deviate with respect to world axes. If my assumption is right then the following lines of code will fulfill your needs.
float yaw = (float) Math.toDegrees(0.01618620);
float pitch = (float) Math.toDegrees(0.48581530);
float roll = (float) Math.toDegrees(0.19617330);
float acc[] = { -0.030029837f, -0.008528218f, -0.199289320f, 1 };
float[] matrix = new float[16];
Matrix.setIdentityM(matrix, 0);
Matrix.rotateM(matrix, 0, yaw, 0, 0, 1);
Matrix.rotateM(matrix, 0, pitch, 0, 1, 0);
Matrix.rotateM(matrix, 0, roll, 1, 0, 0);
float accT[] = new float[4];
Matrix.multiplyMV(accT, 0, matrix, 0, acc, 0);
float rawMagnitude = Matrix.length(acc[0], acc[1], acc[2]);
System.out.println(rawMagnitude);
float transformedMagnitude = Matrix.length(accT[0], accT[1], accT[2]);
System.out.println(transformedMagnitude);
The console printed magnitudes are
0.2017195
0.20171948
I used android.opengl.Matrix. The static methods used here are self explanatory for more details see the API documentation.
If my assumption is exactly opposite, that is, if your azimuth, pitch and roll angles represent how much angle the world axes deviate with respect to phone axes, then just make your azimuth, pitch and roll angles negative.

Android getOrientation() returns azimuth, positive or negative?

Assumption: the phone is held flat (parallel with the ground).
I am using getRotationMatrix() and getOrientation (float[] R, float[] values) to get the azimuth. Under this assumption, the azimuth is simply values[0].
The documentation says:
All three angles above are in radians and positive in the
counter-clockwise direction.
Then I checked, when my phone's y axis points to the North, the azimuth is indeed 0.
However, here comes the problem: when my phone's y axis points to West, i.e. I rotated my phone counter-clockwise to make it point from North to West, the azimuth is negative!
Shouldn't the angle be positive when it is rotated counter-clockwise from North?
Where goes wrong?
No, if the positive direction of Z points to the sky then when y axis points to West then the angle is positive. But the coordinate used in getOrientation the positive direction of Z points down to the earth, thus now what seems counter-clockwise becomes clockwise, so West is negative.

android compass rotateanimation

I'm making a compass that points to a user defined location. I'm using the rotateanimation to rotate the needle. When the needle points in the direction of the phone I know the phone is pointing in the direction I want. However, I wanted the needle to point in the correct direction irregardless of the phone's azimuth.
The problem is that it seems that rotateanimation does not rotate the needle according to the real world coordinates, and instead is relative to the phone's screen. So a 58 degree rotation of the needle does not match a 58 degree rotation in the real world. Is this true or am I making a mistake in my code?
The compass is meant to be used by placing the phone's back flat on a surface. I've also tried outputting the azimuth and it reads like this:
Azimuth Actual Phone angle
0 0
45 90
90 180
when it gets close to a full circle back it bounces between 120 and 340.
Here's the code:
direction = 360 - azimuth + rotate;
RotateAnimation animate = new RotateAnimation(rotateLast, direction, Animation.RELATIVE_TO_SELF, 0.5f, Animation.RELATIVE_TO_SELF, 0.5f);
animate.setFillAfter(true);
animate.setInterpolator(new LinearInterpolator());
animate.setDuration(10);
needle.startAnimation(animate);
rotateLast = direction;
azimuth is the phone's azimuth from the sensor, rotate is the user specified direction (in degrees from north), and direction is the required rotation of the needle.
rotateLast is the last position the needle was at, I'm using this because without it the needle reverts to zero degrees and flickers.
Thanks,
P.S. this has been driving me crazy
I figured it out, my math was all wrong and I misunderstood how azimuth affected the rotation.
I realized when I rotated the image to just azimuth and understood that it resulted in the needle pointing north. All I needed to do was just add the user direction. The math I was using caused the needle to rotate in unpredictable ways.
The answer is simply:
RotateAnimation animate = new RotateAnimation(-azimuth+rotation, -azimuth+rotation, Animation.RELATIVE_TO_SELF, 0.5f, Animation.RELATIVE_TO_SELF, 0.5f);
the -azimuth is because of how rotateanimation rotates counter-clockwise.

Sensor value interpretation

I am currently trying to understand the sensor values I get from code similar to this.
The yaw/azimuth value seems to be okay. The problem is the pitch value, because I get -90° when the device is upright and tilting back and forward lead to the same values.
Lets say i tilt by 45° forward - the value is -45°, so its the same like tilting the device 45° backward.
Like this I cannot determine the device pitch in 360°.
Can somebody help me with that?
Taken from http://developer.android.com/reference/android/hardware/SensorListener.html:
All values are angles in degrees.
values[0]: Azimuth, rotation around the Z axis (0<=azimuth<360). 0 = North, 90 = East, 180 = South, 270 = West
values[1]: Pitch, rotation around X axis (-180<=pitch<=180), with positive values when the z-axis moves toward the y-axis.
values[2]: Roll, rotation around Y axis (-90<=roll<=90), with positive values when the z-axis moves toward the x-axis.
Note that this definition of yaw, pitch and roll is different from the traditional definition used in aviation where the X axis is along the long side of the plane (tail to nose).
So Pitch -180° - 180° instead of 0° - 360°. The difference is forward shows -45° and backward should show 45°, right?

Categories

Resources