Translating Accelerometer Vectors from Device to Earth Coordinates - android

I have scoured the internet and StackOverflow to find a way to translate Android accelerometer vectors from the device coordinate system to Earth coordinate systems. I am using the Sensor.TYPE_ROTATION_VECTOR to do this. I am accessing the sensors using NativeScript and using MathJS to do matrix computations.
I grab the rotation vector using Sensor.TYPE_ROTATION_VECTOR
I calculate the rotation matrix following the Android code for getRotationMatrixFromVector() given at https://android.googlesource.com/platform/frameworks/base/+/master/core/java/android/hardware/SensorManager.java.
---2.1 I calculate q0, q1=acceleration.x, q2=acceleration.y, q3=acceleration.z
---2.2 I tried calculating the rotation matrix using both the float[9] and float[16] matrix sizes but I can't get either of them to work.
---2.3 I tried inverting and transposing both these rotation matrices.
I now multiply the rotation matrix (tried normal, inverted, and transpose) by the matrix [[accel.x],[accel.y],[accel.z],[0.0]] (4x1 matrix)
When I look at my newly translated acceleration values, they aren't rotated at all. If I point my device's x-axis toward the Earth's north and accelerate it this direction, my acceleration.x will be >1 and the other values almost 0. Here my acceleration.y should be translated to be >1. Similarly, if I point my device's y-axis toward the sky and accelerate it this direction, my acceleration.y reads >1 when my acceleration.z should read >1.
Can anyone tell me what I am doing wrong in these steps? The consensus seems to be grab the rotation vector, transform it to a rotation matrix, invert the matrix, then multiply this inverted matrix by the accelerometer vector. It doesn't work for me. Thanks,
var r;
var rinv;
accelerometer.startAccelerometerUpdates(function (accel) {
if (accel.sensortype == 11) {
var q0 = Math.sqrt(1 - accel.x*accel.x - accel.y*accel.y - accel.z*accel.z);
var q1 = accel.x;
var q2 = accel.y;
var q3 = accel.z;
//calculate rotation matrix from unit quaternion
var sq1 = 2*q1*q1;
var sq2 = 2*q2*q2;
var sq3 = 2*q3*q3;
var q1q2 = 2*q1*q2;
var q3q0 = 2*q3*q0;
var q1q3 = 2*q1*q3;
var q2q0 = 2*q2*q0;
var q2q3 = 2*q2*q3;
var q1q0 = 2*q1*q0;
//r = math.matrix([[1-sq2-sq3,q1q2-q3q0,q1q3+q2q0],[q1q2+q3q0,1-sq1-sq3,q2q3-q1q0],[q1q3-q2q0,q2q3+q1q0,1-sq1-sq2]]);
r = math.matrix([[1-sq2-sq3,q1q2-q3q0,q1q3+q2q0,0.0],[q1q2+q3q0,1-sq1-sq3,q2q3-q1q0,0.0],[q1q3-q2q0,q2q3+q1q0,1-sq1-sq2,0.0],[0.0,0.0,0.0,1.0]]);
rinv = math.inv(r);
}
if (accel.sensortype == 10) {
//filter accelerometer errors
if (Math.abs(accel.x) < 10 || Math.abs(accel.y) < 10 || Math.abs(accel.z) < 10) {
if ((Math.abs(accel.x) > .15 && Math.abs(accel.x)/oldAX < 2) || (Math.abs(accel.y) > .15 && Math.abs(accel.y)/oldAY < 2) || (Math.abs(accel.z) > .15 && Math.abs(accel.z)/oldAZ < 2)) { //filter errors in accelerometer
var Ad = math.matrix([[accel.x],[accel.y],[accel.z],[0.0]]);
//transform acceleration values from device coordinates to Earth coordinates
var Ag = math.multiply(rinv,Ad);
var ax = Ag.get([0,0]);
var ay = Ag.get([1,0]);
var az = Ag.get([2,0]);
if (ax > 1 || ay > 1 || az > 1) { //only show large values for easier analyses
page.getViewById("rotationLabel").text = "Earth Axes Acceleration \n x: " + ax + "\ny: " + ay + "\nz: " + az;
}
I expect the device's accelerometer values to be translated to the Earth's coordinate system like on this page at the rotation vector portion.
https://developer.android.com/guide/topics/sensors/sensors_motion

The NativeScript-Accelerometer-Advanced plugin is coded incorrectly. Calculating the unit quaternion value q0 is not necessary as it is provided by the sensor. Dividing by gravity (9.81 m/s^2) is not necessary as it is not included in the sensors. I fixed this in my local copy and my rotation method is working correctly:
1. Get rotation vector
2. Calculate rotation matrix
3. Invert rotation matrix
4. Multiply inverted rotation matrix by accelerometer vector with [1.0] appended as the 4th row

Related

opengl object vibrate after moving a distance

I have an object which moves on a terrain and a third person camera follow it, after I move it for some distance in different directions it begin to shaking or vibrating even if it is not moving and the camera rotates around it, this is the moving code of the object
double& delta = engine.getDeltaTime();
GLfloat velocity = delta * movementSpeed;
glm::vec3 t(glm::vec3(0, 0, 1) * (velocity * 3.0f));
//translate the objet atri before rendering
matrix = glm::translate(matrix, t);
//get the forward vetor of the matrix
glm::vec3 f(matrix[2][0], matrix[2][1], matrix[2][2]);
f = glm::normalize(f);
f = f * (velocity * 3.0f);
f = -f;
camera.translate(f);
and the camera rotation is
void Camera::rotate(GLfloat xoffset, GLfloat yoffset, glm::vec3& c, double& delta, GLboolean constrainpitch) {
xoffset *= (delta * this->rotSpeed);
yoffset *= (delta * this->rotSpeed);
pitch += yoffset;
yaw += xoffset;
if (constrainpitch) {
if (pitch >= maxPitch) {
pitch = maxPitch;
yoffset = 0;
}
if (pitch <= minPitch) {
pitch = minPitch;
yoffset = 0;
}
}
glm::quat Qx(glm::angleAxis(glm::radians(yoffset), glm::vec3(1.0f, 0.0f, 0.0f)));
glm::quat Qy(glm::angleAxis(glm::radians(xoffset), glm::vec3(0.0f, 1.0f, 0.0f)));
glm::mat4 rotX = glm::mat4_cast(Qx);
glm::mat4 rotY = glm::mat4_cast(Qy);
view = glm::translate(view, c);
view = rotX * view;
view = view * rotY;
view = glm::translate(view, -c);
}
float is sometimes not enough.
I use double precision matrices on CPU side to avoid such problems. But as you are on Android it might not be possible. For GPU use floats again as there are no 64bit interpolators yet.
Big numbers are usually the problem
If your world is big then you are passing big numbers into the equations multiplying any errors and only at the final stage the stuff is translated relative to camera position meaning the errors stay multiplied but the numbers got clamped so error/data ratio got big.
To lower this problem before rendering convert all vertexes to coordinate system with origin at or near your camera. You can ignore rotations just offset the positions.
This way you will got higher errors only far away from camera which is with perspective not visible anyway... For more info see:
ray and ellipsoid intersection accuracy improvement
Use cumulative transform matrix instead of Euler angles
for more info see Understanding 4x4 homogenous transform matrices and all the links at bottom of that answer.
This sounds like a numerical effect to me. Even small offsets coming from your game object will influence the rotation of the following camera with small movements / rotations and it looks like a vibrating object / camera.
So what you can do is:
Check if the movement above a threshold value before calculating a new rotation for your camera
When you are above this threshold: do a linear interpolation between the old and the new rotation using the lerp-algorithm for the quaternion ( see this unity answer to get a better understanding how your code can look like: Unity lerp discussion )

Get quaternion from Android gyroscope?

The official development documentation suggests the following way of obtaining the quaternion from the 3D rotation rate vector (wx, wy, wz).
// Create a constant to convert nanoseconds to seconds.
private static final float NS2S = 1.0f / 1000000000.0f;
private final float[] deltaRotationVector = new float[4]();
private float timestamp;
public void onSensorChanged(SensorEvent event) {
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if (timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
// Axis of the rotation sample, not normalized yet.
float axisX = event.values[0];
float axisY = event.values[1];
float axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
// Normalize the rotation vector if it's big enough to get the axis
// (that is, EPSILON should represent your maximum allowable margin of error)
if (omegaMagnitude > EPSILON) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = sin(thetaOverTwo);
float cosThetaOverTwo = cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
timestamp = event.timestamp;
float[] deltaRotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
// User code should concatenate the delta rotation we computed with the current rotation
// in order to get the updated rotation.
// rotationCurrent = rotationCurrent * deltaRotationMatrix;
}
}
My question is:
It is quite different from the acceleration case, where computing the resultant acceleration using the accelerations ALONG the 3 axes makes sense.
I am really confused why the resultant rotation rate can also be computed with the sub-rotation rates AROUND the 3 axes. It does not make sense to me.
Why would this method - finding the composite rotation rate magnitude - even work?
Since your title does not really match your questions, I'm trying to answer as much as I can.
Gyroscopes don't give an absolute orientation (as the ROTATION_VECTOR) but only rotational velocities around those axis they are built to 'rotate' around. This is due to the design and construction of a gyroscope. Imagine the construction below. The golden thing is rotating and due to the laws of physics it does not want to change its rotation. Now you can rotate the frame and measure these rotations.
Now if you want to obtain something as the 'current rotational state' from the Gyroscope, you will have to start with an initial rotation, call it q0 and constantly add those tiny little rotational differences that the gyroscope is measuring around the axis to it: q1 = q0 + gyro0, q2 = q1 + gyro1, ...
In other words: The Gyroscope gives you the difference it has rotated around the three constructed axis, so you are not composing absolute values but small deltas.
Now this is very general and leaves a couple of questions unanswered:
Where do I get an initial position from? Answer: Have a look at the Rotation Vector Sensor - you can use the Quaternion obtained from there as an initialisation
How to 'sum' q and gyro?
Depending on the current representation of a rotation: If you use a rotation matrix, a simple matrix multiplication should do the job, as suggested in the comments (note that this matrix-multiplication implementation is not efficient!):
/**
* Performs naiv n^3 matrix multiplication and returns C = A * B
*
* #param A Matrix in the array form (e.g. 3x3 => 9 values)
* #param B Matrix in the array form (e.g. 3x3 => 9 values)
* #return A * B
*/
public float[] naivMatrixMultiply(float[] B, float[] A) {
int mA, nA, mB, nB;
mA = nA = (int) Math.sqrt(A.length);
mB = nB = (int) Math.sqrt(B.length);
if (nA != mB)
throw new RuntimeException("Illegal matrix dimensions.");
float[] C = new float[mA * nB];
for (int i = 0; i < mA; i++)
for (int j = 0; j < nB; j++)
for (int k = 0; k < nA; k++)
C[i + nA * j] += (A[i + nA * k] * B[k + nB * j]);
return C;
}
To use this method, imagine that mRotationMatrix holds the current state, these two lines do the job:
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
mRotationMatrix = naivMatrixMultiply(mRotationMatrix, deltaRotationMatrix);
// Apply rotation matrix in OpenGL
gl.glMultMatrixf(mRotationMatrix, 0);
If you chose to use Quaternions, imagine again that mQuaternion contains the current state:
// Perform Quaternion multiplication
mQuaternion.multiplyByQuat(deltaRotationVector);
// Apply Quaternion in OpenGL
gl.glRotatef((float) (2.0f * Math.acos(mQuaternion.getW()) * 180.0f / Math.PI),mQuaternion.getX(),mQuaternion.getY(), mQuaternion.getZ());
Quaternion multiplication is described here - equation (23). Make sure, you apply the multiplication correctly, since it is not commutative!
If you want to simply know rotation of your device (I assume this is what you ultimately want) I strongly recommend the ROTATION_VECTOR-Sensor. On the other hand Gyroscopes are quite precise for measuring rotational velocity and have a very good dynamic response, but suffer from drift and don't give you an absolute orientation (to magnetic north or according to gravity).
UPDATE: If you want to see a full example, you can download the source-code for a simple demo-app from https://bitbucket.org/apacha/sensor-fusion-demo.
Makes sense to me. Acceleration sensors typically work by having some measurable quantity change when force is applied to the axis being measured. E.g. if gravity is pulling down on the sensor measuring that axis, it conducts electricity better. So now you can tell how hard gravity, or acceleration in some direction, is pulling. Easy.
Meanwhile gyros are things that spin (OK, or bounce back and forth in a straight line like a tweaked diving board). The gyro is spinning, now you spin, the gyro is going to look like it is spinning faster or slower depending on the direction you spun. Or if you try to move it, it will resist and try to keep going the way it is going. So you just get a rotation change out of measuring it. Then you have to figure out the force from the change by integrating all the changes over the amount of time.
Typically none of these things are one sensor either. They are often 3 different sensors all arranged perpendicular to each other, and measuring a different axis. Sometimes all the sensors are on the same chip, but they are still different things on the chip measured separately.

Using a rotation matrix to convert euler units to 360 degrees

In Android, I am using the accelerometer and magnetic field sensor to calculate spatial positioning, as shown in the code below. The getRotationMatrix method generates values that are in real-world units with azimuth, pitch and roll. Azimuth and roll give values in the range of 0 to 180 or 0 to -180. Pitch however gives values from 0 to 90 or 0 to -90. That's a problem for my app because in my app I need to determine unique locations regardless how the device is oriented. With roll, you can have 2 locations with the same value.
I need to apply a matrix transformation that remaps the sensor values to values that range from 0 to 360 degrees (actually, 360 wouldn't be valid since it's the same as 0 and anything close to 360 would result in a number like 359.99999...)
I am not a mathematician and don't know how to use matrixes, let alone use them in Android but I am aware that this is what is required to get the 0 to 360 degree conversion. It would be nice if the matrix also took care of the azimuth and roll as well so that they also produce values from 0 to 360 but if that isn't possible, that's fine since unique positions can still be derived from their sensor values. Any suggestions how how I create this matrix transformation?
#Override
public void onSensorChanged(SensorEvent event)
{
try
{
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
accelerometerValues = event.values;
else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
magneticFieldValues = event.values;
if ((accelerometerValues != null) && (magneticFieldValues != null))
{
float[] R = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, accelerometerValues, magneticFieldValues);
if (success)
{
float[] values = new float[3];
SensorManager.getOrientation(R, values);
// Convert from radians to degrees if preferred.
values[0] = (float) Math.toDegrees(values[0]); // Azimuth
values[1] = (float) Math.toDegrees(values[1]); // Pitch
values[2] = (float) Math.toDegrees(values[2]); // Roll
}
}
}
catch (Exception ex)
{
}
}
EDIT:
The raw event values for the pitch do not give you unique values as you rotate the device 360 degrees, so I highly doubt any matrix transformation is going to produce the results I am after. Maybe I am using the wrong sensors.

Android: Problems calculating the Orientation of the Device

i'am trying to build a simple Augmented Reality App, so I start working with sensor Data.
According to this thread (Android compass example) and example (http://www.codingforandroid.com/2011/01/using-orientation-sensors-simple.html), the calculation of the orientation using the Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGNETIC_FIELD doesn't really fit.
So I'm not able to get "good" values. The azimut values doesn't make any sense at all, so if I just move the Phone upside the value changes extremly. Even if I just rotate the phone, the values doesn't represent the phones orientation.
Has anybody an idea, who to improve the values quality according to the given example?
In what kind of orientation do you use this sample app? From what is written is this code, the only orientation supported is Portrait or flat on the table, it depends on devices. What do you mean by "good"?
It is normal that the value is not "good" when rotating the device, the device coordinate system is supposed to be working in Portrait, or flat i don't know (Y axis vertical along the screen pointing up, Z axis pointing out of the screen coming from the center of screen, X axis perpendicular to the Y axis going on the right along the screen). Having this, rotating the device will not rotate the device coordinate system, you'll have to remap it.
But if you want the heading of the device in Portrait orientation, here is a piece of code that works good for me:
#Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrix,
event.values);
SensorManager
.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}
You'll get the heading (or azimuth) in:
orientationVals[0]
Answer from Tíbó is good, but if you log roll value, you will expect irregular numbers.
(roll is important for AR Browsers)
This is due to
SensorManager.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
You have to use different matrix for in and out of remap. This following code works for me with a correct roll value:
#Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrixFromVector, event.values);
SensorManager.remapCoordinateSystem(mRotationMatrixFromVector,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}
Probably late to the party. Anyway here is how I got the azimuth
private final int sensorType = Sensor.TYPE_ROTATION_VECTOR;
float[] rotMat = new float[9];
float[] vals = new float[3];
#Override
public void onSensorChanged(SensorEvent event) {
sensorHasChanged = false;
if (event.sensor.getType() == sensorType){
SensorManager.getRotationMatrixFromVector(rotMat,
event.values);
SensorManager
.remapCoordinateSystem(rotMat,
SensorManager.AXIS_X, SensorManager.AXIS_Y,
rotMat);
SensorManager.getOrientation(rotMat, vals);
azimuth = deg(vals[0]); // in degrees [-180, +180]
pitch = deg(vals[1]);
roll = deg(vals[2]);
sensorHasChanged = true;
}
}
Hope it helps
Have you tried the combined (sensor-fusion) type Sensor.TYPE_ROTATION_VECTOR. This may give better results:
Go to https://developer.android.com/reference/android/hardware/SensorEvent.html and search for 'rotation_vector'.
Here's a Kotlin approach with all the necessary matrices included (for some reason the previous answers leave out the array sizes, which matter)
// This is determined from the deprecated Sensor.TYPE_ORIENTATION
var lastOrientation: FloatArray = FloatArray(3)
var lastHeading: Float = 0f
var currentHeading: Float = 0f
// This is from the non deprecated Sensor.TYPE_ROTATION_VECTOR
var lastVectorOrientation: FloatArray = FloatArray(5)
var lastVectorHeading: Float = 0f
var currentVectorHeading: Float = 0f
override fun onSensorChanged(event: SensorEvent) {
when(event.sensor?.type) {
null -> return
Sensor.TYPE_ORIENTATION -> {
lastOrientation = event.values
lastHeading = currentHeading
currentHeading = abs(event.values[0].roundToInt().toFloat())
}
Sensor.TYPE_ROTATION_VECTOR -> {
lastVectorOrientation = event.values
lastVectorHeading = currentVectorHeading
val tempRotationMatrix = FloatArray(9)
val tempOrientationMatrix = FloatArray(3)
getRotationMatrixFromVector(tempRotationMatrix, event.values)
remapCoordinateSystem(tempRotationMatrix, AXIS_X, AXIS_Z, tempRotationMatrix)
getOrientation(tempRotationMatrix, tempOrientationMatrix)
currentVectorHeading = Math.toDegrees(tempOrientationMatrix[0].toDouble()).toFloat()
if(currentVectorHeading < 0) {
currentVectorHeading += 360f//heading = 360 - abs(neg heading), which is really 360 + (-heading)
}
}
else -> return
}
}
I've also included the deprecated Sensor.TYPE_ORIENTATION for anybody wanting to see the difference between the two approaches. There is a several degree difference when using the deprecated method vs the updated approach.

Can I rotate 3D car object

Can I use a rotating animation without using anyOpenGL or any 3rd pary tool. I just want to apply a clockwise rotating 3D car object in a fix layout.yout.
Unless you have some sort of sprite animation sequence that emulates a 3D car then I don't see how you can do it to be honest. I might be wrong though and may have missed something in android but to me this is a classic openGL situation.
By sprite animation I also mean any sort of 2D sequence of images, for example animated gifs.
1) Well, you first have to set the car in 3D space;
To model a 3D object, you must define all the vertices of the car as if you were modeling with points and after , linking this point you get the car in wireframe.
Pont A {x = 3, y = 10, z = 8}
Point B {x = 5, y = 12, z = 2}
Point C {x = 6, y = 40, z = 6}
Point D {x = 7, y = 12, z = 3}
Point E {x = 3, y = 10, z = 8}
...
2) After modeling the car in points, you must define the links of points to form lines, there you have called wireframe modeling, if you want to modeling shapes is different, but with the wireframe already a good idea of the object in the real world.
Line AB = Point A -> Point B
Line BC = Point B -> Point C
3) To spin perfectly the car, you should position it in the center of coordinates,applying in each point the formula translation T with measure the match the distance to the subject from the center at the origin of the axes.
New point x:
x = x - Tx
New point y;
y = y - Ty
New point z:
z = z - Tz
4) With the car in position to spin it into an angle "g" should be applied to each point the rotation transformation, the formula is:
 
Find the new point x:
xt = (x * 1) (* y 0) (z * 0);
Find the new point y:
yt = (X * 0) (y * Math.cos (g)) (z * (Math.sin-(g)));
Find the new point z:
zt = (X * 0) (y * Math.sin (g)) (z * Math.cos (g));
5) After applying the rotation effect, you must return the object to its point of origin, making a reverse translation to step 3.
New point x:
x = Tx x
New point y;
y = y Ty
New point z:
z = z Tz
This is a roughly explanation but is the most basic way, of course it has more complex formulas that are faster these changes, but to become more didactic I put this way.

Categories

Resources