I am using an adapted version of android's getRotationMatrix in a c++ program that reads the phone's sensor data over the network and calculates the device's matrix.
The function works fine and calculates the device's orientation. Unfortunately, Ogre3d has a different axis system than the device. So even though rotation about the x-axis works fine, the y and z axis are wrong. Holding the device level and pointing to north (identity matrix). When I pitch, the rotation is correct. But when I roll and yaw the rotations are alternated. Roll is yaw in Ogre3d and vice versa.
(Ogre3d) ([Device][5])
^ +y-axis ^ +z-axis
* *
* *
* * ^ +y-axis
* * *
* * *
* * *
************> + x-axis ************> +x-axis
*
*
v +z-axis
A quick look at the two axis system looks like Ogre's system (on the left) is essentially the device's system rotated 90 degrees counter clockwise about the x-axis.
I tried to experiment with various combinations when I fist assign sensor values before the matrix is calculated but no combination seems to work correctly. How would I make sure that the rotation matrix getRotationMatrix() produces displays correctly on Ogre3D?
For Reference here is the function that calculates the matrix:
bool getRotationMatrix() {
//sensor data coming through the network are
//stored in accel(accelerometer) and mag(geomagnetic)
//vars which the function has access to
float Ax = accel[0]; float Ay = accel[1]; float Az = accel[2];
float Ex = mag[0]; float Ey = mag[1]; float Ez = mag[2];
float Hx = Ey * Az - Ez * Ay;
float Hy = Ez * Ax - Ex * Az;
float Hz = Ex * Ay - Ey * Ax;
float normH = (float) Math::Sqrt(Hx * Hx + Hy * Hy + Hz * Hz);
if (normH < 0.1f) {
// device is close to free fall (or in space?), or close to
// magnetic north pole. Typical values are > 100.
return false;
}
float invH = 1.0f / normH;
Hx *= invH;
Hy *= invH;
Hz *= invH;
float invA = 1.0f / (float) Math::Sqrt(Ax * Ax + Ay * Ay + Az * Az);
Ax *= invA;
Ay *= invA;
Az *= invA;
float Mx = Ay * Hz - Az * Hy;
float My = Az * Hx - Ax * Hz;
float Mz = Ax * Hy - Ay * Hx;
//ogre3d's matrix3 is column-major whereas getrotatinomatrix produces
//a row-major matrix thus i have tranposed it here
orientation[0][0] = Hx; orientation[0][2] = Mx; orientation[0][2] = Ax;
orientation[1][0] = Hy; orientation[1][3] = My; orientation[1][2] = Ay;
orientation[2][0] = Hz; orientation[2][4] = Mz; orientation[2][2] = Az;
return true;
}
Why not just add the one additional rotation you've already identified before you use it in ogre?
I found the problem. In my function the unit vectors calculated after the cross products I put them in columns whereas I should be putting them in the rows in their appointed matrix3 cells as usual. Something about row-major and column-major confused me even though I was referring to the elements in 2d [][].
multiplying the outcome of the matrix calculation function with this matrix:
1 0 0
0 0 1
0 -1 0
Then pitching the whole result by another p/2 about axis solved the remap problem but I fear my geometry is inverted.
I don't know much about Matrix Rotation, but if the Systems rotates like you are showing, I think that youshould do the following:
X Axis stays the same way, so:
float Ax = accel[0];
float Ex = mag[0];
Y Axis in (Ogre3d) is Z axis in ([Device][5]), so:
float Ay = accel[2];
float Ey = mag[2];
Z Axis in (Ogre3d) is the oposite of Y axis in ([Device][5]), so:
float Az = accel[1] * (-1);
float Ez = mag[1] * (-1);
Try that
Related
I need to know whether device is near to ear or not by using sensors
I tried using proximity, I want to combine accelerator and gyroscope sensors to exactly find the device is near or far from Ear.
Code for Proximity
#Override
public void onSensorChanged(SensorEvent event) {
float distance = event.values[0];
if (event.sensor.getType() == Sensor.TYPE_PROXIMITY) {
if (distance < mProximity.getMaximumRange()) {
iv.setText("Near");
} else {
iv.setText("far");
}
}
}
This is what I got from Android Documentation, I am sure you can dig more to get some answers to your problem, but this should be enough to get you started. You can also do some research about position sensors in android. The documentation is quite useful
// Create a constant to convert nanoseconds to seconds.
private static final float NS2S = 1.0f / 1000000000.0f;
private final float[] deltaRotationVector = new float[4]();
private float timestamp;
public void onSensorChanged(SensorEvent event) {
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if (timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
// Axis of the rotation sample, not normalized yet.
float axisX = event.values[0];
float axisY = event.values[1];
float axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
// Normalize the rotation vector if it's big enough to get the axis
// (that is, EPSILON should represent your maximum allowable margin of error)
if (omegaMagnitude > EPSILON) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = sin(thetaOverTwo);
float cosThetaOverTwo = cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
timestamp = event.timestamp;
float[] deltaRotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix,
deltaRotationVector);
// User code should concatenate the delta rotation we computed with the current rotation
// in order to get the updated rotation.
// rotationCurrent = rotationCurrent * deltaRotationMatrix;
}
}
I have some data from a sensor device, i.e., accelerometer x,y,z and a quaternion. Using this information i would like to render a "line image" in an Android OpenGL View. Can someone help me about the part on how to convert acceleration values to something that can be used by OpenGL glTranslatef and glRotatef function's.
You shouldn't be using the deprecate gltranslate or glrotate. instead just manage the correct transformation matrices directly.
converting a quaternion to matrix can be done with the following: (sourced from my previous code)
float[] quatToMat(quaternion q, float* result)
{
//based on algorithm on wikipedia
// http://en.wikipedia.org/wiki/Rotation_matrix#Quaternion
float w = q.scalar ();
float x = q.x();
float y = q.y();
float z = q.z();
float n = x*x + y*y + z*z + w*w;
float s = n == 0? 0 : 2 / n;
float wx = s * w * x, wy = s * w * y, wz = s * w * z;
float xx = s * x * x, xy = s * x * y, xz = s * x * z;
float yy = s * y * y, yz = s * y * z, zz = s * z * z;
return new float[]{ 1 - (yy + zz), xy + wz , xz - wy ,0,
xy - wz , 1 - (xx + zz), yz + wx ,0,
xz + wy , yz - wx , 1 - (xx + yy),0,
0 , 0 , 0 ,1 };
}
If you still want to use the fixed function pipeline then push that into glMultMatrix.
Suppose I have my current orientation as (azimuth, pitch, roll). Now I wish to update my orientation with the gyroscope. According to the codes given by the Android development web, I can obtain the so-called deltaRotationMatrix as follows:
// Create a constant to convert nanoseconds to seconds.
private static final float NS2S = 1.0f / 1000000000.0f;
private final float[] deltaRotationVector = new float[4]();
private float timestamp;
public void onSensorChanged(SensorEvent event) {
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if (timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
// Axis of the rotation sample, not normalized yet.
float axisX = event.values[0];
float axisY = event.values[1];
float axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
// Normalize the rotation vector if it's big enough to get the axis
// (that is, EPSILON should represent your maximum allowable margin of error)
if (omegaMagnitude > EPSILON) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = sin(thetaOverTwo);
float cosThetaOverTwo = cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
timestamp = event.timestamp;
float[] deltaRotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
// User code should concatenate the delta rotation we computed with the current rotation
// in order to get the updated rotation.
// rotationCurrent = rotationCurrent * deltaRotationMatrix;
}
}
How should I proceed with this snippet so as to update my orientation?
You just need to multiply the deltaRotationMatrix by the rotationCurrentMatrix and then make a call to SensorManager.getOrientation(). You will need to implement a matrix multiplication method. You will also need an initial currentRotationMatrix, you can use the acceleration sensor and magnetic sensor with SensorManager.getRotationMatrix and SensorManager.getOrientation to get the initial currentRotationMatrix. Alternatively, you could use TYPE_ROTATION_VECTOR to get the initial currentRotationMatrix.
currentRotationMatrix = matrixMultiplication(
currentRotationMatrix,
deltaRotationMatrix);
SensorManager.getOrientation(currentRotationMatrix,
gyroscopeOrientation;
Unfortunately, what you will realize is that even the TYPE_GYROSCOPE sensor which is supposed to be calibrated for drift doesn't do a very good job and the sensor quickly drifts out of rotation with the device. Frustrating.
I have a GitHub repo with all of this implemented here
And a working project on the Play Store here
I'm implementing cubic bezier curve logic in my one of Android Application.
I've implemented cubic bezier curve code on canvas in onDraw of custom view.
// Path to draw cubic bezier curve
Path cubePath = new Path();
// Move to startPoint(200,200) (P0)
cubePath.moveTo(200,200);
// Cubic to with ControlPoint1(200,100) (C1), ControlPoint2(300,100) (C2) , EndPoint(300,200) (P1)
cubePath.cubicTo(200,100,300,100,300,200);
// Draw on Canvas
canvas.drawPath(cubePath, paint);
I visualize above code in following image.
[Updated]
Logic for selecting first control points, I've taken ,
baseX = 200 , baseY = 200 and curve_size = X of Endpoint - X of Start Point
Start Point : x = baseX and y = baseY
Control Point 1 : x = baseX and y = baseY - curve_size
Control Point 2 : x = baseX + curve_size and y = baseY - curve_size
End Point : x = baseX + curve_size and y = baseY
I want to allow user to change EndPoint of above curve, and based on the new End points, I invalidate the canvas.
But problem is that, Curve maintain by two control points, which needs to be recalculate upon the change in EndPoint.
Like, I just want to find new Control Points when EndPoint change from (300,200) to (250,250)
Like in following image :
Please help me to calculate two new Control Points based on new End Point that curve shape will maintain same as previous end point.
I refer following reference links during searching:
http://pomax.github.io/bezierinfo/
http://jsfiddle.net/hitesh24by365/jHbVE/3/
http://en.wikipedia.org/wiki/B%C3%A9zier_curve
http://cubic-bezier.com/
Any reference link also appreciated in answer of this question.
changing the endpoint means two things, a rotation along P1 and a scaling factor.
The scaling factor (lets call it s) is len(p1 - p0) / len(p2 - p0)
For the rotation factor (lets call it r) i defer you to Calculating the angle between three points in android , which also gives a platform specific implementation, but you can check correctness by scaling/rotationg p1 in relation to p0, and you should get p2 as a result.
next, apply scaling and rotation with respect to p0 to c1 and c2. for convenience i will call the new c1 'd1' and the new d2.
d1 = rot(c1 - p0, factor) * s + p0
d2 = rot(c2 - p0, factor) * s + p0
to define some pseudocode for rot() (rotation http://en.wikipedia.org/wiki/Rotation_%28mathematics%29)
rot(point p, double angle){
point q;
q.x = p.x * cos(angle) - p.y * sin(angle);
q.y = p.x * sin(angle) + p.y * cos(angle);
}
Your bezier curve is now scaled and rotated in relation to p0, with p1 changed to p2,
Firstly I would ask you to look into following articles :
Bezier Curves
Why B-Spline Curve
B-Spline Curve Summary
What you are trying to implement is a piecewise composite Bézier curve. From the Summary page for n control points (include start/end) you get (n - 1)/3 piecewise Bézier curves.
The control points shape the curve literally. If you don't give proper control points with new point, you will not be able to create smoothly connected bezier curve. Generating them will not work, as it is too complex and there is no universally accepted way.
If you don't have/want to give extra control points, you should use Catmull-Rom spline, which passes through all control points and will be C1 continous (derivative is continuous at any point on curve).
Links for Catmull Rom Spline in java/android :
http://hawkesy.blogspot.in/2010/05/catmull-rom-spline-curve-implementation.html
https://github.com/Dongseob-Park/catmull-rom-spline-curve-android
catmull-rom splines for Android (similar to your question)
Bottom line is if you don't have the control points don't use cubic bezier curve. Generating them is a problem not the solution.
It seems that you are here rotating and scaling a square where you know the bottom two points and need to calculate the other two. The two known points form two triangles with the other two, so we just need to find the third point in a triangle. Supose the end point is x1, y1:
PointF c1 = calculateTriangle(x0, y0, x1, y1, true); //find left third point
PointF c2 = calculateTriangle(x0, y0, x1, y1, false); //find right third point
cubePath.reset();
cubePath.moveTo(x0, y0);
cubePath.cubicTo(c1.x, c1.y, c2.x, c2.y, x1, y1);
private PointF calculateTriangle(float x1, float y1, float x2, float y2, boolean left) {
PointF result = new PointF(0,0);
float dy = y2 - y1;
float dx = x2 - x1;
float dangle = (float) (Math.atan2(dy, dx) - Math.PI /2f);
float sideDist = (float) Math.sqrt(dx * dx + dy * dy); //square
if (left){
result.x = (int) (Math.cos(dangle) * sideDist + x1);
result.y = (int) (Math.sin(dangle) * sideDist + y1);
}else{
result.x = (int) (Math.cos(dangle) * sideDist + x2);
result.y = (int) (Math.sin(dangle) * sideDist + y2);
}
return result;
}
...
There is other way to do this where it does not matter how many points you have in between the first and the last point in the path or event its shape.
//Find scale
Float oldDist = (float) Math.sqrt((x1 - x0) * (x1 - x0) + (y1 - y0) * (y1 - y0));
Float newDist = (float) Math.sqrt((x2 - x0) * (x2 - x0) + (y2 - y0) * (y2 - y0));
Float scale = newDist/oldDist;
//find angle
Float oldAngle = (float) (Math.atan2(y1 - y0, x1 - x0) - Math.PI /2f);
Float newAngle = (float) (Math.atan2(y2 - y0, x2 - x0) - Math.PI /2f);
Float angle = newAngle - oldAngle;
//set matrix
Matrix matrix = new Matrix();
matrix.postScale(scale, scale, x0, y0);
matrix.postRotate(angle, x0, y0);
//transform the path
cubePath.transform(matrix);
A small variant on the suggestion by Lumis
// Find scale
Float oldDist = (float) Math.sqrt((x1 - x0) * (x1 - x0) + (y1 - y0) * (y1 - y0));
Float newDist = (float) Math.sqrt((x2 - x0) * (x2 - x0) + (y2 - y0) * (y2 - y0));
Float scale = newDist/oldDist;
// Find angle
Float oldAngle = (float) (Math.atan2(y1 - y0, x1 - x0));
Float newAngle = (float) (Math.atan2(y2 - y0, x2 - x0));
Float angle = newAngle - oldAngle;
Matrix matrix = new Matrix();
matrix.postScale(scale, scale);
matrix.postRotate(angle);
float[] p = { c1.x, c1.y, c2.x, c2.y };
matrix.mapVectors(p);
PointF newC1 = new PointF(p[0], p[1]);
PointF newC2 = new PointF(p[2], p[3]);
I tried to use the Z axis data from SensorEvent.values, but it doesn't detect rotation of my phone in the XY plane, ie. around the Z-axis.
I am using this as a reference for the co-ordinate axes. Is it correct?
How do I measure that motion using accelerometer values?
These games do something similar: Extreme Skater, Doodle Jump.
PS: my phone orientation will be landscape.
Essentially, there is 2 cases here: the device is laying flat and not flat. Flat here means the angle between the surface of the device screen and the world xy plane (I call it the inclination) is less than 25 degree or larger than 155 degree. Think of the phone lying flat or tilt up just a little bit from a table.
First you need to normalize the accelerometer vector.
That is if g is the vector returns by the accelerometer sensor event values. In code
float[] g = new float[3];
g = event.values.clone();
double norm_Of_g = Math.sqrt(g[0] * g[0] + g[1] * g[1] + g[2] * g[2]);
// Normalize the accelerometer vector
g[0] = g[0] / norm_Of_g
g[1] = g[1] / norm_Of_g
g[2] = g[2] / norm_Of_g
Then the inclination can be calculated as
int inclination = (int) Math.round(Math.toDegrees(Math.acos(g[2])));
Thus
if (inclination < 25 || inclination > 155)
{
// device is flat
}
else
{
// device is not flat
}
For the case of laying flat, you have to use a compass to see how much the device is rotating from the starting position.
For the case of not flat, the rotation (tilt) is calculated as follow
int rotation = (int) Math.round(Math.toDegrees(Math.atan2(g[0], g[1])));
Now rotation = 0 means the device is in normal position. That is portrait without any tilt for most phone and probably landscape for tablet. So if you hold a phone as in your picture above and start rotating, the rotation will change and when the phone is in landscape the rotation will be 90 or -90 depends on the direction of rotation.
The accelerometer is sufficient for checking if the phone is flat as Hoan very nicely demonstrated.
For anyone who arrives here looking to not only check if the phone flat, but what the rotation of the phone is, it can be achieved through the Rotation Vector Motion Sensor.
private double pitch, tilt, azimuth;
#Override
public void onSensorChanged(SensorEvent event) {
//Get Rotation Vector Sensor Values
double[] g = convertFloatsToDoubles(event.values.clone());
//Normalise
double norm = Math.sqrt(g[0] * g[0] + g[1] * g[1] + g[2] * g[2] + g[3] * g[3]);
g[0] /= norm;
g[1] /= norm;
g[2] /= norm;
g[3] /= norm;
//Set values to commonly known quaternion letter representatives
double x = g[0];
double y = g[1];
double z = g[2];
double w = g[3];
//Calculate Pitch in degrees (-180 to 180)
double sinP = 2.0 * (w * x + y * z);
double cosP = 1.0 - 2.0 * (x * x + y * y);
pitch = Math.atan2(sinP, cosP) * (180 / Math.PI);
//Calculate Tilt in degrees (-90 to 90)
double sinT = 2.0 * (w * y - z * x);
if (Math.abs(sinT) >= 1)
tilt = Math.copySign(Math.PI / 2, sinT) * (180 / Math.PI);
else
tilt = Math.asin(sinT) * (180 / Math.PI);
//Calculate Azimuth in degrees (0 to 360; 0 = North, 90 = East, 180 = South, 270 = West)
double sinA = 2.0 * (w * z + x * y);
double cosA = 1.0 - 2.0 * (y * y + z * z);
azimuth = Math.atan2(sinA, cosA) * (180 / Math.PI);
}
private double[] convertFloatsToDoubles(float[] input)
{
if (input == null)
return null;
double[] output = new double[input.length];
for (int i = 0; i < input.length; i++)
output[i] = input[i];
return output;
}
Then to check if the phone is flat you can simply compare the tilt and pitch values with a tolerance values. For example
public boolean flatEnough(double degreeTolerance) {
return tilt <= degreeTolerance && tilt >= -degreeTolerance && pitch <= degreeTolerance && pitch >= -degreeTolerance;
}
The advantage to doing it this way is you can check if the phone is being held in any specific rotation.
It is worth noting that the app's orientation will not affect the values of pitch, tilt, and azimuth.
Working off of the perfect response from #Dan
He missed a very slight bit of information that #davy307 pointed out.
When initializing the mAccelerometer, you must define it as Sensor.TYPE_ROTATION_VECTOR otherwise, it will not have the 3rd rotation vector and throw an ArrayIndexOutOfBounds exception.
mSensorManager = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
mAccelerometer = mSensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);
Otherwise, this is a perfect solution... Appreciated!