I'm trying to move an object to a new x,y position based on the user's touch location, but I've hit a brick wall.
Currently, I'm coding the movement of the axis manually, but it's producing a scripted, "x then y", resulting in a squared off movement. Ideally, I want to gain a linear path to the touch position, not a square.
My basic movement calculation is here:
//check target not met on x axis
if(spriteX != spriteTargetX){
//check if its left or right
if(spriteTargetX<spriteX){
spriteX -=spriteSpeed;
}else{
spriteX +=spriteSpeed;
}
}
if(spriteY != spriteTargetY){
//check if its up or down
if(spriteTargetY<spriteY){
spriteY -=spriteSpeed;
}else{
spriteY +=spriteSpeed;
}
}
The above algorithm always results in a square movement. I honestly don't know whether I should be performing some kind of distance/angle calculation. Any ideas?
One simple way to do this is to represent the direction of movement as a unit vector, and multiply by the speed. I'll list the basic steps, and give an example where you are at (1,1) and wish to move to (4,5) with speed 2:
Get difference between destination and current. (diff.x and diff.y)
diff.x = 3
diff.y = 4
Get the total distance from destination to current.
dist = 5 ( sqrt(3^2 + 4^2) = 5 )
Divide diff.x and diff.y by the distance
diff.x = 0.6
diff.y = 0.8
Multiply diff.x and diff.y by desired speed
diff.x = 1.2
diff.y = 1.6
Add diff.x and diff.y to sprite's x and y, respectively
sprite.x = 2.2
sprite.y = 2.6
Related
Is there a way to get the angle that the ARCore camera is facing compared to one of ARCore's axis in Euler angles? I understand that we can get the forward vector the ARCore camera, but I am unsure as to how get the correct angle that could be calculated from this.
Was probably overthinking the problem. Here is a simple way to get the angle the camera is facing to the -Z axis (positive angle is to right of -Z axis). This method could also be used to figure out the angle the camera is facing to a different axis also.
// get the camera
Camera arCamera = arFragment.getArSceneView().getScene().getCamera();
// get forward vector of the camera
Vector3 cameraPos = arCamera.getWorldPosition();
Vector3 cameraForward = Vector3.add(cameraPos, arCamera.getForward().scaled(1.0f));
Vector3 forwardVector = Vector3.subtract(cameraForward, cameraPos);
forwardVector = new Vector3(forwardVector.x, 0, forwardVector.z).normalized();
double degreesFromCamToNegZAxis = Vector3.angleBetweenVectors(Vector3.forward(), forwardVector);
// take the dot product between the two vectors
// to see from what side of the -Z axis it is facing
float dotProduct = Vector3.dot(Vector3.right(), forwardVector);
// value is between -1 to 1
// if negative, forward vector is facing to the right of -Z axis
if (dotProduct < 0) {
degreesFromCamToNegZAxis = -degreesFromCamToNegZAxis;
}
If you want to figure out from a different axis just change the vector to the world you are trying to relate it to like so. Positive angle is to the right of positive Z axis in this example.
double degreesFromCamToZAxis = Vector3.angleBetweenVectors(Vector3.back(), forwardVector);
// take the dot product between the two vectors
// to see from what side of the Z axis it is facing
float dotProduct = Vector3.dot(Vector3.left(), forwardVector);
// value is between -1 to 1
// if negative, forward vector is facing to the left of Z axis
if (dotProduct < 0) {
degreesFromCamToZAxis = -degreesFromCamToZAxis;
}
Here is an image that visualizes the coordinate system ARCore uses (same as Android's Sensors).
ARCore Coordinate System Reference
I want to pinch zoom around a specific coordinate on a tiled 15f x 15f 3D board. I.e. I don't want to zoom around origin. Thus I need to pan the board accordingly.
I am using a PerspectiveCamera (near=0.1f, far=100f). For a perfect fit of the screen the camera is located at approx.z=13.4 before zooming.
Now what (I think) I want to do is to:
Unproject the screen coordinates (GestureDetector.pinch method) done once for each pinch zoom:
float icx = (initialPointer1.x + initialPointer2.x) / 2;
float icy = (initialPointer1.y + initialPointer2.y) / 2;
pointToMaintain = new Vector3(icx, icy, 0f);
mCamera.unproject(pointToMaintain);
Now for each zoom cycle (as I adjust the mCamera.z accordingly and do mCamera.update()) I project the point back to screen coordinates:
Vector3 pointNewPos = new Vector3(pointToMaintain);
mCamera.project(pointNewPos);
Then calculate the delta and pan accordingly:
int dX = (int) (pointNewPos.x - icx);
int dY = (int) (pointNewPos.y - icy);
pan(...); /* I.e. mCamera.translate(...) */
My problem is that the mCamera.z is initially above pointToMaintain.z and then goes below as the user moves the fingers:
cam.z ptm.z dX dY
0 13.40
1 13.32 13.30 12 134
2 13.24 13.30 12 -188
...
(0) is the original value of mCamera.z before zooming starts. (1) is not not valid? However (2) should be OK.
My questions are:
(1) How can I get a "valid" pointToMaintain when unprojecting the screen coordinates on the camera. I.e. a point that is not less than cam.z. (The reason I get the point to 13.30 is (I guess) because near=0.1f. But as seen above this results in (weird?) screen coordinates).
(2) Is this a good strategy for moving the tiles board closer to the coordinates the user pinched zoomed on?
To mantain focus points, I did this code:
Obs: This code relies on overloaded operators, you need to change the vectors operators by its method (addMe, subtract, etc)
void zoomAt(float changeAmmount, Vector2D focus) {
float newZoom = thisZoom + changeAmmount;
offset = focus - ((focus - offset) * newZoom / thisZoom);
thisZoom = newZoom;
}
Where
focus = Current center point to mantain
offset = Distance from 0,0
thisZoom = current zoom ammount; (starts at 1)
changeAmmount = value to increase or decrease zoom
It took me 4 tries along of 3 years to make it done, and was pretty easy when you drawn it down, its just two triangles.
TL;DR
How come the accelerometer values I get from Sensor.TYPE_ACCELEROMETER are slightly offset? I don't mean by gravity, but by some small error that varies from axis to axis and phone to phone.
Can I calibrate the accelerometer? Or is there a standard way of compensating for these errors?
I'm developing an app that has a need for as precise acceleration measurements as possible (mainly vertical acceleration, i.e. same direction as gravity).
I've been doing A LOT of testing, and it turns out that the raw values I get from Sensor.TYPE_ACCELEROMETER are off. If I let the phone rest at a perfectly horizontal surface with the screen up, the accelerometer shows a Z-value of 9.0, where it should be about 9.81. Likewise, if I put the phone in portrait or landscape mode, the X- and Y- accelerometer values show about 9.6. instead of 9.81.
This of course affects my vertical acceleration, as I'm using SensorManager.getRotationMatrixFromVector(), to calculate the vertical acceleration, resulting in a vertical acceleration that is off by a different amount depending on the rotation of the device.
Now, before anyone jumps the gun and mentions that I should try using Sensor.TYPE_LINEAR_ACCELERATION instead, I must point out that I'm actually doing that as well, parallel to the TYPE_ACCELERATION. By using the gravity sensor I then calculate the vertical acceleration (as described in this answer). The funny thing is that I get EXACTLY the same result as the method that uses the raw accelerometer, SensorManager.getRotationMatrixFromVector() and matrix multiplication (and finally subtracting gravity).
The only way I'm able to get almost exactly zero vertical acceleration for a stationary phone in any rotation is to get the raw accelerometer values, add an offset (from earlier observations, i.e. X+0.21, Y+0.21 and Z+0.81) and then performing the rotation matrix stuff to get the world coordinate system accelerations. Note that since it's not just the calculated vertical acceleration that is wrong - it's actually the raw values from Sensor.TYPE_ACCELEROMETER, which I would think excludes other error sources like gyroscope sensor, etc?
I have tested this on two different phones (Samsung Galaxy S5 and Sony Xperia Z3 compact), and both have these accelerometer value deviances - but of course not the same values on both phones.
How come the the values of Sensor.TYPE_ACCELEROMETER are off, and is there a better way of "calibrating" the accelerometer than simply observing how much they deviate from gravity and adding the difference to the values before using them?
You should calibrate gains, offsets, and angle of the 3 accelerometers.
Unfortunately it's not possible to deepen the whole topic here.
I'll write a small introduction, describing the basic concept, and then I'll post a link to the code of a simple Clinometer that implements the calibration.
The calibration routine could be done with 7 misurations (calculate the mean value of a good number of samples) in different ortogonal positions at your choice, in order to have all +-0 and +-g values of your accelerometers. For example:
STEP 1 = Lay flat
STEP 2 = Rotate 180°
STEP 3 = Lay on the left side
STEP 4 = Rotate 180°
STEP 5 = Lay vertical
STEP 6 = Rotate 180° upside-down
STEP 7 = Lay face down
Then you can use the 7 measurements mean[][] to calculate offsets and gains:
calibrationOffset[0] = (mean[0][2] + mean[0][3]) / 2;
calibrationOffset[1] = (mean[1][4] + mean[1][5]) / 2;
calibrationOffset[2] = (mean[2][0] + mean[2][6]) / 2;
calibrationGain[0] = (mean[0][2] - mean[0][3]) / (STANDARD_GRAVITY * 2);
calibrationGain[1] = (mean[1][4] - mean[1][5]) / (STANDARD_GRAVITY * 2);
calibrationGain[2] = (mean[2][0] - mean[2][6]) / (STANDARD_GRAVITY * 2);
using the values of mean[axis][step], where STANDARD_GRAVITY = 9.81.
Then apply the Gain and Offset Corrections to measurements:
for (int i = 0; i < 7; i++) {
mean[0][i] = (mean[0][i] - calibrationOffset[0]) / calibrationGain[0];
mean[1][i] = (mean[1][i] - calibrationOffset[1]) / calibrationGain[1];
mean[2][i] = (mean[2][i] - calibrationOffset[2]) / calibrationGain[2];
}
and finally calculates the correction angles:
for (int i = 0; i < 7; i++) {
angle[0][i] = (float) (Math.toDegrees(Math.asin(mean[0][i]
/ Math.sqrt(mean[0][i] * mean[0][i] + mean[1][i] * mean[1][i] + mean[2][i] * mean[2][i]))));
angle[1][i] = (float) (Math.toDegrees(Math.asin(mean[1][i]
/ Math.sqrt(mean[0][i] * mean[0][i] + mean[1][i] * mean[1][i] + mean[2][i] * mean[2][i]))));
angle[2][i] = (float) (Math.toDegrees(Math.asin(mean[2][i]
/ Math.sqrt(mean[0][i] * mean[0][i] + mean[1][i] * mean[1][i] + mean[2][i] * mean[2][i]))));
}
calibrationAngle[2] = (angle[0][0] + angle[0][1])/2; // angle 0 = X axis
calibrationAngle[1] = -(angle[1][0] + angle[1][1])/2; // angle 1 = Y axis
calibrationAngle[0] = -(angle[1][3] - angle[1][2])/2; // angle 2 = Z axis
You can find a simple but complete implementation of a 3-axis calibration in this opensource Clinometer app: https://github.com/BasicAirData/Clinometer.
There is also the APK and the link of the Google Play Store if you want to try it.
You can find the calibration routine in CalibrationActivity.java;
The calibration parameters are applied in ClinometerActivity.java.
Furthermore, you can find a very good technical article that deepens the 3-axis calibration here: https://www.digikey.it/it/articles/using-an-accelerometer-for-inclination-sensing.
I need find angle of vehicle turn measured in degrees.
Location points update with equal intervals (1 sec). Therefore device makes like 4-5 points during turn. I schematically displayed that on picture.
Is it possible to calculate the angle of turn using Location? If it is possible, how?
What I tried:
Create two geometric vectors from points 3, 4 and 1, 2 respectively and find angle between those vectors. Coordinates of vectors I calculated like Vector1 (lat2 - lat1; lon2 - lon2). Not sure this approach could be applied to Location coordinates.
Use location1.bearingTo(location2). But this doesn't give expected results. Seems like it gives "compass" results. Perhabs I could use it somehow but not sure.
Also tried few trigonometric formulas like here or here or here. They didn't give expected angle.
EDIT: Solution
The accepted answer works great. But to complete the answer I have to show that method of angleDifference. This one works for me:
public int getAngleDifference(int currentAngle){
int r = 0;
angleList.add(currentAngle);
if (angleList.size() == 4) {
int d = Math.abs(angleList.get(0) - angleList.get(3)) % 360;
r = d > 180 ? 360 - d : d;
angleList.clear();
}
return r;
}
I add points to list untill there're 4 of them and then calculate angle difference between 1st and 4th points for better results.
Hope it will help for someone!
vect1 = LatLon2 - LatLon1; // vector subtraction
vect2 = LatLon4 - LatLon3;
By definition of the dot product has the property:
vect1.vect2 = ||vect1||*||vect2||*Cos(theta)
Here's a breakdown of the notation
The term vect1.vect2 is the dot product of vect1 and vect2.
The general form of a dot product can be broken down component wise let v1 = <x1,y1> and v2=<x2,y2> for two arbitrary vectors v1 and v2 the dot product would be:
v1.v2 = x1*x2 + y1*y2
and the magnitude of some arbitrary vector v is:
||v|| = sqrt(v.v); which is a scalar.
The above is equivalent to the Euclidean distance formula with components x and y:
||v|| = sqrt(x^2 + y^2)
Getting the angle
Find a value for theta given the two vectors vect1 and vect2:
theta = Math.ArcCos(vect1.vect2/(||vect1||*||vect2||))
Approach 1 does not work as you described: Lat, Lon are not cartesian coordinates (One degree of longitude expressed in meters is not one degree of latitide, this is only valid at the equator). You would have first to transform to a (local) cartesian system.
An error is in the drawing: The angle marked with "?" is placed at the wrong side. You most probably want angle: 180 - ?
In your example the car ist turning less than 90°, altough your angle shows more than 90°.
To understand better make another drawing where the car turns left for only 10 degrees. In your drawing this would be 170°, which is wrong.
Approach 2) works better, but you need to sum up the angle differences.
You have to write yourself a method
double angleDifference(double angle1, double angle2);
This look easier than it is, although the code is only a few lines long.
Make sure that you have some test cases that tests the behaviour when crossing the 360° limit.
Example
(turn from bearing 10 to bearing 350), should either give 20 or -20, depending if you want that the method give sthe absolut evalue or the relative angle
I rotated my android device in x direction (from -180 degree to 180 degree), see image below.
And I assume only Rotation vector x value is changed. Y and z maybe have some noise, but it should be not much difference among the values.
However, I receive this. Kindly see
https://docs.google.com/spreadsheets/d/1ZLoSKI8XNjI1v4exaXxsuMtzP0qWTP5Uu4C3YTwnsKo/edit?usp=sharing
I suspect my sensor has some problem.
Any idea? Thank you very much.
Jimmy
Your sensor is fine.Well, the rotation vector entries cannot simply be related to the rotation angle around a particular axis. The SensorEvent structure constitutes of timestamp, sensor, accuracy and values. Depending on the vector the float[] of values vary in size 1-5. The rotation vectors values are based on unit quaternions, all together forming a vector representing the orientation of this world frame relative to your smartphone fixed frame above
They are unitless and positive counter-clockwise.
The orientation of the phone is represented by the rotation necessary to align the East-North-Up coordinates with the phone's coordinates. That is, applying the rotation to the world frame (X,Y,Z) would align them with the phone coordinates (x,y,z).
If the vector would be a Rotation-Matrix one could write it as v_body = R_rot_vec * v_world (<--)pushing the world vector into a smartphone fixed description.
Furthermore about the vector:
The three elements of the rotation vector are equal to the last three components of a unit quaternion <cos(θ/2), xsin(θ/2), ysin(θ/2), z*sin(θ/2)>.
Q: So what to do with it? Depending on your Euler-angles convention (possible 24 sequences, valid 12 ones) you could calculate the corresponding angles u := [ψ,θ,φ] by e.g. applying the 123 sequence:
If you already have the rotation matrix entries get euler like so:
the 321 sequence:
with q1-3 always being the values[0-2] (Dont get confused by u_ijk as ref(Diebel) uses different conventions comp. to the standard)But wait, your linked table only does have 3 values, which is similar to what I get. This is oneSensorEvent of mine, the last three are printed from values[]
timestamp sensortype accuracy values[0] values[1] values[2]
23191581386897 11 -75 -0.0036907701 -0.014922042 0.9932963
4q - 3 values = 1q unknown. The first q0 is redundant info (also the doku says it should be there under values[3], depends on your API-level). So we can use the norm (=length) to calculate q0 from the other three. Set the equation ||q|| = 1 and solve for q0. Now all q0-3 are known.
Furthermore my android 4.4.2 does not have the fourth estimated heading Accuracy (in radians) inside value[4], so I evaluate the event.accuracy:
for (SensorEvent e : currentEvent) {
if (e != null) {
String toMsg = "";
for(int i = 0; i < e.values.length;i++) {
toMsg += " " + String.valueOf(e.values[i]);
}
iBinder.msgString(String.valueOf(e.timestamp) + " "+String.valueOf(e.sensor.getType()) + " " + String.valueOf(e.accuracy) + toMsg, 0);
}
}
Put those equations into code and you will get things sorted.
Here is a short conversion helper, converting Quats. using either XYZ or ZYX. It can be run from shell github. (BSD-licensed)
The relevant part for XYZ
/*quaternation to euler in XYZ (seq:123)*/
double* quat2eulerxyz(double* q) {
/*euler-angles*/
double psi = atan2( -2.*(q[2]*q[3] - q[0]*q[1]) , q[0]*q[0] - q[1]*q[1]- q[2]*q[2] + q[3]*q[3] );
double theta = asin( 2.*(q[1]*q[3] + q[0]*q[2]) );
double phi = atan2( 2.*(-q[1]*q[2] + q[0]*q[3]) , q[0]*q[0] + q[1]*q[1] - q[2]*q[2] - q[3]*q[3] );
/*save var. by simply pushing them back into the array and return*/
q[1] = psi;
q[2] = theta;
q[3] = phi;
return q;
}
Here some examples applying quats to euls:
**Q:** What do the sequence ijk stand for? Take two coordinate-frames A and B superposing each other(all axis within each other) and start rotating frame B through i-axis having angle `psi`, then j-axis having angle `theta` and last z-axis having `phi`. It could also be α, β, γ for i,j,k. *I don't pick up the numbers as they are confusing (Diebel vs other papers).*
R(psi,theta,phi) = R_z(phi)R_y(theta)R_x(psi) (<--)
The trick is elementary rotations are applied from right to left, although we read the sequence from left to right.
Those are the three elementary rotations youre going through to go from
A to B: *v_B = R(psi,theta,phi) v_A*
**Q:** So how to get the euler angles/quats turn from [0°,0°,0°] to eg. [0°,90°,0°]?First align both frames from the pictures, respective the known device frame B to the "invisible" worldframe A. Your done superposing when the angles all get to [0°,0°,0°]. Just figure out where is north, south and east where you are sitting right now and point the devices frame B into those directions. Now when you rotate around y-axis counter-clockwise 90° you will have the desired [0°,90°,0°], when converting the quaternion.
*Julian*
*kinematics source: [Source Diebel(Stanford)][11] with solid info on the mechanics background (careful: for Diebel XYZ is denoted u_321 (1,2,3) while ZYX is u_123 (3,2,1)), and [this][12] is a good starting point.