Move an image based on phone movement (ROTATION_VECTOR) - android

I'm very new to working with the sensors on an Android device. I have the following code (which I obtained from various tutorials).
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
float[] orientationVals = new float[3];
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrix,
event.values);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
txtAzimuth.setText("Azimuth : "+d.format(orientationVals[0]) + '°');
txtPitch.setText("Pitch : "+d.format(orientationVals[1]) + '°');
txtRoll.setText("Roll : "+d.format(orientationVals[2]) + '°');
}
}
Okay, so I've got these values. How do I use them to move an image (a marker of sorts) based on the movement of the phone. For example, if I move the phone to the right, the image should move to the left and when the phone is moved left later, the image should move right.
I've done a lot of googling and what I've found is mostly OpenGl stuff.
Like this:
OpenGl world using camera to find object
This question asks what I want with the exception of the object moving randomly. I want the object position to be decided by me.
I don't want to use OpenGL for this as it would be overkill for my app. Could I somehow map the values produced by the sensors to moving the image?

Related

Azimuth mirrored after remapping axis coordinate system

I am trying to remap a device that has an alternate coordinate system.
The sensor is reporting values that are rotated 90° around the X axis. The format is a Quaternion in standard Android Rotation Vector notation. If I use the data unmodified I can hold the device 90° offset and successfully call getOrientation via:
private void updateOrientationFromVector(float[] rotationVector) {
float[] rotationMatrix = new float[9];
SensorManager.getRotationMatrixFromVector(rotationMatrix, rotationVector);
final int worldAxisForDeviceAxisX = SensorManager.AXIS_X;
final int worldAxisForDeviceAxisY = SensorManager.AXIS_Z;
float[] adjustedRotationMatrix = new float[9];
SensorManager.remapCoordinateSystem(rotationMatrix, worldAxisForDeviceAxisX,
worldAxisForDeviceAxisY, adjustedRotationMatrix);
// Transform rotation matrix into azimuth/pitch/roll
float[] orientation = new float[3];
SensorManager.getOrientation(adjustedRotationMatrix, orientation);
// Convert radians to degrees
float azimuth = orientation[0] * 57;
float pitch = orientation[1] * 57;
float roll = orientation[2] * 57;
// Normalize for readability
if(azimuth < 0) {
azimuth = azimuth + 360;
}
Log.d("Orientation", "Azimuth: " + azimuth + "° Pitch: " + pitch + "° Roll: " + roll + "°);
}
This code works fine for all normal Android devices. If I hold a reference phone in front of me as shown, the data is converted properly and shows my correct bearings. But when I use this test device, I must hold it at the wrong orientation to show me the correct bearings.
I want to pre-process the data from this test device to rotate the axes so the this device matches all other Android devices. This will let the display logic be generic.
Unfortunately I have tried many different techniques and none are working.
First, I tried to use a the Android calls again:
private fun rotateQuaternionAxes(rotationVector :FloatArray) : FloatArray {
val rotationMatrix = FloatArray(9)
SensorManager.getRotationMatrixFromVector(rotationMatrix, rotationVector)
val worldAxisForDeviceAxisX = SensorManager.AXIS_X
val worldAxisForDeviceAxisY = SensorManager.AXIS_Z
val adjustedRotationMatrix = FloatArray(9)
SensorManager.remapCoordinateSystem(rotationMatrix, worldAxisForDeviceAxisX, worldAxisForDeviceAxisY, adjustedRotationMatrix)
val axisRemappedData = Quaternion.fromRotationMatrix(adjustedRotationMatrix)
val rotationData = floatArrayOf(
axisRemappedData.x,
axisRemappedData.y,
axisRemappedData.z,
axisRemappedData.w
)
return rotationData
}
My private Quaternion.fromRotationMatrix is not show here, and came from euclideanspace.com
When I pre-process my rotation data with this, the logic works for everything, except north and south are swapped! East and west are correct, and my pitch and roll are correct.
So I decided to follow the suggestions for Rotating a Quaternion on 1-Axis with the following code:
private fun rotateQuaternionAxes(rotationVector :FloatArray) : FloatArray {
// https://stackoverflow.com/questions/4436764/rotating-a-quaternion-on-1-axis
// Device X+ is towards power button; Y+ is toward camera; Z+ towards nav buttons
// So rotate the reported data 90 degrees around X and the axes move appropriately
val sensorQuaternion: Quaternion = Quaternion(rotationVector[0], rotationVector[1], rotationVector[2], rotationVector[3])
val manipulationQuaternion = Quaternion.axisAngle(-1.0f, 0.0f, 0.0f, 90.0f)
val axisRemappedData = Quaternion.multiply(sensorQuaternion, manipulationQuaternion)
val rotationData = floatArrayOf(
axisRemappedData.x,
axisRemappedData.y,
axisRemappedData.z,
axisRemappedData.w
)
//LogUtil.debug("Orientation Orig: $sensorQuaternion Rotated: $axisRemappedData")
return rotationData
}
This does the exact same thing! Everything is fine, except north and south are mirrored, leaving east and west correct.
My Quaternion math came from sceneform-android-sdk and I double-checked it against several online sources.
I also tried simply changing my data by just grabbing the same data differently according to Convert quaternion to a different coordinate system.
private fun rotateQuaternionAxes(rotationVector :FloatArray) : FloatArray {
// No change:
//val rotationData = floatArrayOf(x_val, y_val, z_val, w_val)
val x_val = rotationVector[0]
val y_val = rotationVector[1]
val z_val = rotationVector[2]
val w_val = rotationVector[3]
val rotationData = floatArrayOf(x_val, z_val, -y_val, w_val)
return rotationData
}
This was not even close. I played with the axes and ended up finding rotationData = floatArrayOf(-z_val, -x_val, y_val, w_val) was had correct pitch and roll, but the azimuth was completely non-functional. So I've abandoned a simple remapping as an option.
Since the Android remapCoordinateSystem and the quaternion math give the same result, they seem mathematically equivalent. And multiple sources indicate they should accomplish what I'm trying to do.
Can any one explain why remapping my axes would swap the north/south? I believe I am getting a quaternion reflection instead of rotation. There is no physical point on the device that tracks the direction it shows.
Answer
As you said, it looks like you are expecting your data to be on the East-North-Up (ENU) Frame of Reference (FoR) but you are seeing data on an East-Down-North (EDN) FoR. The link you cited to convert quaternion to another coordinate system converts from an ENU to a NDW FoR - which evidently is not what you are looking for.
There are two ways you can solve this. Either use another rotation matrix, or swap your variables. Using another rotation matrix means doing more computation - but if you really want to learn how to do this, you can check out my self-plug introduction to quaternions for reference frame rotations.
The easiest way would be to swap your variables by recognizing that your X axis is not changing, but your expected Y is measured in z' and your expected Z is measured in -y'. Where X,Y,Z are the expected FoR, and x',y',z' are the actual measured FoR. The following "swaps" should allow you to get the same behavior as your other Android devices:
x_expected = x_actual
y_expected = z_actual
z_expected = -y_actual
!!! HOWEVER !!! If your measurements are given in quaternions, then you will have to use a rotation matrix. If your measurements are given as X,Y,Z measurements, you can get away with the swap provided above.
ENU/NED/NDW Notation
East-North-Up and all other similar axes notations are defined by the order of the coordinate system, expressed as X, then Y, and lastly Z, with respect to a Global inertial (static) Frame of Reference. I've defined your expected coordinate system as if you were to lay your phone flat on the ground with the screen of the phone facing the sky and the top of your phone pointing Northward.

ARCore Pose and Aruco estimatePoseSingleMarkers

I have a server function that detects and estimates a pose of aruco's marker from an image.
Using the function estimatePoseSingleMarkers I found the rotation and translation vector.
I need to use this value in an Android app with ARCore to create a Pose.
The documentation says that Pose needs two float array (rotation and translation): https://developers.google.com/ar/reference/java/arcore/reference/com/google/ar/core/Pose.
float[] newT = new float[] { t[0], t[1], t[2] };
Quaternion q = Quaternion.axisAngle(new Vector3(r[0], r[1], r[2]), 90);
float[] newR = new float[]{ q.x, q.y, q.z, q.w };
Pose pose = new Pose(newT, newR);
The position of the 3D object placed in this pose is totally random.
What am I doing wrong?
This is a snapshot from server image after estimate and draw axis. The image I receive is rotated of 90°, not sure if it relates to anything.
cv::aruco::estimatePoseSingleMarkers (link) returns rotation vector in Rodrigues format. Following the doc
w = norm( r ) // angle of rotation in radians
r = r/w // unit axis of rotation
thus
float w = sqrt( r[0]*r[0] + r[1]*r[1] + r[2]*r[2] );
// handle w==0.0 separately
// Get a new Quaternion using an axis/angle (degrees) to define the rotation
Quaternion q = Quaternion.axisAngle(new Vector3(r[0]/w, r[1]/w, r[2]/w), w * 180.0/3.14159 );
should work except for the right angle rotation mentioned. That is, if the lens parameters are fed to estimatePoseSingleMarkers correctly or up to certain accuracy.

rotate 3D Object (Spatial) with compass

I've generated 3D overlay using jMonkey in my Android app. Everything works fine - my ninja model is walking in loop. Awsome !
Now I want to rotate camera according to direction of the phone. I thought compass is the best way BUT unfortunatly I have a lot of problems. So here I go
I've created method that is invoked in activity
public void rotate(float x, float y, float z) {
Log.d(TAG, "simpleUpdate: Nowa rotacja: " + y);
newX = x;
newY = y;
newZ = z;
newPosition = true;
}
in 'simpleUpdate' method I've managed it this way
if(newPosition && ninja != null) {
Log.d(TAG, "simpleUpdate: rotacja: " + newY);
ninja.rotate((float)Math.toRadians(newX), (float)Math.toRadians(newY), (float)Math.toRadians(newZ));
newPosition = false;
}
in my activity I'm checking if phone moved
if(lastAzimuth != (int)azimuthInDegress) {
lastAzimuth = (int)azimuthInDegress;
I cast to int so the distortion won't be so big problem
if ((com.fixus.towerdefense.model.SuperimposeJME) app != null) {
((com.fixus.towerdefense.model.SuperimposeJME) app).rotate(0f, azimuthInDegress, 0f);
}
At the moment I want to rotate it only in Y axis
Now the main problem is that the rotations is more like jump that rotation. When I move my phone a bit and I have 6 degrees diffrence (i see this in my log) the model is rotated like for 90 degrees and he turns back. This has nothing to do with rotation or change taken from my compas.
Any ideas ?
UPDATE
I think I got it. Method rotate, rotates from current state with value I've set. So it looks more like old Y rotate + new value. So I'm setting the diffrence between current value and old value and it now look almoust fine. Is it the good way ?

how do I convert Sensor data to degrees

I am getting various sensor readings from my device (programing for android) and i am looking to get the roll (which seems to be a number 1-100) converted into an angle in degrees and also convert the magnetometer heading into degrees..
any simple equations would be appreciated.. my geometry is a fleeting memory..
public void onSensorChanged(SensorEvent evt) {
int type=evt.sensor.getType();
if(type == Sensor.TYPE_ORIENTATION){
azimuth = evt.values[0]; // azimuth rotation around the z-axis
pitch = evt.values[1]; // pitch rotation around the x-axis
roll = evt.values[2]; // roll rotation around the y-axis
}
//Smoothing the sensor data a bit seems like a good idea.
if (type == Sensor.TYPE_MAGNETIC_FIELD) {
orientation[0]=(orientation[0]*1+evt.values[0])*0.5f;
orientation[1]=(orientation[1]*1+evt.values[1])*0.5f;
orientation[2]=(orientation[2]*1+evt.values[2])*0.5f;
} else if (type == Sensor.TYPE_ACCELEROMETER) {
acceleration[0]=(acceleration[0]*2+evt.values[0])*0.33334f;
acceleration[1]=(acceleration[1]*2+evt.values[1])*0.33334f;
acceleration[2]=(acceleration[2]*2+evt.values[2])*0.33334f;
}
if ((type==Sensor.TYPE_MAGNETIC_FIELD) || (type==Sensor.TYPE_ACCELEROMETER)) {
float newMat[]=new float[16];
//Toast toast = Toast.makeText(ctx.getApplicationContext(), "accel", Toast.LENGTH_SHORT);
//toast.show();
SensorManager.getRotationMatrix(newMat, null, acceleration, orientation);
if(displayOri==0||displayOri==2){
SensorManager.remapCoordinateSystem(newMat,SensorManager.AXIS_X*-1, SensorManager.AXIS_MINUS_Y*-1,newMat);
}else{
SensorManager.remapCoordinateSystem(newMat,SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X,newMat);
}
matrix=newMat;
}
}
I should add that i am not sure i just want roll.. my app locks in landscape mode but obviously the user can turn their phone to any angle on any access.. so i probably need all three of the above to get the angle im looking for.
the angle in question is as if the user is looking through their phone, no matter how they are holding it, at the real world and i want the angle they are looking off the horizon..
for instance if they are looking at the horizon i want 90degrees returned, if they are looking straight up in the sky i should get 180, straight down -180.. then i will also want the degrees from magnetic north that they are looking.. using the magnetometer
values[2] should allready contain degree-value, that's mentioned in a reference:
values[2]: Roll, rotation around Y axis (-90<=roll<=90), with positive
values when the z-axis moves toward the x-axis.
Update
Take a look at this picture: http://developer.android.com/images/axis_device.png
Here you see blue axis - Y axis. When your phone turns around it, it's called "rolling". The angle of the rotation will be contained in values[2].

Rotating the Canvas impacts TouchEvents

I have a map application using an in-house map engine on Android. I'm working on a rotating Map view that rotates the map based on the phone's orientation using the Sensor Service. All works fine with the exception of dragging the map when the phone is pointing other than North. For example, if the phone is facing West, dragging the Map up still moves the Map to the South versus East as would be expected. I'm assuming translating the canvas is one possible solution but I'm honestly not sure the correct way to do this.
Here is the code I'm using to rotate the Canvas:
public void dispatchDraw(Canvas canvas)
{
canvas.save(Canvas.MATRIX_SAVE_FLAG);
// mHeading is the orientation from the Sensor
canvas.rotate(-mHeading, origin[X],origin[Y]);
mCanvas.delegate = canvas;
super.dispatchDraw(mCanvas);
canvas.restore();
}
What is the best approach to make dragging the map consistent regardless of the phones orientation? The sensormanager has a "remapcoordinates()" method but it's not clear that this will resolve my problem.
You can trivially get the delta x and delta y between two consecutive move events. To correct these values for your canvas rotation you can use some simple trignometry:
void correctPointForRotate(PointF delta, float rotation) {
// Get the angle of movement (0=up, 90=right, 180=down, 270=left)
double a = Math.atan2(-delta.x,delta.y);
a = Math.toDegrees(a); // a now ranges -180 to +180
a += 180;
// Adjust angle by amount the map is rotated around the center point
a += rotation;
a = Math.toRadians(a);
// Calculate new corrected panning deltas
double hyp = Math.sqrt(x*x + y*y);
delta.x = (float)(hyp * Math.sin(a));
delta.y = -(float)(hyp * Math.cos(a));
}

Categories

Resources