Android - Steering with orientation sensor - android

I am writing a simple 2d game similar to breakout or pong, where you move a bar along the bottom of the screen by tipping the phone left and right.
I am trying to use the Orientation sensor to achieve this movement however i've run into some issues.
After printing out the orientation sensor's values for a while I decided rotation in the Z axis gave me what I wanted, If I hold the screen horizontal and perpendicular to the floor (x: 0, y: 0, z: 90) rotation in Z left and right takes away from 90. So the value 90-z is fine for usage as an "amount of steering" value.
However this perpendicular to the floor position is not very natural to play with, people are much more likely to hold the phone at about 45 degrees in the Z axis, at which point all these values I have been using mess up completely, steering goes weird, people are unhappy..
I guess what I really need is to detect rotation in some z-axis even though the XY plane the phone is in is constantly changing. Is there some clever way to do this with Maths?
EDIT: Just found the OrientationEventListener - Exactly what I needed!

Related

Measure sideways tilt of upright device

Some camera apps have a feature where they display a line on the screen which is always parallell to the horizon, no matter how the phone is tilted sideways. By "tilted sideways" I mean rotating the device around an axis that is perpendicular to the screen.
I have tried most of the "normal" rotation functions such as combining the ROTATION_VECTOR sensor with getRotationMatrix() and getOrientation() but none of the resulting axis seem to correspond to the one I'm looking for.
I also tried using only the accelerometer, normalizing all three axis and detecting how much gravity is in the X-axis. This works decently when the device is perfectly upright (i.e not tilted forward/backwards). But as soon as it's tilted forward or backward the measured sideways tilt gets increasingly inaccurate, since gravity is now acting on two axis at the same time.
Any ideas on how to achieve this kind of sideways rotation detection in a way that works even if the phone is tilted/pitched slightly forward/backward?
The result of getRotationMatrix converts from the phone's local coordinate system to the world coordinate system. Its columns are therefore the principal axes of the phone, with the first being the phone's X-axis (the +ve horizontal axis when holding the phone in portrait mode), and the second being the Y.
To obtain the horizon's direction on the phone's screen, the line of intersection between the horizontal plane (world space) and the phone's plane must be found. First find the coordinates of the world Z-axis (pointing to the sky) in the phone's local basis - i.e. transpose(R) * [0, 0, 1]; the XY coordinates of this, given by R[2][0], R[2][1], is the vertical direction in screen-space. The required horizon line direction is then R[2][1], -R[2][0].
When the phone is close to being horizontal, this vector becomes very small in magnitude - the horizon is no-longer well-defined. Simply stop updating the horizon line below some threshold.

Accelerometer x and y values are exchanged on smartphones and tablets

I use the accelerometer in my app and the activity is always in landscape mode. The x axis is used for speeding up (imagine something like a racing game) and the y axis is used to control direction left and right. Now I have the following problem:
On my smartphone in landscape mode the x axis is the shorter one and the y axis the longer one. The camera is positioned at the shorter side.
On my tablet the axis are vice versa. Also the camera is on the longer side. That means x axis is the shorter one (right / left) and y axis is the longer one (backward / forward). That means the user needs to tilt the tablet left and right for speeding and backwards and forwards for controlling direction. That is not what I want.
Can anyone tell me a way to find out if the axis are vice versa? I thought about detecting if we got a tablet or a smartphone but there are also tablets where the axis (and so the camera position also) are the same like on my smartphone.
Edit: I made a little picture with paint (I know I'm not a good painter^^). I hope you now understand my problem better. Top one is the smartphone, bottom one is the tablet - both in landscape mode. The circle is the camera which shows that the y-Axis is always where the camera is.
I solved my problem by getting the default device orientation and use it to determine if I need to use x or y for speed.
See https://stackoverflow.com/a/9888357/1794338 for getting the default device orientation

Implementing an accurate compass with Android [duplicate]

I'm using the Android gravity and magnetic field sensors to calculate orientation via SensorManager.getRotationMatrix and SensorManager.getOrientation. This gives me the azimuth, pitch and orientation numbers. The results look sensible when the device is lying flat on a table.
However, I've disabled switches between portrait and landscape in the manifest, so that getWindowManager().getDefaultDisplay().getRotation() is always zero. When I rotate the device by 90 degrees so that it's standing vertical I run into trouble. Sometimes the numbers seem quite wrong, and I've realised that this relates to Gimbal lock. However, other apps don't seem to have this problem. For example, I've compared my app against two free sensor test apps (Sensor Tester (Dicotomica) and Sensor Monitoring (R's Software)). My app agrees with these apps when the device is flat, but as I rotate the device into the vertical position there can be significant differences. The two apps seem to agree with each other, so how do they get around this problem?
When the device is not flat, you have to call remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR); before calling getOrientation.
The azimuth returns by getOrientation is obtained by orthogonally project the device unit Y axis into the world East-North plane and then calculate the angle between the resulting projection vector and the North axis.
Now we normally think of direction as the direction where the back camera is pointing. That is the direction of -Z where Z is the device axis pointing out of the screen. When the device is flat we do not think of direction and accept what ever given. But when it is not flat we expect it is the direction of -Z. But getOrientation calculate the direction of the Y axis, thus we need to swap the Y and Z axes before calling getOrientation. That is exactly what remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR) does, it keep the X axis intact and map Z to Y.
Now so how do you know when to remap or not. You can do that by checking
float inclination = (float) Math.acos(rotationMatrix[8]);
if (result.inclination < TWENTY_FIVE_DEGREE_IN_RADIAN
|| result.inclination > ONE_FIFTY_FIVE_DEGREE_IN_RADIAN)
{
// device is flat just call getOrientation
}
else
{
// call remap
}
The inclination above is the angle between the device screen and the world East-North plane. It shows how much the device is tilting.
I think the best way of defining your orientation angles when the device isn't flat is to use a more appropriate angular co-ordinate system that the standard Euler angles that you get from SensorManager.getOrientation(...). I suggest the one that I describe here on math.stackexchange.com. I've also put some code that does implements it in an answer here. Apart from a good definition of azimuth, it also has a definition of the pitch angle which is exactly the angle given by Math.acos(rotationMatrix[8]) that is mentioned in another answer here.
You can get full details from the two links that I've given in the first paragraph. However, in summary, your rotation matrix R from SensorManager.getRotationMatrix(...) is
where (Ex, Ey, Ez), (Nx, Ny, Nz) and (Gx, Gy, Gz) are vectors pointing due East, North, and in the direction of Gravity. Then the azimuth angle that you want is given by
I'll add an example that worked for me, using Hoan's answer, and also highlight some new resources that are available now and may help when looking at apps that need to use the gyro sensors.
Firstly, there is a Google Code lab available with a sample app - I found the completed sample app very useful for understanding a devices behaviour.
The app display looks like this:
The codelabs and the final app version is available at these links (at the time of writing):
https://developer.android.com/codelabs/advanced-android-training-sensor-orientation#0
https://github.com/google-developer-training/android-advanced/tree/master/TiltSpot
In particular this allows you experiment as follows (its easier to do than describe...):
put your device down flat on a table and experiment with rotating on the table while flat, lifting top and bottom (i.e. lift the top of the display, where your front camera typically is, from the table while leaving the bottom, where your home button typically is if you have one, on the table. Similarly, lift the left and right edges of the device. Observe the azimuth, pitch and roll values in the app.
Put your device against a wardrobe door in portrait mode and again experiment moving it around. Open the door to see the effect also. Rotate to landscape to see the difference.
The key thing I think you may see from the above is that everything is very simple and works well when flat and everything is complex and interrelated when not flat.
Looking specifically at a use case I had to display the pitch and roll when vertical in portrait mode, the following worked for me:
when (event?.sensor?.type) {
Sensor.TYPE_ROTATION_VECTOR -> {
// Calculate the rotation matrix
val rotMatrix = FloatArray(9)
var rotationMatrixAdjusted = FloatArray(9)
val azimuthChanged: (Float) -> Unit
val orientation = FloatArray(3)
SensorManager.getRotationMatrixFromVector(rotMatrix, event.values);
SensorManager.remapCoordinateSystem(rotMatrix,
SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X,
rotationMatrixAdjusted);
SensorManager.getOrientation(rotationMatrixAdjusted, orientation)
val rollAngleRadians = orientation[1]
val pitchAngleRadians = orientation[2]
//Report results back to listener
thisListener.onRollPitchEvent(rollAngleRadians, pitchAngleRadians)
}
}

Unity device movement

I'm trying to move a game object when i raise/lower (Shake) my phone but I don't know how get the device's movement. I already know about Input.acceleration but that just gives me the devices rotation and I want it's actual movement.
Is this possible and how would I go about doing it?
Accelerometer reads the sum of: changes in movement (acceleration) + constant force of gravity. You can use it for directional movement in two ways:
Detect changes of gravity angle - when the device is not moving (or is moving at a constant speed) and is parallel to the ground, the accelerometer will read Earth's gravity, ie: new Vector3(0,-9.81,0). When the device is tilted (so not parallel to the ground), the vector's length will still be 9.81, but it will be rotated a bit like. new Vector3(3.03, -9.32, 0) (this example is rotation in one axis by 56 degrees (pi/10)). Using this will yield this kind of controlls: https://www.youtube.com/watch?v=3EU3ip4k0uE
Detect peaks of acceleration. When the device is still or moving with constant speed, the length of acceleration vector will be equal 9.81, when it changes/starts movement, this number will change. You can detect these changes and interpret this as one time momentary movement (like pressing an arrow).
There are alternatives not using an accelometer, for example you can detect some printed marker with Vuforia: https://www.youtube.com/watch?v=8RnlBEU5tkI - interpret the relative position to the marker as and action in a simmilar fashion as you'd detect acceleration change in #1.

World reference values for the accelerometer sensor when walking

I have been trying to develop a Pedestrian Dead Reckoning application for Android, and after taking care of the step detection and step length components, I have decided to tackle the orientation determination problem.
After stumbling on a couple of posts regarding coordinate transformation (and even chatting with a frequent answerer), I have been getting gradually better results, but the are still some things that bother me.
The experiment:
I walked forward Northward, turned back, and walked back Southward. Repeated the procedure towards West , then East.
Issues:
I expected, while walking straight in several directions, to have
values of the X and Y values oscillate with the footsteps, and
have a relatively stable Z value throughout. Instead, the Y
values behave this way, with the Z value having its expected
behavior. How come? Does it have anything to do with me not
using remapCoordinates()? (see Graph 1)
I expected the angle plots to jump around 180º and -180º, but
why do they also do it around 35º? (see Graph 2)
Notes:
I am using gravity and magnetometer values to compute the rotation
matrix, and multiplying it using OpenGL's multiplyMV();
I am not using remapCoordinates(), because I thought I didn't need to: the
phone is upright in my pocket (Y points up/down, Z usually forward)
and should displace itself 45º forwards backwards and forwards, at
worst;
Azimuth values seem ok, and do not have the oscillation described in issue 2. (see Graph 3)
Graphs:
World Reference Gravity Acceleration
(blue is X, red is Y, green is Z)
World Reference Gravity Angles
(blue is atan2(Y/X), red is atan2(Z/Y) and green is atan2(Z/X) )
Orientation Values
(blue is azimuth, red is pitch and green is roll)

Categories

Resources