Measure sideways tilt of upright device - android

Some camera apps have a feature where they display a line on the screen which is always parallell to the horizon, no matter how the phone is tilted sideways. By "tilted sideways" I mean rotating the device around an axis that is perpendicular to the screen.
I have tried most of the "normal" rotation functions such as combining the ROTATION_VECTOR sensor with getRotationMatrix() and getOrientation() but none of the resulting axis seem to correspond to the one I'm looking for.
I also tried using only the accelerometer, normalizing all three axis and detecting how much gravity is in the X-axis. This works decently when the device is perfectly upright (i.e not tilted forward/backwards). But as soon as it's tilted forward or backward the measured sideways tilt gets increasingly inaccurate, since gravity is now acting on two axis at the same time.
Any ideas on how to achieve this kind of sideways rotation detection in a way that works even if the phone is tilted/pitched slightly forward/backward?

The result of getRotationMatrix converts from the phone's local coordinate system to the world coordinate system. Its columns are therefore the principal axes of the phone, with the first being the phone's X-axis (the +ve horizontal axis when holding the phone in portrait mode), and the second being the Y.
To obtain the horizon's direction on the phone's screen, the line of intersection between the horizontal plane (world space) and the phone's plane must be found. First find the coordinates of the world Z-axis (pointing to the sky) in the phone's local basis - i.e. transpose(R) * [0, 0, 1]; the XY coordinates of this, given by R[2][0], R[2][1], is the vertical direction in screen-space. The required horizon line direction is then R[2][1], -R[2][0].
When the phone is close to being horizontal, this vector becomes very small in magnitude - the horizon is no-longer well-defined. Simply stop updating the horizon line below some threshold.

Related

Android accelerometer mapping movement to custom coordinate system

If I have custom coordinate system X - left/right, Y - forward/backward, Z - Up/down that is represented on my PC screen inside my unreal project, how would I map the accelerator values In a way that when I move my phone toward the PC screen (regardless of the phone orientation) so that my Y value goes up and same for other axes?
I got something similar working with rotation by taking "referent" rotation quaternion, inverting it and multiplying it by current rotation quaternion, but I'm just stuck on how to transform movement.
Example of my problem is that if I'm moving my phone up with screen pointing at sky my Z axis increases which is what I want, but when I also point my phone screen to my PC screen and move it forward Z axis again goes up, when I would want in this case that my Y value increases.
There is a similar question Acceleration from device's coordinate system into absolute coordinate system but that doesn't really solve my problem since I don't want to depend on the location of the north for Y and so on.
Clarification of question intent
It sounds like what you want is the acceleration of your device with respect to your laptop. As you correctly mentioned, the similar question Acceleration from device's coordinate system into absolute coordinate system maps the local accelerometer data of a device with respect to a global frame of reference (FoR) (the Cartesian "flat" Earth FoR to be specific - as opposed to the ultra-realistic spherical Earth FoR).
What you know
From your device, you know the local Phone FoR, and from the link above, you can also find the behavior of your device with respect to a flat Earth FoR with a rotation matrix, which I'll call R_EP for Rotation in Earth FoR from Phone FoR. In order to represent the acceleration of your device with respect to your laptop, you will need to know how your laptop is oriented and positioned with respect to either your phone's FoR (A), or the flat Earth FoR (B), or some other FoR that is known to both your laptop and your phone but I'll ignore this cause it's irrelevant and the method is identical to B.
What you'll need
In the first case, A, this will allow you to construct a rotation matrix which I'll call R_LP for Rotation in Laptop FoR from Phone FoR - and that would be super convenient because that's your answer. But alas, life isn't fun without a little bit of a challenge.
In the second case, B, this will allow you to construct a rotation matrix which I'll call R_LE for Rotation in Laptop FoR from Earth FoR. Because the Hamilton product is associative (but NOT commutative: Are quaternions generally multiplied in an order opposite to matrices?), you can find the acceleration of your phone with respect to your laptop by daisy-chaining the rotations, like so:
a_P]L = R_LE * R_EP * a_P]P
Where the ] means "in the frame of", and a_P is acceleration of the Phone. So a_P]L is the acceleration of the Phone in the Laptop FoR, and a_P]P is the acceleration of the Phone in the Phone's FoR.
NOTE When "daisy-chaining" rotation matrices, it's important that they follow a specific order. Always make sure that the rotation matrices are multiplied in the correct order, see Sections 2.6 and 3.1.4 in [1] for more information.
Hint
To define your laptop's FoR (orientation and position) with respect to the global "flat" Earth FoR, you can place your phone on your laptop and set the current orientation and position as your laptop's FoR. This will let you construct R_LE.
Misconceptions
A rotation quaternion, q, is NEITHER the orientation NOR attitude of one frame of reference relative to another. Instead, it represents a "midpoint" vector normal to the rotation plane about which vectors from one frame of reference are rotated to the other. This is why defining quaternions to rotate from a GLOBAL frame to a local frame (or vice-versa) is incredibly important. The ENU to NED rotation is a perfect example, where the rotation quaternion is [0; sqrt(2)/2; sqrt(2)/2; 0], a "midpoint" between the two abscissa (X) axes (in both the global and local frames of reference). If you do the "right hand rule" with your three fingers pointing along the ENU orientation, and rapidly switch back and forth from the NED orientation, you'll see that the rotation from both FoR's is simply a rotation about [1; 1; 0] in the Global FoR.
References
I cannot recommend the following open-source reference highly enough:
[1] "Quaternion kinematics for the error-state Kalman filter" by Joan SolĂ . https://hal.archives-ouvertes.fr/hal-01122406v5
For a "playground" to experiment with, and gain a "hands-on" understanding of quaternions:
[2] Visualizing quaternions, An explorable video series. Lessons by Grant Sanderson. Technology by Ben Eater https://eater.net/quaternions

Unity device movement

I'm trying to move a game object when i raise/lower (Shake) my phone but I don't know how get the device's movement. I already know about Input.acceleration but that just gives me the devices rotation and I want it's actual movement.
Is this possible and how would I go about doing it?
Accelerometer reads the sum of: changes in movement (acceleration) + constant force of gravity. You can use it for directional movement in two ways:
Detect changes of gravity angle - when the device is not moving (or is moving at a constant speed) and is parallel to the ground, the accelerometer will read Earth's gravity, ie: new Vector3(0,-9.81,0). When the device is tilted (so not parallel to the ground), the vector's length will still be 9.81, but it will be rotated a bit like. new Vector3(3.03, -9.32, 0) (this example is rotation in one axis by 56 degrees (pi/10)). Using this will yield this kind of controlls: https://www.youtube.com/watch?v=3EU3ip4k0uE
Detect peaks of acceleration. When the device is still or moving with constant speed, the length of acceleration vector will be equal 9.81, when it changes/starts movement, this number will change. You can detect these changes and interpret this as one time momentary movement (like pressing an arrow).
There are alternatives not using an accelometer, for example you can detect some printed marker with Vuforia: https://www.youtube.com/watch?v=8RnlBEU5tkI - interpret the relative position to the marker as and action in a simmilar fashion as you'd detect acceleration change in #1.

What is the meaning of the android accelerometer values?

I am trying to understand how to use the data from the accelerometer.
When the phone is moved from the horizontal through 180 degress the values of the z-axis go from +g to -g (0 is vertical).
If I move the phone smoothly, and slowly, from the vertical to the left the values go from 0 to +g. However, if I move the phone sharply, to the left, the values first go negative, presumably due to acceleration.
So, as negative values can represent different situations, how can I tell the difference between negative values due to acceleration to the left and negative values due to tilting to the right?
The accelerometer values correspond to the acceleration felt on that axis of the phone at any given time. For example, when the phone is in a normal upright position you will find a value of one g in the downward direction. You'll need to utilize all 3 axis in order to accurately track the phones orientation. Since gravity will act on a different axis when the phone is rotated.
Sharp movements are due to additional acceleration caused by the force of your movement. Try printing out the values for each axis twice a second or so while you move the phone around very slowly, and you'll get a feel for what the values mean.

Vertical movement sensor

I am working on an android app that requires the detection of vertical motion. When moving the tablet upward, the Gyroscope, Accelerometer, and Linear Acceleration sensors give a corresponding value indicating upward or downward motion.
The problem I have is that these sensors will also read an upward/downward motion when you tilt the tablet towards the user or away from the user. For example, the x value in the gyroscope represents the vertical plane. But when you tilt the device forwards, the x value will change.
When I make this motion, the same sensor that reads vertical motion reads a value for this.
The same goes for the rest of the sensors. I have tried to use orientation coupled with the gyro to make the conditional statement, if the pitch is not changing, but the x variable is going up/down, then we have vertical motion. The problem with this is that if the user moves it up but tilted slightly, it will no longer work. I also tried making it so if there is a change in tilt, then there is no vertical motion. But it iterates so quickly that there may be a change in tilt for 1/100 of a second, but for the next there isn't.
Is there any way I can read only vertical changes and not changes in the devices pitch?
Here is what I want to detect:
edit:
"Please come up with a mathematically sound definition of what you consider 'moving upwards.'"
This was my initial question, how can I write a function to define when the tablet is moving upwards or downwards? I consider a vertical translation moving upwards. Now how do I detect this? I simply do not know where to begin, thank you.
Ok, even though this question is fairly old, I see a lot of confusion in the present answer and comments, so in case anyone finds this, I intend to clear a few things up.
The Gyroscope
First of all, the gyroscope does not measure vertical motion as per your definition (a translatory motion). It measures rotation around each of the axes, which are defined as in the figure below. Thus having you tilt your device forwards and backwards indeed rotates it around the x axis and therefore you will see non-zero values in the x value of your gyroscope sensor.
the x value in the gyroscope represents the vertical plane.
I'm not sure what is meant by "the vertical plane", however the x value certainly does not represent the plane itself nor the orientation of the device within the plane.
The x value of the gyroscope sensor represents the current angular velocity of the device around the x axis (eg. the change in rotation).
But when you tilt the device forwards, the x value will change. When I make this motion, the same sensor that reads vertical motion reads a value for this.
Not quite sure what you're referring to here. "The same sensor that reads vertical motion" I assume is the gyroscope, but as previously said, it does not read vertical motion. It does exactly what it says on the tin.
The device coordinate system
This is more in response to user Ali's answer than the original question, but it remains relevant in either case.
The individual outputs of the linear acceleration sensor (or any other sensor for that matter) are expressed in the coordinate system of the device, as shown in the image above. This means if you rotate the device slightly, the outputs will no longer be parallel to any world axis they coincided with before. As such, you will either have to enforce that the device is in a particular orientation for your application, or take the new orientation into account.
The ROTATION_VECTOR sensor, combined with quaternion math or the getRotationMatrixFromVector() method, is one way to translate your measurements from device coordinates to world coordinates. There are other ways to achieve the same goal, but once achieved, the way you hold your device won't matter for measuring vertical motion.
In either case, the axis you're looking for is the y axis, not the z axis.
(If by any chance you meant "along device y axis" as "vertical", then just ignore all the orientation stuff and just use the linear acceleration sensor)
Noise
You mentioned some problems regarding noise and update rates in the question, so I'll just mention it here. The simplest and one of the more common ways to get nice, consistent data from something that varies very often is to use a low-pass filter. What type of filter is best depends on the application, but I find that a exponential moving average filter is viable in most cases.
Finishing thoughts
Note that if you take proper care of the orientation, your transformed linear acceleration output will be a good approximation of vertical motion (well, change in motion) without filtering any noise.
Also, if you want to measure vertical "motion", as in velocity, you need to integrate the accelerometer output. For various reasons, this doesn't really turn out too well in most cases, although it is less severe in the case of velocity rather than trying to measure position.
OK, I suspect it is only a partial answer.
If you want to detect vertical movement, you only need linear acceleration, the device orientation doesn't matter. See
iOS - How to tell if device is raised/dropped (CoreMotion)
or
how to calculate phone's movement in the vertical direction from rest?
For some reason you are concerned with the device orientation as well, and I have no idea why. I suspect that you want to detect something else. So please tell us more and then I will improve my answer.
UPDATE
I read the post on coremotion, and you mentioned that higher z lower x and y means vertical motion, can you elaborate?
I will write in pseudo code. You measured the (x, y, z) linear acceleration vector. Compute
rel_z = z/sqrt(x^2+y^2+z^2+1.0e-6)
If rel_z > 0.9 then the acceleration towards the z direction dominates (vertical motion). Note that the constant 0.9 is arbitrary and may require tweaking (should be a positive number less than 1). The 1.0e-6 is there to avoid accidental division by zero.
You may have to add another constraint that z is sufficiently large. I don't know your device, whether it measures gravity as 1 or 9.81. I assume it measures it as 1.
So all in all:
if (rel_z > 0.9 && abs(z) > 0.1) { // we have vertical movement
Again, the constant 0.1 is arbitrary and may require tweaking. It should be positive.
UPDATE 2
I do not want this because rotating it towards me is not moving it upwards
It is moving upwards: The center of mass is moving upwards. My code has the correct behavior.
Please come up with a mathematically sound definition of what you consider "moving upwards."

Android - Steering with orientation sensor

I am writing a simple 2d game similar to breakout or pong, where you move a bar along the bottom of the screen by tipping the phone left and right.
I am trying to use the Orientation sensor to achieve this movement however i've run into some issues.
After printing out the orientation sensor's values for a while I decided rotation in the Z axis gave me what I wanted, If I hold the screen horizontal and perpendicular to the floor (x: 0, y: 0, z: 90) rotation in Z left and right takes away from 90. So the value 90-z is fine for usage as an "amount of steering" value.
However this perpendicular to the floor position is not very natural to play with, people are much more likely to hold the phone at about 45 degrees in the Z axis, at which point all these values I have been using mess up completely, steering goes weird, people are unhappy..
I guess what I really need is to detect rotation in some z-axis even though the XY plane the phone is in is constantly changing. Is there some clever way to do this with Maths?
EDIT: Just found the OrientationEventListener - Exactly what I needed!

Categories

Resources