I am confused of the axis direction of android cellphone. By google, the defination of three axis is as following.
"The X axis is horizontal and points to the right, the Y axis is vertical and points up and the Z axis points towards the outside of the front face of the screen."
So I think this means that if I put my cellphone on a flat table and screen up, than I should record a negative Z value because gravity is pointing down, right? However, no matter I used an app or my own code, I can only get a positive Z value. Same things happen on the other 2 axises. Can anyone help to explain this thing.I tried this on 3 different cellphones, Nexus4, Nexus5 and HTC One. Thanks。
Related
Some camera apps have a feature where they display a line on the screen which is always parallell to the horizon, no matter how the phone is tilted sideways. By "tilted sideways" I mean rotating the device around an axis that is perpendicular to the screen.
I have tried most of the "normal" rotation functions such as combining the ROTATION_VECTOR sensor with getRotationMatrix() and getOrientation() but none of the resulting axis seem to correspond to the one I'm looking for.
I also tried using only the accelerometer, normalizing all three axis and detecting how much gravity is in the X-axis. This works decently when the device is perfectly upright (i.e not tilted forward/backwards). But as soon as it's tilted forward or backward the measured sideways tilt gets increasingly inaccurate, since gravity is now acting on two axis at the same time.
Any ideas on how to achieve this kind of sideways rotation detection in a way that works even if the phone is tilted/pitched slightly forward/backward?
The result of getRotationMatrix converts from the phone's local coordinate system to the world coordinate system. Its columns are therefore the principal axes of the phone, with the first being the phone's X-axis (the +ve horizontal axis when holding the phone in portrait mode), and the second being the Y.
To obtain the horizon's direction on the phone's screen, the line of intersection between the horizontal plane (world space) and the phone's plane must be found. First find the coordinates of the world Z-axis (pointing to the sky) in the phone's local basis - i.e. transpose(R) * [0, 0, 1]; the XY coordinates of this, given by R[2][0], R[2][1], is the vertical direction in screen-space. The required horizon line direction is then R[2][1], -R[2][0].
When the phone is close to being horizontal, this vector becomes very small in magnitude - the horizon is no-longer well-defined. Simply stop updating the horizon line below some threshold.
I use the accelerometer in my app and the activity is always in landscape mode. The x axis is used for speeding up (imagine something like a racing game) and the y axis is used to control direction left and right. Now I have the following problem:
On my smartphone in landscape mode the x axis is the shorter one and the y axis the longer one. The camera is positioned at the shorter side.
On my tablet the axis are vice versa. Also the camera is on the longer side. That means x axis is the shorter one (right / left) and y axis is the longer one (backward / forward). That means the user needs to tilt the tablet left and right for speeding and backwards and forwards for controlling direction. That is not what I want.
Can anyone tell me a way to find out if the axis are vice versa? I thought about detecting if we got a tablet or a smartphone but there are also tablets where the axis (and so the camera position also) are the same like on my smartphone.
Edit: I made a little picture with paint (I know I'm not a good painter^^). I hope you now understand my problem better. Top one is the smartphone, bottom one is the tablet - both in landscape mode. The circle is the camera which shows that the y-Axis is always where the camera is.
I solved my problem by getting the default device orientation and use it to determine if I need to use x or y for speed.
See https://stackoverflow.com/a/9888357/1794338 for getting the default device orientation
I am trying to understand how to use the data from the accelerometer.
When the phone is moved from the horizontal through 180 degress the values of the z-axis go from +g to -g (0 is vertical).
If I move the phone smoothly, and slowly, from the vertical to the left the values go from 0 to +g. However, if I move the phone sharply, to the left, the values first go negative, presumably due to acceleration.
So, as negative values can represent different situations, how can I tell the difference between negative values due to acceleration to the left and negative values due to tilting to the right?
The accelerometer values correspond to the acceleration felt on that axis of the phone at any given time. For example, when the phone is in a normal upright position you will find a value of one g in the downward direction. You'll need to utilize all 3 axis in order to accurately track the phones orientation. Since gravity will act on a different axis when the phone is rotated.
Sharp movements are due to additional acceleration caused by the force of your movement. Try printing out the values for each axis twice a second or so while you move the phone around very slowly, and you'll get a feel for what the values mean.
I have been trying to develop a Pedestrian Dead Reckoning application for Android, and after taking care of the step detection and step length components, I have decided to tackle the orientation determination problem.
After stumbling on a couple of posts regarding coordinate transformation (and even chatting with a frequent answerer), I have been getting gradually better results, but the are still some things that bother me.
The experiment:
I walked forward Northward, turned back, and walked back Southward. Repeated the procedure towards West , then East.
Issues:
I expected, while walking straight in several directions, to have
values of the X and Y values oscillate with the footsteps, and
have a relatively stable Z value throughout. Instead, the Y
values behave this way, with the Z value having its expected
behavior. How come? Does it have anything to do with me not
using remapCoordinates()? (see Graph 1)
I expected the angle plots to jump around 180º and -180º, but
why do they also do it around 35º? (see Graph 2)
Notes:
I am using gravity and magnetometer values to compute the rotation
matrix, and multiplying it using OpenGL's multiplyMV();
I am not using remapCoordinates(), because I thought I didn't need to: the
phone is upright in my pocket (Y points up/down, Z usually forward)
and should displace itself 45º forwards backwards and forwards, at
worst;
Azimuth values seem ok, and do not have the oscillation described in issue 2. (see Graph 3)
Graphs:
World Reference Gravity Acceleration
(blue is X, red is Y, green is Z)
World Reference Gravity Angles
(blue is atan2(Y/X), red is atan2(Z/Y) and green is atan2(Z/X) )
Orientation Values
(blue is azimuth, red is pitch and green is roll)
I am writing a simple 2d game similar to breakout or pong, where you move a bar along the bottom of the screen by tipping the phone left and right.
I am trying to use the Orientation sensor to achieve this movement however i've run into some issues.
After printing out the orientation sensor's values for a while I decided rotation in the Z axis gave me what I wanted, If I hold the screen horizontal and perpendicular to the floor (x: 0, y: 0, z: 90) rotation in Z left and right takes away from 90. So the value 90-z is fine for usage as an "amount of steering" value.
However this perpendicular to the floor position is not very natural to play with, people are much more likely to hold the phone at about 45 degrees in the Z axis, at which point all these values I have been using mess up completely, steering goes weird, people are unhappy..
I guess what I really need is to detect rotation in some z-axis even though the XY plane the phone is in is constantly changing. Is there some clever way to do this with Maths?
EDIT: Just found the OrientationEventListener - Exactly what I needed!