How to use Android Accelerometer to control a character? - android

Say I have a character walking around in 2d space. Lets say my phone is flat on a table in landscape orientation.
If I tilt the phone away from me, the character should start moving up. If I tilt it towards me she should start moving down. Same goes for right and left.
The reason I ask here is because I found Google's explanation rather confusing.
http://developer.android.com/reference/android/hardware/SensorManager.html#getRotationMatrix%28float%5b%5d,%20float%5b%5d,%20float%5b%5d,%20float%5b%5d%29
This link implies the x and y are relative to compass coordinates? I can't imagine that's how the accelerometer works. I just want to do this relative to the phone being tilted on a certain axis.
For example, should the phone tilt away from me, I feel it should be easy to say "the phone is tilting ___ radians positively on the y axis." Then I should just be able to use trig to calculate the acceleration on my character.
I guess my real question is how do I read from the accelerometer and determine to what angle the phone is tilting on a given axis. This image details how I currently think the axis are laid out on the phone.
I'm sure this has been asked before, so a link to a good source of question solving is awesome as well.

Related

How to calculate angle of a device keeping device parallel to the ground

I am trying to find the compass type angle, ie. when device is kept parallel to the ground and then if you rotate it along the line passing through the table. eg. the one mentioned in How to detect android device rotation parallel to the ground?
But the problem with above question which is already answered is that one of the link isn't working and other shows ORIENTATION_TYPE which is deprecated.
Currently I am using TYPE_ROTATION_VECTOR, which gives Roll(holding device with your face watching forward and device facing you and then tilting it to left and right) and pitch(again holding device with straight face forward and try to keep it on table). These are calculated accurately. But the azimuth angle which this TYPE_ROTATION_VECTOR provides, is (again keep the device with your face forward and then device faces you and then rotate it along the line going from floor to ceiling).
So, how to actually calculate accurately the angle when device is kept on table and you rotate it along the line passing from floor to ceiling?
Apologies if I am unclear as its difficult to explain in words. If I am not clear, will try to explain with images.

Does Android's "Game Rotation Vector" sensor ignore rotation around vertical axis?

I'm developing an app that uses Android sensors to help vehicles navigate in an indoor location. As part of my evaluation process of different sensors, I wanted to try the "rotation vector" sensors. For various reasons, magnetic field readings are not very useful for my location, so thus I wanted to try the "Game Rotation Vector" sensor (sensor fusion, available from API level 18 and later). The description states that it is identical to the regular Rotation Vector sensor except no magnetic field information is used to correct for gyroscope drift around the vertical axis.
When looking for information about the Rotation Vector sensors, I came across an example from Google, where they show the Rotation Vector sensor using a 3d cube. It works pretty well, except for being very sensitive to local magnetic fields (and me being far north, even worse, since the horizontal component is very small here).
Since long term drift can be compensated by other reference data (map information), I wanted to use the Game Rotation Vector sensor for my app. However, when changing all references from "TYPE_ROTATION_VECTOR" to "TYPE_GAME_ROTATION_VECTOR" in the example code, the cube no longer reacted to rotations around the vertical axis (eg. me spinning my chair, holding the device in front of me). Tilting the device in the other two directions moved the cube. I also noticed the cube was a lot more "laggy" this time around, reacting very slowly to any movement.
Is this the way the Game Rotation Vector sensor is supposed to work (eg. ignoring any Z axis rotations)? It would kind of make sense, since a gamer playing in the back seat shouldn't be affected by the vehicle turning, but at the same time it differs from the description provided by Google (my first link). From the description I was under the impression that it would drift slowly, not ignore rotation all together.
I would be deeply grateful for any input on this issue.
Best Regards,
John
Ok, just in case anyone happens to find this, here are my findings:
The Game Rotation Vector sensor does detect rotation around the vertical axis. It is quite accurate in most situations.
However, it has a couple of issues... First, while lying still it has accelerating horizontal drift (even when a gyroscope-based orientation has linear drift). For my device, Game Rotation Vector started out good, but accelerated and finally drifted more than 400 degrees over the course of an hour.
Secondly, and even more disturbing, it does not seem to ignore magnetic fields, contrary to the official description (linked in the question). I tried driving around the parking lot with my device fixed on the passenger seat, and the Game Rotation Vector fell behind largely (it was more than 180 degrees off after one full rotation over 40 seconds), while integrated gyroscope data was accurate within a few degrees. It also showed changes in rotation when the gyroscope was hovering around zero, suggesting that it was in fact compensating for a change in (what I presume to be) magnetic field.
I still don't know why it acted wierd in the test app I linked to before, but I have since decided to use a complementary filter to combine accelerometer and gyro data instead.

Android Convert device coordinate system to "user" coordinate system

My question is similar to Changing sensor coordinate system in android
I want to be able to compare a user's movements with each other regardless of device orientation. So that when the users holds out the phone in portrait orientation and bends his arm, acceleration readings are the same as when he holds out his phone in landscape and then bends his arm in the same direction.
This is what I call the "user" coordinate system. It is different from the world coordinate system since it should not matter what wind direction the user is facing. It is different from device coordinates since it should not matter how the user holds his device.
It is acceptable in my application to do a calibration step before each movement so the base/resting orientation matrices can be determined. Is it perhaps just a matter of multiplying the matrix of the first movement with the inverse of the second (and then with the new values?)
The answer in the question mentioned seems about right, but I need a more concrete explanation, actual code samples would be ideal.
Note remapCoordinateSystem won't suffice, it only accepts right angles. I need to be able to work with small deviations since the device is strapped to a wrist, which might not always be at right angles with the arm.
I'm currently working on this issue, and I think this might help:
convert device coordinate system to world corrdinate system
We can assume that for the most of the time, people use the phone standing or walking or sitting, which means the user coordinate system share the same z-axis(gravity), and a fixed difference in degree between y-axis (user coordinate, front of user's face) and y-axis(world coordinate, north direction). The difference of degree can be obtained via TYPE_MAGNETIC_FIELD sensor. Thus we can transform from world coordinate to user coordinate.
What's about the user using the phone lying on the bed? I think for that kind of case, a pre-calibration is need by define y-axis of user coordinate, like a movement to tell the phone which direction is the front of user's face.

Determining heading in Inertial Navigation Systems

I have a question regarding inertial navigation with a mobile device.
I am using an android tablet for development but I think the question is related to
all types (even with better sensors) of hardware.
The most basic question when developing an inertial system is how to determine the
direction of the carrier's movement.
Even if we assume that the magnetometer readings are 100% accurate (which they are obviously not!) There is still the question of the device orientation relative to the user.
Simple example - if the user is walking north, but holds the device with the device's Y axis points north-east, (a link to a picture of the different axis: http://developer.android.com/reference/android/hardware/SensorEvent.html)
Then the magnetometer will point towards north-east.
How can we tell which way the user is actually heading?
(The same will be true if we use both magnetometer and Gyro for determining heading)
A possible solution will be to use the Accelerometer's Y-axis and X-axis readings.
Something in the direction of using arctan(a-Y/a-X)
(for example - if the user holds the device perfectly straight, then the X-Axis will show nothing...)
But since the Accelerometer's readings are not stable, it is not so easy...
Does anyone know of an algorithm that actually works? I am sure this is a well known problem, but I can't seem to find references to solutions...
Thanks in advance!
Ariel
See this answer for an idea: by obtaining the acceleration values in relation to the earth, you can then use the atan2 function to compute the actual direction.
You mention the user holds the tablet, and I assume fairly stable (unlike a case I am working on, where the user moves the phone constantly). Yet, for some reason, the user may change the orientation of the device, and this may influence the readings you obtain.
Thus, in the event of an orientation change, you should call remapCoordinates() accordingly to fix the readings you obtain.
NOTE: You can also use the getOrientation() method accordingly, and the first field represents the heading direction.
The really right answer is to leave the device sitting there for a while, detect the rotation of the earth, and then compute true north from that. Unfortunately, the gyros in a mobile phone aren't accurate enough for that....

Determining roll angle (up/down) when Android phone is horizontal?

When the Android phone is on its side (horizontal orientation), the Roll represents the tilt, so to speak. When the phone is perpendicular to the ground (looking directly at the screen), the roll says 90. However, when you start tilting it forward or backward, as if you wanted to look down or up, the angle just decreases either way.
This means looking up or down 45 degrees gives the same roll or 45 degrees.
How is it possible to know if you are rolling it forward or backward?
I have been looking around for an answer to this, and can't find anyone else with this problem. Judging from different apps, it seems to be possible, so I think I am just missing some relationship to something, and was hoping someone might be able to nudge me in the right direction. Thanks!
The way I fixed this was with the accelerometer data. Watch to see if the Z axis is positive or negative, and then adjust the roll value accordingly.

Categories

Resources