My question is similar to Changing sensor coordinate system in android
I want to be able to compare a user's movements with each other regardless of device orientation. So that when the users holds out the phone in portrait orientation and bends his arm, acceleration readings are the same as when he holds out his phone in landscape and then bends his arm in the same direction.
This is what I call the "user" coordinate system. It is different from the world coordinate system since it should not matter what wind direction the user is facing. It is different from device coordinates since it should not matter how the user holds his device.
It is acceptable in my application to do a calibration step before each movement so the base/resting orientation matrices can be determined. Is it perhaps just a matter of multiplying the matrix of the first movement with the inverse of the second (and then with the new values?)
The answer in the question mentioned seems about right, but I need a more concrete explanation, actual code samples would be ideal.
Note remapCoordinateSystem won't suffice, it only accepts right angles. I need to be able to work with small deviations since the device is strapped to a wrist, which might not always be at right angles with the arm.
I'm currently working on this issue, and I think this might help:
convert device coordinate system to world corrdinate system
We can assume that for the most of the time, people use the phone standing or walking or sitting, which means the user coordinate system share the same z-axis(gravity), and a fixed difference in degree between y-axis (user coordinate, front of user's face) and y-axis(world coordinate, north direction). The difference of degree can be obtained via TYPE_MAGNETIC_FIELD sensor. Thus we can transform from world coordinate to user coordinate.
What's about the user using the phone lying on the bed? I think for that kind of case, a pre-calibration is need by define y-axis of user coordinate, like a movement to tell the phone which direction is the front of user's face.
Related
I am creating app that will measure acceleration of vehicle in each axis using accelerometer in Android smartphone. I need somehow rotate phone measurement coordinations system to coordination system of vehicle - to allow driver to put phone in holder - and so phone will have some different rotation to vehicle.
I started with calibration process where I tell user to hold phone to match vehicle coordination system - so I save TYPE_GRAVITY sensor X, Y, and Z gravity acceleration value. Then I tell user to put phone in holder and again save TYPE_GRAVITY sensor X, Y, and Z gravity acceleration value.
Now I need to find some relation between those two vectors so I can use it to correct (rotate) TYPE_LINEAR_ACCELERATION X,Y,Z data to match vehicle coordinations system.
Using just a gravity sensor you'll lack one rotational axis to compute what you want.
Imagine your phone being held vertically, the accelerometer will show you direction to the bottom of the phone. If you now rotate it around vertical axis - you'll still get the same result from the accelerometer. This means you can't get the rotation around vertical axis this way.
One solution would be to use a gyroscope - this gives you entire rotation of the phone, but increases a hardware requirements of your app.
But the better solution IMHO would be to get rid of the entire calibration process. Cars move mostly straight, only sometimes you get the side way acceleration so you could scan your readings for few seconds and find a 'main' axis and a 'side' one.
Besides, your calibration process still depends on user precision during placing the phone, so it may not work as you've expected.
I'm developing an app that uses Android sensors to help vehicles navigate in an indoor location. As part of my evaluation process of different sensors, I wanted to try the "rotation vector" sensors. For various reasons, magnetic field readings are not very useful for my location, so thus I wanted to try the "Game Rotation Vector" sensor (sensor fusion, available from API level 18 and later). The description states that it is identical to the regular Rotation Vector sensor except no magnetic field information is used to correct for gyroscope drift around the vertical axis.
When looking for information about the Rotation Vector sensors, I came across an example from Google, where they show the Rotation Vector sensor using a 3d cube. It works pretty well, except for being very sensitive to local magnetic fields (and me being far north, even worse, since the horizontal component is very small here).
Since long term drift can be compensated by other reference data (map information), I wanted to use the Game Rotation Vector sensor for my app. However, when changing all references from "TYPE_ROTATION_VECTOR" to "TYPE_GAME_ROTATION_VECTOR" in the example code, the cube no longer reacted to rotations around the vertical axis (eg. me spinning my chair, holding the device in front of me). Tilting the device in the other two directions moved the cube. I also noticed the cube was a lot more "laggy" this time around, reacting very slowly to any movement.
Is this the way the Game Rotation Vector sensor is supposed to work (eg. ignoring any Z axis rotations)? It would kind of make sense, since a gamer playing in the back seat shouldn't be affected by the vehicle turning, but at the same time it differs from the description provided by Google (my first link). From the description I was under the impression that it would drift slowly, not ignore rotation all together.
I would be deeply grateful for any input on this issue.
Best Regards,
John
Ok, just in case anyone happens to find this, here are my findings:
The Game Rotation Vector sensor does detect rotation around the vertical axis. It is quite accurate in most situations.
However, it has a couple of issues... First, while lying still it has accelerating horizontal drift (even when a gyroscope-based orientation has linear drift). For my device, Game Rotation Vector started out good, but accelerated and finally drifted more than 400 degrees over the course of an hour.
Secondly, and even more disturbing, it does not seem to ignore magnetic fields, contrary to the official description (linked in the question). I tried driving around the parking lot with my device fixed on the passenger seat, and the Game Rotation Vector fell behind largely (it was more than 180 degrees off after one full rotation over 40 seconds), while integrated gyroscope data was accurate within a few degrees. It also showed changes in rotation when the gyroscope was hovering around zero, suggesting that it was in fact compensating for a change in (what I presume to be) magnetic field.
I still don't know why it acted wierd in the test app I linked to before, but I have since decided to use a complementary filter to combine accelerometer and gyro data instead.
Say I have a character walking around in 2d space. Lets say my phone is flat on a table in landscape orientation.
If I tilt the phone away from me, the character should start moving up. If I tilt it towards me she should start moving down. Same goes for right and left.
The reason I ask here is because I found Google's explanation rather confusing.
http://developer.android.com/reference/android/hardware/SensorManager.html#getRotationMatrix%28float%5b%5d,%20float%5b%5d,%20float%5b%5d,%20float%5b%5d%29
This link implies the x and y are relative to compass coordinates? I can't imagine that's how the accelerometer works. I just want to do this relative to the phone being tilted on a certain axis.
For example, should the phone tilt away from me, I feel it should be easy to say "the phone is tilting ___ radians positively on the y axis." Then I should just be able to use trig to calculate the acceleration on my character.
I guess my real question is how do I read from the accelerometer and determine to what angle the phone is tilting on a given axis. This image details how I currently think the axis are laid out on the phone.
I'm sure this has been asked before, so a link to a good source of question solving is awesome as well.
So, right now I'm grabbing the accelerometer data and converting them to a decently rough estimate of the angle at which the phone is being held. For right now I'm just focused on the yaw axis.
My area of interest is between 0 and 45 degrees on the yaw axis, so I made a limited queue of the past 5 to 10 readings and compared the numbers to determine if it's going up or down, which kind of works, but it is slow and not really as precise or reliable as I'd want it to be.
Is there a way you can kind of just determine which direction your phone is rotating with just the accelerometer and the magnetic field sensor I guess, without keeping a history of past readings, or something like that? I'm really new to sensor manipulation and Android in general. Any help understanding would be great.
It's not clear exactly what you're looking for here, position or velocity. Generally speaking, you don't want to get a position measurement by using integration on the accelerometer data. There's a lot of error associated with that calculation.
If you literally want the "direction your phone is rotating," rather than angular position, you can actually get that directly from the gyroscope sensor, which provides rotational velocities. That would let you get the direction it's rotating from the velocity without storing data. You should be aware that not every phone has a gyroscope sensor, but it does seem like the newer ones do.
If you want the absolute orientation of the phone (position), you can use the Rotation Vector sensor. This is a combined sensor that automatically integrates data from several of the sensors in one go, and provides additional accuracy. From this, you can get roll-pitch-yaw with a single measurement. Basically, you first want to get your data from the Rotation_vector sensor. Then you use the sensor data with getRotationMatrixFromVector. You can use the output from that in getOrientation (see the same page as the previous link), which will spit out roll-pitch-yaw measurements for you. You might need to rotate the axes around a bit to get the angles measured positive in the direction you want.
I have a question regarding inertial navigation with a mobile device.
I am using an android tablet for development but I think the question is related to
all types (even with better sensors) of hardware.
The most basic question when developing an inertial system is how to determine the
direction of the carrier's movement.
Even if we assume that the magnetometer readings are 100% accurate (which they are obviously not!) There is still the question of the device orientation relative to the user.
Simple example - if the user is walking north, but holds the device with the device's Y axis points north-east, (a link to a picture of the different axis: http://developer.android.com/reference/android/hardware/SensorEvent.html)
Then the magnetometer will point towards north-east.
How can we tell which way the user is actually heading?
(The same will be true if we use both magnetometer and Gyro for determining heading)
A possible solution will be to use the Accelerometer's Y-axis and X-axis readings.
Something in the direction of using arctan(a-Y/a-X)
(for example - if the user holds the device perfectly straight, then the X-Axis will show nothing...)
But since the Accelerometer's readings are not stable, it is not so easy...
Does anyone know of an algorithm that actually works? I am sure this is a well known problem, but I can't seem to find references to solutions...
Thanks in advance!
Ariel
See this answer for an idea: by obtaining the acceleration values in relation to the earth, you can then use the atan2 function to compute the actual direction.
You mention the user holds the tablet, and I assume fairly stable (unlike a case I am working on, where the user moves the phone constantly). Yet, for some reason, the user may change the orientation of the device, and this may influence the readings you obtain.
Thus, in the event of an orientation change, you should call remapCoordinates() accordingly to fix the readings you obtain.
NOTE: You can also use the getOrientation() method accordingly, and the first field represents the heading direction.
The really right answer is to leave the device sitting there for a while, detect the rotation of the earth, and then compute true north from that. Unfortunately, the gyros in a mobile phone aren't accurate enough for that....