Android TYPE_GAME_ROTATION_VECTOR reference - android

We have a project where we just switched our use of Sensor type from type TYPE_ROTATION_VECTOR to TYPE_GAME_ROTATION_VECTOR.
According to google's doc:
Identical to TYPE_ROTATION_VECTOR except that it doesn't use the geomagnetic field. Therefore the Y axis doesn't point north, but instead to some other reference, that reference is allowed to drift by the same order of magnitude as the gyroscope drift around the Z axis.
My question is (being an android noob) how can one 'calibrate' that reference to the current device's orientation? meaning from the starting orientation the device is at before onSensorChanged enters into play?
what we need is orientation data from a reference frame, that reference frame being the initial device orientation of the device in space (so deltas are needed, not absolute rotations)
any help is highly appreciated, i'm an iOS developer mainly and this is all of 1 line of code there :S

how can one 'calibrate' that reference to the current device's orientation? meaning from the starting orientation the device is at before onSensorChanged enters into play?
You could literally do the maths. Measure the yaw/heading at the start, and compare it to your current value over time. This number wouldn't be affected by magnetic interference, unlike TYPE_ROTATION_VECTOR.
The trade off is your reference can drift.
Just answering this question because I was researching out about TYPE_GAME_ROTATION_VECTOR.
PS: I would guess the reason you'd switch to TYPE_GAME_ROTATION_VECTOR is to avoid magnetic interference with surroundings. People usually play games indoors, where the compass might interfere with hard/soft iron.
TYPE_GAME_ROTATION_VECTOR docs
TYPE_ROTATION_VECTOR docs

Related

How to retrieve high quality compass orientation (as in Google Maps)?

All of the guides to getting compass orientation in Android I've found have a bug: when you hold the phone in portrait mode and "look" above the horizon, the compass arrow turns 180 degrees from the correct direction.
Google Maps orientation indicator doesn't have this problem.
Another nice thing that Google Maps have is that they somehow estimate compass accuracy. Any idea how they do this?
Error you ask is caused because of Euler angles or called Gimbal Lock.
To solve very high angle difference, you should check if your device is flat or not checking inclination after getting inclination using gravity sensor or getting gravity from accelerometer with lowpass filter. If it's not flat(AR application or inclination>25 || inclination < 155) remap coordinate system from x,y to x,z. That will solve the issue.
SensorManager.getRotationMatrixFromVector(rotationMatrix, event.values);
SensorManager.remapCoordinateSystem(rotationMatrix, SensorManager.AXIS_X, SensorManager.AXIS_Z, remappedRM);
SensorManager.getOrientation(remappedRM, orientation);
This solves the Gimbal Lock, however, 3D compasses(normal compasses does not give correct results due to they are only mapped in x,y and occurs Gimbal Lock) i've seen on Play Store and every code here has a diffence when you keep your device laying flat or screen pointing at you. That difference sometimes gets up to 10 degrees. I haven't able to solve it. Mostly difference is between 1-5 degrees but i sometimes see it rise up to 10 degrees which is not acceptable.
Google measures azimuth from your location. There is a code to find bearing.
currentLocation.getBearing();
Accuracy of lat long(accuracy of your current location) is what determines accuracy of bearing.
List of ways doing a compass with the most accurate to least is in order
Use GPS/Wife(Google's fused location) location and get bearing from location
Use Rotation Vector(requires magnetic field sensor, you should check if it's available), it's a fused sensor(magnetic field+gyroscope+accelerometer and software parts) and using Kalman filter to smooth values but check inclination to remap orientation
Gravity/Accelerometer + Magnetic Field Sensor. This will have a terrible noise attached to it, to smooth it you should use moving average or low-pass filter(this not for isolating gravity, it's for using a threshold frequency to prevent high jumps)
bug when looking above the horizon.
There is some tricky parts when you work with compass.
The compass orientation depends on magnetic fields and phone orientation, so, in order to correct deviation you need to do some matrix operations.
Check this article, it provides a simple example -> https://www.journal.deviantdev.com/android-compass-azimuth-calculating/
Paraphrasing the article: "So if the device is not holding flat (∓45° deviation) you have to use remapCoordinateSystem() in a useful way to get correct results."
accuracy.
First of all, accuracy rely on physical components present in the device. Fragmentation is a common problem here. Samsung magnetometers are different from LG's, and Android System is not going to abstract you from this.
On the other hand, devices could rely on different type of sensors: harward or sensor fusion. So you're going to experiment differences between one device and other.
So this is a mess. But it exists some techniques you could apply in order to get accuracy and uniformed data from sensors (not only for compass, probably GPS too).
Some people discard the first few seconds of sensor data. Some time is needed in order to calibrate the signal, so the first data retrieved use to have some deviation. The time you need to discard depends on manufacturer again, buy I'd try with 5 seconds or so.
Use Interpolators and Extrapolators. Many sensors in android provide you a way to retrieve data each certain milliseconds. But this is the abstraction provided by Android. Harward sensors have its own timing and they update the signal when they they think it is necessary. The rest of the time, when Android ask the sensor for signal data the sensor returns the last value or maybe some kind of operation of the last signal data provided by manufacturer (again).
So, it is interesting to have some abstraction layer (interpolator / extrapolator) which receive the data from Android system each 20, 50, 1000... This layer make some operations in order to have some uniformity, and then communicates the data to your app.
The operation here could be some kind of average between current and last value, accumulated average or maybe another kind of normalization.
OK, i figured it out myself. first you need to calculate the bearing from the compass. then the Maps api-2 camera can be rotated.
public void updateCamera(float bearing) {
CameraPosition currentPlace = new CameraPosition.Builder()
.target(new LatLng(centerLatitude, centerLongitude))
.bearing(bearing).tilt(65.5f).zoom(18f).build();
googleMap.moveCamera(CameraUpdateFactory.newCameraPosition(currentPlace));
}
set SensorListener in your code and call this method in onSensorChanged event. i have added a tilt value so the map will rotate in 3D.
[Reference:] [Android Maps v2 rotate mapView with compass1
if you can know about more details http://www.techotopia.com/index.php/Working_with_the_Google_Maps_Android_API_in_Android_Studio
Compasses will give you what you want, but it's so noisy. It is noisy for 2 reasons, one of reason is, it is picking up, real noise, real signal. So, we live in an environment that's magnetically very noisy. So, this compasses, picking up everything that's magnetic. The other reason is that, it's not integrated, so it doesn't have benefit of dropping the frequency component. So, try to combine your compass with gyroscope data. This video will help you so much for using these sensors.
Some more details, you can combine accelerometers also. So in summary, Gyroscopes provide orientation, Accelerometers provide a correction due to gravity, and compasses provide a correction due to magnetic North.

Redundant motion sensors on android?

I'm developing a tool which receives motion sensor data and sends it to a machine learning algorithm, which ultimately will deduce different types of movement.
I read the Motion sensor guide and it seems like there is some redundancy in the data you can get from the sensors. For example: the accelrometer data contains gravity data and the linear acceleration data shows acceleration without acceleration due to gravity.
So my question is: do i really need all the sensors to get all forms of motion or can I give up some of them?
EDIT: (clarifying the question)
I want to collect the minimal data that will allow me to deduce the same things. What I'm looking for is user behavior: the angle which the user holds his phone, the way the user moves while using his phone, etc..
The answer I'm looking for should include the sets of sensors that have high correlation within them, such that only some of the sensors in this set are required to deduce the same type of motion\movement\rotation\acceleration\etc..
The term "Motion" in the question have no precise meaning. So I answer more generally.
"The way one holds his phone" is nothing but the orientation of the phone.There are three sensors which individually tells the orientation of the phone.
Accelerometer sensor
Orientation sensor
Rotation Vector sensor
Among them only the accelerometer is physical sensor and other two are virtual sensors (they don't have special piece of hardware, they use accelerometer data and report the orientation in different formats).
The orientation sensor is deprecated so you can't use it.
Rotation vector sensor tells the orientation encoded in a quaternion. If your code is based on quaternions then normalize the sensor output using SensorManager.getQuaternionFromVector() and continue. If your code is based on rotation matrix then obtain rotation matrix by calling SensorManager.getRotationMatrixFromVector() passing sensor output and continue. If you want the orientation alone get it by calling SensorManager.getOrientation() passing rotation matrix obtained previously.
Using accelerometer sensor we can find the orientation, but the recommended approach is to combine it with magnetic field sensor output. Call SensorManager.getRotationMatrix() by passing the output of accelerometer output and magnetic field sensor output and get the rotation matrix. If your code is based on rotation matrix, just continue. If you want the orientation alone get it by calling SensorManager.getOrientation() passing rotation matrix obtained in previously. If your code is based on quaternion call SensorManager.getQuaternionFromVector() by passing rotation vector (orientation) obtained previously.
"The way one moves his phone" - Here I consider four motions.
Change of position (Simple translation) and rate change of position (velocity) - No sensor to detect them.
Rate of change of velocity (Simple acceleration) - Accelerometer detects it. But it also contains the gravity component. Normally we need acceleration without gravity component. This could be calculated simply as explained here. However there is another virtual sensor called Linear Acceleration which does the job for us.
Change of orientation (Rotation) - Whenever the orientation changes the accelerometer, orientation and rotation vector sensors report us (gyroscope also reports, but is explained in next point). How to use this sensor to get the current orientation is explained in first part of the answer.
Rate of change of orientation (Angular velocity) - Whenever the orientation changes the gyroscope sensor reports. The output is three numbers representing angular acceleration along x, y and z axes. The unit is radians per second.
Output of the gyroscope sensors is not accurate in long term and the output of accelerometer is not accurate in short term, so combine them to get steady output. For details see this question.
Now it is clear that the gyroscope and accelerometer is required in minimum. However using wide range of sensors minimizes our work.
You can't decide what you get - each sensor's data is already defined, and you get all or nothing. If you see closely, there isn't a place in public API which would let you ask for specific things.
To back this up here's quote from Google's document explaining sensor types:
An accelerometer sensor reports the acceleration of the device along the 3 sensor axes. The measured acceleration includes both the physical acceleration (change of velocity) and the gravity. The measurement is reported in the x, y and z fields of sensors_event_t.acceleration.
If you see into android source, the structs here are strictly defined, and struct for acceleration contains specific fields. So even if you would get 0 in fields you don't like, you won't gain anything.
But what you're referring to are two things - base sensors, which are roughly equivalent to physical sensors on the device, and composite sensors, which combine readings from various physical sensors to get more useful data.
So while you can't decide what you get for a particular sensor (like "only gravity" or "only acceleration in Y axis"), composite sensors do give you data that you can compute by yourself using only base sensors. So linear acceleration is composition of data from accelerometer and gyroscope (or magnetic sensor), after some calculations. Similarly step detector "sensor" uses only accelerometer, but interpretes the data automatically to just give you an event that "yes, someone has made a step" with single value 1.
If you're feeding raw motion data to some algorithms, I would guess base sensors are what you're looking for. That said, I believe you can still safely register for all sensors (both base and composite ones) that combined give you all data that you need (and maybe more), without meaningful battery impact.
For more detailed information on each of the sensors refer to Sensor types on Android website, and if you're curious, you can read up short summary on sensors stack as well.
No, you don't need every sensor. Some of the sensors exist as a convenience to the user. Your example of the linear acceleration sensor is one- it tells you the results of the accelerometer with gravity taken out. You could do this yourself from the raw accelerometer data, but that takes a bit of math (you need to subtract the vector gravity over all 3 axes) and a bit of knowhow (did you remember to calibrate the sensor? It may not read 9.8 at rest. For that matter, 9.8 may not be your gravity if you're not at sea level). That's a lot of work that would need to be repeated by each app, so they created a software "sensor" that sits on top of the accelerometer and provides the computed data. It would be unusual for an app to use raw and linear accelerometers in the same app, generally its one or the other. The step counter is another example of this, it guesses at what a step is based on the accelerometer data. You also wouldn't want calibrated and uncalibrated gyroscope data.
As for what you do need- no clue, you don't say enough about what you're trying to do. One warning though- you said you're trying to detect motion. YOu can't do that. You can detect accelerations and rotation. You cannot detect motion at a constant speed. If you're developing any type of app using these it pays to use the correct terminology and think in terms of physics and how the physical accelerometer and gyroscope work, otherwise you're going to cause yourself bugs.

How to transform Android magnetometer readings to the the world's coordinate space?

I am writing some programming on the Android sensors, where I am confused by the readings of magnetometer sensor.
Magnetometer reports the magnetic strengths on the three axes of the phone. And I observe that at a same location, if the phone's heading changes, the magnetic readings dramatically change.
In my understand, however, the earth's magnetic field at a specific location should be relative stable, regardless of the phone's placement gesture.
So, my question is, is there any way to transform the raw readings from the 3-axis magnetometer sensor to the world's coordinate system? The accelerometer and orientation data are also available on mobile phones. If so, I suspect the transformed magnetism should be the same even the phone's heading direction changes.
I have referred to the Android source codes, specially, the getOrientation() function and the getRotationMatrix() function. I hoped to get some help from their code implementation. But I did not understand very well. Could someone give any explains on the algorithm principle of these functions?
Link to the code of the functions: http://www.netmite.com/android/mydroid/cupcake/frameworks/base/core/java/android/hardware/SensorManager.java
Thanks! I am really anxious to the solution to this question.
This is impossible, since the device does not know its orientation in world space.
Of course, the orientation can be guessed by the sensor input, and that is what getOrientation() and getRotationMatrix() do. However, on a long timescale only the measurement of acceleration (by gravity) and the magnetic field provide the necessary information. Gyroscope data can be used to refine the estimate for shorter periods, but getOrientation is not guaranteed to use it, and maybe that sensor is not even existent on the particular device.
This means backtransforming using getOrientation would use the exact same data which you want to correct, rendering it useless.

Determining heading in Inertial Navigation Systems

I have a question regarding inertial navigation with a mobile device.
I am using an android tablet for development but I think the question is related to
all types (even with better sensors) of hardware.
The most basic question when developing an inertial system is how to determine the
direction of the carrier's movement.
Even if we assume that the magnetometer readings are 100% accurate (which they are obviously not!) There is still the question of the device orientation relative to the user.
Simple example - if the user is walking north, but holds the device with the device's Y axis points north-east, (a link to a picture of the different axis: http://developer.android.com/reference/android/hardware/SensorEvent.html)
Then the magnetometer will point towards north-east.
How can we tell which way the user is actually heading?
(The same will be true if we use both magnetometer and Gyro for determining heading)
A possible solution will be to use the Accelerometer's Y-axis and X-axis readings.
Something in the direction of using arctan(a-Y/a-X)
(for example - if the user holds the device perfectly straight, then the X-Axis will show nothing...)
But since the Accelerometer's readings are not stable, it is not so easy...
Does anyone know of an algorithm that actually works? I am sure this is a well known problem, but I can't seem to find references to solutions...
Thanks in advance!
Ariel
See this answer for an idea: by obtaining the acceleration values in relation to the earth, you can then use the atan2 function to compute the actual direction.
You mention the user holds the tablet, and I assume fairly stable (unlike a case I am working on, where the user moves the phone constantly). Yet, for some reason, the user may change the orientation of the device, and this may influence the readings you obtain.
Thus, in the event of an orientation change, you should call remapCoordinates() accordingly to fix the readings you obtain.
NOTE: You can also use the getOrientation() method accordingly, and the first field represents the heading direction.
The really right answer is to leave the device sitting there for a while, detect the rotation of the earth, and then compute true north from that. Unfortunately, the gyros in a mobile phone aren't accurate enough for that....

Mobile phone sensors

After some hours of searching I'm so confused, so I'll tell now what I learned, so please correct me if I was wrong :
Light Sensor: surrounding light
Magnetic Sensor: I can get the north location.
Accelerometer Sensor: the gravity X Y Z , earth gravity or what ever acceleration .
Proximity Sensor: just like the parking car sensors.
Orientation Sensor: Tells the X Y Z degrees form their axis.
I've tried those sensors using some apps from android market like "My sensors", and I can confirm that accelerometer Sensor can't feel if you rotate your phone in position over a table. While orientation Sensor seems to catch all the moves. Now I can conclude that compass app uses the magnetic sensor to see where is the north, then orientation sensor to know where are you heading with your device, right?
switching between portrait and landscape modes use "Accelerometer Sensor" and checks the earth gravity on which axis.right?
Q1:so if everything is ok, what is "Gyroscope"? is it same as "Orientation Sensor"?
Q2:is Orientation Sensor avalible on most of the devices?
Q3:what other uses of Orientation Sensor?
Q4:why most of the websites even wikipedia says thet Orientation Sensor == Accelerometer Sensor?
-Rami
Ok, first the easy ones...
About Q1, Gyroscope measures the Angular velocity (radians/second) and the Orientation is a different magnitude, telling us how is "placed" the device (I don't really know how to explain something so basic in English).
And about Q2, I would say yes, 3-Axis orientation is avaiable on most of smartphones, at least those running Android.
Q3: Compass actually uses the Orientarion values, Magnetic Field sensor measures that magnitude, (not in degrees) though probably you can calculate the compass values with the magnetic field. Another use... well, you can tell wether the device lies upside or downside, for example.
About Q4, this is more difficult, I'm not that expert in accelerometers, but I think most of these "Sensors" use the same hardware sensor, which measures the magnecit field and makes the necessary calculations, but I insist, maybe it's better to read more detailed and technical information.
If you notice, now OrientarionSensor is deprecated, and this is written in the docs:
Note: This sensor type exists for
legacy reasons, please use
getRotationMatrix() in conjunction
with remapCoordinateSystem() and
getOrientation() to compute these
values instead.
So yes, it seems it calculates Orientation values trough the Accelerometer, but still, Orientation is given in degrees and Acceleration in (m/s^2), different magnitudes. As I told you, I think they measure different magnitudes with the same sensor, that's why they present different kind of Events in the API.
I hope I haven't written many huge mistakes, because well, I would also like to clarify some concepts regarding to these Sensors.
mdelolmo is perfectly right.
I would like to add the following:
About Q4. Everyone addresses the Orientation sensor as accelerometer
because the smartphones use it for the "Auto-Rotate" feature.
The switching between portrait & landscape modes
(often called orientation in layman terms) is done
by determining that the gravity is along which of axis of the phone.
This is done by the accelerometer-IC.
The orientation sensor (in Android) uses the accelerometer data
and the magnetic data to determine the exact positioning of the
device. ie. the angles it makes with all the 3axes. These are
azimuth(or yaw), pitch & roll.
The gyroscope provides the angular velocity of the device.
It is NOT the orientation sensor.
I haven't worked on android phones but may in the future, however accelerometers detect acceleration usually through the motions of a proof mass. So they can be used to orient a device roughly because they sense the g-vector so any orientation is totally unconstrained in angle about the g-vector. Now accelerometers can be utilized as gyroscopes but they are not used that way because they would need to be better than any accelerometer currently made to sense rotation via centripetal acceleration.
Gyroscopes directly measure either angular rate or angle directly. Most measure angular rate and the rate is integrated to get angle, so they can be used to measure orientation but since they are inertial sensors they drift and so do not provide an absolute orientation but are excellent sensors for relative rotations or relative orientation with respect to a very recent orientation. I hope this helps.
The magnetic sensor measures the direction to true north via the earth's magnetic field. The magnetic sensor supplied North with the g-vector via the accelerometeres give full orientation information because it breaks the symmetry of the orientation about the g-vector. This only really true when the phone/sensors are not moving. Since I do not know how this is implemented in the phone I can't say much else, but the fact that you need accelerometers and another orientation to get full orientation may be the reason why there is confusion about this subject.
Barometers measure pressure and can be good sensors to measure changes in height but can be fooled by active air moving systems such as as found in air conditioned homes and in forced hot air systems.
If you are not moving and you have sufficient sensitivity to measure earth rate with your gyroscopes you can do something called gyrocompassing where the gyroscopes and accelerometers become analytically aligned or physically aligned with with the local level coordinate system. This is how much better gyroscopes and accelerometers measure orientation in systems like aircraft, spacecraft and ships/submarines. There are many complications but this is the basic idea.

Categories

Resources