Mobile phone sensors - android

After some hours of searching I'm so confused, so I'll tell now what I learned, so please correct me if I was wrong :
Light Sensor: surrounding light
Magnetic Sensor: I can get the north location.
Accelerometer Sensor: the gravity X Y Z , earth gravity or what ever acceleration .
Proximity Sensor: just like the parking car sensors.
Orientation Sensor: Tells the X Y Z degrees form their axis.
I've tried those sensors using some apps from android market like "My sensors", and I can confirm that accelerometer Sensor can't feel if you rotate your phone in position over a table. While orientation Sensor seems to catch all the moves. Now I can conclude that compass app uses the magnetic sensor to see where is the north, then orientation sensor to know where are you heading with your device, right?
switching between portrait and landscape modes use "Accelerometer Sensor" and checks the earth gravity on which axis.right?
Q1:so if everything is ok, what is "Gyroscope"? is it same as "Orientation Sensor"?
Q2:is Orientation Sensor avalible on most of the devices?
Q3:what other uses of Orientation Sensor?
Q4:why most of the websites even wikipedia says thet Orientation Sensor == Accelerometer Sensor?
-Rami

Ok, first the easy ones...
About Q1, Gyroscope measures the Angular velocity (radians/second) and the Orientation is a different magnitude, telling us how is "placed" the device (I don't really know how to explain something so basic in English).
And about Q2, I would say yes, 3-Axis orientation is avaiable on most of smartphones, at least those running Android.
Q3: Compass actually uses the Orientarion values, Magnetic Field sensor measures that magnitude, (not in degrees) though probably you can calculate the compass values with the magnetic field. Another use... well, you can tell wether the device lies upside or downside, for example.
About Q4, this is more difficult, I'm not that expert in accelerometers, but I think most of these "Sensors" use the same hardware sensor, which measures the magnecit field and makes the necessary calculations, but I insist, maybe it's better to read more detailed and technical information.
If you notice, now OrientarionSensor is deprecated, and this is written in the docs:
Note: This sensor type exists for
legacy reasons, please use
getRotationMatrix() in conjunction
with remapCoordinateSystem() and
getOrientation() to compute these
values instead.
So yes, it seems it calculates Orientation values trough the Accelerometer, but still, Orientation is given in degrees and Acceleration in (m/s^2), different magnitudes. As I told you, I think they measure different magnitudes with the same sensor, that's why they present different kind of Events in the API.
I hope I haven't written many huge mistakes, because well, I would also like to clarify some concepts regarding to these Sensors.

mdelolmo is perfectly right.
I would like to add the following:
About Q4. Everyone addresses the Orientation sensor as accelerometer
because the smartphones use it for the "Auto-Rotate" feature.
The switching between portrait & landscape modes
(often called orientation in layman terms) is done
by determining that the gravity is along which of axis of the phone.
This is done by the accelerometer-IC.
The orientation sensor (in Android) uses the accelerometer data
and the magnetic data to determine the exact positioning of the
device. ie. the angles it makes with all the 3axes. These are
azimuth(or yaw), pitch & roll.
The gyroscope provides the angular velocity of the device.
It is NOT the orientation sensor.

I haven't worked on android phones but may in the future, however accelerometers detect acceleration usually through the motions of a proof mass. So they can be used to orient a device roughly because they sense the g-vector so any orientation is totally unconstrained in angle about the g-vector. Now accelerometers can be utilized as gyroscopes but they are not used that way because they would need to be better than any accelerometer currently made to sense rotation via centripetal acceleration.
Gyroscopes directly measure either angular rate or angle directly. Most measure angular rate and the rate is integrated to get angle, so they can be used to measure orientation but since they are inertial sensors they drift and so do not provide an absolute orientation but are excellent sensors for relative rotations or relative orientation with respect to a very recent orientation. I hope this helps.
The magnetic sensor measures the direction to true north via the earth's magnetic field. The magnetic sensor supplied North with the g-vector via the accelerometeres give full orientation information because it breaks the symmetry of the orientation about the g-vector. This only really true when the phone/sensors are not moving. Since I do not know how this is implemented in the phone I can't say much else, but the fact that you need accelerometers and another orientation to get full orientation may be the reason why there is confusion about this subject.
Barometers measure pressure and can be good sensors to measure changes in height but can be fooled by active air moving systems such as as found in air conditioned homes and in forced hot air systems.
If you are not moving and you have sufficient sensitivity to measure earth rate with your gyroscopes you can do something called gyrocompassing where the gyroscopes and accelerometers become analytically aligned or physically aligned with with the local level coordinate system. This is how much better gyroscopes and accelerometers measure orientation in systems like aircraft, spacecraft and ships/submarines. There are many complications but this is the basic idea.

Related

How to find up or down using the device's orientation sensor?

I've been trying to find up (or down, which wouldn't be much harder to find, just multiply up by -1) on android with no good solution. I need a vector pointing up in the same coordinates system than the one used by the accelerometer. This way, I will be able to:
remove the force added by gravity to the output of the accelerometer
determinate if the device has shaken in a vertical direction or not.
Gravity Sensor:
First, I thought of using the gravity sensor, which would've been the simplest solution. But! my device did not have a gravity sensor...
Magnetic Field Sensor: So, I thought of using the magnetic field sensor to find two vectors pointing north at two different positions but with the same orientation, after what I calculated the cross product of both the vectors to try and find a vector pointing up or down. It didn't work (or it seemed like it didn't).
Magnetic Field with Accelerometer: Then, I thought of doing the cross product but, this time, with a vector provided by the magnetic field sensor and one provided by accelerometer. I couldn't find how to, because the accelerometer points a bit downward when the user is accelerating, because of the gravitational pull.
So, I came to the conclusion that I would need to use the orientation sensor to do so, or only to rely on a low-pass filter....
Put simply, how do I determinate what is the vector pointing up in the coordinates system used by the accelerometer if I have these sensors:
orientation
accelerometer
magnetic field
Unfortunately your problem isn't particularly simple. I'm not familiar enough with the Android API and perhaps if you look around you'll find something which attempts to isolate the gravity acceleration from the external device acceleration (I know there are sensor fusion algorithms that do this with some degree of success).
The core of the problem is that the acceleration due to gravity and the local acceleration on the device are indistinguishable (there is no such thing a a gravity sensor). An accelerometer simply records the total acceleration acting on the device, which is the sum of ALL accelerations. It is not trivial to isolate the gravity vector, but the methods I have seen mainly involve using the gyroscope as the magnetometer is simply too inaccurate and contains too much latency.
Instead I suggest that you either assume that the local forces on the device are negligible and therefore your acceleration IS the gravity vector. Or alternatively as you mentioned, use a low pass filter which for most purposes will suffice (and is a common way of defining DOWN for the purposes of sensor fusion and augmented reality).
All the best!
The magnetic field is not horizontal, which is why your solutions with the magnetometer didn't work. This is called the magnetic dip. Generally speaking if you want to determine the up/down vector you should not rely on the magnetic field because other solutions are easier to implement and more precise.
The two main solutions to find the up/down vector are:
Low-pass on the accelerometer. This may not be precise enough for your purposes.
Sensor fusion on the gyroscope and accelerometer. This will be precise enough, but requires a good head for math to implement (i.e.: matrices or quaternions)
Good luck!

What's the conceptual difference between rotation vector sensor and orientation sensor in Android?

Android provides both the rotation vector sensor and the orientation sensor. I know they returns different data, because for vector sensor we have sin of angles, in orientation sensor we have angles. But what's the conceptual difference? I can't understand from the docs. Which one provides the orientation of the device in the three-dimensional space? I'm confused!
The older ORIENTATION sensors report orientation using three angles. The problem with this coordinate systems is that it suffers from "gimbal lock": when the actual orientation vector is close to vertical, one of the coordinates goes to 90 or -90 degrees, and the remaining two coordinates become either uninterpretable or dangerously denormalized.
The newer ROTATION sensors report orientation using quarternion coordinates, which are more complicated to work with, but don't suffer from the Gimbal lock problem. When orientation is reported using quaternion coordinates, you can determine the precise orientation of the device no matter what the orientation is.
Quaternions are also more computationally efficient. You don't need to call expensive trig functions to apply a quaternion rotation to a vector. If the w coordinate isn't supplied you can still compute w with a single sqrt call, compared to three sin and three cos function calls for orientation coordinates in the three angle Euler form.
Short story: the ORIENTATION-style sensors were done wrong. They were fixed in API 9, by replacing them with ROTATION sensors.
ROTATION_VECTOR sensor was introduced in API 9 and represents 'virtual' sensor which combines data from different sensors (usually ACCELEROMETER, GEOMAGNETIC_FIELD and GYROSCOPE) and does some smart calculations to provide more accurate data rather than using raw data from ACCEL and GEOMAGNETIC_FIELD sensors. This is called 'sensor fusion'. More info you can find here
ORIENTATION sensor is deprecated since it is providing not very accurate data. Documentation is suggesting to use raw data from ACCELEROMETER and GEOMAGNETIC_FIELD sensors instead.
Unfortunately, I cannot provide any examples how to use ROTATION_VECTOR sensor data since I'm in process of investigation right now :)
Just in case you need some examples how to use raw data - feel free to ask me - I'll post some examples, but simple googling can help you better ;)
They are conceptually same, just representationally different.
Have a look at the code of orientation sensor here.
The parameter of the function for the orientation sensor is the rotation matrix, which inturn is calculated from the rotation vector(the quats representation)
On cheap Android phones (unlike higher end iPhones) compass will work only when the phone orientation is close to horizontal (i.e. parallel to ground surface).
Technically a good compass (i.e. a floating magnetic sphere) should work at any orientation but the cheap ones don't. Hence, to use a compass make sure that your phone is horizontal via looking at ACCELEROMETER before you use MAGNETOMETER. Hopefully Google will use better magnetometers in the future!

How to transform Android magnetometer readings to the the world's coordinate space?

I am writing some programming on the Android sensors, where I am confused by the readings of magnetometer sensor.
Magnetometer reports the magnetic strengths on the three axes of the phone. And I observe that at a same location, if the phone's heading changes, the magnetic readings dramatically change.
In my understand, however, the earth's magnetic field at a specific location should be relative stable, regardless of the phone's placement gesture.
So, my question is, is there any way to transform the raw readings from the 3-axis magnetometer sensor to the world's coordinate system? The accelerometer and orientation data are also available on mobile phones. If so, I suspect the transformed magnetism should be the same even the phone's heading direction changes.
I have referred to the Android source codes, specially, the getOrientation() function and the getRotationMatrix() function. I hoped to get some help from their code implementation. But I did not understand very well. Could someone give any explains on the algorithm principle of these functions?
Link to the code of the functions: http://www.netmite.com/android/mydroid/cupcake/frameworks/base/core/java/android/hardware/SensorManager.java
Thanks! I am really anxious to the solution to this question.
This is impossible, since the device does not know its orientation in world space.
Of course, the orientation can be guessed by the sensor input, and that is what getOrientation() and getRotationMatrix() do. However, on a long timescale only the measurement of acceleration (by gravity) and the magnetic field provide the necessary information. Gyroscope data can be used to refine the estimate for shorter periods, but getOrientation is not guaranteed to use it, and maybe that sensor is not even existent on the particular device.
This means backtransforming using getOrientation would use the exact same data which you want to correct, rendering it useless.

Android Sensors - Which of them get direct input?

the Android SDK actually offers a nice interface to access the sensors.
But e.g. the linear acceleration-sensor can be evaluated as the documentation describes from gravity and acceleration - so there is no real physical counterpart for this Sensor, it is rather a - let's call it - "virtual sensor".
For the proximity-sensor things are rather clear, i can't imagine it is influenced by some other values.
But the GPS-sensor could be influenced by the accleremoter sensor when the GPS-signal is rather weaks I think values are somehow estimated supported by other sensors.
So basically my question is: which sensors do get direct input from physical sensors and which are somehow altered or totally calculated by the Android-SDK?
And how do I get raw input from the sensors?
I appended a list of the sensors available through the Sensor-class. GPS, W-LAN, Camera, etc. missing
//API-Level: 3
TYPE_ACCELEROMETER
TYPE_GYROSCOPE
TYPE_LIGHT
TYPE_MAGNETIC_FIELD
TYPE_PRESSURE
TYPE_PROXIMITY
TYPE_TEMPERATURE
//API-Level: 9 (2.3)
TYPE_GRAVITY
TYPE_LINEAR_ACCELERATION // can be calculated via acc. and grav. (link above)
TYPE_ROTATION_VECTOR
I am pretty sure the GPS at the moment is a stand alone or give raw data output.
The orientation sensor is one that I know that is not a raw from a single sensor but is actually the fusion of 2 sensors and in the future possibly more (gyro). As of now the orientation is a combination of the magnetic field sensor (compass) and the accelerometer. Any modern day compass will use both the compass and accelerometer to calculate its final direction and to compensate for drift, noise and other interference. If you notice when calculating the orientation with get rotation matrix and get orientation it requires you to listen for both magnetic field and accelerometer sensors.
I would say the gravity, linear acceleration and rotation vector sensors are not actual sensors and just parts of data from other sensors separated out, mostly from accelerometer and compass.
Lastly the pressure and temperature sensor are actually calculated through a single sensor.

Android Sensors

I have a very basic question about Sensors:
Do magnetic sensors return readings w.r.t the phone's initial orientation or w.r.t the world coordinates?
What about accelerometers? Do they return values w.r.t their previous readings or is each value an independent acceleration relative to the world coordinate system?
I know that gyros return readings relative to the phone's initial orientation. So, how do I convert the yaw, pitch and roll readings from a gyro into the azimuth, pitch and roll readings from a magnetic sensor of a smartphone (I'm using HTC hero)
Thanks!
As mentioned, the gyroscope measures the angular velocity.
The third value returned (values[2]) is the angular velocity regarding the
z axis. You can use this value together with the initial value from the magnetometer to
calculate current heading: Theta(i+1) = Theta(i) + Wgyro*deltaT
You can receive initial heading orientation from 'Orientation' measurement (values[0])
This measurement is dependent only on the magnetometer. (you can put a magnet or a second phone close to the Smartphone and watch the output going crazy)
The second and third values of the 'Orientation' are dependent on the readings of the
Accelerometer. Since the Accelerometer measures gravity, it is possible to calculate
the pitch and roll angles from the Accelerometer readings in Axis Y and X.
Hope this helps
Ariel
Android Sensors (upto FroYo) provide the application with "raw" data.
There is bare minimum of "cooking" (ie processing) involved.
The accel & compass device provide absolute accel & magnetic data respectively.
The gyroscope provides relative angular velocity.
Gyroscopes do NOT provide relative data wrt any specific state/position.
What you need to understand is that gyroscopic data is angular-velocity.
Angular velocity is simply, how fast the phone is rotating (in degrees-per-second).
So once you hold it still, gyro says (0,0,0) &
as you rotate, you get how fast it is rotating.
This continues until u again hold it back still
when the gyro reading again becomes (0,0,0).
Theoretically the gyro can be used in "callibrate" the compass.
But to do so would require a lot of experimentation on your part.
The ideal place to fiddle around would be the sensor-HAL.
NOTE: You would need to turn-ON all the sensor h/w even if
ONLY compass data is reqd. As you will be cross-referencing
the gyro/accel data for that. This will mean larger power consumption &
extremely poor battery life. All the sensors turned on continuously can
drain the battery of a standard Android phone in 4-5hrs.
You can read more Android Sensors here.

Categories

Resources