Android Sensors - Which of them get direct input? - android

the Android SDK actually offers a nice interface to access the sensors.
But e.g. the linear acceleration-sensor can be evaluated as the documentation describes from gravity and acceleration - so there is no real physical counterpart for this Sensor, it is rather a - let's call it - "virtual sensor".
For the proximity-sensor things are rather clear, i can't imagine it is influenced by some other values.
But the GPS-sensor could be influenced by the accleremoter sensor when the GPS-signal is rather weaks I think values are somehow estimated supported by other sensors.
So basically my question is: which sensors do get direct input from physical sensors and which are somehow altered or totally calculated by the Android-SDK?
And how do I get raw input from the sensors?
I appended a list of the sensors available through the Sensor-class. GPS, W-LAN, Camera, etc. missing
//API-Level: 3
TYPE_ACCELEROMETER
TYPE_GYROSCOPE
TYPE_LIGHT
TYPE_MAGNETIC_FIELD
TYPE_PRESSURE
TYPE_PROXIMITY
TYPE_TEMPERATURE
//API-Level: 9 (2.3)
TYPE_GRAVITY
TYPE_LINEAR_ACCELERATION // can be calculated via acc. and grav. (link above)
TYPE_ROTATION_VECTOR

I am pretty sure the GPS at the moment is a stand alone or give raw data output.
The orientation sensor is one that I know that is not a raw from a single sensor but is actually the fusion of 2 sensors and in the future possibly more (gyro). As of now the orientation is a combination of the magnetic field sensor (compass) and the accelerometer. Any modern day compass will use both the compass and accelerometer to calculate its final direction and to compensate for drift, noise and other interference. If you notice when calculating the orientation with get rotation matrix and get orientation it requires you to listen for both magnetic field and accelerometer sensors.
I would say the gravity, linear acceleration and rotation vector sensors are not actual sensors and just parts of data from other sensors separated out, mostly from accelerometer and compass.
Lastly the pressure and temperature sensor are actually calculated through a single sensor.

Related

Redundant motion sensors on android?

I'm developing a tool which receives motion sensor data and sends it to a machine learning algorithm, which ultimately will deduce different types of movement.
I read the Motion sensor guide and it seems like there is some redundancy in the data you can get from the sensors. For example: the accelrometer data contains gravity data and the linear acceleration data shows acceleration without acceleration due to gravity.
So my question is: do i really need all the sensors to get all forms of motion or can I give up some of them?
EDIT: (clarifying the question)
I want to collect the minimal data that will allow me to deduce the same things. What I'm looking for is user behavior: the angle which the user holds his phone, the way the user moves while using his phone, etc..
The answer I'm looking for should include the sets of sensors that have high correlation within them, such that only some of the sensors in this set are required to deduce the same type of motion\movement\rotation\acceleration\etc..
The term "Motion" in the question have no precise meaning. So I answer more generally.
"The way one holds his phone" is nothing but the orientation of the phone.There are three sensors which individually tells the orientation of the phone.
Accelerometer sensor
Orientation sensor
Rotation Vector sensor
Among them only the accelerometer is physical sensor and other two are virtual sensors (they don't have special piece of hardware, they use accelerometer data and report the orientation in different formats).
The orientation sensor is deprecated so you can't use it.
Rotation vector sensor tells the orientation encoded in a quaternion. If your code is based on quaternions then normalize the sensor output using SensorManager.getQuaternionFromVector() and continue. If your code is based on rotation matrix then obtain rotation matrix by calling SensorManager.getRotationMatrixFromVector() passing sensor output and continue. If you want the orientation alone get it by calling SensorManager.getOrientation() passing rotation matrix obtained previously.
Using accelerometer sensor we can find the orientation, but the recommended approach is to combine it with magnetic field sensor output. Call SensorManager.getRotationMatrix() by passing the output of accelerometer output and magnetic field sensor output and get the rotation matrix. If your code is based on rotation matrix, just continue. If you want the orientation alone get it by calling SensorManager.getOrientation() passing rotation matrix obtained in previously. If your code is based on quaternion call SensorManager.getQuaternionFromVector() by passing rotation vector (orientation) obtained previously.
"The way one moves his phone" - Here I consider four motions.
Change of position (Simple translation) and rate change of position (velocity) - No sensor to detect them.
Rate of change of velocity (Simple acceleration) - Accelerometer detects it. But it also contains the gravity component. Normally we need acceleration without gravity component. This could be calculated simply as explained here. However there is another virtual sensor called Linear Acceleration which does the job for us.
Change of orientation (Rotation) - Whenever the orientation changes the accelerometer, orientation and rotation vector sensors report us (gyroscope also reports, but is explained in next point). How to use this sensor to get the current orientation is explained in first part of the answer.
Rate of change of orientation (Angular velocity) - Whenever the orientation changes the gyroscope sensor reports. The output is three numbers representing angular acceleration along x, y and z axes. The unit is radians per second.
Output of the gyroscope sensors is not accurate in long term and the output of accelerometer is not accurate in short term, so combine them to get steady output. For details see this question.
Now it is clear that the gyroscope and accelerometer is required in minimum. However using wide range of sensors minimizes our work.
You can't decide what you get - each sensor's data is already defined, and you get all or nothing. If you see closely, there isn't a place in public API which would let you ask for specific things.
To back this up here's quote from Google's document explaining sensor types:
An accelerometer sensor reports the acceleration of the device along the 3 sensor axes. The measured acceleration includes both the physical acceleration (change of velocity) and the gravity. The measurement is reported in the x, y and z fields of sensors_event_t.acceleration.
If you see into android source, the structs here are strictly defined, and struct for acceleration contains specific fields. So even if you would get 0 in fields you don't like, you won't gain anything.
But what you're referring to are two things - base sensors, which are roughly equivalent to physical sensors on the device, and composite sensors, which combine readings from various physical sensors to get more useful data.
So while you can't decide what you get for a particular sensor (like "only gravity" or "only acceleration in Y axis"), composite sensors do give you data that you can compute by yourself using only base sensors. So linear acceleration is composition of data from accelerometer and gyroscope (or magnetic sensor), after some calculations. Similarly step detector "sensor" uses only accelerometer, but interpretes the data automatically to just give you an event that "yes, someone has made a step" with single value 1.
If you're feeding raw motion data to some algorithms, I would guess base sensors are what you're looking for. That said, I believe you can still safely register for all sensors (both base and composite ones) that combined give you all data that you need (and maybe more), without meaningful battery impact.
For more detailed information on each of the sensors refer to Sensor types on Android website, and if you're curious, you can read up short summary on sensors stack as well.
No, you don't need every sensor. Some of the sensors exist as a convenience to the user. Your example of the linear acceleration sensor is one- it tells you the results of the accelerometer with gravity taken out. You could do this yourself from the raw accelerometer data, but that takes a bit of math (you need to subtract the vector gravity over all 3 axes) and a bit of knowhow (did you remember to calibrate the sensor? It may not read 9.8 at rest. For that matter, 9.8 may not be your gravity if you're not at sea level). That's a lot of work that would need to be repeated by each app, so they created a software "sensor" that sits on top of the accelerometer and provides the computed data. It would be unusual for an app to use raw and linear accelerometers in the same app, generally its one or the other. The step counter is another example of this, it guesses at what a step is based on the accelerometer data. You also wouldn't want calibrated and uncalibrated gyroscope data.
As for what you do need- no clue, you don't say enough about what you're trying to do. One warning though- you said you're trying to detect motion. YOu can't do that. You can detect accelerations and rotation. You cannot detect motion at a constant speed. If you're developing any type of app using these it pays to use the correct terminology and think in terms of physics and how the physical accelerometer and gyroscope work, otherwise you're going to cause yourself bugs.

Accelerometer vs Gravity sensor

Does gravity sensor return right values if device is in motion? I thought that gravity sensor use accelerometer to recognize gravitation direction. Are these two sensors different piece of hardware?
The Gravity sensor is what Android calls a 'software sensor' and calculates its values using more than one hardware sensor.
The software Gravity sensor is only available if the device has a gyroscope.
By combining accelerometer data with gyroscope data, the acceleration due to moving the device can be filtered out to leave the pure gravity signal. So yes, it will return the right value under motion.
Thus, the Gravity sensor gives a much better signal for device orientation than just the accelerometer on its own.
Combining sensor values is called sensor fusion and important for high quality measurement values.
The Android Documentation describes the Gravity Sensor.
Unfortunately, many Android devices lack a Gyroscope, and thus, will have no Gravity sensor either. This leaves you with a sub optimal signal from just the accelerometer alone, giving a lower quality user experience compared to Android devices with both sensors, and lower quality experience compared to iOS devices.
You can block installations on incompatible devices by using a Google Play requirement specification as follows:
<uses-feature android:name="android.hardware.sensor.gyroscope" />
Albert Einstein answered this question in 1911
A little reflection will show that the law of the equality of the inertial and gravitational mass is equivalent to the assertion that the acceleration imparted to a body by a gravitational field is independent of the nature of the body. For Newton's equation of motion in a gravitational field, written out in full, it is:
(Inertial mass) \cdot (Acceleration) = (Intensity of the gravitational field) \cdot (Gravitational mass).
It is only when there is numerical equality between the inertial and gravitational mass that the acceleration is independent of the nature of the body.
— Albert Einstein
We cannot differentiate the measurement of acceleration from the measurement of gravity since they are equivalent to the observer. Even if the Android had two sensors they would both be measuring the same thing.

Does Android Rotation Vector Sensor use the magnetometer readings?

As known, the so-called "Rotation Vector Sensor" is a virtual sensor.
I know for sure that it takes the readings from accelerometer and gyroscope for computation.
However, now I wish to know whether the magnetometer readings are also take into the calculation of the Rotation Vector Sensor?
The relation constitutes to
The rotation vector sensor reports the orientation of the device relative to the East-North-Up coordinates frame.
The underlying physical sensors are Accelerometer, Magnetometer, and Gyroscope, while the latter is used as the main orientation change input.
Furthermore source.android.com says
This sensor also uses accelerometer and magnetometer input to make up for gyroscope drift, and it cannot be implemented using only the accelerometer and magnetometer.
In contrary the Game rotation vector explicitly only uses the Accelerometer and Gyroscope. I asked a similar question on how the data is acutally fused, yet to be answered.
The link also directs to this table
Yes, rotation vector is a combination of the accelerometer, the magnetometer, and sometimes the gyroscope to determine the three-dimensional angle along which the Android device lays with respect to the Earth frame coordinates.

Mobile phone sensors

After some hours of searching I'm so confused, so I'll tell now what I learned, so please correct me if I was wrong :
Light Sensor: surrounding light
Magnetic Sensor: I can get the north location.
Accelerometer Sensor: the gravity X Y Z , earth gravity or what ever acceleration .
Proximity Sensor: just like the parking car sensors.
Orientation Sensor: Tells the X Y Z degrees form their axis.
I've tried those sensors using some apps from android market like "My sensors", and I can confirm that accelerometer Sensor can't feel if you rotate your phone in position over a table. While orientation Sensor seems to catch all the moves. Now I can conclude that compass app uses the magnetic sensor to see where is the north, then orientation sensor to know where are you heading with your device, right?
switching between portrait and landscape modes use "Accelerometer Sensor" and checks the earth gravity on which axis.right?
Q1:so if everything is ok, what is "Gyroscope"? is it same as "Orientation Sensor"?
Q2:is Orientation Sensor avalible on most of the devices?
Q3:what other uses of Orientation Sensor?
Q4:why most of the websites even wikipedia says thet Orientation Sensor == Accelerometer Sensor?
-Rami
Ok, first the easy ones...
About Q1, Gyroscope measures the Angular velocity (radians/second) and the Orientation is a different magnitude, telling us how is "placed" the device (I don't really know how to explain something so basic in English).
And about Q2, I would say yes, 3-Axis orientation is avaiable on most of smartphones, at least those running Android.
Q3: Compass actually uses the Orientarion values, Magnetic Field sensor measures that magnitude, (not in degrees) though probably you can calculate the compass values with the magnetic field. Another use... well, you can tell wether the device lies upside or downside, for example.
About Q4, this is more difficult, I'm not that expert in accelerometers, but I think most of these "Sensors" use the same hardware sensor, which measures the magnecit field and makes the necessary calculations, but I insist, maybe it's better to read more detailed and technical information.
If you notice, now OrientarionSensor is deprecated, and this is written in the docs:
Note: This sensor type exists for
legacy reasons, please use
getRotationMatrix() in conjunction
with remapCoordinateSystem() and
getOrientation() to compute these
values instead.
So yes, it seems it calculates Orientation values trough the Accelerometer, but still, Orientation is given in degrees and Acceleration in (m/s^2), different magnitudes. As I told you, I think they measure different magnitudes with the same sensor, that's why they present different kind of Events in the API.
I hope I haven't written many huge mistakes, because well, I would also like to clarify some concepts regarding to these Sensors.
mdelolmo is perfectly right.
I would like to add the following:
About Q4. Everyone addresses the Orientation sensor as accelerometer
because the smartphones use it for the "Auto-Rotate" feature.
The switching between portrait & landscape modes
(often called orientation in layman terms) is done
by determining that the gravity is along which of axis of the phone.
This is done by the accelerometer-IC.
The orientation sensor (in Android) uses the accelerometer data
and the magnetic data to determine the exact positioning of the
device. ie. the angles it makes with all the 3axes. These are
azimuth(or yaw), pitch & roll.
The gyroscope provides the angular velocity of the device.
It is NOT the orientation sensor.
I haven't worked on android phones but may in the future, however accelerometers detect acceleration usually through the motions of a proof mass. So they can be used to orient a device roughly because they sense the g-vector so any orientation is totally unconstrained in angle about the g-vector. Now accelerometers can be utilized as gyroscopes but they are not used that way because they would need to be better than any accelerometer currently made to sense rotation via centripetal acceleration.
Gyroscopes directly measure either angular rate or angle directly. Most measure angular rate and the rate is integrated to get angle, so they can be used to measure orientation but since they are inertial sensors they drift and so do not provide an absolute orientation but are excellent sensors for relative rotations or relative orientation with respect to a very recent orientation. I hope this helps.
The magnetic sensor measures the direction to true north via the earth's magnetic field. The magnetic sensor supplied North with the g-vector via the accelerometeres give full orientation information because it breaks the symmetry of the orientation about the g-vector. This only really true when the phone/sensors are not moving. Since I do not know how this is implemented in the phone I can't say much else, but the fact that you need accelerometers and another orientation to get full orientation may be the reason why there is confusion about this subject.
Barometers measure pressure and can be good sensors to measure changes in height but can be fooled by active air moving systems such as as found in air conditioned homes and in forced hot air systems.
If you are not moving and you have sufficient sensitivity to measure earth rate with your gyroscopes you can do something called gyrocompassing where the gyroscopes and accelerometers become analytically aligned or physically aligned with with the local level coordinate system. This is how much better gyroscopes and accelerometers measure orientation in systems like aircraft, spacecraft and ships/submarines. There are many complications but this is the basic idea.

Android Sensors

I have a very basic question about Sensors:
Do magnetic sensors return readings w.r.t the phone's initial orientation or w.r.t the world coordinates?
What about accelerometers? Do they return values w.r.t their previous readings or is each value an independent acceleration relative to the world coordinate system?
I know that gyros return readings relative to the phone's initial orientation. So, how do I convert the yaw, pitch and roll readings from a gyro into the azimuth, pitch and roll readings from a magnetic sensor of a smartphone (I'm using HTC hero)
Thanks!
As mentioned, the gyroscope measures the angular velocity.
The third value returned (values[2]) is the angular velocity regarding the
z axis. You can use this value together with the initial value from the magnetometer to
calculate current heading: Theta(i+1) = Theta(i) + Wgyro*deltaT
You can receive initial heading orientation from 'Orientation' measurement (values[0])
This measurement is dependent only on the magnetometer. (you can put a magnet or a second phone close to the Smartphone and watch the output going crazy)
The second and third values of the 'Orientation' are dependent on the readings of the
Accelerometer. Since the Accelerometer measures gravity, it is possible to calculate
the pitch and roll angles from the Accelerometer readings in Axis Y and X.
Hope this helps
Ariel
Android Sensors (upto FroYo) provide the application with "raw" data.
There is bare minimum of "cooking" (ie processing) involved.
The accel & compass device provide absolute accel & magnetic data respectively.
The gyroscope provides relative angular velocity.
Gyroscopes do NOT provide relative data wrt any specific state/position.
What you need to understand is that gyroscopic data is angular-velocity.
Angular velocity is simply, how fast the phone is rotating (in degrees-per-second).
So once you hold it still, gyro says (0,0,0) &
as you rotate, you get how fast it is rotating.
This continues until u again hold it back still
when the gyro reading again becomes (0,0,0).
Theoretically the gyro can be used in "callibrate" the compass.
But to do so would require a lot of experimentation on your part.
The ideal place to fiddle around would be the sensor-HAL.
NOTE: You would need to turn-ON all the sensor h/w even if
ONLY compass data is reqd. As you will be cross-referencing
the gyro/accel data for that. This will mean larger power consumption &
extremely poor battery life. All the sensors turned on continuously can
drain the battery of a standard Android phone in 4-5hrs.
You can read more Android Sensors here.

Categories

Resources