In Android Documentation is specified the third parameter as
float[] gravity
then is specifies
[0 0 g] = R * gravity (g = magnitude of gravity)
Now, in most of the examples online I can see everyone sending accelerometer values to getRotationMatrix, but, Isn't suppose that I should send only gravity values?
For example, if the mobile phone has the gravity sensor,
Should I send it raw output to getRotationMatrix?
If it hasn't one, Should I send accelerometer values? Should I extract non gravity components first? (as accelerometer values are Acceleration minus G).
Will the use of gravity sensor values be more reliable than using accelerometer values in mobile phones that have that sensor?
Thanks in advance! Guillermo.
I think the reason you only see examples using the accelerometer values is because the gravity sensor was only launched in API 9 and also because most phones might not give this values separated from the accelerometer values, or dont have the sensor, etc, etc..
Another reason would be because in most of the cases the result tend to be the same, since what the accelerometer sensor outputs is the device linear acceleration plus gravity, but most of the time the phone will be standing still or even moving at a constant velocity, thus the device acceleration will be zero.
From the setRotationMatrix Android Docs:
The matrices returned by this function are meaningful only when the device is not free-falling and it is not close to the magnetic north. If the device is accelerating, or placed into a strong magnetic field, the returned matrices may be inaccurate.
Now, you're asking if the gravity data would me more reliable? Well, there is nothing like testing, but I suppose it wouldn't make much difference and it really depends on which application you want. Also, obtaining the simple gravity values is not trivial and it requires filtering, so you could end up with noisy results.
Related
I have written a simple Activity which is a SensorEventListener for Sensor.TYPE_ACCELEROMETER.
In my onSensorChanged(SensorEvent event) i just pick the values in X,Y,Z format and write them on to a file.
Added to this X,Y,Z is a label, the label is specific to the activity i am performing.
so its X,Y,Z,label
Like this i obtain my activity profile. Would like to have suggestions on what operations to perform after data collection so as to remove noise and get the best data for an activity.
The main intent of this data collection is to construct a user activity detection application using neural network library (NeuroPh for Android) Link.
Just for fun I wrote a pedometer a few weeks ago, and it would have been able to detect the three activities that you mentioned. I'd make the following observations:In addition to Sensor.TYPE_ACCELEROMETER, Android also has Sensor.TYPE_GRAVITY and Sensor.TYPE_LINEAR_ACCELERATION. If you log the values of all three, then you notice that the values of TYPE_ACCELEROMETER are always equal to the sum of the values of TYPE_GRAVITY and TYPE_LINEAR_ACCELERATION. The onSensorChanged(…) method first gives you TYPE_ACCELEROMETER, followed by TYPE_GRAVITY and TYPE_LINEAR_ACCELERATION which are the results of its internal methodology of splitting the accelerometer readings into gravity and the acceleration that's not due to gravity. Given that you're interested in the acceleration due to activities, rather than the acceleration due to gravity, you may find TYPE_LINEAR_ACCELERATION is better for what you need.Whatever sensors you use, the X, Y, Z that you're measuring will depend on the orientation of the device. However, for detecting the activities that you mention, the result can't depend on e.g. whether the user is holding the device in a portrait or landscape position, or whether the device is flat or vertical, so the individual values of X, Y and Z won't be any use. Instead you'll have to look at the length of the vector, i.e. sqrt(XX+YY+ZZ) which is independent of the device orientation.You only need to smooth the data if you're feeding it into something which is sensitive to noise. Instead, I'd say that the data is the data, and you'll get the best results if you use mechanisms which aren't sensitive to noise and hence don't need the data to be smoothed. By definition, smoothing is discarding data. You want to design an algorithm that takes noisy data in at one end and outputs the current activity at the other end, so don't prejudge whether it's necessary to include smoothing as part of that algorithmHere is a graph of sqrt(XX+YY+ZZ) from Sensor.TYPE_ ACCELEROMETER which I recorded when I was building my pedometer. The graphs shows the readings measured when I walked for 100 steps. The green line is sqrt(XX+YY+Z*Z), the blue line is an exponentially weighted moving average of the green line which gives me the average level of the green line, and the red line shows my algorithm counting steps. I was able to count the steps just by looking for the maximum and minimums and when the green line crosses the blue line. I didn't use any smoothing or Fast Fourier Transforms. In my experience, for this sort of thing the simplest algorithms often work best, because although complex ones might work in some situations it's harder to predict how they'll behave in all situations. And robustness is a vital characteristic of any algorithm :-).
This sounds like an interesting problem!
Have you plotted your data against time to get a feel for it, to see what kind of noise you are dealing with, and to help decide how you might pre-process your data for input to the detector?
^
|
A |
|
|
|
|_________________>
| time
|
v
I'd start with lines for each activity:
|Ax + Ay + Az|
|Vx + Vy + Vz| (approximate by calculating area of trapezoids formed by your data points)
... etc
Maybe you can work out the orientation of the phone by attempting to detect gravity, then rotate your vectors to a 'standard' orientation (eg positive Z axis = up). If you can do that, then the different axes may become more meaningful. For example, walking (in pocket) would tend to have a velocity on the horizontal plane, which might be distinguished from walking (in hand) by motion in the vertical plane.
As for filters, if the data appears noisy, a simple starting point is to apply a moving average to smooth it. This is a common technique for sensor data in general:
https://en.wikipedia.org/wiki/Moving_average
Also, this post seems relevant to your question:
How to remove Gravity factor from Accelerometer readings in Android 3-axis accelerometer
Things Identified by me:
The data has to be preprocessed as and how you need it to be, In my
case i just want 3 inputs and one output
The data has to be
subjected to Smoothing (Five-point Smoothing or Any other technique
which suites you the best) Reference. So that Noise gets filtered out (not completely though). Moving Average is one of the techniques
Linearized data would be good, because you dont have any idea how the data was sampled, Use interpolation to help you in Linearizing the data
Finally use FFT (Fast Fourier Transform) to extract the recipe out of the dish, that is to extract features out of your dataset!
For an Android application, I need to get magnetic field measurements across the axis of global (world's) coordinate system. Here is how I'm going (guessing) to implement this. Please, correct me if necessary. Also, please, note that the question is about algorithmic part of the task, and not about Android APIs for sensors - I have an experience with the latter.
First step is to obtain TYPE_MAGNETIC_FIELD sensor data (M) and TYPE_ACCELEROMETER sensor data (G). The second is supposed to be used according to Android's documentation, but I'm not sure if it shouldn't be TYPE_GRAVITY instead (again as G), because accelerometer seems providing not the pure gravity.
Next step is to get rotation matrices via getRotationMatrix(R, I, G, M), where R and I are rotation and inclination matrix correspondingly.
And now goes the most questionnable part: in order to convert M vector into the world's coordinate system, I suppose to multiply [R * I] * M.
I'm not sure this is a correct way for transforming magnetic field reading into another basis. Also, I don't know if remapCoordinateSystem should be used in addition or as replacement for something above.
If there exists some source code which does this thing already, I'd appreciate posting a link, but I don't want to use big general purposes libraries (for example, for augmented reality support) for this specific task, because I'd like to keep it as simple as possible.
P.S.
I came to the idea to add some information to the original post for clarity.
Let us suppose a device rests on a table and continuously reads data from its magnetic sensor. Each measurement contains 3 values, presenting magnetic field in axis X, Y, Z, which are device's local coordinate system. I take it that I can neglect environmental field fluctuations (smoothed by lowpass filter), so this 3 values should remain almost the same all the time the device remains in place. If we rotate device around any axis, the values change, because we change the local coordinate system. But the field itself is not actually changed. So I want to translate local X, Y, Z field measurements into such X', Y', Z', that they keep their respective values regardless to device rotation, provided that device is not moved from its location (only rotated).
I've implemented the algorithm described above and got regular and noticable changes in values X', Y', Z', obtained through suggested transformations, so there is something wrong in it.
P.P.S.
Occasionally I've found an exact duplicate of my question here on SO - How can I get the magnetic field vector, independent of the device rotation? - but unfortunately the answer contains my suggestions, and OP of that question confirms that they do not work.
The coordinates of M with respect to the word coordinate is just the multiplication R*M.
The rotation matrix R is mathematically the change of basis matrix from the device coordinate to the word coordinate.
Let X, Y, Z be the device coordinate basis and W_1, W_2, W_3 be the word coordinate basis then
M = m_1 X + m_2 Y + m_3
and also
M = c_1 W_1 + c_2 W_2 + c_3 W_3
where R * (m_1, m_2, m_3) = (c_1, c_2, c_3) transpose.
Low pass filter is only used to filter out accelerations in the X, Y directions. RemapCoordinateSystem is used to change the order of the basis, ie changing from W_1, W_2, W_3 to W_1, W_3, W_2.
The magnetometer sensor on your device returns a 3-vector in device coordinates. You can use getRotationMatrix() to get a matrix that could be used to convert that device-coordinates
vector to world coordinates. You could also learn about Quaternions and use
TYPE_ROTATION_VECTOR directly. However, there's no Quaternion library in Android (that I know of) and that's a discussion beyond the scope of this question.
However, none of this will do you any good because the device orientation information is based in part on the value from the magnetometers. In other words, the device will always tell you that the magnetic vector is facing exactly North.
Now, what you can do is get magnetic dip. This is one of the outputs from getRotationMatrix(), although you'll have to convert a matrix to an angle for it to be useful. That too, is beyond the scope of this question.
Finally, your last option is to build a table which is level and which has an arrow on it pointing true north. (You'll have to align it by the stars at night or something.) Then, place your device flat on the table with the top of the device facing north. In this case, device coordinates will be the same as world coordinates and the magnetometer sensor will produce the values you want.
Your comments indicate that you're interested in local variations. There's simply no way to get true north with your Android device alone. Theoretically, you could build a table as I described, and then walk around holding the device in strictly the same orientation as before, keeping an eye on the table for reference. I doubt you could pull it off, though.
You could try using gyros in your app to help you keep the device oriented exactly the same way at all times, but the gyros in any Android device you use are likely to drift too much for this to work.
Or perhaps we still don't understand what you're trying to do. Bottom line, though, is that you simply cannot get a global coordinate system with an Android device alone -- whatever you get will always be aligned with the local magnetic field at that exact spot.
What is the difference between gravity and acceleration sensors in Android? From my point of view the physical value is the same in both cases.
Which one measures the force acting on unit mass inside the device?
ADDITION
The question is: what physical quantity is measured by these sensors? According to equivalence principle the acceleration and gravity are indistinguishable and the only way to measure both is by usual (but 3d) spring balance.
Acceleration sensor gives you back the sum of all forces applied to your device, while Gravity sensor returns only the influence of gravity. If you want to exclude the gravity from acceleration, you may use a high-pass filter or just subtract the gravity sensor values from acceleration sensor values -- not sure which method gives better precision.
Another method could be using Sensor.TYPE_LINEAR_ACCELERATION, which gives exactly (acceleration - gravity), however you should check if it's available on the device. I've found a few devices, which have ACCELERATION sensor working, but no response from GRAVITY or LINEAR_ACCELERATION sensors.
This link might be helpful: http://www.sensorplatforms.com/which-sensors-in-android-gets-direct-input-what-are-virtual-sensors
The following excerpt summarize the answer for your question:
"The list... includes both physical sensors and sensor types with values derived from physical sensors, sometimes these are called virtual sensors... The virtual sensors types (gravity, linear acceleration, and rotation vector) provide values derived by combining the results from physical sensors intelligently... The rotation vector is a combination of the accelerometer, the magnetometer, and sometimes the gyroscope to determine the three-dimensional angle along which the Android device lays with respect to the Earth frame coordinates. By knowing the rotation vector of a device, accelerometer data can be separated into gravity and linear acceleration."
This link https://stackoverflow.com/a/8238389/604964 also says that the gravity values are computed using a Butterworth filter.
Can anyone help on removing the g factor from accelerometer readings?
I am using SensorEventListener with onSensorChanged() method for getting Sensor.TYPE_ACCELEROMETER data. I need only pure acceleration values in all directions. So at any state if the device is stable (or in constant speed), it should give (0.0,0.0,0.0) roughly.
Currently, depending on its pitch and roll, it gives me variable output depending on the g forces acting on each axis.
I hope there is some formula to remove this, as I also get orientation values (pitch and roll) from Sensor.TYPE_ORIENTATION listener. I have used some but it didn't work.
You can use a low-pass filter.
Do this for each of your sensor values:
g = 0.9 * g + 0.1 * v
Where v is your current sensor value and g is a global variable initially set to zero. Mind that you'll need as many g variables as you have axes.
With v = v - g you can eliminate the gravity factor from your sensor value.
Use Sensor.TYPE_LINEAR_ACCELERATION instead of Sensor.TYPE_ACCELEROMETER
Take a look of the following link.
http://developer.android.com/reference/android/hardware/SensorEvent.html
Just subtract out g (~9.8m/s^2) times the z direction of the rotation matrix.
Or to be more explicit about it, let
a = your accelerometer reading,
R = your rotation matrix (as a 9-long vector).
Then what you want is
(a[0]-g*R[6], a[1]-g*R[7], a[2]-g*R[8]).
Differentiating with respect to time a function of time rids you of the constants.
So by taking the derivative of the accelerometer's signal you'll get the "Jerk", which you can then re-integrate in order to get the non-constant part of the acceleration you're looking for.
In Layman's terms, take a sample from the accelerometer every 1 second, and subtract it from the previous sample. If the answer is (very close to) zero, you're not accelerating relatively to earth. If the result is non-zero, integrate it (in this case, multiply by one second), you have your acceleration.
Two things, though :
-Look out for noise in the signal, round off your input.
-Don't expect hyper-accurate results from on-chip accelerometers. You can use them to detect shaking, changes in orientation, but not really for knowing how many G's you're experiencing while making sharp turns in your car.
One way (for devices only with accelerometer) is to remove gravity vector from accelerometer data by subtracting the values that would come in static case for same orientation. But as orientation is again calculated by taking acceleration readings and not independently, its not very accurate.
Gyroscope may help in this case. But few androids still have a true gyroscope. And using its raw readings is not so simple.
you need to assume two coordinate systems:
1- fixed global system.
2- moving coordinate system in which the origin moves & rotates as sensor does.
in global system, g is always parallel to z axis but in moving system it is not.
so all you have to do is to compute 3*3 rotation matrix from orientation angles or
yaw, pitch & roll. (you can find formulas everywhere).
then multiply this rotation matrix by 3*1 acceleration vector measured by sensor.
this will transform coordinates and declare the values in fixed global system.
the only thing afterward is to simply subtract g from z value.
At present i am working on hal part of sensors in android sdk, we are using 3- Axis BMA-150 Accelerometer sensor to get acceleration values with respect to X,y,Z Axis, I want to know whether this sensor will give o/p directly in SI units by using some calibration techniques or what ? , and i noticed that in sensor.c file they mentioned
720.0 LSG = 1G(9.8 m/s2), what is the relation between LSG and acceleration due to gravity?
what is meant by LSG
why they are multiplying the o/p of accelerometer x,y,z valuse with 9.8/720.0f . please help on this part .
Thanks
Vinay
Without knowing anything about the device, 9.8 meters per square second is the value of gravitational acceleration at Earth surface. The equation you quote seems to be definition of "LSG" and the only thing that makes sense to define in the context is the unit in which output is provided. So the device will probably give output '720.0' on the axis that is vertical. By multiplying with 9.8/720.0 you renormalize the value to SI unit (m/s^2)
(Average gravitational acceleration, denoted G (that explains the 1G), is 9.80665 m/s^2, but varies by a few percent between equator and poles and the device is probably unable to give more than two significant digits precision anyway)