At present i am working on hal part of sensors in android sdk, we are using 3- Axis BMA-150 Accelerometer sensor to get acceleration values with respect to X,y,Z Axis, I want to know whether this sensor will give o/p directly in SI units by using some calibration techniques or what ? , and i noticed that in sensor.c file they mentioned
720.0 LSG = 1G(9.8 m/s2), what is the relation between LSG and acceleration due to gravity?
what is meant by LSG
why they are multiplying the o/p of accelerometer x,y,z valuse with 9.8/720.0f . please help on this part .
Thanks
Vinay
Without knowing anything about the device, 9.8 meters per square second is the value of gravitational acceleration at Earth surface. The equation you quote seems to be definition of "LSG" and the only thing that makes sense to define in the context is the unit in which output is provided. So the device will probably give output '720.0' on the axis that is vertical. By multiplying with 9.8/720.0 you renormalize the value to SI unit (m/s^2)
(Average gravitational acceleration, denoted G (that explains the 1G), is 9.80665 m/s^2, but varies by a few percent between equator and poles and the device is probably unable to give more than two significant digits precision anyway)
Related
I have been working with Android's calibrated magnetometer for some time, feeding it into our algorithm for the rotation vector values to calculate the correct yaw/orientation with North. In spite of dealing with not completely projecting the yaw onto a plane that is parallel with the ground to get true yaw independent of pitch, we have been noticing that even after we calibrate the magnetometer - using the calibrated magnetometer values and moving the phone in figure eights and other movements/orientations - the calibrated values seem to eventually try to recalibrate.
With this in mind, we decided to start looking specifically at the uncalibrated values given by Android within our JNI code. Within the struct "ASensorEvent", there is "uncalibrated_magnetic", which is the struct "AUncalibratedEvent" - all of this is defined in "android/sensor.h". I assumed that this would give me uncalibrated values; however I was mistaken - at least on the devices I check it on - and was given the supposed calibrated values. Being that in the "sensor.h", the only enums for sensors that are explicitly defined are...
ASENSOR_TYPE_ACCELEROMETER = 1,
ASENSOR_TYPE_MAGNETIC_FIELD = 2,
ASENSOR_TYPE_GYROSCOPE = 4,
ASENSOR_TYPE_LIGHT = 5,
ASENSOR_TYPE_PROXIMITY = 8
...I decided to directly type in 14 assuming this would give me the uncalibrated magnetometer values since this is the values that is associated with the magnetometer outside of JNI http://developer.android.com/reference/android/hardware/Sensor.html#TYPE_MAGNETIC_FIELD
This gave the uncalibrated magnetometer values that corresponded with those outside of JNI.
So, at this point, we decided to plot the values given and we noticed something strange.
Here, you can see the x-axis are the y-values given and the y-axis the z-values given by the uncalibrated magnetometer - however, the axes are irrelevant since it can be seen across all axes. At the bottom left, you'll notice a "j" figure rotated roughly 150 degrees clockwise. These "j" figure values were at the beginning of data collection and lasted for around 20 seconds.
We haven't always seen this in our data collection, but around 50% of the time we have seen this. I really have no idea what this is. I mean I assume it isn't some weird hard iron offset since I imagine such an offset to be close to offset that is visible with a majority of the data and I'd assume it wasn't soft-iron skewed values because the environment was consistently the same at least after 1 second until the end of data collection (lasted for about 200s) and sometimes was the same throughout the whole trace.
I guess we are starting to speculate that we are not truly getting uncalibrated/raw values.
Thanks in advance.
As written on http://developer.android.com/guide/topics/sensors/sensors_position.html#sensors-pos-magunc
"Factory calibration and temperature compensation are still applied to the magnetic field." Hope it helps!
What is the difference between gravity and acceleration sensors in Android? From my point of view the physical value is the same in both cases.
Which one measures the force acting on unit mass inside the device?
ADDITION
The question is: what physical quantity is measured by these sensors? According to equivalence principle the acceleration and gravity are indistinguishable and the only way to measure both is by usual (but 3d) spring balance.
Acceleration sensor gives you back the sum of all forces applied to your device, while Gravity sensor returns only the influence of gravity. If you want to exclude the gravity from acceleration, you may use a high-pass filter or just subtract the gravity sensor values from acceleration sensor values -- not sure which method gives better precision.
Another method could be using Sensor.TYPE_LINEAR_ACCELERATION, which gives exactly (acceleration - gravity), however you should check if it's available on the device. I've found a few devices, which have ACCELERATION sensor working, but no response from GRAVITY or LINEAR_ACCELERATION sensors.
This link might be helpful: http://www.sensorplatforms.com/which-sensors-in-android-gets-direct-input-what-are-virtual-sensors
The following excerpt summarize the answer for your question:
"The list... includes both physical sensors and sensor types with values derived from physical sensors, sometimes these are called virtual sensors... The virtual sensors types (gravity, linear acceleration, and rotation vector) provide values derived by combining the results from physical sensors intelligently... The rotation vector is a combination of the accelerometer, the magnetometer, and sometimes the gyroscope to determine the three-dimensional angle along which the Android device lays with respect to the Earth frame coordinates. By knowing the rotation vector of a device, accelerometer data can be separated into gravity and linear acceleration."
This link https://stackoverflow.com/a/8238389/604964 also says that the gravity values are computed using a Butterworth filter.
In Android Documentation is specified the third parameter as
float[] gravity
then is specifies
[0 0 g] = R * gravity (g = magnitude of gravity)
Now, in most of the examples online I can see everyone sending accelerometer values to getRotationMatrix, but, Isn't suppose that I should send only gravity values?
For example, if the mobile phone has the gravity sensor,
Should I send it raw output to getRotationMatrix?
If it hasn't one, Should I send accelerometer values? Should I extract non gravity components first? (as accelerometer values are Acceleration minus G).
Will the use of gravity sensor values be more reliable than using accelerometer values in mobile phones that have that sensor?
Thanks in advance! Guillermo.
I think the reason you only see examples using the accelerometer values is because the gravity sensor was only launched in API 9 and also because most phones might not give this values separated from the accelerometer values, or dont have the sensor, etc, etc..
Another reason would be because in most of the cases the result tend to be the same, since what the accelerometer sensor outputs is the device linear acceleration plus gravity, but most of the time the phone will be standing still or even moving at a constant velocity, thus the device acceleration will be zero.
From the setRotationMatrix Android Docs:
The matrices returned by this function are meaningful only when the device is not free-falling and it is not close to the magnetic north. If the device is accelerating, or placed into a strong magnetic field, the returned matrices may be inaccurate.
Now, you're asking if the gravity data would me more reliable? Well, there is nothing like testing, but I suppose it wouldn't make much difference and it really depends on which application you want. Also, obtaining the simple gravity values is not trivial and it requires filtering, so you could end up with noisy results.
I am developing an app using android OS for which I need to know how can I calculate the movement of the device up in the vertical direction.
For example, the device is at rest (point A), the user picks it up in his hand (point B), now there is a height change between point A and point B, how would i calculate that?
I have already gone through the articles about sensors and accelerometers, but I couldn't really find anything to help me with that. Anyone have any ideas?
If you integrate the acceleration twice you get position but the error is horrible. It is useless in practice. Here is an explanation why (Google Tech Talk) at 23:20. I highly recommend this video.
Now, you do not need anything accurate and that is a different story. The linear acceleration is available after sensor fusion, as described in the video. See Sensor.TYPE_LINEAR_ACCELERATION at SensorEvent. I would first try a high-pass filter to detect sudden increase in the linear acceleration along the vertical axis.
I have no idea whether it is good for your application.
You can actually establish (only) the vertical position without measuring acceleration over time. This is accomplished by measuring the angle between the direction to the center of the earth, and the direction to the magnetic north pole.
This only changes (significantly) when the altitude (height) of the phone changes. What you do is use the accelerometer and magnetometer to get two float[3] arrays, treat these as vectors, make them unit vectors, and then the angle between any two unit vectors is arccos(AxM).
Note that's dot product ie. math.acos(A[0]*B[0]+A[1]*B[1]+A[2]*B[2]) Any change in this angle corresponds to a change in height. Also note that this will have to be calibrated to real units and the ratio of change in angle to height will be different at various longitudes; But this is a method of getting an absolute value for height; though of course the angle also becomes skewed when undergoing acceleration, or when there are nearby magnets :)
you can correlate it to magnetic field sensor in microTesla
You can use dist= integral of integral of acceleration ~ sigma ~ summation
= integral of speed+constant
Can anyone help on removing the g factor from accelerometer readings?
I am using SensorEventListener with onSensorChanged() method for getting Sensor.TYPE_ACCELEROMETER data. I need only pure acceleration values in all directions. So at any state if the device is stable (or in constant speed), it should give (0.0,0.0,0.0) roughly.
Currently, depending on its pitch and roll, it gives me variable output depending on the g forces acting on each axis.
I hope there is some formula to remove this, as I also get orientation values (pitch and roll) from Sensor.TYPE_ORIENTATION listener. I have used some but it didn't work.
You can use a low-pass filter.
Do this for each of your sensor values:
g = 0.9 * g + 0.1 * v
Where v is your current sensor value and g is a global variable initially set to zero. Mind that you'll need as many g variables as you have axes.
With v = v - g you can eliminate the gravity factor from your sensor value.
Use Sensor.TYPE_LINEAR_ACCELERATION instead of Sensor.TYPE_ACCELEROMETER
Take a look of the following link.
http://developer.android.com/reference/android/hardware/SensorEvent.html
Just subtract out g (~9.8m/s^2) times the z direction of the rotation matrix.
Or to be more explicit about it, let
a = your accelerometer reading,
R = your rotation matrix (as a 9-long vector).
Then what you want is
(a[0]-g*R[6], a[1]-g*R[7], a[2]-g*R[8]).
Differentiating with respect to time a function of time rids you of the constants.
So by taking the derivative of the accelerometer's signal you'll get the "Jerk", which you can then re-integrate in order to get the non-constant part of the acceleration you're looking for.
In Layman's terms, take a sample from the accelerometer every 1 second, and subtract it from the previous sample. If the answer is (very close to) zero, you're not accelerating relatively to earth. If the result is non-zero, integrate it (in this case, multiply by one second), you have your acceleration.
Two things, though :
-Look out for noise in the signal, round off your input.
-Don't expect hyper-accurate results from on-chip accelerometers. You can use them to detect shaking, changes in orientation, but not really for knowing how many G's you're experiencing while making sharp turns in your car.
One way (for devices only with accelerometer) is to remove gravity vector from accelerometer data by subtracting the values that would come in static case for same orientation. But as orientation is again calculated by taking acceleration readings and not independently, its not very accurate.
Gyroscope may help in this case. But few androids still have a true gyroscope. And using its raw readings is not so simple.
you need to assume two coordinate systems:
1- fixed global system.
2- moving coordinate system in which the origin moves & rotates as sensor does.
in global system, g is always parallel to z axis but in moving system it is not.
so all you have to do is to compute 3*3 rotation matrix from orientation angles or
yaw, pitch & roll. (you can find formulas everywhere).
then multiply this rotation matrix by 3*1 acceleration vector measured by sensor.
this will transform coordinates and declare the values in fixed global system.
the only thing afterward is to simply subtract g from z value.