I am using LG Optimus 2x smartphone(Gyroscope and Accelerometer sensor) for positioning.
I want to get correct rotation angles from gyroscope that can be used later on for body to earth coordinate transformation. My question is that
How I can measure and remove the drift in gyro sensor.
The one way is to take the average of gyro samples (when mobile is in static condition) for some time and subtracting from current sample, which is not good way.
When the mobile is in rotation/motion how to get the drift free angles?
As far as I know, either the Kalman filter or something similar is implemented in the SensorManager. Check out Sensor Fusion on Android Devices: A Revolution in Motion Processing.
You are trying to solve a problem that is already solved.
I am the author of a compass application that integrates data from magnetic and gyroscope sensors (steady compass). I have tested this application mostly on a LG Optimus black (the device that you can see on the video) running Android 2.2, so I am going to share my experiences:
Gyroscope readings are very accurate. This sensor is just the opposite to accelerometer and magnetic sensors which give readings with a lot of jitter.
The readings from the gyroscope (i.e. the angular speed) does not drift at all. You will have a drift in the estimation of the orientation if you just integrate gyroscope readings. Since you are integrating samples in different times, you will obtain just an approach that will degrade after every integration step.
In order to avoid such a drift in the orientation estimation, you must consider other input sources to correct the results coming from gyroscope data integration. The solution is the integration of data coming from the orientation sensor (magnetic + acceleration) and data coming from the gyroscope.
Be careful with LG phones: According to the Android API, gyroscope will return data in rad/s. The LG Optimus Black with Froyo gives readings in degrees/s. The update to Android 2.3 has just been released for such phone. I have to test whether the new version behaves according to the specifications.
What Android version does your phone have? Have you tested any application using gyroscope? Did you get the expected results?
Basically gyros drift over long time periods. Whereas accelerometers have no drift but tend to be unstable. By combining information from both sensors using a Kalman filter you can obtain a accurate attitude. For some this less complex you could also use a Complementary Filter.
See this post for more info:
Combine Gyroscope and Accelerometer Data
Related
I am trying to understand the process of sensor fusion and along with it Kalman filtering too.
My goal is to detect Fall of a device using Accelerometer and Gyroscope.
In most of the papers such as this one, It mentions how to overcome drift due to Gyroscope and noise due to Accelerometer. Eventually the sensor fusion provides us with better measurements of Roll, Pitch and Yaw and not better acceleration.
Is it possible to get better 'acceleration results' by sensor fusion and in turn use that for 'Fall detection' ? As only better Roll, Yaw and Pitch are not enough to detect a Fall.
However this source recommends to smoothen Accelerometer (Ax,Ay,Az) and Gyroscope (Gx,Gy,Gz) using Kalman filter individually and using some classification algorithm such as k-NN Algorithm or clustering to detect Fall using supervised learning.
Classification part is not my problem, it is if I should fuse the sensors(3D accelerometer and 3D gyroscope) or smoothen the sensors separately, with my goal of detecting a fall.
Several clarifications
Kalman Filter is typically to perform sensor fusion for position and orientation estimation, usually to combine IMU (accel and gyro) with some no-drifting absolute measurements (computer vision, GPS)
Complimentary filter, which is typically used to have good orientation estimation by combining accel(noisy but non-drifting) and gyro(accurate but drifting) . Using accel and combine with gyro, one can have fairly good orientation estimation. The orientation estimation you can see as primary using the gyro, but corrected using accel.
For the application of Fall detection using IMU, I believe that acceleration is very important. There is no known way to "correct" the acceleration reading, and thinking of this way is likely to be the wrong approach. My suggestion is to use accelerations as one of your inputs to the system, collect a bunch of data simulating the fall situation, you might be surprised that there are a lot of viable signals there.
I dont think you need to use KF to detect fall detection. Using simple Accelerometer will able to detect the fall of device. If you apply low pass filter to smooth accelerometer and check if total acceleration is close to zero (in free fall device is going with -g (9.8 m/s2) acc) for more than certain duration, you can detect as fall.
The issue with above approach is if device is rotating fast then acceleration wont be close to zero. For robust solution, you can implement simple complementary (search for Mahony) filter rather than KF for this application.
I'm developing an app that measures car's accelerations.
Instead of using the accelerometer, I have read about linear accelerometer and implemented it.
I have two devices to test, It's working fine on a Sony Xperia Z1 but I don't get any value on a old Alcatel onetouch.
Is it because of its android version (5.1 vs 4.2), because my code is wrong or is it a hardware limitation?
Taken from Android's documentation:
TYPE_LINEAR_ACCELERATION
Added in API level 9
int TYPE_LINEAR_ACCELERATION
A constant describing a linear acceleration sensor type.
See SensorEvent.values for more details.
Constant Value: 10 (0x0000000a)
This means it should work on any device post API 9
Please take in consideration Lork's response in this post. It states which type of sensor you should use in order to measure certain types of movements.
Update: I'll add the information of Lork's response as it might help future readers:
TYPE_ACCELEROMETER uses the accelerometer and only the accelerometer. It returns raw accelerometer events, with minimal or no processing at all.
TYPE_GYROSCOPE (if present) uses the gyroscope and only the gyroscope. Like above, it returns raw events (angular speed un rad/s) with no processing at all (no offset / scale compensation).
TYPE_ORIENTATION is deprecated. It returns the orientation as yaw/ pitch/roll in degres. It's not very well defined and can only be relied upon when the device has no "roll". This sensor uses a combination of the accelerometer and the magnetometer. Marginally better results can be obtained using SensorManager's helpers. This sensor is heavily "processed".
TYPE_LINEAR_ACCELERATION, TYPE_GRAVITY, TYPE_ROTATION_VECTOR are "fused" sensors which return respectively the linear acceleration, gravity and rotation vector (a quaternion). It is not defined how these are implemented. On some devices they are implemented in h/w, on some devices they use the accelerometer + the magnetometer, on some other devices they use the gyro.
On Nexus S and Xoom, the gyroscope is currently NOT used. They behave as if there was no gyro available, like on Nexus One or Droid. We are planing to improve this situation in a future release.
Currently, the only way to take advantage of the gyro is to use TYPE_GYROSCOPE and integrate the output by hand.
I hope this helps,
Mathias
I am trying to use the phone as a gun aiming with the gyroscope. I calibrate with the phone or tablet in a certain orientation. This will shoot straight. Then depending on the direction the phone is turned (left/right/up/down.), the gun shoots in that direction.
I am using the gyroscope. And all this works. Except after shooting for about 30 secs, the gyroscope slowly starts drifting towards left or right. So when I go back to the orientation I calibrated with, it doesn't shoot straight anymore. Does anyone have any experience writing a Complementary or Kalman Filter to fuse gyro and accelerometer data to give better results in Unity 3D?
I've found this online - http://www.x-io.co.uk/open-source-ahrs-with-x-imu/. It seems to do exactly what I want. But I am using it wrong. I sometime get better and sometimes get worse results with. Anybody have any experience with it ?
First place, Gyro/accelerometer fusion will stabilize your pitch/roll angles, since gravity indicates in which direction ground is. However, you cannot correct "left/right" drift because actual heading is unknown. Getting proper heading stabilization cannot be achieved with gyro/accelerometer alone: it requires additional information.
The example you provide (Madgwick’s MARG/IMU filter) is a filter that can integrate magnetometers ("north" reference), but it has two requirements for getting good results:
The magnetometer has been properly calibrated.
There are no magnetic field disturbances. This is generally not true if you are indoors, or if you are moving close to power lines or metallic structures.
An alternative is using a video signal to get optical flow information, or detecting if the phone is resting in a fixed position to compensate gyro biases from time to time.
I am trying to do an analysis which involves interpreting the results from the various sensors of the Android device. Right now I'm analyzing the magnetic field sensor which should tell me (according to this documentation page) the values of the ambient magnetic field in uT (micro Tesla). Everything is clear so far but on the 2 devices that I tested on (Galaxy S, Galaxy Gio) the results are different. And by different I mean very different.
For example having the same orientation on a table here's what those devices show (just one example):
S2: -2,12, 60 (approximate values on x,y,z accordingly)
Gio: -2,12,-36 (approximate values on x,y,z accordingly)
Even if I switch positions the results are the same. I also read on this page about the intensity of the magnetic field and I believe that I should have around 50uT given my geographical latitude.
Can someone explain what those sensors mean?
Does someone know how to explain this behavior?
Thank you,
Iulian
I just can confirm the problem.
As the author of the steady compass application, I have done many tests of magnetic field sensors with different devices. I have found that one of the devices reported an absolute value of mf about 2x the amount reported by other different device under the same conditions.
I have done a lot of test after phone calibration, and I even put the devices in "airplane mode" trying to minimize electromagnetic interferences coming from the own device. Another comment is that plugging the USB cable on some devices can create important magnetic field variations.
The worst results that I have seen on a given device (after calibration, airplane mode and USB disconnected) is this: Using a compass application with the device on horizontal plane, I take a reading, rotate phone 90 deg in the horizontal plane and take a new compass reading. Both readings differ in more than 15 degrees!
Fortunately, not all devices include such a bad sensors. Best devices have errors of about 2-3 degrees after a 90 degrees rotation.
The conclusion: Unless you are very confident on your magnetic field sensor because you have tested and certified a good behavior, you should give a limited reliability to the readings reported by such sensor.
I have also experienced these problems with the sensors. I believe the phones will give different sensor readings based on the hardware used by the manufacturer.
Try reading the accuracy of the sensor for SENSOR_STATUS_UNRELIABLE. You may have to recalibrate the magnetometer.
I am working on a project using HTC magic which requires the data from the electronic compass, including both the accelerometer and magnetic sensor. But I find that there is a significant latency between the move of the phone and the trigger of the sensorChanged event. In other word, the acceleration and magnetic data obtained from sensor are updated about half of a second after my motion. And I have several questions about the problem as follow.
Are the orientation data computed by the acceleration and magnetic data? Or are there a physical sensor for orientation?
Does the latency result from the android API (using the event) or the physical limitation of the electronic compass?
It is said that the model of the electronic compass is AK8976A from Asahi Kasei. Does anybody have the datasheet or know the frequency of the sampling?
Any idea to improve the real-time experience?
Thank you in advance!
When you register the SensorEventListener what rate are you using? You should be using SENSOR_DELAY_GAME to get the best balance between frequent updates and not overdriving the update queue which can actually cause updated to be slower if SENSOR_DELAY_FASTEST.
As to your other questions I think they're kind of moot. Whether the update delay you're seeing is due to the API, or the actual compass itself you can't change it.
I did figure this out. Turns out that in 2.2 you can't use sensor rates other than the standard SENSOR_DELAY_UI, SENSOR_DELAY_NORMAL, etc. The documentation claims you can also specify the number of microseconds between notifications, but they are lying.
Once I used the standard constants it started working on 2.2