Latency when getting sensor data from gPhone? - android

I am working on a project using HTC magic which requires the data from the electronic compass, including both the accelerometer and magnetic sensor. But I find that there is a significant latency between the move of the phone and the trigger of the sensorChanged event. In other word, the acceleration and magnetic data obtained from sensor are updated about half of a second after my motion. And I have several questions about the problem as follow.
Are the orientation data computed by the acceleration and magnetic data? Or are there a physical sensor for orientation?
Does the latency result from the android API (using the event) or the physical limitation of the electronic compass?
It is said that the model of the electronic compass is AK8976A from Asahi Kasei. Does anybody have the datasheet or know the frequency of the sampling?
Any idea to improve the real-time experience?
Thank you in advance!

When you register the SensorEventListener what rate are you using? You should be using SENSOR_DELAY_GAME to get the best balance between frequent updates and not overdriving the update queue which can actually cause updated to be slower if SENSOR_DELAY_FASTEST.
As to your other questions I think they're kind of moot. Whether the update delay you're seeing is due to the API, or the actual compass itself you can't change it.

I did figure this out. Turns out that in 2.2 you can't use sensor rates other than the standard SENSOR_DELAY_UI, SENSOR_DELAY_NORMAL, etc. The documentation claims you can also specify the number of microseconds between notifications, but they are lying.
Once I used the standard constants it started working on 2.2

Related

Kalman filter sensor fusion for FALL detection: Accelerometer + Gyroscope

I am trying to understand the process of sensor fusion and along with it Kalman filtering too.
My goal is to detect Fall of a device using Accelerometer and Gyroscope.
In most of the papers such as this one, It mentions how to overcome drift due to Gyroscope and noise due to Accelerometer. Eventually the sensor fusion provides us with better measurements of Roll, Pitch and Yaw and not better acceleration.
Is it possible to get better 'acceleration results' by sensor fusion and in turn use that for 'Fall detection' ? As only better Roll, Yaw and Pitch are not enough to detect a Fall.
However this source recommends to smoothen Accelerometer (Ax,Ay,Az) and Gyroscope (Gx,Gy,Gz) using Kalman filter individually and using some classification algorithm such as k-NN Algorithm or clustering to detect Fall using supervised learning.
Classification part is not my problem, it is if I should fuse the sensors(3D accelerometer and 3D gyroscope) or smoothen the sensors separately, with my goal of detecting a fall.
Several clarifications
Kalman Filter is typically to perform sensor fusion for position and orientation estimation, usually to combine IMU (accel and gyro) with some no-drifting absolute measurements (computer vision, GPS)
Complimentary filter, which is typically used to have good orientation estimation by combining accel(noisy but non-drifting) and gyro(accurate but drifting) . Using accel and combine with gyro, one can have fairly good orientation estimation. The orientation estimation you can see as primary using the gyro, but corrected using accel.
For the application of Fall detection using IMU, I believe that acceleration is very important. There is no known way to "correct" the acceleration reading, and thinking of this way is likely to be the wrong approach. My suggestion is to use accelerations as one of your inputs to the system, collect a bunch of data simulating the fall situation, you might be surprised that there are a lot of viable signals there.
I dont think you need to use KF to detect fall detection. Using simple Accelerometer will able to detect the fall of device. If you apply low pass filter to smooth accelerometer and check if total acceleration is close to zero (in free fall device is going with -g (9.8 m/s2) acc) for more than certain duration, you can detect as fall.
The issue with above approach is if device is rotating fast then acceleration wont be close to zero. For robust solution, you can implement simple complementary (search for Mahony) filter rather than KF for this application.

Inbuilt sensor calibration functionality in Android

I am working on an application which utilizes accelerometer and magnetometer data and their fused data to function. Now there's an inherent need when it comes to magnetometer to re-calibrate it regularly. The sensor gets uncalibrated due to a phenomena called hard iron effect. My application requires very accurate sensor data (which the hardware is capable of delivering but noise and uncalibrated values create a roadblock). I also know that there are inbuilt calibration functions running in background on android because many a times (not always) when magnetometer is showing wrong values it gets corrected by itself without an input from the user (like 8-shaped motion). I would like to know how frequently does android perform this calibration and is there a need for me to write my own auto-calibration code. The other possibility is if possible call this inbuilt calibration function at some frequency within my application. Android documentation I have gone through provides very little information on this.
SensorEvents has a field called accuracy that changes for magnetometers on the devices I have tested. This may be a good indication on whether the event you receive should be used or now.

Get Accelero, Gyro and Magneto in same time Android

I'm working on Sensor fusion with Accelerometer, Gyroscope and Magnetic Field on Android. Thanks to SensorsManager I can be noticed for each new value of theses sensors.
In reality and this is the case for my Nexus 5 (I'm not sure for others Android devices), acceleration, rotation rate and magnetic field are sampled in same time. We can verify it using event.timestamp.
On others systems (like iOS, xSens...), Sensor SDK provides a notification with these 3 vectors in same time.
Of course, when I receive an acceleration(t), I can write some lines of codes with arrays to wait rotationRate(t) and magneticField(t). But if there is a way to have an access directly to these 3 vectors together it could be very interesting to know!
An other question relative to sensors data:
Is there advices from Android team to device constructors to provide data in chronological order ?
Thank you,
Thibaud
Short answer, no, Android doesn't provide a way to get all the sensor readings as it reads them.
Furthermore, the behavior that you've observed with SensorManager, namely that readings from different sensors happen to have the same timestamp suggesting that they were read together - should not be relied upon. There isn't documentation that guarantees this behavior (also, this is likely a quirk of your testing and update configuration), so relying upon it could come to bite you in some future update (and trying to take advantage of this is likely much more difficult to get right or fast than the approach I outline below).
Generally, unless all results are generated by the same sensor, it is impossible to get them all "at the same time". Furthermore, just about all the sensors are noisy so you'd already need to do some smoothing if you read them as fast as possible.
So, what you could do is sample them pretty quickly, then at specific intervals, report the latest sample from all sensors (or some smoothed value that accounts for the delta between sample time and report time). This is a trivial amount of extra code, especially if you're already smoothing noisy sensor data.
There is a workaround for this particular problem. When multiple registered listeners are present in an activty at the same time, timestamp for those event may be misleading. But you can multiple fragment objects into the said activity which have different context's. Then listen to every sensor in these fragments. With this approach the sensor reading timestamps become reliable.
Or listen in parallel threads if you know about concurrency...

accuracy of android "linear acceleration" versus manual processing of accelerometer?

Reading here: Android Sensors - Which of them get direct input? ,
I am wondering if anyone has experience or a technically detailed link about the accuracy of the linear acceleration versus just manual processing of the acceleration raw data. E.g., do the new phones have dedicated hardware chips for filtering out gravity, or are most devices just going to filter the same raw source?
Update, proposed answer for someone to confirm if they have such a device (Xoom,Nexus S,?):
"If the device has gyro, or possibly multiple accelerometers, then the returned values for gravity (G) versus external linear acceleration (L) can be fundamentally more accurate than any processing on accelerometer data alone. Without extra sensors, e.g. as on most phones, one could in principle post-process the Acceleration (A) to attempt separation as accurately as what the device is returning for A = G+L"
It seems, the gravity/lin. acceleration can be calculated by a low-pass-filter - just as described in the Android-Documentation.
However only filtering the last value will not do it. I get acceptable results by averaging the accelerometer values of ca. 200ms (for moderate movement, this will still screw up, e.g. when you flip your phone fast between your fingers).
Your proposed answer is most likely correct.
You can check the statistics of several smartphone models on Android fragmentation.
For many models the power consumption of the lin. acceleration and gravity sensor is just the sum of accelerometer, gyroscope and magnetometer.
The gyroscope lets you recognize fast angular movement and it can be used to improve the gravity value, which is not possible with just the low-pass-filter. For the magnetometer im not sure if it really gets you more information.
On my phone (HTC One S) the gravity sensor uses just as much power as the accelerometer, but is still better than my simple filter. So either it is another hardware sensor or probably they use different weights on it. I tried to weight acc-data stronger, if their absolute value is closer to gravity, which is nice but was still not as good as the actual gravity sensor.
For compatibility reasons I would suggest to use a low-pass-filter for gravity if possible, as still not every smartphone has a gyroscope or mentioned sensors.

Gyro Sensor drift and Correct angle Estimation

I am using LG Optimus 2x smartphone(Gyroscope and Accelerometer sensor) for positioning.
I want to get correct rotation angles from gyroscope that can be used later on for body to earth coordinate transformation. My question is that
How I can measure and remove the drift in gyro sensor.
The one way is to take the average of gyro samples (when mobile is in static condition) for some time and subtracting from current sample, which is not good way.
When the mobile is in rotation/motion how to get the drift free angles?
As far as I know, either the Kalman filter or something similar is implemented in the SensorManager. Check out Sensor Fusion on Android Devices: A Revolution in Motion Processing.
You are trying to solve a problem that is already solved.
I am the author of a compass application that integrates data from magnetic and gyroscope sensors (steady compass). I have tested this application mostly on a LG Optimus black (the device that you can see on the video) running Android 2.2, so I am going to share my experiences:
Gyroscope readings are very accurate. This sensor is just the opposite to accelerometer and magnetic sensors which give readings with a lot of jitter.
The readings from the gyroscope (i.e. the angular speed) does not drift at all. You will have a drift in the estimation of the orientation if you just integrate gyroscope readings. Since you are integrating samples in different times, you will obtain just an approach that will degrade after every integration step.
In order to avoid such a drift in the orientation estimation, you must consider other input sources to correct the results coming from gyroscope data integration. The solution is the integration of data coming from the orientation sensor (magnetic + acceleration) and data coming from the gyroscope.
Be careful with LG phones: According to the Android API, gyroscope will return data in rad/s. The LG Optimus Black with Froyo gives readings in degrees/s. The update to Android 2.3 has just been released for such phone. I have to test whether the new version behaves according to the specifications.
What Android version does your phone have? Have you tested any application using gyroscope? Did you get the expected results?
Basically gyros drift over long time periods. Whereas accelerometers have no drift but tend to be unstable. By combining information from both sensors using a Kalman filter you can obtain a accurate attitude. For some this less complex you could also use a Complementary Filter.
See this post for more info:
Combine Gyroscope and Accelerometer Data

Categories

Resources