Inbuilt sensor calibration functionality in Android - android

I am working on an application which utilizes accelerometer and magnetometer data and their fused data to function. Now there's an inherent need when it comes to magnetometer to re-calibrate it regularly. The sensor gets uncalibrated due to a phenomena called hard iron effect. My application requires very accurate sensor data (which the hardware is capable of delivering but noise and uncalibrated values create a roadblock). I also know that there are inbuilt calibration functions running in background on android because many a times (not always) when magnetometer is showing wrong values it gets corrected by itself without an input from the user (like 8-shaped motion). I would like to know how frequently does android perform this calibration and is there a need for me to write my own auto-calibration code. The other possibility is if possible call this inbuilt calibration function at some frequency within my application. Android documentation I have gone through provides very little information on this.

SensorEvents has a field called accuracy that changes for magnetometers on the devices I have tested. This may be a good indication on whether the event you receive should be used or now.

Related

Is posible build an indoor localization system with android device inertial sensors

I'm working on an indoor positioning system and I have one doubt after failing at all my real tests:
I have done some work with android sensors values and some machine learning algorithms (with good theorical results) but in a real environement i found some problems.
My proposal was to have three phases:
The fist phase consist on collecting data through an android app with a map with some points. You move to real point position and save the values of sensors asociated with the coordinates of the point.
The second phase consist on creating a machine learning model (in this case, a classifier) to predict the user position based on sensor values at every time.
Then, we export the classifier to the device and get predictions of user position in real time.
The data we stored on fingerprinting phase (phase 1) was the x,y,z values of accelerometer, magnetomer and gyroscope given by the Android Sensor Manager. On a second approach, we used a median filter to filtrate noise from that values. Our problem is that the way you hold the phone change the measurements. The reason is that Android sensors values are given for device coordinate system, so sensor values are variable to phone orientation and tilt.
Android Device Coordinate System
So, the question is:
Is posible or there is a way to build an indoor localization system (with a positioning accuracy around 2-3 meters) by only taking in account android smartphone sensors (accelerometer, gyroscope and magnetomer) using machine learning algorithms (or other algorithms) to work on real environements?
Thanks in advance!!
There are a few companies that started doing fingerprinting solely based on magnetometer, but as far as I know they ended up at least mixing it with other technologies, like BLE beacons or similar.
From what I was told the problem is that magnetic fields can change drastically due to changes inside your building but also changes outside of your scope (i.e. thunderstorms).
Taking one step back I see another problem with your approach: different device models behave radically different in terms of the data their sensors provide. To make things worse, the same device may provide very different data today than it did yesterday. This is especially true for the magnetometer - at least from my experience.

Get Accelero, Gyro and Magneto in same time Android

I'm working on Sensor fusion with Accelerometer, Gyroscope and Magnetic Field on Android. Thanks to SensorsManager I can be noticed for each new value of theses sensors.
In reality and this is the case for my Nexus 5 (I'm not sure for others Android devices), acceleration, rotation rate and magnetic field are sampled in same time. We can verify it using event.timestamp.
On others systems (like iOS, xSens...), Sensor SDK provides a notification with these 3 vectors in same time.
Of course, when I receive an acceleration(t), I can write some lines of codes with arrays to wait rotationRate(t) and magneticField(t). But if there is a way to have an access directly to these 3 vectors together it could be very interesting to know!
An other question relative to sensors data:
Is there advices from Android team to device constructors to provide data in chronological order ?
Thank you,
Thibaud
Short answer, no, Android doesn't provide a way to get all the sensor readings as it reads them.
Furthermore, the behavior that you've observed with SensorManager, namely that readings from different sensors happen to have the same timestamp suggesting that they were read together - should not be relied upon. There isn't documentation that guarantees this behavior (also, this is likely a quirk of your testing and update configuration), so relying upon it could come to bite you in some future update (and trying to take advantage of this is likely much more difficult to get right or fast than the approach I outline below).
Generally, unless all results are generated by the same sensor, it is impossible to get them all "at the same time". Furthermore, just about all the sensors are noisy so you'd already need to do some smoothing if you read them as fast as possible.
So, what you could do is sample them pretty quickly, then at specific intervals, report the latest sample from all sensors (or some smoothed value that accounts for the delta between sample time and report time). This is a trivial amount of extra code, especially if you're already smoothing noisy sensor data.
There is a workaround for this particular problem. When multiple registered listeners are present in an activty at the same time, timestamp for those event may be misleading. But you can multiple fragment objects into the said activity which have different context's. Then listen to every sensor in these fragments. With this approach the sensor reading timestamps become reliable.
Or listen in parallel threads if you know about concurrency...

How to/Should I implement a Kalman filter to get accurate Accelerometer data?

I want to get as accurate data from the built in accelerometer in an Android phone as possible. I want to track two dimensional movement in x and y axis and even the small movements must be registered.
When I look at the data from the accelerometer / linear acceleration when the phone is flat on a table it changes a lot when i should be zero.
I have looked at Kalman filters, it seems like a good approach but I am having problems setting up a model.
1. Is a Kalman filter the way to go to get as accurate data as possible from an accelerometer?
2. Will a Kalman filter work? Maybe i have misunderstood but it seems like the acceleration or the velocity must be constant?
3. How do I set up the model for using Kalman filter? I'm having trouble understanding (among other things) what the process noise is?
A Kalman filter applies when all measurements (of acceleration in this case) are equal to the true value plus a measurement error. The measurement error is the process noise. For for the original Kalman filter to apply the noise must be normally distributed, i.e. sometimes the error will be positive, sometimes negative, and on average zero.
If you jerk your android phone quickly back and forth, there'll be large accelerations. I'd suggest recording the accelerometer readings in that kind of action, and reviewing by eye to see whether it looks like there's the readings are indeed subject to some kind of normally distributed process noise. My guess is that the answer will be "No", i.e. I expect they readings when plotted on a graph will be smooth-ish. But if they're not smooth, a Kalman filter could be useful.
If you're trying to use accelerometer readings to work out location, I think your project is doomed to failure. Acceleration is the 2nd derivative of position with respect to time, and I've never heard of anyone being able to integrate the readings with sufficient accuracy to be at all useful.
I have applied a Kalman filter successfully to GPS readings on an Android phone to improve the location estimate. See Smooth GPS data for code that implements a Kalman filter for that. I subsequently wondered whether velocity and perhaps acceleration data could be used to improve the location estimate. Although I never followed up on that idea, see https://dsp.stackexchange.com/questions/8860/more-on-kalman-filter-for-position-and-velocity for the maths that I was considering using.
The optimal way of using all the sensor inputs (GPS, accelerometer, gyroscope, etc) to get a good estimate of location is a very hard (and interesting) problem. To find out more, the key phrase to search for is "Sensor fusion". On this subject, there's an old youtube video at http://www.youtube.com/watch?v=C7JQ7Rpwn2k .
You might find this thread useful. I came across the same issues
We think the variance when lying flat might be an issue with Gimbal lock confusing the calculations but thats just a theory right now. We've also noticed the covariance in each axis alters depending on the orientation of the device, which might be gimbal lock interference too, but again just a theory
Implement a Kalman filter to smooth data from deviceOrientation API

Magnetic Field Sensor calibration on ANDROID

I'm making an application that works as a compass..
I'm using the accelerometer and the magnetic field sensors to compute the azimuth angle through, sensor.getOrientation().
I'm searching for something that can improve the magnetic field sensor accuracy, since I'm getting it state of accuracy as UNRELIABLE!
Any one knows anything about this?I'm looking for something that can be either hardcoded or for instance just physically moving the phone until it gets calibrated!
This is not a final answer (I don't know anything for sure), but my understanding from online posts is that waving the phone around in a figure of 8 a few times while the compass is in use is supposed to trigger automatic recalibration. This is what the google maps app suggests, for example. I don't know whether this is dependent on application functionality (something in maps that detects the waving by accelerometer and triggers a recalibration), or something in the android stack, or something specific to per-phone implementations. Try it and see!
Eg discussion: http://androidforums.com/epic-4g-support-troubleshooting/217317-cant-get-compass-calibrate.html
This reference appears to suggest this per-axis / figure-8 rotation process is built-in functionality: http://m.eclipsim.com/gpsstatus/
And here another article that claims this is built-in functionality, and that you don't even need to be running a compass-consuming app for the recalibration to work: http://www.ichimusai.org/2009/06/20/how-to-calibrate-the-htc-magic-compass/
Just a few points
The figure 8 motion works sometimes and not others, I have no idea why, they really need to have some kind of code based way to check if the 8 motion worked (Assuming that the physical motion is actually required)
They also need a way to detect that calibration is required, I looked at the code for the accuracy output (the unreliable constant) and once they send it to you they will not send it again, so for instance if you calibrate but then come within a strong magnetic field it will not resend (not sure why they did that)
One not completely reliable way to detect ongoing issues is that you can also use the magnetic sensor output and do something like field=sqrt(x*x+y*y+z*z) and check that field falls between say 25 and 65 and then ask the user to calibrate if it does not.
The bottom line after testing 18 phones is that I would never depend on a Android based compass with the current crop of phones, accuracy would always be in question.
I have also found even if you are lucky and have a fairly reliable phone you can never be sure that it's calibrated without checking it against a real compass, which kind of defeats the purpose.
NOTE: On a lot of the mis-behaving phones we have found that the sensor writes a calibration file and a tmp file with the same name. If you delete those files and re-boot the phones the calibration file is recreated with zero'd values and the cold start and general calibration problems resolve themselves.
The bad news is that they are stored in /data/misc and require root privileges to get at (thanks Google & Sensor mfg!) so even though I suspect this would solve a lot of problems for a lot of developers it just is not viable from a marketplace app perspective.
I am developing for Android. I'm using Titanium Alloy as development tool with the Titanium Geolocation module.
I have only tested 2 devices [Galaxy Note and S4] against a commercial magnetic compass. Following a calibration process [tilt along the 3 axis] and using 2 different compass apps and the app I'm working on, the Android compass seems accurate enough for basic use ... correlation was good enough for my purpose anyway. I also found the device compass reading to be very sensitive to other magnetic and electrical field interference ... initial mistake I made was to use the compass feature whilst device was in a device protector with a magnetic closure facility [quite common on tabs] ... this interference is particularly strong. I thus need to suggest to users of my app to remove device protectors, keep device free of other electronics and then do standard calibration before initializing the app.
Another option is:
Go To sensors menu: #*0#*
Then if you see a red line in Magnetic Sensor section and a Need for Calibration you should recalibrate your compass.
How;
According those guys;
Turn the Samsung Galaxy Mini S5 around all of its axes until the red
line in the black circle changes color from red to blue. You can also
run through a motion that follows the shape of an 8. It may be that
several attempts are needed to calibrate the compass...

Latency when getting sensor data from gPhone?

I am working on a project using HTC magic which requires the data from the electronic compass, including both the accelerometer and magnetic sensor. But I find that there is a significant latency between the move of the phone and the trigger of the sensorChanged event. In other word, the acceleration and magnetic data obtained from sensor are updated about half of a second after my motion. And I have several questions about the problem as follow.
Are the orientation data computed by the acceleration and magnetic data? Or are there a physical sensor for orientation?
Does the latency result from the android API (using the event) or the physical limitation of the electronic compass?
It is said that the model of the electronic compass is AK8976A from Asahi Kasei. Does anybody have the datasheet or know the frequency of the sampling?
Any idea to improve the real-time experience?
Thank you in advance!
When you register the SensorEventListener what rate are you using? You should be using SENSOR_DELAY_GAME to get the best balance between frequent updates and not overdriving the update queue which can actually cause updated to be slower if SENSOR_DELAY_FASTEST.
As to your other questions I think they're kind of moot. Whether the update delay you're seeing is due to the API, or the actual compass itself you can't change it.
I did figure this out. Turns out that in 2.2 you can't use sensor rates other than the standard SENSOR_DELAY_UI, SENSOR_DELAY_NORMAL, etc. The documentation claims you can also specify the number of microseconds between notifications, but they are lying.
Once I used the standard constants it started working on 2.2

Categories

Resources