Reading Accelerometer data in Android - android

What would be the best method in introducing a delay in the accelerometer readings of an Android application.
This code will be running in the intent service and it will run for long periods of time (3-6 hours at least). I'd like to make it resource efficient as well.
I'm using a Thread.Sleep() for now and I was wondering if there would be a better method to do this.
Also in the example provided in the documentation, they use a high pass filter to filter out the gravity. But would it be possible to consider the sum of all the readings on the 3 axes to be grater than 10 and simply subtract that value from the sum? (I'm only interested in the net acceleration on the phone not the directions).
Thanks!

Dear god do NOT use sleep to delay accelerometer readings. That won't actually delay readings, it will just make you react to them slower- the system will still be making readings at the same rate and in fact queue them up. If you want less frequent readings, specify so when you set up your sensor listener. It takes a frequency parameter.
You wouldn't sum the 3 and subtract 10- you'd sum their squares and subtract 100 (or more exactly, 9.8^2). But even that isn't quite right, as sensors aren't perfect, and an at rest device may not read exactly 9.8 (especially if you aren't at sea level- the gravity is less in Denver than it is in New York). Instead you should take a long term average and find out what that particular device reads at rest. Which might be what a high pass filter was doing, although I'd have to see the code to know for sure.

Related

How do I perform geolocation to get data as accurate and precise as possible?

we are developing a multiplatform (android and ios) application in react-native, which mainly deals with geolocation data, thus it higly depends on accuracy and precision of the data. For instance, the application accumulates - reads and saves - the user's geolocation data every 5 seconds for ie 10 minutes (lets call it a track). So, during the track, there are about 200 measurements saved. Later on, we perform different computations based on the data accumulated during the track and visualise results to the user.
While on iphone the altitude graph of data measured at physically (nearly) identical alltitude has rather linear/smooth characteristics, on android there tend to be +-5 meters peaks. Lets say we want to measure total meters elevated during the track, thus to compute the sum of differences between succeeding measurements. Now imagine, how much the mentioned imprecision influences the result - there might be 5 meters altitude change every 5 seconds. Although physically your total elevation is +-0 meters, the total elevation computed may differ drastically.
So, to eliminate the errors, for us, there are basically two ways:
to make the measurements more accurate and precise (this is natural)
to perform some sort of approximation and adjust the measurements accordingly (this is hacky)
The first way sounds much better, but we are aware of the specific device's hardware limitation - we simply cannot fix inaccurate hardware with our software. The problem is, even on the same device, other commercial apps (ie Runtastic) perform the measurement "better" (even offline). That makes me think about the second (approximation) way. This way is rather hacky, and although it can lead to good-looking result in many cases, we think (but cannot prove) there is not such a prefrect approximation that will never corrupt the data "too much".
So, our questions are:
What are the techniques to make every altitude measurement as accurate and precise as possible on android? Which tools can be useful to accomplish that and how should we use them to get the best experience? We can think of:
gps altitude
nmea altitude
barometer
online API to get altitude based on latitude and longitude
Is it worth to apply some sort of approximation on the accumulated data? If so, what aspects (or even better ideas) should we take into account? Is this technique common?
Is there any different solution we did not mention?
Thanks in advance!
GPS altitude is the best one. but it depend on the gps chip. There are some with high accurate(High Cost). Most Phone GPS use online to get data also(For get good Hot start)
To Smooth GPS data(Or any other) ,You can use Kalman Filter. It complex and heavy but. it can guarantee the result and avoid misleading data very well.
OR
you can omit misleading packets base on pervious ones (Mean Filter) - Simple, But should be coded preciously
NOTE : I haven't use barometer. I gave my answer without it. Good luck
If you have a barometer then it's accuracy can be a lot better for altitude than you get with a phone quality gps. It's why high end bike computers and gps tracking watches have barometers in them. This is especially true if your interested in relative altitude differences rather than the actual real altitude. To get a good actual altitude with the barometer you need some sensible calibration scheme and be aware of the effects of the weather over time.
Whatever your source of altitude data you should expect to do some form of post processing on the data to get something sensible. What the processing will need to be will depend on your situation and what what you consider important.
For getting the total climb over a track you have 3 cases to consider. The easy case is simple going up or down a single consistent hill, you want to get the correct altitude difference value which is easy to check with a good map. At the other end going along a totally flat route you want to get zero climb again nice and easy to check but not so easy to achieve with typical hardware. In the middle going over undulating terrain is far harder to check although possibly the far more interesting case for the user. Getting this accurate is some form of trade off against the totally flat situation. I have code doing this in my app and with the filtering I have I know that the totally flat will over read a bit clocking up 20-30m climb in an hour. The undulating case will under count by about 1m for each undulation. The single big hill is generally pretty accurate given the known limitation of a barometric type system. Gps only based is nothing like as good. Typically going back to an online lookup approach is more consistent for a gps based system but it depends on the quality of the lookup data and the type of terrain.

Averaging sensor data over a small time period or just picking a few values in a small time period

I am working on an Android app that collect accelerometer data. It detects way too much data in the onSensorChanged() listener and I found out that you cannot really set a sampling rate (you can suggest it but Android can ignore it). I want something like accelerometer readings every 0.5 seconds for about 10 mins for around 10 people. Is it better for me to capture that enormous accelerometer data for 10 mins and average it out for every 0.5 seconds or should I find a work around where I just probe the onSensorChanged() every 0.5 seconds? Do you know of any similar solutions out there?
It actually depends on your application or what you are trying to achieve here.
For example if you care about accuracy for a short period of time then let call be fast as possible , get the feedback and process it your way. this way you will not miss any data and they will be accurate.
On the other hand if you are trying to keep the the sensor alive for a longer period if time then you should limit its calls. It will save your battery and as well as it will avoid extra calls from the sensor.

Tracking phone (relative) height with precision within 10 cm

I am trying to track changing phone height. The initial phone height can be inputted. I would like to track the height quite precisely (less than +- 10 cm range), and for a prolonged period of time (at least few minutes). Hopefully, I would have height estimates every <100ms. Ideally, the phone should be kept in a pocket, so I don't think camera-related methods could work.
I have tried using the barometer, but it seems too inaccurate and slow. I have also tried numerically integrating the accelerometer data but it seems to work for few seconds max.
I have been playing around with this for a while, I was wondering if there is any better method I did not consider, or if my goal is possible at all. Eg, I could try finding better integration methods for the acceleration. Is my goal feasible?
Added: if I could know WiFi RSSI, combined with acceleration, would they improve precision by much?
Your goal isn't feasible. Accelerometers are way too noisy, and can be effected by ambient vibrations. I'm less familiar with barometers, but they don't tend to be accurate. Remember you don't have scientific equipment in there, you have stuff that's meant for consumer level use.
Your best bet is actually probably the camera. Take a picture of an object of known height on the ground and use trig to figure out what the angle to it is. Of course that's not suited to general purpose computing or height, but is good enough for some purposes.

bad Accelerometer data with vibration

I am working an a bike computer app. I was hoping to work out the inclination of the slope using the accelerometer but things are not working too well.
I have put in test code getting the sensor data I am just smapeling at the UI rate and keeping a moving average over 128 samples which is about 6 seconds worth. With the phone in hand the data is good and I can calculate a good angle compared to my calibration flat vector.
With the phone mounted on the bike things are not at all good. I expect to get a good bit of noise but I was hoping that the large number of samples over the big time window would remove the vibration effects and general bike movements. Unfortunately this just is not working, the magnitude of the acceleration vector is not really staying around the 9.8 mark but is dropping lower which indicates to me that something is not right somewhere.
Here is a plot of the data from part of a test ride.
As you can see when stationary at the start the magnitude is OK but once I get going it drops. I am fairly sure the problem is vibration related I initially descend and there was heavy vibration I then climb and the vibration is less and the magnitude gets back towards 9.8 but then I drop down quickly on a bad road and the magnitude ends up less than 3.
This is with a SonyErricson Xperia Active which uses a BMA250 sensor the datasheat looks like the sensor should be capable. My only theory for the cause of the problem is that the range is set to the 2g range and the vibration is causing data to go out of range and this is causing my problems.
Has anyone seen anything like this?
Has anyone got any ideas on the cause of the problem?
Is there any way to change the sensitivity that I have not found?
Additional information.
OK I logged the raw sensor data before my filtering. A very small portion presented here
The major axis is in green and on the flat as I belive this should be without the vibration it should be about 8.5. There is no obvious clamping on the data but I get more below 8.5 values than above 8.5 values. Even if the sensor is set up for it's most sensative 2g range it looks like the vibration shgould be OK I have a max value here of just over 15 and a minimum of -10 well ib a +- 20 ragnge just not centered correctly on the 8.5 it should be.
I will dig out my other phone which looks to have a slightly different sensor a BMA150 and try with that but unless it is perfect I think I will have to give up on the idea.
I suspect the accelerometer is not linear over such large G ranges. If so, and if there is any asymmetry, it will do what you see.
The solution for that is to pad the accelerometer mount a bit more, foam rubber, bungy-cord, whatever, possibly mount it on a heavier stage to filter the vibration more.
Or (not a good solution) try to model the error and compensate for it.
I used the same handset and by coincidence the same averaging interval of 6 seconds for an application a few years ago and I don't recall seeing the behaviour in the graph.
I'm wondering whether the issue is in the way the 6 second averages are being accumulated. One problem I had is that the sampling interval was not constant but depends on how busy the processor is. A sample is acquired in the specified time but the calling of the event handler depends on the scheduler. When the processor is unloaded sampling occurs at a constant frequency but as the processor works harder the sampling frequency becomes slower and more erratic. You can write your app to keep processor load low while sampling. What we did is sample for 6 seconds, doing nothing else, then stop sampling and process the last sample set but this was only partially successful as you can't control other apps running at the same time and the scheduler is sharing processor resources across them all. On the Xperia Active I found it can occasionally go out to seconds between samples which I attributed to garbage collection in one of the JVMs. The solution for us was to time stamp each sample then run some quality checks over a sample set and discard those that failed the quality check. This is a poor solution as defining what is good enough is imprecise and when the user runs another app that uses a lot of resources most sample sets can be discarded so the app needs additional logic to handle that.
The current Android API, unavailable on the Xperia Active, should have eliminated this as samples can be batched as described at https://source.android.com/devices/sensors/hal-interface.html#batch_sensor_flags_sampling_period_maximum_report_latency .
If the algorithm assumed a particular number of samples rather than counting them and the processor worked harder as the bike went faster, though I'm not sure why it would, it would produce something like the first graph because when the bike is going downhill magnitude goes down and when going up hill it goes up. There is a lot of speculation there but a 6 second average giving a magnitude of less than 3 m/s^2 looks implausible from my experience with this sensor.

How to Calibrate Android Accelerometer & Reduce Noise, Eliminate Gravity

So, I've been struggling with this problem for some time, and haven't had any luck tapping the wisdom of the internets and related SO posts on the subject.
I am writing an Android app that uses the ubiquitous Accelerometer, but I seem to be getting an incredible amount of "noise" even while at rest, and can't seem to figure out how to deal with it as my readings need to be relatively accurate. I thought that maybe my phone (HTC Incredible) was dysfunctional, but the sensor seems to work well with other games and apps I've played.
I've tried to use various "filters" but I can't seem to wrap my mind around them. I understand that gravity must be dealt within some way, and maybe that's where I am going wrong. Currently I have tried this, adapted from a SO answer, which refers to an example from the iPhone SDK:
accel[0] = event.values[0] * kFilteringFactor + accel[0] * (1.0f - kFilteringFactor);
accel[1] = event.values[1] * kFilteringFactor + accel[1] * (1.0f - kFilteringFactor);
double x = event.values[0] - accel[0];
double y = event.values[1] - accel[1];
The poster says to "play with" the kFilteringFactor value (kFilteringFactor = 0.1f in the example) until satisfied. Unfortunately I still seem to get a lot of noise, and all this seems to do is make the readings come in as tiny decimals, which doesn't help me all that much, and it appears to just make the sensor less sensitive. The math centers of my brain are also atrophied from years of neglect, so I don't completely understand how this filter is working.
Can someone explain to me in some detail how to go about getting a useful reading from the accelerometer? A succinct tutorial would be an incredible help, as I haven't found a really good one (at least aimed at my level of knowledge). I get frustrated because I feel like all of this should be more apparent to me. Any help or direction would be greatly appreciated, and of course I can provide more samples from my code if needed.
I hope I'm not asking to be spoon-fed too much; I wouldn't be asking unless I've been trying to figure it our for a while. It also looks like there is some interest from other SO members.
To get a correct reading from the accelerometer you need to use the equation speed = SQRT(x*x + y*y + z*z). Using this, when the phone is at rest the speed will be that of gravity - 9.8m/s. So if you subtract that (SensorManager.GRAVITY_EARTH) then when the phone is at rest, you will have a reading of 0 m/s. As for noise, Blrfl might be right about cheap accelerometers, even when my phone is at rest, it continuously flickers a few fractions of a metre per second. You could just set a small threshold e.g 0.4m/s and if the speed doesn't go over that, then it is at rest.
Partial answer:
Accuracy. If you're looking for high accuracy, the inexpensive accelerometers you find in handsets won't cut the mustard. For comparison, a three-axis sensor suitable for industrial or scientific use runs north of $1,500 for just the sensor; adding the hardware to power it and turn its readings into something a computer can use doubles the price. The sensor in a handset runs well below $5 in quantity.
Noise. Cheap sensors are inaccurate, and inaccuracy translates to noise. An inaccurate sensor that isn't moving won't always show zeros, it will show values on either side within some range. About the best you can do is characterize the sensor while motionless to get some idea how noisy it is and use that to round your measurements to a less-precise scale based on expected error. (In other words, If it's within ±x m/s^2 of zero, it's safe to say the sensor's not moving, but you can't be precisely sure because it could be moving very slowly.) You'll have to do this on every device, because they don't all use the same accelerometer and they all behave differently. I guess that's one advantage the iPhone has: the hardware's pretty much homogeneous.
Gravity. There's some discussion in the SensorEvent documentation about factoring gravity out of what the accelerometer says. You'll notice it bears a lot of similarity to the code you posted, except that it's clearer about what it's doing. :-)
HTH.
How do you deal with jitteriness? You smooth the data. Instead of looking at the sequence of values from the sensor as your values, you average them on an ongoing basis, and the new sequence formed become the values you use. This moves each jittery value closer to the moving average. Averaging necessarily gets rid of quick variations in adjacent values.. and is why people use the terminology Low (frequency) Pass filtering since data that originally may have varied a lot per sample (or unit time) now varies more slowly.
eg, instead of using values 10 6 7 11 7 10, you can average these in many ways. For example, we can compute the next value from an equal weight of the running average (ie, of your last processed data point) with the next raw data point. Using a 50-50 mix for the above numbers, we'd get 10, 8, 7.5, 9.25, 8.125, 9.0675. This new sequence, our processed data, would be used in lieu of the noisy data. And we could use a different mix than 50-50 of course.
As an analogy, imagine you are reporting where a certain person is located using only your eyesight. You have a good view of the wider landscape, but the person is engulfed in a fog. You will see pieces of the body that catch your attention .. a moving left hand, a right foot, shine off eyeglasses, etc, that are jittery, BUT each value is fairly close to the true center of mass. If we run some sort of running averaging, we'd get values that approach the center of mass of that target as it moves through the fog and are in effect more accurate than the values we (the sensor) reported which was made noisy by the fog.
Now it seems like we are losing potentially interesting data to get a boring curve. It makes sense though. If we are trying to recreate an accurate picture of the person in the fog, the first task is to get a good smooth approximation of the center of mass. To this we can then add data from a complementary sensor/measuring process. For example, a different person might be up close to this target. That person might provide very accurate description of the body movements, but might be in the thick of the fog and not know overall where the target is ending up. This is the complementary position to what we first got -- the second data gives detail accurately without a sense of the approximate location. The two pieces of data would be stitched together. We'd low pass the first set (like your problem presented here) to get a general location void of noise. We'd high pass the second set of data to get the detail without unwanted misleading contributions to the general position. We use high quality global data and high quality local data, each set optimized in complementary ways and kept from corrupting the other set (through the 2 filterings).
Specifically, we'd mix in gyroscope data -- data that is accurate in the local detail of the "trees" but gets lost in the forest (drifts) -- into the data discussed here (from accelerometer) which sees the forest well but not the trees.
To summarize, we low pass data from sensors that is jittery but stays close to the "center of mass". We combine this base smooth value with data that is accurate at the detail but drifts, so this second set is high-pass filtered. We get the best of both worlds as we process each group of data to clean it of incorrect aspects. For the accelerometer, we smooth/low pass the data effectively by running some variation of a running average on its measured values. If we were treating the gyroscope data, we'd do math that effectively keeps the detail (accepts deltas) while rejecting the accumulated error that would eventually grow and corrupt the accelerometer smooth curve. How? Essentially, we use the actual gyro values (not averages), but use a small number of samples (of deltas) a piece when deriving our total final clean values. Using a small number of deltas keeps the overall average curve mostly along the same averages tracked by the low pass stage (by the averaged accelerometer data) which forms the bulk of each final data point.

Categories

Resources