Using android accelerator to calculate an exact relative position of device - android

I'm trying to use an android phone's internal sensors as a means of defining a 3D object for use in a video game. I therefore need to create a map of the phone's orientation, and relative location over time. (relative to a starting position) I tried creating a test app which kept a float[] (x,y,z) of a 'current location' and kept adding the data from the sensors to it. But the noise from the sensors just made the current location slowly wobble off the grid I was using to visualize the feedback. I've looked through filters online, and all explanations are just confusing. Is there anyway to get a cleaner output from the sensors? Is there a better way of tracking the phone's exact relative location?

You will not be able to accurately track a device's position with a typical phone or tablet. In theory, you can use a device's internal sensors to measure its acceleration in each axis. In theory, you integrate acceleration once to obtain speed and once more to obtain change in position. Thus, in theory, you should be able to calculate a device's position relative to its starting location. HOWEVER, in practice, things don't work out so well.
A device's sensors are not perfect and contain small errors. Integrating these small errors twice turns them into large errors. These large errors accumulate and end up making your calculations useless.
Take a look at Project Tango, which contains a bunch of fancy sensors and algorithms to implement 6 DOF tracking. This device was built by a research team and is still a work in progress, so keep in mind that it's a tough problem.

Related

What are the common ways to test sensors working properly

I'm going to create an app which uses sensors in a smartphone. Before doing this job, I also want to test the sensors of the smartphone so that I can evaluate their correctness. What are the common ways to achieve that? I'm going to use phone's accelerometer sensor. But I also wonder about other sensors too.
Using a level, verify that a surface is flat. Place the phone on the surface, and ensure that there is a 1g reading in one direction and zero in the others. This can be done in the other two directions as well to ensure they are reading 1g from gravity, but you will need a holder that keeps it at exactly a right angle with the surface (or alternatively, set on the floor with the back of the phone against a wall once you know the wall and floor are level).

Is posible build an indoor localization system with android device inertial sensors

I'm working on an indoor positioning system and I have one doubt after failing at all my real tests:
I have done some work with android sensors values and some machine learning algorithms (with good theorical results) but in a real environement i found some problems.
My proposal was to have three phases:
The fist phase consist on collecting data through an android app with a map with some points. You move to real point position and save the values of sensors asociated with the coordinates of the point.
The second phase consist on creating a machine learning model (in this case, a classifier) to predict the user position based on sensor values at every time.
Then, we export the classifier to the device and get predictions of user position in real time.
The data we stored on fingerprinting phase (phase 1) was the x,y,z values of accelerometer, magnetomer and gyroscope given by the Android Sensor Manager. On a second approach, we used a median filter to filtrate noise from that values. Our problem is that the way you hold the phone change the measurements. The reason is that Android sensors values are given for device coordinate system, so sensor values are variable to phone orientation and tilt.
Android Device Coordinate System
So, the question is:
Is posible or there is a way to build an indoor localization system (with a positioning accuracy around 2-3 meters) by only taking in account android smartphone sensors (accelerometer, gyroscope and magnetomer) using machine learning algorithms (or other algorithms) to work on real environements?
Thanks in advance!!
There are a few companies that started doing fingerprinting solely based on magnetometer, but as far as I know they ended up at least mixing it with other technologies, like BLE beacons or similar.
From what I was told the problem is that magnetic fields can change drastically due to changes inside your building but also changes outside of your scope (i.e. thunderstorms).
Taking one step back I see another problem with your approach: different device models behave radically different in terms of the data their sensors provide. To make things worse, the same device may provide very different data today than it did yesterday. This is especially true for the magnetometer - at least from my experience.

Drawing in 2D space using Accelerometer (gyroscope?)

I am trying to create an application that will track movement of the device in 2D space. After doing research online, all I could find that one way to do it is integrate linear acceleration twice but the error is horrible.
Are there any solutions to this problem? I would like to be able to move my phone up, which would cause a vertical line to be drawn on the screen, to scale of how far the phone was moved. Then if I move the phone to the left, horizontal line would be drawn - effectively allowing me to draw on the screen using movements of the phone.
Can this be done at all? If so, what direction should I take in the development? I don't know where to start...
EDIT: More about the project:
I am trying to make an exercise app that will track the movement of the leg/arm: for example, when you are doing stomach crunches and the phone is attached with an armstrap to your ankle.
The app would track repeated movements of the leg.
Unfortunately the accelerometers in these phones are nowhere near what you need to implement an inertial measurement unit. The big problem is since you are integrating twice an integration always comes with a constant integral(x,dx) = x^2/2 +c this constant is what makes this difficult. To make things worse you get it twice, once when integrating to get velocity and once to get position.
One method of fixing this that I have seen in commercial innertial measurement units is called a zero velocity null, this is where you use some other source of data to tell it when you have stopped the motion of the device so you can zero out the velocity. For example I saw a project put an inertial measurement unit on a shoe and it would zero the velocity whenever it detected the shoe being put on the ground which vastly improved the accuracy. Its possible that you could use a camera or something to determine this, however I have not seen it done. If you would like to start messing with this then you are an awesome person and I would love to hear how it turns out.
Edit: I should clarify that the constant I mention above is where the error accumulates. If you can zero velocity null it then you periodically drop the accumulated error from your stored current velocity. The error in position will still accumulate, however this would make it not drift when they are holding it relatively still which may make it passable for drawing.
I know no other way other than integrating the acceleration twice.
Moreover I think that it's not possible if you don't have knowledge about other sensors that might be in your device (for example on one of my devices I have 7 (seven) sensors related to various physical signals the device might be receiving).
Other than that remember that the sensor data is noisy and almost always must be pre-filtered. For example you can use geometric mean of last 10 samples. That should lower your error by providing a smoother input data to the integrating function.

How to measure the speed of car by phone with accelerometer and gyroscope?

I want to know current speed of car and make a passed path. I have an Android phone with accelerometer and gyroscope which sent me data. This is the data in phone system of coordinate that probably wouldn't the same as coordiante system of car.
How I can transform this accelerations and rotations to car system of coordinate?
The generic answer for your generic question is no. The acceleration measures the changes in the speed, so the best you could get from acceleration, is the speed variation.
To get the absolute speed you would have to have the initial speed and add it to the speed change:
v(t) = v0 + a*t
So, if you would have a car moving along a straight line, and your device was fixed to the car, you could get easly the speed changes (although measurements errors will add up and quickly lead to discrepancies)
In practice you will face many issues trying to implement it, namely:
You need the initial speed to be determinate based on the same referential as the acceleration. This would require some measurements and a lot of trignometry, as you would get both values from different sensores at different rates.
The car will not move in a straight line, so your acceleration referential will be constantly moving (a lot more of trignometry and calculus).
If the device is in the user hand, the device movements in relation to the car will increase even more the calculations (and accumulated errors).
Regards.
You need some sort of external reference (e.g. GPS is such a thing): If you just integrate the acceleration, the error will go indefinitely.
Because these sensors are not accurate enough. the error will quickly get out of control. (The linked answer is about position but the same holds for the velocity.)
In case of a car, you are better off with the GPS. If want to do something fancy, you could enforce the environmental constraints deduced from a map, that is, assume that the car goes on a road and not through buildings, etc. You will find more details on this in Chapter 5 of the PhD thesis entitled Pedestrian Localisation for Indoor Environments.
It looks like it's possible to do. I don't have an Android specific example but this forum has quite a lot of chat about it: http://www.edaboard.com/thread119232.html
It would be a lot easier if you used the Android Location class though. Specifically the getSpeed() method should give you what you need: http://developer.android.com/reference/android/location/Location.html
The Location class relies on a location provider though so your app will require appropriate permissions.
Both dont deliver anything if the car travels at the same constant speed for some time. The only way would be GPS which has a calculated speed with every location it provides.

How to detect user movement in Android by using wifi rssi or acccelerometer?

In my project, I want to detect if a user is moving or not by using either wifi rssi or accelerometer sensor.
What I should do to achieve that?
It actually all depends on what kind of movement you want to detect.
WiFi RSSIs : From a starting position and scan results (initial RSSIs for newly discovered access points), you can check through time their evolution in term of signal quality. A short displacement of the user will not be easy to find as RSSI values are tweaked by a large amount of parameters (orientation, obstacles, setup of the room, atmospheric conditions, people around). Thus you would need averaged values (scans must then be performed quickly to have enough data) and leaving an access point perimeter would make you lose the information.
Accelerometer : Depends on what quality of sensor you are using. If you're using embedded sensors within smartphones, it will be tough. Their accuracy is bad, and as you'll need to integrate its values (m/s² to get m/s) the error will grow subsequently. Plus it might be hard to discern real user movement from the device's tilt if you're using a mobile phone or tablet.
Without really knowing the details of your projet, I believe that RSSIs should be easier to use if you actually need to detect not so tiny motion. If you want something more precise, you'll need some way bigger research work.
See Android accelerometer accuracy (Inertial navigation) for RSSI-based indoor localization.

Categories

Resources