Determining heading in Inertial Navigation Systems - android

I have a question regarding inertial navigation with a mobile device.
I am using an android tablet for development but I think the question is related to
all types (even with better sensors) of hardware.
The most basic question when developing an inertial system is how to determine the
direction of the carrier's movement.
Even if we assume that the magnetometer readings are 100% accurate (which they are obviously not!) There is still the question of the device orientation relative to the user.
Simple example - if the user is walking north, but holds the device with the device's Y axis points north-east, (a link to a picture of the different axis: http://developer.android.com/reference/android/hardware/SensorEvent.html)
Then the magnetometer will point towards north-east.
How can we tell which way the user is actually heading?
(The same will be true if we use both magnetometer and Gyro for determining heading)
A possible solution will be to use the Accelerometer's Y-axis and X-axis readings.
Something in the direction of using arctan(a-Y/a-X)
(for example - if the user holds the device perfectly straight, then the X-Axis will show nothing...)
But since the Accelerometer's readings are not stable, it is not so easy...
Does anyone know of an algorithm that actually works? I am sure this is a well known problem, but I can't seem to find references to solutions...
Thanks in advance!
Ariel

See this answer for an idea: by obtaining the acceleration values in relation to the earth, you can then use the atan2 function to compute the actual direction.
You mention the user holds the tablet, and I assume fairly stable (unlike a case I am working on, where the user moves the phone constantly). Yet, for some reason, the user may change the orientation of the device, and this may influence the readings you obtain.
Thus, in the event of an orientation change, you should call remapCoordinates() accordingly to fix the readings you obtain.
NOTE: You can also use the getOrientation() method accordingly, and the first field represents the heading direction.

The really right answer is to leave the device sitting there for a while, detect the rotation of the earth, and then compute true north from that. Unfortunately, the gyros in a mobile phone aren't accurate enough for that....

Related

How to use accelerometer on Android Wear to record rowing stroke

My goal is to have a simple stroke rate detector displayed on my Android Watch (Sony Smartwatch), for this I need to detect when the watch changes from moving forwards to moving backwards.
I have code working that will get the event values (x,y,z) as detected in the onSensorChanged event (and display them on the watch), but I am struggling to make sense of these.
I understand the values report acceleration in the given axis, and I understand that z reports gravity. But if these values are reporting just acceleration, I am not clear how to know when there is a change of direction. I presume a positive number indicates acceleration, a number of 0 is a constant speed and a negative number is deceleration...is that correct? And if so, how can I detect when the Watch has changed direction from going forwards to going backwards?
Thanks in advance.
Android Wear is no different than "conventional" Android when it comes to detecting motion. Basically, you need to consider exactly what the accelerometers are recording: raw acceleration (with a lot of noise). To determine motion from that, you need to look at trends over time, and probably integrate the smoothed accelerometer data. It's not trivial.
You'll probably want to use the TYPE_LINEAR_ACCELERATION sensor, because it filters out gravity (which isn't relevant to your use case). And because the watch will probably experience some rotation during a rowing stroke (which you want to factor out), you may need to combine TYPE_ROTATION_VECTOR with the acceleration vector to accurately determine direction changes.
A couple of other SO questions which may help point you in the right direction:
how to calculate phone's movement in the vertical direction from rest?
How can I get the direction of movement using an accelerometer?
I've not attempted this specific problem, of course, but I've done enough other work with Android motion sensors to know that you have a good programming challenge ahead of you. Enjoy it!

Reorientation of Accelerometer axes to car's axes in phonegap application

I am currently doing an application to detect potholes on the road through the accelerometer in the phone. The problem I have is that I need to reorient the accelerometer axis to align with the cars axis.
I found this explanation below from a report online but I do not know how I use the GPS to calculate the post-rotation and how to monitor the pre-rotation angles.
The explanation online:
"The phone can lie at any arbitrary orientation and, hence, it’s
embedded accelerometer. Therefore, it must be oriented along the vehicle’s axis before
analyzing the signals. This system uses an algorithm based upon Euler angles for
reorientation. The sensor is virtually rotated along the vehicle’s axis using pre-rotation,
tilt and post-rotation angles (Euler angles). The post-rotation angle is calculated using
GPS, so to avoid extra energy consumption the pre-rotation and tilt angles are
monitored continuously and whenever there is any significant change in these angles,
GPS is turned on and reorientation process is done again."
I have searched for ways to find the device orientation with phonegap but all I can find is the heading orientation plugin which seems to be used to give the compass direction of the phone.
Any advice or even an alternate way of doing this would be greatly appreciated.
I found the ebook "Pervasive Computing: 10th International Conference, Pervasive 2012 Newcastle, UK" useful for understanding how to program the reorientation of the accelerometer axis's.
Here's a link to the page in the book that describes the process: https://books.google.ie/books?id=VTy6BQAAQBAJ&pg=PA7&lpg=PA7&dq=pre-rotation,+tilt+post-rotation+matrices&source=bl&ots=Py9GXtE7Io&sig=xfur3P7sv_XaR9ihOAsPXvgGiWw&hl=en&sa=X&ved=0ahUKEwiCmc3V0YfLAhXFPRoKHZuODpMQ6AEIKDAC#v=onepage&q=pre-rotation%2C%20tilt%20post-rotation%20matrices&f=false
Hope this helps anyone else trying to do this.

I was wondering if anyone knew what the 3 axis on an accelerometer are?

I am trying to create an algorithm to record data from a accelerometer, I was wondering if anyone knew what the x,y and z axis values were exactly?
If you have a look at the Sensors Overview:
Measures the acceleration force in m/s^2 that is applied to a device on
all three physical axes (x, y, and z), including the force of gravity.
Well, not that I can go into too much detail but can give you some insight.
The accelerometer measures forces made on the mobile, so if you put the phone on the table it should output x=0, y=1, z=0 (depending on which way the device is facing). The reason for this is that the table applies force to the phone keeping it from falling to the ground. As a consequence a reading of all zeroes would imply free-fall.
This means that often one of the first tasks when gathering accelerometer data is to determine which axis (or combination of) reads the forces of gravity and thereby knowing which part of the phone is facing up.
This guy explains it rather well: https://www.youtube.com/watch?v=KZVgKu6v808
Hope this is clear enough, otherwise attend to the documentation as mharper suggests.

Detect horizontal and vertical movement of the Smartphone on a table

I'm trying to write a small android game where the phone is placed on the table.
On the screen there is a ball, which the user control its movement by moving the phone.
Along all the game the user won't lift the phone from the table.
At the beginning the ball will placed in the middle of the screen:
Pushing the phone from the user:
should move the ball toward the top of the smartphone screen:
And from the current position of the ball, moving the phone back to the user, and to the right:
will move the ball accordingly:
I read the Android Motion Sensors Guide carefully but I didn't even realize what Sensor \ Sensors should I use.
I would love to get any directions.
First of all TYPE_LINEAR_ACCELERATION, TYPE_ROTATION_VECTOR, TYPE_GRAVITY are not physical sensors but made from sensor fusion.
Secondly from Android 4+ these fused sensors make use of device Gyroscope, so they WON'T work if the mobile device doesn't has a gyroscope.
So if you want to make a generic app for all phones prefer using only Accelerometer (TYPE_ACCELEROMETER).
Now for your case since user won't lift the mobile from table, and if you want you can easily subtract the Gravity component from accelerometer. See http://developer.android.com/reference/android/hardware/SensorEvent.html
under section Sensor.TYPE_ACCELEROMETER. (Code is given too).
Now you can see How can I find distance traveled with a gyroscope and accelerometer? to find the linear displacement & the 1st answer states their is NO use of Gyroscope. (Or you can just google for finding the displacement/Linear velocity from acceleromter readings)
Hope this all would give you quite lot an idea.
It's really difficult to do this type of linear position sensing using the types of sensors that smartphones have. Acceleration is the second derivative of position with respect to time. So in theory, you could use the accelerometer data, integrating twice in time to achieve the absolute position. In practice, noise makes that calculation inaccurate.
On the other hand, if you're really talking about putting the camera facedown on the table, maybe you can come up with some clever way of using OpenCV's optical flow features to detect motion using the camera, almost like an optical mouse. I don't know how you would light the surface (the flash would probably burn through the battery) but it might be possible to work something out.

How to use the sensor values in the same way as google skymap?

I am trying to replicate how Google Skymap makes use of sensor inputs. I currently use getOrientation and the accelerometer values to determine up and rotation. This works fine whilst the phone is held perpindicular to the horizon, but starts to fail when pointing the phone towards the ground. for example when laid flat on a table, the readings used (accel_x) when spinning the phone does not affect this input, but when held perpindicular the values are useful and I am able to rotate the display based on the values.
(I currently display a basic horizon quad)
I am thinking that I need to make use of more than one value here but am quite at a loss of how to do it.
Any pointers?
Also, is there a 2.1 alternative to Display.getRotation?
Thanks in advance
Note: I am only interested in the roll (in the aviation sense) of the phone, regardless of the pitch in which it is held.

Categories

Resources