Reorientation of Accelerometer axes to car's axes in phonegap application - android

I am currently doing an application to detect potholes on the road through the accelerometer in the phone. The problem I have is that I need to reorient the accelerometer axis to align with the cars axis.
I found this explanation below from a report online but I do not know how I use the GPS to calculate the post-rotation and how to monitor the pre-rotation angles.
The explanation online:
"The phone can lie at any arbitrary orientation and, hence, it’s
embedded accelerometer. Therefore, it must be oriented along the vehicle’s axis before
analyzing the signals. This system uses an algorithm based upon Euler angles for
reorientation. The sensor is virtually rotated along the vehicle’s axis using pre-rotation,
tilt and post-rotation angles (Euler angles). The post-rotation angle is calculated using
GPS, so to avoid extra energy consumption the pre-rotation and tilt angles are
monitored continuously and whenever there is any significant change in these angles,
GPS is turned on and reorientation process is done again."
I have searched for ways to find the device orientation with phonegap but all I can find is the heading orientation plugin which seems to be used to give the compass direction of the phone.
Any advice or even an alternate way of doing this would be greatly appreciated.

I found the ebook "Pervasive Computing: 10th International Conference, Pervasive 2012 Newcastle, UK" useful for understanding how to program the reorientation of the accelerometer axis's.
Here's a link to the page in the book that describes the process: https://books.google.ie/books?id=VTy6BQAAQBAJ&pg=PA7&lpg=PA7&dq=pre-rotation,+tilt+post-rotation+matrices&source=bl&ots=Py9GXtE7Io&sig=xfur3P7sv_XaR9ihOAsPXvgGiWw&hl=en&sa=X&ved=0ahUKEwiCmc3V0YfLAhXFPRoKHZuODpMQ6AEIKDAC#v=onepage&q=pre-rotation%2C%20tilt%20post-rotation%20matrices&f=false
Hope this helps anyone else trying to do this.

Related

How to retrieve high quality compass orientation (as in Google Maps)?

All of the guides to getting compass orientation in Android I've found have a bug: when you hold the phone in portrait mode and "look" above the horizon, the compass arrow turns 180 degrees from the correct direction.
Google Maps orientation indicator doesn't have this problem.
Another nice thing that Google Maps have is that they somehow estimate compass accuracy. Any idea how they do this?
Error you ask is caused because of Euler angles or called Gimbal Lock.
To solve very high angle difference, you should check if your device is flat or not checking inclination after getting inclination using gravity sensor or getting gravity from accelerometer with lowpass filter. If it's not flat(AR application or inclination>25 || inclination < 155) remap coordinate system from x,y to x,z. That will solve the issue.
SensorManager.getRotationMatrixFromVector(rotationMatrix, event.values);
SensorManager.remapCoordinateSystem(rotationMatrix, SensorManager.AXIS_X, SensorManager.AXIS_Z, remappedRM);
SensorManager.getOrientation(remappedRM, orientation);
This solves the Gimbal Lock, however, 3D compasses(normal compasses does not give correct results due to they are only mapped in x,y and occurs Gimbal Lock) i've seen on Play Store and every code here has a diffence when you keep your device laying flat or screen pointing at you. That difference sometimes gets up to 10 degrees. I haven't able to solve it. Mostly difference is between 1-5 degrees but i sometimes see it rise up to 10 degrees which is not acceptable.
Google measures azimuth from your location. There is a code to find bearing.
currentLocation.getBearing();
Accuracy of lat long(accuracy of your current location) is what determines accuracy of bearing.
List of ways doing a compass with the most accurate to least is in order
Use GPS/Wife(Google's fused location) location and get bearing from location
Use Rotation Vector(requires magnetic field sensor, you should check if it's available), it's a fused sensor(magnetic field+gyroscope+accelerometer and software parts) and using Kalman filter to smooth values but check inclination to remap orientation
Gravity/Accelerometer + Magnetic Field Sensor. This will have a terrible noise attached to it, to smooth it you should use moving average or low-pass filter(this not for isolating gravity, it's for using a threshold frequency to prevent high jumps)
bug when looking above the horizon.
There is some tricky parts when you work with compass.
The compass orientation depends on magnetic fields and phone orientation, so, in order to correct deviation you need to do some matrix operations.
Check this article, it provides a simple example -> https://www.journal.deviantdev.com/android-compass-azimuth-calculating/
Paraphrasing the article: "So if the device is not holding flat (∓45° deviation) you have to use remapCoordinateSystem() in a useful way to get correct results."
accuracy.
First of all, accuracy rely on physical components present in the device. Fragmentation is a common problem here. Samsung magnetometers are different from LG's, and Android System is not going to abstract you from this.
On the other hand, devices could rely on different type of sensors: harward or sensor fusion. So you're going to experiment differences between one device and other.
So this is a mess. But it exists some techniques you could apply in order to get accuracy and uniformed data from sensors (not only for compass, probably GPS too).
Some people discard the first few seconds of sensor data. Some time is needed in order to calibrate the signal, so the first data retrieved use to have some deviation. The time you need to discard depends on manufacturer again, buy I'd try with 5 seconds or so.
Use Interpolators and Extrapolators. Many sensors in android provide you a way to retrieve data each certain milliseconds. But this is the abstraction provided by Android. Harward sensors have its own timing and they update the signal when they they think it is necessary. The rest of the time, when Android ask the sensor for signal data the sensor returns the last value or maybe some kind of operation of the last signal data provided by manufacturer (again).
So, it is interesting to have some abstraction layer (interpolator / extrapolator) which receive the data from Android system each 20, 50, 1000... This layer make some operations in order to have some uniformity, and then communicates the data to your app.
The operation here could be some kind of average between current and last value, accumulated average or maybe another kind of normalization.
OK, i figured it out myself. first you need to calculate the bearing from the compass. then the Maps api-2 camera can be rotated.
public void updateCamera(float bearing) {
CameraPosition currentPlace = new CameraPosition.Builder()
.target(new LatLng(centerLatitude, centerLongitude))
.bearing(bearing).tilt(65.5f).zoom(18f).build();
googleMap.moveCamera(CameraUpdateFactory.newCameraPosition(currentPlace));
}
set SensorListener in your code and call this method in onSensorChanged event. i have added a tilt value so the map will rotate in 3D.
[Reference:] [Android Maps v2 rotate mapView with compass1
if you can know about more details http://www.techotopia.com/index.php/Working_with_the_Google_Maps_Android_API_in_Android_Studio
Compasses will give you what you want, but it's so noisy. It is noisy for 2 reasons, one of reason is, it is picking up, real noise, real signal. So, we live in an environment that's magnetically very noisy. So, this compasses, picking up everything that's magnetic. The other reason is that, it's not integrated, so it doesn't have benefit of dropping the frequency component. So, try to combine your compass with gyroscope data. This video will help you so much for using these sensors.
Some more details, you can combine accelerometers also. So in summary, Gyroscopes provide orientation, Accelerometers provide a correction due to gravity, and compasses provide a correction due to magnetic North.

Sensor Fusion of Accelerometer and Gyroscope in Unity 3D

I am trying to use the phone as a gun aiming with the gyroscope. I calibrate with the phone or tablet in a certain orientation. This will shoot straight. Then depending on the direction the phone is turned (left/right/up/down.), the gun shoots in that direction.
I am using the gyroscope. And all this works. Except after shooting for about 30 secs, the gyroscope slowly starts drifting towards left or right. So when I go back to the orientation I calibrated with, it doesn't shoot straight anymore. Does anyone have any experience writing a Complementary or Kalman Filter to fuse gyro and accelerometer data to give better results in Unity 3D?
I've found this online - http://www.x-io.co.uk/open-source-ahrs-with-x-imu/. It seems to do exactly what I want. But I am using it wrong. I sometime get better and sometimes get worse results with. Anybody have any experience with it ?
First place, Gyro/accelerometer fusion will stabilize your pitch/roll angles, since gravity indicates in which direction ground is. However, you cannot correct "left/right" drift because actual heading is unknown. Getting proper heading stabilization cannot be achieved with gyro/accelerometer alone: it requires additional information.
The example you provide (Madgwick’s MARG/IMU filter) is a filter that can integrate magnetometers ("north" reference), but it has two requirements for getting good results:
The magnetometer has been properly calibrated.
There are no magnetic field disturbances. This is generally not true if you are indoors, or if you are moving close to power lines or metallic structures.
An alternative is using a video signal to get optical flow information, or detecting if the phone is resting in a fixed position to compensate gyro biases from time to time.

Determining which direction phone is rotating with just accelerometer

So, right now I'm grabbing the accelerometer data and converting them to a decently rough estimate of the angle at which the phone is being held. For right now I'm just focused on the yaw axis.
My area of interest is between 0 and 45 degrees on the yaw axis, so I made a limited queue of the past 5 to 10 readings and compared the numbers to determine if it's going up or down, which kind of works, but it is slow and not really as precise or reliable as I'd want it to be.
Is there a way you can kind of just determine which direction your phone is rotating with just the accelerometer and the magnetic field sensor I guess, without keeping a history of past readings, or something like that? I'm really new to sensor manipulation and Android in general. Any help understanding would be great.
It's not clear exactly what you're looking for here, position or velocity. Generally speaking, you don't want to get a position measurement by using integration on the accelerometer data. There's a lot of error associated with that calculation.
If you literally want the "direction your phone is rotating," rather than angular position, you can actually get that directly from the gyroscope sensor, which provides rotational velocities. That would let you get the direction it's rotating from the velocity without storing data. You should be aware that not every phone has a gyroscope sensor, but it does seem like the newer ones do.
If you want the absolute orientation of the phone (position), you can use the Rotation Vector sensor. This is a combined sensor that automatically integrates data from several of the sensors in one go, and provides additional accuracy. From this, you can get roll-pitch-yaw with a single measurement. Basically, you first want to get your data from the Rotation_vector sensor. Then you use the sensor data with getRotationMatrixFromVector. You can use the output from that in getOrientation (see the same page as the previous link), which will spit out roll-pitch-yaw measurements for you. You might need to rotate the axes around a bit to get the angles measured positive in the direction you want.

Translate Gyroscope readings to bearing in degrees

I am trying to find the angle of rotation of a car while it makes a turn using Gyroscope from an Android device. So imagine a car is travelling in a bearing of angle 168 and makes a right turn on a road. Now I need to calculate the new heading or bearing angle just using Gyroscope. But the values I receive are in radians/sec. I tried integrating these values over the time period dT. But these values are not even close to the actual angles. I thought the rotation are in reference to the device, and I tried to convert the values to the real World coordinates. But I didnt get a good algorithm for that.
Can someone help me or point to the right resources to solve this issue?
EDIT:
I forgot to mention in the question, I am trying to do this without GPS, (in the scenario when GPS fails.) And I am trying to avoid Sensor fusion as I am planning to use only Gyroscope as I am looking for a solution that could run even out of Android platform. I am even talking to the OBD to get the actual speed of the vehicle. So I am just trying to collect Gyroscope data from any client and process it at the back end and determine just the turning of a vehicle
You need rotation vector see description here (API level 9).
This thing is using something called sensor fusion to get good quality information about phone orientation relative to earth and magnetic north.
You can also calculate derivate of GPS position to estimate car turn direction.

Determining heading in Inertial Navigation Systems

I have a question regarding inertial navigation with a mobile device.
I am using an android tablet for development but I think the question is related to
all types (even with better sensors) of hardware.
The most basic question when developing an inertial system is how to determine the
direction of the carrier's movement.
Even if we assume that the magnetometer readings are 100% accurate (which they are obviously not!) There is still the question of the device orientation relative to the user.
Simple example - if the user is walking north, but holds the device with the device's Y axis points north-east, (a link to a picture of the different axis: http://developer.android.com/reference/android/hardware/SensorEvent.html)
Then the magnetometer will point towards north-east.
How can we tell which way the user is actually heading?
(The same will be true if we use both magnetometer and Gyro for determining heading)
A possible solution will be to use the Accelerometer's Y-axis and X-axis readings.
Something in the direction of using arctan(a-Y/a-X)
(for example - if the user holds the device perfectly straight, then the X-Axis will show nothing...)
But since the Accelerometer's readings are not stable, it is not so easy...
Does anyone know of an algorithm that actually works? I am sure this is a well known problem, but I can't seem to find references to solutions...
Thanks in advance!
Ariel
See this answer for an idea: by obtaining the acceleration values in relation to the earth, you can then use the atan2 function to compute the actual direction.
You mention the user holds the tablet, and I assume fairly stable (unlike a case I am working on, where the user moves the phone constantly). Yet, for some reason, the user may change the orientation of the device, and this may influence the readings you obtain.
Thus, in the event of an orientation change, you should call remapCoordinates() accordingly to fix the readings you obtain.
NOTE: You can also use the getOrientation() method accordingly, and the first field represents the heading direction.
The really right answer is to leave the device sitting there for a while, detect the rotation of the earth, and then compute true north from that. Unfortunately, the gyros in a mobile phone aren't accurate enough for that....

Categories

Resources