I am trying to estimate heading from the accelerometer, gyro and magnetometer data. I have implemented A Complementary Filter from here.
What I am trying is that I am holding the phone in my hand and I walked 15 steps in a straight line and I am trying to estimate Euler angles as given in the link above. But when I plot the raw data, I observe that the magnetometer data deviates. Here are the images of raw sensor data.
My question is: how do I estimate Euler angles so that they indicate I am walking in straight line.
Were you walking indoors, or somewhere there might be electrical or magnetic fields? Judging by the magnetometer graph, that's what's happening. If so, I'm afraid there's no solution to your problem.
Try running a compass app, or even carrying a toy compass, and then walk the same path. I bet you'll see the compass swing as you walk.
I've been environments where moving a compass three feet down a table can cause it to swing 180 degrees.
As an aside: you almost never want Euler angles. They have degenerate cases. Try to use transformation matrices or quaternions if you can.
Related
I am making a raspberry pi robot with an FVP (First Person View) camera mounted on a Pan/tilt Servo. I want to make it VR compatible by connecting it to my Phone. But my phone doesn't have Gyroscope sensor to detect horizontal movements, but it has magnetometer and accelerometer. How can I combine data from accelerometer and magnetometer to make a virtual gyroscope that can move with my camera. I am noob in all of these.
You should have an rotation vector sensor that is already fusing the two. You will not get better results than it.
Note that this will not be as high quality as a proper gyroscope and will have artifacts if the robot moves.
If you're still interested in how to make this yourself, you can get roll and pitch information from the accelerometer, then get the yaw information from the magnetometer. Best if you find a library for 3d maths and do this with quaternions or matrices. This seems like a use case where you will be affected by gimbal locks easily, so euler angles will be problematic.
I guess you want to use this for VR? Don't try to move the servos to compensate for head movement directly, you'll only make a motion sickness generator. Look at how timewarp works - you move the servos in the general direction a person is looking at and render the video reprojected on a sphere - this way you have almost zero lag.
I am trying to use the phone as a gun aiming with the gyroscope. I calibrate with the phone or tablet in a certain orientation. This will shoot straight. Then depending on the direction the phone is turned (left/right/up/down.), the gun shoots in that direction.
I am using the gyroscope. And all this works. Except after shooting for about 30 secs, the gyroscope slowly starts drifting towards left or right. So when I go back to the orientation I calibrated with, it doesn't shoot straight anymore. Does anyone have any experience writing a Complementary or Kalman Filter to fuse gyro and accelerometer data to give better results in Unity 3D?
I've found this online - http://www.x-io.co.uk/open-source-ahrs-with-x-imu/. It seems to do exactly what I want. But I am using it wrong. I sometime get better and sometimes get worse results with. Anybody have any experience with it ?
First place, Gyro/accelerometer fusion will stabilize your pitch/roll angles, since gravity indicates in which direction ground is. However, you cannot correct "left/right" drift because actual heading is unknown. Getting proper heading stabilization cannot be achieved with gyro/accelerometer alone: it requires additional information.
The example you provide (Madgwick’s MARG/IMU filter) is a filter that can integrate magnetometers ("north" reference), but it has two requirements for getting good results:
The magnetometer has been properly calibrated.
There are no magnetic field disturbances. This is generally not true if you are indoors, or if you are moving close to power lines or metallic structures.
An alternative is using a video signal to get optical flow information, or detecting if the phone is resting in a fixed position to compensate gyro biases from time to time.
So, right now I'm grabbing the accelerometer data and converting them to a decently rough estimate of the angle at which the phone is being held. For right now I'm just focused on the yaw axis.
My area of interest is between 0 and 45 degrees on the yaw axis, so I made a limited queue of the past 5 to 10 readings and compared the numbers to determine if it's going up or down, which kind of works, but it is slow and not really as precise or reliable as I'd want it to be.
Is there a way you can kind of just determine which direction your phone is rotating with just the accelerometer and the magnetic field sensor I guess, without keeping a history of past readings, or something like that? I'm really new to sensor manipulation and Android in general. Any help understanding would be great.
It's not clear exactly what you're looking for here, position or velocity. Generally speaking, you don't want to get a position measurement by using integration on the accelerometer data. There's a lot of error associated with that calculation.
If you literally want the "direction your phone is rotating," rather than angular position, you can actually get that directly from the gyroscope sensor, which provides rotational velocities. That would let you get the direction it's rotating from the velocity without storing data. You should be aware that not every phone has a gyroscope sensor, but it does seem like the newer ones do.
If you want the absolute orientation of the phone (position), you can use the Rotation Vector sensor. This is a combined sensor that automatically integrates data from several of the sensors in one go, and provides additional accuracy. From this, you can get roll-pitch-yaw with a single measurement. Basically, you first want to get your data from the Rotation_vector sensor. Then you use the sensor data with getRotationMatrixFromVector. You can use the output from that in getOrientation (see the same page as the previous link), which will spit out roll-pitch-yaw measurements for you. You might need to rotate the axes around a bit to get the angles measured positive in the direction you want.
I am trying to find the angle of rotation of a car while it makes a turn using Gyroscope from an Android device. So imagine a car is travelling in a bearing of angle 168 and makes a right turn on a road. Now I need to calculate the new heading or bearing angle just using Gyroscope. But the values I receive are in radians/sec. I tried integrating these values over the time period dT. But these values are not even close to the actual angles. I thought the rotation are in reference to the device, and I tried to convert the values to the real World coordinates. But I didnt get a good algorithm for that.
Can someone help me or point to the right resources to solve this issue?
EDIT:
I forgot to mention in the question, I am trying to do this without GPS, (in the scenario when GPS fails.) And I am trying to avoid Sensor fusion as I am planning to use only Gyroscope as I am looking for a solution that could run even out of Android platform. I am even talking to the OBD to get the actual speed of the vehicle. So I am just trying to collect Gyroscope data from any client and process it at the back end and determine just the turning of a vehicle
You need rotation vector see description here (API level 9).
This thing is using something called sensor fusion to get good quality information about phone orientation relative to earth and magnetic north.
You can also calculate derivate of GPS position to estimate car turn direction.
I am writing some programming on the Android sensors, where I am confused by the readings of magnetometer sensor.
Magnetometer reports the magnetic strengths on the three axes of the phone. And I observe that at a same location, if the phone's heading changes, the magnetic readings dramatically change.
In my understand, however, the earth's magnetic field at a specific location should be relative stable, regardless of the phone's placement gesture.
So, my question is, is there any way to transform the raw readings from the 3-axis magnetometer sensor to the world's coordinate system? The accelerometer and orientation data are also available on mobile phones. If so, I suspect the transformed magnetism should be the same even the phone's heading direction changes.
I have referred to the Android source codes, specially, the getOrientation() function and the getRotationMatrix() function. I hoped to get some help from their code implementation. But I did not understand very well. Could someone give any explains on the algorithm principle of these functions?
Link to the code of the functions: http://www.netmite.com/android/mydroid/cupcake/frameworks/base/core/java/android/hardware/SensorManager.java
Thanks! I am really anxious to the solution to this question.
This is impossible, since the device does not know its orientation in world space.
Of course, the orientation can be guessed by the sensor input, and that is what getOrientation() and getRotationMatrix() do. However, on a long timescale only the measurement of acceleration (by gravity) and the magnetic field provide the necessary information. Gyroscope data can be used to refine the estimate for shorter periods, but getOrientation is not guaranteed to use it, and maybe that sensor is not even existent on the particular device.
This means backtransforming using getOrientation would use the exact same data which you want to correct, rendering it useless.