I am working on a project which includes an Android application which is used for
controlling/steering.
Speed: When you tilt the phone forward/backwards (pitch) it simulates giving gas and breaking.
Direction: When you tilt the phone left/right (roll) it simulates steering to the left and right.
I have already written some code which seemed to work fine. But when I took a closer look, I found that some values are acting weird.
When I tilt the phone forward/backward to handle the speed it works perfect I get the expected speed and direction values. But when I tilt the phone to the left/right to handle the direction it seems to corrupt some values. When it is tilting to the left/right that doesn't only change the direction value (roll) but it also affects the speed value (pitch).
For extra information:
Programming for Android 2.2
Device is an Google Nexus One
Holding the device in portrait
The most relevant code I use to read the sensor values is as follows:
public void onSensorChanged(SensorEvent sensorEvent)
{
synchronized (this)
{
if (sensorEvent.sensor.getType() == Sensor.TYPE_ORIENTATION)
{
float azimuth = sensorEvent.values[0]; // azimuth rotation around the z-axis
float pitch = sensorEvent.values[1]; // pitch rotation around the x-axis
float roll = sensorEvent.values[2]; // roll rotation around the y-axis
System.out.println("pitch: " + pitch);
System.out.println("roll: " + roll);
System.out.println("--------------------");
// Convert the sensor values to the actual speed and direction values
float speed = (pitch * 2.222f) + 200;
float direction = roll * -2.222f;
So when I run the code, and I look at the printed values. When tilting the device left/right, it seems to affect the pitch value as well. How come? And how can I get the pure pitch value, when 'roll'-ing? So that tilting the phone to the left/right doesn't affect/corrupt the pitch value.
You could read up on Gimbal lock. That's bitten me before.
Related
I am currently implementing an speedometer by receiving orientation data from my phone. I am using
SensorManager.getRotationMatrix(R, I, gravity, geomagnetic);
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
float azimuth = orientation[0];
double azimuthD = Math.toDegrees(azimuth);
if(azimuthD < 0) azimuthD = 360 + azimuthD;
With this i am able to receive the rotation data from my phone, such as azimuth etc..
Anyway, this works fine while the device is placed on a table or something. But when rotating around a certain point (in my case the device is fixed on a wheel and rotating at a certain speed) the values are far away from being accurate. I believe, since I am using gravity and the geomagnetic sensor, there could be an conflict with forces that influence these sensors, while rotating. As the wheel turns, the rotation changes relative to a point, but the local device rotation stays the same.
How can I access the orientation of the device while it's turning without running into a lot of noisy data?
I read some about the ´Sensor.TYPE_ROTATION_VECTOR´ property, but couldn't quite figure out how it works. Also I read about the possibility to remap the coordination system, but how is that supposed to help, since my phone is never not vertical to the floor more like with an angle of 5°-10°.
I would appreciate any help.
Cheers,
viehlieb
I guess i found my answer.
The solution was to throw away all the code i posted above and use the gyroscope, obviously.
The gyroscope values measure angular velocity of the device's rotation. The coordinate system used is the devices own coordinate system. In my case the relevant value was the rotation around the z-axis.
Values are in radiant per second, which can be mapped to m/s if you figure multiply the wheel's circumference. So the trick was in the OnSensorChanged method:
if(sensorEvent.sensor.getType() == Sensor.TYPE_GYROSCOPE){
gyroscope = sensorEvent.values;
double rotZ = gyroscope[2];
double degrees = Math.toDegrees(gyroscope[2]);
//calculate the speed with circumference = 2.23m
float speed = (float) degrees/ 360.0f * 2.23f * 3.6f;
}
If now you'd like to have accurate values, you could store them in an array and calculate the average. Remember to clear the array every 20th time (or so) the onSensorChanged method is called. With SENSOR_DELAY_GAME registered there's sufficient data over which you could build the average.
I'm trying to get the device acceleration in unity to move an object
public class ExampleClass : MonoBehaviour {
public float speed = 10.0F;
void Update() {
Vector3 dir = Vector3.zero;
// we assume that device is held parallel to the ground
// and Home button is in the right hand
// remap device acceleration axis to game coordinates:
// 1) XY plane of the device is mapped onto XZ plane
// 2) rotated 90 degrees around Y axis
dir.x = -Input.acceleration.y;
dir.z = Input.acceleration.x;
// clamp acceleration vector to unit sphere
if (dir.sqrMagnitude > 1)
dir.Normalize();
// Make it move 10 meters per second instead of 10 meters per frame...
dir *= Time.deltaTime;
// Move object
transform.Translate(dir * speed);
}
}
But when I run the game on my device, the object moves and stops depending on the orientation of the device and not it's acceleration.
I tried also to print Input.acceleration readings
GUI.Button(new Rect(10, 10, 150, 80), Input.acceleration.x + "\n" + Input.acceleration.y + "\n" + Input.acceleration.z);
and I noticed that the three numbers' values change only when I rotate the device, and their value changes is always -1 and 1.
I know that that Accelerometer Is used for measuring acceleration ,not rotation. and the sensor that measures rotation is gyroscope.
Why is this happening? How can I read acceleration instead of rotation.
Most of the devices have gyroscope these days so try Input.gyro.userAcceleration
Note that on most of the android devices gyroscope is turned off by default and you need to set Input.gyro.enabled to true.
I want to detect if the user has taken a turn on the road while driving using the sensors on the android phone. How do I code this? I am collecting data live from all the sensors(accelerometer,location,rotation,geomagnetic) and storing them on the sd card. So now i just want to know whether the user has a taken a turn and in which direction he has turned.
I assume the registration of the sensor is done properly. You can detect the direction by using the orientation sensor (deprecated) as follows:
#Override
public void onSensorChanged(SensorEvent event) {
float azimuth_angle = event.values[0];
int precision = 2;
if (prevAzimuth - azimuth_angle < precision * -1)
Log.v("->", "RIGHT");
else if (prevAzimuth - azimuth_angle > precision)
Log.v("<-", "LEFT");
prevAzimuth = azimuth_angle;
}
Note: The variable of "prevAzimuth" is declared as global. You can change "precision" value to whatever you want. We need this value because we do not want to see output after each trivial change in azimuth angle. However, too large precision gives imprecise results. To me, "2" is optimum.
If you are tracking location coordinates, you can also track shifts between the angle from previous locations.
angle = arctan((Y2 - Y1) / (X2 - X1)) * 180 / PI
See this answer for calculating x and y.
Decision to use sensor values is based on an unrealistic assumption that the device is never rotated with respect to the vehicle.
Recently I have made some research to use both the accelerometer + Gyroscope to use those senser to track a smartphone without the help of the GPS (see this post)
Indoor Positioning System based on Gyroscope and Accelerometer
For that purpose I will need my orientation (angle (pitch, roll etc..)) so here what i have done so far:
public void onSensorChanged(SensorEvent arg0) {
if (arg0.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
{
accel[0] = arg0.values[0];
accel[1] = arg0.values[1];
accel[2] = arg0.values[2];
pitch = Math.toDegrees(Math.atan2(accel[1], Math.sqrt(Math.pow(accel[2], 2) + Math.pow(accel[0], 2))));
tv2.setText("Pitch: " + pitch + "\n" + "Roll: " + roll);
} else if (arg0.sensor.getType() == Sensor.TYPE_GYROSCOPE )
{
if (timestamp != 0) {
final float dT = (arg0.timestamp - timestamp) * NS2S;
angle[0] += arg0.values[0] * dT;
filtered_angle[0] = (0.98f) * (filtered_angle[0] + arg0.values[0] * dT) + (0.02f)* (pitch);
}
timestamp = arg0.timestamp;
}
}
Here I'm trying to angle (just for testing) from my accelerometer (pitch), from integration the gyroscope_X trough time filtering it with a complementary filter
filtered_angle[0] = (0.98f) * (filtered_angle[0] + gyro_x * dT) + (0.02f)* (pitch)
with dT begin more or less 0.009 secondes
But I don't know why but my angle are not really accurate...when the device is position flat on the table (Screen facing up)
Pitch (angle fromm accel) = 1.5 (average)
Integrate gyro = 0 to growing (normal it's drifting)
filtered gyro angle = 1.2
and when I lift the phone of 90° (see the screen is facing the wall in front of me)
Pitch (angle fromm accel) = 86 (MAXIMUM)
Integrate gyro = he is out ok its normal
filtered gyro angle = 83 (MAXIMUM)
So the angles never reach 90 ??? Even if I try to lift the phone a bit more...
Why doesn't it going until 90° ? Are my calculation wrong? or is the quality of the sensor crap?
AN other thing that I'm wondering it is that: with Android I don't "read out" the value of the sensor but I'm notified when they change. The problem is that as you see in the code the Accel and Gyro share the same method.... so when I compute the filtered angle I will take the pitch of the accel measure 0.009 seconds before, no ? Is that maybe the source of my problem?
Thank you !
I can only repeat myself.
You get position by integrating the linear acceleration twice but the error is horrible. It is useless in practice. In other words, you are trying to solve the impossible.
What you actually can do is to track just the orientation.
Roll, pitch and yaw are evil, do not use them. Check in the video I already recommended, at 38:25.
Here is an excellent tutorial on how to track orientation with gyros and accelerometers.
Similar questions that you might find helpful:
track small movements of iphone with no GPS
What is the real world accuracy of phone accelerometers when used for positioning?
how to calculate phone's movement in the vertical direction from rest?
iOS: Movement Precision in 3D Space
How to use Accelerometer to measure distance for Android Application Development
Distance moved by Accelerometer
How can I find distance traveled with a gyroscope and accelerometer?
I wrote a tutorial on the use of the Complementary Filter for oriëntation tracking with gyroscope and accelerometer: http://www.pieter-jan.com/node/11 maybe it can help you.
I test your code and found that probably the scale factor is not consistent.
Convert the pitch to 0-pi gives better result.
In my test, the filtered result is ~90 degrees.
pitch = (float) Math.toDegrees(Math.atan2(accel[1], Math.sqrt(Math.pow(accel[2], 2) + Math.pow(accel[0], 2))));
pitch = pitch*PI/180.f;
filtered_angle = weight * (filtered_angle + event.values[0] * dT) + (1.0f-weight)* (pitch);
i tried and this will give you angle 90...
filtered_angle = (filtered_angle / 83) * 90;
Currently, I'm trying to rotate 3D Cube using orientation sensor values, using getRotation() method. Some unexpected behaviors are observed when the android device is rotated above some bounds. For instance, if I make the device 'stand up', the value of the 'roll' just becomes crazy.
Also I'm experiencing the phenomenon similar to so-called gimbal-lock. The only difference is I'm experiencing the very problem even before applying the sensor values to the 3D rotation. When I try to change the 'pitch' value by rotating the device around only 'pitch' axis, the 'yaw' value also changes according to the rotation of the pitch. It seems completely unreasonable to me.
Could somebody help me?? I'm stuck in this problem for a month.
This is a common problem with yaw, pitch and roll. You cannot get rid of it as long as you are using yaw, pitch and roll (Euler angles). This video explains why.
I use rotation matrices instead of Euler angles in my motion sensing application. For an introduction to rotation matrices I recommend:
Direction Cosine Matrix IMU: Theory
Rotation matrices work like a charm.
Quaternions are also very popular and said to be the most stable.
[This answer was copied from here.]
Using quaternions to compute YPR won't do much to solve any problem. The problem of gimbal lock (which near pitch of +/-90 can drive yaw and roll -- actually yaw-roll at the north pole -- to go crazy under slight changes/noise in the underlying quaternion).
However, if you use Yaw Pitch and Roll values to perform a rotation of a 3D object shouldn't exhibit any odd behavior near the gimbal lock position. It's just that an amibguity in yaw and roll arise and large variations in yaw and roll do not imply the actual orientation is going crazy -- just that the orientation is insensitive to large changes in yaw-roll near pitch of 90.
BUT, also note that phones and browsers for HTML5 do not properly implement yaw, pitch and roll per conventions for Android. Here is a good blog for reference:
http://www.sensorplatforms.com/understanding-orientation-conventions-mobile-platforms/
Here is a basic example, this will return the vector of gravity. Note that you can change the sensor type and the speed of sampling, more details here
SensorManager sensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
Sensor sensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
sensorManager.registerListener(new SensorEventListener() {
#Override
public void onSensorChanged(SensorEvent event) {
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
double total = Math.sqrt(x * x + y * y + z * z);
}
#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
}, sensor, SensorManager.SENSOR_DELAY_FASTEST);
Well if you running on the phone.
Quaternions are the best, and you should use it
For rotation matrix and euler angle, you can easily came across such term called gimbal lock. It happens frequently with user violent action.
Gimbal lock is the loss of one degree of freedom in a three-dimensional, three-gimbal mechanism that occurs when the axes of two of the three gimbals are driven into a parallel configuration, "locking" the system into rotation in a degenerate two-dimensional space.
Rotation matrix and euler angle are good for slow moving robot action.
For details on quaternions concatnations and convert point to new system,
you can refer to wiki link
https://en.wikipedia.org/wiki/Quaternion