Hi I am creating an application in which the user holds the phone upright and then rotates it around the y axis (similar to taking a panorama).
(source: apple.com)
I need to detect the angle of rotation. In iOS this was fairly simple with the gyroscope sensor, but I am not finding the same luck with Android. If anyone could point me in the right direction that would be great.
Assuming your Y axis points to the center of earth, the value you are looking for is called azimuth.
To monitor its change you will need to register a listener for TYPE_ACCELEROMETER and TYPE_MAGNETIC_FIELD events:
mngr = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
accelerometer = mngr.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
magneticField = mngr.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
int rate = SensorManager.SENSOR_DELAY_GAME; // or other
mngr.registerListener(sensorListener, accelerometer, rate);
mngr.registerListener(sensorListener, magneticField, rate);
And within the listener, call:
float[] values = new float[3];
SensorManager.getOrientation(R, values);
float current_azimuth_val = values[0]; // <----------
Note that the quality. and latency, if the data you will obtain is highly hardware dependent.
There are various sensors available that can be managed through a SensorManager. Of course, since every device decides whether or not to put a particular sensor on the hardware platform for their model you have to check whether one exists. Some have gyro like iOS, some can be done with accelerometer and magnometer sensors in its place.
You can get started here: http://developer.android.com/guide/topics/sensors/sensors_overview.html
Related
I am currently implementing an speedometer by receiving orientation data from my phone. I am using
SensorManager.getRotationMatrix(R, I, gravity, geomagnetic);
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
float azimuth = orientation[0];
double azimuthD = Math.toDegrees(azimuth);
if(azimuthD < 0) azimuthD = 360 + azimuthD;
With this i am able to receive the rotation data from my phone, such as azimuth etc..
Anyway, this works fine while the device is placed on a table or something. But when rotating around a certain point (in my case the device is fixed on a wheel and rotating at a certain speed) the values are far away from being accurate. I believe, since I am using gravity and the geomagnetic sensor, there could be an conflict with forces that influence these sensors, while rotating. As the wheel turns, the rotation changes relative to a point, but the local device rotation stays the same.
How can I access the orientation of the device while it's turning without running into a lot of noisy data?
I read some about the ´Sensor.TYPE_ROTATION_VECTOR´ property, but couldn't quite figure out how it works. Also I read about the possibility to remap the coordination system, but how is that supposed to help, since my phone is never not vertical to the floor more like with an angle of 5°-10°.
I would appreciate any help.
Cheers,
viehlieb
I guess i found my answer.
The solution was to throw away all the code i posted above and use the gyroscope, obviously.
The gyroscope values measure angular velocity of the device's rotation. The coordinate system used is the devices own coordinate system. In my case the relevant value was the rotation around the z-axis.
Values are in radiant per second, which can be mapped to m/s if you figure multiply the wheel's circumference. So the trick was in the OnSensorChanged method:
if(sensorEvent.sensor.getType() == Sensor.TYPE_GYROSCOPE){
gyroscope = sensorEvent.values;
double rotZ = gyroscope[2];
double degrees = Math.toDegrees(gyroscope[2]);
//calculate the speed with circumference = 2.23m
float speed = (float) degrees/ 360.0f * 2.23f * 3.6f;
}
If now you'd like to have accurate values, you could store them in an array and calculate the average. Remember to clear the array every 20th time (or so) the onSensorChanged method is called. With SENSOR_DELAY_GAME registered there's sufficient data over which you could build the average.
Which sensor should be used to determine whether device has been tilted UP or Down by using Android Sensor?
By using the following code:
SensorManager.getRotationMatrix(mRotationMatrix, null, mValuesAccel,mValuesMagnet);
SensorManager.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, R2);
SensorManager.getOrientation(R2, mValuesOrientation);
I'm getting three orientation values(azimuth,pitch,roll)
How can I use three orientation values to determine whether device has been tilted up or down.
To get the device tilt, as well as other attack angles you will need to register a listener for TYPE_ACCELEROMETER and TYPE_MAGNETIC_FIELD events:
mngr = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
accelerometer = mngr.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
magneticField = mngr.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
int rate = SensorManager.SENSOR_DELAY_GAME; // or other
mngr.registerListener(sensorListener, accelerometer, rate);
mngr.registerListener(sensorListener, magneticField, rate);
Once listener is activated, you will need to call
SensorManager.getOrientation()
to obtain current values for azimuth(z), pitch(x) & roll(y).
You can find a detailed code example here: http://www.codingforandroid.com/2011/01/using-orientation-sensors-simple.html
i think Accelerometer Sensor is used to determine whether device has been tilted UP or Down...
I want to detect a metal using magnetic sensor values. i am getting values continuously like x=30.00 ,y=-20.00 ,z=-13.00
now i want to know how to use these values for detecting any metal(mathameticalcalu,formulas)
code is
sensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
// get compass sensor (ie magnetic field)
myCompassSensor = sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
float azimuth = Math.round(event.values[0]);
float pitch = Math.round(event.values[1]);
float roll = Math.round(event.values[2]);
To detect metal, you have to check the intensity of the magnetic field, i.e. the magnitude of the magnetic field vector.
float mag = Math.sqrt(x^2 + y^2 + z^2);
Then you need to compare this value to the expected value of the magnetic field at your location on Earth. Luckily, Android provides functions to do so. Look at the GeomagneticField, reference is here https://developer.android.com/reference/android/hardware/GeomagneticField.html
Then if the value you are reading out of the sensors is quite far from the expected value, there's "something" (you guessed it, metal) that is disturbing the Earth magnetic field in the vicinity of your sensor. A test you could implement for instance is the following:
if (mag > 1.4*expectedMag || mag < 0.6*expectedMag) {
//there is a high probability that some metal is close to the sensor
} else {
//everything is normal
}
You should experiment a bit with the 1.4 and 0.6 values so that it fits your application. Note that this is never going to work 100% of the time because the magnetic sensors on a smartphone are quite cheap and nasty.
You can detect magnetic field using android Magnetic field sensor not metals.But Metals which are having magnetic field also will be detected e.g iron,nickel etc.Because ferrous metals behave the same way as a live electric cable .
It's past several days since I started using this function and have not yet succeeded in obtaining valid results.
What i want is basically convert acceleration vector from device's coordinates system, to real world coordinates. I' know that is possible because i have acceleration in relative coordinates and i know the orientation of the device in real world system.
Reading Android developers seems that using getRotationMatrix() i get R = rotation matrix.
So if i want A (acceleration vector in world system) from A' (acceleration vector in phone system) i must do simply:
A=R*A'
But i cant'n understand why the vector A has ALWAYS the first and the second component zero (example: +0,00;-0,00;+6,43)
My current code is similar to this:
public void onSensorChanged(SensorEvent event) {
synchronized (this) {
switch(event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
accelerometervalues = event.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomagneticmatrix =event.values.clone();
break;
}
if (geomagneticmatrix != null && accelerometervalues != null) {
float[] Rs = new float[16];
float[] I = new float[16];
SensorManager.getRotationMatrix(Rs, I, accelerometervalues, geomagneticmatrix);
float resultVec[] = new float[4];
float relativacc[]=new float [4];
relativacc[0]=accelerationvalues[0];
relativacc[1]=accelerationvalues[1];
relativacc[2]=accelerationvalues[2];
relativacc[3]=0;
Matrix.multiplyMV(resultVec, 0, Rs, 0, relativacc, 0);
//resultVec[] is the vector acceleration relative to world coordinates system..but doesn't WORK!!!!!
}
}
}
This question is very similar to this one Transforming accelerometer's data from device's coordinates to real world coordinates but there i can't find the solution...i had tried all the ways..
Please help me, i need help!!!
UPDATE:
Now my code is below, i had tried to explain matrix product, but nothing change:
float[] Rs = new float[9];
float[] I = new float[9];
SensorManager.getRotationMatrix(Rs, I, accelerationvalues, geomagneticmatrix);
float resultVec[] = new float[4];
resultVec[0]=Rs[0]*accelerationvalues[0]+Rs[1]*accelerationvalues[1]+Rs[2]*accelerationvalues[2];
resultVec[1]=Rs[3]*accelerationvalues[0]+Rs[4]*accelerationvalues[1]+Rs[5]*accelerationvalues[2];
resultVec[2]=Rs[6]*accelerationvalues[0]+Rs[7]*accelerationvalues[1]+Rs[8]*accelerationvalues[2];
Here some example of data read and result:
Rs separated by " " Rs[0] Rs[1]....Rs[8]
Av separated by " " accelerationvalues[0] ...accelerationvalues[2]
rV separated by " " resultVec[0] ...resultVec[2]
As you can notice the component on x and y axes in real world are zero (around) even if you move speddy the phone. Instead the relative acceleration vector detect correctly each movement!!!
SOLUTION
The errors in the numberrs are relative to float vars multiplication that is not the same as a double multyplication.
This sums to the fact that rotation matrix isn't costant if the phone, even if with the same orientation, is accelerating.
So is impossible translate acceleration vector to absolute coordinates during motion...
It's hard but it's the reality.
Finnaly i found the answer:
The errors in the numbers are relative to float vars multiplication that is not the same as a double multyplication. Here there is the solution.
This sums to the fact that rotation matrix isn't costant if the phone, even if with the same orientation, is accelerating. So is impossible translate acceleration vector to absolute coordinates during motion... It's hard but it's the reality.
FYI the orientation vector is made from magnetomer data AND gravity vector. This cause a ciclic problem: convert relative acc needs oirentation needs magnetic field AND gravity, but we know gravity only if the phone is stop by relative acc..so we are return to begin.
This is confirmed in Android Developers where is explained that rotation matrix give true result only when the phone isn't accelerate (e.g. they talk of free fall, infact there shouldn't be gravity mesaurement) or when it isn't in a non regulare magnetic field.
The matrices returned by this function are meaningful only when the
device is not free-falling and it is not close to the magnetic north.
If the device is accelerating, or placed into a strong magnetic field,
the returned matrices may be inaccurate.
In others world, fully un-useful...
You can trust this thing doing simple experiment on the table with Android Senor or something like this..
You must track down this arithmetic error before you worry about rotation, acceleration or anything else.
You have confirmed that
resultVec[0]=Rs[0]*accelerationvalues[0];
gives you
Rs[0]: 0.24105562
accelerationValues[0]: 6.891896
resultVec[0]: 1.1920929E-7
So once again, simpify. Try this:
Rs[0] = 0.2;
resultVec[0] = Rs[0] * 6.8
EDIT:
The last one gave resultVec[0]=1.36, so let's try this:
Rs[0] = 0.2;
accelerationValues[0] = 6.8
resultVec[0] = Rs[0] * accelerationValues[0];
If you do the sums, using the printed values you have appended, I get
`(0.00112, -0.0004, 10)`
which is not as small as what you have. Therefore there is an arithmetic error!
Could the problem be that you are using accelerationvalues[] in the last block, and accelerometervalues[] later?
I have developed several applications that make use of android sensors, so I am answering to one of your questions according to my experience:
But i cant'n understand why the vector A has ALWAYS the first and the
second component zero (example: +0,00;-0,00;+6,43)
I have observed this problem with the acceleration sensor and the magnetic field sensor, too. The readings are zero for some of the axis (two as you point, or just one in other occasions). This problem happens when you have just enabled the sensors (registerListener()) and I assume that it is related to some kind of sensor initialization.
In the case of the acceleration sensor, I have observed that just a small shaking of the device makes it to start giving correct sensor readings.
The correct solution would be the method onAccuracyChanged() giving the correct information about the sensor state. It should be returning a staus of SensorManager.SENSOR_STATUS_UNRELIABLE, but instead of that, it permanently returns SensorManager.SENSOR_STATUS_ACCURACY_HIGH on all physical devices that I have tested so far. With the method onAccuracyChanged() properly implemented, you could ignore bad readings or ask the user to wait while the sensor is being initialized.
Currently, I'm trying to rotate 3D Cube using orientation sensor values, using getRotation() method. Some unexpected behaviors are observed when the android device is rotated above some bounds. For instance, if I make the device 'stand up', the value of the 'roll' just becomes crazy.
Also I'm experiencing the phenomenon similar to so-called gimbal-lock. The only difference is I'm experiencing the very problem even before applying the sensor values to the 3D rotation. When I try to change the 'pitch' value by rotating the device around only 'pitch' axis, the 'yaw' value also changes according to the rotation of the pitch. It seems completely unreasonable to me.
Could somebody help me?? I'm stuck in this problem for a month.
This is a common problem with yaw, pitch and roll. You cannot get rid of it as long as you are using yaw, pitch and roll (Euler angles). This video explains why.
I use rotation matrices instead of Euler angles in my motion sensing application. For an introduction to rotation matrices I recommend:
Direction Cosine Matrix IMU: Theory
Rotation matrices work like a charm.
Quaternions are also very popular and said to be the most stable.
[This answer was copied from here.]
Using quaternions to compute YPR won't do much to solve any problem. The problem of gimbal lock (which near pitch of +/-90 can drive yaw and roll -- actually yaw-roll at the north pole -- to go crazy under slight changes/noise in the underlying quaternion).
However, if you use Yaw Pitch and Roll values to perform a rotation of a 3D object shouldn't exhibit any odd behavior near the gimbal lock position. It's just that an amibguity in yaw and roll arise and large variations in yaw and roll do not imply the actual orientation is going crazy -- just that the orientation is insensitive to large changes in yaw-roll near pitch of 90.
BUT, also note that phones and browsers for HTML5 do not properly implement yaw, pitch and roll per conventions for Android. Here is a good blog for reference:
http://www.sensorplatforms.com/understanding-orientation-conventions-mobile-platforms/
Here is a basic example, this will return the vector of gravity. Note that you can change the sensor type and the speed of sampling, more details here
SensorManager sensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
Sensor sensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
sensorManager.registerListener(new SensorEventListener() {
#Override
public void onSensorChanged(SensorEvent event) {
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
double total = Math.sqrt(x * x + y * y + z * z);
}
#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
}, sensor, SensorManager.SENSOR_DELAY_FASTEST);
Well if you running on the phone.
Quaternions are the best, and you should use it
For rotation matrix and euler angle, you can easily came across such term called gimbal lock. It happens frequently with user violent action.
Gimbal lock is the loss of one degree of freedom in a three-dimensional, three-gimbal mechanism that occurs when the axes of two of the three gimbals are driven into a parallel configuration, "locking" the system into rotation in a degenerate two-dimensional space.
Rotation matrix and euler angle are good for slow moving robot action.
For details on quaternions concatnations and convert point to new system,
you can refer to wiki link
https://en.wikipedia.org/wiki/Quaternion