I want to calculate orientation of the phone. Android documentation says that I can do that using getRotationMatrix (float[] R, float[] I, float[] gravity, float[] geomagnetic) and remapCoordinateSystem(float[], int, int, float[]), but also in documentation write The matrices returned by this function are meaningful only when the device is not free-falling and it is not close to the magnetic north. If the device is accelerating, or placed into a strong magnetic field, the returned matrices may be inaccurate.
My question is how to calculate phone orientation when the phone is accelerating, no matthr what kind of acceleration it is, free fall, phone attached to car etc...
getResources().getConfiguration().orientation
http://developer.android.com/reference/android/content/res/Configuration.html#orientation
An accelerometer is NOT designed to calculate pitch. It is only by mere fortunate natural coincidence that you can establish pitch from a NON-ACCELERATING accelerometer. At all other times it is unreliable (for measuring pitch). It has nothing to do with quality of sensor etc it's just that accelerometers DO NOT measure pitch. To reliably establish pitch you need gyroscopes. These devices ARE designed to measure pitch.
Related
I'm using both gyroscope and accelerometer in an Android. What I do is only displaying values from both sensors. I don't get why to track device acceleration, I have to use gyroscope and why the device orientation is given by accelerometer sensor.
I have test this code on 2 tablets, 3 phones and result are the same.
Listeners :
// gyroscope sensor
sensorManager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_GYROSCOPE), SensorManager.SENSOR_DELAY_NORMAL);
// accelerometer sensor
sensorManager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
And to get the result I implement SensorEventListener :
#Override
public void onSensorChanged(SensorEvent sensorEvent) {
switch (sensorEvent.sensor.getStringType()){
case Sensor.STRING_TYPE_ACCELEROMETER:
(TextView)findViewById(R.id.sensor_accel_data_textView)).setText(String.valueOf(sensorEvent.values[0]));
((TextView)findViewById(R.id.sensor_accel_data_textView2)).setText(String.valueOf(sensorEvent.values[1]));
((TextView)findViewById(R.id.sensor_accel_data_textView3)).setText(String.valueOf(sensorEvent.values[2]));
break;
case Sensor.STRING_TYPE_GYROSCOPE:
((TextView)findViewById(R.id.sensor_gyro_data_textView)).setText(String.valueOf(sensorEvent.values[0]));
((TextView)findViewById(R.id.sensor_gyro_data_textView2)).setText(String.valueOf(sensorEvent.values[1]));
((TextView)findViewById(R.id.sensor_gyro_data_textView3)).setText(String.valueOf(sensorEvent.values[2]));
break;
}
}
They are not inverted. Accelerometer gives you ax, ay, az which are accelerations in 3 directions. Gyroscope gives you gx, gy, gz which are rotation velocities around 3 directions.
Those two sensors can be used independently.
Accelerometer does not give you orientation. There is orientation sensor but it might be deprecated. I think sensor values are orientation dependent, but there are ways to make them orientation independent.
You can install some sensor app from play store and compare it with your values for testing purpose.
This question is almost a year old, but I just stumbled onto it and got the impression that the root of the question is a misunderstanding of what each sensor actually measures.
So, hopefully the following explanation clarifies a bit that the sensors are not switched, but that in fact accelerometers are used to detect orientation and that the (imho badly named) gyroscope does not provide an absolute orientation.
Accelerometer
You should imagine an accelerometer as a sample mass, which is held by some springs, because that is what an accelerometer is (you can search for MEMS accelerometer to find examples on how this is actually implemented on a micrometer scale). If you accelerate the device, the mass will push against the springs because of inertia and the tiny deflection of the mass is measured to detect the acceleration. However, the mass is not only deflected by an actual acceleration but also by gravitational pull. So, if your phone is resting on the table, the mass is still deflected towards the ground.
So, the "-10" to "10" you see is earth's acceleration at 9.81 m/s². From a physics perspective, this is confusing because the resting device is obviously not being accelerated while the sensor still shows 9.81 m/s², so we get the device acceleration plus earth's acceleration from the sensor. While this is the nightmare of any physics teacher, it is extremely helpful because it tells you where "down" is and hence gives you the orientation of the device (except for rotations around the axis of gravity).
Gyroscope
The sensor called "gyroscope" is another sample mass in your device, which is actively driven to vibrate. Because of the vibration movement it is subject to the Coriolis force (in its frame of reference) and gets deflected when rotating (searching for "MEMS gyroscope" yields even more astonishing devices, especially if you think about the fact that they can detect the Coriolis force on that scale).
However, the deflection does not allow you to determine an absolute angle (as an old-fashioned gyroscope would), but instead it is proportional to the angular velocity (rate of rotation). Therefore, you can expect a reading of zero on all axes for a resting device and will only see a reading for any axis as long as you are rotating the device about this axis.
Gyroscope vs Accelerometer:
Acceleromter measures how fast object is moving along axis. How fast it moves within X and Y and Z.
Gyroscope measures how fast object is rotating around axis. How was it rotates around X and Y and Z.
At the beginnig, your X or Y or Z will have Earth acceleration value measured in meter divided squared second, which is 9.81 m/s^2
I am looking for a solution that replaces the deprecated Android sensor Sensor.TYPE_ORIENTATION.
The most reported solution is to combine Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGNETIC_FIELD, then calculate a rotation matrix by using SensorManager#getRotationMatrix and obtain the Euler angles by using SensorManager#getOrientation.
Another reported solution is to use Sensor.TYPE_ROTATION_VECTOR, which also ends up with a rotation matrix and the Euler angles by using SensorManager#getOrientation
Unfortunately those behave totally different to TYPE_ORIENTATION when rotating the mobile device. Try both types while your phone is laying on the desk and then turning it up (pitch) to 90° (the screen is now directly facing to you). The calculated Euler angles of azimuth and roll get really wild (cause of something called the Gimbal lock problem) while the degree values retrieved with TYPE_ORIENTATION are pretty stable (not accurate but quite ok). Every value (yaw, pitch and roll) of TYPE_ORIENTATION seems to be some kind of "projected" degree without having the Gimbal Lock problem.
What would be a way to get similar degrees (for yaw, roll and pitch) without using the depreciated TYPE_ORIENTATION sensor (maybe from the rotation matrix)? How does the TYPE_ORIENTATION algorithm does it internally?
The azimuth in getOrientation is the angle between the magnetic north and the projection of the device y-axis into the world x-y plane. When the device is up to 90° the projection is a zero vector, thus the azimuth does not make sense in this case and can be any value. Physically, trying to find the angle between the magnetic north and a vector pointing to the Sky will not make sense.
You should look at my project at https://github.com/hoananguyen/dsensor/blob/master/dsensor/src/main/java/com/hoan/dsensor_master/DProcessedSensor.java
Android provides sensor data in device coordinate system no matter how it is oriented. Is there any way to have sensor data in 'gravity' coordinate system? I mean no matter how the device is oriented I want accelerometer data and orientation in coordinate system where y-axis points toward the sky, x-axis toward the east and z-axis towards south pole.
I took a look at remapCoordinateSystem but seems to be limited to only swapping axis. I guess for orientation I will have to do some low level rotation matrix transformation (or is there any better solution?). But how about acceleration data? Is there any way to have data in relation to coordinate system that is fixed (sort of world coordinate system).
The reason I need this is I'm trying to do some simple motion gestures when phone is in the pocket and it would be easier for me to have all data in coordinates system related to user rather device coordinate system (that will have a little bit different orientation in different user's pockets)
Well you basically get the North orientation when starting - for this you use the accelerometer and the magnetic field sensor to compute orientation (or the deprecated Orientation sensor).
Once you have it you can compute a rotation matrix (Direction Cosine Matrix) from those azimuth, pitch and roll angles. Multiplying your acceleration vector by that matrix will transform your device-frame movements into Earth-frame ones.
As your device will change its orientation as time goes by, you'll need to update it. To do so, retrieve gyroscope's data and update your Direction Cosine Matrix for each new value. You could also get the orientation true value just like the first time, but it's less accurate.
My solution involves DCM, but you could also use quaternions, it's just a matter of choice. Feel free to ask more if needed. I hope this is what you wanted to know !
I want to know about the values of X,Y and Z axis for any position/movement of device so I can use these values for my further work. As I searched there are two methods, Orientation Sensor(gives value in degree,as azimuth,pitch and roll) and Accelerometer(gives values between 1 to 10 for x,y and z).
As per my understanding both are same for my requirement. I can't find difference between them. Please can any one clear me about them in detail w.r.t my aim. which sensor should I use?
There are differences between both:
Accelerometer detects acceleration in space. The reason why it will always detect an acceleration of 9.8m/s^2 downwards is because gravity is equivalent to acceleration in space.
Orientation detects if your device's axis are rotated from the real-world; it detects tilts and and degrees from the magnetic North. Please note that this sensor is deprecated and Google recommends you use the accelerometer and magnetometer to calculate orientation.
You'll need the accelerometer to detect movement. So you should use this one, since your aim is to know this movement.
The orientation sensor gives information about its position compared to a reference plane. So you could use this to see of the device is tilted, upside down or something like that.
Can anyone help on removing the g factor from accelerometer readings?
I am using SensorEventListener with onSensorChanged() method for getting Sensor.TYPE_ACCELEROMETER data. I need only pure acceleration values in all directions. So at any state if the device is stable (or in constant speed), it should give (0.0,0.0,0.0) roughly.
Currently, depending on its pitch and roll, it gives me variable output depending on the g forces acting on each axis.
I hope there is some formula to remove this, as I also get orientation values (pitch and roll) from Sensor.TYPE_ORIENTATION listener. I have used some but it didn't work.
You can use a low-pass filter.
Do this for each of your sensor values:
g = 0.9 * g + 0.1 * v
Where v is your current sensor value and g is a global variable initially set to zero. Mind that you'll need as many g variables as you have axes.
With v = v - g you can eliminate the gravity factor from your sensor value.
Use Sensor.TYPE_LINEAR_ACCELERATION instead of Sensor.TYPE_ACCELEROMETER
Take a look of the following link.
http://developer.android.com/reference/android/hardware/SensorEvent.html
Just subtract out g (~9.8m/s^2) times the z direction of the rotation matrix.
Or to be more explicit about it, let
a = your accelerometer reading,
R = your rotation matrix (as a 9-long vector).
Then what you want is
(a[0]-g*R[6], a[1]-g*R[7], a[2]-g*R[8]).
Differentiating with respect to time a function of time rids you of the constants.
So by taking the derivative of the accelerometer's signal you'll get the "Jerk", which you can then re-integrate in order to get the non-constant part of the acceleration you're looking for.
In Layman's terms, take a sample from the accelerometer every 1 second, and subtract it from the previous sample. If the answer is (very close to) zero, you're not accelerating relatively to earth. If the result is non-zero, integrate it (in this case, multiply by one second), you have your acceleration.
Two things, though :
-Look out for noise in the signal, round off your input.
-Don't expect hyper-accurate results from on-chip accelerometers. You can use them to detect shaking, changes in orientation, but not really for knowing how many G's you're experiencing while making sharp turns in your car.
One way (for devices only with accelerometer) is to remove gravity vector from accelerometer data by subtracting the values that would come in static case for same orientation. But as orientation is again calculated by taking acceleration readings and not independently, its not very accurate.
Gyroscope may help in this case. But few androids still have a true gyroscope. And using its raw readings is not so simple.
you need to assume two coordinate systems:
1- fixed global system.
2- moving coordinate system in which the origin moves & rotates as sensor does.
in global system, g is always parallel to z axis but in moving system it is not.
so all you have to do is to compute 3*3 rotation matrix from orientation angles or
yaw, pitch & roll. (you can find formulas everywhere).
then multiply this rotation matrix by 3*1 acceleration vector measured by sensor.
this will transform coordinates and declare the values in fixed global system.
the only thing afterward is to simply subtract g from z value.