I'm using both gyroscope and accelerometer in an Android. What I do is only displaying values from both sensors. I don't get why to track device acceleration, I have to use gyroscope and why the device orientation is given by accelerometer sensor.
I have test this code on 2 tablets, 3 phones and result are the same.
Listeners :
// gyroscope sensor
sensorManager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_GYROSCOPE), SensorManager.SENSOR_DELAY_NORMAL);
// accelerometer sensor
sensorManager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
And to get the result I implement SensorEventListener :
#Override
public void onSensorChanged(SensorEvent sensorEvent) {
switch (sensorEvent.sensor.getStringType()){
case Sensor.STRING_TYPE_ACCELEROMETER:
(TextView)findViewById(R.id.sensor_accel_data_textView)).setText(String.valueOf(sensorEvent.values[0]));
((TextView)findViewById(R.id.sensor_accel_data_textView2)).setText(String.valueOf(sensorEvent.values[1]));
((TextView)findViewById(R.id.sensor_accel_data_textView3)).setText(String.valueOf(sensorEvent.values[2]));
break;
case Sensor.STRING_TYPE_GYROSCOPE:
((TextView)findViewById(R.id.sensor_gyro_data_textView)).setText(String.valueOf(sensorEvent.values[0]));
((TextView)findViewById(R.id.sensor_gyro_data_textView2)).setText(String.valueOf(sensorEvent.values[1]));
((TextView)findViewById(R.id.sensor_gyro_data_textView3)).setText(String.valueOf(sensorEvent.values[2]));
break;
}
}
They are not inverted. Accelerometer gives you ax, ay, az which are accelerations in 3 directions. Gyroscope gives you gx, gy, gz which are rotation velocities around 3 directions.
Those two sensors can be used independently.
Accelerometer does not give you orientation. There is orientation sensor but it might be deprecated. I think sensor values are orientation dependent, but there are ways to make them orientation independent.
You can install some sensor app from play store and compare it with your values for testing purpose.
This question is almost a year old, but I just stumbled onto it and got the impression that the root of the question is a misunderstanding of what each sensor actually measures.
So, hopefully the following explanation clarifies a bit that the sensors are not switched, but that in fact accelerometers are used to detect orientation and that the (imho badly named) gyroscope does not provide an absolute orientation.
Accelerometer
You should imagine an accelerometer as a sample mass, which is held by some springs, because that is what an accelerometer is (you can search for MEMS accelerometer to find examples on how this is actually implemented on a micrometer scale). If you accelerate the device, the mass will push against the springs because of inertia and the tiny deflection of the mass is measured to detect the acceleration. However, the mass is not only deflected by an actual acceleration but also by gravitational pull. So, if your phone is resting on the table, the mass is still deflected towards the ground.
So, the "-10" to "10" you see is earth's acceleration at 9.81 m/s². From a physics perspective, this is confusing because the resting device is obviously not being accelerated while the sensor still shows 9.81 m/s², so we get the device acceleration plus earth's acceleration from the sensor. While this is the nightmare of any physics teacher, it is extremely helpful because it tells you where "down" is and hence gives you the orientation of the device (except for rotations around the axis of gravity).
Gyroscope
The sensor called "gyroscope" is another sample mass in your device, which is actively driven to vibrate. Because of the vibration movement it is subject to the Coriolis force (in its frame of reference) and gets deflected when rotating (searching for "MEMS gyroscope" yields even more astonishing devices, especially if you think about the fact that they can detect the Coriolis force on that scale).
However, the deflection does not allow you to determine an absolute angle (as an old-fashioned gyroscope would), but instead it is proportional to the angular velocity (rate of rotation). Therefore, you can expect a reading of zero on all axes for a resting device and will only see a reading for any axis as long as you are rotating the device about this axis.
Gyroscope vs Accelerometer:
Acceleromter measures how fast object is moving along axis. How fast it moves within X and Y and Z.
Gyroscope measures how fast object is rotating around axis. How was it rotates around X and Y and Z.
At the beginnig, your X or Y or Z will have Earth acceleration value measured in meter divided squared second, which is 9.81 m/s^2
Related
I am collecting accelerometer and gyroscope data from an Android smartwatch while being in a moving car.
The goal is to be able to classify, using Hidden Markov Modelss, whether the subjects are the driver of the car of a passenger, looking at e.g. the steering wheel.
When reading about android sensors, i noticed the following statement:
https://developer.android.com/guide/topics/sensors/sensors_overview.html
When a device is held in its default orientation, the X axis is horizontal and points to the right, the Y axis is vertical and points up, and the Z axis points toward the outside of the screen face.
As i interpret it: The axes of (x, y, z) from an accelerometer and a gyroscope will swap according to the orientation of the device. I've read that it is possible to fix the three axis to the world frame coordinate system, so for example Z axis always measures vertical acceleration from an accerometer.
Opposed to if the axes are not fixed, I think the data will generalize poorly for a classification purpose. However, I am uncertatain whether it is an issue, or if data where axes are not fixed is equally good or better.
A sub question is whether there are other sensors than accelerometer gyroscope which could be used for this classification goal.
Thanks in advance!
I´m currently facing some problems in reading out the values of the Android rotation vector. As the Android documentation explains, the rotation vector is a sensor fusion combination of gyroscope, magnetometer and acceleromenter. The reference frame of the rotation vector is X facing east, Y facing north and Z vertical when the device is laying in a flat position.
All the mentioned characteristics are right and work in my application.
Now the problem: The rotation vector seems partly very inaccurate. First the values of the rotation vector seem to lag behind the actual movement, second the values are really affected by fast movements of de device.
To exclude any errors made by my implementation is tested a demo application from google as well. The source code is located here. Also in this demo the rotation vector seems partly laggy and unreliable in fast movements.
Did anyone else faced some problems with Androids rotation vector?
Here is how I´m doing it:
Register the sensor and listener:
sensorManager.registerListener(this, gravitySensor, SensorManager.SENSOR_DELAY_FASTEST, handler)
Reading out the values and create a rotation matrix used for OpenGL rendering:
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR) {
SensorManager.getRotationMatrixFromVector(rotationMatrix4D, event.values);
}
}
Edit:
I am developing on Samsung Galaxy S3 Mini. Maybe just the device's sensors are not that good.
After deep investigating Android's Rotation Vector on different devices I recognized that this sensor indeed is really inaccurate.
So there are a lot of possibilities to get a more accurate and stable orientation.
1. Smooth the Rotation Vector over time
2. Perform a sensor fusion with the Ratation Vector and for example the gyroscope.
My approach was to store the initial rotation vector and then use the gyroscope to calculate the actual orientation of the device. Since the gyroscope is likely to be affected by drift, i reset the orientation every few seconds the the orientation of the rotation vector.
Android provides sensor data in device coordinate system no matter how it is oriented. Is there any way to have sensor data in 'gravity' coordinate system? I mean no matter how the device is oriented I want accelerometer data and orientation in coordinate system where y-axis points toward the sky, x-axis toward the east and z-axis towards south pole.
I took a look at remapCoordinateSystem but seems to be limited to only swapping axis. I guess for orientation I will have to do some low level rotation matrix transformation (or is there any better solution?). But how about acceleration data? Is there any way to have data in relation to coordinate system that is fixed (sort of world coordinate system).
The reason I need this is I'm trying to do some simple motion gestures when phone is in the pocket and it would be easier for me to have all data in coordinates system related to user rather device coordinate system (that will have a little bit different orientation in different user's pockets)
Well you basically get the North orientation when starting - for this you use the accelerometer and the magnetic field sensor to compute orientation (or the deprecated Orientation sensor).
Once you have it you can compute a rotation matrix (Direction Cosine Matrix) from those azimuth, pitch and roll angles. Multiplying your acceleration vector by that matrix will transform your device-frame movements into Earth-frame ones.
As your device will change its orientation as time goes by, you'll need to update it. To do so, retrieve gyroscope's data and update your Direction Cosine Matrix for each new value. You could also get the orientation true value just like the first time, but it's less accurate.
My solution involves DCM, but you could also use quaternions, it's just a matter of choice. Feel free to ask more if needed. I hope this is what you wanted to know !
I want to calculate orientation of the phone. Android documentation says that I can do that using getRotationMatrix (float[] R, float[] I, float[] gravity, float[] geomagnetic) and remapCoordinateSystem(float[], int, int, float[]), but also in documentation write The matrices returned by this function are meaningful only when the device is not free-falling and it is not close to the magnetic north. If the device is accelerating, or placed into a strong magnetic field, the returned matrices may be inaccurate.
My question is how to calculate phone orientation when the phone is accelerating, no matthr what kind of acceleration it is, free fall, phone attached to car etc...
getResources().getConfiguration().orientation
http://developer.android.com/reference/android/content/res/Configuration.html#orientation
An accelerometer is NOT designed to calculate pitch. It is only by mere fortunate natural coincidence that you can establish pitch from a NON-ACCELERATING accelerometer. At all other times it is unreliable (for measuring pitch). It has nothing to do with quality of sensor etc it's just that accelerometers DO NOT measure pitch. To reliably establish pitch you need gyroscopes. These devices ARE designed to measure pitch.
I want to know about the values of X,Y and Z axis for any position/movement of device so I can use these values for my further work. As I searched there are two methods, Orientation Sensor(gives value in degree,as azimuth,pitch and roll) and Accelerometer(gives values between 1 to 10 for x,y and z).
As per my understanding both are same for my requirement. I can't find difference between them. Please can any one clear me about them in detail w.r.t my aim. which sensor should I use?
There are differences between both:
Accelerometer detects acceleration in space. The reason why it will always detect an acceleration of 9.8m/s^2 downwards is because gravity is equivalent to acceleration in space.
Orientation detects if your device's axis are rotated from the real-world; it detects tilts and and degrees from the magnetic North. Please note that this sensor is deprecated and Google recommends you use the accelerometer and magnetometer to calculate orientation.
You'll need the accelerometer to detect movement. So you should use this one, since your aim is to know this movement.
The orientation sensor gives information about its position compared to a reference plane. So you could use this to see of the device is tilted, upside down or something like that.