I want to know about the values of X,Y and Z axis for any position/movement of device so I can use these values for my further work. As I searched there are two methods, Orientation Sensor(gives value in degree,as azimuth,pitch and roll) and Accelerometer(gives values between 1 to 10 for x,y and z).
As per my understanding both are same for my requirement. I can't find difference between them. Please can any one clear me about them in detail w.r.t my aim. which sensor should I use?
There are differences between both:
Accelerometer detects acceleration in space. The reason why it will always detect an acceleration of 9.8m/s^2 downwards is because gravity is equivalent to acceleration in space.
Orientation detects if your device's axis are rotated from the real-world; it detects tilts and and degrees from the magnetic North. Please note that this sensor is deprecated and Google recommends you use the accelerometer and magnetometer to calculate orientation.
You'll need the accelerometer to detect movement. So you should use this one, since your aim is to know this movement.
The orientation sensor gives information about its position compared to a reference plane. So you could use this to see of the device is tilted, upside down or something like that.
Related
I'm using both gyroscope and accelerometer in an Android. What I do is only displaying values from both sensors. I don't get why to track device acceleration, I have to use gyroscope and why the device orientation is given by accelerometer sensor.
I have test this code on 2 tablets, 3 phones and result are the same.
Listeners :
// gyroscope sensor
sensorManager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_GYROSCOPE), SensorManager.SENSOR_DELAY_NORMAL);
// accelerometer sensor
sensorManager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
And to get the result I implement SensorEventListener :
#Override
public void onSensorChanged(SensorEvent sensorEvent) {
switch (sensorEvent.sensor.getStringType()){
case Sensor.STRING_TYPE_ACCELEROMETER:
(TextView)findViewById(R.id.sensor_accel_data_textView)).setText(String.valueOf(sensorEvent.values[0]));
((TextView)findViewById(R.id.sensor_accel_data_textView2)).setText(String.valueOf(sensorEvent.values[1]));
((TextView)findViewById(R.id.sensor_accel_data_textView3)).setText(String.valueOf(sensorEvent.values[2]));
break;
case Sensor.STRING_TYPE_GYROSCOPE:
((TextView)findViewById(R.id.sensor_gyro_data_textView)).setText(String.valueOf(sensorEvent.values[0]));
((TextView)findViewById(R.id.sensor_gyro_data_textView2)).setText(String.valueOf(sensorEvent.values[1]));
((TextView)findViewById(R.id.sensor_gyro_data_textView3)).setText(String.valueOf(sensorEvent.values[2]));
break;
}
}
They are not inverted. Accelerometer gives you ax, ay, az which are accelerations in 3 directions. Gyroscope gives you gx, gy, gz which are rotation velocities around 3 directions.
Those two sensors can be used independently.
Accelerometer does not give you orientation. There is orientation sensor but it might be deprecated. I think sensor values are orientation dependent, but there are ways to make them orientation independent.
You can install some sensor app from play store and compare it with your values for testing purpose.
This question is almost a year old, but I just stumbled onto it and got the impression that the root of the question is a misunderstanding of what each sensor actually measures.
So, hopefully the following explanation clarifies a bit that the sensors are not switched, but that in fact accelerometers are used to detect orientation and that the (imho badly named) gyroscope does not provide an absolute orientation.
Accelerometer
You should imagine an accelerometer as a sample mass, which is held by some springs, because that is what an accelerometer is (you can search for MEMS accelerometer to find examples on how this is actually implemented on a micrometer scale). If you accelerate the device, the mass will push against the springs because of inertia and the tiny deflection of the mass is measured to detect the acceleration. However, the mass is not only deflected by an actual acceleration but also by gravitational pull. So, if your phone is resting on the table, the mass is still deflected towards the ground.
So, the "-10" to "10" you see is earth's acceleration at 9.81 m/s². From a physics perspective, this is confusing because the resting device is obviously not being accelerated while the sensor still shows 9.81 m/s², so we get the device acceleration plus earth's acceleration from the sensor. While this is the nightmare of any physics teacher, it is extremely helpful because it tells you where "down" is and hence gives you the orientation of the device (except for rotations around the axis of gravity).
Gyroscope
The sensor called "gyroscope" is another sample mass in your device, which is actively driven to vibrate. Because of the vibration movement it is subject to the Coriolis force (in its frame of reference) and gets deflected when rotating (searching for "MEMS gyroscope" yields even more astonishing devices, especially if you think about the fact that they can detect the Coriolis force on that scale).
However, the deflection does not allow you to determine an absolute angle (as an old-fashioned gyroscope would), but instead it is proportional to the angular velocity (rate of rotation). Therefore, you can expect a reading of zero on all axes for a resting device and will only see a reading for any axis as long as you are rotating the device about this axis.
Gyroscope vs Accelerometer:
Acceleromter measures how fast object is moving along axis. How fast it moves within X and Y and Z.
Gyroscope measures how fast object is rotating around axis. How was it rotates around X and Y and Z.
At the beginnig, your X or Y or Z will have Earth acceleration value measured in meter divided squared second, which is 9.81 m/s^2
I am working on an android app that requires the detection of vertical motion. When moving the tablet upward, the Gyroscope, Accelerometer, and Linear Acceleration sensors give a corresponding value indicating upward or downward motion.
The problem I have is that these sensors will also read an upward/downward motion when you tilt the tablet towards the user or away from the user. For example, the x value in the gyroscope represents the vertical plane. But when you tilt the device forwards, the x value will change.
When I make this motion, the same sensor that reads vertical motion reads a value for this.
The same goes for the rest of the sensors. I have tried to use orientation coupled with the gyro to make the conditional statement, if the pitch is not changing, but the x variable is going up/down, then we have vertical motion. The problem with this is that if the user moves it up but tilted slightly, it will no longer work. I also tried making it so if there is a change in tilt, then there is no vertical motion. But it iterates so quickly that there may be a change in tilt for 1/100 of a second, but for the next there isn't.
Is there any way I can read only vertical changes and not changes in the devices pitch?
Here is what I want to detect:
edit:
"Please come up with a mathematically sound definition of what you consider 'moving upwards.'"
This was my initial question, how can I write a function to define when the tablet is moving upwards or downwards? I consider a vertical translation moving upwards. Now how do I detect this? I simply do not know where to begin, thank you.
Ok, even though this question is fairly old, I see a lot of confusion in the present answer and comments, so in case anyone finds this, I intend to clear a few things up.
The Gyroscope
First of all, the gyroscope does not measure vertical motion as per your definition (a translatory motion). It measures rotation around each of the axes, which are defined as in the figure below. Thus having you tilt your device forwards and backwards indeed rotates it around the x axis and therefore you will see non-zero values in the x value of your gyroscope sensor.
the x value in the gyroscope represents the vertical plane.
I'm not sure what is meant by "the vertical plane", however the x value certainly does not represent the plane itself nor the orientation of the device within the plane.
The x value of the gyroscope sensor represents the current angular velocity of the device around the x axis (eg. the change in rotation).
But when you tilt the device forwards, the x value will change. When I make this motion, the same sensor that reads vertical motion reads a value for this.
Not quite sure what you're referring to here. "The same sensor that reads vertical motion" I assume is the gyroscope, but as previously said, it does not read vertical motion. It does exactly what it says on the tin.
The device coordinate system
This is more in response to user Ali's answer than the original question, but it remains relevant in either case.
The individual outputs of the linear acceleration sensor (or any other sensor for that matter) are expressed in the coordinate system of the device, as shown in the image above. This means if you rotate the device slightly, the outputs will no longer be parallel to any world axis they coincided with before. As such, you will either have to enforce that the device is in a particular orientation for your application, or take the new orientation into account.
The ROTATION_VECTOR sensor, combined with quaternion math or the getRotationMatrixFromVector() method, is one way to translate your measurements from device coordinates to world coordinates. There are other ways to achieve the same goal, but once achieved, the way you hold your device won't matter for measuring vertical motion.
In either case, the axis you're looking for is the y axis, not the z axis.
(If by any chance you meant "along device y axis" as "vertical", then just ignore all the orientation stuff and just use the linear acceleration sensor)
Noise
You mentioned some problems regarding noise and update rates in the question, so I'll just mention it here. The simplest and one of the more common ways to get nice, consistent data from something that varies very often is to use a low-pass filter. What type of filter is best depends on the application, but I find that a exponential moving average filter is viable in most cases.
Finishing thoughts
Note that if you take proper care of the orientation, your transformed linear acceleration output will be a good approximation of vertical motion (well, change in motion) without filtering any noise.
Also, if you want to measure vertical "motion", as in velocity, you need to integrate the accelerometer output. For various reasons, this doesn't really turn out too well in most cases, although it is less severe in the case of velocity rather than trying to measure position.
OK, I suspect it is only a partial answer.
If you want to detect vertical movement, you only need linear acceleration, the device orientation doesn't matter. See
iOS - How to tell if device is raised/dropped (CoreMotion)
or
how to calculate phone's movement in the vertical direction from rest?
For some reason you are concerned with the device orientation as well, and I have no idea why. I suspect that you want to detect something else. So please tell us more and then I will improve my answer.
UPDATE
I read the post on coremotion, and you mentioned that higher z lower x and y means vertical motion, can you elaborate?
I will write in pseudo code. You measured the (x, y, z) linear acceleration vector. Compute
rel_z = z/sqrt(x^2+y^2+z^2+1.0e-6)
If rel_z > 0.9 then the acceleration towards the z direction dominates (vertical motion). Note that the constant 0.9 is arbitrary and may require tweaking (should be a positive number less than 1). The 1.0e-6 is there to avoid accidental division by zero.
You may have to add another constraint that z is sufficiently large. I don't know your device, whether it measures gravity as 1 or 9.81. I assume it measures it as 1.
So all in all:
if (rel_z > 0.9 && abs(z) > 0.1) { // we have vertical movement
Again, the constant 0.1 is arbitrary and may require tweaking. It should be positive.
UPDATE 2
I do not want this because rotating it towards me is not moving it upwards
It is moving upwards: The center of mass is moving upwards. My code has the correct behavior.
Please come up with a mathematically sound definition of what you consider "moving upwards."
Android provides sensor data in device coordinate system no matter how it is oriented. Is there any way to have sensor data in 'gravity' coordinate system? I mean no matter how the device is oriented I want accelerometer data and orientation in coordinate system where y-axis points toward the sky, x-axis toward the east and z-axis towards south pole.
I took a look at remapCoordinateSystem but seems to be limited to only swapping axis. I guess for orientation I will have to do some low level rotation matrix transformation (or is there any better solution?). But how about acceleration data? Is there any way to have data in relation to coordinate system that is fixed (sort of world coordinate system).
The reason I need this is I'm trying to do some simple motion gestures when phone is in the pocket and it would be easier for me to have all data in coordinates system related to user rather device coordinate system (that will have a little bit different orientation in different user's pockets)
Well you basically get the North orientation when starting - for this you use the accelerometer and the magnetic field sensor to compute orientation (or the deprecated Orientation sensor).
Once you have it you can compute a rotation matrix (Direction Cosine Matrix) from those azimuth, pitch and roll angles. Multiplying your acceleration vector by that matrix will transform your device-frame movements into Earth-frame ones.
As your device will change its orientation as time goes by, you'll need to update it. To do so, retrieve gyroscope's data and update your Direction Cosine Matrix for each new value. You could also get the orientation true value just like the first time, but it's less accurate.
My solution involves DCM, but you could also use quaternions, it's just a matter of choice. Feel free to ask more if needed. I hope this is what you wanted to know !
What is the difference between gravity and acceleration sensors in Android? From my point of view the physical value is the same in both cases.
Which one measures the force acting on unit mass inside the device?
ADDITION
The question is: what physical quantity is measured by these sensors? According to equivalence principle the acceleration and gravity are indistinguishable and the only way to measure both is by usual (but 3d) spring balance.
Acceleration sensor gives you back the sum of all forces applied to your device, while Gravity sensor returns only the influence of gravity. If you want to exclude the gravity from acceleration, you may use a high-pass filter or just subtract the gravity sensor values from acceleration sensor values -- not sure which method gives better precision.
Another method could be using Sensor.TYPE_LINEAR_ACCELERATION, which gives exactly (acceleration - gravity), however you should check if it's available on the device. I've found a few devices, which have ACCELERATION sensor working, but no response from GRAVITY or LINEAR_ACCELERATION sensors.
This link might be helpful: http://www.sensorplatforms.com/which-sensors-in-android-gets-direct-input-what-are-virtual-sensors
The following excerpt summarize the answer for your question:
"The list... includes both physical sensors and sensor types with values derived from physical sensors, sometimes these are called virtual sensors... The virtual sensors types (gravity, linear acceleration, and rotation vector) provide values derived by combining the results from physical sensors intelligently... The rotation vector is a combination of the accelerometer, the magnetometer, and sometimes the gyroscope to determine the three-dimensional angle along which the Android device lays with respect to the Earth frame coordinates. By knowing the rotation vector of a device, accelerometer data can be separated into gravity and linear acceleration."
This link https://stackoverflow.com/a/8238389/604964 also says that the gravity values are computed using a Butterworth filter.
I am developing an app using android OS for which I need to know how can I calculate the movement of the device up in the vertical direction.
For example, the device is at rest (point A), the user picks it up in his hand (point B), now there is a height change between point A and point B, how would i calculate that?
I have already gone through the articles about sensors and accelerometers, but I couldn't really find anything to help me with that. Anyone have any ideas?
If you integrate the acceleration twice you get position but the error is horrible. It is useless in practice. Here is an explanation why (Google Tech Talk) at 23:20. I highly recommend this video.
Now, you do not need anything accurate and that is a different story. The linear acceleration is available after sensor fusion, as described in the video. See Sensor.TYPE_LINEAR_ACCELERATION at SensorEvent. I would first try a high-pass filter to detect sudden increase in the linear acceleration along the vertical axis.
I have no idea whether it is good for your application.
You can actually establish (only) the vertical position without measuring acceleration over time. This is accomplished by measuring the angle between the direction to the center of the earth, and the direction to the magnetic north pole.
This only changes (significantly) when the altitude (height) of the phone changes. What you do is use the accelerometer and magnetometer to get two float[3] arrays, treat these as vectors, make them unit vectors, and then the angle between any two unit vectors is arccos(AxM).
Note that's dot product ie. math.acos(A[0]*B[0]+A[1]*B[1]+A[2]*B[2]) Any change in this angle corresponds to a change in height. Also note that this will have to be calibrated to real units and the ratio of change in angle to height will be different at various longitudes; But this is a method of getting an absolute value for height; though of course the angle also becomes skewed when undergoing acceleration, or when there are nearby magnets :)
you can correlate it to magnetic field sensor in microTesla
You can use dist= integral of integral of acceleration ~ sigma ~ summation
= integral of speed+constant