I am developing an app using android OS for which I need to know how can I calculate the movement of the device up in the vertical direction.
For example, the device is at rest (point A), the user picks it up in his hand (point B), now there is a height change between point A and point B, how would i calculate that?
I have already gone through the articles about sensors and accelerometers, but I couldn't really find anything to help me with that. Anyone have any ideas?
If you integrate the acceleration twice you get position but the error is horrible. It is useless in practice. Here is an explanation why (Google Tech Talk) at 23:20. I highly recommend this video.
Now, you do not need anything accurate and that is a different story. The linear acceleration is available after sensor fusion, as described in the video. See Sensor.TYPE_LINEAR_ACCELERATION at SensorEvent. I would first try a high-pass filter to detect sudden increase in the linear acceleration along the vertical axis.
I have no idea whether it is good for your application.
You can actually establish (only) the vertical position without measuring acceleration over time. This is accomplished by measuring the angle between the direction to the center of the earth, and the direction to the magnetic north pole.
This only changes (significantly) when the altitude (height) of the phone changes. What you do is use the accelerometer and magnetometer to get two float[3] arrays, treat these as vectors, make them unit vectors, and then the angle between any two unit vectors is arccos(AxM).
Note that's dot product ie. math.acos(A[0]*B[0]+A[1]*B[1]+A[2]*B[2]) Any change in this angle corresponds to a change in height. Also note that this will have to be calibrated to real units and the ratio of change in angle to height will be different at various longitudes; But this is a method of getting an absolute value for height; though of course the angle also becomes skewed when undergoing acceleration, or when there are nearby magnets :)
you can correlate it to magnetic field sensor in microTesla
You can use dist= integral of integral of acceleration ~ sigma ~ summation
= integral of speed+constant
Related
First of all, I suppose the device at rest at the first measure so the acceleration I have is the gravity one, second I won't use a low pass filter.
Android gives me the linear acceleration and the compass value. My guess is that I can use the compass to rotate the acceleration to the earth reference and so remove gravity.
My guess is that if I calculate the difference of two compass measurement I have a quantity that is index of the rotation of the cellphone and so, if I add it to the initial gravity vector i can rotate it too.
Then at the i-th measure I have, my guess is this:
#a[i] contains the 3-acceleration at time i
#b[i] cointains the 3-compass values at time i
b[i]=numpy.sin(b[i]/(180./math.pi)) # I normalize the compass values from 0 to 1
#since b is a unit vector I need to "de-normalize" it
b[i]=b[i]*sqrt(g**2.)
deltab=b[i]-b[i-1]
# At the very beginning of the code I had something like g=a[0]
g=g-deltab #well I've tried also with the plus sign
it don't work..but I can't see the problem..any idea?
EDIT: I'm also trying this method, which again I don't know why it don't works:
I've found here to compute the rotation matrix giving the vector and the rotation angle.
Here is how to build this matrix: http://en.wikipedia.org/wiki/Rotation_matrix#Rotation_matrix_from_axis_and_angle
The angle I've choosen is the one of the scalar product by the old and the new compass direction.
The vector with which they rotate is the cross product between the old and the new vector (I guess..and this may be wrong)
So: If I get the rotation of the compass, and then I build the rotation matrix, and then I apply it to my initial gravity vector, is it the correct rotate the gravity vector?
#a[i] contains the 3-acceleration at time i
#b[i] cointains the 3-compass values at time i
omega=cross(b[i],b[i-1])
theta=dot(b[i],b[i-1])/sqrt(dot(b[i],b[i])*dot(b[i-1],b[i-1]))
M=rotationMatrix(omega,theta)
aWithoutGravity[i]=dot(M,a[i])
As you describe it, this problem can not be solved. If you had two coordinate systems, you could describe one in terms of the other, but you only have two vectors (described within coordinate system of the phone), and single vectors introduce ambiguities, since you really only know the angle between them.
For example, consider taking an initial measurement of the gravity and magnetic field vector and now rotate and spin your phone. You can again measure the magnetic field vector, but imagine you don't know the gravity vector, where will it be? All you know is its angle relative to the magnetic field vector (and you might also assume you know its magnitude), basically forming a cone around the magnetic field vector (or ring if you assume the magnitude). But since you don't know exact information about it, you can't use it to completely determine the linear acceleration. That is, given a measured acceleration (physically, gravity + linear acceleration), every vector in the cone of possible gravity vectors would imply a different linear acceleration.
It does get you part of the way there, in that you now have a complicated geometry problem where there will only be a certain range of linear acceleration vectors that will be consistent with the "gravity cone", but the problem become much more complicated than a simple subtraction, and there's no a priori reason to believe that this smaller subset of possible linear acceleration vectors tells you anything useful.
For example, consider that the original reading of the compass is (1,0,0), and the reading for acceleration is (6, 6, 0). If the phone is then just rotated about (1,0,0), other readings of acceleration are possible, such as (6, -6, 0), (6, 0, 6), (6, 0, -6), and many in between. So, because of this ambiguity, one can't tell based purely on the compass reading whether (6, 6, 0) changed into, say, (6, -6, 0) because of the acceleration or because the phone was rotated.
The compass values (i.e. the values from the android sensor Sensor.TYPE_MAGNETIC_FIELD) has components in both the gravity direction and the direction of magnetic north. So you can use it, in conjunction with the gravity vector, to work out the direction of east, because that's given by the vector cross product of the two vectors. In fact, depending on where you are in the world, the Sensor.TYPE_MAGNETIC_FIELD component in the gravity direction can be quite large.
Also note that Android has two sensors Sensor.TYPE_GRAVITY and Sensor.TYPE_LINEAR_ACCELERATION which are derived from Sensor.TYPE_ACCELEROMETER. And it's always true that the readings satisfy
Sensor.TYPE_ACCELEROMETER = Sensor.TYPE_GRAVITY + Sensor.TYPE_LINEAR_ACCELERATION
I am working on an android app that requires the detection of vertical motion. When moving the tablet upward, the Gyroscope, Accelerometer, and Linear Acceleration sensors give a corresponding value indicating upward or downward motion.
The problem I have is that these sensors will also read an upward/downward motion when you tilt the tablet towards the user or away from the user. For example, the x value in the gyroscope represents the vertical plane. But when you tilt the device forwards, the x value will change.
When I make this motion, the same sensor that reads vertical motion reads a value for this.
The same goes for the rest of the sensors. I have tried to use orientation coupled with the gyro to make the conditional statement, if the pitch is not changing, but the x variable is going up/down, then we have vertical motion. The problem with this is that if the user moves it up but tilted slightly, it will no longer work. I also tried making it so if there is a change in tilt, then there is no vertical motion. But it iterates so quickly that there may be a change in tilt for 1/100 of a second, but for the next there isn't.
Is there any way I can read only vertical changes and not changes in the devices pitch?
Here is what I want to detect:
edit:
"Please come up with a mathematically sound definition of what you consider 'moving upwards.'"
This was my initial question, how can I write a function to define when the tablet is moving upwards or downwards? I consider a vertical translation moving upwards. Now how do I detect this? I simply do not know where to begin, thank you.
Ok, even though this question is fairly old, I see a lot of confusion in the present answer and comments, so in case anyone finds this, I intend to clear a few things up.
The Gyroscope
First of all, the gyroscope does not measure vertical motion as per your definition (a translatory motion). It measures rotation around each of the axes, which are defined as in the figure below. Thus having you tilt your device forwards and backwards indeed rotates it around the x axis and therefore you will see non-zero values in the x value of your gyroscope sensor.
the x value in the gyroscope represents the vertical plane.
I'm not sure what is meant by "the vertical plane", however the x value certainly does not represent the plane itself nor the orientation of the device within the plane.
The x value of the gyroscope sensor represents the current angular velocity of the device around the x axis (eg. the change in rotation).
But when you tilt the device forwards, the x value will change. When I make this motion, the same sensor that reads vertical motion reads a value for this.
Not quite sure what you're referring to here. "The same sensor that reads vertical motion" I assume is the gyroscope, but as previously said, it does not read vertical motion. It does exactly what it says on the tin.
The device coordinate system
This is more in response to user Ali's answer than the original question, but it remains relevant in either case.
The individual outputs of the linear acceleration sensor (or any other sensor for that matter) are expressed in the coordinate system of the device, as shown in the image above. This means if you rotate the device slightly, the outputs will no longer be parallel to any world axis they coincided with before. As such, you will either have to enforce that the device is in a particular orientation for your application, or take the new orientation into account.
The ROTATION_VECTOR sensor, combined with quaternion math or the getRotationMatrixFromVector() method, is one way to translate your measurements from device coordinates to world coordinates. There are other ways to achieve the same goal, but once achieved, the way you hold your device won't matter for measuring vertical motion.
In either case, the axis you're looking for is the y axis, not the z axis.
(If by any chance you meant "along device y axis" as "vertical", then just ignore all the orientation stuff and just use the linear acceleration sensor)
Noise
You mentioned some problems regarding noise and update rates in the question, so I'll just mention it here. The simplest and one of the more common ways to get nice, consistent data from something that varies very often is to use a low-pass filter. What type of filter is best depends on the application, but I find that a exponential moving average filter is viable in most cases.
Finishing thoughts
Note that if you take proper care of the orientation, your transformed linear acceleration output will be a good approximation of vertical motion (well, change in motion) without filtering any noise.
Also, if you want to measure vertical "motion", as in velocity, you need to integrate the accelerometer output. For various reasons, this doesn't really turn out too well in most cases, although it is less severe in the case of velocity rather than trying to measure position.
OK, I suspect it is only a partial answer.
If you want to detect vertical movement, you only need linear acceleration, the device orientation doesn't matter. See
iOS - How to tell if device is raised/dropped (CoreMotion)
or
how to calculate phone's movement in the vertical direction from rest?
For some reason you are concerned with the device orientation as well, and I have no idea why. I suspect that you want to detect something else. So please tell us more and then I will improve my answer.
UPDATE
I read the post on coremotion, and you mentioned that higher z lower x and y means vertical motion, can you elaborate?
I will write in pseudo code. You measured the (x, y, z) linear acceleration vector. Compute
rel_z = z/sqrt(x^2+y^2+z^2+1.0e-6)
If rel_z > 0.9 then the acceleration towards the z direction dominates (vertical motion). Note that the constant 0.9 is arbitrary and may require tweaking (should be a positive number less than 1). The 1.0e-6 is there to avoid accidental division by zero.
You may have to add another constraint that z is sufficiently large. I don't know your device, whether it measures gravity as 1 or 9.81. I assume it measures it as 1.
So all in all:
if (rel_z > 0.9 && abs(z) > 0.1) { // we have vertical movement
Again, the constant 0.1 is arbitrary and may require tweaking. It should be positive.
UPDATE 2
I do not want this because rotating it towards me is not moving it upwards
It is moving upwards: The center of mass is moving upwards. My code has the correct behavior.
Please come up with a mathematically sound definition of what you consider "moving upwards."
I'm working on a Pedestrian Navigation System on Android. I am currently trying to get the rotation matrix between the 3-axis accelerometer referential and the motion that is applied to the device.
Let's say you are walking straight forward with the device in your hand, but the Y axis of the accelerometer (thus the device's Y axis) is not oriented in the same direction as the one you're heading yourself (basically you're holding the device in an awkward way).
Then if I want to apply distances (based on step detection), it would be wrong to apply them to the accelerometer referential (which I know the orientation) : it has to be rotated accordingly to the motion heading.
This is why I would like to know if you could enlighten me about a method to compute the accelerometer readings to turn them into rotation angles (or a matrix). Such a method has to avoid integrating the acceleration even once, as the error is abyssal on cheap accelerometers (otherwise it would be pretty easy to perform I think).
EDIT : Magnetometer and Gyroscope help you to find the device's orientation even when stationnary, but doesn't allow you to know in which direction the device is moving. I have the first one, and I'm searching for the second. Basically :
Human Referential (motion direction) -> Device Referential (or accelerometer referential) -> Earth Referential
and i'm searching the way to find the rotation matrix to compute distances from HR to DR, which I would then apply to the rotation matrix I found to go from DR to ER.
I found an article which might be useful, here is the link.
The Idea is based upon acquiring the accelerometer data in the 3 axis, (which is an n times 3
matrix) and finding the PCA (principle component analysis) of the matrix. The PCA is the 3 vectors with the highest energy of the matrix).
Idealy, the main vector (with the highest energy) direction is upwards, and the second is the heading direction.
You can read it all in the article.
I tried implementing the algorithm in matlab, the result is o.k. (not great)
Hope you can do better (I would like to hear about good results).
Hope this helps
Ariel
When I listen to orientation event in an android app, I get a SensorEvent, which contains 3 floats - azimuth, pitch, and roll in relation to the real-world's axis.
Now say I am building an app like labyrinth, but I don't want to force the user the be over the phone and hold the phone such that the xy plane is parallel to the ground. Instead I want to be able to allow the user to hold the phone as they wish, laying down or, perhaps, sitting down and holding the phone at an angle. In other words, I need to calibrate the phone in accordance with the user's preference.
How can I do that?
Also note that I believe that my answer has to do with getRotationMatrix and getOrientation, but I am not sure how!
Please help! I've been stuck at this for hours.
For a Labyrinth style app, you probably care more for the acceleration (gravity) vector than the axes orientation. This vector, in Phone coordinate system, is given by the combination of the three accelerometers measurements, rather than the rotation angles. Specifically, only the x and y readings should affect the ball's motion.
If you do actually need the orientation, then the 3 angular readings represent the 3 Euler angles. However, I suspect you probably don't really need the angles themselves, but rather the rotation matrix R, which is returned by the getRotationMatrix() API. Once you have this matrix, then it is basically the calibration that you are looking for. When you want to transform a vector in world coordinates to your device coordinates, you should multiply it by the inverse of this matrix (where in this special case, inv(R) = transpose(R).
So, following the example I found in the documentation, if you want to transform the world gravity vector g ([0 0 g]) to the device coordinates, multiply it by inv(R):
g = inv(R) * g
(note that this should give you the same result as reading the accelerometers)
Possible APIs to use here: invertM() and multiplyMV() methods of the matrix class.
I don't know of any android-specific APIs, but all you want to do is decrease the azimuth by a certain amount, right? So you move the "origin" from (0,0,0) to whatever they want. In pseudocode:
myGetRotationMatrix:
return getRotationMatrix() - origin
At present i am working on hal part of sensors in android sdk, we are using 3- Axis BMA-150 Accelerometer sensor to get acceleration values with respect to X,y,Z Axis, I want to know whether this sensor will give o/p directly in SI units by using some calibration techniques or what ? , and i noticed that in sensor.c file they mentioned
720.0 LSG = 1G(9.8 m/s2), what is the relation between LSG and acceleration due to gravity?
what is meant by LSG
why they are multiplying the o/p of accelerometer x,y,z valuse with 9.8/720.0f . please help on this part .
Thanks
Vinay
Without knowing anything about the device, 9.8 meters per square second is the value of gravitational acceleration at Earth surface. The equation you quote seems to be definition of "LSG" and the only thing that makes sense to define in the context is the unit in which output is provided. So the device will probably give output '720.0' on the axis that is vertical. By multiplying with 9.8/720.0 you renormalize the value to SI unit (m/s^2)
(Average gravitational acceleration, denoted G (that explains the 1G), is 9.80665 m/s^2, but varies by a few percent between equator and poles and the device is probably unable to give more than two significant digits precision anyway)