How do I filter noise of the accelerometer data in Android? I would like to create a high-pass filter for my sample data so that I could eliminate low frequency components and focus on the high frequency components. I have read that Kalman filter might be the best candidate for this, but how do I integrate or use this method in my application which will mostly written in Android Java? or can it be done in the first place? or through Android NDK? Is there by any chance that this can be done in real-time?
Any idea will be much appreciated. Thank you!
The samples from Apple's SDK actually implement the filtering in an even simpler way which is by using ramping:
//ramp-speed - play with this value until satisfied
const float kFilteringFactor = 0.1f;
//last result storage - keep definition outside of this function, eg. in wrapping object
float accel[3];
//acceleration.x,.y,.z is the input from the sensor
//result.x,.y,.z is the filtered result
//high-pass filter to eliminate gravity
accel[0] = acceleration.x * kFilteringFactor + accel[0] * (1.0f - kFilteringFactor);
accel[1] = acceleration.y * kFilteringFactor + accel[1] * (1.0f - kFilteringFactor);
accel[2] = acceleration.z * kFilteringFactor + accel[2] * (1.0f - kFilteringFactor);
result.x = acceleration.x - accel[0];
result.y = acceleration.y - accel[1];
result.z = acceleration.z - accel[2];
Here's the code for Android, adapted from the apple adaptive high pass filter example. Just plug this in and implement onFilteredAccelerometerChanged()
private static final boolean ADAPTIVE_ACCEL_FILTER = true;
float lastAccel[] = new float[3];
float accelFilter[] = new float[3];
public void onAccelerometerChanged(float accelX, float accelY, float accelZ) {
// high pass filter
float updateFreq = 30; // match this to your update speed
float cutOffFreq = 0.9f;
float RC = 1.0f / cutOffFreq;
float dt = 1.0f / updateFreq;
float filterConstant = RC / (dt + RC);
float alpha = filterConstant;
float kAccelerometerMinStep = 0.033f;
float kAccelerometerNoiseAttenuation = 3.0f;
if(ADAPTIVE_ACCEL_FILTER)
{
float d = clamp(Math.abs(norm(accelFilter[0], accelFilter[1], accelFilter[2]) - norm(accelX, accelY, accelZ)) / kAccelerometerMinStep - 1.0f, 0.0f, 1.0f);
alpha = d * filterConstant / kAccelerometerNoiseAttenuation + (1.0f - d) * filterConstant;
}
accelFilter[0] = (float) (alpha * (accelFilter[0] + accelX - lastAccel[0]));
accelFilter[1] = (float) (alpha * (accelFilter[1] + accelY - lastAccel[1]));
accelFilter[2] = (float) (alpha * (accelFilter[2] + accelZ - lastAccel[2]));
lastAccel[0] = accelX;
lastAccel[1] = accelY;
lastAccel[2] = accelZ;
onFilteredAccelerometerChanged(accelFilter[0], accelFilter[1], accelFilter[2]);
}
For those wondering what norm() and clamp() methods do in the answer from rbgrn, you can see them here:
http://developer.apple.com/library/IOS/samplecode/AccelerometerGraph/Listings/AccelerometerGraph_AccelerometerFilter_m.html
double norm(double x, double y, double z)
{
return Math.sqrt(x * x + y * y + z * z);
}
double clamp(double v, double min, double max)
{
if(v > max)
return max;
else if(v < min)
return min;
else
return v;
}
I seem to remember this being done in Apple's sample code for the iPhone. Let's see...
Look for AccelerometerFilter.h / .m on Google (or grab Apple's AccelerometerGraph sample) and this link: http://en.wikipedia.org/wiki/High-pass_filter (that's what Apple's code is based on).
There is some pseudo-code in the Wiki, too. But the math is fairly simple to translate to code.
IMO, designing a Kalman filter as your first attempt is over-complicating what's probably a fairly simple problem. I'd start with a simple FIR filter, and only try something more complex when/if you've tested that and found with reasonable certainty that it can't provide what you want. My guess, however, is that it will be able to do everything you need, and do it much more easily and efficiently.
Related
I’m trying to perform a circle gesture operation on mobile device screen using appium. I tried with swipe(), press("args").moveTo("args") and also tried using javascript executor method also. But not able to perform the circle gesture operation on mobile screen for iOS.
Need to perform this circle gesture operation without loosing the touch in middle while performing this action from first point to last point.
Is there any tool like AutoIT or Sikuli to perform this above gesture operation on mobile devices and can be executed in appium scripts using java in Mac.
For those looking for a quick solution, here is my implementation based on the other comments in this thread:
public void SwipeArc(double centerX, double centerY, double radius, double startDegree, double degrees, int steps)
{
//interpolate along the circumference of the circle
double angle = degrees / steps;
double prevX = centerX + radius * Math.Cos(startDegree * Math.PI / 180F); ;
double prevY = centerY + radius * Math.Sin(startDegree * Math.PI / 180F);
TouchAction circleTouch = new TouchAction(_Driver); //Your appium driver object here
circleTouch.Press(prevX, prevY);
for(int i = 1; i <= steps; ++i)
{
double newX = centerX + radius * Math.Cos((startDegree + angle * i) * Math.PI / 180F);
double newY = centerY + radius * Math.Sin((startDegree + angle * i) * Math.PI / 180F);
double difX = newX - prevX;
double difY = newY - prevY;
circleTouch.MoveTo(difX, difY);
prevX = newX;
prevY = newY;
}
circleTouch.Release();
circleTouch.Perform();
}
This solution assumes the Appium server is expecting relative coordinates for each step, I'm not sure if this is the case for all Appium server versions.
Use touch actions!
I tried it on iOS and android real devices, it works fine. But you may need to play around a bit to get the right sets of coordinates and move parameters.
I've been assigned to make a simple android app game in which the user controls a ball on the screen by tilting the phone. The app utilizes a custom view to draw all the objects and it uses a Runnable object to animate the ball. Note that x and y represent the ball's postilion, vx and vy represent the ball's velocity and fx and fy represent the forces being applied to it. ix and iy represent the phone's tilt; I set these two to dummy values in order to test the app without worrying about setting up the sensor manager for the time being :
private Runnable animator = new Runnable() {
#Override
public void run() {
boolean needNewFrame = false;
long now = AnimationUtils.currentAnimationTimeMillis();
float dt = Math.min(now - lastTime, 50) / 1000f;
fx = - alpha * vx + beta * ix;
fy = - alpha * vy + beta * iy;
vx = fx * dt;
vy = fx * dt;
if(x<xMin||x>xMax-ball_radius||y<yMin||y>yMax-ball_radius)
bounce();
x = x + vx * dt;
y = y + vy * dt;
lastTime = now;
needNewFrame = true;
postDelayed(this, 20);
invalidate();
}
};
I've set up the animator like so but it won't run. I tried calling post(animator) on the views initialization but that didn't work. How do I fix this?
Also, how do I set the ix and iy variables from the phone's tilt? From what I understand, the Sensor Manager is meant to be setup from an activity class.
Actually, the animation does play successfully. It's just that the animation was playing very slowly.
I have been trying to animate an image of a fly which moves in a path like the following image(which i have added for a clear idea) in android version 2.2
Well,this can be done in a very simple manner in the iphone as they have a property forsetting this auto rotation after the path is drawn using
animation.rotationMode = kCAAnimationRotateAuto;
which i believe would rotate the object based on the path`
I am able to animate ma fly through this path using the nineoldandroid library using the methods
path.moveTo(float x, float y);
path.lineTo(float x, float y);
path.curveTo(float c0X, float c0Y, float c1X, float c1Y, float x, float y);
Such that the curves are drawn through cubic B�zier curve.
Now what i have been trying is to implement something that would allow my fly to rotate itself along the path and i just cant seem to reach anywhere.
Please Help Me out with some ideas!!! :( :(
You have to download the demo and the lib of nineoldandroids and these 4 java files if you want to use my solution
That was easy, I modified the evaluator in the demo of nineoldandroids.
It's too much to post here:
Just to get the idea:
I extend the PathPoint with the field angle.
Then write all calculated Points in a stack (a simple float[][])
After the first calculation the angle can be calculated by the atan and the last 2 points in the stack.
If you don't want to use a stack you can modify the timeparam and look forward to where the next point will be drawn and calculate the angle out of these.
Just think about:
Do you first watch where you are walking to and then walk or do you just walk and then chose the angle for the destination. It's not neccessary since we have display densities that high and calculating the angle for each pixel.
Here's the PathEvaluator
public class PathEvaluatorAngle implements TypeEvaluator<PathPointAngle> {
private static final int POINT_COUNT = 5000;
private float[][] stack = new float[POINT_COUNT][2];
private int stackC = 0;
#Override
public PathPointAngle evaluate(float t, PathPointAngle startValue, PathPointAngle endValue) {
float x, y;
if (endValue.mOperation == PathPointAngle.CURVE) {
float oneMinusT = 1 - t;
x = oneMinusT * oneMinusT * oneMinusT * startValue.mX +
3 * oneMinusT * oneMinusT * t * endValue.mControl0X +
3 * oneMinusT * t * t * endValue.mControl1X +
t * t * t * endValue.mX;
y = oneMinusT * oneMinusT * oneMinusT * startValue.mY +
3 * oneMinusT * oneMinusT * t * endValue.mControl0Y +
3 * oneMinusT * t * t * endValue.mControl1Y +
t * t * t * endValue.mY;
} else if (endValue.mOperation == PathPointAngle.LINE) {
x = startValue.mX + t * (endValue.mX - startValue.mX);
y = startValue.mY + t * (endValue.mY - startValue.mY);
} else {
x = endValue.mX;
y = endValue.mY;
}
stack[stackC][0] = x;
stack[stackC][1] = y;
double angle;
if (stackC == 0){
angle = 0;
} else if (stackC >= POINT_COUNT){
throw new IllegalStateException("set the stack POINT_COUNT higher!");
} else {
angle = Math.atan(
(stack[stackC][1] - stack[stackC-1][1]) /
(stack[stackC][0] - stack[stackC-1][0])
) * 180d/Math.PI;
}
stackC++;
return PathPointAngle.moveTo(x, y, angle);
}
}
Please check the below link.Hope it will help.
https://github.com/JakeWharton/NineOldAndroids
I started working on a concept that requires me to find a way to move a rectangle toward a given point at a given speed. I'm developing for Android so this is relatively speed critical (it's going to be calculated every frame for potentially hundreds of objects, as well.)
The solutions I could think of are as follows:
float diff_x = x2 - x1;
float diff_y = y2 - y1;
float length = sqrt((diff_x * diff_x) + (diff_y * diff_y));
float dir_x = diff_x / len;
float dir_y = diff_y / len;
float move_x = dir_x * MOVE_SPEED;
float move_y = dir_y * MOVE_SPEED;
As you can see, this way requires a square root, which I know to be quite costly. I thought of an alternative, which uses trigonometry, but it's costly as well.
float diff_x = x2 - x1;
float diff_y = y2 - y1;
float angle = atan2(diff_y, diff_x);
float move_x = sin(angle) * MOVE_SPEED;
float move_y = cos(angle) * MOVE_SPEED;
Are there any other ways? If not, which of my solutions would be faster? Thanks for any help.
A very common tric you can use is to put everything squarred/power of two/ ^2
this way instead of using sqrt you just use
length = (diff_x * diff_x) + (diff_y * diff_y);
diff_x*diff_x/length
I am using an adapted version of android's getRotationMatrix in a c++ program that reads the phone's sensor data over the network and calculates the device's matrix.
The function works fine and calculates the device's orientation. Unfortunately, Ogre3d has a different axis system than the device. So even though rotation about the x-axis works fine, the y and z axis are wrong. Holding the device level and pointing to north (identity matrix). When I pitch, the rotation is correct. But when I roll and yaw the rotations are alternated. Roll is yaw in Ogre3d and vice versa.
(Ogre3d) ([Device][5])
^ +y-axis ^ +z-axis
* *
* *
* * ^ +y-axis
* * *
* * *
* * *
************> + x-axis ************> +x-axis
*
*
v +z-axis
A quick look at the two axis system looks like Ogre's system (on the left) is essentially the device's system rotated 90 degrees counter clockwise about the x-axis.
I tried to experiment with various combinations when I fist assign sensor values before the matrix is calculated but no combination seems to work correctly. How would I make sure that the rotation matrix getRotationMatrix() produces displays correctly on Ogre3D?
For Reference here is the function that calculates the matrix:
bool getRotationMatrix() {
//sensor data coming through the network are
//stored in accel(accelerometer) and mag(geomagnetic)
//vars which the function has access to
float Ax = accel[0]; float Ay = accel[1]; float Az = accel[2];
float Ex = mag[0]; float Ey = mag[1]; float Ez = mag[2];
float Hx = Ey * Az - Ez * Ay;
float Hy = Ez * Ax - Ex * Az;
float Hz = Ex * Ay - Ey * Ax;
float normH = (float) Math::Sqrt(Hx * Hx + Hy * Hy + Hz * Hz);
if (normH < 0.1f) {
// device is close to free fall (or in space?), or close to
// magnetic north pole. Typical values are > 100.
return false;
}
float invH = 1.0f / normH;
Hx *= invH;
Hy *= invH;
Hz *= invH;
float invA = 1.0f / (float) Math::Sqrt(Ax * Ax + Ay * Ay + Az * Az);
Ax *= invA;
Ay *= invA;
Az *= invA;
float Mx = Ay * Hz - Az * Hy;
float My = Az * Hx - Ax * Hz;
float Mz = Ax * Hy - Ay * Hx;
//ogre3d's matrix3 is column-major whereas getrotatinomatrix produces
//a row-major matrix thus i have tranposed it here
orientation[0][0] = Hx; orientation[0][2] = Mx; orientation[0][2] = Ax;
orientation[1][0] = Hy; orientation[1][3] = My; orientation[1][2] = Ay;
orientation[2][0] = Hz; orientation[2][4] = Mz; orientation[2][2] = Az;
return true;
}
Why not just add the one additional rotation you've already identified before you use it in ogre?
I found the problem. In my function the unit vectors calculated after the cross products I put them in columns whereas I should be putting them in the rows in their appointed matrix3 cells as usual. Something about row-major and column-major confused me even though I was referring to the elements in 2d [][].
multiplying the outcome of the matrix calculation function with this matrix:
1 0 0
0 0 1
0 -1 0
Then pitching the whole result by another p/2 about axis solved the remap problem but I fear my geometry is inverted.
I don't know much about Matrix Rotation, but if the Systems rotates like you are showing, I think that youshould do the following:
X Axis stays the same way, so:
float Ax = accel[0];
float Ex = mag[0];
Y Axis in (Ogre3d) is Z axis in ([Device][5]), so:
float Ay = accel[2];
float Ey = mag[2];
Z Axis in (Ogre3d) is the oposite of Y axis in ([Device][5]), so:
float Az = accel[1] * (-1);
float Ez = mag[1] * (-1);
Try that