Android - How to "shake to erase"? - android

This may be an easy question, but I'm stuck. I am trying to implement the "shake to erase" feature in a drawing program (simple paint app). I can't get it to work. Here's my code:
private final SensorEventListener mSensorListener = new SensorEventListener() {
public void onSensorChanged(SensorEvent se) {
float x = se.values[0];
float y = se.values[1];
float z = se.values[2];
mAccelLast = mAccelCurrent;
mAccelCurrent = (float) Math.sqrt((double) (x*x + y*y + z*z));
float delta = mAccelCurrent - mAccelLast;
mAccel = mAccel * 0.9f + delta; // perform low-cut filter
if (mAccel > 2) {
mView.onDraw(mCanvas);
mCanvas.drawBitmap(cache, 0, 0, new Paint(Paint.DITHER_FLAG));
}
}
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
};
The SensorEventListener is based off of this example. I make it into the if statement, but the canvas won't reset until after I've touched the screen (a new touch event).
I'd like the canvas to reset/erase during the shake event, without any further prompts from the user necessary.
Any help would be wonderful, thank you!

You might have to call invalidate on the graphics object to get it to redraw.
Hope that helps!
http://developer.android.com/guide/topics/graphics/index.html

Related

how to move windows cursor by phone accelerometer smoothly

i am trying to move windows cursor with android phone accelerometer, but i don't know how to properly use data to move cursor smoothly.
I use wifi to transfer data from phone to laptop.
Here is some code without wifi and socket transfer code, please give me some idea or correct my code.
Android:
onCreate:
sensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
accelerometer = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
I am using only X and Y coordinates:
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
x = event.values[0];
y = event.values[1];
Then i use this function in windows, that moves cursor from one to other position by steps:
Params: x1,x2 - from position, x2,y2 - to position, t - time, n - steps
public void mouseGlide(int x1, int y1, int x2, int y2, int t, int n) {
try {
Robot r = new Robot();
double dx = (x2 - x1) / ((double) n);
double dy = (y2 - y1) / ((double) n);
double dt = t / ((double) n);
for (int step = 1; step <= n; step++) {
Thread.sleep((int) dt);
r.mouseMove((int) (x1 + dx * step), (int) (y1 + dy * step));
}
} catch (Exception e) {
e.printStackTrace();
}
}
Basically i want to use phone tilt to move cursor as same as when i use mouse, photoshere in camera apps moves smoothly.
I do not know how to do it.

Android detect phone lifting action

I want to perform some activity when the user lifts the phone from a flat surface. The method I am using right now is detect shake motion using phone's Accelerometer using the following code:
sensorMan = (SensorManager) getSystemService(SENSOR_SERVICE);
accelerometer = sensorMan.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
sensorMan.registerListener(this, accelerometer, SensorManager.SENSOR_STATUS_ACCURACY_HIGH);
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravity = event.values.clone();
// Shake detection
float x = mGravity[0];
float y = mGravity[1];
float z = mGravity[2];
mAccelLast = mAccelCurrent;
mAccelCurrent = FloatMath.sqrt(x * x + y * y + z * z);
float delta = mAccelCurrent - mAccelLast;
mAccel = mAccel * 0.9f + delta;
if (mAccel > 0.9) {
//Perform certain tasks.
}
}
The issue I am facing with this code is the 0.9f threshold is reached sometimes even if the phone is still on the flat surface. I tried logging the mAccel value and found it to be rannging from 9.0 to 0.4 even when the phone is not even touched. Is there any guaranteed way to detect the phone's lift movement?
Solved the issue. All I wanted to do was to check for the "Y" value stated in the question and check if the value was greater than 1.0.
Note that, if the phone is kept in vertical position the Y is always around 9.8 but in such cases you can check for X instead. In my case user had to lift the phone and somewhen he will tilt the phone so I put a check for if(y >= 1.0 && y <= 2.0);
EDIT : UPDATED CODE
#Override
public void onSensorChanged(SensorEvent event) {
try {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravity = event.values.clone();
// Shake detection
float x = mGravity[0];
float y = mGravity[1];
float z = mGravity[2];
float yAbs = Math.abs(mGravity[1]);
mAccelLast = mAccelCurrent;
mAccelCurrent = FloatMath.sqrt(x * x + y * y + z * z);
float delta = mAccelCurrent - mAccelLast;
mAccel = mAccel * 0.9f + delta;
if (yAbs > 2.0 && yAbs < 4.0 && !isAlerted() && !isCallActive()) {
alert();
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
I would add the Gyroscope into the detection routine too.
The Phone gets Accelerated AND gets up from x=0 y=0 z=0 to, lets say y=120, that's the Trigger.
Look here
for Infos how to using it.
Another Sensor for lifting detection would be the Proximity Sensor, when the Phone lays flat on the Desk dinstance would be 0, if its picked up that value would raise quickly

Using Android gyroscope instead of accelerometer. I find lots of bits and pieces, but no complete code

The Sensor Fusion video looks great, but there's no code:
http://www.youtube.com/watch?v=C7JQ7Rpwn2k&feature=player_detailpage#t=1315s
Here is my code which just uses accelerometer and compass. I also use a Kalman filter on the 3 orientation values, but that's too much code to show here. Ultimately, this works ok, but the result is either too jittery or too laggy depending on what I do with the results and how low I make the filtering factors.
/** Just accelerometer and magnetic sensors */
public abstract class SensorsListener2
implements
SensorEventListener
{
/** The lower this is, the greater the preference which is given to previous values. (slows change) */
private static final float accelFilteringFactor = 0.1f;
private static final float magFilteringFactor = 0.01f;
public abstract boolean getIsLandscape();
#Override
public void onSensorChanged(SensorEvent event) {
Sensor sensor = event.sensor;
int type = sensor.getType();
switch (type) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags[0] = event.values[0] * magFilteringFactor + mags[0] * (1.0f - magFilteringFactor);
mags[1] = event.values[1] * magFilteringFactor + mags[1] * (1.0f - magFilteringFactor);
mags[2] = event.values[2] * magFilteringFactor + mags[2] * (1.0f - magFilteringFactor);
isReady = true;
break;
case Sensor.TYPE_ACCELEROMETER:
accels[0] = event.values[0] * accelFilteringFactor + accels[0] * (1.0f - accelFilteringFactor);
accels[1] = event.values[1] * accelFilteringFactor + accels[1] * (1.0f - accelFilteringFactor);
accels[2] = event.values[2] * accelFilteringFactor + accels[2] * (1.0f - accelFilteringFactor);
break;
default:
return;
}
if(mags != null && accels != null && isReady) {
isReady = false;
SensorManager.getRotationMatrix(rot, inclination, accels, mags);
boolean isLandscape = getIsLandscape();
if(isLandscape) {
outR = rot;
} else {
// Remap the coordinates to work in portrait mode.
SensorManager.remapCoordinateSystem(rot, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
}
SensorManager.getOrientation(outR, values);
double x180pi = 180.0 / Math.PI;
float azimuth = (float)(values[0] * x180pi);
float pitch = (float)(values[1] * x180pi);
float roll = (float)(values[2] * x180pi);
// In landscape mode swap pitch and roll and invert the pitch.
if(isLandscape) {
float tmp = pitch;
pitch = -roll;
roll = -tmp;
azimuth = 180 - azimuth;
} else {
pitch = -pitch - 90;
azimuth = 90 - azimuth;
}
onOrientationChanged(azimuth,pitch,roll);
}
}
private float[] mags = new float[3];
private float[] accels = new float[3];
private boolean isReady;
private float[] rot = new float[9];
private float[] outR = new float[9];
private float[] inclination = new float[9];
private float[] values = new float[3];
/**
Azimuth: angle between the magnetic north direction and the Y axis, around the Z axis (0 to 359). 0=North, 90=East, 180=South, 270=West
Pitch: rotation around X axis (-180 to 180), with positive values when the z-axis moves toward the y-axis.
Roll: rotation around Y axis (-90 to 90), with positive values when the x-axis moves toward the z-axis.
*/
public abstract void onOrientationChanged(float azimuth, float pitch, float roll);
}
I tried to figure out how to add gyroscope data, but I am just not doing it right. The google doc at http://developer.android.com/reference/android/hardware/SensorEvent.html shows some code to get a delta matrix from the gyroscope data. The idea seems to be that I'd crank down the filters for the accelerometer and magnetic sensors so that they were really stable. That would keep track of the long term orientation.
Then, I'd keep a history of the most recent N delta matrices from the gyroscope. Each time I got a new one I'd drop off the oldest one and multiply them all together to get a final matrix which I would multiply against the stable matrix returned by the accelerometer and magnetic sensors.
This doesn't seem to work. Or, at least, my implementation of it does not work. The result is far more jittery than just the accelerometer. Increasing the size of the gyroscope history actually increases the jitter which makes me think that I'm not calculating the right values from the gyroscope.
public abstract class SensorsListener3
implements
SensorEventListener
{
/** The lower this is, the greater the preference which is given to previous values. (slows change) */
private static final float kFilteringFactor = 0.001f;
private static final float magKFilteringFactor = 0.001f;
public abstract boolean getIsLandscape();
#Override
public void onSensorChanged(SensorEvent event) {
Sensor sensor = event.sensor;
int type = sensor.getType();
switch (type) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags[0] = event.values[0] * magKFilteringFactor + mags[0] * (1.0f - magKFilteringFactor);
mags[1] = event.values[1] * magKFilteringFactor + mags[1] * (1.0f - magKFilteringFactor);
mags[2] = event.values[2] * magKFilteringFactor + mags[2] * (1.0f - magKFilteringFactor);
isReady = true;
break;
case Sensor.TYPE_ACCELEROMETER:
accels[0] = event.values[0] * kFilteringFactor + accels[0] * (1.0f - kFilteringFactor);
accels[1] = event.values[1] * kFilteringFactor + accels[1] * (1.0f - kFilteringFactor);
accels[2] = event.values[2] * kFilteringFactor + accels[2] * (1.0f - kFilteringFactor);
break;
case Sensor.TYPE_GYROSCOPE:
gyroscopeSensorChanged(event);
break;
default:
return;
}
if(mags != null && accels != null && isReady) {
isReady = false;
SensorManager.getRotationMatrix(rot, inclination, accels, mags);
boolean isLandscape = getIsLandscape();
if(isLandscape) {
outR = rot;
} else {
// Remap the coordinates to work in portrait mode.
SensorManager.remapCoordinateSystem(rot, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
}
if(gyroUpdateTime!=0) {
matrixHistory.mult(matrixTmp,matrixResult);
outR = matrixResult;
}
SensorManager.getOrientation(outR, values);
double x180pi = 180.0 / Math.PI;
float azimuth = (float)(values[0] * x180pi);
float pitch = (float)(values[1] * x180pi);
float roll = (float)(values[2] * x180pi);
// In landscape mode swap pitch and roll and invert the pitch.
if(isLandscape) {
float tmp = pitch;
pitch = -roll;
roll = -tmp;
azimuth = 180 - azimuth;
} else {
pitch = -pitch - 90;
azimuth = 90 - azimuth;
}
onOrientationChanged(azimuth,pitch,roll);
}
}
private void gyroscopeSensorChanged(SensorEvent event) {
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if(gyroUpdateTime != 0) {
final float dT = (event.timestamp - gyroUpdateTime) * NS2S;
// Axis of the rotation sample, not normalized yet.
float axisX = event.values[0];
float axisY = event.values[1];
float axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = (float)Math.sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
// Normalize the rotation vector if it's big enough to get the axis
if(omegaMagnitude > EPSILON) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = (float)Math.sin(thetaOverTwo);
float cosThetaOverTwo = (float)Math.cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
gyroUpdateTime = event.timestamp;
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
// User code should concatenate the delta rotation we computed with the current rotation
// in order to get the updated rotation.
// rotationCurrent = rotationCurrent * deltaRotationMatrix;
matrixHistory.add(deltaRotationMatrix);
}
private float[] mags = new float[3];
private float[] accels = new float[3];
private boolean isReady;
private float[] rot = new float[9];
private float[] outR = new float[9];
private float[] inclination = new float[9];
private float[] values = new float[3];
// gyroscope stuff
private long gyroUpdateTime = 0;
private static final float NS2S = 1.0f / 1000000000.0f;
private float[] deltaRotationMatrix = new float[9];
private final float[] deltaRotationVector = new float[4];
//TODO: I have no idea how small this value should be.
private static final float EPSILON = 0.000001f;
private float[] matrixMult = new float[9];
private MatrixHistory matrixHistory = new MatrixHistory(100);
private float[] matrixTmp = new float[9];
private float[] matrixResult = new float[9];
/**
Azimuth: angle between the magnetic north direction and the Y axis, around the Z axis (0 to 359). 0=North, 90=East, 180=South, 270=West
Pitch: rotation around X axis (-180 to 180), with positive values when the z-axis moves toward the y-axis.
Roll: rotation around Y axis (-90 to 90), with positive values when the x-axis moves toward the z-axis.
*/
public abstract void onOrientationChanged(float azimuth, float pitch, float roll);
}
public class MatrixHistory
{
public MatrixHistory(int size) {
vals = new float[size][];
}
public void add(float[] val) {
synchronized(vals) {
vals[ix] = val;
ix = (ix + 1) % vals.length;
if(ix==0)
full = true;
}
}
public void mult(float[] tmp, float[] output) {
synchronized(vals) {
if(full) {
for(int i=0; i<vals.length; ++i) {
if(i==0) {
System.arraycopy(vals[i],0,output,0,vals[i].length);
} else {
MathUtils.multiplyMatrix3x3(output,vals[i],tmp);
System.arraycopy(tmp,0,output,0,tmp.length);
}
}
} else {
if(ix==0)
return;
for(int i=0; i<ix; ++i) {
if(i==0) {
System.arraycopy(vals[i],0,output,0,vals[i].length);
} else {
MathUtils.multiplyMatrix3x3(output,vals[i],tmp);
System.arraycopy(tmp,0,output,0,tmp.length);
}
}
}
}
}
private int ix = 0;
private boolean full = false;
private float[][] vals;
}
The second block of code contains my changes from the first block of code which add the gyroscope to the mix.
Specifically, the filtering factor for accel is made smaller (making the value more stable). The MatrixHistory class keeps track of the last 100 gyroscope deltaRotationMatrix values which are calculated in the gyroscopeSensorChanged method.
I've seen many questions on this site on this topic. They've helped me get to this point, but I cannot figure out what to do next. I really wish the Sensor Fusion guy had just posted some code somewhere. He obviously had it all put together.
Well, +1 to you for even knowing what a Kalman filter is. If you'd like, I'll edit this post and give you the code I wrote a couple years ago to do what you're trying to do.
But first, I'll tell you why you don't need it.
Modern implementations of the Android sensor stack use Sensor Fusion, as Stan mentioned above. This just means that all of the available data -- accel, mag, gyro -- is collected together in one algorithm, and then all the outputs are read back out in the form of Android sensors.
Edit: I just stumbled on this superb Google Tech Talk on the subject: Sensor Fusion on Android Devices: A Revolution in Motion Processing. Well worth the 45 minutes to watch it if you're interested in the topic.
In essence, Sensor Fusion is a black box. I've looked into the source code of the Android implementation, and it's a big Kalman filter written in C++. Some pretty good code in there, and far more sophisticated than any filter I ever wrote, and probably more sophisticated that what you're writing. Remember, these guys are doing this for a living.
I also know that at least one chipset manufacturer has their own sensor fusion implementation. The manufacturer of the device then chooses between the Android and the vendor implementation based on their own criteria.
Finally, as Stan mentioned above, Invensense has their own sensor fusion implementation at the chip level.
Anyway, what it all boils down to is that the built-in sensor fusion in your device is likely to be superior to anything you or I could cobble together. So what you really want to do is to access that.
In Android, there are both physical and virtual sensors. The virtual sensors are the ones that are synthesized from the available physical sensors. The best-known example is TYPE_ORIENTATION which takes accelerometer and magnetometer and creates roll/pitch/heading output. (By the way, you should not use this sensor; it has too many limitations.)
But the important thing is that newer versions of Android contain these two new virtual sensors:
TYPE_GRAVITY is the accelerometer input with the effect of motion filtered out
TYPE_LINEAR_ACCELERATION is the accelerometer with the gravity component filtered out.
These two virtual sensors are synthesized through a combination of accelerometer input and gyro input.
Another notable sensor is TYPE_ROTATION_VECTOR which is a Quaternion synthesized from accelerometer, magnetometer, and gyro. It represents the full 3-d orientation of the device with the effects of linear acceleration filtered out.
However, Quaternions are a little bit abstract for most people, and since you're likely working with 3-d transformations anyway, your best approach is to combine TYPE_GRAVITY and TYPE_MAGNETIC_FIELD via SensorManager.getRotationMatrix().
One more point: if you're working with a device running an older version of Android, you need to detect that you're not receiving TYPE_GRAVITY events and use TYPE_ACCELEROMETER instead. Theoretically, this would be a place to use your own kalman filter, but if your device doesn't have sensor fusion built in, it probably doesn't have gyros either.
Anyway, here's some sample code to show how I do it.
// Requires 1.5 or above
class Foo extends Activity implements SensorEventListener {
SensorManager sensorManager;
float[] gData = new float[3]; // Gravity or accelerometer
float[] mData = new float[3]; // Magnetometer
float[] orientation = new float[3];
float[] Rmat = new float[9];
float[] R2 = new float[9];
float[] Imat = new float[9];
boolean haveGrav = false;
boolean haveAccel = false;
boolean haveMag = false;
onCreate() {
// Get the sensor manager from system services
sensorManager =
(SensorManager)getSystemService(Context.SENSOR_SERVICE);
}
onResume() {
super.onResume();
// Register our listeners
Sensor gsensor = sensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY);
Sensor asensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Sensor msensor = sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
sensorManager.registerListener(this, gsensor, SensorManager.SENSOR_DELAY_GAME);
sensorManager.registerListener(this, asensor, SensorManager.SENSOR_DELAY_GAME);
sensorManager.registerListener(this, msensor, SensorManager.SENSOR_DELAY_GAME);
}
public void onSensorChanged(SensorEvent event) {
float[] data;
switch( event.sensor.getType() ) {
case Sensor.TYPE_GRAVITY:
gData[0] = event.values[0];
gData[1] = event.values[1];
gData[2] = event.values[2];
haveGrav = true;
break;
case Sensor.TYPE_ACCELEROMETER:
if (haveGrav) break; // don't need it, we have better
gData[0] = event.values[0];
gData[1] = event.values[1];
gData[2] = event.values[2];
haveAccel = true;
break;
case Sensor.TYPE_MAGNETIC_FIELD:
mData[0] = event.values[0];
mData[1] = event.values[1];
mData[2] = event.values[2];
haveMag = true;
break;
default:
return;
}
if ((haveGrav || haveAccel) && haveMag) {
SensorManager.getRotationMatrix(Rmat, Imat, gData, mData);
SensorManager.remapCoordinateSystem(Rmat,
SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, R2);
// Orientation isn't as useful as a rotation matrix, but
// we'll show it here anyway.
SensorManager.getOrientation(R2, orientation);
float incl = SensorManager.getInclination(Imat);
Log.d(TAG, "mh: " + (int)(orientation[0]*DEG));
Log.d(TAG, "pitch: " + (int)(orientation[1]*DEG));
Log.d(TAG, "roll: " + (int)(orientation[2]*DEG));
Log.d(TAG, "yaw: " + (int)(orientation[0]*DEG));
Log.d(TAG, "inclination: " + (int)(incl*DEG));
}
}
}
Hmmm; if you happen to have a Quaternion library handy, it's probably simpler just to receive TYPE_ROTATION_VECTOR and convert that to an array.
To the question where to find complete code, here's a default implementation on Android jelly bean: https://android.googlesource.com/platform/frameworks/base/+/jb-release/services/sensorservice/
Start by checking the fusion.cpp/h.
It uses Modified Rodrigues Parameters (close to Euler angles) instead of quaternions. In addition to orientation the Kalman filter estimates gyro drift. For measurement updates it uses magnetometer and, a bit incorrectly, acceleration (specific force).
To make use of the code you should either be a wizard or know the basics of INS and KF. Many parameters have to be fine-tuned for the filter to work. As Edward adequately put, these guys are doing this for living.
At least in google's galaxy nexus this default implementation is left unused and is overridden by Invense's proprietary system.

Android Gyroscope verse Accelerometer

I am building an Android game and I want to figure out whether the user tilts the device to the left or the right (Similar to how Temple Run works when you move the man from side to side).
I have read many tutorials and examples and I made sample applications but the amount of data I get back from both the Gyroscope and the Accelerometer are overwhelming. Would I need both sets of hardware to work out whether the user tilts the device and in which direction?
My current application is detecting every slight movement and that is obviously not correct.
public class Main extends Activity {
private SensorManager mSensorManager;
private float mAccel; // acceleration apart from gravity
private float mAccelCurrent; // current acceleration including gravity
private float mAccelLast; // last acceleration including gravity
private RelativeLayout background;
private Boolean isleft = true;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
this.background = (RelativeLayout) findViewById(R.id.RelativeLayout1);
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mSensorManager.registerListener(mSensorListener, mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
mAccel = 0.00f;
mAccelCurrent = SensorManager.GRAVITY_EARTH;
mAccelLast = SensorManager.GRAVITY_EARTH;
/* float x1, x2, y1, y2;
String direction;
switch(event.getAction()) {
case(MotionEvent.ACTION_DOWN):
x1 = event.getX();
y1 = event.getY();
break;
case(MotionEvent.ACTION_UP) {
x2 = event.getX();
y2 = event.getY();
float dx = x2-x1;
float dy = y2-y1;
// Use dx and dy to determine the direction
if(Math.abs(dx) > Math.abs(dy)) {
if(dx>0) directiion = "right";
else direction = "left";
} else {
if(dy>0) direction = "down";
else direction = "up";
}
}
}*/
}
private final SensorEventListener mSensorListener = new SensorEventListener() {
public void onSensorChanged(SensorEvent se) {
float x = se.values[0];
float y = se.values[1];
float z = se.values[2];
if((mAccelLast<mAccelCurrent)&&(isleft == true)){
background.setBackgroundResource(R.drawable.bg_right);
isleft = false;
}
if((mAccelLast>mAccelCurrent)&&(isleft == false)){
background.setBackgroundResource(R.drawable.bg_left);
isleft = true;
}
mAccelLast = mAccelCurrent;
mAccelCurrent = (float) Math.sqrt((double) (x*x + y*y + z*z));
float delta = mAccelCurrent - mAccelLast;
Log.d("FB", "delta : "+delta);
mAccel = mAccel * 0.9f + delta; // perform low-cut filter
// Log.d("FB", "mAccel : "+mAccel);
}
Would I be better off using just the accelerometer, just the gyroscope or would I need both?
This post links to the differences between the two: Android accelerometer and gyroscope
http://diydrones.com/profiles/blogs/faq-whats-the-difference
http://answers.oreilly.com/topic/1751-mobile-accelerometers-and-gyroscopes-explained/
The documentation will also help: http://developer.android.com/guide/topics/sensors/sensors_motion.html
From my VERY limited experience, the gyro constantly measures the x, y, z rotation and keeps updating. Useful for steering a car/plane/character in a game. The accelerometer is a little more like a wii-mote, for swinging around or picking up a shake gesture.
From my experience with accelerometer using the gravity method, you can use it for x, y and z rotation. Just type in google "vector method accelerometer". I have used this method with a compass for correcting the coordinates due to tilt.

"shake to do something" code explanation

I've found this code, Its function is to do something when the device is shaken strong enough, but I haven't fully understood it . Anyone please help me
public class ShakeActivity extends Activity {
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mSensorManager.registerListener(mSensorListener, mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
mAccel = 0.00f;
mAccelCurrent = SensorManager.GRAVITY_EARTH;
mAccelLast = SensorManager.GRAVITY_EARTH;
}
private SensorManager mSensorManager;
private float mAccel; // acceleration apart from gravity
private float mAccelCurrent; // current acceleration including gravity
private float mAccelLast; // last acceleration including gravity
private final SensorEventListener mSensorListener = new SensorEventListener() {
public void onSensorChanged(SensorEvent se) {
float x = se.values[0];
float y = se.values[1];
float z = se.values[2];
mAccelLast = mAccelCurrent;
mAccelCurrent = (float) Math.sqrt((double) (x*x + y*y + z*z));
float delta = mAccelCurrent - mAccelLast;
mAccel = mAccel * 0.9f + delta; // perform low-cut filter
}
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
};
#Override
protected void onResume() {
super.onResume();
mSensorManager.registerListener(mSensorListener, mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
}
#Override
protected void onStop() {
mSensorManager.unregisterListener(mSensorListener);
super.onStop();
}
}
please help me to understand this two lines
mAccelCurrent = (float) Math.sqrt((double) (x*x + y*y + z*z));//I guess this is for computing the value of the acceleration
and this line I don't understand
mAccel = mAccel * 0.9f + delta;
thanks in advance.
The sensor will return three values, for acceleration along the three axis directions; these are placed in x, y and z in your code sample. Imagine three masses on springs all at right angles to each other; as you move the device around and the springs stretch and contract, x, y and z contain their lengths.
mAccelCurrent = (float) Math.sqrt((double) (x*x + y*y + z*z));
This is computing the magnitude of the acceleration. Imagine if, instead of the three masses on springs, you had just one, always pointing exactly in the direction the device is being accelerated in. It's actually possible to work out what that system would look like from the values we have, and that's what's being done here: mAccelCurrent is how much such a spring would get stretched. This is the calculation being performed.
mAccel = mAccel * 0.9f + delta;
This is a high pass filter on the input. Here it has the effect of making sudden changes in acceleration give bigger values. It's not clear from just the code you've posted why this is being done; I am guessing it is to make the code elsewhere that ultimately checks mAccel more sensitive to the forces at the extremes of each shake when the device is being shaken.

Categories

Resources