I've found this code, Its function is to do something when the device is shaken strong enough, but I haven't fully understood it . Anyone please help me
public class ShakeActivity extends Activity {
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mSensorManager.registerListener(mSensorListener, mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
mAccel = 0.00f;
mAccelCurrent = SensorManager.GRAVITY_EARTH;
mAccelLast = SensorManager.GRAVITY_EARTH;
}
private SensorManager mSensorManager;
private float mAccel; // acceleration apart from gravity
private float mAccelCurrent; // current acceleration including gravity
private float mAccelLast; // last acceleration including gravity
private final SensorEventListener mSensorListener = new SensorEventListener() {
public void onSensorChanged(SensorEvent se) {
float x = se.values[0];
float y = se.values[1];
float z = se.values[2];
mAccelLast = mAccelCurrent;
mAccelCurrent = (float) Math.sqrt((double) (x*x + y*y + z*z));
float delta = mAccelCurrent - mAccelLast;
mAccel = mAccel * 0.9f + delta; // perform low-cut filter
}
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
};
#Override
protected void onResume() {
super.onResume();
mSensorManager.registerListener(mSensorListener, mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
}
#Override
protected void onStop() {
mSensorManager.unregisterListener(mSensorListener);
super.onStop();
}
}
please help me to understand this two lines
mAccelCurrent = (float) Math.sqrt((double) (x*x + y*y + z*z));//I guess this is for computing the value of the acceleration
and this line I don't understand
mAccel = mAccel * 0.9f + delta;
thanks in advance.
The sensor will return three values, for acceleration along the three axis directions; these are placed in x, y and z in your code sample. Imagine three masses on springs all at right angles to each other; as you move the device around and the springs stretch and contract, x, y and z contain their lengths.
mAccelCurrent = (float) Math.sqrt((double) (x*x + y*y + z*z));
This is computing the magnitude of the acceleration. Imagine if, instead of the three masses on springs, you had just one, always pointing exactly in the direction the device is being accelerated in. It's actually possible to work out what that system would look like from the values we have, and that's what's being done here: mAccelCurrent is how much such a spring would get stretched. This is the calculation being performed.
mAccel = mAccel * 0.9f + delta;
This is a high pass filter on the input. Here it has the effect of making sudden changes in acceleration give bigger values. It's not clear from just the code you've posted why this is being done; I am guessing it is to make the code elsewhere that ultimately checks mAccel more sensitive to the forces at the extremes of each shake when the device is being shaken.
Related
I want to perform some activity when the user lifts the phone from a flat surface. The method I am using right now is detect shake motion using phone's Accelerometer using the following code:
sensorMan = (SensorManager) getSystemService(SENSOR_SERVICE);
accelerometer = sensorMan.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
sensorMan.registerListener(this, accelerometer, SensorManager.SENSOR_STATUS_ACCURACY_HIGH);
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravity = event.values.clone();
// Shake detection
float x = mGravity[0];
float y = mGravity[1];
float z = mGravity[2];
mAccelLast = mAccelCurrent;
mAccelCurrent = FloatMath.sqrt(x * x + y * y + z * z);
float delta = mAccelCurrent - mAccelLast;
mAccel = mAccel * 0.9f + delta;
if (mAccel > 0.9) {
//Perform certain tasks.
}
}
The issue I am facing with this code is the 0.9f threshold is reached sometimes even if the phone is still on the flat surface. I tried logging the mAccel value and found it to be rannging from 9.0 to 0.4 even when the phone is not even touched. Is there any guaranteed way to detect the phone's lift movement?
Solved the issue. All I wanted to do was to check for the "Y" value stated in the question and check if the value was greater than 1.0.
Note that, if the phone is kept in vertical position the Y is always around 9.8 but in such cases you can check for X instead. In my case user had to lift the phone and somewhen he will tilt the phone so I put a check for if(y >= 1.0 && y <= 2.0);
EDIT : UPDATED CODE
#Override
public void onSensorChanged(SensorEvent event) {
try {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravity = event.values.clone();
// Shake detection
float x = mGravity[0];
float y = mGravity[1];
float z = mGravity[2];
float yAbs = Math.abs(mGravity[1]);
mAccelLast = mAccelCurrent;
mAccelCurrent = FloatMath.sqrt(x * x + y * y + z * z);
float delta = mAccelCurrent - mAccelLast;
mAccel = mAccel * 0.9f + delta;
if (yAbs > 2.0 && yAbs < 4.0 && !isAlerted() && !isCallActive()) {
alert();
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
I would add the Gyroscope into the detection routine too.
The Phone gets Accelerated AND gets up from x=0 y=0 z=0 to, lets say y=120, that's the Trigger.
Look here
for Infos how to using it.
Another Sensor for lifting detection would be the Proximity Sensor, when the Phone lays flat on the Desk dinstance would be 0, if its picked up that value would raise quickly
I try to use the Vector Rotation sensor to find the event when the user turn over his device, as GS3 to stop music.
My code :
private SensorManager mSensorManager;
private final SensorEventListener mSensorListener = new SensorEventListener() {
public void onSensorChanged(SensorEvent se) {
float x = se.values[0];
float y = se.values[1];
float z = se.values[2];
System.out.println("X Vector : " + x + " / Y Vector : " + y + " / Z Vector : " + z);
if(//condition){
//method1();
}
}
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
};
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mSensorManager.registerListener(mSensorListener, mSensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR), SensorManager.SENSOR_DELAY_NORMAL);
}
I use System.out.println to see how the x, y and z variables change during my turn over, but I don't understand these values. When I let the device on the table and I start Activity, x, y and z are not always at 0. Then when I turn over it, all values change and still very close (values between -1 and 1).
My quastion is, how can I find the good axis, and what is the value which I have to put in my condition to detect the turn over ?
EDIT : Finally the code works fine using Y axe, but I can't use the values if landscape orientation is possible on the activity. Y axe values are corrects only if use portrait orientation only. Any idea to use with both ?
I'm trying to use the linear acceleration fusion sensor on my android app. However, it cannot find the sensor for Sensor.TYPE_LINEAR_ACCELERATION.
I perform these calls:
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mAccelerometer = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
mLinearAcceleration = mSensorManager.getDefaultSensor(Sensor.TYPE_LINEAR_ACCELERATION);
After the above, mAccelerometer is defined, but mLinearAcceleration is null. I know that TYPE_LINEAR_ACCELERATION was not added until API level 9, but I think I am running in at least API level 9. Here is a snippet of my manifest file:
<uses-sdk
android:minSdkVersion="9"
android:targetSdkVersion="17" />
So I believe I am working under the correct API level.
However, when I try listing all sensors that are available, I do not find all sensors that should be available for this API level.
When I call...
List<Sensor> l = mSensorManager.getSensorList(Sensor.TYPE_ALL);
for(Sensor s : l)
System.out.println(s.getName());
I get
Goldfish 3-axis Accelerometer
Goldfish 3-axis Magnetic field sensor
Goldfish Orientation sensor
Goldfish Temperature sensor
Goldfish Proximity sensor
None of these are linear acceleration, and one is Orientation, a deprecated sensor. It appears my sensors are operating as though they were from before API level 9, but I don't know why that would be. Is there some field in my project that could be forcing the app to act as an older API? (other than the minimum supported API field in the AndroidManifest.xml file, which is already set to 9)
put this in your manifest
<uses-permission android:name=”android.permission.SET_ORIENTATION” />
Is for access to your accelerometer
try this:
/* put this into your activity class */
private SensorManager mSensorManager;
private float mAccel; // acceleration apart from gravity
private float mAccelCurrent; // current acceleration including gravity
private float mAccelLast; // last acceleration including gravity
private final SensorEventListener mSensorListener = new SensorEventListener() {
public void onSensorChanged(SensorEvent se) {
float x = se.values[0];
float y = se.values[1];
float z = se.values[2];
mAccelLast = mAccelCurrent;
mAccelCurrent = (float) Math.sqrt((double) (x*x + y*y + z*z));
float delta = mAccelCurrent - mAccelLast;
mAccel = mAccel * 0.9f + delta; // perform low-cut filter
}
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
};
#Override
protected void onResume() {
super.onResume();
mSensorManager.registerListener(mSensorListener, mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
}
#Override
protected void onPause() {
mSensorManager.unregisterListener(mSensorListener);
super.onPause();
}
And this:
/* do this in onCreate */
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mSensorManager.registerListener(mSensorListener, mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
mAccel = 0.00f;
mAccelCurrent = SensorManager.GRAVITY_EARTH;
mAccelLast = SensorManager.GRAVITY_EARTH;
I am building an Android game and I want to figure out whether the user tilts the device to the left or the right (Similar to how Temple Run works when you move the man from side to side).
I have read many tutorials and examples and I made sample applications but the amount of data I get back from both the Gyroscope and the Accelerometer are overwhelming. Would I need both sets of hardware to work out whether the user tilts the device and in which direction?
My current application is detecting every slight movement and that is obviously not correct.
public class Main extends Activity {
private SensorManager mSensorManager;
private float mAccel; // acceleration apart from gravity
private float mAccelCurrent; // current acceleration including gravity
private float mAccelLast; // last acceleration including gravity
private RelativeLayout background;
private Boolean isleft = true;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
this.background = (RelativeLayout) findViewById(R.id.RelativeLayout1);
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mSensorManager.registerListener(mSensorListener, mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
mAccel = 0.00f;
mAccelCurrent = SensorManager.GRAVITY_EARTH;
mAccelLast = SensorManager.GRAVITY_EARTH;
/* float x1, x2, y1, y2;
String direction;
switch(event.getAction()) {
case(MotionEvent.ACTION_DOWN):
x1 = event.getX();
y1 = event.getY();
break;
case(MotionEvent.ACTION_UP) {
x2 = event.getX();
y2 = event.getY();
float dx = x2-x1;
float dy = y2-y1;
// Use dx and dy to determine the direction
if(Math.abs(dx) > Math.abs(dy)) {
if(dx>0) directiion = "right";
else direction = "left";
} else {
if(dy>0) direction = "down";
else direction = "up";
}
}
}*/
}
private final SensorEventListener mSensorListener = new SensorEventListener() {
public void onSensorChanged(SensorEvent se) {
float x = se.values[0];
float y = se.values[1];
float z = se.values[2];
if((mAccelLast<mAccelCurrent)&&(isleft == true)){
background.setBackgroundResource(R.drawable.bg_right);
isleft = false;
}
if((mAccelLast>mAccelCurrent)&&(isleft == false)){
background.setBackgroundResource(R.drawable.bg_left);
isleft = true;
}
mAccelLast = mAccelCurrent;
mAccelCurrent = (float) Math.sqrt((double) (x*x + y*y + z*z));
float delta = mAccelCurrent - mAccelLast;
Log.d("FB", "delta : "+delta);
mAccel = mAccel * 0.9f + delta; // perform low-cut filter
// Log.d("FB", "mAccel : "+mAccel);
}
Would I be better off using just the accelerometer, just the gyroscope or would I need both?
This post links to the differences between the two: Android accelerometer and gyroscope
http://diydrones.com/profiles/blogs/faq-whats-the-difference
http://answers.oreilly.com/topic/1751-mobile-accelerometers-and-gyroscopes-explained/
The documentation will also help: http://developer.android.com/guide/topics/sensors/sensors_motion.html
From my VERY limited experience, the gyro constantly measures the x, y, z rotation and keeps updating. Useful for steering a car/plane/character in a game. The accelerometer is a little more like a wii-mote, for swinging around or picking up a shake gesture.
From my experience with accelerometer using the gravity method, you can use it for x, y and z rotation. Just type in google "vector method accelerometer". I have used this method with a compass for correcting the coordinates due to tilt.
This may be an easy question, but I'm stuck. I am trying to implement the "shake to erase" feature in a drawing program (simple paint app). I can't get it to work. Here's my code:
private final SensorEventListener mSensorListener = new SensorEventListener() {
public void onSensorChanged(SensorEvent se) {
float x = se.values[0];
float y = se.values[1];
float z = se.values[2];
mAccelLast = mAccelCurrent;
mAccelCurrent = (float) Math.sqrt((double) (x*x + y*y + z*z));
float delta = mAccelCurrent - mAccelLast;
mAccel = mAccel * 0.9f + delta; // perform low-cut filter
if (mAccel > 2) {
mView.onDraw(mCanvas);
mCanvas.drawBitmap(cache, 0, 0, new Paint(Paint.DITHER_FLAG));
}
}
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
};
The SensorEventListener is based off of this example. I make it into the if statement, but the canvas won't reset until after I've touched the screen (a new touch event).
I'd like the canvas to reset/erase during the shake event, without any further prompts from the user necessary.
Any help would be wonderful, thank you!
You might have to call invalidate on the graphics object to get it to redraw.
Hope that helps!
http://developer.android.com/guide/topics/graphics/index.html