Android detect phone lifting action - android

I want to perform some activity when the user lifts the phone from a flat surface. The method I am using right now is detect shake motion using phone's Accelerometer using the following code:
sensorMan = (SensorManager) getSystemService(SENSOR_SERVICE);
accelerometer = sensorMan.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
sensorMan.registerListener(this, accelerometer, SensorManager.SENSOR_STATUS_ACCURACY_HIGH);
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravity = event.values.clone();
// Shake detection
float x = mGravity[0];
float y = mGravity[1];
float z = mGravity[2];
mAccelLast = mAccelCurrent;
mAccelCurrent = FloatMath.sqrt(x * x + y * y + z * z);
float delta = mAccelCurrent - mAccelLast;
mAccel = mAccel * 0.9f + delta;
if (mAccel > 0.9) {
//Perform certain tasks.
}
}
The issue I am facing with this code is the 0.9f threshold is reached sometimes even if the phone is still on the flat surface. I tried logging the mAccel value and found it to be rannging from 9.0 to 0.4 even when the phone is not even touched. Is there any guaranteed way to detect the phone's lift movement?

Solved the issue. All I wanted to do was to check for the "Y" value stated in the question and check if the value was greater than 1.0.
Note that, if the phone is kept in vertical position the Y is always around 9.8 but in such cases you can check for X instead. In my case user had to lift the phone and somewhen he will tilt the phone so I put a check for if(y >= 1.0 && y <= 2.0);
EDIT : UPDATED CODE
#Override
public void onSensorChanged(SensorEvent event) {
try {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mGravity = event.values.clone();
// Shake detection
float x = mGravity[0];
float y = mGravity[1];
float z = mGravity[2];
float yAbs = Math.abs(mGravity[1]);
mAccelLast = mAccelCurrent;
mAccelCurrent = FloatMath.sqrt(x * x + y * y + z * z);
float delta = mAccelCurrent - mAccelLast;
mAccel = mAccel * 0.9f + delta;
if (yAbs > 2.0 && yAbs < 4.0 && !isAlerted() && !isCallActive()) {
alert();
}
}
} catch (Exception e) {
e.printStackTrace();
}
}

I would add the Gyroscope into the detection routine too.
The Phone gets Accelerated AND gets up from x=0 y=0 z=0 to, lets say y=120, that's the Trigger.
Look here
for Infos how to using it.
Another Sensor for lifting detection would be the Proximity Sensor, when the Phone lays flat on the Desk dinstance would be 0, if its picked up that value would raise quickly

Related

onSensorChanged check if the mobile is in an horizontal position (no landscape)

I need to check if the mobile goes to the horizontal position (no landscape, I mean like you place it on the table).
I know I have to use "SensorEventListener" interface and the "onSensorChanged" event, but I am not able to get an example to howto check the mobile position.
#Override
public void onSensorChanged(SensorEvent event) {
//
}
Try this:
#Override
public void onSensorChanged(SensorEvent event) {
float[] values = event.values;
float x = values[0];
float y = values[1];
float z = values[2];
float norm =(float) Math.sqrt(x * x + y * y + z * z);
// Normalize the accelerometer vector
x = (x / norm);
y = (y / norm);
z = (z / norm);
int inclination = (int) Math.round(Math.toDegrees(Math.acos(z)));
if (inclination < 25 || inclination > 155)
{
// device is horiontal
Toast.makeText(this,"device horiontal !",Toast.LENGTH_SHORT).show();
}
}

How to shake up and down using Accelerometer in Android?

I tried the following way, but it doesn't work perfectly.. If device is kept in normal way, then also it fires shake event.
#Override
public void onSensorChanged(SensorEvent event) {
// TODO Auto-generated method stub
long curTime = System.currentTimeMillis();
// only allow one update every 200Ms.
if ((curTime - lastUpdate) > 200) {
lastUpdate = curTime;
x = event.values[SensorManager.DATA_X];
y = event.values[SensorManager.DATA_Y];
z = event.values[SensorManager.DATA_Z];
Vibrator vibrate = (Vibrator) getSystemService(Context.VIBRATOR_SERVICE);
if (Round(y, 4) > 22) {
vibrate.vibrate(200);
Log.d("sensor", "==== Up Detected===");
} else if (Round(y, 4) < -20) {
vibrate.vibrate(200);
Log.d("sensor", "==== Down Detected=== ");
}
mLastX = x;
mLastY = y;
mLastZ = z;
}
}
public static float Round(float Rval, int Rpl) {
float p = (float) Math.pow(10, Rpl);
Rval = Rval * p;
float tmp = Math.round(Rval);
return (float) tmp / p;
}
Please Help.
Any Help would be highly appreciated..
Thanks
Sensor values are very raw and noisy. You need to add a layer of digital signal processing on top of them to get good results. Just using the raw values will lead to a lot of jitter in your results. You're trying to do some basic ones with your time delay, but you need to do more filtering.
Also, you're vibrating in response to a shake. That vibrate will cause the accelerometer to see movement, leading to more false positives.
Apply lowpass filter on as your y-axis values.
The basic filter will be as mentioned in documentation.
public void onSensorChanged(SensorEvent event){
// In this example, alpha is calculated as t / (t + dT),
// where t is the low-pass filter's time-constant and
// dT is the event delivery rate.
final float alpha = 0.8;
// Isolate the force of gravity with the low-pass filter.
gravity[0] = alpha * gravity[0] + (1 - alpha) * event.values[0];
gravity[1] = alpha * gravity[1] + (1 - alpha) * event.values[1];
gravity[2] = alpha * gravity[2] + (1 - alpha) * event.values[2];
// Remove the gravity contribution with the high-pass filter.
linear_acceleration[0] = event.values[0] - gravity[0];
linear_acceleration[1] = event.values[1] - gravity[1];
linear_acceleration[2] = event.values[2] - gravity[2];
}

Using Android gyroscope instead of accelerometer. I find lots of bits and pieces, but no complete code

The Sensor Fusion video looks great, but there's no code:
http://www.youtube.com/watch?v=C7JQ7Rpwn2k&feature=player_detailpage#t=1315s
Here is my code which just uses accelerometer and compass. I also use a Kalman filter on the 3 orientation values, but that's too much code to show here. Ultimately, this works ok, but the result is either too jittery or too laggy depending on what I do with the results and how low I make the filtering factors.
/** Just accelerometer and magnetic sensors */
public abstract class SensorsListener2
implements
SensorEventListener
{
/** The lower this is, the greater the preference which is given to previous values. (slows change) */
private static final float accelFilteringFactor = 0.1f;
private static final float magFilteringFactor = 0.01f;
public abstract boolean getIsLandscape();
#Override
public void onSensorChanged(SensorEvent event) {
Sensor sensor = event.sensor;
int type = sensor.getType();
switch (type) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags[0] = event.values[0] * magFilteringFactor + mags[0] * (1.0f - magFilteringFactor);
mags[1] = event.values[1] * magFilteringFactor + mags[1] * (1.0f - magFilteringFactor);
mags[2] = event.values[2] * magFilteringFactor + mags[2] * (1.0f - magFilteringFactor);
isReady = true;
break;
case Sensor.TYPE_ACCELEROMETER:
accels[0] = event.values[0] * accelFilteringFactor + accels[0] * (1.0f - accelFilteringFactor);
accels[1] = event.values[1] * accelFilteringFactor + accels[1] * (1.0f - accelFilteringFactor);
accels[2] = event.values[2] * accelFilteringFactor + accels[2] * (1.0f - accelFilteringFactor);
break;
default:
return;
}
if(mags != null && accels != null && isReady) {
isReady = false;
SensorManager.getRotationMatrix(rot, inclination, accels, mags);
boolean isLandscape = getIsLandscape();
if(isLandscape) {
outR = rot;
} else {
// Remap the coordinates to work in portrait mode.
SensorManager.remapCoordinateSystem(rot, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
}
SensorManager.getOrientation(outR, values);
double x180pi = 180.0 / Math.PI;
float azimuth = (float)(values[0] * x180pi);
float pitch = (float)(values[1] * x180pi);
float roll = (float)(values[2] * x180pi);
// In landscape mode swap pitch and roll and invert the pitch.
if(isLandscape) {
float tmp = pitch;
pitch = -roll;
roll = -tmp;
azimuth = 180 - azimuth;
} else {
pitch = -pitch - 90;
azimuth = 90 - azimuth;
}
onOrientationChanged(azimuth,pitch,roll);
}
}
private float[] mags = new float[3];
private float[] accels = new float[3];
private boolean isReady;
private float[] rot = new float[9];
private float[] outR = new float[9];
private float[] inclination = new float[9];
private float[] values = new float[3];
/**
Azimuth: angle between the magnetic north direction and the Y axis, around the Z axis (0 to 359). 0=North, 90=East, 180=South, 270=West
Pitch: rotation around X axis (-180 to 180), with positive values when the z-axis moves toward the y-axis.
Roll: rotation around Y axis (-90 to 90), with positive values when the x-axis moves toward the z-axis.
*/
public abstract void onOrientationChanged(float azimuth, float pitch, float roll);
}
I tried to figure out how to add gyroscope data, but I am just not doing it right. The google doc at http://developer.android.com/reference/android/hardware/SensorEvent.html shows some code to get a delta matrix from the gyroscope data. The idea seems to be that I'd crank down the filters for the accelerometer and magnetic sensors so that they were really stable. That would keep track of the long term orientation.
Then, I'd keep a history of the most recent N delta matrices from the gyroscope. Each time I got a new one I'd drop off the oldest one and multiply them all together to get a final matrix which I would multiply against the stable matrix returned by the accelerometer and magnetic sensors.
This doesn't seem to work. Or, at least, my implementation of it does not work. The result is far more jittery than just the accelerometer. Increasing the size of the gyroscope history actually increases the jitter which makes me think that I'm not calculating the right values from the gyroscope.
public abstract class SensorsListener3
implements
SensorEventListener
{
/** The lower this is, the greater the preference which is given to previous values. (slows change) */
private static final float kFilteringFactor = 0.001f;
private static final float magKFilteringFactor = 0.001f;
public abstract boolean getIsLandscape();
#Override
public void onSensorChanged(SensorEvent event) {
Sensor sensor = event.sensor;
int type = sensor.getType();
switch (type) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags[0] = event.values[0] * magKFilteringFactor + mags[0] * (1.0f - magKFilteringFactor);
mags[1] = event.values[1] * magKFilteringFactor + mags[1] * (1.0f - magKFilteringFactor);
mags[2] = event.values[2] * magKFilteringFactor + mags[2] * (1.0f - magKFilteringFactor);
isReady = true;
break;
case Sensor.TYPE_ACCELEROMETER:
accels[0] = event.values[0] * kFilteringFactor + accels[0] * (1.0f - kFilteringFactor);
accels[1] = event.values[1] * kFilteringFactor + accels[1] * (1.0f - kFilteringFactor);
accels[2] = event.values[2] * kFilteringFactor + accels[2] * (1.0f - kFilteringFactor);
break;
case Sensor.TYPE_GYROSCOPE:
gyroscopeSensorChanged(event);
break;
default:
return;
}
if(mags != null && accels != null && isReady) {
isReady = false;
SensorManager.getRotationMatrix(rot, inclination, accels, mags);
boolean isLandscape = getIsLandscape();
if(isLandscape) {
outR = rot;
} else {
// Remap the coordinates to work in portrait mode.
SensorManager.remapCoordinateSystem(rot, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
}
if(gyroUpdateTime!=0) {
matrixHistory.mult(matrixTmp,matrixResult);
outR = matrixResult;
}
SensorManager.getOrientation(outR, values);
double x180pi = 180.0 / Math.PI;
float azimuth = (float)(values[0] * x180pi);
float pitch = (float)(values[1] * x180pi);
float roll = (float)(values[2] * x180pi);
// In landscape mode swap pitch and roll and invert the pitch.
if(isLandscape) {
float tmp = pitch;
pitch = -roll;
roll = -tmp;
azimuth = 180 - azimuth;
} else {
pitch = -pitch - 90;
azimuth = 90 - azimuth;
}
onOrientationChanged(azimuth,pitch,roll);
}
}
private void gyroscopeSensorChanged(SensorEvent event) {
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if(gyroUpdateTime != 0) {
final float dT = (event.timestamp - gyroUpdateTime) * NS2S;
// Axis of the rotation sample, not normalized yet.
float axisX = event.values[0];
float axisY = event.values[1];
float axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = (float)Math.sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
// Normalize the rotation vector if it's big enough to get the axis
if(omegaMagnitude > EPSILON) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = (float)Math.sin(thetaOverTwo);
float cosThetaOverTwo = (float)Math.cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
gyroUpdateTime = event.timestamp;
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
// User code should concatenate the delta rotation we computed with the current rotation
// in order to get the updated rotation.
// rotationCurrent = rotationCurrent * deltaRotationMatrix;
matrixHistory.add(deltaRotationMatrix);
}
private float[] mags = new float[3];
private float[] accels = new float[3];
private boolean isReady;
private float[] rot = new float[9];
private float[] outR = new float[9];
private float[] inclination = new float[9];
private float[] values = new float[3];
// gyroscope stuff
private long gyroUpdateTime = 0;
private static final float NS2S = 1.0f / 1000000000.0f;
private float[] deltaRotationMatrix = new float[9];
private final float[] deltaRotationVector = new float[4];
//TODO: I have no idea how small this value should be.
private static final float EPSILON = 0.000001f;
private float[] matrixMult = new float[9];
private MatrixHistory matrixHistory = new MatrixHistory(100);
private float[] matrixTmp = new float[9];
private float[] matrixResult = new float[9];
/**
Azimuth: angle between the magnetic north direction and the Y axis, around the Z axis (0 to 359). 0=North, 90=East, 180=South, 270=West
Pitch: rotation around X axis (-180 to 180), with positive values when the z-axis moves toward the y-axis.
Roll: rotation around Y axis (-90 to 90), with positive values when the x-axis moves toward the z-axis.
*/
public abstract void onOrientationChanged(float azimuth, float pitch, float roll);
}
public class MatrixHistory
{
public MatrixHistory(int size) {
vals = new float[size][];
}
public void add(float[] val) {
synchronized(vals) {
vals[ix] = val;
ix = (ix + 1) % vals.length;
if(ix==0)
full = true;
}
}
public void mult(float[] tmp, float[] output) {
synchronized(vals) {
if(full) {
for(int i=0; i<vals.length; ++i) {
if(i==0) {
System.arraycopy(vals[i],0,output,0,vals[i].length);
} else {
MathUtils.multiplyMatrix3x3(output,vals[i],tmp);
System.arraycopy(tmp,0,output,0,tmp.length);
}
}
} else {
if(ix==0)
return;
for(int i=0; i<ix; ++i) {
if(i==0) {
System.arraycopy(vals[i],0,output,0,vals[i].length);
} else {
MathUtils.multiplyMatrix3x3(output,vals[i],tmp);
System.arraycopy(tmp,0,output,0,tmp.length);
}
}
}
}
}
private int ix = 0;
private boolean full = false;
private float[][] vals;
}
The second block of code contains my changes from the first block of code which add the gyroscope to the mix.
Specifically, the filtering factor for accel is made smaller (making the value more stable). The MatrixHistory class keeps track of the last 100 gyroscope deltaRotationMatrix values which are calculated in the gyroscopeSensorChanged method.
I've seen many questions on this site on this topic. They've helped me get to this point, but I cannot figure out what to do next. I really wish the Sensor Fusion guy had just posted some code somewhere. He obviously had it all put together.
Well, +1 to you for even knowing what a Kalman filter is. If you'd like, I'll edit this post and give you the code I wrote a couple years ago to do what you're trying to do.
But first, I'll tell you why you don't need it.
Modern implementations of the Android sensor stack use Sensor Fusion, as Stan mentioned above. This just means that all of the available data -- accel, mag, gyro -- is collected together in one algorithm, and then all the outputs are read back out in the form of Android sensors.
Edit: I just stumbled on this superb Google Tech Talk on the subject: Sensor Fusion on Android Devices: A Revolution in Motion Processing. Well worth the 45 minutes to watch it if you're interested in the topic.
In essence, Sensor Fusion is a black box. I've looked into the source code of the Android implementation, and it's a big Kalman filter written in C++. Some pretty good code in there, and far more sophisticated than any filter I ever wrote, and probably more sophisticated that what you're writing. Remember, these guys are doing this for a living.
I also know that at least one chipset manufacturer has their own sensor fusion implementation. The manufacturer of the device then chooses between the Android and the vendor implementation based on their own criteria.
Finally, as Stan mentioned above, Invensense has their own sensor fusion implementation at the chip level.
Anyway, what it all boils down to is that the built-in sensor fusion in your device is likely to be superior to anything you or I could cobble together. So what you really want to do is to access that.
In Android, there are both physical and virtual sensors. The virtual sensors are the ones that are synthesized from the available physical sensors. The best-known example is TYPE_ORIENTATION which takes accelerometer and magnetometer and creates roll/pitch/heading output. (By the way, you should not use this sensor; it has too many limitations.)
But the important thing is that newer versions of Android contain these two new virtual sensors:
TYPE_GRAVITY is the accelerometer input with the effect of motion filtered out
TYPE_LINEAR_ACCELERATION is the accelerometer with the gravity component filtered out.
These two virtual sensors are synthesized through a combination of accelerometer input and gyro input.
Another notable sensor is TYPE_ROTATION_VECTOR which is a Quaternion synthesized from accelerometer, magnetometer, and gyro. It represents the full 3-d orientation of the device with the effects of linear acceleration filtered out.
However, Quaternions are a little bit abstract for most people, and since you're likely working with 3-d transformations anyway, your best approach is to combine TYPE_GRAVITY and TYPE_MAGNETIC_FIELD via SensorManager.getRotationMatrix().
One more point: if you're working with a device running an older version of Android, you need to detect that you're not receiving TYPE_GRAVITY events and use TYPE_ACCELEROMETER instead. Theoretically, this would be a place to use your own kalman filter, but if your device doesn't have sensor fusion built in, it probably doesn't have gyros either.
Anyway, here's some sample code to show how I do it.
// Requires 1.5 or above
class Foo extends Activity implements SensorEventListener {
SensorManager sensorManager;
float[] gData = new float[3]; // Gravity or accelerometer
float[] mData = new float[3]; // Magnetometer
float[] orientation = new float[3];
float[] Rmat = new float[9];
float[] R2 = new float[9];
float[] Imat = new float[9];
boolean haveGrav = false;
boolean haveAccel = false;
boolean haveMag = false;
onCreate() {
// Get the sensor manager from system services
sensorManager =
(SensorManager)getSystemService(Context.SENSOR_SERVICE);
}
onResume() {
super.onResume();
// Register our listeners
Sensor gsensor = sensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY);
Sensor asensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Sensor msensor = sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
sensorManager.registerListener(this, gsensor, SensorManager.SENSOR_DELAY_GAME);
sensorManager.registerListener(this, asensor, SensorManager.SENSOR_DELAY_GAME);
sensorManager.registerListener(this, msensor, SensorManager.SENSOR_DELAY_GAME);
}
public void onSensorChanged(SensorEvent event) {
float[] data;
switch( event.sensor.getType() ) {
case Sensor.TYPE_GRAVITY:
gData[0] = event.values[0];
gData[1] = event.values[1];
gData[2] = event.values[2];
haveGrav = true;
break;
case Sensor.TYPE_ACCELEROMETER:
if (haveGrav) break; // don't need it, we have better
gData[0] = event.values[0];
gData[1] = event.values[1];
gData[2] = event.values[2];
haveAccel = true;
break;
case Sensor.TYPE_MAGNETIC_FIELD:
mData[0] = event.values[0];
mData[1] = event.values[1];
mData[2] = event.values[2];
haveMag = true;
break;
default:
return;
}
if ((haveGrav || haveAccel) && haveMag) {
SensorManager.getRotationMatrix(Rmat, Imat, gData, mData);
SensorManager.remapCoordinateSystem(Rmat,
SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, R2);
// Orientation isn't as useful as a rotation matrix, but
// we'll show it here anyway.
SensorManager.getOrientation(R2, orientation);
float incl = SensorManager.getInclination(Imat);
Log.d(TAG, "mh: " + (int)(orientation[0]*DEG));
Log.d(TAG, "pitch: " + (int)(orientation[1]*DEG));
Log.d(TAG, "roll: " + (int)(orientation[2]*DEG));
Log.d(TAG, "yaw: " + (int)(orientation[0]*DEG));
Log.d(TAG, "inclination: " + (int)(incl*DEG));
}
}
}
Hmmm; if you happen to have a Quaternion library handy, it's probably simpler just to receive TYPE_ROTATION_VECTOR and convert that to an array.
To the question where to find complete code, here's a default implementation on Android jelly bean: https://android.googlesource.com/platform/frameworks/base/+/jb-release/services/sensorservice/
Start by checking the fusion.cpp/h.
It uses Modified Rodrigues Parameters (close to Euler angles) instead of quaternions. In addition to orientation the Kalman filter estimates gyro drift. For measurement updates it uses magnetometer and, a bit incorrectly, acceleration (specific force).
To make use of the code you should either be a wizard or know the basics of INS and KF. Many parameters have to be fine-tuned for the filter to work. As Edward adequately put, these guys are doing this for living.
At least in google's galaxy nexus this default implementation is left unused and is overridden by Invense's proprietary system.

Android Gyroscope verse Accelerometer

I am building an Android game and I want to figure out whether the user tilts the device to the left or the right (Similar to how Temple Run works when you move the man from side to side).
I have read many tutorials and examples and I made sample applications but the amount of data I get back from both the Gyroscope and the Accelerometer are overwhelming. Would I need both sets of hardware to work out whether the user tilts the device and in which direction?
My current application is detecting every slight movement and that is obviously not correct.
public class Main extends Activity {
private SensorManager mSensorManager;
private float mAccel; // acceleration apart from gravity
private float mAccelCurrent; // current acceleration including gravity
private float mAccelLast; // last acceleration including gravity
private RelativeLayout background;
private Boolean isleft = true;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
this.background = (RelativeLayout) findViewById(R.id.RelativeLayout1);
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mSensorManager.registerListener(mSensorListener, mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
mAccel = 0.00f;
mAccelCurrent = SensorManager.GRAVITY_EARTH;
mAccelLast = SensorManager.GRAVITY_EARTH;
/* float x1, x2, y1, y2;
String direction;
switch(event.getAction()) {
case(MotionEvent.ACTION_DOWN):
x1 = event.getX();
y1 = event.getY();
break;
case(MotionEvent.ACTION_UP) {
x2 = event.getX();
y2 = event.getY();
float dx = x2-x1;
float dy = y2-y1;
// Use dx and dy to determine the direction
if(Math.abs(dx) > Math.abs(dy)) {
if(dx>0) directiion = "right";
else direction = "left";
} else {
if(dy>0) direction = "down";
else direction = "up";
}
}
}*/
}
private final SensorEventListener mSensorListener = new SensorEventListener() {
public void onSensorChanged(SensorEvent se) {
float x = se.values[0];
float y = se.values[1];
float z = se.values[2];
if((mAccelLast<mAccelCurrent)&&(isleft == true)){
background.setBackgroundResource(R.drawable.bg_right);
isleft = false;
}
if((mAccelLast>mAccelCurrent)&&(isleft == false)){
background.setBackgroundResource(R.drawable.bg_left);
isleft = true;
}
mAccelLast = mAccelCurrent;
mAccelCurrent = (float) Math.sqrt((double) (x*x + y*y + z*z));
float delta = mAccelCurrent - mAccelLast;
Log.d("FB", "delta : "+delta);
mAccel = mAccel * 0.9f + delta; // perform low-cut filter
// Log.d("FB", "mAccel : "+mAccel);
}
Would I be better off using just the accelerometer, just the gyroscope or would I need both?
This post links to the differences between the two: Android accelerometer and gyroscope
http://diydrones.com/profiles/blogs/faq-whats-the-difference
http://answers.oreilly.com/topic/1751-mobile-accelerometers-and-gyroscopes-explained/
The documentation will also help: http://developer.android.com/guide/topics/sensors/sensors_motion.html
From my VERY limited experience, the gyro constantly measures the x, y, z rotation and keeps updating. Useful for steering a car/plane/character in a game. The accelerometer is a little more like a wii-mote, for swinging around or picking up a shake gesture.
From my experience with accelerometer using the gravity method, you can use it for x, y and z rotation. Just type in google "vector method accelerometer". I have used this method with a compass for correcting the coordinates due to tilt.

Convert values form Sensor.TYPE_ORIENTATION to Euler angles?

I have to write a compass app in Android. The only thing the user sees on the screen is a cube with a red wall which has to point north. This is not important. What's important is that I need to rotate that cube accordingly to the rotation of the device itself so that the red wall continues to point north no matter how the phone is being held. My code is simple and straightforward:
#Override
public void onSensorChanged(SensorEvent event) {
synchronized (this) {
switch (event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
direction = event.values[2];
break;
case Sensor.TYPE_ORIENTATION:
if (direction < 0) {
angleX = event.values[1];
angleY = -event.values[2];
angleZ = event.values[0];
} else {
angleX = -event.values[1];
angleY = -event.values[2];
angleZ = event.values[0];
}
break;
}
}
}
I have added this extra direction variable that simply stores whether the phone's display is pointing downwards or upwards. I don't know if I need it but it seems to fix some bugs. I am using the SensorSimulator for android but whenever my pitch slider goes in the [-90, 90] interval the other variables get mixed up. It's like they get a 180 offset. But I can't detect when I am in this interval because the range of the pitch is from -90 to 90 so I can move that slider from left to write and I will always be in that interval.
This was all just to show you how far has my code advanced. I am not saying how this problem should be solved because I will only probably stir myself into a dead end. You see, I have been trying to write that app for 3 days now, and you can imagine how pissed my boss is. I have read all sorts of tutorials and tried every formula I could find or think of. So please help me. All I have to do is know how to rotate my cube, the rotation angles of which are EULER ANGLES in degrees.
Here's some code I wrote to do something pretty similar, really only caring about the rotation of the device in the roll direction. Hope it helps! It just uses the accelerometer values to determine the pitch, no need to get orientation of the view.
public void onSensorChanged(SensorEvent event) {
float x = -1 * event.values[0] / SensorManager.GRAVITY_EARTH;
float y = -1 * event.values[1] / SensorManager.GRAVITY_EARTH;
float z = -1 * event.values[2] / SensorManager.GRAVITY_EARTH;
float signedRawRoll = (float) (Math.atan2(x, y) * 180 / Math.PI);
float unsignedRawRoll = Math.abs(signedRawRoll);
float rollSign = signedRawRoll / unsignedRawRoll;
float rawPitch = Math.abs(z * 180);
// Use a basic low-pass filter to only keep the gravity in the accelerometer values for the X and Y axes
// adjust the filter weight based on pitch, as roll is harder to define as pitch approaches 180.
float filterWeight = rawPitch > 165 ? 0.85f : 0.7f;
float newUnsignedRoll = filterWeight * Math.abs(this.roll) + (1 - filterWeight) * unsignedRawRoll;
this.roll = rollSign * newUnsignedRoll;
if (Float.isInfinite(this.roll) || Float.isNaN(this.roll)) {
this.roll = 0;
}
this.pitch = filterWeight * this.pitch + (1 - filterWeight) * rawPitch;
for (IAngleListener listener : listeners) {
listener.deviceRollAndPitch(this.roll, this.pitch);
}
}

Categories

Resources