I am working on an Android app wherein I want to scroll a large image horizontally. I used the accelerometer (Sensor.TYPE_ACCELEROMETER) and magnetic field (Sensor.TYPE_MAGNETIC_FIELD) data to get the angle of rotation. This data being to frequent infested with noise I am not able to implement a smooth motion effect.
#Override
public void onSensorChanged(SensorEvent event) {
switch (event.sensor.getType()) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags = event.values.clone();
break;
case Sensor.TYPE_ACCELEROMETER:
accels = event.values.clone();
break;
}
if (mags != null && accels != null) {
gravity = new float[16];
boolean success = SensorManager.getRotationMatrix(gravity, null, accels, mags);
if (success) {
float[] outGravity = new float[16];
SensorManager.remapCoordinateSystem(gravity, SensorManager.AXIS_X, SensorManager.AXIS_Z, outGravity);
SensorManager.getOrientation(outGravity, values);
rollingAverage[0] = roll(rollingAverage[0], values[0]);
rollingAverage[1] = roll(rollingAverage[1], values[1]);
rollingAverage[2] = roll(rollingAverage[2], values[2]);
azimuth = Math.toDegrees(values[0]);
pitch = Math.toDegrees(values[1]);
roll = Math.toDegrees(values[2]);
mags = null;
accels = null;
double diffRoll = lastRoll - roll;
double diffPitch = lastPitch - pitch;
long curTime = System.currentTimeMillis();
if (Math.abs(diffRoll) >= 2) {
if (diffRoll > 0)
imageView.panLeft();
else
imageView.panRight();
lastRoll = roll;
}
}
}
}
Any ideas on achieving this using other methods?
You have to implement sensor fusion techniques based on Kalman filter or other filters. You can use open source libraries if needed. Refer this bitbucket repository. If you want to do yourself, read the tutorial.
I am developing some application like Runtastic Pedometer using the algorithm but I am not getting any similarity between the results.
my code is as follows:
public void onSensorChanged(SensorEvent event)
{
Sensor sensor = event.sensor;
synchronized (this)
{
if (sensor.getType() == Sensor.TYPE_ORIENTATION) {}
else {
int j = (sensor.getType() == Sensor.TYPE_ACCELEROMETER) ? 1 : 0;
if (j == 1) {
float vSum = 0;
for (int i=0 ; i<3 ; i++) {
final float v = mYOffset + event.values[i] * mScale[j];
vSum += v;
}
int k = 0;
float v = vSum / 3;
//Log.e("data", "data"+v);
float direction = (v > mLastValues[k] ? 1 : (v < mLastValues[k] ? -1 : 0));
if (direction == - mLastDirections[k]) {
// Direction changed
int extType = (direction > 0 ? 0 : 1); // minumum or maximum?
mLastExtremes[extType][k] = mLastValues[k];
float diff = Math.abs(mLastExtremes[extType][k] - mLastExtremes[1 - extType][k]);
if (diff > mLimit) {
boolean isAlmostAsLargeAsPrevious = diff > (mLastDiff[k]*2/3);
boolean isPreviousLargeEnough = mLastDiff[k] > (diff/3);
boolean isNotContra = (mLastMatch != 1 - extType);
if (isAlmostAsLargeAsPrevious && isPreviousLargeEnough && isNotContra) {
for (StepListener stepListener : mStepListeners) {
stepListener.onStep();
}
mLastMatch = extType;
}
else {
Log.i(TAG, "no step");
mLastMatch = -1;
}
}
mLastDiff[k] = diff;
}
mLastDirections[k] = direction;
mLastValues[k] = v;
}
}
}
}
for registering sensors:
mSensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
mSensor = mSensorManager.getDefaultSensor(
Sensor.TYPE_ACCELEROMETER);
mSensorManager.registerListener(mStepDetector,mSensor,SensorManager.SENSOR_DELAY_NORMAL);
in the algorithm i have different levels for sensitivity as public void
setSensitivity(float sensitivity) {
mLimit = sensitivity; // 1.97 2.96 4.44 6.66 10.00 15.00 22.50 33.75 50.62
}
on various sensitivity level my result is:
sensitivity rantastic pedometer my app
10.00 3870 5500
11.00 3000 4000
11.15 3765 4576
13.00 2000 890
11.30 754 986
I am not getting any proper pattern to match with the requirement.
As per my analysis this application is using Sensor.TYPE_MAGNETIC_FIELD for steps calculation please let me know some algorithm so that I can meet with the requirement.
The first thing you need to do is decide on an algorithm. As far as I know there are roughly speaking three ways to detect steps using accelerometers that are described in the literature:
Use the Pythagorean theorem to calculate the magnitude of the acceleration vector of each sample from the accelerometer. Low-pass filter the magnitude signal to remove high frequency noise and then look for peaks and valleys in the filtered signal. You may need to add additional requirements to remove false positives. This is by far the simplest way to detect steps, it is also the way that most if not all ordinary pedometers of the sort that you can buy from a sports store work.
Use Pythagoras' like in (1), then run the signal through an FFT and compare the output from the FFT to known outputs of walking. This requires you to have access to a fairly large amount of training data.
Feed the accelerometer data into an algorithm that uses some suitable machine learning technique, for example a neural network or a digital wavelet transform. You can of course include other sensors in this approach. This also requires you to have access to a fairly large amount of training data.
Once you have decided on an algorithm you will probably want to use something like Matlab or SciPy to test your algorithm on your computer using recordings that you have made on Android phones. Dump accelerometer data to a cvs file on your phone, make a record of how many steps the file represents, copy the file to your computer and run your algorithm on the data to see if it gets the step count right. That way you can detect problems with the algorithm and correct them.
If this sounds difficult, then the best way to get access to good step detection is probably to wait until more phones come with the built-in step counter that KitKat enables.
https://github.com/bagilevi/android-pedometer
i hope this might be helpfull
I am using step detection in my walking instrument.
I get nice results of step detection.
I use achartengine to plot accelerometer data.
Take a look here.
What I do:
Analysis of magnitude vector for accelerometer sensor.
Setting a changeable threshold level. When signal from accelerometer is above it I count it as a step.
Setting the time of inactive state (for step detection) after first crossing of the threshold.
Point 3. is calculated:
arbitrary setting the maximum tempo of our walking (e.g. 120bpm)
if 60bpm - 1000msec per step, then 120bpm - 500msec per step
accelerometer passes data with certain desired frequency (SENSOR_DELAY_NORMAL, SENSOR_DELAY_GAME, etc.). When DELAY_GAME: T ~= 20ms (this is included in Android documentation)
n - samples to omit (after passing the threshold)
n = 500msec / T
n = 500 / 20 = 25 (plenty of them. You can adjust this value).
after that, the threshold becomes active.
Take a look at this picture:
This is my realization. It was written about 1.5-2 years ago. And I really don't remember all this stuff that I wrote. But it worked. And it worked good for my needs.
I know that this is really big class (some methods are deleted), but may be it will be helpful. If not, I'll just remove this answer...
public class StepDetector implements SensorEventListener
{
public static final int MAX_BUFFER_SIZE = 5;
private static final int Y_DATA_COUNT = 4;
private static final double MIN_GRAVITY = 2;
private static final double MAX_GRAVITY = 1200;
public void onSensorChanged(final SensorEvent sensorEvent)
{
final float[] values = sensorEvent.values;
final Sensor sensor = sensorEvent.sensor;
if (sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
{
magneticDetector(values, sensorEvent.timestamp / (500 * 10 ^ 6l));
}
if (sensor.getType() == Sensor.TYPE_ACCELEROMETER)
{
accelDetector(values, sensorEvent.timestamp / (500 * 10 ^ 6l));
}
}
private ArrayList<float[]> mAccelDataBuffer = new ArrayList<float[]>();
private ArrayList<Long> mMagneticFireData = new ArrayList<Long>();
private Long mLastStepTime = null;
private ArrayList<Pair> mAccelFireData = new ArrayList<Pair>();
private void accelDetector(float[] detectedValues, long timeStamp)
{
float[] currentValues = new float[3];
for (int i = 0; i < currentValues.length; ++i)
{
currentValues[i] = detectedValues[i];
}
mAccelDataBuffer.add(currentValues);
if (mAccelDataBuffer.size() > StepDetector.MAX_BUFFER_SIZE)
{
double avgGravity = 0;
for (float[] values : mAccelDataBuffer)
{
avgGravity += Math.abs(Math.sqrt(
values[0] * values[0] + values[1] * values[1] + values[2] * values[2]) - SensorManager.STANDARD_GRAVITY);
}
avgGravity /= mAccelDataBuffer.size();
if (avgGravity >= MIN_GRAVITY && avgGravity < MAX_GRAVITY)
{
mAccelFireData.add(new Pair(timeStamp, true));
}
else
{
mAccelFireData.add(new Pair(timeStamp, false));
}
if (mAccelFireData.size() >= Y_DATA_COUNT)
{
checkData(mAccelFireData, timeStamp);
mAccelFireData.remove(0);
}
mAccelDataBuffer.clear();
}
}
private void checkData(ArrayList<Pair> accelFireData, long timeStamp)
{
boolean stepAlreadyDetected = false;
Iterator<Pair> iterator = accelFireData.iterator();
while (iterator.hasNext() && !stepAlreadyDetected)
{
stepAlreadyDetected = iterator.next().first.equals(mLastStepTime);
}
if (!stepAlreadyDetected)
{
int firstPosition = Collections.binarySearch(mMagneticFireData, accelFireData.get(0).first);
int secondPosition = Collections
.binarySearch(mMagneticFireData, accelFireData.get(accelFireData.size() - 1).first - 1);
if (firstPosition > 0 || secondPosition > 0 || firstPosition != secondPosition)
{
if (firstPosition < 0)
{
firstPosition = -firstPosition - 1;
}
if (firstPosition < mMagneticFireData.size() && firstPosition > 0)
{
mMagneticFireData = new ArrayList<Long>(
mMagneticFireData.subList(firstPosition - 1, mMagneticFireData.size()));
}
iterator = accelFireData.iterator();
while (iterator.hasNext())
{
if (iterator.next().second)
{
mLastStepTime = timeStamp;
accelFireData.remove(accelFireData.size() - 1);
accelFireData.add(new Pair(timeStamp, false));
onStep();
break;
}
}
}
}
}
private float mLastDirections;
private float mLastValues;
private float mLastExtremes[] = new float[2];
private Integer mLastType;
private ArrayList<Float> mMagneticDataBuffer = new ArrayList<Float>();
private void magneticDetector(float[] values, long timeStamp)
{
mMagneticDataBuffer.add(values[2]);
if (mMagneticDataBuffer.size() > StepDetector.MAX_BUFFER_SIZE)
{
float avg = 0;
for (int i = 0; i < mMagneticDataBuffer.size(); ++i)
{
avg += mMagneticDataBuffer.get(i);
}
avg /= mMagneticDataBuffer.size();
float direction = (avg > mLastValues ? 1 : (avg < mLastValues ? -1 : 0));
if (direction == -mLastDirections)
{
// Direction changed
int extType = (direction > 0 ? 0 : 1); // minumum or maximum?
mLastExtremes[extType] = mLastValues;
float diff = Math.abs(mLastExtremes[extType] - mLastExtremes[1 - extType]);
if (diff > 8 && (null == mLastType || mLastType != extType))
{
mLastType = extType;
mMagneticFireData.add(timeStamp);
}
}
mLastDirections = direction;
mLastValues = avg;
mMagneticDataBuffer.clear();
}
}
public static class Pair implements Serializable
{
Long first;
boolean second;
public Pair(long first, boolean second)
{
this.first = first;
this.second = second;
}
#Override
public boolean equals(Object o)
{
if (o instanceof Pair)
{
return first.equals(((Pair) o).first);
}
return false;
}
}
}
One main difference I spotted between your implementation and the code in the grepcode project is the way you register the listener.
Your code:
mSensorManager.registerListener(mStepDetector,
mSensor,
SensorManager.SENSOR_DELAY_NORMAL);
Their code:
mSensorManager.registerListener(mStepDetector,
mSensor,
SensorManager.SENSOR_DELAY_FASTEST);
This is a big difference. SENSOR_DELAY_NORMAL is intended for orientation changes, and is therefor not that fast (ever noticed that it takes some time between you rotating the device, and the device actually rotating? That's because this is some functionality that does not need to be super fast (that would probably be pretty annoying even). The rate at which you get updates is not that high).
On the other hand, SENSOR_DELAY_FASTEST is intended for things like pedometers: you want the sensor data as fast and often as possible, so your calculations of steps will be as accurate as possible.
Try to switch to the SENSOR_DELAY_FASTEST rate, and test again! It should make a big difference.
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType()==Sensor.TYPE_ACCELEROMETER ){
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
currentvectorSum = (x*x + y*y + z*z);
if(currentvectorSum < 100 && inStep==false){
inStep = true;
}
if(currentvectorSum > 125 && inStep==true){
inStep = false;
numSteps++;
Log.d("TAG_ACCELEROMETER", "\t" + numSteps);
}
}
}
This is my code for detecting if phone is turned around or not
private SensorManager sensorManager;
private int orientationLim = 165;
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ORIENTATION) {
// If shake to stop is enabled
boolean turnAroundToStop = Utils.getBooleanFromProperties(this, Properties.SP_CB_TURN_AROUND_TO_STOP);
if (turnAroundToStop) {
float value = Math.abs(event.values[1]);
if (value > orientationLim && !stopped) {
// Down
stopped = true;
} else {
// Up
stopped = false;
}
}
}
}
but problem is that stopped variable is set to true even phone is not completely turned around but just a little.
How can i modify this code that will be executed only when phone is relay turned around.
float pitch = values[2];
if (pitch <= 45 && pitch >= -45) {
// mostly vertical
} else if (pitch < -45) {
// mostly right side up
} else if (pitch > 45) {
// mostly left side up
}
Take a look here on this topic:
http://www.workingfromhere.com/blog/2009/03/30/orientation-sensor-tips-in-android/
My app is locked into portrait orientation only, however in one fragment I have a camera preview where I would like to rotate captured images based on the device orientation. I believe that because my app is portrait only, the following code always logs zero.
Display display = ((WindowManager)getActivity().getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
int rotation = display.getRotation();
Log.i(TAG, "Rotation: " + rotation );
Is it possible to get the actual orientation of the device while locking the app to portrait?
I am targeting android 4.0+ so I'm not concerned if the solution won't work on older devices.
you could implement a SensorEventListener, then look at the Roll in onSensorChanged:
#Override
public void onSensorChanged(SensorEvent event) {
synchronized(this)
{
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mAccelerometerValues = event.values;
}
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
mGeomageneticValues = event.values;
}
if ((mAccelerometerValues != null) && (mGeomageneticValues != null))
{
boolean success = SensorManager.getRotationMatrix(R, I, mAccelerometerValues, mGeomageneticValues);
if (success)
{
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
SensorManager.getOrientation(outR,orientation);
mYaw = orientation[0] * MathX.toDegreesF;
mPitch = orientation[1] * MathX.toDegreesF;
mRoll = orientation[2] * MathX.toDegreesF;
String sText = String.format("a:%1.4f\np:%1.4f\nr:%1.4f", yaw,pitch,roll);
}
}
}
}
I'm trying to setup my android game to use android sensors for controlling the game.
When the phone is tilted forwards slightly, I want the thrusters to fire.
And tilting the phone left or right rotates it in the rescpetive direction.
At the moment, I'm using TYPE_ORIENTATION as follows:
public void onSensorChanged(SensorEvent event) {
if(mMode == STATE_RUNNING) {
synchronized (mSurfaceHolder) {
//rotation
float pitch = event.values[2]; // pitch
if(pitch <= 15 & pitch >= -15) {
mRotating = 0;
} else if(pitch < -15) {
mRotating += 1;
} else if(pitch > 15) {
mRotating -= 1;
}
//thrust by tilting forward!
if(event.values[0] > 0) {
thrusterFiring = true;
} else {
thrusterFiring = false;
}
}
}
}
But I'm not sure if this is quite right? Additionally, you have to filt the phone quite a bit forwards in order for the thrusters to fire - I'd like it so you only need to tilt a little!
Any help appreciated!