How do I detect a wiggle motion on Android? - android

I need an algorithm that can detect a 'wiggle' gesture on Android, and by that I mean where the phone is held in portrait mode and the wrist is twisted left and right, like the motion of opening a round door handle. So I need to capture rotation back and forth around the z axis for a period of time.
A (crude) picture to illustrate the motion I need to capture:
I've tried a number of different algorithms that capture shaking, such as this "Shoogle For Yer Photos" solution seen here on Github, which work quite well, but capture shaking on every axis using the accelerometer. How would I restrict an algorithm like this to only detect rotation back and forth on the Z-axis, 'wiggling' for a minimum of say 4 seconds?
Here is an extract from the 'Shoogle For Yer Photos' code posted above, which shows how they handle shake detection:
private static final long KEEP_DATA_POINTS_FOR = 1500;
private static final long MINIMUM_EACH_DIRECTION = 7;
private static final float POSITIVE_COUNTER_THRESHHOLD = (float) 2.0;
private static final float NEGATIVE_COUNTER_THRESHHOLD = (float) -2.0;
public void checkForShake() {
long curTime = System.currentTimeMillis();
long cutOffTime = curTime - KEEP_DATA_POINTS_FOR;
while(dataPoints.size() > 0 && dataPoints.get(0).atTimeMilliseconds < cutOffTime) dataPoints.remove(0);
int x_pos =0, x_neg=0, x_dir = 0, y_pos=0, y_neg=0, y_dir=0, z_pos=0, z_neg = 0, z_dir = 0;
for(DataPoint dp: dataPoints){
if (dp.x > POSITIVE_COUNTER_THRESHHOLD && x_dir < 1) {
++x_pos;
x_dir = 1;
}
if (dp.x < NEGATIVE_COUNTER_THRESHHOLD && x_dir > -1) {
++x_neg;
x_dir = -1;
}
if (dp.y > POSITIVE_COUNTER_THRESHHOLD && y_dir < 1) {
++y_pos;
y_dir = 1;
}
if (dp.y < NEGATIVE_COUNTER_THRESHHOLD && y_dir > -1) {
++y_neg;
y_dir = -1;
}
if (dp.z > POSITIVE_COUNTER_THRESHHOLD && z_dir < 1) {
++z_pos;
z_dir = 1;
}
if (dp.z < NEGATIVE_COUNTER_THRESHHOLD && z_dir > -1) {
++z_neg;
z_dir = -1;
}
}
if ((x_pos >= MINIMUM_EACH_DIRECTION && x_neg >= MINIMUM_EACH_DIRECTION) ||
(y_pos >= MINIMUM_EACH_DIRECTION && y_neg >= MINIMUM_EACH_DIRECTION) ||
(z_pos >= MINIMUM_EACH_DIRECTION && z_neg >= MINIMUM_EACH_DIRECTION) ) {
lastShake = System.currentTimeMillis();
last_x = 0; last_y=0; last_z=0;
dataPoints.clear();
triggerShakeDetected();
return;
}
}
Any help will be greatly appreciated, thanks.

Related

android AR view, why the speed of AR view is becoming slower

I draw the constellation line in the AR view of android device. using the following code. At beginning ,the response speed of objects in AR view is ok. After I closed the AR view or left the AR view, and reopen or back to the AR view. The response speed of objects(constellations) in AR view became slower(may take it 1 second). How to deal with this issue and speed up. lenght1 = 352, length2 = 384. how to improve the performance.
By the way, after I uninstall the app, and run the app again, the response speed of objects in AR view become the normal.
private void drawConstellationLine(Canvas canvas, int displayHeight,
int displayWidth) {
int length;
float ex = 0, ey = 0, bx = 0, by = 0;
Paint pline = new Paint();
pline.setColor(Color.RED);
double time1 = System.currentTimeMillis();
int i,j,length1,length2;
for( i = 0, length1 = ShowMoon.starlinelist.size(); i<length1;i++)
{
for( j = 0, length2 = ShowMoon.starlist.size(); j<length2;j++){
//System.out.println("length2:"+length2);
if(ShowMoon.starlinelist.get(i).beginhr ==ShowMoon.starlist.get(j).hr) {
bx = (float)ShowMoon.starlist.get(j).x;
by = (float)ShowMoon.starlist.get(j).y;
}
if(ShowMoon.starlinelist.get(i).endhr ==ShowMoon.starlist.get(j).hr) {
ex = (float)ShowMoon.starlist.get(j).x;
ey = (float)ShowMoon.starlist.get(j).y;
}
}
if (((bx >= 0 && bx <=displayWidth) ||( ex >= 0 && ex <= displayWidth))
&& ((ey >= 0 && ey <=displayHeight) || (by >= 0 && by <= displayHeight))){
canvas.drawLine(bx, by, ex, ey, pline);
//
double time2 = System.currentTimeMillis();
System.out.println("time dif:"+(time2-time1));
}
}
these two codes: if(ShowMoon.starlinelist.get(i).beginhr ==ShowMoon.starlist.get(j).hr).
if(ShowMoon.starlinelist.get(i).endhr ==ShowMoon.starlist.get(j).hr)
take a lot of time.

Selecting similar coloured pixels in ImageView similar to magic wand tool in Photoshop

I have tried in OpenCV- Sample-Color-blob-detection code as well ,but What I actually want is this to work on a Bitmap, not camera view.
I have tried the below code ,
public boolean onTouch(View v, MotionEvent event)
{
int x = (int)event.getX();
int y = (int)event.getY();
Log.e("Test", "Touch image coordinates:"+x+" , "+y);
Log.e("Test", "color:"+bimp.getPixel(x, y));
int xclear = bimp.getPixel(x, y) ;
int xclear_red = Color.red(xclear) ;
int xclear_blue = Color.blue(xclear) ;
int xclear_green = Color.green(xclear) ;
if ((x < 0) || (y < 0) || (x > bimp.getWidth()) || (y > bimp.getHeight()))
return false;
for(int x1=0 ; x1<bimp.getWidth() ; x1++)
{
for(int y1=0 ; y1<bimp.getHeight() ; y1++)
{
int px = bimp.getPixel(x1, y1);
int px_red = Color.red(px) ;
int px_blue = Color.blue(px) ;
int px_green = Color.green(px) ;
if((px_red+10 > xclear_red) && (px_red -10 < xclear_red))
{
if((px_blue > xclear_blue) && (px_blue-10 < xclear_blue))
{
if((px_green+10 > xclear_green) && (px_green-10 < xclear_green))
{
bimp.setPixel(x1, y1, Color.TRANSPARENT);
}
}
}
}
if(x1 == bimp.getWidth()-1 )
img.setImageBitmap(bimp);
}
return false; // don't need subsequent touch events
}
it is making the pixel which resembles ( + or - 10) of the pixel color , I have touched on transparent .
But what I actually want is (shown in the image below).
that is to select the similar coloured pixels(shown with red coloured border) as the action of wand tool in photoshop . So that i could make the selected portion transparent or crop.
Please suggest me some ideas .Thanks in Advance .
I think what you need is flood fill algorithm. It may be of some use to you.
http://en.wikipedia.org/wiki/Flood_fill

how to calculate exact foot step count using accelerometer in android?

I am developing some application like Runtastic Pedometer using the algorithm but I am not getting any similarity between the results.
my code is as follows:
public void onSensorChanged(SensorEvent event)
{
Sensor sensor = event.sensor;
synchronized (this)
{
if (sensor.getType() == Sensor.TYPE_ORIENTATION) {}
else {
int j = (sensor.getType() == Sensor.TYPE_ACCELEROMETER) ? 1 : 0;
if (j == 1) {
float vSum = 0;
for (int i=0 ; i<3 ; i++) {
final float v = mYOffset + event.values[i] * mScale[j];
vSum += v;
}
int k = 0;
float v = vSum / 3;
//Log.e("data", "data"+v);
float direction = (v > mLastValues[k] ? 1 : (v < mLastValues[k] ? -1 : 0));
if (direction == - mLastDirections[k]) {
// Direction changed
int extType = (direction > 0 ? 0 : 1); // minumum or maximum?
mLastExtremes[extType][k] = mLastValues[k];
float diff = Math.abs(mLastExtremes[extType][k] - mLastExtremes[1 - extType][k]);
if (diff > mLimit) {
boolean isAlmostAsLargeAsPrevious = diff > (mLastDiff[k]*2/3);
boolean isPreviousLargeEnough = mLastDiff[k] > (diff/3);
boolean isNotContra = (mLastMatch != 1 - extType);
if (isAlmostAsLargeAsPrevious && isPreviousLargeEnough && isNotContra) {
for (StepListener stepListener : mStepListeners) {
stepListener.onStep();
}
mLastMatch = extType;
}
else {
Log.i(TAG, "no step");
mLastMatch = -1;
}
}
mLastDiff[k] = diff;
}
mLastDirections[k] = direction;
mLastValues[k] = v;
}
}
}
}
for registering sensors:
mSensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
mSensor = mSensorManager.getDefaultSensor(
Sensor.TYPE_ACCELEROMETER);
mSensorManager.registerListener(mStepDetector,mSensor,SensorManager.SENSOR_DELAY_NORMAL);
in the algorithm i have different levels for sensitivity as public void
setSensitivity(float sensitivity) {
mLimit = sensitivity; // 1.97 2.96 4.44 6.66 10.00 15.00 22.50 33.75 50.62
}
on various sensitivity level my result is:
sensitivity rantastic pedometer my app
10.00 3870 5500
11.00 3000 4000
11.15 3765 4576
13.00 2000 890
11.30 754 986
I am not getting any proper pattern to match with the requirement.
As per my analysis this application is using Sensor.TYPE_MAGNETIC_FIELD for steps calculation please let me know some algorithm so that I can meet with the requirement.
The first thing you need to do is decide on an algorithm. As far as I know there are roughly speaking three ways to detect steps using accelerometers that are described in the literature:
Use the Pythagorean theorem to calculate the magnitude of the acceleration vector of each sample from the accelerometer. Low-pass filter the magnitude signal to remove high frequency noise and then look for peaks and valleys in the filtered signal. You may need to add additional requirements to remove false positives. This is by far the simplest way to detect steps, it is also the way that most if not all ordinary pedometers of the sort that you can buy from a sports store work.
Use Pythagoras' like in (1), then run the signal through an FFT and compare the output from the FFT to known outputs of walking. This requires you to have access to a fairly large amount of training data.
Feed the accelerometer data into an algorithm that uses some suitable machine learning technique, for example a neural network or a digital wavelet transform. You can of course include other sensors in this approach. This also requires you to have access to a fairly large amount of training data.
Once you have decided on an algorithm you will probably want to use something like Matlab or SciPy to test your algorithm on your computer using recordings that you have made on Android phones. Dump accelerometer data to a cvs file on your phone, make a record of how many steps the file represents, copy the file to your computer and run your algorithm on the data to see if it gets the step count right. That way you can detect problems with the algorithm and correct them.
If this sounds difficult, then the best way to get access to good step detection is probably to wait until more phones come with the built-in step counter that KitKat enables.
https://github.com/bagilevi/android-pedometer
i hope this might be helpfull
I am using step detection in my walking instrument.
I get nice results of step detection.
I use achartengine to plot accelerometer data.
Take a look here.
What I do:
Analysis of magnitude vector for accelerometer sensor.
Setting a changeable threshold level. When signal from accelerometer is above it I count it as a step.
Setting the time of inactive state (for step detection) after first crossing of the threshold.
Point 3. is calculated:
arbitrary setting the maximum tempo of our walking (e.g. 120bpm)
if 60bpm - 1000msec per step, then 120bpm - 500msec per step
accelerometer passes data with certain desired frequency (SENSOR_DELAY_NORMAL, SENSOR_DELAY_GAME, etc.). When DELAY_GAME: T ~= 20ms (this is included in Android documentation)
n - samples to omit (after passing the threshold)
n = 500msec / T
n = 500 / 20 = 25 (plenty of them. You can adjust this value).
after that, the threshold becomes active.
Take a look at this picture:
This is my realization. It was written about 1.5-2 years ago. And I really don't remember all this stuff that I wrote. But it worked. And it worked good for my needs.
I know that this is really big class (some methods are deleted), but may be it will be helpful. If not, I'll just remove this answer...
public class StepDetector implements SensorEventListener
{
public static final int MAX_BUFFER_SIZE = 5;
private static final int Y_DATA_COUNT = 4;
private static final double MIN_GRAVITY = 2;
private static final double MAX_GRAVITY = 1200;
public void onSensorChanged(final SensorEvent sensorEvent)
{
final float[] values = sensorEvent.values;
final Sensor sensor = sensorEvent.sensor;
if (sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
{
magneticDetector(values, sensorEvent.timestamp / (500 * 10 ^ 6l));
}
if (sensor.getType() == Sensor.TYPE_ACCELEROMETER)
{
accelDetector(values, sensorEvent.timestamp / (500 * 10 ^ 6l));
}
}
private ArrayList<float[]> mAccelDataBuffer = new ArrayList<float[]>();
private ArrayList<Long> mMagneticFireData = new ArrayList<Long>();
private Long mLastStepTime = null;
private ArrayList<Pair> mAccelFireData = new ArrayList<Pair>();
private void accelDetector(float[] detectedValues, long timeStamp)
{
float[] currentValues = new float[3];
for (int i = 0; i < currentValues.length; ++i)
{
currentValues[i] = detectedValues[i];
}
mAccelDataBuffer.add(currentValues);
if (mAccelDataBuffer.size() > StepDetector.MAX_BUFFER_SIZE)
{
double avgGravity = 0;
for (float[] values : mAccelDataBuffer)
{
avgGravity += Math.abs(Math.sqrt(
values[0] * values[0] + values[1] * values[1] + values[2] * values[2]) - SensorManager.STANDARD_GRAVITY);
}
avgGravity /= mAccelDataBuffer.size();
if (avgGravity >= MIN_GRAVITY && avgGravity < MAX_GRAVITY)
{
mAccelFireData.add(new Pair(timeStamp, true));
}
else
{
mAccelFireData.add(new Pair(timeStamp, false));
}
if (mAccelFireData.size() >= Y_DATA_COUNT)
{
checkData(mAccelFireData, timeStamp);
mAccelFireData.remove(0);
}
mAccelDataBuffer.clear();
}
}
private void checkData(ArrayList<Pair> accelFireData, long timeStamp)
{
boolean stepAlreadyDetected = false;
Iterator<Pair> iterator = accelFireData.iterator();
while (iterator.hasNext() && !stepAlreadyDetected)
{
stepAlreadyDetected = iterator.next().first.equals(mLastStepTime);
}
if (!stepAlreadyDetected)
{
int firstPosition = Collections.binarySearch(mMagneticFireData, accelFireData.get(0).first);
int secondPosition = Collections
.binarySearch(mMagneticFireData, accelFireData.get(accelFireData.size() - 1).first - 1);
if (firstPosition > 0 || secondPosition > 0 || firstPosition != secondPosition)
{
if (firstPosition < 0)
{
firstPosition = -firstPosition - 1;
}
if (firstPosition < mMagneticFireData.size() && firstPosition > 0)
{
mMagneticFireData = new ArrayList<Long>(
mMagneticFireData.subList(firstPosition - 1, mMagneticFireData.size()));
}
iterator = accelFireData.iterator();
while (iterator.hasNext())
{
if (iterator.next().second)
{
mLastStepTime = timeStamp;
accelFireData.remove(accelFireData.size() - 1);
accelFireData.add(new Pair(timeStamp, false));
onStep();
break;
}
}
}
}
}
private float mLastDirections;
private float mLastValues;
private float mLastExtremes[] = new float[2];
private Integer mLastType;
private ArrayList<Float> mMagneticDataBuffer = new ArrayList<Float>();
private void magneticDetector(float[] values, long timeStamp)
{
mMagneticDataBuffer.add(values[2]);
if (mMagneticDataBuffer.size() > StepDetector.MAX_BUFFER_SIZE)
{
float avg = 0;
for (int i = 0; i < mMagneticDataBuffer.size(); ++i)
{
avg += mMagneticDataBuffer.get(i);
}
avg /= mMagneticDataBuffer.size();
float direction = (avg > mLastValues ? 1 : (avg < mLastValues ? -1 : 0));
if (direction == -mLastDirections)
{
// Direction changed
int extType = (direction > 0 ? 0 : 1); // minumum or maximum?
mLastExtremes[extType] = mLastValues;
float diff = Math.abs(mLastExtremes[extType] - mLastExtremes[1 - extType]);
if (diff > 8 && (null == mLastType || mLastType != extType))
{
mLastType = extType;
mMagneticFireData.add(timeStamp);
}
}
mLastDirections = direction;
mLastValues = avg;
mMagneticDataBuffer.clear();
}
}
public static class Pair implements Serializable
{
Long first;
boolean second;
public Pair(long first, boolean second)
{
this.first = first;
this.second = second;
}
#Override
public boolean equals(Object o)
{
if (o instanceof Pair)
{
return first.equals(((Pair) o).first);
}
return false;
}
}
}
One main difference I spotted between your implementation and the code in the grepcode project is the way you register the listener.
Your code:
mSensorManager.registerListener(mStepDetector,
mSensor,
SensorManager.SENSOR_DELAY_NORMAL);
Their code:
mSensorManager.registerListener(mStepDetector,
mSensor,
SensorManager.SENSOR_DELAY_FASTEST);
This is a big difference. SENSOR_DELAY_NORMAL is intended for orientation changes, and is therefor not that fast (ever noticed that it takes some time between you rotating the device, and the device actually rotating? That's because this is some functionality that does not need to be super fast (that would probably be pretty annoying even). The rate at which you get updates is not that high).
On the other hand, SENSOR_DELAY_FASTEST is intended for things like pedometers: you want the sensor data as fast and often as possible, so your calculations of steps will be as accurate as possible.
Try to switch to the SENSOR_DELAY_FASTEST rate, and test again! It should make a big difference.
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType()==Sensor.TYPE_ACCELEROMETER ){
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
currentvectorSum = (x*x + y*y + z*z);
if(currentvectorSum < 100 && inStep==false){
inStep = true;
}
if(currentvectorSum > 125 && inStep==true){
inStep = false;
numSteps++;
Log.d("TAG_ACCELEROMETER", "\t" + numSteps);
}
}
}

Move animated movieClips using buttons instead of arrow keys

I'm trying to develop a game where I want my character to run when I click a button, and continue running if I hold the button. I'm new to ActionScript 3, so I'm a bit lost here.
I've found code that satisfies my requirements; but it uses the arrow keys, as below:
function moveRunKei() {
if (Key.isDown(Key.RIGHT)) {
dx = 15; //speed
runKei._xscale = 50;
} else if (Key.isDown(Key.LEFT)) {
dx = -15;
runKei._xscale = -50;
} else {
dx = 0;
}
runKei._x += dx;
if (runKei._x < 100) runKei._x = 100; //30
if (runKei._x > 550) runKei._x = 550;
if (dx != 0 && runKei._currentframe == 1) {
runKei.gotoAndPlay("run");
} else if (dx == 0 && runKei._currentframe != 1) {
runKei.gotoAndStop("stand");
}
}
this.onEnterFrame = function(){
moveRunKei();
}
I need to be able to do this using buttons.
////////////////////////////////////////////////////////////////////////////////////
import flash.events.Event;
var mouseDown:Boolean;
var speed:Number=4;
addEventListener(MouseEvent.MOUSE_DOWN, onMouseDown);
addEventListener(MouseEvent.MOUSE_UP, onMouseUp);
addEventListener(Event.ENTER_FRAME, onEnterFrame);
function onMouseDown(event:MouseEvent):void
{
mouseDown = true;
}
function onMouseUp(event:MouseEvent):void
{
mouseDown = false;
}
function onEnterFrame(event:Event):void
{
if (mouseDown)
{
runKei.gotoAndPlay("Run");
runKei.x += speed;
}
}
This code able to make my character move continuously when I hold the button but it didn't animate while it move(the character freeze until I release the button) - I'm not sure how to explain it.
You'll need to add event listeners for mouse down and mouse up on each of the buttons for movement. Then have booleans that keep track of whether or not the button is down.
It's worth mentioning the code you've linked seems to be actionscript2 so I've changed it to work with as3
var leftDown:Boolean = false;
var rightDown:Boolean = true;
leftButton.addEventListener(MouseEvent.MOUSE_DOWN, onMouseDown)
rightButton.addEventListener(MouseEvent.MOUSE_DOWN, onMouseDown)
leftButton.addEventListener(MouseEvent.MOUSE_UP, onMouseUp)
rightButton.addEventListener(MouseEvent.MOUSE_UP, onMouseUp)
leftButton.addEventListener(MouseEvent.MOUSE_OUT, onMouseUp)
rightButton.addEventListener(MouseEvent.MOUSE_OUT, onMouseUp)
function onMouseDown(e:MouseEvent):void
{
//since you can't click on two things at once, this is fine.
rightDown = (e.target == rightButton);
leftDown = (e.target == rightButton);
}
function onMouseDown(e:MouseEvent):void
{
//since you can't click on two things at once, this is fine.
rightDown = (e.target == rightButton);
leftDown = (e.target == leftButton);
}
function moveRunKei()
{
if (rightDown) {
dx = 15; //speed
runKei.scaleX = -0.5;
} else if (leftDown) {
dx = -15;
runKei.scaleX = -0.5;
} else {
dx = 0;
}
runKei.x += dx;
if (runKei.x < 100) runKei.x = 100; //30
if (runKei.x > 550) runKei.x = 550;
if (dx != 0 && runKei.currentFrame == 1)
{
runKei.gotoAndPlay("run");
}
else if (dx == 0 && runKei.currentFrame != 1)
{
runKei.gotoAndStop("stand");
}
}
this.addEventListener(Event.ENTER_FRAME, onEnterFrame);
function onEnterFrame(e:Event):void
{
moveRunKei();
}

Android Google Maps - how to set the zoom level (and center) of map to display all my geoobjects?

I use google api 2.2. How should I set zoom level(maybe center as well) to display all my geo-objects from ArrayList or Array? I should be dependent on landscape/portrait orientation and maybe screen resolution?
I guess that computing min and max for latitude/longitude will not be complicated in your array ;-)
m_Map.getController().zoomToSpan(
maxLatitude() - minlatitude(),
maxLongitude() - minLongitude());
Following is a simple approach I have followed:
1. Find the distance between the two Geo points. Use Loc1.distanceTo(Loc2) to get the distance between them.
2. Based on the distance, you can use the below code to set the zoom level. You may have to improvise on the below code to support all screen sizes.
if (fDistance>0 && fDistance<=0.5) {
iZoomLevel = 18;
} else if (fDistance>0.5 && fDistance<=2) {
iZoomLevel = 17;
} else if (fDistance>2 && fDistance<=3) {
iZoomLevel = 15;
} else if (fDistance>3 && fDistance<=10) {
iZoomLevel = 14;
} else if (fDistance>10 && fDistance<=50) {
iZoomLevel = 11;
} else if (fDistance>50 && fDistance<=100) {
iZoomLevel = 9;
} else if (fDistance>100 && fDistance<=300) {
iZoomLevel = 8;
} else if (fDistance>300 && fDistance<=1000) {
iZoomLevel = 7;
} else if (fDistance>1000 && fDistance<=3000) {
iZoomLevel = 5;
} else if (fDistance>3000 && fDistance<=5000) {
iZoomLevel = 4;
} else if (fDistance>5000 && fDistance<=10000) {
iZoomLevel = 3;
}

Categories

Resources