GestureDetector Multi-touch - android

I'm working with touch gestures in Android using the OnGestureListener interface and GestureDetector.
I made an app to test if detecting two fingers works, in onFlp(MotionEvent e1, MotionEvent e2, float velocityX,float velocityY), I print the id of the different MotionEvents but these ids are the same (apparently only detects one finger).
Does GestureDetector support multi-touch events?

The Issue
Using OnGestureListener to detect multitouch gestures does not seem to be implemented by default.
The first thing you may have tried is reading event.pointerCount to get the count of fingers on the screen. However, this will be equal to 1. This is because you will (quite likely) never be able touch the screen with both fingers in exactly the same millisecond.
Fixing it
You have to buffer pointerCount (the amount of fingers on screen). First add those variables somewhere in the context that you intend to track gestures in:
// track how many fingers are used
var bufferedPointerCount = 1
var bufferTolerance = 500 // in ms
var pointerBufferTimer = Timer()
Then, in the onTouchEvent(event: MotionEvent) function, you add this:
// Buffer / Debounce the pointer count
if (event.pointerCount > bufferedPointerCount) {
bufferedPointerCount = event.pointerCount
pointerBufferTimer = fixedRateTimer("pointerBufferTimer", true, bufferTolerance, 1000) {
bufferedPointerCount = 1
this.cancel() // a non-recurring timer
}
}
Essentially this tracks the maximum amount of fingers on the display and keeps it valid for bufferTolerance milliseconds (here: 500).
I currently am implementing it in a custom Android Launcher I created (finnmglas/Launcher | see related issue)

Related

How to dispatch multi touch gestures using AccessibilityService (disptachGesture)

As we know that Android O.S. supports **multi finger gestures **. I want to develop an app that dispatches complex gesture for user. I am able to capture the motion events and dispatch gestures which are made only of one finger.
But if the user uses multiple pointers (fingers) for making a gesture, then I am able to capture them but then how can I dispatch them using Accessibility Service (dispatchGesture) function.
Any help would be most welcomed. Thanks
So, actually to dispatch multi finger gestures using accessibility services we can use strokes for each fingers.
For ex- to dispatch a two finger gesture it would be required to add two gesture stroke to gesture description and then dispatch it.
Double swipe up gesture as an example
Point position=new Point(100,10);
GestureDescription.Builder builder = new GestureDescription.Builder();
Path p = new Path();
Path q = new Path();
//For first finger
p.moveTo(position.x, position.y);
p.lineTo(position.x, position.y+300);
//For second finger
q.moveTo(position.x, position.y);
q.lineTo(position.x + 50, position.y+300);
//Two strokes for two fingers
builder.addStroke(new GestureDescription.StrokeDescription(p, 100L, 50L));
builder.addStroke(new GestureDescription.StrokeDescription(q, 100L, 50L));
GestureDescription gesture = builder.build();
boolean isDispatched = dispatchGesture(gesture,gestureResultCallback,null);
The order in which individual pointers appear within a motion event is undefined. Thus the index of a pointer can change from one event to the next, but the pointer ID of a pointer is guaranteed to remain constant as long as the pointer remains active. Use the getPointerId() method to obtain a pointer's ID to track the pointer across all subsequent motion events in a gesture. Then for successive motion events, use the findPointerIndex() method to obtain the pointer index for a given pointer ID in that motion event. For example:
private var mActivePointerId: Int = 0
override fun onTouchEvent(event: MotionEvent): Boolean {
...
// Get the pointer ID
mActivePointerId = event.getPointerId(0)
// ... Many touch events later...
// Use the pointer ID to find the index of the active pointer
// and fetch its position
val (x: Float, y: Float) = event.findPointerIndex(mActivePointerId).let { pointerIndex ->
// Get the pointer's current position
event.getX(pointerIndex) to event.getY(pointerIndex)
}
...
}
To support multiple touch pointers, you can cache all active pointers with their IDs at their individual ACTION_POINTER_DOWN and ACTION_DOWN event time; remove the pointers from your cache at their ACTION_POINTER_UP and ACTION_UPevents. Those cached IDs may be necessary in order to handle other action events correctly; for example, when processing ACTION_MOVE event, you can find the index for each cached active pointer ID, retrieve the pointer's coordinates by using the relavent functions (getX(), getY(), etc.), then compare with your cached coordinates to discover the actually moved pointers. There can be multiple moved pointers in one ACTION_MOVE event. The getActionIndex() function does not apply to the ACTION_MOVE event.

Is it possible to make android webview based app less sensitive in interpreting taps as small drags?

I have and C# Xamarin android app that hosts a reactjs app in a webview.
When using this app on a touch screen android device, It appears that occasionally tapping the screen is ignored.
What appears to be going on is that, the tap is interpreted as a mini drag event, as there was some small directional movement in the tap.
Looking at the android logs, for failed taps, I noticed output like the following:
adb -d logcat -s CustomFrequencyManagerService
06-19 13:35:49.225 2945 9989 D CustomFrequencyManagerService: acquireDVFSLockLocked : type : DVFS_MIN_LIMIT frequency : 839000 uid : 1000 pid : 2945 pkgName : GESTURE_DETECTED#CPU_MIN#49
06-19 13:35:49.781 2945 2945 D CustomFrequencyManagerService: releaseDVFSLockLocked : Getting Lock type frm List : DVFS_MIN_LIMIT frequency : 839000 uid : 1000 pid : 2945 tag : GESTURE_DETECTED#CPU_MIN#49
Note the GESTURE_DETECTED part of the log entry.
However for successful taps, CustomFrequencyManagerService has no output in the log.
Looking at this from the reactjs app perspective:
I noticed that the failed taps emit the following events:
touchstart
touchend
While the normal successful events are:
touchstart
touchend
mousedown
blur
mouseup
click
I could potentially change the reactjs app to respond directly to touch events instead of click events, but I was wondering if there was a way (hopefully programmatically via android app) to alter the sensitivity with regard to what's interpreted as a drag as opposed to a click?
By installing a IOnTouchListener on the Android.WebKit.WebView
_webView.SetOnTouchListener(new GestureIgnoreTouchListener());
I was able to see at what movement threshold a click turned into a drag.
public class GestureIgnoreTouchListener : Java.Lang.Object, Android.Views.View.IOnTouchListener
{
float _x;
float _y;
public bool OnTouch(Android.Views.View v, MotionEvent e)
{
if (e.Action == MotionEventActions.Down)
{
_x = e.RawX;
_y = e.RawY;
return false;
}
if (e.Action == MotionEventActions.Up)
{
var diffX = e.RawX - _x;
var diffY = e.RawY - _y;
var distance = Math.Sqrt(Math.Pow(diffX, 2) + Math.Pow(diffY, 2));
// observed:
// if distance is 10 or less then this is interpreted as a click.
// if distance is 12 or greater, click is not emitted.
Console.WriteLine(distance);
return false;
}
return false;
}
}
Ideally, if the distance was between 10 and 50, I would like to be able to make this be considered a click not a drag. Possibly I could create a synthetic click event, in this case, but I'm hoping I can somehow influence what ever android code is responsible for interpreting this as a drag.
There are two approaches I've seen people use for this situation. The first you already mentioned: tell react when you're in touch screen environment to use tap events instead of click events.
The second is to take into account what Android refers to as "touch slop":
From https://developer.android.com/training/gestures/movement.html:
Because finger-based touch isn't always the most precise form of
interaction, detecting touch events is often based more on movement
than on simple contact. To help apps distinguish between
movement-based gestures (such as a swipe) and non-movement gestures
(such as a single tap), Android includes the notion of "touch slop".
Touch slop refers to the distance in pixels a user's touch can wander
before the gesture is interpreted as a movement-based gesture. For
more discussion of this topic, see Managing Touch Events in a
ViewGroup.
Google provides an example of one way to deal with it in a difference context here: https://developer.android.com/training/gestures/viewgroup#vc which you could probably adapt to your situation.

Android How to detect swipe between two touch points on image view

I have an Imageview with different number of touch points on them. It basically an app which is detecting the swipe between 2 touch points and not allowing the user to swipe any other point or in or out of other direction. It should constrict user to just swipe between two touch points.
Just take a look at following picture:
Now the user should start swiping from point 1 to point 2. if the swipe is not started from starting point 1, it should not color the path between point 1 and point 2.
But if the user successfully swipe between the point 1 and point 2 now swipe between point 2 to 3 should be enabled. Thus user should go through Point 1 to 2, Point 2 to 3 , Point 3 to 4 , point 4 to point 5 to complete round 1.
Please tell me how to achieve this functionality . I know about gestures, gesture overlay etc but none of them fits to my condition as they uses general touch events and gesture directions.
Please suggest me the way to achieve this and keep in mind I want to make this app to be able to run on all type of devices , so I can simply give the hard coded x,y values.
Edit : on Demand
I am posting the link of the app on play store who has same functionality , But I do not know How they are achieving this functionality .
https://play.google.com/store/apps/details?id=al.trigonom.writeletters
If each touch point can be created as individual views(e.g. ImageView), then you can create an inViewInBounds() function.
I used code from here to create code where I needed to detect finger press movement over multiple imageViews:
Rect outRect = new Rect();
int[] location = new int[2];
//Determine if a touch movement is currently over a given view.
private boolean inViewInBounds(View view, int x, int y){
view.getDrawingRect(outRect);
view.getLocationOnScreen(location);
outRect.offset(location[0], location[1]);
return outRect.contains(x, y);
}
To use this function, on a parent view to all those child touch points, set the Click and Touch Listeners:
//This might look redundant but is actually required: "The empty OnClickListener is required to keep OnTouchListener active until ACTION_UP"
parentView.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {}
});
//All the work gets done in this function:
parentView.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
int x = (int)event.getRawX();
int y = (int)event.getRawY();
// ** myTouchPoint might be an array that you loop through here...
if ( inViewInBounds(myTouchPoint, x, y) ) doLogic(myTouchPoint);
return false;
}
});
The code above only shows detecting when one of your views are 'touched'.
if none are 'touched' but a view is 'active' (e.g. When a touch is detected, set a variable like: viewLastTouched = myTouchPoint) then you would call something like drawingLine(viewLastTouched, x, y) function - for whatever it needed to do to draw the line and/or detect boundaries etc.
They are not using android native java code to build this app.
The app is running with this code
import Runtime.MMFRuntime;
public class Main extends MMFRuntime {
}
This in turn is from https://github.com/ClickteamLLC/android/blob/master/docs/index.md
This is used to package apps / games written using - http://www.clickteam.com/clickteam-fusion-2-5

Dragging objects using ACTION_MOVE

I am trying to make all my drawn Sprites dragable for a little game. It should be able to touch anywhere and the sprites should move the same distance, the finger moves.
With the following method they will move on an ACTION_MOVE event, but only very slow, a shorter distance and sometimes they dont:
addToX/Y only adds the gap to the coordinates of the sprites
#Override
public boolean onTouchEvent(MotionEvent evt){
switch(evt.getAction()){
case MotionEvent.ACTION_DOWN:
break;
case MotionEvent.ACTION_MOVE:
if(getHistorySize() > 0){
for(int i = 1, n = evt.getHistorySize(); i < n; i++){
int calcX = (int) getHistoricalX(i) - (int) getHistoricalX(i-1);
int calcY = (int) getHistoricalY(i) - (int) getHistoricalY(i-1);
for(Sprite sprite : spriteList) {
sprite.addToX(calcX);
sprite.addToY(calcY);
}
}
}
return true;
}
Any ideas on this?
Assuming your Sprite class is an (potentially-indirect) extension of android.view.View, then you can use setOnDragListener() to define an onDrag() override for them. Then you can use startDrag(...) on them to begin the drag. This is typically triggered by a long-press gesture on the view to be dragged, but in your case you can trigger it from within onTouchEvent() in ACTION_MOVE once (or even ACTION_DOWN). See here for more details on these methods.
Also, with respect to the code you posted, one issue with it that probably explains why it doesn't always work is that you are only using the historical points (which may or may not have accumulated on any particular call to onTouchEvent()). Whether or not getHistorySize() is greater than 0, you should still also use evt.getX() and evt.getY() on each call to onTouchEvent(). But of course, if you use the drag listener approach I suggested instead, you won't need to worry about this.
Update per comment
If you want to move all of the sprites at once, you can put the sprites into a full-screen FrameLayout and attach a GestureDetector that uses a GestureDetector.SimpleOnGestureListener to capture onScroll() callbacks and then calls scrollTo() on the FrameLayout. When the parent FrameLayout scrolls, all of its children sprites will appear to move together.

Fling implementation on android canvas

I have the usual gesture detector for detecting fling , It is an instance attribute of a SurfaceView
GestureDetector flingDetector = new GestureDetector(getContext(),new SimpleOnGestureListener() {
#Override
public boolean onFling(MotionEvent e1, MotionEvent e2, float velocityX, float velocityY) {
// Fling implementation
return true;
}
});
I am drawing a lot of complex stuff on a canvas and I have a translate(dx,dy) method that I use with onScroll.
So my question is how do I implement the fling using the translate method ?
There seem to be a lot of questions on detecting fling , my question is on implementing it .
I am not sure this will answer your question, I'll give it a try.
Check http://developer.android.com/reference/android/view/MotionEvent.html for MotionEvent.
You can use the two events received as e1 and e2 on the onFling method, and calculate coordinate differences with e1.getX(), e2.getX(), e1.getY(), e2.getY().... With this you would have the dx and dy to use with translate(dx,dy).
Since the fling seems more of a dynamic gesture, you could decide that fling means an ampler movement, and apply an amplification factor to dx and dy, so that when the user scrolls, they get a precise movement, but on fling, the actual movement gets amplified.
If this factor depends on velocity, you have a custom response for every user input.
(A different thing would be animating the result, which I guess would depend on other things).
An example I might try if it were me:
User scrolls softly: Movement is dx,dy. Translate(dx,dy).
User flings:
Real motion: dx=(e2.getX()-e1.getX(). dy = (e2.getY()-e1.getY().
Fling factor: (Custom implementation).
Modified motion: dxModified = dx*velocityX*F. dyModified = dy*velocityY*F.
Finally: translate (dxModified,dyModified)
Hope this helps to some extent.
Edit: I did not realize this question was from 2012, hopefully this will help someone some time. It would be nice to know about the final implementation anyway!.

Categories

Resources