I have a bitmap that I can drag around on my app, but there is a small bug, if let's say someone clicks with one finger on the bitmap and starts moving it around, and then while moving it around clicks with another finger, the bitmap "teleports" over there. How can I detect if the user switched fingers so bitmaps wont teleport? this is my Action_MOVE code:
case MotionEvent.ACTION_MOVE:
if (dragging) {
CurrentMobEntry.getKey().x = (int) x
- CurrentMobEntry.getValue().getNormalbit()
.getWidth() / 2;
CurrentMobEntry.getKey().y = (int) y
- CurrentMobEntry.getValue().getNormalbit()
.getHeight() / 2;
CurrentMobEntry.getValue().getDestroyedP().x=(int)x
+CurrentMobEntry.getValue().getNormalbit()
.getWidth() / 2;
CurrentMobEntry.getValue().getDestroyedP().y = (int)y
+ CurrentMobEntry.getValue().getNormalbit()
.getHeight() / 2;
}
break;
From the documentation:
Some devices can report multiple movement traces at the same time. Multi-touch screens emit one movement trace for each finger. The individual fingers or other objects that generate movement traces are referred to as pointers. Motion events contain information about all of the pointers that are currently active even if some of them have not moved since the last event was delivered.
The number of pointers only ever changes by one as individual pointers go up and down, except when the gesture is canceled.
So what you want to do is to allow only for one pointer (finger) to operate in any given time. I can't seem to find a simple solution to this so we are going to use a workaround:
Each pointer gets a unique, constant ID. On any given touch event, you will check to see if that pointer initiated the event and then if true consume it, else ignore.
For a full example of this check the Android Developer's blog.
Related
As we know that Android O.S. supports **multi finger gestures **. I want to develop an app that dispatches complex gesture for user. I am able to capture the motion events and dispatch gestures which are made only of one finger.
But if the user uses multiple pointers (fingers) for making a gesture, then I am able to capture them but then how can I dispatch them using Accessibility Service (dispatchGesture) function.
Any help would be most welcomed. Thanks
So, actually to dispatch multi finger gestures using accessibility services we can use strokes for each fingers.
For ex- to dispatch a two finger gesture it would be required to add two gesture stroke to gesture description and then dispatch it.
Double swipe up gesture as an example
Point position=new Point(100,10);
GestureDescription.Builder builder = new GestureDescription.Builder();
Path p = new Path();
Path q = new Path();
//For first finger
p.moveTo(position.x, position.y);
p.lineTo(position.x, position.y+300);
//For second finger
q.moveTo(position.x, position.y);
q.lineTo(position.x + 50, position.y+300);
//Two strokes for two fingers
builder.addStroke(new GestureDescription.StrokeDescription(p, 100L, 50L));
builder.addStroke(new GestureDescription.StrokeDescription(q, 100L, 50L));
GestureDescription gesture = builder.build();
boolean isDispatched = dispatchGesture(gesture,gestureResultCallback,null);
The order in which individual pointers appear within a motion event is undefined. Thus the index of a pointer can change from one event to the next, but the pointer ID of a pointer is guaranteed to remain constant as long as the pointer remains active. Use the getPointerId() method to obtain a pointer's ID to track the pointer across all subsequent motion events in a gesture. Then for successive motion events, use the findPointerIndex() method to obtain the pointer index for a given pointer ID in that motion event. For example:
private var mActivePointerId: Int = 0
override fun onTouchEvent(event: MotionEvent): Boolean {
...
// Get the pointer ID
mActivePointerId = event.getPointerId(0)
// ... Many touch events later...
// Use the pointer ID to find the index of the active pointer
// and fetch its position
val (x: Float, y: Float) = event.findPointerIndex(mActivePointerId).let { pointerIndex ->
// Get the pointer's current position
event.getX(pointerIndex) to event.getY(pointerIndex)
}
...
}
To support multiple touch pointers, you can cache all active pointers with their IDs at their individual ACTION_POINTER_DOWN and ACTION_DOWN event time; remove the pointers from your cache at their ACTION_POINTER_UP and ACTION_UPevents. Those cached IDs may be necessary in order to handle other action events correctly; for example, when processing ACTION_MOVE event, you can find the index for each cached active pointer ID, retrieve the pointer's coordinates by using the relavent functions (getX(), getY(), etc.), then compare with your cached coordinates to discover the actually moved pointers. There can be multiple moved pointers in one ACTION_MOVE event. The getActionIndex() function does not apply to the ACTION_MOVE event.
I am trying to make all my drawn Sprites dragable for a little game. It should be able to touch anywhere and the sprites should move the same distance, the finger moves.
With the following method they will move on an ACTION_MOVE event, but only very slow, a shorter distance and sometimes they dont:
addToX/Y only adds the gap to the coordinates of the sprites
#Override
public boolean onTouchEvent(MotionEvent evt){
switch(evt.getAction()){
case MotionEvent.ACTION_DOWN:
break;
case MotionEvent.ACTION_MOVE:
if(getHistorySize() > 0){
for(int i = 1, n = evt.getHistorySize(); i < n; i++){
int calcX = (int) getHistoricalX(i) - (int) getHistoricalX(i-1);
int calcY = (int) getHistoricalY(i) - (int) getHistoricalY(i-1);
for(Sprite sprite : spriteList) {
sprite.addToX(calcX);
sprite.addToY(calcY);
}
}
}
return true;
}
Any ideas on this?
Assuming your Sprite class is an (potentially-indirect) extension of android.view.View, then you can use setOnDragListener() to define an onDrag() override for them. Then you can use startDrag(...) on them to begin the drag. This is typically triggered by a long-press gesture on the view to be dragged, but in your case you can trigger it from within onTouchEvent() in ACTION_MOVE once (or even ACTION_DOWN). See here for more details on these methods.
Also, with respect to the code you posted, one issue with it that probably explains why it doesn't always work is that you are only using the historical points (which may or may not have accumulated on any particular call to onTouchEvent()). Whether or not getHistorySize() is greater than 0, you should still also use evt.getX() and evt.getY() on each call to onTouchEvent(). But of course, if you use the drag listener approach I suggested instead, you won't need to worry about this.
Update per comment
If you want to move all of the sprites at once, you can put the sprites into a full-screen FrameLayout and attach a GestureDetector that uses a GestureDetector.SimpleOnGestureListener to capture onScroll() callbacks and then calls scrollTo() on the FrameLayout. When the parent FrameLayout scrolls, all of its children sprites will appear to move together.
I am attempting to translate an object depending on the touch position of the user.
The problem with it is, when I test it out, the object disappears as soon as I drag my finger on my phone screen. I am not entirely sure what's going on with it?
If somebody can guide me that would be great :)
Thanks
This is the Code:
#pragma strict
function Update () {
for (var touch : Touch in Input.touches)
{
if (touch.phase == TouchPhase.Moved) {
transform.Translate(0, touch.position.y, 0);
}
}
}
The problem is that you're moving the object by touch.position.y. This isn't a point inworld, it's a point on the touch screen. What you'll want to do is probably Camera.main.ScreenToWorldPoint(touch.position).y which will give you the position inworld for wherever you've touched.
Of course, Translate takes a vector indicating distance, not final destination, so simply sticking the above in it still won't work as you're intending.
Instead maybe try this:
Vector3 EndPos = Camera.main.ScreenToWorldPoint(touch.position);
float speed = 1f;
transform.position = Vector3.Lerp(transform.position, EndPos, speed * Time.deltaTime);
which should move the object towards your finger while at the same time keeping its movements smooth looking.
You'll want to ask this question at Unity's dedicated Questions/Answers site: http://answers.unity3d.com/index.html
There are very few people that come to stackoverflow for Unity specific question, unless they relate to Android/iOS specific features.
As for the cause of your problem, touch.position.y is define in screen space (pixels) where as transform.Translate is expecting world units (meters). You can convert between the two using the Camera.ScreenToWorldPoint() method, then creating a vector out of the camera position and screen world point. With this vector you can then either intersect some geometry in the scene or simply use it as a point in front of the camera.
http://docs.unity3d.com/Documentation/ScriptReference/Camera.ScreenToWorldPoint.html
I'm currently developing an air-hockey simulation for android. For the multiplayer mode I'm tracking two touch events on the screen, which works well as long as the touch points don't get to close.
When the two fingers get to close, android only recognizes one touch event, in the middle of both points.
To make it even worse, android sometimes messes up the IDs after the collision.
I already thought about estimating the next touch points ans assigning IDs manually, does anybody know a better way, or knows about somebody who already fixed this problem programmatically?
NOTE: I'm testing on a Samsung Galaxy S 3
Not necessarily a logical fix to the issue, nevertheless a possible solution to the application:
If I'm not completely mistaken, air-hockey games shouldn't allow opponents to intrude on each others game field. If we assume a thick border cross the center of the screen (in portrait mode), then I wouldn't be allowed to do anything beyond that border, hence there is no point in tracking my finger after it reaches the border line.
Encapsulating your tracked touch events into valid physical locations as described might just help you in ignoring invalid points (given that the physical locations doesn't intersect, that is).
You might also have to keep track of direction of the touch vector: if the vector is stretching from the center of the screen towards "your end" it might be the opponents intruding finger or your own returning finger. In neither case should they affect the hockey puck (perhaps).
It may depend on the device you are using, but I'm using the code below in a Huawei X5 and It never mess up fingers, even if they touch each other or it I twist them over the screen.
private static PointF touchScreenStartPtArr[] = new PointF[10];
private static PointF touchScreenStopPtArr[] = new PointF[10];
private static PointF touchScreenCurrPtArr[] = new PointF[10];
OnTouchListener onTouchListenerMulti = new OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
int action = event.getAction() & MotionEvent.ACTION_MASK;
int pointerIndex = (event.getAction() & MotionEvent.ACTION_POINTER_INDEX_MASK) >> MotionEvent.ACTION_POINTER_INDEX_SHIFT;
int fingerId = event.getPointerId(pointerIndex);
switch (action) {
case MotionEvent.ACTION_DOWN:
case MotionEvent.ACTION_POINTER_DOWN:
touchScreenStartPtArr[fingerId].x = event.getX(pointerIndex);
touchScreenStartPtArr[fingerId].y = event.getY(pointerIndex);
break;
case MotionEvent.ACTION_UP:
case MotionEvent.ACTION_POINTER_UP:
case MotionEvent.ACTION_CANCEL:
touchScreenStopPtArr[fingerId].x = event.getX(pointerIndex);
touchScreenStopPtArr[fingerId].y = event.getX(pointerIndex);
break;
case MotionEvent.ACTION_MOVE:
int pointerCount = event.getPointerCount();
for (int i = 0; i < pointerCount; i++) {
touchScreenCurrPtArr[fingerId].x = event.getX(i);
touchScreenCurrPtArr[fingerId].y = event.getY(i);
}
break;
}
return true;
}
};
Note that I'm using fingerId and not pointerId to identify the correct finger, as pointer id may change when one finger is released.
Hope it works for you.
Here's the way I see it.
The touchscreen hardware gives you a resolution below which two touches are the same as one. This is something you cannot change.
Now the question is, what to do when two touches merge? (This is something that can be tested for programmatically one would think; e.g. if 2 touch pts -> 1 touch pt. AND prev touch pt 1 is close enough to prev touch pt 2...). In your case, I would move both pucks along the merged touch gesture until they separate, then return individual control.
Of course, I see several problems with this, like which touch controls which puck after the split? Maybe one person lifted their finger during the merge.
You could have both players lose control of their puck if a merge occurs. This could simulate the shock to the wrist as your hand bashes into your opponent's :)
I also like #dbm idea.
Hope this helps. Probably didn't :)
I'm writing an app that involves writing on the screen using one's finger, or eventually a stylus. I have that part working. On ACTION_DOWN, starts drawing; on ACTION_MOVE, adds line segments; on ACTION_UP, finishes line.
The problem is that after ACTION_DOWN, apparently the pointer needs to move more than 10 pixels away from where it started (basically a 20x20 box around the starting point) in order to begin sending ACTION_MOVE events. After leaving the box, the move events are all quite accurate. (I figured out the 10 pixel thing by testing it.) Since this is meant to be used for writing or drawing, 10 pixels is a fairly significant loss: depending on how small you're trying to write, you can lose the first letter or two. I haven't been able to find anything about it - only a couple posts on a forum or two, like http://android.modaco.com/topic/339694-touch-input-problem-not-detecting-very-small-movements/page_pid_1701028#entry1701028. It seems to be present on some devices or systems and not others. No ideas as to how to get rid of it when you have it, though.
I'm using a Galaxy Tab 10.1, with Android 3.1. I've tried several different things to try to get rid of it: I've tried setting the event's coords to something else to see if I could trick it into thinking the cursor was in a different place; I tried re-dispatching the event with the coords changed (my handler reacted to the new points, but still didn't respond to movements in the 10-pixel radius.) I've searched through source code for any references to the effect, and found none (though I think it's from a different version of Android - code for 3.1 isn't released yet, is it?) I've searched for methods of querying the current state of the pointers, so I could just have a timer catch the changes until the pointer crossed the threshold. Couldn't find any way of getting pointer coords without a corresponding movement event. Nothing worked. Does anybody know anything about this, or have any ideas or work-arounds? Thank you.
-- Update: Drag and drop events show the same threshold.
I agree in part with the post by #passsy but come to a different conclusion. Firstly as mentioned, the mTouchSlop is the value that we are interested in and is exposed via ViewConfiguration.get(context).getScaledTouchSlop();
If you check the Android source for the ViewConfiguraton, the default value for TOUCH_SLOP is 8dip, but the comments mention that this value is a fallback only, and the actual value is defined when the Android OS for that specific device is built. (it may be more or less than this value. It appears to hold true for the Galaxy Tab devices)
More specific to the code sample, the mTouchSlop value is read from the ViewConfiguration when the View is initialised, but the value is only accessed in the onTouchEvent method. If you extend View and override this method (without calling super) then the behaviour of mTouchSlop in the View class is no longer relevant.
More telling (to us) was that when changing the Android settings to overlay touch events on the screen, a touch with a small drag does not register as a motion event, highlighted by the fact that the crosshairs from the Android OS do not move. From this our conclusion is that the minimal drag distance is being enforced at the OS level and your application will never be aware of drag events smaller than the TOUCH_SLOP value. You should also be aware that TOUCH_SLOP should not be used directly and the API deprecates the getTouchSlop method and recommends getScaledTouchSlop which takes the device screen size and pixel density into account. A side effect of this is that the actual minimum stroke length as perceived on different devices may vary. eg on a Galaxy Tab 2.0 7.0" it feels like we are able to draw shorter minimum strokes using the same code base than when running on a Galaxy Tab 2.0 10.1"
You should also be aware that (if you find a way to alter this value), this value determines how Android systems distinguish between taps and strokes. That is if you tap the screen but your finger moves slightly while performing the tap, it will be interpreted as a tap if it moved less than TOUCH_SLOP, but as a stroke if it moved more than TOUCH_SLOP. Therefore setting TOUCH_SLOP to a smaller value will increase the chance that a tap will be interpreted as a stroke.
Our own conclusion is that this minimum distance is not something that can be changed in practice and is something we need to live with.
The problem is on Line 6549 in class View https://github.com/android/platform_frameworks_base/blob/master/core/java/android/view/View.java
if (!pointInView(x, y, mTouchSlop)) {...}
/**
* Utility method to determine whether the given point, in local coordinates,
* is inside the view, where the area of the view is expanded by the slop factor.
* This method is called while processing touch-move events to determine if the event
* is still within the view.
*/
private boolean pointInView(float localX, float localY, float slop) {
return localX >= -slop && localY >= -slop && localX < ((mRight - mLeft) + slop) &&
localY < ((mBottom - mTop) + slop);
}
mTouchSlop is set in the constructor
mTouchSlop = ViewConfiguration.get(context).getScaledTouchSlop();
You can extend View and set mTouchSlop to zero. I don't see an other way to set mTouchSlop. There is no function like getApplicationContext.setScaledTouchSlop(int n).
Extend View Class.
Override pointInView method without "#Override" annotaion and set touchSlop = 0:
public boolean pointInView(float localX, float localY, float slop) {
slop = 0;
return localX >= -slop && localY >= -slop && localX < (getWidth() + slop) &&
localY < (getBottom() + slop);
}