Android will treat fast single touch as move? - android

I write a very simple android application, that I can draw something on the pad. Touch the screen with a finger, you will see a green ball, move your finger, you will see a red line.
But I found a very strange thing: If I touch the screen with two fingers one by one very fast, it will draw a line between them! (Imaging you are pressing two keys jkjkjkjkjkjjkjkjkjkjkjkj on the keyboard)
The key code is pretty simple:
public boolean onTouch(View v, MotionEvent event) {
int action = event.getAction();
switch (action & MotionEvent.ACTION_MASK) {
case MotionEvent.ACTION_DOWN:
multiTouch = false;
id = event.getPointerId(0);
PointF p = getPoint(event, 0);
path = new Path();
path.moveTo(p.x, p.y);
paths.add(path);
points.add(copy(p));
break;
case MotionEvent.ACTION_POINTER_DOWN:
multiTouch = true;
for (int i = 0; i < event.getPointerCount(); i++) {
int tId = event.getPointerId(i);
if (tId != id) {
points.add(getPoint(event, i));
}
}
break;
case MotionEvent.ACTION_MOVE:
if (!multiTouch) {
p = getPoint(event, 0);
path.lineTo(p.x, p.y);
}
break;
}
invalidate();
return true;
}
The full source is here: https://github.com/freewind/TouchTest/blob/master/src/com/example/MyImageView.java
And it's a working demo: https://github.com/freewind/TouchTest
Or you can just download the signed apk on your android device, and test it yourself: https://github.com/freewind/TouchTest/blob/master/TouchTest.apk?raw=true
You can see in my code, I have checked if it's multi touch and disabled drawing on that case.
My android version is 4.0, and my code target is 2.3.3
There is a picture on my android pad:
You can see there are some lines but it should not be, there should be a green ball on the left of the red line instead.
I'm not sure why android treat fast single touch as moving, I considered 3 reasons:
My code has something wrong
Android sdk has something wrong
My android pad has something wrong, e.g. missing a ACTION_DOWN event
How to find out the real reason?
UPDATE
One of my friend used his android mobile(android 2.1) to test this app and found there is no red line, another used android 2.3.5 and found there are red lines.
Please review my code, I have checked multi-touch by ACTION_POINTER_DOWN, and will do nothing on ACTION_MOVE if there are more than 1 points. So the id of point is not needed. (Actually, in my first version of this code, I used id but have the same issue).
And I don't think this is an expected behavior, because it made the development of touching programs hard. I found this issue because in my another application(user can drag/zoom/rotate an image by fingers), the image sometimes "jump" on screen.
I even tried a popular game (Fruit Ninja) on my android pad and iTouch, and found android version has the issue but iTouch doesn't.
Now I'm sure there is something wrong (missing an ACTION_UP event when the first finger ups), but I still don't know what causes it. My android pad? Or Android sdk?

That is way it works for multitouch. When you press fast android handle it as gesture, and you will have 2 pressed pointers. To avoid it try to handle action_up, or use action_pointer_down instead

you can check the id of the touch , so that you will handle only the first touch alone.
alternatively , you can monitor all touches and handle them together .

Related

Unity 2D collisions with prefabs not working on phone

I'm having a bit of a problem with OnTriggerEnter when I'm using my mobile as a test device.
I have some touch code that successfully lets me drag objects around the screen.
I am then having the objects collide with other objects on the screen.
This was working perfectly until I turned the objects into prefabs. ( I'm needing to do this as the objects are being randomly generated at runtime)
Now, I can still move the objects around the screen but they no longer collide with the other objects, which are also prefabs. It does however still work fine when running it on my laptop in the unity editor.
All my objects have colliders on them with trigger checked, and the moving objects have rigidbodies.
On trigger enter code
public void OnTriggerEnter(Collider other)
{
Debug.Log ("here");
Debug.Log(this.gameObject.tag +"is this");
Debug.Log(other.gameObject.tag + "is other");
if (this.gameObject.tag == other.gameObject.tag)
{
Debug.Log("here2)");
Reftomanager.miniGameScore++;
Reftomanager.updateScore();
Destroy(this.gameObject);
}
}
touch code
if (Input.touchCount > 0)
{
Touch touch = Input.GetTouch(0);
switch(touch.phase)
{
case TouchPhase.Began:
Ray ray = Camera.main.ScreenPointToRay (touch.position);
if (Physics.Raycast(ray,out hit))
{
thisObject = hit.collider.gameObject;
touchPos = Camera.main.ScreenToWorldPoint (touch.position);
if(thisObject.name!="circle")
{
draggingMode = true;
}
}
break;
case TouchPhase.Moved:
if (draggingMode)
{
touchPos = Camera.main.ScreenToWorldPoint (touch.position);
newCentre = touchPos;
thisObject.transform.position = touchPos;
}
break;
case TouchPhase.Ended:
draggingMode = false;
break;
}
}
}
I'm completely stumped so any help would be amazing.
Thanks
Just got this same error recently. I suggest using
If(other.gameObject.CompareTag ("YourTagName"))
Also if you recently added a tag or edited any tags, I found that unity has a bug where your tags will not register on your android build unless you restart unity.
GL.
Since your using 3D colliders, is it possible that the position you are assigning them is different? Touch.position is a Vector2, which means ScreenToWorldPoint would be using 0 for z. If you are using a Vector3 with a z value other than 0 to get the world point in the editor (Standalone Input), it could get you a different value even if x and y are the same.
Another possibility is that there is a platform specific error happening somewhere else in the code, upon object instantiate. Your movement code would still work fine, if it isn't in the same Monobehavior.
If you have an Android, you can use Android Monitor with the Unity tag to check for error messages.

Android How to detect swipe between two touch points on image view

I have an Imageview with different number of touch points on them. It basically an app which is detecting the swipe between 2 touch points and not allowing the user to swipe any other point or in or out of other direction. It should constrict user to just swipe between two touch points.
Just take a look at following picture:
Now the user should start swiping from point 1 to point 2. if the swipe is not started from starting point 1, it should not color the path between point 1 and point 2.
But if the user successfully swipe between the point 1 and point 2 now swipe between point 2 to 3 should be enabled. Thus user should go through Point 1 to 2, Point 2 to 3 , Point 3 to 4 , point 4 to point 5 to complete round 1.
Please tell me how to achieve this functionality . I know about gestures, gesture overlay etc but none of them fits to my condition as they uses general touch events and gesture directions.
Please suggest me the way to achieve this and keep in mind I want to make this app to be able to run on all type of devices , so I can simply give the hard coded x,y values.
Edit : on Demand
I am posting the link of the app on play store who has same functionality , But I do not know How they are achieving this functionality .
https://play.google.com/store/apps/details?id=al.trigonom.writeletters
If each touch point can be created as individual views(e.g. ImageView), then you can create an inViewInBounds() function.
I used code from here to create code where I needed to detect finger press movement over multiple imageViews:
Rect outRect = new Rect();
int[] location = new int[2];
//Determine if a touch movement is currently over a given view.
private boolean inViewInBounds(View view, int x, int y){
view.getDrawingRect(outRect);
view.getLocationOnScreen(location);
outRect.offset(location[0], location[1]);
return outRect.contains(x, y);
}
To use this function, on a parent view to all those child touch points, set the Click and Touch Listeners:
//This might look redundant but is actually required: "The empty OnClickListener is required to keep OnTouchListener active until ACTION_UP"
parentView.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {}
});
//All the work gets done in this function:
parentView.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
int x = (int)event.getRawX();
int y = (int)event.getRawY();
// ** myTouchPoint might be an array that you loop through here...
if ( inViewInBounds(myTouchPoint, x, y) ) doLogic(myTouchPoint);
return false;
}
});
The code above only shows detecting when one of your views are 'touched'.
if none are 'touched' but a view is 'active' (e.g. When a touch is detected, set a variable like: viewLastTouched = myTouchPoint) then you would call something like drawingLine(viewLastTouched, x, y) function - for whatever it needed to do to draw the line and/or detect boundaries etc.
They are not using android native java code to build this app.
The app is running with this code
import Runtime.MMFRuntime;
public class Main extends MMFRuntime {
}
This in turn is from https://github.com/ClickteamLLC/android/blob/master/docs/index.md
This is used to package apps / games written using - http://www.clickteam.com/clickteam-fusion-2-5

Unity3d - Touch Input issue

I develop a game that need user to have a single touch to draw over to link multiple objects. Currently using mouse button as touch input with this code:
if(Input.GetButton("Fire1"))
{
mouseIsDown = true;
...//other codes
}
It works fine with mouse but it will give me problem when I compile it for multi touch device. In multitouch device, if you have 2 fingers down, the middle point will be taken as the input. But if I already touch an object then with second finger down to screen, it will mess up my game input and give me havoc design.
What I want is to limit it to 1 touch only. If the second finger come touching, ignore it. How can I achieve that?
Below is all you need:
if ((Input.touchCount == 1) && (Input.GetTouch (0).phase == TouchPhase.Began)) {
mouseIsDown = true;
...//other codes
}
It will only fire when one finger is on the screen because of Input.touchCount == 1
You are using Input.GetButton. A button is not a touch. You probably want to look at Input.GetTouch
If you need to support multiple input-type devices, you may want to consider scripting your own manager to abstract this out somewhat.
I would roll with a bit of polling!
so in your Update() I typically do something like this:
foreach (Touch touch in Input.touches)
{
// Get the touch id of the current touch if you're doing multi-touch.
int pointerID = touch.fingerId;
if (touch.phase == TouchPhase.Began)
{
// Do something off of a touch.
}
}
If you're looking for more info check this out:
TouchPhases in Unity

Is there a offset between onTouchEvent and onTouchListener?

I have developed a game that shoots when player touches the screen by using onTouchListener for my custom SurfaceView and Thread.
But now I want to change the approach and instead of onTouchListener I added onTouchEvent to the SurfaceView o my Activity.
The problem is that I get some kind of offset when I click on the emulator.
Everything is working great, except that offset keeps appearing and I don't understand why.
Also let me mention that my app is running in landscape mode, maybe this is relevant.
I suspect that it isn't working properly because onTouchListener was added to the view and depended on it, but onTouchEvent doesn't depend on the view.
Also my view doesn't have any padding or margin properties, it is full-screen view (fill_parent).
Does anyone have any ideas on this?
I have done my application, and everything works correctly now, but i still do not know what the problem was.
After lots of debugging of my application the onTouchEvent returned random Y values that were always higher than the ones that the onTouchListener returned. And i am not sure why this is hapening since my view that recognizes the onTouchListener is a full-screen view.
So i figured out a way to get passed this by doing some math.
The first function that the android calls is the onTouch method which gives the correct values but just for one touch. I needed it to give right values even on the MotionEvent.ACTION_MOVE so i noticed that MotionEvent.ACTION_MOVE is actually doing correctly depending on the first touch recognized by the onTouchEvent.
So i just got the coordinate Y from the onTouch and the different coordinate with the offset from onTouchEvent calculated the difference and in every onTouchEvent from there, until the user lifts up their finger, i just subtract that difference and that gives me the correct value.
If anyone else has this problem, and doesn't know how to fix it, here is my code, maybe it will be helpful.
#Override
public boolean onTouchEvent(MotionEvent arg1) {
/*you can only touch when the thread is running*/
if(game.state() != STATE_PAUSE){
if(arg1.getAction() == MotionEvent.ACTION_DOWN){
coordinateX = arg1.getX();
coordinateY = arg1.getY();
differenceY = Math.abs(coordinateY - touchedY);
coordinateY = coordinateY - differenceY;
shootingIsOkay = true;
game.setDrawCircle(coordinateX,coordinateY);
}
if(arg1.getAction() == MotionEvent.ACTION_MOVE){
coordinateX = arg1.getX();
coordinateY = arg1.getY();
coordinateY = coordinateY - differenceY;
shootingIsOkay = true;
game.setDrawCircle(coordinateX,coordinateY);
}
if(arg1.getAction() == MotionEvent.ACTION_UP){
shootingIsOkay = false;
}
}
return false;
}
And the onTouch method that is called from the onTouchListener that depends on the view is just simple, here:
#Override
public boolean onTouch(View arg0, MotionEvent arg1) {
touchedY = arg1.getY();
return false;
}
If anyone knows how to fix this problem, please post your answer, i am very curious why this is happening

Android multi-touch interference

I'm currently developing an air-hockey simulation for android. For the multiplayer mode I'm tracking two touch events on the screen, which works well as long as the touch points don't get to close.
When the two fingers get to close, android only recognizes one touch event, in the middle of both points.
To make it even worse, android sometimes messes up the IDs after the collision.
I already thought about estimating the next touch points ans assigning IDs manually, does anybody know a better way, or knows about somebody who already fixed this problem programmatically?
NOTE: I'm testing on a Samsung Galaxy S 3
Not necessarily a logical fix to the issue, nevertheless a possible solution to the application:
If I'm not completely mistaken, air-hockey games shouldn't allow opponents to intrude on each others game field. If we assume a thick border cross the center of the screen (in portrait mode), then I wouldn't be allowed to do anything beyond that border, hence there is no point in tracking my finger after it reaches the border line.
Encapsulating your tracked touch events into valid physical locations as described might just help you in ignoring invalid points (given that the physical locations doesn't intersect, that is).
You might also have to keep track of direction of the touch vector: if the vector is stretching from the center of the screen towards "your end" it might be the opponents intruding finger or your own returning finger. In neither case should they affect the hockey puck (perhaps).
It may depend on the device you are using, but I'm using the code below in a Huawei X5 and It never mess up fingers, even if they touch each other or it I twist them over the screen.
private static PointF touchScreenStartPtArr[] = new PointF[10];
private static PointF touchScreenStopPtArr[] = new PointF[10];
private static PointF touchScreenCurrPtArr[] = new PointF[10];
OnTouchListener onTouchListenerMulti = new OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
int action = event.getAction() & MotionEvent.ACTION_MASK;
int pointerIndex = (event.getAction() & MotionEvent.ACTION_POINTER_INDEX_MASK) >> MotionEvent.ACTION_POINTER_INDEX_SHIFT;
int fingerId = event.getPointerId(pointerIndex);
switch (action) {
case MotionEvent.ACTION_DOWN:
case MotionEvent.ACTION_POINTER_DOWN:
touchScreenStartPtArr[fingerId].x = event.getX(pointerIndex);
touchScreenStartPtArr[fingerId].y = event.getY(pointerIndex);
break;
case MotionEvent.ACTION_UP:
case MotionEvent.ACTION_POINTER_UP:
case MotionEvent.ACTION_CANCEL:
touchScreenStopPtArr[fingerId].x = event.getX(pointerIndex);
touchScreenStopPtArr[fingerId].y = event.getX(pointerIndex);
break;
case MotionEvent.ACTION_MOVE:
int pointerCount = event.getPointerCount();
for (int i = 0; i < pointerCount; i++) {
touchScreenCurrPtArr[fingerId].x = event.getX(i);
touchScreenCurrPtArr[fingerId].y = event.getY(i);
}
break;
}
return true;
}
};
Note that I'm using fingerId and not pointerId to identify the correct finger, as pointer id may change when one finger is released.
Hope it works for you.
Here's the way I see it.
The touchscreen hardware gives you a resolution below which two touches are the same as one. This is something you cannot change.
Now the question is, what to do when two touches merge? (This is something that can be tested for programmatically one would think; e.g. if 2 touch pts -> 1 touch pt. AND prev touch pt 1 is close enough to prev touch pt 2...). In your case, I would move both pucks along the merged touch gesture until they separate, then return individual control.
Of course, I see several problems with this, like which touch controls which puck after the split? Maybe one person lifted their finger during the merge.
You could have both players lose control of their puck if a merge occurs. This could simulate the shock to the wrist as your hand bashes into your opponent's :)
I also like #dbm idea.
Hope this helps. Probably didn't :)

Categories

Resources