Unity3d - Touch Input issue - android

I develop a game that need user to have a single touch to draw over to link multiple objects. Currently using mouse button as touch input with this code:
if(Input.GetButton("Fire1"))
{
mouseIsDown = true;
...//other codes
}
It works fine with mouse but it will give me problem when I compile it for multi touch device. In multitouch device, if you have 2 fingers down, the middle point will be taken as the input. But if I already touch an object then with second finger down to screen, it will mess up my game input and give me havoc design.
What I want is to limit it to 1 touch only. If the second finger come touching, ignore it. How can I achieve that?

Below is all you need:
if ((Input.touchCount == 1) && (Input.GetTouch (0).phase == TouchPhase.Began)) {
mouseIsDown = true;
...//other codes
}
It will only fire when one finger is on the screen because of Input.touchCount == 1

You are using Input.GetButton. A button is not a touch. You probably want to look at Input.GetTouch
If you need to support multiple input-type devices, you may want to consider scripting your own manager to abstract this out somewhat.

I would roll with a bit of polling!
so in your Update() I typically do something like this:
foreach (Touch touch in Input.touches)
{
// Get the touch id of the current touch if you're doing multi-touch.
int pointerID = touch.fingerId;
if (touch.phase == TouchPhase.Began)
{
// Do something off of a touch.
}
}
If you're looking for more info check this out:
TouchPhases in Unity

Related

How can i make custom Input Manager in Unity for Android

So I'm making a Pong game in unity and I wanted to port it to Android but the problem is I don't realy know how to make it work on mobile. The idea was to just tap on the side of the screen for the paddle to move but Unity Input Manager Doesn't really support things like that and I heard about custom Input Managers but all the tutorials just can't help so I need to ask you. https://i.stack.imgur.com/uB9q0.png
You need to check for touches, get the touch and update the transform of whatever object you're touching(hopefully).
Documentation of concern:
Touch
Touch.phase
Transform
Camera.ScreenToWorldPoint
if (Input.touchCount > 0)
{
Touch touch = Input.GetTouch(0); // get first touch since touch count is greater than zero
if (touch.phase == TouchPhase.Stationary || touch.phase == TouchPhase.Moved)
{
// get the touch position from the screen touch to world point
Vector3 touchedPos = Camera.main.ScreenToWorldPoint(new Vector3(touch.position.x, touch.position.y, 10));
// lerp and set the position of the current object to that of the touch, but smoothly over time.
transform.position = Vector3.Lerp(transform.position, touchedPos, Time.deltaTime);
}
}

Unity how handle onTouch event?

I add UI object Button and add the C# script with public function.
To button I add component Event Trigger, do events(Pointer Click and Pointer Down) and redirect to my function public void onClick()
On PC code works, but when I upload game to android and touch the object, code not works.
How to do onTouch event?
I think OnMouseDown will check every frame if there is a mouse input it's like update so you have to cheek touch in Update & with touch you will have more control like Touch Phase to detect if the touch begins or lifted or moved etc...
you need to check
if(input.touchCount > 0)
void Update() {
if (Input.touchCount > 0){
print("exist a touch");
if(Input.GetTouch(0).phase == TouchPhase.Began){
print("Touch begans");
}
if(Input.GetTouch(0).phase == TouchPhase.Ended){
print("Touch Ended");
}
}
}
chCount > 0)& inside this you can cheek for touch Phase
Touches aren't Clicks
In order to handle touch input you need to check Input.touchCount and then query each touch with Input.GetTouch. Note that each Touch has an an ID that will be unique per finger and consistent across frames.
There are no easy OnClick like methods for touches, as touches can be a lot more complex (tap, long tap, drag, etc), so you will have to check inside Update() and handle the conversion from touch data to mouse analogs yourself.

Moving a game object in Arcore

I have been working out on google ARCore, and got stuck on how to move the game object with the inputs coming from the android device.
The canvas that i have created is precisely with 4 buttons, which as AxisTouchButton script from cross platform input covering vertical and horizontal. I have tried out lean touch to scale, translate and rotate seems to works perfectly.But when i am trying to apply force or velocity to the game object, it moves perfectly for the first time, then when i again axis the buttons, it starts to float in that particular direction unless any other button is pressed.
The below code is for the movement of the game object attached to the Andy prefab in HelloAR scene from examples :
Vector3 offset=Vector3.zero;
offset.x = CrossPlatformInputManager.GetAxis("Horizontal");
offset.z= CrossPlatformInputManager.GetAxis("Vertical");
rb.velocity=(offset * speed ) ;
I'm not sure why your prefab is drifting with the code snippet you've provided,
Try resetting the velocity to zero after you are done with movement of prefab.
rb.velocity = new Vector3(0,0,0);
Or maybe it is due to the fact that you are moving the prefab too far away from its parent anchor, or maybe away from the plane detected by arcore.
But I've another tested way to move a prefab using touch input on the planes detected by arcore and as it allows you to move the prefab only on the planes detected so you can easily reset its anchor after you are done with replacing prefab.
I'd modified the HelloARController.cs script in the following way.
bool move = false; //handle move with some button calls
void Update(){
//add this in your update method to call MoveObject() method
//handle move with some buttons
if(move){
MoveObject();
}
}
void MoveObject(){
if(Input.touchCount == 1){
Touch touch = Input.GetTouch(0);
TrackableHit hit;
TrackableHitFlags raycastFilter = TrackableHitFlags.PlaneWithinPolygon | TrackableHitFlags.FeaturePointWithSurfaceNormal;
if (Frame.Raycast (touch.position.x, touch.position.y, raycastFilter, out hit)) {
if ((hit.Trackable is DetectedPlane) && Vector3.Dot (firstPersonCamera.transform.position - hit.Pose.position, hit.Pose.rotation * Vector3.up) < 0) {
Debug.Log ("Hit at back of the current detected plane");
}
else {// KEY CODE SNIPPET : moves the selectedObject at the location of touch on detected planes
selectedObject.transform.position = hit.Pose.position;
}
}
else {
Debug.Log ("Not moving");
}
}
}
here selectedObject is you andy prefab of whatever you are instantiating.
Make sure that you are instantiating only one prefab at a time and refer it to selectedObject.
Try out the new ARCore Manipulation System. Working like a charm (for newbies).
They forgot to add a collider on the prefab, so don't forget to add it before running the example.
ARCore Unity SDK v1.13.0

Unity3D touch event binding

I have created a button in my 2d game using Unity3d, added box collider 2d with name PADBASE to detect touch events this way:
if(Input.touchCount > 0)
{
for(int i = 0; i < Input.touchCount; i++)
{
Vector3 mouseWorldPos3D = Camera.main.ScreenToWorldPoint(Input.GetTouch(i).position);
Vector2 mousePos2D = new Vector2(mouseWorldPos3D.x, mouseWorldPos3D.y);
Vector2 dir = Vector2.zero;
RaycastHit2D hit = Physics2D.Raycast(mousePos2D, dir);
Touch t = Input.GetTouch(i);
if (hit.transform != null)
{
if(Physics2D.Raycast (hit.transform.position , hit.transform.forward))
{
GameObject recipient = hit.transform.gameObject;
if(t.phase == TouchPhase.Began) //poczatek dotyku
{
// button clicked
// change colour to red to visually show that button is being pressed
}
else if (t.phase == TouchPhase.Ended)
{
// change collor to its default colour to visually show that its no longer pressed
}
}
}
}
}
And lets assume I will change button colour to red when player touched the button, and back to its default colour when he released his finger (for example)
Now it will obviously work only when player will release his finger, as long as his finger is actually inside bounds of the box collider, what I am trying to do is "bind touch events(?)" to still catch touch event (slide or ended) even if player moved his finger outside of the collider without releasing his finger (for example accidentally)
I am looking forward for some suggestions, thanks.
In my game I will have multiple buttons so multi touch is necessary.
Solved, it actually turns to be really easy, you need to store and compare touch finger ID.
Solution:
- on touch began:
check if touch is withing game objects box collider like in the code attached above.
get touch finger ID and pass it to your button or keep reference of it
- on touch ended:
compare touch finger ID with your stored touch id, if they are equal, set your stored finger id to -1 and perform your code that should execute on touch edned
- on touch moved or (||) stationary:
do the same comparison of the touch id like in the touch ended, and perform your code that should be executed if player will slide/move his finger, but do not modify your finger id variable.
That`s it, works well with multiple buttons with multi touch.

Android will treat fast single touch as move?

I write a very simple android application, that I can draw something on the pad. Touch the screen with a finger, you will see a green ball, move your finger, you will see a red line.
But I found a very strange thing: If I touch the screen with two fingers one by one very fast, it will draw a line between them! (Imaging you are pressing two keys jkjkjkjkjkjjkjkjkjkjkjkj on the keyboard)
The key code is pretty simple:
public boolean onTouch(View v, MotionEvent event) {
int action = event.getAction();
switch (action & MotionEvent.ACTION_MASK) {
case MotionEvent.ACTION_DOWN:
multiTouch = false;
id = event.getPointerId(0);
PointF p = getPoint(event, 0);
path = new Path();
path.moveTo(p.x, p.y);
paths.add(path);
points.add(copy(p));
break;
case MotionEvent.ACTION_POINTER_DOWN:
multiTouch = true;
for (int i = 0; i < event.getPointerCount(); i++) {
int tId = event.getPointerId(i);
if (tId != id) {
points.add(getPoint(event, i));
}
}
break;
case MotionEvent.ACTION_MOVE:
if (!multiTouch) {
p = getPoint(event, 0);
path.lineTo(p.x, p.y);
}
break;
}
invalidate();
return true;
}
The full source is here: https://github.com/freewind/TouchTest/blob/master/src/com/example/MyImageView.java
And it's a working demo: https://github.com/freewind/TouchTest
Or you can just download the signed apk on your android device, and test it yourself: https://github.com/freewind/TouchTest/blob/master/TouchTest.apk?raw=true
You can see in my code, I have checked if it's multi touch and disabled drawing on that case.
My android version is 4.0, and my code target is 2.3.3
There is a picture on my android pad:
You can see there are some lines but it should not be, there should be a green ball on the left of the red line instead.
I'm not sure why android treat fast single touch as moving, I considered 3 reasons:
My code has something wrong
Android sdk has something wrong
My android pad has something wrong, e.g. missing a ACTION_DOWN event
How to find out the real reason?
UPDATE
One of my friend used his android mobile(android 2.1) to test this app and found there is no red line, another used android 2.3.5 and found there are red lines.
Please review my code, I have checked multi-touch by ACTION_POINTER_DOWN, and will do nothing on ACTION_MOVE if there are more than 1 points. So the id of point is not needed. (Actually, in my first version of this code, I used id but have the same issue).
And I don't think this is an expected behavior, because it made the development of touching programs hard. I found this issue because in my another application(user can drag/zoom/rotate an image by fingers), the image sometimes "jump" on screen.
I even tried a popular game (Fruit Ninja) on my android pad and iTouch, and found android version has the issue but iTouch doesn't.
Now I'm sure there is something wrong (missing an ACTION_UP event when the first finger ups), but I still don't know what causes it. My android pad? Or Android sdk?
That is way it works for multitouch. When you press fast android handle it as gesture, and you will have 2 pressed pointers. To avoid it try to handle action_up, or use action_pointer_down instead
you can check the id of the touch , so that you will handle only the first touch alone.
alternatively , you can monitor all touches and handle them together .

Categories

Resources