I want to handle a press In react native app, so that :
When I touch the screen and move my finger across the screen,
I want to show the coordinates X and Y in real time, so by moving my finger the coordinates changes.
You can use react-native-gesture-handler and react-native-gesture-detector to detect custom gestures which allow you to create gestures on the fly. Yep, just plug it in, paint the gesture and you will receive the coordinate data for your super-complex, custom gesture. You can use it to just use the data points as a predefined gesture in your app, or you can even let your app users create their own custom gestures
Related
I want to be able to decompose a long gestures (the touch finger remains on the screen until the gesture ends) into multiple main points in order to measure and store the pressure and velocity between each one of them as shown in an example gesture in Example.
I am also open to other approaches that would allow me to get a similar result.
I am trying to develop a game for Android using pygame.
It would be a platformer. To move the main charachter, I would make the game to wait for mouse events (like pygame.MOUSEBUTTONDOWN).
In order to do that on mobile, I'd like to create a graphic representation of a joypad with arrow keys to be shown on the bottom left corner of the screen.
Now, when the user touches one of the arrows, a MOUSEBUTTONDOWN event should be triggered and the charachter should move accordingly.
My question is: since the "joypad" object is a mere draw, how can I link it to the event with pygame?
Is there a way to do so? Should I use the pixel coordinates of the arrow keys of the joypad or is there a better choice?
As far as I know this is not possible.
When handling input, mouse input and touch input are to be handled separately.
So to answer the 2 questions you listed at the end:
As far as I know there is no way to implement this functionality.
You could use the pixel coordinates of the arrows. However you can use Rects for that and test if the place of mouse input/touch input is inside the arrow button Rect with the collidepoint method
You can achieve that as follows:
arrow_left.collidepoint(mouse_x, mouse_y)
I hope this answer helped you!
I am wondering how to check if the user is performing gesture on our predefined area. I have an criteria and that is , let suppose , If I have a transparent graphics of word "B" on the screen , then user should start filling it with color with gesture (with the touch of his finger). he starts it with the below corner of the B and end up how we write the b when writing with the pencil.
So how to achieve this type of gesture and how to just restrict user to use the gesture on on that part of the screen ? please any idea about it or share me any thing which is similar to it.
The KitKat SDK supports a new type of scale gesture called quick scale. Instead of requiring two fingers to pinch zoom, the user can doubletap and swipe to scale a view. You can see this in action in the Chrome and Maps apps.
Both Chrome and Maps differentiate between a doubletap (which zooms into the relevant content area, as before) and a doubletap-swipe (which allows you to scale arbitrarily with one finger).
Under the hood, the ScaleGestureDetector uses a GestureDetector to detect doubletaps and start looking for the corresponding swipe.
My question is how to mimic Chrome and Maps, detecting both doubletaps and this doubletap-swipe gesture but not both at the same time. That is, I'd like to differentiate between a normal doubletap (no swipe) and a doubletap-swipe.
I have both a GestureDetector and a ScaleGestureDetector being fed all touch events on my view. Currently, both GestureListener.onDoubleTap() and ScaleGestureListener.onScaleBegin() fire when I do a doubletap-swipe. onDoubleTap() gets fired first, so there's no way cancel handling events in the ScaleGestureListener.
I see two possible solutions, neither of which is very clean:
Copy the ScaleGestureDetector from the Android source and add a new callback to the ScaleGestureListener interface for something like onDoubleTapConfirmed() (doubletap, no swipe).
Add a small delay to onDoubleTap() so we handle the event X milliseconds after it gets triggered. If onScaleBegin() gets fired before the delay is up, cancel handling the onDoubleTap() event and start handling the scale instead.
What's the best way to handle this?
A bit late to this party, but the quick scale is more like an unfinished double tap on which the second tap turns into a drag, something like:
tap down, tap up, tap down, move
It seems your interpretation of quick scale is:
tap down, tap up, tap down, tap up, tap down, move
Which makes your problem a lot easier if you only register a plain double tap on the second action down. When the second action up happens, you just need to figure out whether a drag has happened during the second tap cycle. In case the gesture recognizers don't work for you, you can do that manually by checking against getScaledTouchSlop.
Also, copying the ScaleGestureDetector is probably not as bad as you think - github is littered with "compat" projects that are basically backports of features in newer SDK versions.
I am trying to create an app that has you put your finger on a starting point ( an area with 150dpx150dp) and have it go through a specific pattern (like a maze).
How should I go about doing this? (I'm used to using the given buttons to do things)
Handle the OnTouch Event to get the coordinates of the touch.
The event contains a instance of the MotionEvent class.
The process to move the object(eg current painting location) would be:
b=get previous touch coordinates
a=get cordinates of current touch
move object from location b to a
Use gesture - http://developer.android.com/resources/articles/gestures.html