How do I get the current touch coordinates?
In my case, I'm setting a timer and want to see if there was any movement during the elapsed time. When receiving a MotionEvent, it's easy enough to cache the x & y coordinates from the event. As long as more touch events come, it's easy to cache the latest touch coordinates. Then, finally, compare the most recent to the originals.
But this technique fails because, sometimes, some other view consumes future touch events and my view no longer gets them (it gets a "cancel"). Thus, the most recent cached value is "stale".
The obvious solution is to somehow query the system to get the current touch coordinates (or none if he user let go). So far, I haven't found any such method.
Related
When a user starts a touch in a particular view and then moves the finger outside the view (while still in contact with the surface), the MotionEvent returned clamps the x/y value into coordinates inside the box.
That is, I won't get negative Y values if I move the finger above the view, for example.
Is there a way to get this raw value? getRawY seems to clamp the value as well.
I can't do anything with activities, since this is for an IME project, and the IME basically accepts a View from you and takes care of rendering etc.
Set the FLAG_WATCH_OUTSIDE_TOUCH on the window associated with your view. That should make it work.
I do have another way to do it, but you really don't want to use it (it involves the ugly hack of making a partially transparent full screen keyboard) so lets try this first.
Here's the horrible hack you don't want to use this method:
1)Add a transparent view above your keyboard, taking up the rest of the screen.
2)In onComputeInsets, define your touchable insets to be a REGION, and pass it a region made up of only your keyboard
3)In onTouchEvent, on a ACTION_DOWN set a flag that says we're in touch mode, and make sure to invalidate your view (this will cause onComputeInsets to be called again)
4)In onComputeInsets, look at the value of that touch mode flag. If false, use the insets set in 2. If true, set the entire screen as touchable
5)In onTouchEvent, on a ACTION_UP or ACTION_CANCEL clear that flag and call invalidate to reset the insets
What that will do is tell the framework that touches on that transparent view do no belong to the keyboard (so they'll be sent to the app) unless you're in touch mode in which case we tell the framework we want all touches anywhere. Its hacky, but I've done this kind of code before and it does work. It just requires you to play with onComputeInsets, which is almost undocumented (they added minimal documentation a few versions ago) and has been used by probably about 20 people in the world. Its an ugly api.
I need to know how i can make my activity to know when a user touches the screen, double touches it, or holding the finger on it for a long touch on a single view, with out using buttons
Could some one briefly describe me witnh what i should work here, and what logic to use?
You need to have some View or layout item that encompasses the screen, so that all the touch events are sent to it. From there, you need to attach a listener of some kind (touch or gesture, probably).
I would recommend either a GestureDetector (this supports double tapping and other gestures), or a basic touch listener (uses MotionEvent, which doesn't have double tap, but you could implement this yourself).
If you use MotionEvent, you can detect how long the finger has been down with getDownTime(). For a double tap, you can record the time of the last press (using Calendar or similar), then check if the last press was within some amount of time (maybe 500ms?).
If you use GestureDetector, you can implement these differently. Take a look at this answer for more details (other answers in that thread provide alternatives as well). It also supports MotionEvent objects, so that shouldn't be a problem.
I am testing this sample on version 13 of android (3.2) and I have an issue when there is multiple touches on screen.
When I first touch, then there is an action_down event, if I make another simultaneous touch, then I won't get another action_down, my first touch keeps active and I can keep getting action_move from the first touch.
The problem is, when I release the first touch and move the second, it creates a line to that touch, because it generates another action_move event.
I tried using Euclidean distance, but it seems to slow down too much, and makes the lines incomplete.
I tried creating a producer/consumer model, but still got the same problem.
I also tried checking the time from the last touch, but this is very inefficient.
Does anyone have any suggestion?
I admit I'm a little confused about what your actual problem is, but it seems to me like your problem might be solved simply by handling more of the touch events. When the second finger comes down, you can get an ACTION_POINTER_DOWN event, indicating a second touch, and you can then modify how your ACTION_MOVE events will be interpreted while the second finger is down, until an ACTION_POINTER_UP event comes through. Hope that answers or at least helps your question.
I am trying to develop a game with Android where I need to move a tile in all directions. My question is how can I get the motion of the finger on the screen(right,left,top) using getX() and getY()?
Thanks.
Take a look at getHistoricalSize, getHistoricalX(int) and getHistoricalY(int).
When the finger is touching the screen, Android records the move positions and saves them. You can then call getHistorySize to get the number of motions recorded, and then call getHistoricalX and getHistoricalY with a parameter less that the history size to get the x/y at that history position.
So, for example, you can call getHistoricalY with a parameter that indicates that previous motion event, and then compare it to the current one. If the current one is bigger, then the finger is swiping down.
Note: Motion events recording applies only to ACTION_MOVE.
I want to move an image around the screen according to the trackball movement. I am able to capture the movements using the onTrackballEvent method. But this is being called for very small float values (I believe for each rotation?) of the trackball.
Now as X and Y positions of views should be integers when specifying with LayoutParams, it makes no sense to move the view with every rotation. Instead I want to move the view only after the user stops rotating the trackball.
Is there any method by which we can get whether the user stopped using the trackball or not?
If there isn't (I don't know Android), the classic approach is to start a timer in each event, and keep track of the deltas reported in it (by adding them to a running sum, one for X and one for Y). On any subsequent event, before the timer has triggered, just re-start the same timer, pushing its triggering further into the future.
When the user stops generating events, the timer will eventually trigger, and then you can apply the sum of all the movements, and clear the sums.
A suitable timeout must be chosen so that the user interaction is not perceived as being too sluggish. Perhaps something on the order of 200-300 milliseconds, or more. This is easy to tweak until it feels "right".
I've also gotten a lot of value out of setting a threshold. So if the total movement in a specific direction adds up to a given amount, then trigger a navigational event.