Sliding finger across screen activating buttons as if they were pressed - android

In case my title wasn't clear enough, I'll explain it in detail:
Say we have a screen filled with multiple buttons (10+), and we press down on one, activating onTouch/onClick. If we now move the finger, without lifting it, I want it to activate any other button it slides over. In this particular case, I want sound to be played when you slide over a virtual piano.
I know about the onTouchListener-solution where you register every ACTION_MOVE and find some boundaries that activates new events, but that's far from optimal if you have multiple buttons and want to allow smooth sliding without delay.
I also read this thread which suggested that we combine the touchListener of the View with a gesturelistener from the activity, but once again, this does not feel at all optimal for my situation.
I have not tried combining touchlistener with gesturelistener yet, but I will go ahead and do so if someone tells me they have no other way of doing this.

In my opinion, the proper way of doing this is to forget about buttons, and create a custom view which draws the entire keyboard. In this view you can handle touch events the way you like. You do not even need the gesture detector, just analyze the action and coordinates of motion events, it's very easy.
I don't understand what you mean about ACTION_MOVE and delays. To minimize delay, react on ACTION_DOWN, and then on ACTION_MOVE if it hovers other keys while in down state. It can't be any faster than that. With buttons there is an important delay because the onClick() event is triggered when the user lift the finger, on ACTION_UP.
Buttons are just not meant to work as you describe. The idea is that if a user taps on a button and then move his finger away at the same time it does not trigger onClick events on other views around. This prevents bogus clicks.

I actually took the "easy" way out and used buttons with an onTouch-method using ACTION_DOWN and ACTION_MOVE with coordinate calculations that combined with event.getX() and event.getY() allows me to detect which button is currently hovered. It's lag free with 13 buttons.

Related

Android ChatHead is limiting the touch to itself only, BUT I need support second finger simultaenously second touching the area outside the chathead,

http://www.piwai.info/chatheads-basics/
By following this good guide, I can make a chathead and also detect the touch event.
However, if I touch the chathead with first finger, and try to touch other area (outside) the chathead with second finger, the second touch is not possible.
(The area outside can be the home screen, or another app, activity)
Similarly, IF I first touch the outside, and try to use second finger to touch the chathead, it is not possible.
I tried the similar interaction with facebook messenger chathead and it is the same.
My question is: is it possible to support the second touch?
maybe using dispatch touch event? but afaik dispatch is only for activity.
the chathead uses service and window.
Any help would be deeply appreciated!
Yes its possible using the following workaround.
Have a transparent layout surrounding your chathead.
This
transparent layout will intercept the touch and you can do the necessary
handling.
You can then pass this touch event up the hierarchy/other apps by returning false from OnTouchEvent().
To let the other apps handle touch event,the transparent view can only be activated when the user is already touching your chathead.This way you cna make sure that the user is planning to do some gesture with your chathead.
This isn't possible using layouts manually added to the WindowManager as a system overlay when the underlying view is from a completely different hierarchy.
Once you start a touch event on the first view, all subsequent touch events will be sent to the same view heirarchy until ALL MotionEvents are finished (I.e ACTION_UP or ACTION_CANCEL has occurred).
Basically, once you are interacting with one view heirarchy, any outside touches are interpreted as touches outside the current heirarchy, and ignore any underlying view heirarchies which may or may not occupy the same screen position.

How to detect a screen touch that already exists when activity is created?

I have an app that is used outdoors, in all conditions. These are on rooted B&N Nook tablets running Android 2.1. They have optical touch detection, not pressure, so a large raindrop on the screen can "disable" the device, because it's being detected as a press, and then all other presses are not detected.
Part one: in the activity being used, I manually detect long (10 second) screen presses, consistent with a raindrop beginning a blocking press. I use dispatchTouchEvent() for this and it's fine.
Part two: So then I open a new activity and actually circle the rain drop and tell the user "wipe up this rain drop". The new activity opens fine, and I can successfully draw circles anywhere I please.
The trouble is the new activity does not receive any touch events for that very first press... the long press that hasn't stopped yet. Not getting a "new" ACTION_DOWN is understandable... I already grabbed that. If I lift my finger though, there's no ACTION_UP either. After lifting the initial press, every works fine: I can tap the screen, instantly a circle is drawn around the spot, and it will move if I drag my finger, so no problems there.
How do I get the initial press, the one that brought me here, that still exists? It must some sort of polling API, not event, since I really want the current state and I know the event is already sucked up. To be clear, NO events are coming out of dispatchTouchEvent() until I first take my finger off the screen (even the off does not create a detectable event).
(I could grab the coordinates from the prior activity and pass it... but the trouble there is the rain drop could slide around during the 10 second wait period. And I would rather the activity be self contained in doing its job.)
I'm not sure you can (although I've never tried). I'm pretty sure the touch events are canceled out as soon as the new Activity is opened.
The approach that you may decide instead is to either use the Framgents API or simply open a new View on top of the View that's being touched.
The View that's receiving touch events will continue to do so until one of these events happen:
The method onTouchEvent() returns false. If it returns false at any point, it will stop receiving touch events all together. Meaning, if you return false in an ACTION_MOVE action, you will not receive an ACTION_UP action.
You receive an ACTION_CANCEL which denotes the gesture has ended. Usually it means the touch left the view bounds, but it could be a number of reasons.
You receive an ACTION_UP which means the last touching finger lifted from the View.
Views in the back will always receive touch events as long as Views in the front return false for the actions which most do by default. So if you just pop up a new View on top of the View that's recording the touches, just keep recording and pass the draw coordinates to the top View.

blinking in custom drag/tap implementation

I have implemented custom view with some children. The view can be scrolled using standard drag gesture. Also every child can be clicked. The problem is, that when I start dragging the view, one of children gets 'down' event and it changes its state to 'pressed' for a second. I would prefer standard listview behavior - the child goes into pressed state when the user keeps pressing this child with his/her finger for like 50ms. It would reduce blinking caused by misread press event.
I know, that I need at least 2 events to detect if the user is tapping or dragging the view. For now I'm using TimerTask to shedule 'down' event. When I get 'move' event before my 'down' event is executed, I know that the user is dragging and I can cancel the sheduled event.
I know it's quite hacky. I also tried gesturedetector to detect drag and tap events, but it needs some additional work to properly implement changing view state from pressed to default when the user moves finger and starts to drag the view.
My question is - how this is implemented in android listview? I tried to copy their solution from listview implementation, but it's so huge I can't handle it. Simply I don't see the code responsible for handling such situation.
I managed to understand a gesture detection logic in ListView and, in general, in android views. I wrote my own gesture detector, which is somewhat better than the original one. It reports more gestures (multiple taps, dragging) and has some configurables (timeouts, move epsilon). You can find it open-sourced here: Better Gesture Detector on code.google
The library uses Handler class and postDelayed()/removeCallbacks() method combination to detect, handle and cancel motion events and gestures. It's quite simple and one should be able to get the idea by just reading the code.
This repository also contains a simple demo. Please note that this code is provided 'as is', contains some useless comments, logs and should be cleaned up a bit.

process touch events across multiple scrollviews

I have two scrollviews side by side, I want the user to be able to drag list items back and forth from left to right scrollviews. However, I can't find a way to handle the touch events. I can't set a touch listener for each scrollview seperately as the drag gesture gets dropped when passing from one to another. I tried creating an absolute layout over the top of both, which works from the drag and drop perspective, but it stops me from being able to scroll the scrollviews. Is there a simple solution to this? can anyone help me out?
Generally, onTouchListener returns a boolean that indicates whether the touch has been handled. It's up to you to decide whether the touch was handled or not. When the user touches a View, Android will call it's touch listener. If the touch listener returns true, then it regards the touch as handled then moves on. If the touch listener returns false, then it will go up one to the parent view (in this case whatever your ScrollView is). Then the parent view's touch listener is called and must decide how to handle the touch. It will keep cascading up the parent views until a true is returned or until it reaches the end.
In your case, you may have to decide what the user has to do in order to drag & drop vs. scrolling. Perhaps the user must do a long press on an item before he/she can drag it or something.

Gesture problems with a ViewFlipper that contains a ListView

I have a ViewFlipper where one of the views is a ListView. To move back and forth between the views, I have a GestureListener that detects left and right swipes. Sometimes the left & right swipes interfere with the ListView. That is, when I want to switch to the next view by swiping left/right, I may accidentally click on an item in my list.
Is there a good way to prevent this interference?
Have a look at http://android-journey.blogspot.com/2010/01/android-gestures.html.
The SimpleGestureListener from this page is a great solution to gesture detection. When run in dynamic mode (the default), it intercepts touch events that are determined to be gestures to prevent them from performing other actions. Other touch events are not interfered with.
If you are only interested in swipe gestures I recommend disabling the code for detecting tapping and only listening for swipes.
If you want something a little snazzier than a ViewFlipper (something more like the Android home screen) try out this new addition to the Android compatibility libs: http://android-developers.blogspot.com/2011/08/horizontal-view-swiping-with-viewpager.html?m=1

Categories

Resources