I have a custom sliding drawer that basically has very same traits as a options menu. I wont get into why I'm not using an options menu at this time as its beyond the scope of this question (long story short - it won't work).
The drawer sits on top of a view pager so users can swipe between various fragments to interact with the various aspects of the application. I want to be able to detect when a interacts/engages with anything OUTSIDE OF the SlidingDrawer. If that happens I want to automatically close the sliding drawer.
I've tried listeners, event listeners, gestures, gesture listeners, etc and I cannot seem to get this magic potion to work. Anyone have any ideas/tips/tricks?
Have you tried overriding the
public boolean dispatchTouchEvent(MotionEvent ev)
This method is inherited from the activity, so you should have access to it.
You didn't post any source code, so I can tell for sure that it'll work. Please try and let us know.
Related
The goal is to obtain views that can be interacted with instantly (that can be clicked right now and something would happen). If the view is visible and clickable in general but hovered by another view/menu/side panel, it should be omitted.
Voice Access do that. And it seems to use Accessibility API.
The perfect example is the bottom menu in Google Maps. When it expands, "Search along the route" button underneath is still visible but it's not highlighted by the app.
So what do we have?
There is a stream of AccessibilityEvent. The most useful is
AccessibilityEvent.TYPE_WINDOW_CONTENT_CHANGED, so we can be notified when something is happening.
With getSource() we can get an instance of AccessibilityNodeInfo that triggered the event.
Or we can get a root of a window with AccessibilityService.getRootInActiveWindow(). And having that we are able to traverse the whole hierarchy within an app.
AccessibilityNodeInfo doesn't provide any information about z-order of views, so it's not possible to understand what is above and what is beneath.
The bottom menu is in the same window (it's not modal).
If you try to click "Search along the route" button while the bottom menu is expanded, the bottom menu collapses. So you can't actually click it, it's beneath the menu.
I've looked through all parameters of the AccessibilityNodeInfo, like isVisibleToUser(), isClickable(), isContextClickable(), isSelected(), isFocusable(), isFocused(), isAccessibilityFocused() and the button has the same parameters when the bottom menu is collapsed/expanded. It's visible to the user, focusable and clickable.
I've looked into hidden APIs and don't see anything that can be useful.
What I'm missing?
The key point is that in an AccessibilityService.onAccessibilityEvent() the tree hierarchy is not final. To get views that are interactable at the moment, AccessibilityService.getRootInActiveWindow() should be called with a delay.
AccessibilityNodeInfo#getDrawingOrder() will probably help you. Note that you need to do tree traversal to determine what is on top of what.
There are still corner cases with transparent views that will give you trouble, but that should get you 95% of the way there. We're working on a better answer for that case.
I want to implement a swipe from bottom gesture like windows 8 tablet in android.But I can't catch any touch begin from android navigation bar.
Here is my code in MainActivity.java:
#Override
public boolean dispatchTouchEvent(MotionEvent ev) {
Log.d("touch",ev.toString());
return super.dispatchTouchEvent(ev);
}
Swipe from left or right works fine.But nothing happends when I swipe from navigation bar or from bottom edge in non-navigation bar devices
I tested this on an Android 4.4.4 Tablet and Genymotion.
First things first: Android: Difference between onInterceptTouchEvent and dispatchTouchEvent?.
As stated by the best answer, dispatchTouchEvent is actually defined on Activity, View and ViewGroup. Think of it as a controller which decides how to route the touch events. With that in mind, we know that every touch event that starts from the navigation bar (i.e. when you touch your finger on the navigation bar, hold it and swipe up) is not going to be intercepted by the method dispatchTouchEvent.
If that was not the answer you are looking for, then i imagine that you're looking for a way to implement a menu that shows up when you swipe from the bottom-up . Look, this library should help you if this is what you want.
Edit: i found one more example that covers the entire swipe gestures subject that may help you. Follow this link.
Recently I implemented the sliding menu library. It is working fine but the problem is by default, the secondary menu(the layout on right),if not open, slides in when its clicked.
What I want is that the click event should be passed on to its children so that the views inside the secondary menu can get click. The user can use swipe gesture to open the secondary menu.
All the answers I've looked at, are meant to disable swipe gesture on the secondary menu and not the click. So, it will be helpful if someone can provide some suggestions :)
You need to update the library to update the default behavior of Sliding menu.
You would have to go into CustomViewAbove class and change the onInterceptTouchEvent method.
Remove mQuickReturn = true; in MotionEvent.ACTION_DOWN:
I implemented a SlidingDrawer in my app and everything works fine so far.
Then I made an invisible handle, just like the one in the Android main window.
Now I want, that it is possible to grab the handle (and make it visible) just by moving the finger over the handle. I mean the same behaviour like the one in the Android main window.
But i donĀ“t know how to mange that.
(At the moment it only works when you tap at the handle, and not if you tap somewhere else and move the finger above the handle)
Sounds like a bad idea. There will be no way to distinquish between the system's drawer and your app's drawer.
You want to detect a swipe motion that starts on the top bezel.
You want to use MotionEvent's getEdgeFlags() to detect whether a swipe gesture starts on the bezel.
I'm implementing a sliding from top drawer custom widget from some source code (AFAIR, from sephiroth). It works well as is, but I need a small layout (i.e. 3 buttons) in the handle of the drawer. So, I had to modify the code a little. I'm using dispatchTouchEvent in the main onTouchEvent of the widget, to propagate the necessary events down to the children of the handle view.
It works well when the drawer is closed (the handle and its buttons are on the top of the screen), but when it is open all the way down handle buttons stop working. I have a proper response on touch from the handle layout frame (its color changing and I can close the drawer) though.
By an accident I've realized that handle buttons can be triggered by touching their old locations near the top of the screen! But their images are properly shown at the bottom, and react to the setText() properly. What's happening? And how to fix that?