I want to implement a swipe from bottom gesture like windows 8 tablet in android.But I can't catch any touch begin from android navigation bar.
Here is my code in MainActivity.java:
#Override
public boolean dispatchTouchEvent(MotionEvent ev) {
Log.d("touch",ev.toString());
return super.dispatchTouchEvent(ev);
}
Swipe from left or right works fine.But nothing happends when I swipe from navigation bar or from bottom edge in non-navigation bar devices
I tested this on an Android 4.4.4 Tablet and Genymotion.
First things first: Android: Difference between onInterceptTouchEvent and dispatchTouchEvent?.
As stated by the best answer, dispatchTouchEvent is actually defined on Activity, View and ViewGroup. Think of it as a controller which decides how to route the touch events. With that in mind, we know that every touch event that starts from the navigation bar (i.e. when you touch your finger on the navigation bar, hold it and swipe up) is not going to be intercepted by the method dispatchTouchEvent.
If that was not the answer you are looking for, then i imagine that you're looking for a way to implement a menu that shows up when you swipe from the bottom-up . Look, this library should help you if this is what you want.
Edit: i found one more example that covers the entire swipe gestures subject that may help you. Follow this link.
Related
I'm trying to do an Accessibility Service where the purpose it's to create some "layer" between the user when he touches the screen and the tap.
For example : when I touch the screen I when to double tap at this precise position I touch the Screen.
I think, but I'm really open to suggestion, that I will have to create an invisible layout that will cover all the screen where a would be able to activate an onTouchListener to get the position and use my accessibility service to create gesture and transfer the touch behind the layout to click anywhere.
As far I only found a solution for Android 4.1 or less.
I also want to use a kind of cursor, the app Open Sesame do it well and the cursor can go over the navigation bar and interact with.
I also found the open source project Eva facial mouse but they don't perform complex gesture and don't go over the navigation bar.
So my big question is, I am in the right way by wanting to create an invisible layout to detect touch even on the navigation bar and is there someone would help me to enlighten my search in the right direction.
I succeed in putting an overlay layout over the status bar, just add the right Flags to your LayoutParams.
For my case I use: FLAG_FULLSCREEN, FLAG_LAYOUT_IN_SCREEN and FLAG_LAYOUT_NO_LIMITS.
in an android application, I need to do the following:
I have a view, that acts like a menu, I can slide it from the bottom-up.
However I want to have 3 states: closed, half-opened and opened.
Please refer to this pic:
The blue section is the Handle, the user will use it to slide up/down or fling...
This is a bit similar to the android 5 notification bar, where you drag down to reveal half the view, then another drag down will reveal the whole view.
I can start implementing it in the onTouch of the blue section but that would require me to handle a lot of cases, especially the ones where the user flings vs slowly drag....
Is there any easier way or library that can help?
Thank you very much
I implemented a SlidingDrawer in my app and everything works fine so far.
Then I made an invisible handle, just like the one in the Android main window.
Now I want, that it is possible to grab the handle (and make it visible) just by moving the finger over the handle. I mean the same behaviour like the one in the Android main window.
But i donĀ“t know how to mange that.
(At the moment it only works when you tap at the handle, and not if you tap somewhere else and move the finger above the handle)
Sounds like a bad idea. There will be no way to distinquish between the system's drawer and your app's drawer.
You want to detect a swipe motion that starts on the top bezel.
You want to use MotionEvent's getEdgeFlags() to detect whether a swipe gesture starts on the bezel.
I have a custom sliding drawer that basically has very same traits as a options menu. I wont get into why I'm not using an options menu at this time as its beyond the scope of this question (long story short - it won't work).
The drawer sits on top of a view pager so users can swipe between various fragments to interact with the various aspects of the application. I want to be able to detect when a interacts/engages with anything OUTSIDE OF the SlidingDrawer. If that happens I want to automatically close the sliding drawer.
I've tried listeners, event listeners, gestures, gesture listeners, etc and I cannot seem to get this magic potion to work. Anyone have any ideas/tips/tricks?
Have you tried overriding the
public boolean dispatchTouchEvent(MotionEvent ev)
This method is inherited from the activity, so you should have access to it.
You didn't post any source code, so I can tell for sure that it'll work. Please try and let us know.
I'm implementing a sliding from top drawer custom widget from some source code (AFAIR, from sephiroth). It works well as is, but I need a small layout (i.e. 3 buttons) in the handle of the drawer. So, I had to modify the code a little. I'm using dispatchTouchEvent in the main onTouchEvent of the widget, to propagate the necessary events down to the children of the handle view.
It works well when the drawer is closed (the handle and its buttons are on the top of the screen), but when it is open all the way down handle buttons stop working. I have a proper response on touch from the handle layout frame (its color changing and I can close the drawer) though.
By an accident I've realized that handle buttons can be triggered by touching their old locations near the top of the screen! But their images are properly shown at the bottom, and react to the setText() properly. What's happening? And how to fix that?