jquery-ui draggable interfering with touch scroll - android

It's my understanding that jQuery UI totally ignores touch events and people wanting to use touch events have had to use touch-punch to map touch events into mouse events. However, the newer versions of UI seem to be interfering with built in touch events such as scrolling even though they should be ignoring those events.
I have jQuery UI draggable elements that work fine. However, when using a touch device and attempting to scroll the window with that element below the touch point, the screen won't scroll. If jQuery UI is ignoring touch events, why is this happening?
I've found that older versions of UI don't have this issue. The version I'm using is 1.11.4 .
Just to re-iterate: I'm not trying to make the draggable elements draggable on a touch device. I just want it to ignore the touch events and allow regular scrolling to happen.

It seems that the library is ignoring actions, but the CSS is setting touch-action: none which is what is causing the functionality I'm seeing.

Related

Can we make pinch gesture in android core system programatically without touching being service on background?

I've a question regarding pinch gesture in android system. I've a task regarding pinch zoom in and out in android app without touching screen. In android we can detect for pinch gesture but can we produce gesture without touching programatically?
You could, but you wouldn't do it that way. In your touch code, you'd be detecting a gesture. When that gesture is detected, you do something. Take that something, turn it into a function. Then when you want to produce that gesture programatically, just call that function instead. No need to do something complicated and possible to fail like faking a gesture. Just architect your code properly and its all avoided.

MRTK Unity Android Scrolling by touch input is too slow

I am trying to make Scrolling Object Collection from MRTK Unity work with Android.
The problem is, that scrolling is just too slow. When I swipe across the whole screen the objects in the list barely move.
If I look at the collection, touch the screen, and then move my phone in the direction I want to scroll, everything works fine.
But that is not how I want users to scroll.
Any ideas how to make the touch behave the same way/have the same impact as moving the phone?

Android overlay intercepting touch events

I would like to build an overlay that intercepts all touch events, registers them, but it is basically transparent and lets the touch events (and all other gestures) interact with the underlying apps.
What i've tried so far are the solutions proposed here
Android overlay to grab ALL touch, and pass them on?
Android : Multi touch and TYPE_SYSTEM_OVERLAY
but i'm able either to have a "transparent" that lets pass all touch events to the underlying apps, or an overlay that intercepts all touch events without letting them "pass down".
For what i've read and tried so far, it seems to be a security limitation, from Android 4.x and maybe this cannot be done at all.
I'm open to any kind of solution, thus maybe there is a way of making this work with root permissions, or using NDK, or dunno what else, as i'm quite new to Android programming.

Ignore Immersive mode swipe

With the game running in Android 4.4's Fullscreen Immersive Mode, the user swipes from the edge of the screen to show the notification / status bar & the Menu buttons bar
However, these swipes are also passed to the game underneath as downward swipes
Is there an official / clean way to ignore those swipes other than hard-coding a specific are of the screen to ignore down swipes from?
In K, there is no way to do this. The system receives motion events in parallel to your application, not before. This avoids introducing latency and gives apps/games the ability to implement gestures using these events.
Bear in mind that even if a gesture starts near the edge, it may not end up meeting the requirements for an edge swipe by the time the gesture ends. You may want to look at the gesture definition in AOSP if you are trying to avoid it.
i want the exact the same thing. i am developing a drawing APP and want to turn on immessive mode. but when swipin, the touch event is passed in, which results a wrong drawing curve. this is not acceptable. it would be great if if the gesture just eat the event instead of dispatch to my app.
i'd suggest fire an CANCEL event to APP once the system gesture is detected and fired.

Capturing gestures with accessibility features on (such as explore-by-touch)

We have an application which captures gestures (currently using the onTouch event callback, works great). Sadly, when turning on accessibility features on (such as explore-by-touch), only some of the fingers are recognized by our application. We of course have reasons to believe this is not due to a bug in our code. To us, the visually-impaired and blind populations are very important, and the gestures are even more important for them.
How can gestures be captured when accessibility features are enabled?
I haven't done this myself (disclaimer), but from the "Handling custom touch events" section in the Accessibility docs it looks like you'll need to implement a "virtual view hierarchy" by overriding getAccessibilityNodeProvider (assuming you have some custom views, or you're overriding onTouch in built in views, which has a similar net effect).
There is a good deal of info on the docs on that, and that works back to Android 1.6 via the support library. I'd look into all that first and get very familiar with detecting when the accessibility stuff is enabled and when it isn't, and react accordingly when it is.

Categories

Resources