Android overlay intercepting touch events - android

I would like to build an overlay that intercepts all touch events, registers them, but it is basically transparent and lets the touch events (and all other gestures) interact with the underlying apps.
What i've tried so far are the solutions proposed here
Android overlay to grab ALL touch, and pass them on?
Android : Multi touch and TYPE_SYSTEM_OVERLAY
but i'm able either to have a "transparent" that lets pass all touch events to the underlying apps, or an overlay that intercepts all touch events without letting them "pass down".
For what i've read and tried so far, it seems to be a security limitation, from Android 4.x and maybe this cannot be done at all.
I'm open to any kind of solution, thus maybe there is a way of making this work with root permissions, or using NDK, or dunno what else, as i'm quite new to Android programming.

Related

Can we make pinch gesture in android core system programatically without touching being service on background?

I've a question regarding pinch gesture in android system. I've a task regarding pinch zoom in and out in android app without touching screen. In android we can detect for pinch gesture but can we produce gesture without touching programatically?
You could, but you wouldn't do it that way. In your touch code, you'd be detecting a gesture. When that gesture is detected, you do something. Take that something, turn it into a function. Then when you want to produce that gesture programatically, just call that function instead. No need to do something complicated and possible to fail like faking a gesture. Just architect your code properly and its all avoided.

jquery-ui draggable interfering with touch scroll

It's my understanding that jQuery UI totally ignores touch events and people wanting to use touch events have had to use touch-punch to map touch events into mouse events. However, the newer versions of UI seem to be interfering with built in touch events such as scrolling even though they should be ignoring those events.
I have jQuery UI draggable elements that work fine. However, when using a touch device and attempting to scroll the window with that element below the touch point, the screen won't scroll. If jQuery UI is ignoring touch events, why is this happening?
I've found that older versions of UI don't have this issue. The version I'm using is 1.11.4 .
Just to re-iterate: I'm not trying to make the draggable elements draggable on a touch device. I just want it to ignore the touch events and allow regular scrolling to happen.
It seems that the library is ignoring actions, but the CSS is setting touch-action: none which is what is causing the functionality I'm seeing.

How to record touch movements on Android keyboard?

I apologize if this is a stupid question. I have only taken an introductory Java class and am a beginner to Android.
I want to track the drawing made by a finger using Swype and record each stroke as an image or vector of some sort.
Is there a simple way to record a touch movement in another application, maybe by making an application running in the background?
Thank you and I appreciate any guidance!
Touch events are consumed by the top-most view. Because swype doesn't have any kind of API you can hook into you won't be able to get that information. Input Events | Android Developers
There might be some way to create a skeleton keyboard application that captures the gestures that people make over an image of the swype keyboard that then passes that gesture to the real swype, but I'd personally be worried about copyright unless it was just for personal use.

Capturing gestures with accessibility features on (such as explore-by-touch)

We have an application which captures gestures (currently using the onTouch event callback, works great). Sadly, when turning on accessibility features on (such as explore-by-touch), only some of the fingers are recognized by our application. We of course have reasons to believe this is not due to a bug in our code. To us, the visually-impaired and blind populations are very important, and the gestures are even more important for them.
How can gestures be captured when accessibility features are enabled?
I haven't done this myself (disclaimer), but from the "Handling custom touch events" section in the Accessibility docs it looks like you'll need to implement a "virtual view hierarchy" by overriding getAccessibilityNodeProvider (assuming you have some custom views, or you're overriding onTouch in built in views, which has a similar net effect).
There is a good deal of info on the docs on that, and that works back to Android 1.6 via the support library. I'd look into all that first and get very familiar with detecting when the accessibility stuff is enabled and when it isn't, and react accordingly when it is.

android catch touches in background

I want to provide user with a way to control my application then it is in background. I implemented special hidden area in a corner via overlay. But such solution has lots of restrictions: only 2 possible actions for each hidden area (one for regular touch and other for long press), other applications use the same behavior (area can be overlapped) etc.
While searching for alternatives I found the AGS application here.This application uses gesture and it catches them everywhere on the screen.
AFAIK it is impossible to do with overlay, Am I right? Any ideas how to implement such behavior?

Categories

Resources