Capturing gestures with accessibility features on (such as explore-by-touch) - android

We have an application which captures gestures (currently using the onTouch event callback, works great). Sadly, when turning on accessibility features on (such as explore-by-touch), only some of the fingers are recognized by our application. We of course have reasons to believe this is not due to a bug in our code. To us, the visually-impaired and blind populations are very important, and the gestures are even more important for them.
How can gestures be captured when accessibility features are enabled?

I haven't done this myself (disclaimer), but from the "Handling custom touch events" section in the Accessibility docs it looks like you'll need to implement a "virtual view hierarchy" by overriding getAccessibilityNodeProvider (assuming you have some custom views, or you're overriding onTouch in built in views, which has a similar net effect).
There is a good deal of info on the docs on that, and that works back to Android 1.6 via the support library. I'd look into all that first and get very familiar with detecting when the accessibility stuff is enabled and when it isn't, and react accordingly when it is.

Related

Does the libvlcsharp MediaPlayerElement support non-touch devices?

I was messing around with libvlcsharp on Xamarin and the (fairly) new MediaPlayerElement UI on Android. For devices such as Android TV, there is no touch interface so you have to use something like a remote control.
For this case, I end up capturing keypresses in DispatchKeyEvent and send them to my app via MessagingCenter. I was wondering whether MediaPlayerElement can support non-touch devices automatically OR if not, what the best approach would be to handling keypresses in the app. I would have to "highlight" various controls of the MediaPlayerElement and then be able to select them when "DpadCenter" is pressed.
My questions are:
Does MediaPlayerElement already support non touch gestures? This site here seems to suggest it might with the comment that you can turn them off.
If it doesn't support them (and you have to roll your own), is there a programmatic way to highlight (e.g. change the background color) of the individual controls/buttons (such as pause or play) and invoke them?
Does MediaPlayerElement already support non touch gestures? This site here seems to suggest it might with the comment that you can turn them off.
If it doesn't support them (and you have to roll your own), is there a programmatic way to highlight (e.g. change the background color) of the individual controls/buttons (such as pause or play) and invoke them
You can override functionality for any control, so you should be able to hook your DpadCenter event and modify the behavior you expect of the player element.
It is on the roadmap to provide better docs for this https://code.videolan.org/videolan/LibVLCSharp/-/issues/309
For customization of the control, a similar answer was created a while ago: How to create LibVLCSharp custom playback controls in Xamarin Forms?
Do share what you build with it :-) We don't have any Android TV sample for this.

Android overlay intercepting touch events

I would like to build an overlay that intercepts all touch events, registers them, but it is basically transparent and lets the touch events (and all other gestures) interact with the underlying apps.
What i've tried so far are the solutions proposed here
Android overlay to grab ALL touch, and pass them on?
Android : Multi touch and TYPE_SYSTEM_OVERLAY
but i'm able either to have a "transparent" that lets pass all touch events to the underlying apps, or an overlay that intercepts all touch events without letting them "pass down".
For what i've read and tried so far, it seems to be a security limitation, from Android 4.x and maybe this cannot be done at all.
I'm open to any kind of solution, thus maybe there is a way of making this work with root permissions, or using NDK, or dunno what else, as i'm quite new to Android programming.

Phonegap: Is it possible to change orientation using javascript?

I was wondering if within Phonegap it was possible to disable orientation changes using JavaScript alone. So not using an external plugin, or modifying the config.xml for example. I could not find any definitive answers to this question. The reason for this is we have an application that we package up and have customers download, which allows them to run their own mobile applications using a software product we offer. Since we want customers to have all the options available for their applications, this packaged application has both orientations enabled (this application was built using Phonegap obviously).
I was thinking perhaps there is an event I can listen for and disable its propagation. Another hack I was thinking of was if I could detect an orientation event, I can rotate the page in the opposite direction of the movement that triggered the orientation event, if its not possible to disable its propagation or stop the rotation in the first place.

How to record touch movements on Android keyboard?

I apologize if this is a stupid question. I have only taken an introductory Java class and am a beginner to Android.
I want to track the drawing made by a finger using Swype and record each stroke as an image or vector of some sort.
Is there a simple way to record a touch movement in another application, maybe by making an application running in the background?
Thank you and I appreciate any guidance!
Touch events are consumed by the top-most view. Because swype doesn't have any kind of API you can hook into you won't be able to get that information. Input Events | Android Developers
There might be some way to create a skeleton keyboard application that captures the gestures that people make over an image of the swype keyboard that then passes that gesture to the real swype, but I'd personally be worried about copyright unless it was just for personal use.

What is gesture in Android

I would like to know what is meant by gestures in typical mobiles more specifically android ones. Android supports gesture recognition.
Is a gesture termed as when user
holds the device and shakes it(say
upwards, downwards or side- side)?
Is a gesture termed as when a
finger is placed on the screen and
traced UP,DOWN,RIGHT,LEFT? If so
what is the difference between touch
screen and gestures.
I am confused between 1) and 2) option.
What is a gesture exactly?
As I understand it, a gesture is anytime a user touches the screen and does a pre-defined motion that the system understands. I would venture to say that shaking the phone is not a gesture, but a function of detecting changes in the accelerometers.
From Android's reference web page, a gesture is a hand-drawn shape on a touch screen. It can have one or multiple strokes. Each stroke is a sequence of timed points. A user-defined gesture can be recognized by a GestureLibrary.
https://developer.android.com/reference/android/gesture/Gesture.html
I see gestures as being a type of input pattern that you expect from the user. I.e., you can setup mouse gestures in web browsers to do things like going "Forward" or "Back" in the browse by doing a specific "gesture" (maybe middle mouse button click and moving the mouse left goes "Back").
I'll give a brief answer to your bonus question: Yes, it's quite possible to to character recognition from input gestures on Android. In fact, at least one major vendor has already ported an existing handwriting engine to that platform. Works beautifully, but there's a lot of legal and marketing cruft to take care of as well before it ends up on real devices :(

Categories

Resources