Android Jelly Bean service that receives all touch events - android

I am an undergraduate research assistant working on an android accessibility project. My task involves collecting as much data about the user experience as possible, including touch events and other view interactions. I require 2 services: an accessibility service to gather details about the view current interaction, and a TouchListener service that is able to intercept MotionEvents.
My problem is with the TouchListener service. Is there any known way to intercept all touch events and pass them on to the current view?
Essentially, it seems like an invisible system-overlay view is needed to constantly listen for touch events, but the view can either intercept all touch events and NOT pass them to the view behind, or it can pass the even back and register the touch event as an ambiguous ACTION_OUTSIDE event, giving no details about the interaction.
My question is similar to this one, and the obstacle is discussed here. If anyone has found a work-around, please post!

Related

android event handlers trigger graph

Some actions from the user may trigger multiple events and handlers, like the well known LongClick ~> Click.
I am looking for some kind of a hierarchy graph that summarizes this potential trigger-order relation for Android Events, but no success so far.
Any pointer on this please ?

How to take action for users using android accessibility service?

We have started to look into the Building Accessibility Service for Android at https://developer.android.com/guide/topics/ui/accessibility/services.html. Based on this documentation, we can perform custom gestures on behalf of user as mentioned under section "Taking actions for users" at https://developer.android.com/guide/topics/ui/accessibility/services.html#act-for-users.
We have following questions based on this documentation.
1) As we understand, there are gestures that user would perform and our code would listen to. Let's call these Listening Gestures. Then there are gestures that could be performed by our code for user. Let's call these Performing Gestures. Question is where do Performing Gestures impact - over touch-and-explore layer or underneath the touch-and-explore layer? For additional information, touch-and-explore is feature of Android Operating System that can be requested by Accessibility Services.
2) Does the Performing Gesture trigger any AccessibilityEvent which is notified to Accessibility Service? If yes, there's possible recursion if both Listening Gesture and Performing Gesture happen to be same. That is Listening Gesture could be swipe right which triggers some event. Performing Gesture is also let's say a swipe right. Now, this will also in turn trigger same event handler.
3) How do we determine that Performing Gesture executed successfully? The whole thing holds significance if Performing Gesture happens underneath the touch-and-explore layer.
Any help would be greatly appreciated.
1) No, performing gestures on behalf of users utilizing Accessibility service capabilities DOES NOT end up being caught as a "listening" gesture. The AccessibilityService actually sends the gesture through to the API that calculates screen touches in the exact same way that the screen does, circumventing the screen completely. So, these events are invisible to the assisstive technology. Though, if you hang on to a reference, you could of course call the callbacks from AccessibilityService for these gestures yourself. So, any gesture you perform will not trigger a touch to explore gesture. In fact, you could trigger performing a gesture as a result of a touch to explore gesture.
2) This, to me is actually the same question as question 1. No, it does not, because of all of the same reasons in question 1.
3) There are two answers to this. The first is 'dispatchGesture' returns a boolean. This boolean is true when the operating system runs into no technical issues dispatching your gesture. Potential issues for example would be: Your attempting to interact off screen. This would be stupid of you! LOL. If a "true" is returned from this method, your gesture was generally accpetable and was performed. At this point, we can be as sure that a gesture was performed as a user actually performing the gesture themselves. It's the exact same logic in the Operating System... check out the AOSP for yourself if you don't believe me :)
3B) The only way to be sure things are working is to watch your gesture take actions on the screen.

Find out which View consumed touch event, if any

I was troubleshooting a view-related issue: a click listener that is not fired when it's supposed to. After a long session of trial-and-error, I found out that a parent view was disabled, thus discarding all events to its children.
Is there a way, in Android, to find exactly what happens to a touch or click event when it is injected to the app? Like how was it dispatched, which views were traversed by it, who ignored it (and why), who discarded it (and why) and finally who consumed it.
Ideally it would be some kind of low-level dump on Logcat emitted for every click in the app.
As you can see in the similar post at the issue tracker Dianne Hackborn writes:
There currently isn't a way to do this.
Meaning, there is no such an API in the framework.
But you can have a custom root ViewGroup and listen for each and every touch event (via onInterceptTouchEvent()) and dump the MotionEvent.
I believe that's the only possible way so far.

Automatic touch on android device(sending touch impulses)

I want to know if it is possible to generate automatic touch at regular intervals of time, say 5 seconds in another application.
For example..I want to develop an application which will create a touch response just as we touch the screen, at a particular coordinate at regular fixed interval.
Please help.
It's possible to create and dispatch touch events in your application. It's pretty easy. Look at View.dispatchTouchEvent method and parameters. You have to override that method in root view group so you can pass your events to all views in the activity.
It's not possible to access other applications though (due to security reasons).
edit: seems like dispatchTouchEvent is public, so no need to override

How to prevent Android's native webview from performing a mousedown event after a touchdown

When registering to mousedown and touchdown events on a webview both are triggered at the very same time, leading up to the odd situation of having twice "mousedown-like" events to deal with. There's a known workaround which consists of calling event.preventDefault()... but I cannot call it because I do need to get the default behavior anyway. And of course I cannot remove the regular mouse events listeners neither without no longer being compatible with a regular computer.
There's another known workaround (found here: http://www.quirksmode.org/blog/archives/2010/02/do_we_need_touc.html) which consists of detecting whether the first event that we receive is of touchevent type or not, in the former case we then simply remove the listeners to the mouse events.... But I've found it ugly and moreover I can't believe why the Android's native browser keeps firing mouse event whereas touch events are registered and there's no way to prevent that from happening in the manifest or somewhere else.
Btw: this issue doesn't occur on Safari mobile.
Thank you very much!

Categories

Resources