Is it possible to detect touch events from within another application, specifically swipes? I'd like to be able to detect if the user has swiped left or right (even with 2 fingers - but not required). Perhaps there is a service or broadcast I can listen to.
or failing that, is there some API perhaps that I can poll say 10 times a second to get the touch state and I can compute the rest (why, I remember writing a mouse driver strobing the COM1 port with IN OUTs in 8086 assembler coded in a TSR on a XT...)!
Anyway, any help appreciated. (I think it could be done by hijacking the primary Launcher and having a transparent click-through on-top activity, but that's seriously fraud with danger!)
Is it possible to detect touch events from within another application, specifically swipes?
Fortunately, no.
or failing that, is there some API perhaps that I can poll say 10 times a second to get the touch state and I can compute the rest
Fortunately, no.
I think it could be done by hijacking the primary Launcher and having a transparent click-through on-top activity
Fortunately, no.
You are welcome to write your own home screen application, in which case you can track your own touch events on your own home screen. You are welcome to write an ordinary application and track your own touch events on your own activities.
Related
We have started to look into the Building Accessibility Service for Android at https://developer.android.com/guide/topics/ui/accessibility/services.html. Based on this documentation, we can perform custom gestures on behalf of user as mentioned under section "Taking actions for users" at https://developer.android.com/guide/topics/ui/accessibility/services.html#act-for-users.
We have following questions based on this documentation.
1) As we understand, there are gestures that user would perform and our code would listen to. Let's call these Listening Gestures. Then there are gestures that could be performed by our code for user. Let's call these Performing Gestures. Question is where do Performing Gestures impact - over touch-and-explore layer or underneath the touch-and-explore layer? For additional information, touch-and-explore is feature of Android Operating System that can be requested by Accessibility Services.
2) Does the Performing Gesture trigger any AccessibilityEvent which is notified to Accessibility Service? If yes, there's possible recursion if both Listening Gesture and Performing Gesture happen to be same. That is Listening Gesture could be swipe right which triggers some event. Performing Gesture is also let's say a swipe right. Now, this will also in turn trigger same event handler.
3) How do we determine that Performing Gesture executed successfully? The whole thing holds significance if Performing Gesture happens underneath the touch-and-explore layer.
Any help would be greatly appreciated.
1) No, performing gestures on behalf of users utilizing Accessibility service capabilities DOES NOT end up being caught as a "listening" gesture. The AccessibilityService actually sends the gesture through to the API that calculates screen touches in the exact same way that the screen does, circumventing the screen completely. So, these events are invisible to the assisstive technology. Though, if you hang on to a reference, you could of course call the callbacks from AccessibilityService for these gestures yourself. So, any gesture you perform will not trigger a touch to explore gesture. In fact, you could trigger performing a gesture as a result of a touch to explore gesture.
2) This, to me is actually the same question as question 1. No, it does not, because of all of the same reasons in question 1.
3) There are two answers to this. The first is 'dispatchGesture' returns a boolean. This boolean is true when the operating system runs into no technical issues dispatching your gesture. Potential issues for example would be: Your attempting to interact off screen. This would be stupid of you! LOL. If a "true" is returned from this method, your gesture was generally accpetable and was performed. At this point, we can be as sure that a gesture was performed as a user actually performing the gesture themselves. It's the exact same logic in the Operating System... check out the AOSP for yourself if you don't believe me :)
3B) The only way to be sure things are working is to watch your gesture take actions on the screen.
I want to know if it is possible to generate automatic touch at regular intervals of time, say 5 seconds in another application.
For example..I want to develop an application which will create a touch response just as we touch the screen, at a particular coordinate at regular fixed interval.
Please help.
It's possible to create and dispatch touch events in your application. It's pretty easy. Look at View.dispatchTouchEvent method and parameters. You have to override that method in root view group so you can pass your events to all views in the activity.
It's not possible to access other applications though (due to security reasons).
edit: seems like dispatchTouchEvent is public, so no need to override
I'm looking into making an app that prevents the default action on volume up / down key presses, and would like to get some input to see if I'd be wasting my time trying.
Here's the setup:
I have a Samsung Galaxy S3. I like to listen to music. Often i unintentionally change the volume of whats playing when the phone is locked and in my pocket.
At first i thought it would be simple; having tested a simple override of the onKeyDown() method for retrieving the integer values for the volume keys. Quickly i came to the realization that this would only work if i were to not lock my phone and keep ONLY that app open.
Next i found a few articles on Services, however i believe this also falls short for my needs as it's not a subset of Activity and so doesn't implement onKeyDown(); and unless I'm mistaken, a wake lock actually wakes and/or unlocks the phone?
Should I give up now, or is this actually achievable?
(actually it must be possible as the whole point of this is that i don't have to pay £3 for an app for this one feature. lol)
Note: Running 4.2.1
New to Android, but not to Java.
Steve.
Edit: Just a thought, but if i extended Activity to my own class (and override onKeyDown), and instantiated it in a Service as a static instance, would that custom activity persist while the phone is locked?
Edit2: I found this SO post which suggests using FLAG_SHOW_WHEN_LOCKED. I'll start looking at this when I get home, but I'm still open to suggestions and advice :D
Edit 3:
OK so tell me if i start losing the plot here...
Using a broadcast receiver i will listen for the ACTION_SCREEN_OFF flag. If that gets called, create my custom Activity (with the onKeyDown() Override) and set it to FLAG_SHOW_WHEN_LOCKED to take over my lock screen (i don't care because the screen is off). The receiver then listens for the ACTION_SCREEN_OFF, if called it will then destroy the Activity before showing the screen (or after, I'm not bothered if it flickers).
(possibly even just pause it, and only kill it if it detects an unlock).
Also a big thanks to #StoneBird for getting me this far, sometimes it helps to just hash it out with someone who knows what there talking about (hopefully ^_-).
Try this?
Settings.System
You can start a service and set system volume like every 1 ms to keep it at a steady level. No key checking should be needed since the value is overwritten every 1 ms.
Also, there are free volume keepers on the market so you don't have to pay for that.
I know that it is easy to measure the touch pressure by using the function getPressure() provided by android API within an application activity.
However, I would like to measure all the motionevents (not only within the application activiy, but also outside the application activity).
How can I achieve this?
You can't, thankfully.
For security reasons, Android does not allow you to intercept touch events on anything outside of your own App's Activities. If this were allowed, a malicious app could take over and render the device useless by simply consuming all the touch events.
SECURITY ISSUE: I don't now how it happens, but readers of this questions come up with idea that solutions to this problem is a security threat. So please keep in mind, all data I am interested in is measuring time of user in/activity. That's all. What user did -- I am NOT interested at all!
What I need is very simple concept, but I cannot find a solution. I would like to run a process (in the background) and have a listener to user interaction.
So, user touches screen --> my method is triggered.
User unlock phone --> my method is triggered.
User types on physical keyboard --> my method is triggered.
So in short my method is NOT triggered when the phone lies still on the table :-) Please note, that the interaction is with entire phone, not with my app, and I am not interested that user typed letter "K", I am not even interested that user typed something, all I care is the user used the phone in some way.
Btw. the state when user is walking listening to music (phone in the pocket) -- it is NOT interaction.
What I am looking for is a trigger -- something like INTERACTION_DETECTED, or (conversely) a callback method which is set like reportInactiveUser(10*1000) and it would be called if user was inactive for 10 seconds.
QUESTION how to detect user interaction?
Purpose: to measure the time of using manually the phone.
(lack of) PROGRESS
I found out that BatteryStatsImpl could hold the data I need. The instance of it is kept by BatteryStatsService, which however does not let any access to it, only IBatteryStats which in turn has only one interesting method -- getStatistics and the output does not include user activity counters (or they are all zeros; I set all permissions on). So, so far -- bad luck.
All calls have to be made by reflection, because those classes are not available anyway ;-).
You want to intercept physical interactions with the phone for that.
For screen touches, this package will come handy. *You can use this to listen for gestures on the screen.
For physical button touches, I suggest you have a look at this.