Is that possible?
I'm developing a service for disabled people. They can define voice commands and the service can detect the commands and execute them. Like when the user says "scroll down", The service (which is in fact a background process) takes control of screen and scrolls down (regardless of what application is on foreground), or touches a specific position and so forth. I wonder if this is possible in an android device? If not, what about a rooted device? (i.e the service has the root permissions). I know that getting voice input and processing it is possible. My question is about doing actions like touch (Action_Down) or scroll the user interface on behalf of a user.
Note that I don't have access to whatever application is running! In fact my service doesn't know about the application that is running on foreground. It might be a social media app or a messaging app or a game or whatever else! So in fact my service must be capable of defining input events like touch, swipe, scroll etc.
Thanks in advance!
Yes, that is possible.
For example, ListView has two methods for programmatic scroll:
listView.setSelection(id); //For instant scroll
listView.smoothScrollToPosition(id); // For smooth scroll
For an example of how to use voice triggered actions - check this answer out
For an example of how to inject events programmatically - check this link out
Related
I created a background service on android and I have two buttons which appear on the top of the screen all the time. I want to use these two buttons like scroll down and scroll up. But these two buttons should work on any kind of applications like Instagram, Facebook, Twitter and so. So, it means it should work in all applications that use scrolling.
I search a week on internet but I could not find any solutions.
This is not possible, sorry. Something like this would require your Service to have access to the Views of the applications and this would be a huge security breach, because you could read values from them and so on.
You could achieve this with a custom button code broadcast (so basically your buttons would act as physical buttons on the device) but this would most probably require you to have system-level permissions and some level of cooperation with the OEMs.
Android Activity class has a method called dispatchKeyEvent(), which could let you simulate the key input (with some limitations) but this is not present in the Service class.
Sadly this is not something you can do in Android. Typically you should not be able to touch views with a background service, the point of a background service is that you do some work in it (for example upload files to your web server or get some data). You CAN send a signal from a service once you're finished doing work to tell an app that something needs to happen, however the app needs to be specifically coded to respond to this broadcasted event.
If you wanted to do this with an app that you have developed, that can be achieved by using the onReceive method of say a BroadcastReceiver, however you cannot specifically define the behaviour of other apps as this would represent a security breach in Android.
I have been trying to design an accessibility application for android for disabled people like that of ios7 switch control point mode. The biggest problem I am facing is how would I make an activity that would appear over all applications and takes switch events and send touch/tap events accordingly to any app over which it is running. I have searched and found that we could have a view that runs on top of other applications using system overlay mode but that does not let me send touch events. Please point me in right direction. Thanks.
You don't want an Activity. You want to develop and AccessibilityService.
http://developer.android.com/training/accessibility/service.html
Accessibility services receive callbacks from the accessibility apis, and are allowed to interact, draw views on top of, and send events to the applications that are running on the device. Once you register yourself as an accessibility service, you have much more power over the way the OS works, than you do within the confines of an application activity. You can even override touch events, and send your own! Though advanced gesture control is lost, as the gestures that you send are within the confines of the accessibility framework.
I try to do a service that will recognize the movement (+ longclick move up) to associate an event.
However, I need this service also works when another application is launched.
At the moment I create a service that gives me a transparent LinearLayout.
I can well recover movement, BUT I can not use the current application below.
Can not click through the layout. I wish, however, that this is possible and that my service detects this movement only.
EX: on Android kitkatt there now has a fullscreen mode and during a slide back down on windowing with the appearance of the taskbar, and the application below have not impacted.
Have you any ideas ?
Thanks.
I wish, however, that this is possible and that my service detects this movement only.
That is not possible from an SDK app on Android 4.0+. Either you receive the touch events or the underlying app receives the touch events, not both. Otherwise, this would represent a security flaw known as tapjacking.
As the title said, in my research I need to log user touch screen events, mainly gestures(such as tap, scroll, swipe, etc) when a user is using another app. Currently I can't figure out a good way of doing it. It seems that for Android 4.x the touch screen events can not be read out? Anyone knows any good methods to capture the touch events?
I was also doing R&D from quite some time to get global touch events inside an ANDROID App.
Actually our requirement was to capture touch coordinates on phone from inside ANDROID app.
We tried several approaches:
1- First we tried to put a transparent layout(like a layer of glass on above the phone screen) which covers the phone screen and then receive the touch events and dynamically removing the glass so that the event could be passed to below the glass and then dynamically inserting the glass again.
But the drawback was that it require two times tap which is not feasible (because 1st tap would give the touch coordinate and second touch would be passed below the glass).
2- Then we tried to use one special flag in window manager i.e. flag_watch_outside_touch by making the transparent layout size 1 X 1 but we were getting touch coordinates as (0,0) because Android Framework imposes security that no other process can get touch points inside other process.
3- Then we tried to run command "adb shell getevents" through code but it was not accepting touch events (although by using "adb shell sendevent" we were able to se send global touch events i.e. by passing a coordinate we were able to pass event on particular coordinate on the device screen.
4- Now we were quite sure that without rooting no App can get global touch events that will happen on some other app process. Android security will not allow this otherwise any app can passwords or other user input data.
But.... we have to do this (without rooting) ...
5- Now we tried to use shell script and executed the getevent command from command prompt and redirected the touch events to a file and parsed the touch events to get readable coordinates ..
So we cannot capture global touchevents inside an app through code but yes we can capture from OS level by executing getevent command from adb shell.
That we can the pass to android app by storing it into external storage inside phone storage .
Hope this will help you all.
Thanks,
This is not possible due to security and privacy reasons (as Sir SC mentioned) unless you're rooted.
I have tried multiple things (even asked a question on Stackoverflow: Android - Inactivity/Activity regardless of top app). I came to the conclusion that using an "Accessibility Service" is the closest we can come to knowing when a user has touched the screen. This isn't fool proof, however. You will not get an event for every screen touch (scrolling in Chrome didn't yield any events).
With that said, if your application can rely on a rooted solution then it's possible to listen to incoming lines from getevent (https://source.android.com/devices/tech/input/getevent.html). These lines simply give details of touch (and other) events. But this requires root access so it might not be an acceptable solution.
I think that you should already found this one but it "may" be the only library which can help you on this since you can't interact between activities directly because of security and privacy reasons:
GitHub android-global-touchevent
Not sure whether to post this as a comment or an answer, but I suspect that this should not be possible due to security concerns. If you could monitor activity in another application like that without it being obvious to the user, you could inappropriately monitor password enteries into other applications. For that reason I wouldn't be surprised if that sort of activity is not supported.
As amarnathpatel said its almost impossible to get touch events without root.But there is an workout through which we can get events but not the co-ordinates.It is done using FLAG_WATCH_OUTSIDE_TOUCH.
check this out.
I'm trying to figure out the right way to add Chromecast buttons (pause, play, etc) to an Android Notification. I've set up a custom notification that sends PendingIntents to a ChromecastService. That service is trying to interact with a class I built called ChromecastAdapter. The ChromecastAdapter implements MediaRouteAdapter and contains all the listeners and state that go along with casting. However, all this state is gone as soon as I exit the application. So, my ChromecastService doesn't end up having access to the Chromecast once my app is gone.
It seems to me that the only way to get this to work is refactor all the Chromecast state into a Service that implements MediaRouteAdapter. I really don't want to do this since I'm pretty happy with the way things are now.
Since these interactive Notifications are required by Google, I feel like there has to be a standard way of interacting with a cast from a Notification. Am I on the right track here? Do I have to place all my Chromecast interactions behind a Service?
What the behavior should be depends on the type of app and the requirements of the app. If your app is "gone" (in the sense that the Application instance is gone), then the question that you should ask yourself is whether you would want to keep a notification mechanism to stay around; there are apps that when they are killed, the receiver also gets closed and user is sent back to the home screen on the chromecast device, in which case there is no reason to keep a notification around.
On the other hand, there are apps that based on their requirements, you would want to let the cast device continue what it was doing (for example play the video) even if the mobile app is gone. In those cases, you may want to have a notification mechanism in place for "bringing up" the app. To achieve that, you need to maintain certain amount of information/state/objects in a service, enough to be able to establish a connection again and "join" the running app. In addition, your "service" needs to be aware of the status of the app on your receiver so if that app is killed (say, someone else starts casting a different app to the device), it can be notified and exit.