Simulate user input - android

Is it possible to simulate user input in android ?
For example to have some service or some thread that will work in background and initiate something that will simulate user input so if we are on the home screen and simulated input(let say click) happened than the application should start if the coordinates of the simulated click point to some icon on the home screen , or let say open a clock if the simulated input points to the clock area ...
or maybe if some edittext have focus and the simulated input of keystrokes start then the edittext should be filled with some text . . .
I am sure that this can be done in .Net or java but I do not know if it is possible in android

From documentation on building accesibility services:
Starting with Android 4.0 (API Level 14), accessibility services can act on behalf of users, including changing the input focus and selecting (activating) user interface elements. In Android 4.1 (API Level 16) the range of actions has been expanded to include scrolling lists and interacting with text fields. Accessibility services can also take global actions, such as navigating to the Home screen, pressing the Back button, opening the notifications screen and recent applications list.
(...)
In order to take actions on behalf of users, your accessibility service must register to receive events from a few or many applications and request permission to view the content of applications by setting the android:canRetrieveWindowContent to true in the service configuration file.
See the linked documentation for details.

Is it possible to simulate user input in android ?
Only from a unit test suite. IOW, not in the way that you are thinking, as it would be a massive security hole.

Related

Android Q background restrictions

I need to show an activity on push received, but I am getting Background activity start from package-name blocked. system Toast.
This is an authentication activity where user needs to perform some task. I do not manage phone or NFC interaction thus I don't need to actually start 'special' service but showing notification is not enough - I need that activity.
SYSTEM_ALERT_WINDOW permission doesn't help.
So, should I re-implement all my flows to work only with notifications? Is there any possibility to start activity when application was closed (No activity in back stack)?
Android Q places restrictions on when apps can start activities. This behavior change helps minimize interruptions for the user and keeps the user more in control of what's shown on their screen.You can see the full document here
As of Android Q Beta 4, this change has the following properties:
Affects your app if you launch activities without user interaction
Mitigate by using notification-triggered activities
Disable restrictions by turning on the Allow background activity starts developer option

Android: click or scroll on behalf of user?

Is that possible?
I'm developing a service for disabled people. They can define voice commands and the service can detect the commands and execute them. Like when the user says "scroll down", The service (which is in fact a background process) takes control of screen and scrolls down (regardless of what application is on foreground), or touches a specific position and so forth. I wonder if this is possible in an android device? If not, what about a rooted device? (i.e the service has the root permissions). I know that getting voice input and processing it is possible. My question is about doing actions like touch (Action_Down) or scroll the user interface on behalf of a user.
Note that I don't have access to whatever application is running! In fact my service doesn't know about the application that is running on foreground. It might be a social media app or a messaging app or a game or whatever else! So in fact my service must be capable of defining input events like touch, swipe, scroll etc.
Thanks in advance!
Yes, that is possible.
For example, ListView has two methods for programmatic scroll:
listView.setSelection(id); //For instant scroll
listView.smoothScrollToPosition(id); // For smooth scroll
For an example of how to use voice triggered actions - check this answer out
For an example of how to inject events programmatically - check this link out

Android . Activating my application when some string was entered in some other application

I am trying to create a behavior where entering some string in any application will open my application.
I've tried looking around on how to listen for keyboard press or listening for text change, but I couldn't find my required behavior and I don't want to create a custom keyboard for this.
If this is not possible, what will be a good implementation for lunching my application as fast as possible while in the other application?
Answered before the requirement to "don't want custom keyboard"
The only viable way that I can think of is if the user was using a custom keyboard written by you. Custom keyboards can and do act as key-loggers and therefor could detect any key combination, or written word and allow you to execute your code when your conditions are met.
Rerfer to Creating an input method docs
How to open my app as fast as possible from another application?
Press home, launch your app by clicking on the launcher icon
But assuming you mean without doing that, you'll still need to monitor some event, say volume keys pressed or device being shaken for instance, or have you app be running already in the foreground such as what Facebook messenger does (or used to do, I don't know)
Related questions:
What APIs in Android is Facebook using to create Chat Heads?
Listen to volume buttons in background service?
How to detect shake event with android?
Demo of bubbles

Programmatically enter multi-window mode in Android N

Android N has a new feature - Multi Window Mode. It enables two applications to be active side-by-side (actually one one is active, other one is paused but we can see both simultaneously).
I am looking for an API that I can call to have my application enter multi-window mode. I couldn't find much help in Android N SDK docs. I am trying to have two activities of my app run side by side, but without user having to do manual steps.
MANUALLY ENTERING MUTLI-WINDOW MODE
The user can switch into multi-window mode in the following ways:
If the user opens the Overview screen and performs a long press on an activity title, they can drag that activity to a highlighted portion of the screen to put the activity in multi-window mode.
If the user performs a long press on the Overview button, the device puts the current activity in multi-window mode, and opens the Overview screen to let the user choose another activity to share the screen.
The SDK for API 24 introduced a new constant to toggle split screen mode from an accessibility service:
https://developer.android.com/reference/android/accessibilityservice/AccessibilityService.html#GLOBAL_ACTION_TOGGLE_SPLIT_SCREEN
The constant can be passed to following method:
https://developer.android.com/reference/android/accessibilityservice/AccessibilityService.html#performGlobalAction(int)
Google seems to have missed to document the new constant in the performGlobalAction method. I still consider this to be an official API since Google did not mark the constant as hidden.
You have to implement an accessibility service in your app and let the user manually enable the service in system settings->Accessibility so it might not be a viable option for all apps.
As of Android N, this is not supported.
The only supported way to enter multi-window mode is if the user manually triggers it.
I don't think that an api for the thing you want to do exists, You could try making your app have two fragments on each side of the screen. With a black bar in the middle, make each fragment resize according to the "X" position of the bar. :)

Android user presses a key

Is there a way to register a receiver for a app running in the background for when a user presses a key. Kind of like "ACTION_USER_PRESENT" but if any keys were pressed on the screen.
MORE DETAIL: My app is running as a service in the background. User opens the phone and presses keys, like they searching for something online on their driod. Can I capture those key presses in the background?
To detect whether a user is using the device you could also use the information whether the screen is on or off as an approximation (making the assumption that the screen timeout is set). This blog entry shows how to capture the screen on and off events (I haven't done it myself though).
In the Android HCI Extractor ( http://code.google.com/p/android-hci-extractor/ ) we traverse the GUI and install some event filters (by using listeners) in the app top view.
Maybe if you can reach a top level view from which listen the events you could listen to all the events for this view. Let's try ;-)
This tool is an open-source prototype you can find here: http://code.google.com/p/android-hci-extractor/
It is very easy to integrate and use. In the tutorials you can see that only a few lines of code are needed: http://www.catedrasaes.org/trac/wiki/MIM
I hope it helps you!!

Categories

Resources