Android user presses a key - android

Is there a way to register a receiver for a app running in the background for when a user presses a key. Kind of like "ACTION_USER_PRESENT" but if any keys were pressed on the screen.
MORE DETAIL: My app is running as a service in the background. User opens the phone and presses keys, like they searching for something online on their driod. Can I capture those key presses in the background?

To detect whether a user is using the device you could also use the information whether the screen is on or off as an approximation (making the assumption that the screen timeout is set). This blog entry shows how to capture the screen on and off events (I haven't done it myself though).

In the Android HCI Extractor ( http://code.google.com/p/android-hci-extractor/ ) we traverse the GUI and install some event filters (by using listeners) in the app top view.
Maybe if you can reach a top level view from which listen the events you could listen to all the events for this view. Let's try ;-)
This tool is an open-source prototype you can find here: http://code.google.com/p/android-hci-extractor/
It is very easy to integrate and use. In the tutorials you can see that only a few lines of code are needed: http://www.catedrasaes.org/trac/wiki/MIM
I hope it helps you!!

Related

Android: click or scroll on behalf of user?

Is that possible?
I'm developing a service for disabled people. They can define voice commands and the service can detect the commands and execute them. Like when the user says "scroll down", The service (which is in fact a background process) takes control of screen and scrolls down (regardless of what application is on foreground), or touches a specific position and so forth. I wonder if this is possible in an android device? If not, what about a rooted device? (i.e the service has the root permissions). I know that getting voice input and processing it is possible. My question is about doing actions like touch (Action_Down) or scroll the user interface on behalf of a user.
Note that I don't have access to whatever application is running! In fact my service doesn't know about the application that is running on foreground. It might be a social media app or a messaging app or a game or whatever else! So in fact my service must be capable of defining input events like touch, swipe, scroll etc.
Thanks in advance!
Yes, that is possible.
For example, ListView has two methods for programmatic scroll:
listView.setSelection(id); //For instant scroll
listView.smoothScrollToPosition(id); // For smooth scroll
For an example of how to use voice triggered actions - check this answer out
For an example of how to inject events programmatically - check this link out

Android . Activating my application when some string was entered in some other application

I am trying to create a behavior where entering some string in any application will open my application.
I've tried looking around on how to listen for keyboard press or listening for text change, but I couldn't find my required behavior and I don't want to create a custom keyboard for this.
If this is not possible, what will be a good implementation for lunching my application as fast as possible while in the other application?
Answered before the requirement to "don't want custom keyboard"
The only viable way that I can think of is if the user was using a custom keyboard written by you. Custom keyboards can and do act as key-loggers and therefor could detect any key combination, or written word and allow you to execute your code when your conditions are met.
Rerfer to Creating an input method docs
How to open my app as fast as possible from another application?
Press home, launch your app by clicking on the launcher icon
But assuming you mean without doing that, you'll still need to monitor some event, say volume keys pressed or device being shaken for instance, or have you app be running already in the foreground such as what Facebook messenger does (or used to do, I don't know)
Related questions:
What APIs in Android is Facebook using to create Chat Heads?
Listen to volume buttons in background service?
How to detect shake event with android?
Demo of bubbles

How to create an Android app that can capture the global touch screen events when user is playing with another app?

As the title said, in my research I need to log user touch screen events, mainly gestures(such as tap, scroll, swipe, etc) when a user is using another app. Currently I can't figure out a good way of doing it. It seems that for Android 4.x the touch screen events can not be read out? Anyone knows any good methods to capture the touch events?
I was also doing R&D from quite some time to get global touch events inside an ANDROID App.
Actually our requirement was to capture touch coordinates on phone from inside ANDROID app.
We tried several approaches:
1- First we tried to put a transparent layout(like a layer of glass on above the phone screen) which covers the phone screen and then receive the touch events and dynamically removing the glass so that the event could be passed to below the glass and then dynamically inserting the glass again.
But the drawback was that it require two times tap which is not feasible (because 1st tap would give the touch coordinate and second touch would be passed below the glass).
2- Then we tried to use one special flag in window manager i.e. flag_watch_outside_touch by making the transparent layout size 1 X 1 but we were getting touch coordinates as (0,0) because Android Framework imposes security that no other process can get touch points inside other process.
3- Then we tried to run command "adb shell getevents" through code but it was not accepting touch events (although by using "adb shell sendevent" we were able to se send global touch events i.e. by passing a coordinate we were able to pass event on particular coordinate on the device screen.
4- Now we were quite sure that without rooting no App can get global touch events that will happen on some other app process. Android security will not allow this otherwise any app can passwords or other user input data.
But.... we have to do this (without rooting) ...
5- Now we tried to use shell script and executed the getevent command from command prompt and redirected the touch events to a file and parsed the touch events to get readable coordinates ..
So we cannot capture global touchevents inside an app through code but yes we can capture from OS level by executing getevent command from adb shell.
That we can the pass to android app by storing it into external storage inside phone storage .
Hope this will help you all.
Thanks,
This is not possible due to security and privacy reasons (as Sir SC mentioned) unless you're rooted.
I have tried multiple things (even asked a question on Stackoverflow: Android - Inactivity/Activity regardless of top app). I came to the conclusion that using an "Accessibility Service" is the closest we can come to knowing when a user has touched the screen. This isn't fool proof, however. You will not get an event for every screen touch (scrolling in Chrome didn't yield any events).
With that said, if your application can rely on a rooted solution then it's possible to listen to incoming lines from getevent (https://source.android.com/devices/tech/input/getevent.html). These lines simply give details of touch (and other) events. But this requires root access so it might not be an acceptable solution.
I think that you should already found this one but it "may" be the only library which can help you on this since you can't interact between activities directly because of security and privacy reasons:
GitHub android-global-touchevent
Not sure whether to post this as a comment or an answer, but I suspect that this should not be possible due to security concerns. If you could monitor activity in another application like that without it being obvious to the user, you could inappropriately monitor password enteries into other applications. For that reason I wouldn't be surprised if that sort of activity is not supported.
As amarnathpatel said its almost impossible to get touch events without root.But there is an workout through which we can get events but not the co-ordinates.It is done using FLAG_WATCH_OUTSIDE_TOUCH.
check this out.

Check volume button usage when screen is off

For this question I'm going to quote another user who got no response to their question:
I've written an Andoid app that uses the hardware Volume buttons for another purpose.
It works fine if the app is running and visible, but when I turn the
screen off or let it time out, the button clicks don't get into my
handlers.
Does anyone know if there is a way to detect these button clicks when
the screen is off?
Source: AV695's question
I'm working on an app myself that makes use of the volume buttons, but as this user also noted, the normal behavior of checking buttons with onKeyPress stops working once the screen is off. This is because the Activity gets paused on screen off.
Is there a way to keep the activity running while the screen is off, or check for the usage of the volume buttons when the screen is off? I tried using a Service for this before but it's impossible to check for the volume keys like that as noted by Commonsware.
I doubt that this is supported (without resorting to a battery-draining wakelock) at either the platform, kernel, or underlying radio firmware levels without modifications to the last to bring volume presses during sleep to the attention of the kernel.
Within the realm of reasonable system-ROM modifications, a more reasonable one might be to modify an existing open source ROM for the device to insert some custom platform level code into the handling of the power button usually used to wake up the device preparatory to unlocking it - that at least we know does get the attention of the kernel. That code could then inform the user by sound or vibration if there are unacknowledged notifications.
You could optionally wait briefly, check device orientation, or look for another key press to avoid doing this in an annoying way when the user is holding the device outside their pocket and trying to unlock it.
Or you could not use the volume key and just set a timer to wake up every 15 minutes and vibrate if there are unacknowledged notifications, avoiding the need to fumble in ones pockets.
You mention it's a custom request: if implies it's one off or low-volume, another option to consider would be that a few vendors have "bluetooth watches" out with an SDK that lets you push notifications from an android device.
If you can capture the notification when it's generated, you could push it to the user's wrist, and then let the phone go back to sleep.
You cannot intercept the key while your application is in background, but instead of listening to the KeyPress itself. You can register a ContentObserver, as described in this question.
As Chris Stratton mentioned, the only way to keep your App alive is by using battery-draining wake locks.
However, since I found myself in the same situation, I came up with another solution. Unfortunately, you'll need a rooted device as well as the Xposed framework.
With Xposed, which replaces the zygot process so you can hook yourself into any constructor and method of the system, you will be able to catch the raw KeyEvents before the system handles them.
This is done in PhoneWindowManager.interceptKeyBeforeQueueing(). By using a XC_MethodHook, you can use beforeHookedMethod() on the afore mentioned method to catch every hardware button event, even if the device is in deep sleep.
After catching events you are interested in, you can create a temporary wake lock to do your things but don't forget to release the wake lock after you finished your work.
A good example of how to accomplish this is the Xposed Torch Module.
If you, however, rely on a non rooted system, the bad news is that it's simply not possible without draining the battery...
I was also trying to implement volume button press detection in my app and I left that part to be developed later once the core part is done. I was able to detect volume key press while screen is on even when phone is locked, from a background service.
Background Video Recorder 2 (BVR2) (and possible BVR1 also, I did not try) is one of the apps that can detect volume key press even when screen is off. While trying to implement volume key detection while screen is off in my app, I installed BVR2, hoping to find how it works. To my surprise it gave my app the ablity to detect volume keys even when screen is off. My app had a ContentObserver to monitor volume changes, but was not working when screen is off. When BVR2 is active my app also could detect volume key press when screen is off. Still digging.
But BVR2 has its own trigger action, that is to record video, an action you may not want to occur just for the sake of you application detecting volume key presses.
Another app is QuickClick. This app can give your app what it lacks, the power to detect volume key presses even when screen is off, without extra unwanted actions. Just install QuickClick and do not configure any action. Create a ContentObserver to monitor for stream volume changes and you are ready. You app will now be able to detect volume key presses even when screen is off.
Please note that my app runs as a background service.
Both of the apps mentioned above are meant for other uses, but uses volume key detection to perform action. I am in no way connected to any of the apps mentioned.
If these apps, and possibly dozens others, can detect volume key press, it can be done. I request experts to find out how to do it, so that we can implement in our app without relying on another app.
If you find this answer useful, please up-vote.
I am not sure if it is as simple as this but check this android blog:
Allowing applications to play nice(r) with each other: Handling remote control buttons
It explains the usage of a broadcast receiver that receives the up/down volume controls and other music controls.
In summary you should use registerMediaButtonEventReceiver

What is the proper way to reference a user interaction on Android?

I'm currently in the process of writing documentation for an app, and was curious of the proper way to reference a user interaction on screen.
i.e.: To advance to the settings screen, tap/touch/click the settings icon.
Since Android is available on so many form-factors, including TV, is it 'tap' or 'touch' or 'click' or something else entirely that maybe encompasses everything? I've checked some other app docs and they all vary.
Thanks in advance.
The documentation of the SDK (agreed, this is for developers, and not end-users) seems to be using the touch word.
See for example the Handling UI Events section, in which you'll find (quoting) :
This is called when the user either
touches the item (when in touch
mode), or focuses upon the item with
the navigation-keys or trackball and
presses the suitable "enter" key or
presses down on the trackball.
Or :
For a touch-capable device, once the
user touches the screen, the device
will enter touch mode.

Categories

Resources