Set mouse position in software - android

I am using an Android Stick (http://www.geekbuying.com/item/Uhost-2-Dual-Core-TV-Box-Mini-PC-Android-4-0-4-RK3066-Cortex-A9-1-6GHZ-1GB-RAM-4G-ROM-with-Bluetooth-WIFI-Skype-XBMC---Black-312467.html) for building an application. The application uses an attached USB webcam for some of it's functionality. Additionally, I connect a mouse to this device which the user can use to navigate through various pages in the application. A left/right movement of the mouse results in navigation to previous/next page.
While the mouse works with the Android device, I additionally require to reset the position of the mouse to the center after every single interaction with the user. Is it possible to set the mouse position using software in Android? I am using View.OnGenericMotionListener to capture the mouse movement.
Currently, I also require to perform a primary mouse button click to bring the mouse in focus inside the application. I want to remove this requirement by either generating the primary mouse button click in software, or otherwise bring the application in software by some other means.
I have been unable to find any APIs to get the above to work. Any help on these would be greatly useful.
Just in case I need to write some sort of drivers to get this thing working, any help in this direction would also be useful.
Any workarounds around this problem, while still using the mouse, could also prove useful.

Mouse event is managed by the system framework. You cannot control it on Java side.
On the adb shell you can open /dev/input/uevent device to write mouse events include
relative movement
click action
absolute position (you might want this)
However, you cannot do it as a normal application, unless you do it on a rooted device, or you can use adb shell to start a daemon service in the background to perform the event writing for your application.

I additionally require to reset the position of the mouse to the center after every single interaction with the user.
This is now possible with the pointer capture API in Android 8.0+ (released August 2017). Summary:
To request pointer capture, call the requestPointerCapture() method on the view.
Once the request to capture the pointer is successful, Android calls onPointerCaptureChange(true), and starts delivering the mouse events.
Your focused view can handle the events by performing one of the following tasks:
If you're using a custom view, override onCapturedPointerEvent(MotionEvent).
Otherwise, register an OnCapturedPointerListener.
The view in your app can release the pointer capture by calling releasePointerCapture().

Related

Android: click or scroll on behalf of user?

Is that possible?
I'm developing a service for disabled people. They can define voice commands and the service can detect the commands and execute them. Like when the user says "scroll down", The service (which is in fact a background process) takes control of screen and scrolls down (regardless of what application is on foreground), or touches a specific position and so forth. I wonder if this is possible in an android device? If not, what about a rooted device? (i.e the service has the root permissions). I know that getting voice input and processing it is possible. My question is about doing actions like touch (Action_Down) or scroll the user interface on behalf of a user.
Note that I don't have access to whatever application is running! In fact my service doesn't know about the application that is running on foreground. It might be a social media app or a messaging app or a game or whatever else! So in fact my service must be capable of defining input events like touch, swipe, scroll etc.
Thanks in advance!
Yes, that is possible.
For example, ListView has two methods for programmatic scroll:
listView.setSelection(id); //For instant scroll
listView.smoothScrollToPosition(id); // For smooth scroll
For an example of how to use voice triggered actions - check this answer out
For an example of how to inject events programmatically - check this link out

How to monitor mouse event when it moves out of the Android device screen?

My Android app wants to track the external mouse movement(either Bluetooth mouse or USB mouse).
I wrote a customized View to monitor the mouse events, but the events stop producing when mouse moves to the edge of the screen. I can understand that in most cases, there is no need to track the movement which occurs outside of the device screen. But I do need to monitor that kind of mouse movement.
I guess it can be monitored by the low level API. But can anyone point me to the right direction? Thanks.
well... how about that you got the key event when user input the bank password......
You may not get the key event if your activity or service not active but you may hijack the event from low level.
You can first change the /dev/input/eventX r/w value since all the event comes here, i.e. all the input event.
then you need write an C program e.g. to a dynamic library to select or epoll the event.
finally you can implement a JNI API to get the event from your apps.
Here we are.

Make a service to recognize gesture to bind an event on top of all other applications Android 4.X

I try to do a service that will recognize the movement (+ longclick move up) to associate an event.
However, I need this service also works when another application is launched.
At the moment I create a service that gives me a transparent LinearLayout.
I can well recover movement, BUT I can not use the current application below.
Can not click through the layout. I wish, however, that this is possible and that my service detects this movement only.
EX: on Android kitkatt there now has a fullscreen mode and during a slide back down on windowing with the appearance of the taskbar, and the application below have not impacted.
Have you any ideas ?
Thanks.
I wish, however, that this is possible and that my service detects this movement only.
That is not possible from an SDK app on Android 4.0+. Either you receive the touch events or the underlying app receives the touch events, not both. Otherwise, this would represent a security flaw known as tapjacking.

How to create an Android app that can capture the global touch screen events when user is playing with another app?

As the title said, in my research I need to log user touch screen events, mainly gestures(such as tap, scroll, swipe, etc) when a user is using another app. Currently I can't figure out a good way of doing it. It seems that for Android 4.x the touch screen events can not be read out? Anyone knows any good methods to capture the touch events?
I was also doing R&D from quite some time to get global touch events inside an ANDROID App.
Actually our requirement was to capture touch coordinates on phone from inside ANDROID app.
We tried several approaches:
1- First we tried to put a transparent layout(like a layer of glass on above the phone screen) which covers the phone screen and then receive the touch events and dynamically removing the glass so that the event could be passed to below the glass and then dynamically inserting the glass again.
But the drawback was that it require two times tap which is not feasible (because 1st tap would give the touch coordinate and second touch would be passed below the glass).
2- Then we tried to use one special flag in window manager i.e. flag_watch_outside_touch by making the transparent layout size 1 X 1 but we were getting touch coordinates as (0,0) because Android Framework imposes security that no other process can get touch points inside other process.
3- Then we tried to run command "adb shell getevents" through code but it was not accepting touch events (although by using "adb shell sendevent" we were able to se send global touch events i.e. by passing a coordinate we were able to pass event on particular coordinate on the device screen.
4- Now we were quite sure that without rooting no App can get global touch events that will happen on some other app process. Android security will not allow this otherwise any app can passwords or other user input data.
But.... we have to do this (without rooting) ...
5- Now we tried to use shell script and executed the getevent command from command prompt and redirected the touch events to a file and parsed the touch events to get readable coordinates ..
So we cannot capture global touchevents inside an app through code but yes we can capture from OS level by executing getevent command from adb shell.
That we can the pass to android app by storing it into external storage inside phone storage .
Hope this will help you all.
Thanks,
This is not possible due to security and privacy reasons (as Sir SC mentioned) unless you're rooted.
I have tried multiple things (even asked a question on Stackoverflow: Android - Inactivity/Activity regardless of top app). I came to the conclusion that using an "Accessibility Service" is the closest we can come to knowing when a user has touched the screen. This isn't fool proof, however. You will not get an event for every screen touch (scrolling in Chrome didn't yield any events).
With that said, if your application can rely on a rooted solution then it's possible to listen to incoming lines from getevent (https://source.android.com/devices/tech/input/getevent.html). These lines simply give details of touch (and other) events. But this requires root access so it might not be an acceptable solution.
I think that you should already found this one but it "may" be the only library which can help you on this since you can't interact between activities directly because of security and privacy reasons:
GitHub android-global-touchevent
Not sure whether to post this as a comment or an answer, but I suspect that this should not be possible due to security concerns. If you could monitor activity in another application like that without it being obvious to the user, you could inappropriately monitor password enteries into other applications. For that reason I wouldn't be surprised if that sort of activity is not supported.
As amarnathpatel said its almost impossible to get touch events without root.But there is an workout through which we can get events but not the co-ordinates.It is done using FLAG_WATCH_OUTSIDE_TOUCH.
check this out.

Android user presses a key

Is there a way to register a receiver for a app running in the background for when a user presses a key. Kind of like "ACTION_USER_PRESENT" but if any keys were pressed on the screen.
MORE DETAIL: My app is running as a service in the background. User opens the phone and presses keys, like they searching for something online on their driod. Can I capture those key presses in the background?
To detect whether a user is using the device you could also use the information whether the screen is on or off as an approximation (making the assumption that the screen timeout is set). This blog entry shows how to capture the screen on and off events (I haven't done it myself though).
In the Android HCI Extractor ( http://code.google.com/p/android-hci-extractor/ ) we traverse the GUI and install some event filters (by using listeners) in the app top view.
Maybe if you can reach a top level view from which listen the events you could listen to all the events for this view. Let's try ;-)
This tool is an open-source prototype you can find here: http://code.google.com/p/android-hci-extractor/
It is very easy to integrate and use. In the tutorials you can see that only a few lines of code are needed: http://www.catedrasaes.org/trac/wiki/MIM
I hope it helps you!!

Categories

Resources