I'm trying to access my android device from a web app. I send coordinates from the web app and I want to do a click on long press action on a view at that position.
Is it possible to do that? I think it's possible to get the coordinate of a view and maybe I can store coordinates of few views in an array or something similar and if the coordinate from the web app matches one of the views' position in array, I can do some action on that view..like clicking a button.
So can I click a button, knowing its coordinates?
I know this is quite far fetched, but I want to try. I'm able to view my android device on the web app. Now I'm trying to access it. Sorry if this sounds stupid.
ANY opinion on this would be appreciated.
You should see the following two answers.
The first one is quite straight forward but requires rooted android phone:- Simulating touch event using su
The second uses motion event. In this first you check if the view recieved the event and if it did ,you can use dispatchEvent() function to send the event:- https://stackoverflow.com/a/4692133/1375020
Related
I am developing an Android Application and need to log my users "journey" through the application.
The events I wish to log include all user interaction events such as when the user clicks on a button etc, and also each Activity and/or Fragment visited.
I know I can litter my code with my logging logic however there are a number of downsides to this such as:-
using autoLink "web" and MovementMethod to allow the user to click on
a web url within a displayed TextView means I have nowhere to add my
logging code unless I use Spannables or a custom textView.
Developer human error will result in logging incorrect details or
missing logging altogether.
What I would like is a single point within the Android framework where I could intercept all UI events and Activity transitions.
I do not wish to create custom widgets to add my logging code.
Is it possible to place my logging code in one Android "hook" to allow me identify which widget was clicked in which Activity/Fragment?
Is it possible to place my logging code in one Android "hook" to allow me identify which widget was clicked in which Activity/Fragment?
No, sorry. Besides, that would be woefully insufficient:
There are many more UI events than just "clicks" (long-press, swipe, other single-touch and multi-touch gestures, key events)
Knowing a widget alone is insufficient, as your TextView scenario illustrates
Big Picture:
User is on their device home screen with an overlay from my application in an arbitrary corner. Overlay is small icon with no functionality (can register if it has been touched) other than presence.
Is it possible to know when the user has clicked OUTSIDE the overlay. I can tell when the user touches the overlay itself, but would like to know if the user has touched the screen but not the overlay?
Does not matter what is being touched, just if the screen is being touched on their device.
Also, same scenario, is it possible to know that the user has clicked a button? For instance, the user clicks the contacts application (or camera, or any application), is their a way to read that action? Do not care what application/button is clicked, just that one was clicked.
Just trying to learn what is possible, so please no need to write out code. Maybe just some pointers in the right direction. Thanks for input.
For the first scenario you can try chat heads(like Facebook messanger) by using intentservice.
I want to start chromecast routing automatically and not when the user presses the button. Does anyone know how i can simulate in any way that the user pressed the media route button? I have looked through the different classes and not found anything.
I am aware that this is not how Google intends developers to use it, and my application is only functioning as a proof of concept.
If anyone knows another way to achieve the same thing (The casting starts when the app starts, if the user has enabled it in the options menu) - let me know!
You can follow the same steps as usual (get a hold of MediaRouter instance, set a selector, register a callback, etc) but then you need to keep a list of discovered routes in your application (as they are discovered by MediaRouter; you will get a call back via onRouteAdded(()). You need to do the bookkeeping as well (via onRouteRemoved() callback). Now that you have a list of routes, you can programmatically decide which one is the one you want to use and again do as usual (same stuff that you would do when you get a callback via onRouteSelected()) except that you need to call MediaRouter.selectRoute(your_selected_route) yourself to tell the framework about it. For the first part, you can take a look at this sample.
So what I discovered was that I couldn't make a check for routes in the beginning of the program because the MediaRouter hadn't discovered them yet. (I.e the call to getRoutes returned only the default route...) In my program, it was enough to start a thread that sleeps for three seconds and then calls selects any available route:
if(mMediaRouter.getRoutes().size() >= 1) {
mMediaRouter.selectRoute(mMediaRouter.getRoutes().get(1));
}
If I needed a more persistent solution, I'd do as Ali Naddaf suggested.
As the title said, in my research I need to log user touch screen events, mainly gestures(such as tap, scroll, swipe, etc) when a user is using another app. Currently I can't figure out a good way of doing it. It seems that for Android 4.x the touch screen events can not be read out? Anyone knows any good methods to capture the touch events?
I was also doing R&D from quite some time to get global touch events inside an ANDROID App.
Actually our requirement was to capture touch coordinates on phone from inside ANDROID app.
We tried several approaches:
1- First we tried to put a transparent layout(like a layer of glass on above the phone screen) which covers the phone screen and then receive the touch events and dynamically removing the glass so that the event could be passed to below the glass and then dynamically inserting the glass again.
But the drawback was that it require two times tap which is not feasible (because 1st tap would give the touch coordinate and second touch would be passed below the glass).
2- Then we tried to use one special flag in window manager i.e. flag_watch_outside_touch by making the transparent layout size 1 X 1 but we were getting touch coordinates as (0,0) because Android Framework imposes security that no other process can get touch points inside other process.
3- Then we tried to run command "adb shell getevents" through code but it was not accepting touch events (although by using "adb shell sendevent" we were able to se send global touch events i.e. by passing a coordinate we were able to pass event on particular coordinate on the device screen.
4- Now we were quite sure that without rooting no App can get global touch events that will happen on some other app process. Android security will not allow this otherwise any app can passwords or other user input data.
But.... we have to do this (without rooting) ...
5- Now we tried to use shell script and executed the getevent command from command prompt and redirected the touch events to a file and parsed the touch events to get readable coordinates ..
So we cannot capture global touchevents inside an app through code but yes we can capture from OS level by executing getevent command from adb shell.
That we can the pass to android app by storing it into external storage inside phone storage .
Hope this will help you all.
Thanks,
This is not possible due to security and privacy reasons (as Sir SC mentioned) unless you're rooted.
I have tried multiple things (even asked a question on Stackoverflow: Android - Inactivity/Activity regardless of top app). I came to the conclusion that using an "Accessibility Service" is the closest we can come to knowing when a user has touched the screen. This isn't fool proof, however. You will not get an event for every screen touch (scrolling in Chrome didn't yield any events).
With that said, if your application can rely on a rooted solution then it's possible to listen to incoming lines from getevent (https://source.android.com/devices/tech/input/getevent.html). These lines simply give details of touch (and other) events. But this requires root access so it might not be an acceptable solution.
I think that you should already found this one but it "may" be the only library which can help you on this since you can't interact between activities directly because of security and privacy reasons:
GitHub android-global-touchevent
Not sure whether to post this as a comment or an answer, but I suspect that this should not be possible due to security concerns. If you could monitor activity in another application like that without it being obvious to the user, you could inappropriately monitor password enteries into other applications. For that reason I wouldn't be surprised if that sort of activity is not supported.
As amarnathpatel said its almost impossible to get touch events without root.But there is an workout through which we can get events but not the co-ordinates.It is done using FLAG_WATCH_OUTSIDE_TOUCH.
check this out.
I am using an Android Stick (http://www.geekbuying.com/item/Uhost-2-Dual-Core-TV-Box-Mini-PC-Android-4-0-4-RK3066-Cortex-A9-1-6GHZ-1GB-RAM-4G-ROM-with-Bluetooth-WIFI-Skype-XBMC---Black-312467.html) for building an application. The application uses an attached USB webcam for some of it's functionality. Additionally, I connect a mouse to this device which the user can use to navigate through various pages in the application. A left/right movement of the mouse results in navigation to previous/next page.
While the mouse works with the Android device, I additionally require to reset the position of the mouse to the center after every single interaction with the user. Is it possible to set the mouse position using software in Android? I am using View.OnGenericMotionListener to capture the mouse movement.
Currently, I also require to perform a primary mouse button click to bring the mouse in focus inside the application. I want to remove this requirement by either generating the primary mouse button click in software, or otherwise bring the application in software by some other means.
I have been unable to find any APIs to get the above to work. Any help on these would be greatly useful.
Just in case I need to write some sort of drivers to get this thing working, any help in this direction would also be useful.
Any workarounds around this problem, while still using the mouse, could also prove useful.
Mouse event is managed by the system framework. You cannot control it on Java side.
On the adb shell you can open /dev/input/uevent device to write mouse events include
relative movement
click action
absolute position (you might want this)
However, you cannot do it as a normal application, unless you do it on a rooted device, or you can use adb shell to start a daemon service in the background to perform the event writing for your application.
I additionally require to reset the position of the mouse to the center after every single interaction with the user.
This is now possible with the pointer capture API in Android 8.0+ (released August 2017). Summary:
To request pointer capture, call the requestPointerCapture() method on the view.
Once the request to capture the pointer is successful, Android calls onPointerCaptureChange(true), and starts delivering the mouse events.
Your focused view can handle the events by performing one of the following tasks:
If you're using a custom view, override onCapturedPointerEvent(MotionEvent).
Otherwise, register an OnCapturedPointerListener.
The view in your app can release the pointer capture by calling releasePointerCapture().