Update(2013.08.15) : I managed to emulate touch event using 1. adb shell (slow), 2. Monkey tool(fast but not satisfying) and 3. monkeyrunner (best because I can couple it with python)
I'm trying to create an external device (probably using raspberry pi) which acts as input device for android. Specifically, I want the device to create touch event(touch, swipe... etc). The touch event should be created not only when certain app is activated, but also in background.
I have thought several methods of doing it.
Create an app(probably service) which receives data from the input device and emulate the touch event.(Probably monkeyrunner?)
Connect the android device to the external device and use adb to directly create touch event.
Make the device to mimic the behavior of joystick (I heard that this method only applies to several game apps. is it true?)
Which is the most viable method? Or is any of the method possible? (On rooted phone probably)
PS. For solution 1, I saw several apps (Remote desktop-like apps) which creates touch events using S/W. How does it work?(Using Android API, use adb or misc. methods... etc.)
Thank you for reading this question.
Related
I have a problem in sending touch events to Android (not to a specific application, system-wide touch events) through bluetooth.
I know that there are similar questions, but I still could not find the solution. I will be really happy if somebody helps me on this issue.
You can find details of my problem below :
Requirement :
To send remote touch events to Android (without root) through bluetooth.
Possible options I discovered so far :
1) Use of ADB commands - This is possible via USB cable and over network connection; however it is not allowed via bluetooth. Workaround I thought of :
To write an Android application with a background service which listens for Bluetooth inputs; and once it receives input, it runs ADB commands within the app itself. However, I think this requires rooted Android.
2) MonkeyRunner - I think this is also not possible through bluetooth.
3) Accessibility Service - According to my research, writing an accessibility service may be a solution to simulate touch events only when specific conditions are met (e.g. view clicks). But I need to initiate the touch from an external bluetooth device, without waiting for a specific condition.
4) Connecting the external bluetooth device as a HID (e.g. mouse) and simulate the touch on mouse button click. But in this case, the mouse pointer will always be visible to user. I also could not find a way to hide the mouse pointer. There are some tutorials to customise the pointer (e.g. a bigger one), but as I saw transparent mouse pointer is not supported.
I found an application called AutoInput (a Tasker plugin), which simulates touch events, but I could not get any feedback about how it works actually, I could not find how it simulates touch events.
Thanks in advance for your help.
Regards
I am trying to implement a very simple hack.
Suppose there are 2 android phones which are connected via wifi and I run an app in Phone B which sends events to phone A like a pinch / swipe / turn.
So conceptually I would have to write an app which opens a socket and whenever there is an event sends an ioctl to kernel driver which injects these virtual events into android input subsystem
I just dont know what part in android that I need to push virtual events any help would be welcome
You need an rooted device as Phone A, look on this answers:
How to simulate touch from background service with sendevent or other way?
How to simulate a touch event in Android?
and:
Programmatically Injecting Events on Android – Part 1
How to compile Android Application with system permissions
Instruction for compile here
I'm trying to get my computer (Mac/Linux) to send touch commands to my Nexus 7 tablet via ADB. I've found that I can successfully send touch events via "adb shell input tap x y", but the noticable delay is inhibiting what I can do. I would need to be able to send several per second, but this method sends at about 1 per second. I'm hoping to in the end control the inputs via a python script on the host pc if that's possible.
Thanks!
Have you taken a look at monkeyrunner?
Or if you are looking for an automation test framework, you could try robotium.
I know, I can listen input devices from /dev/input/eventx on Android/Linux. If you are superuser, you can also send events to the device through that.
I would like to send mouse events to my Android device as superuser. However, in order to do this, a mouse must be connected to the device via USB or bluetooth connection. Without it, I get error as Could not open /dev/input/event8, No such device when execute this command on adb sendevent /dev/input/event8 xxxx xxxx xxxxxxxx. In this case, the node was attempted to be created using the mknod /dev/input/event8 c 13 71 command.
The problem is solved when I connect a bluetooth or USB mouse to the device. The device is created automatically under /sys/devices/platform/tegra_uart.2/tty/ttyHS2/hci0 named hci0:11 also create input event /dev/input/event7 (major 13 minor 71). After that I can send events to that node and control Android mouse cursor. But I want to do this without connecting a mouse to the device.
Could anyone suggest how can I create a mouse input device (like when a mouse is connected) on my android device virtually?
From what i see you should create your own virtual device with your own driver , Fortunately there is an easy way to do so using uinput
There is an easy guide for getting started here , and this question can be a good guide to write your own virtual driver.
I thought this can only be done if you have access to kernel , and create your own ( i dont think modifying user rom is a good solution ) , but after reading this , it is clear that Uinput can run in user mode.
Note :
I agree with recommendition to use touch events ; as this solution is more common and makes sense , check second suggestion is this answer
is it possible to have a service (A) which will launch an activity (B) and then periodically capture B's screen?
also, is it possible to send onTouch events from A to B?
basically, i'd like to make a bot which would use an application so i don't have to.
i guess it's not possible but had to ask.
You can't do that across Activitys but you could create a view that held the Activity that you want to automate. Then periodically you can grab the ViewDecorator and do some processing on it and then inject touch events.
Screen captures of apps which are not the current app are prevented in Android devices due to security considerations.
As far as I know up until Android 4.3 you can only do this in these scenarios:
Your device is rooted
Your app is signed with the same signature of the system (Google apps)
With adb (debug environment): your device is connected via USB to a PC which is running adb shell commands, or either the USB is disconnected but you have started the native adb service in your device after each device reboot.
Some useful links:
How to programmatically take a screenshot in Android?
How to make a capture screen app on Android
http://code.google.com/p/android-screenshot-library/wiki/UserGuide