I want to send "CTRL + W" to the Chrome for Android to close active tab. I tried lots of things but there is no success to achieve it from terminal. (If i connect a USB Keyboard with OTG, i can close the tab with CTRL+W)
Firstly i do not want to write a application for this, i only want a shell command to use it from Tasker.
I read somewhere that to achieve this (CTRL+W keypress), i have to simulate key presses like this:
Down CTRL
Down W
Up W
Up CTRL
And to achieve this from terminal, it seems i have to use "sendevent".
I can simulate all hardware keypress with "sendevent" but can not simulate the normal keys with it.
For example, to down and up to the POWER key:
sendevent /dev/input/event1 1 116 1
sendevent /dev/input/event1 0 0 0
sendevent /dev/input/event1 1 116 0
sendevent /dev/input/event1 0 0 0
i use this commands, but i can not use this commands to send normal keys. (for example a,b,c etc)
The event1 is the gpio-keys, so i'm using it. And all the other input events are sensors and one is the charging driver. (max77693-muic)
The output of the "getevent -p" says that:
add device 1: /dev/input/event9
name: "compass_sensor"
events:
REL (0002): 0000 0001 0002 0003 0004 0005 0006 0007
0008 0009
input props:
<none>
add device 2: /dev/input/event6
name: "barometer_sensor"
events:
REL (0002): 0000 0001 0002
input props:
<none>
add device 3: /dev/input/event5
name: "light_sensor"
events:
REL (0002): 0000 0001 0002 0009
input props:
<none>
add device 4: /dev/input/event4
name: "proximity_sensor"
events:
ABS (0003): 0019 : value 1, min 0, max 1, fuzz 0, flat 0, resolution 0
input props:
<none>
add device 5: /dev/input/event3
name: "gyro_sensor"
events:
REL (0002): 0003 0004 0005
input props:
<none>
could not get driver version for /dev/input/mice, Not a typewriter
add device 6: /dev/input/event7
name: "Midas_WM1811 Midas Jack"
events:
KEY (0001): 0072 0073 00e2
SW (0005): 0002 0004
input props:
<none>
add device 7: /dev/input/event1
name: "gpio-keys"
events:
KEY (0001): 0072 0073 0074 00ac
input props:
<none>
add device 8: /dev/input/event0
name: "max77693-muic"
events:
KEY (0001): 0072 0073 00a3 00a4 00a5
input props:
<none>
add device 9: /dev/input/event8
name: "sec_touchkey"
events:
KEY (0001): 008b 009e
LED (0011): 0008
input props:
<none>
add device 10: /dev/input/event2
name: "sec_touchscreen"
events:
ABS (0003): 002f : value 0, min 0, max 9, fuzz 0, flat 0, resolution 0
0030 : value 0, min 0, max 255, fuzz 0, flat 0, resolution 0
0031 : value 0, min 0, max 255, fuzz 0, flat 0, resolution 0
0032 : value 0, min 0, max 30, fuzz 0, flat 0, resolution 0
0035 : value 0, min 0, max 719, fuzz 0, flat 0, resolution 0
0036 : value 0, min 0, max 1279, fuzz 0, flat 0, resolution 0
0039 : value 0, min 0, max 65535, fuzz 0, flat 0, resolution 0
003c : value 0, min -90, max 90, fuzz 0, flat 0, resolution 0
003d : value 0, min 0, max 1, fuzz 0, flat 0, resolution 0
input props:
INPUT_PROP_DIRECT
Also my gpio-keys layout file "/system/usr/keylayout/gpio-keys.kl" like this:
key 115 VOLUME_UP WAKE
key 114 VOLUME_DOWN WAKE
key 116 POWER WAKE
key 172 HOME WAKE
I can send all normal keyevents with:
"input keyevent KEYCODE_X"
and to send more than one:
"input keyevent KEYCODE_X KEYCODE_Y"
You should think it can works like this:
"input keyevent KEYCODE_CTRL_LEFT KEYCODE W"
but keyevent down and up immediatly and i can not use it to send CTRL+W combination.
I know, the answer should be with "sendevent". But i can not find.
I also tried to adding some fake keys into the key layout file like this:
key 115 VOLUME_UP WAKE
key 114 VOLUME_DOWN WAKE
key 116 POWER WAKE
key 172 HOME WAKE
key 19 Q
i restarted the phone, then tried:
sendevent /dev/input/event1 1 19 1
sendevent /dev/input/event1 0 0 0
sendevent /dev/input/event1 1 19 0
sendevent /dev/input/event1 0 0 0
But it never writes "Q" into the any textbox.
Please help, thanks for your helps.
Oh YES!! I dont know why but whenever I feel stuck I come to stackoverflow and as soon as I start writing the question somehow I find the answer... xD
Anyways, I was able to do it by following procedure:
Go to /system/usr/keylayout/
In my case there was no gpio-keys, anyways open Generic.kl
It has all the keycodes you would need to simulate anything... such as for CTRL_RIGHT keycode is 97 and for W keycode is 17
Thats all you need, now open tasker --> New task --> Add wait 5 sec --> Run shell:
input keyevent 97
input keyevent 17
Now run the command and quickly open chrome, voila! in 5 secs you will see your tab disappearing!
Hope that helps all the future tasker pros ;)
Kudos...
The events section in getevent -p output lists all accepted key codes:
add device 7: /dev/input/event1
name: "gpio-keys"
events:
KEY (0001): 0072 0073 0074 00ac
i.e. VOLUME_UP(0x73), VOLUME_DOWN(0x72), POWER(0x74) and HOME(0xAC) in case of /dev/input/event1. Everything else gets filtered out by the linux kernel input driver long before it reaches the Android framework (where the layout files you tried to modify are used)
Related
I am struggling on sending combination key like "ctrl" + "a" on my samsung J7 with android 8.1.0.
After finding resources in the Internet, I found some commands that help sending text via ADB such as:
adb shell input keyevent 29 => a
adb shell input text "a" => a
However, when I used sendevent to send text "a", it didn't work (The same as "ctrl" + "a"). My command is below:
adb shell sendevent /dev/input/event6 1 30 1 => Key down.
adb shell sendevent /dev/input/event6 0 0 0 -> End report.
adb shell sendevent /dev/input/event6 1 30 0 => Key up.
adb shell sendevent /dev/input/event6 0 0 0 => End report.
Then I checked keycode followed by link, I executed the command:
adb shell getevent -p
The output is:
add device 1: /dev/input/event2
name: "accelerometer_sensor"
events:
REL (0002): 0000 0001 0002 0007 0009
input props:
<none>
add device 2: /dev/input/event3
name: "proximity_sensor"
events:
ABS (0003): 0019 : value 1, min 0, max 1, fuzz 0, flat 0, resolution 0
input props:
<none>
add device 3: /dev/input/event6
name: "sec_touchscreen"
events:
KEY (0001): 0145* 014a
ABS (0003): 0000 : value 0, min 0, max 1079, fuzz 0, flat 0, resolution 0
0001 : value 0, min 0, max 1919, fuzz 0, flat 0, resolution 0
002f : value 0, min 0, max 9, fuzz 0, flat 0, resolution 0
0030 : value 0, min 0, max 255, fuzz 0, flat 0, resolution 0
0031 : value 0, min 0, max 255, fuzz 0, flat 0, resolution 0
0035 : value 0, min 0, max 1079, fuzz 0, flat 0, resolution 0
0036 : value 0, min 0, max 1919, fuzz 0, flat 0, resolution 0
0039 : value 0, min 0, max 65535, fuzz 0, flat 0, resolution 0
003e : value 0, min 0, max 1, fuzz 0, flat 0, resolution 0
input props:
INPUT_PROP_DIRECT
add device 4: /dev/input/event4
name: "Codec3026 Headset Events"
events:
KEY (0001): 0072 0073 00e2 0246
input props:
<none>
add device 5: /dev/input/event0
name: "meta_event"
events:
REL (0002): 0006 0007
input props:
<none>
could not get driver version for /dev/input/mice, Not a typewriter
add device 6: /dev/input/event1
name: "sec_touchkey"
events:
KEY (0001): 009e 00fe
LED (0011): 0008
input props:
<none>
add device 7: /dev/input/event5
name: "gpio_keys"
events:
KEY (0001): 0072 0073 0074 00ac
input props:
<none>
The output means the device /dev/input/event6 does not have keyevent for inputting texts.
So my question is: How can I implement combination key with keydown and keyup separately?
Thanks so much for helping.
After doing a lot of research, I figured out the root cause is from meta key code.
I tried to use a third party application, but the origin version couldn't send meta key codes correctly on Android 8+.
Therefore, I made some changes and then requested a pull ticket as issue. During waiting for merging, I publish the modified version in my repo in case anyone needs it. You can download only ADBKeyboard.apk on this repo since it is updated though.
Now I can send Ctrl + A as below: (4096 is META_CONTROL_ON, 29 is KEYCODE_A)
adb shell am broadcast -a ADB_INPUT_TEXT --es mcode '4096,29'
I'm using the code above to simulate a 'swipe' using sendevent:
sendevent /dev/input/event0 3 53 300 ;First position X
sendevent /dev/input/event0 3 54 600 ;First position Y
sendevent /dev/input/event0 3 48 5
sendevent /dev/input/event0 3 58 50
sendevent /dev/input/event0 0 2 0
sendevent /dev/input/event0 0 0 0
sendevent /dev/input/event0 3 53 300 ;Second position X
sendevent /dev/input/event0 3 54 400 ;Second position Y
sendevent /dev/input/event0 0 2 0
sendevent /dev/input/event0 0 0 0
sendevent /dev/input/event0 0 2 0
sendevent /dev/input/event0 0 0 0
However, it does swipe instantly without any delay.
I'm trying to figure how to specify the duration of the swipe, like you can do using adb shell input:
input [touchscreen|touchpad|touchnavigation] swipe <x1> <y1> <x2> <y2> [duration(ms)]
shell input swipe 300 400 300 200 2000
This produces a swipe with a duration of 2 seconds.
I have tried to add a
sleep 2 before the ;Second position but it does result in a pause before the swipe instead of a swipe with 2 seconds of duration.
With duration I mean, the time slowly swapping from position 1 to position 2.
The problem with this is that sending events through sendevent takes some time. I made a python script (You can take whatever you need from there) that interpolates points between the given ones. It also waits some time between points.
This is the lineal interpolation code:
def lerp(p1:tuple, p2:tuple, points:int) -> list:
output = []
header = [_p2 - _p1 for _p1, _p2 in zip(p1, p2)]
for p in range(points + 1):
percent = p / points
output.append((p1[0] + percent * header[0], p1[1] + percent * header[1]))
return output
The time problem appears when using multiples points. Using a path with 10 interpolated points with no time between them already takes 1.29 seconds and a 100 points one, 11.45.
If you compare the sendevent and input commands' source code you can clearly understand their goals; the former covers basic command line's input events whereas the latter covers more flexible and complex input scenarios.
To get an insight on how the swipe duration has been implemented (on input command) you can focus directly on the sendSwipe method: it sends multiple basic input events, leveraging the InputManager, in a timespan defined by the duration parameter
final long endTime = down + duration;
The function injectMotionEvent used by sendSwipe doesn't have any concept of "duration".
That said, I think the command you're looking for, as of today, doesn't exist and I believe you can still rely on console prompt like
input swipe 300 400 300 200 2000
that can be invoked after using
adb shell
One cannot set the duration on the low level, but one can record analog input and then play it back. This permits for more flexible and complex scenarios ...because the events are countless.
Run adb shell to open a shell.
Where ...
getevent --help shows all available options.
getevent -p shows all recordable devices.
getevent -lp /dev/input/event1 shows BTN_TOUCH event data format.
getevent /dev/input/event1 logs input events for device focaltech_ts.
getevent -l /dev/input/event1 is human-readable (useless for automation).
To record:
cd sdcard/Download
getevent /dev/input/event1 >> ./swipe.log
download swipe.log with the Android device explorer.
Where 0003 means coordinate, and 0x35 is the X-axis and 0x36 is the Y-axis:
0003 0035 000001a8
0003 0036 000005cb
This log can the be played back by a shell script loop, with sendevent.
sendevent --help shows the expected parameters: DEVICE TYPE CODE VALUE.
cat ./swipe.log | while read line
do
adb shell sendevent /dev/input/event1 $line
done
When delaying the execution with sleep, the lines with 0000 0000 00000000 might suit best.
Alike this one can also automate GPIO buttons, which maybe be quite specific on certain devices. UiObject2.swipe() might also just generate linear-interpolation coordinates and play them back. It generally does not matter, if they're generated or recorded - the only difference is that the one movement is perfectly straight and the other one obviously isn't.
I am trying to get the (x, y) coordinate of the touch through the pointer location option in Developer Options and I use these coordinates to tap on the screen using sendevent. Here is my script that does the sendevent.
tap.sh
sendevent /dev/input/event0 3 57 2421
sendevent /dev/input/event0 3 58 232
sendevent /dev/input/event0 3 53 $1
sendevent /dev/input/event0 3 54 $2
sendevent /dev/input/event0 0 0 0
sendevent /dev/input/event0 3 57 4294967295
sendevent /dev/input/event0 0 0 0
I call the script from adb shell sh tap.sh <x> <y> but it is not tapping on the right coordinate. Instead it is tapping at a different location.
Also when I tap on the screen and check the result in getevent adb shell getevent. I find that the coordinates that is shown on the pointer location and the getevent are different.
Why are they different and how do I solve this issue?
PS: The devices I tried are Nexus 7, Nexus 10.
The X and Y co-ordinates obtained from the getevent and the ones obtained from the pointer location in developer options are not the same. They are mapped using a formula.
displayX = (x - minX) * displayWidth / (maxX - minX + 1)
displayY = (y - minY) * displayHeight / (maxY - minY + 1)
Source: Touch Devices
Turn on developer options and enable Pointer Location and you can see the x and y coordinates on top of the screen when you tap on the screen use those coordinates to send tap events.
Are you aware that getevent (in my experience, this possibly varies between devices) shows base 16 values?
(side note: getevent -l is often easier to read as it prints a string representation of the event types)
i.e. if getevent -l says
/dev/input/event1: EV_ABS ABS_MT_POSITION_X 000001cb
/dev/input/event1: EV_ABS ABS_MT_POSITION_Y 00000376
the position of the touch is (459, 886) actually
however it appears that sendevent is not following suit in requiring hex values if your code works at all, as your (such as) 53 and 54 work where I would have used
0035 and 0036.
Edit:
Having tried the original code on a Nexus 5 (correct device file substituted in), I have found that no touch event is generated (nor when the hexadecimal equivalent is substituted, for experimental rigor), nor from reusing values captured (and converted) from getevent. Previously, I have had better experience converting the events with a Python script based on the C one here, and writing the output directly to the device file.
Edit 2:
This question here suggests that the initial code should work.
I'm using a touchscreen (atmel maxtouch - atmel 1664s) with android and finding that the further to the right(X gets larger) I go, the larger distance between where my finger is vs touch spot on screen. Would this be a problem with settings in the IDC file, driver, or somewhere else? Using another OS like Ubuntu on the same screen doesn't seem to have this problem.
I've used this IDC file to try and correct the position, but the last line just turns the touchscreen into a trackpad.
touch.deviceType = touchScreen
touch.orientationAware = 1
output.x = (raw.x - raw.x.min) * (output.width / raw.width)
The kernel driver isn't detecting and reporting the possible range of the input X reports correctly.
If you use adb shell and run getevent -il you should get something like
add device 6: /dev/input/event2
bus: 0000
vendor 0000
product 0000
version 0000
name: "touch_dev"
location: ""
id: ""
version: 1.0.1
events:
ABS (0003): ABS_MT_SLOT : value 0, min 0, max 9, fuzz 0, flat 0, resolution 0
ABS_MT_TOUCH_MAJOR : value 0, min 0, max 15, fuzz 0, flat 0, resolution 0
ABS_MT_POSITION_X : value 0, min 0, max 1535, fuzz 0, flat 0, resolution 0
ABS_MT_POSITION_Y : value 0, min 0, max 2559, fuzz 0, flat 0, resolution 0
ABS_MT_TRACKING_ID : value 0, min 0, max 65535, fuzz 0, flat 0, resolution 0
ABS_MT_PRESSURE : value 0, min 0, max 255, fuzz 0, flat 0, resolution 0
input props:
INPUT_PROP_DIRECT
You can see on my device, the X value can range between 0 and 1535.
If you then run getevent -trl /dev/input/event2, move your finger around the screen, and look at the maximum possible X value, it should correspond:
[ 115960.226411] EV_ABS ABS_MT_POSITION_X 000005ee
0x5ee = 1518, so that's about right.
There are some parameters on the touch controller which adjust this scaling, and need to be in sync with what the kernel driver reports. The standard Linux mainline driver doesn't deal very well with those parameters being out of sync with the platform data. There are patches to address this which haven't gone upstream yet: https://github.com/atmel-maxtouch/linux/commit/002438d207
If when you move your finger to the far right, the touch is still on screen, you could probably correct it by doing
output.x = raw.x / scale
Where scale is the ratio of the reported vs desired coordinates. You can't do it the other way round, because the lower input layers will throw away reports outside of the screen.
A proper fix would be to fix the bug in the kernel driver, or adjust the range settings on the touch controller.
You don't say what particular device it is, so it's difficult to help further.
I have a custom Android (2.3) running on a Beagleboard xM. Currently we are trying to replace a small resistive touchscreen with a bigger, capacitive one. However we can't seem to get the touch working with our android. (The resistive always worked fine).
The kernel modules are loaded and the driver is running. The driver outputs the correct touch coordinates when the screen is pressed.
/proc/bus/input/devices contains these entries:
I: Bus=0003 Vendor=0eef Product=7458 Version=0210
N: Name="eGalax Inc. eGalaxTouch EXC7200-7458"
P: Phys=usb-ehci-omap.0-2.3/input0
S: Sysfs=/devices/platform/ehci-omap.0/usb1/1-2/1-2.3/1-2.3:1.0/input/input2
U: Uniq=
H: Handlers=
B: EV=1b
B: KEY=421 0 30001 0 0 0 0 0 0 0 0
B: ABS=100 3f
B: MSC=10
I: Bus=0006 Vendor=0eef Product=0020 Version=0001
N: Name="eGalaxTouch Virtual Device for Multi"
P: Phys=
S: Sysfs=/devices/virtual/input/input5
U: Uniq=
H: Handlers=event2
B: EV=b
B: KEY=400 0 0 0 0 0 0 0 0 0 0
B: ABS=6608000 1000003
I: Bus=0006 Vendor=0eef Product=0010 Version=0001
N: Name="eGalaxTouch Virtual Device for Single"
P: Phys=
S: Sysfs=/devices/virtual/input/input6
U: Uniq=
H: Handlers=event3
B: EV=b
B: KEY=30000 0 0 0 0 0 0 0 0
B: ABS=3
which are correct, according to the guide for the device.
When touching the screen android shows this in logcat:
V/EventHub(10978): /dev/input/event2 got: t0=1594, t1=229858, type=0, code=0, v=0
D/InputReader(10978): Input event: device=0x10002 type=0x0 scancode=0 keycode=0 value=0
V/EventHub(10978): /dev/input/touchscreen0 got: t0=1594, t1=235198, type=3, code=53, v=788
D/InputReader(10978): Input event: device=0x10001 type=0x3 scancode=53 keycode=53 value=788
V/EventHub(10978): /dev/input/event2 got: t0=1594, t1=235198, type=3, code=53, v=788
D/InputReader(10978): Input event: device=0x10002 type=0x3 scancode=53 keycode=53 value=788
V/EventHub(10978): /dev/input/touchscreen0 got: t0=1594, t1=235382, type=3, code=54, v=1512
D/InputReader(10978): Input event: device=0x10001 type=0x3 scancode=54 keycode=54 value=1512
V/EventHub(10978): /dev/input/event2 got: t0=1594, t1=235382, type=3, code=54, v=1512
D/InputReader(10978): Input event: device=0x10002 type=0x3 scancode=54 keycode=54 value=1512
V/EventHub(10978): /dev/input/touchscreen0 got: t0=1594, t1=235473, type=3, code=0, v=788
D/InputReader(10978): Input event: device=0x10001 type=0x3 scancode=0 keycode=0 value=788
This also seems to be correct, the code 53 and 54 are for the X and Y coordinates.
However the android UI is not reacting to the touch inputs. Neither is our own app, nor the pointer painter from the development tools. Are the events not forwarded to the input dispatcher? The problem seems to be on the android side, but I cannot figure it out.
I hope someone can help me there or at least tell me where I can find additional information.