I'm using a touchscreen (atmel maxtouch - atmel 1664s) with android and finding that the further to the right(X gets larger) I go, the larger distance between where my finger is vs touch spot on screen. Would this be a problem with settings in the IDC file, driver, or somewhere else? Using another OS like Ubuntu on the same screen doesn't seem to have this problem.
I've used this IDC file to try and correct the position, but the last line just turns the touchscreen into a trackpad.
touch.deviceType = touchScreen
touch.orientationAware = 1
output.x = (raw.x - raw.x.min) * (output.width / raw.width)
The kernel driver isn't detecting and reporting the possible range of the input X reports correctly.
If you use adb shell and run getevent -il you should get something like
add device 6: /dev/input/event2
bus: 0000
vendor 0000
product 0000
version 0000
name: "touch_dev"
location: ""
id: ""
version: 1.0.1
events:
ABS (0003): ABS_MT_SLOT : value 0, min 0, max 9, fuzz 0, flat 0, resolution 0
ABS_MT_TOUCH_MAJOR : value 0, min 0, max 15, fuzz 0, flat 0, resolution 0
ABS_MT_POSITION_X : value 0, min 0, max 1535, fuzz 0, flat 0, resolution 0
ABS_MT_POSITION_Y : value 0, min 0, max 2559, fuzz 0, flat 0, resolution 0
ABS_MT_TRACKING_ID : value 0, min 0, max 65535, fuzz 0, flat 0, resolution 0
ABS_MT_PRESSURE : value 0, min 0, max 255, fuzz 0, flat 0, resolution 0
input props:
INPUT_PROP_DIRECT
You can see on my device, the X value can range between 0 and 1535.
If you then run getevent -trl /dev/input/event2, move your finger around the screen, and look at the maximum possible X value, it should correspond:
[ 115960.226411] EV_ABS ABS_MT_POSITION_X 000005ee
0x5ee = 1518, so that's about right.
There are some parameters on the touch controller which adjust this scaling, and need to be in sync with what the kernel driver reports. The standard Linux mainline driver doesn't deal very well with those parameters being out of sync with the platform data. There are patches to address this which haven't gone upstream yet: https://github.com/atmel-maxtouch/linux/commit/002438d207
If when you move your finger to the far right, the touch is still on screen, you could probably correct it by doing
output.x = raw.x / scale
Where scale is the ratio of the reported vs desired coordinates. You can't do it the other way round, because the lower input layers will throw away reports outside of the screen.
A proper fix would be to fix the bug in the kernel driver, or adjust the range settings on the touch controller.
You don't say what particular device it is, so it's difficult to help further.
Related
I am struggling on sending combination key like "ctrl" + "a" on my samsung J7 with android 8.1.0.
After finding resources in the Internet, I found some commands that help sending text via ADB such as:
adb shell input keyevent 29 => a
adb shell input text "a" => a
However, when I used sendevent to send text "a", it didn't work (The same as "ctrl" + "a"). My command is below:
adb shell sendevent /dev/input/event6 1 30 1 => Key down.
adb shell sendevent /dev/input/event6 0 0 0 -> End report.
adb shell sendevent /dev/input/event6 1 30 0 => Key up.
adb shell sendevent /dev/input/event6 0 0 0 => End report.
Then I checked keycode followed by link, I executed the command:
adb shell getevent -p
The output is:
add device 1: /dev/input/event2
name: "accelerometer_sensor"
events:
REL (0002): 0000 0001 0002 0007 0009
input props:
<none>
add device 2: /dev/input/event3
name: "proximity_sensor"
events:
ABS (0003): 0019 : value 1, min 0, max 1, fuzz 0, flat 0, resolution 0
input props:
<none>
add device 3: /dev/input/event6
name: "sec_touchscreen"
events:
KEY (0001): 0145* 014a
ABS (0003): 0000 : value 0, min 0, max 1079, fuzz 0, flat 0, resolution 0
0001 : value 0, min 0, max 1919, fuzz 0, flat 0, resolution 0
002f : value 0, min 0, max 9, fuzz 0, flat 0, resolution 0
0030 : value 0, min 0, max 255, fuzz 0, flat 0, resolution 0
0031 : value 0, min 0, max 255, fuzz 0, flat 0, resolution 0
0035 : value 0, min 0, max 1079, fuzz 0, flat 0, resolution 0
0036 : value 0, min 0, max 1919, fuzz 0, flat 0, resolution 0
0039 : value 0, min 0, max 65535, fuzz 0, flat 0, resolution 0
003e : value 0, min 0, max 1, fuzz 0, flat 0, resolution 0
input props:
INPUT_PROP_DIRECT
add device 4: /dev/input/event4
name: "Codec3026 Headset Events"
events:
KEY (0001): 0072 0073 00e2 0246
input props:
<none>
add device 5: /dev/input/event0
name: "meta_event"
events:
REL (0002): 0006 0007
input props:
<none>
could not get driver version for /dev/input/mice, Not a typewriter
add device 6: /dev/input/event1
name: "sec_touchkey"
events:
KEY (0001): 009e 00fe
LED (0011): 0008
input props:
<none>
add device 7: /dev/input/event5
name: "gpio_keys"
events:
KEY (0001): 0072 0073 0074 00ac
input props:
<none>
The output means the device /dev/input/event6 does not have keyevent for inputting texts.
So my question is: How can I implement combination key with keydown and keyup separately?
Thanks so much for helping.
After doing a lot of research, I figured out the root cause is from meta key code.
I tried to use a third party application, but the origin version couldn't send meta key codes correctly on Android 8+.
Therefore, I made some changes and then requested a pull ticket as issue. During waiting for merging, I publish the modified version in my repo in case anyone needs it. You can download only ADBKeyboard.apk on this repo since it is updated though.
Now I can send Ctrl + A as below: (4096 is META_CONTROL_ON, 29 is KEYCODE_A)
adb shell am broadcast -a ADB_INPUT_TEXT --es mcode '4096,29'
I have a KeDei 3.5 ich SPI TFT LCD Display. I attached to Raspberry Pi with Android Things Image. When I boot the RPi, nothing is shown, only displays a black screen. The display is getting powered up, and I can differentiate between it's on and off state. But when I try remote display (followed steps from this post), I can see the display properly.
Logcat output which I think is related to display:
mBaseDisplayInfo=DisplayInfo{"Built-in Screen", uniqueId "local:0", app 640 x 480, real 640 x 480, largest app 640 x 480, smallest app 640 x 480,mode 1, defaultMode 1, modes [{id=1, width=640, height=480,fps=60.000004}], colorMode 0, supportedColorModes [0], hdrCapabilities android.view.Display$HdrCapabilities#1d6308, rotation 0, density 240 (0.0 x 0.0) dpi, layerStack 0, appVsyncOff 1000000, presDeadline 16666666, type BUILT_IN, state ON, FLAG_SECURE, FLAG_SUPPORTS_PROTECTED_BUFFERS, removeMode 0}
I tried latest Raspbian Image and dev preview 0.4.1 as mentioned here. Tried with HDMI config given in the same link. Nothing works except the rpi_35_v6.3_ubuntu_mate_15_kedei image from KeDei vendor.
Display Specs:
480x320 16bit/18bit
version 6.3 2016/11/1
Android Things:
OS: 0.6.1-devpreview
Build: OIM1.171126.016
Empty Bundle
According Hardware Platforms description, Raspberry Pi with Android Things support Display only via HDMI or DSI interfaces, not via SPI, and there is no built in drivers for it - your KeDei 3.5 ich SPI TFTLCD Display will not work. You can use that display only from your application via SPI commands (something like custom driver), but not from Android Things system.
I want to send "CTRL + W" to the Chrome for Android to close active tab. I tried lots of things but there is no success to achieve it from terminal. (If i connect a USB Keyboard with OTG, i can close the tab with CTRL+W)
Firstly i do not want to write a application for this, i only want a shell command to use it from Tasker.
I read somewhere that to achieve this (CTRL+W keypress), i have to simulate key presses like this:
Down CTRL
Down W
Up W
Up CTRL
And to achieve this from terminal, it seems i have to use "sendevent".
I can simulate all hardware keypress with "sendevent" but can not simulate the normal keys with it.
For example, to down and up to the POWER key:
sendevent /dev/input/event1 1 116 1
sendevent /dev/input/event1 0 0 0
sendevent /dev/input/event1 1 116 0
sendevent /dev/input/event1 0 0 0
i use this commands, but i can not use this commands to send normal keys. (for example a,b,c etc)
The event1 is the gpio-keys, so i'm using it. And all the other input events are sensors and one is the charging driver. (max77693-muic)
The output of the "getevent -p" says that:
add device 1: /dev/input/event9
name: "compass_sensor"
events:
REL (0002): 0000 0001 0002 0003 0004 0005 0006 0007
0008 0009
input props:
<none>
add device 2: /dev/input/event6
name: "barometer_sensor"
events:
REL (0002): 0000 0001 0002
input props:
<none>
add device 3: /dev/input/event5
name: "light_sensor"
events:
REL (0002): 0000 0001 0002 0009
input props:
<none>
add device 4: /dev/input/event4
name: "proximity_sensor"
events:
ABS (0003): 0019 : value 1, min 0, max 1, fuzz 0, flat 0, resolution 0
input props:
<none>
add device 5: /dev/input/event3
name: "gyro_sensor"
events:
REL (0002): 0003 0004 0005
input props:
<none>
could not get driver version for /dev/input/mice, Not a typewriter
add device 6: /dev/input/event7
name: "Midas_WM1811 Midas Jack"
events:
KEY (0001): 0072 0073 00e2
SW (0005): 0002 0004
input props:
<none>
add device 7: /dev/input/event1
name: "gpio-keys"
events:
KEY (0001): 0072 0073 0074 00ac
input props:
<none>
add device 8: /dev/input/event0
name: "max77693-muic"
events:
KEY (0001): 0072 0073 00a3 00a4 00a5
input props:
<none>
add device 9: /dev/input/event8
name: "sec_touchkey"
events:
KEY (0001): 008b 009e
LED (0011): 0008
input props:
<none>
add device 10: /dev/input/event2
name: "sec_touchscreen"
events:
ABS (0003): 002f : value 0, min 0, max 9, fuzz 0, flat 0, resolution 0
0030 : value 0, min 0, max 255, fuzz 0, flat 0, resolution 0
0031 : value 0, min 0, max 255, fuzz 0, flat 0, resolution 0
0032 : value 0, min 0, max 30, fuzz 0, flat 0, resolution 0
0035 : value 0, min 0, max 719, fuzz 0, flat 0, resolution 0
0036 : value 0, min 0, max 1279, fuzz 0, flat 0, resolution 0
0039 : value 0, min 0, max 65535, fuzz 0, flat 0, resolution 0
003c : value 0, min -90, max 90, fuzz 0, flat 0, resolution 0
003d : value 0, min 0, max 1, fuzz 0, flat 0, resolution 0
input props:
INPUT_PROP_DIRECT
Also my gpio-keys layout file "/system/usr/keylayout/gpio-keys.kl" like this:
key 115 VOLUME_UP WAKE
key 114 VOLUME_DOWN WAKE
key 116 POWER WAKE
key 172 HOME WAKE
I can send all normal keyevents with:
"input keyevent KEYCODE_X"
and to send more than one:
"input keyevent KEYCODE_X KEYCODE_Y"
You should think it can works like this:
"input keyevent KEYCODE_CTRL_LEFT KEYCODE W"
but keyevent down and up immediatly and i can not use it to send CTRL+W combination.
I know, the answer should be with "sendevent". But i can not find.
I also tried to adding some fake keys into the key layout file like this:
key 115 VOLUME_UP WAKE
key 114 VOLUME_DOWN WAKE
key 116 POWER WAKE
key 172 HOME WAKE
key 19 Q
i restarted the phone, then tried:
sendevent /dev/input/event1 1 19 1
sendevent /dev/input/event1 0 0 0
sendevent /dev/input/event1 1 19 0
sendevent /dev/input/event1 0 0 0
But it never writes "Q" into the any textbox.
Please help, thanks for your helps.
Oh YES!! I dont know why but whenever I feel stuck I come to stackoverflow and as soon as I start writing the question somehow I find the answer... xD
Anyways, I was able to do it by following procedure:
Go to /system/usr/keylayout/
In my case there was no gpio-keys, anyways open Generic.kl
It has all the keycodes you would need to simulate anything... such as for CTRL_RIGHT keycode is 97 and for W keycode is 17
Thats all you need, now open tasker --> New task --> Add wait 5 sec --> Run shell:
input keyevent 97
input keyevent 17
Now run the command and quickly open chrome, voila! in 5 secs you will see your tab disappearing!
Hope that helps all the future tasker pros ;)
Kudos...
The events section in getevent -p output lists all accepted key codes:
add device 7: /dev/input/event1
name: "gpio-keys"
events:
KEY (0001): 0072 0073 0074 00ac
i.e. VOLUME_UP(0x73), VOLUME_DOWN(0x72), POWER(0x74) and HOME(0xAC) in case of /dev/input/event1. Everything else gets filtered out by the linux kernel input driver long before it reaches the Android framework (where the layout files you tried to modify are used)
I'm struggling with calibration of a touchscreen on Android plataform.
It is an USB Single-Touch Touchscreen from vendor 0dfc and product 0001 as checked with dmesg:
<6>[ 4118.091541] input: USB Touchscreen 0dfc:0001 as /devices/platform/usb20_host/usb2/2-1/2-1.3/2-1.3:1.0/input/input23
I'm pushing the Vendor_0dfc_Product_0001.idc file /data/system/devices/idc/ (following the documentation from android source - IDC
I got the touch device with all requirements for single touch events:
root#android:/ # getevent -il /dev/input/event3
add device 1: /dev/input/event3
bus: 0003
vendor 0dfc
product 0001
version 0202
name: "USB Touchscreen 0dfc:0001"
location: "usb-usb20_host-1.3/input0"
id: ""
version: 1.0.1
events:
KEY (0001): BTN_TOUCH
ABS (0003): ABS_X : value 540, min 0, max 32767, fuzz 0, flat 0, resolution 0
ABS_Y : value 289, min 0, max 32767, fuzz 0, flat 0, resolution 0
input props:
<none>
I also enabled the Pointer Location option from Developer options (Android settings) in order to debug this stage of calibration.
Setup 1
touch.deviceType = touchScreen
With this setup (1) all the gestures on the touchscreen take place at the up-left corner - just a few pixels left/right/up/down no matter the gesture (swipe). All the touchscreen get events. All the gestures are reversed - when swipe left the pointer goes right; when swipe up, the pointer goes down.
Setup 2
touch.deviceType = pointer
touch.gestureMode = pointer
With this setup (2), as expected, it shows a pointer, placed at the position from the last pointer device left (mouse). All the gestures on the touchscreen (no matter the swipe size) keep beaving like setup 1 - move only a few pixels with each swipe event, and with reversed axis.
Setup 3
touch.deviceType = pointer
touch.gestureMode = spots
With this setup (3) the result is the same as setup 2. I just did that to prove that the IDC file is being interpreted correctly.
At this stage, as you can check by now, I have a working IDC file (setup 1) requiring calibration for this touch device.
I tried a lot of combinations from other IDC files (internet samples) and from android source - IDC - ANY OTHER PROPERTY TOOK EFFECT (NOT A SINGLE ONE) - raw.*, output.*, touch.size.*
Does anyone knows how to calibrate properly a touch screen in Android that could guide me in this process?
Same here,
but my calibration app did't do anything.
After a while, reading /system/etc/init.sh i found the following:
mkdir -p /data/misc/tscal
touch /data/misc/tscal/pointercal
chown 1000.1000 /data/misc/tscal /data/misc/tscal/*
chmod 775 /data/misc/tscal
chmod 664 /data/misc/tscal/pointercal
Just run those commands manually, reboot, and start the calibration app
I want to calculate and show in a plot the power consumption of my app over the time. The x axis is the time (hours) and the y axis the power consumption in mW.
I have the discharge values for my application (100, 93, 82, 78, 71, 64, 59, 49, 41) that correspond to initial charge, 1h, 2h... The battery of the smartphone is 3.7V and 1850mAh. I calculated the power consumption the same way:
cons(W) = voltage (V) * discharge amount (%) * capacity (mAh) / discharge time (h)
cons (W) = 3.7V * 1.85 Ah * [100, 93, 82, 78, 71, 64, 59, 49, 41] / [0.1 1 2 3 4 5 6 7 8 ]
Is that correct? I know there is a way to directly obtain the values I need but I want to compare several apps and I don't have time to compute the values again. So, based on the previous calculation, What I am doing wrong? I am obtaining values too large. Any suggestion?
Android and iOS have the possibility to show power consumption on a per-app basis.
At least Android should support API calls, to access these values.
(These calculations are more valid then just using battery drain, still not perfect. [i.e. they use processor time, readout of sensor-values, ...])
possible duplicate: https://stackoverflow.com/questions/23428675/android-to-check-battery-stats-per-application