I successfuly installed (after 1 week of work) Android on BeagleBoard C4. As display I use Lilliput 669 with an eGalax Usb TouchController. Everything seems ok with exception of touchscreen which have X and Y axes reverse.
I: Bus=0003 Vendor=0eef Product=0001
Version=0210 N: Name="eGalax Inc. USB
TouchController" P:
Phys=usb-ehci-omap.0-2.2/input0 S:
Sysfs=/devices/platform/ehci-omap.0/usb1/1-2/1-2.2/1-2.2:1.0/input/input1
U: Uniq= H: Handlers=event1 B: EV=9
B: ABS=600000 3
I: Bus=0003 Vendor=0eef Product=0001
Version=0210 N: Name="eGalax Inc. USB
TouchController" P:
Phys=usb-ehci-omap.0-2.2/input0 S:
Sysfs=/devices/platform/ehci-omap.0/usb1/1-2/1-2.2/1-2.2:1.0/input/input2
U: Uniq= H: Handlers=mouse0 event2 B:
EV=1b B: KEY=400 0 0 0 0 0 0 0 0 0 0
B: ABS=600000 3 B: MSC=10
Do you know how to reverse X and Y axis for my touchscreen ?
I know that this post is old, but this was a problem I encountered recently while I swapped the broken touchscreen from my Android tablet. The replacement was a poorly manufactured p.o.c.s... Anyway, after I plugged it in, the xy was swapped first, then I turned the connector over and it was just an x-axis problem. I knew that if I flipped the glass over, it would fix it, but that would make the connector too far away from the port on the motherboard. So I just plugged it in, and went to the settings of the tablet, and calibrated the touchscreen from the settings menu. Done.
That's it.
SOLUTION:
Just go to the settings option in the menu and find the "Touchscreen Calibration" option and calibrate it again. This should fix it :) Or just flip connector around and try again :)
The reason I posted this is to help others who might have this problem :)
Related
I use a touch screen 5 inch(800*480) tft LCD HDMI with raspberry pi 3 ,
I use this driver.
The touch driver works and emits points but shows the mouse cursor in the display. How so I disable the mouse cursor on the touch screen?
below picture show my problem .
OS Version:
1.0.12.5501548
My config.txt in boot files is:
max_usb_current=1
hdmi_group=2
hdmi_mode=1
hdmi_mode=87
hdmi_cvt 800 480 60 6 0 0 0
When I add below config to config.txt mouse cursor disappeared but the driver doesn't work also.
dtoverlay=ads7846,cs=1,penirq=25,penirq_pull=2,speed=50000,keep_vref_on=0,swapxy=0,pmax=255,xohms=150,xmin=200,xmax=3900,ymin=200,ymax=3900
According to Android documentation: https://developer.android.com/guide/topics/sensors/sensors_motion.html
linear acceleration = acceleration - acceleration due to gravity .
Which seems to work my 2 phones and LG Watch Urbane W150.
BUT, on my new Huawei Watch 2, linear acceleration still contains acceleration due to gravity. I'm running the same app on all devices, and only on Huawei Watch 2 I see this problem.
The command adb shell dumpsys sensorservice for LG Watch Urban W150 give this as output when the watch is lying on table:
Linear Acceleration: last 10 events
1 (ts=24353.048620089, wall=17:28:40.031) -0.92, -0.15, 0.23,
2 (ts=24353.115270480, wall=17:28:40.086) -0.79, -0.17, 0.33,
3 (ts=24353.181920870, wall=17:28:40.146) -0.75, -0.14, 0.21,
MPU6515 Accelerometer: last 50 events
1 (ts=93043.344428463, wall=12:33:30.392) -8.18, -2.35, 4.83,
2 (ts=93043.352240963, wall=12:33:30.392) -8.20, -2.35, 4.87,
3 (ts=93043.367865963, wall=12:33:30.392) -8.14, -2.35, 4.81,
As you see the x,y and z for LG watch is close to zero which is expected.
However, the same command but for Huawei Watch 2 gives:
huawei Linear Acceleration Sensor: last 10 events
1 (ts=31287.761652539, wall=17:31:07.258) -2.27, 6.58, 7.51,
2 (ts=31287.829730716, wall=17:31:07.325) -2.48, 6.41, 8.10,
3 (ts=31287.891682226, wall=17:31:07.387) -2.55, 6.60, 7.39,
BMI160 3-axis Accelerometer: last 50 events
1 (ts=100179.672482189, wall=12:39:19.258) 9.30, -3.46, 0.83,
2 (ts=100179.682469756, wall=12:39:19.258) 9.32, -3.46, 0.83,
3 (ts=100179.692457323, wall=12:39:19.258) 9.30, -3.47, 0.84,
And you see weird output for the linear acc sensor.
Is it a bug in Huawei Watch 2?
All huawei equipment have this problem, maybe the SOC design issue or the driver's math is wrong.
Tested in huawei p40 mate20 p10, all got high bias result.
My android devices 'power button'
sometimes use enable.
originally , when I clicked power button. immediately showing power menu.
but mostly not showing
I command in emulater
cat /proc/bus/input/devices
then showing
I: Bus=0000 Vendor=0000 Product=0000 Version =0000
N: Name="Android Power Button"
P: Phys=
S: Sysfs=/devices/virtual/input/input16
U: Uniq=
H: Handlers=kbd event13
B: PROP=0
B: EV=3
B: KEY=100000 0 0 0
I can check power button connected my device.
perhaps, Using this . power button control in my app?
surely, it should work power button
thanks.
I'm struggling with calibration of a touchscreen on Android plataform.
It is an USB Single-Touch Touchscreen from vendor 0dfc and product 0001 as checked with dmesg:
<6>[ 4118.091541] input: USB Touchscreen 0dfc:0001 as /devices/platform/usb20_host/usb2/2-1/2-1.3/2-1.3:1.0/input/input23
I'm pushing the Vendor_0dfc_Product_0001.idc file /data/system/devices/idc/ (following the documentation from android source - IDC
I got the touch device with all requirements for single touch events:
root#android:/ # getevent -il /dev/input/event3
add device 1: /dev/input/event3
bus: 0003
vendor 0dfc
product 0001
version 0202
name: "USB Touchscreen 0dfc:0001"
location: "usb-usb20_host-1.3/input0"
id: ""
version: 1.0.1
events:
KEY (0001): BTN_TOUCH
ABS (0003): ABS_X : value 540, min 0, max 32767, fuzz 0, flat 0, resolution 0
ABS_Y : value 289, min 0, max 32767, fuzz 0, flat 0, resolution 0
input props:
<none>
I also enabled the Pointer Location option from Developer options (Android settings) in order to debug this stage of calibration.
Setup 1
touch.deviceType = touchScreen
With this setup (1) all the gestures on the touchscreen take place at the up-left corner - just a few pixels left/right/up/down no matter the gesture (swipe). All the touchscreen get events. All the gestures are reversed - when swipe left the pointer goes right; when swipe up, the pointer goes down.
Setup 2
touch.deviceType = pointer
touch.gestureMode = pointer
With this setup (2), as expected, it shows a pointer, placed at the position from the last pointer device left (mouse). All the gestures on the touchscreen (no matter the swipe size) keep beaving like setup 1 - move only a few pixels with each swipe event, and with reversed axis.
Setup 3
touch.deviceType = pointer
touch.gestureMode = spots
With this setup (3) the result is the same as setup 2. I just did that to prove that the IDC file is being interpreted correctly.
At this stage, as you can check by now, I have a working IDC file (setup 1) requiring calibration for this touch device.
I tried a lot of combinations from other IDC files (internet samples) and from android source - IDC - ANY OTHER PROPERTY TOOK EFFECT (NOT A SINGLE ONE) - raw.*, output.*, touch.size.*
Does anyone knows how to calibrate properly a touch screen in Android that could guide me in this process?
Same here,
but my calibration app did't do anything.
After a while, reading /system/etc/init.sh i found the following:
mkdir -p /data/misc/tscal
touch /data/misc/tscal/pointercal
chown 1000.1000 /data/misc/tscal /data/misc/tscal/*
chmod 775 /data/misc/tscal
chmod 664 /data/misc/tscal/pointercal
Just run those commands manually, reboot, and start the calibration app
I am recently working on a project aiming at evaluating whether an android app crashes or not. The evaluation process is:
Collect the logs(which record the execution process of an app).
Generate formulas to predict the result (formulas is generated by GP)
Evaluate the logs by formulas
Now I can produce formulas, but for convenience for users, I want to translate formulas into form of natural language and tell users why crash happened.(I think it looks like "inverse natural language processing".)
To explain the idea more clearly, imagine you got a formula like this:
155 - count(onKeyDown) >= 148
It's obvious that if count(onKeyDown) > 7, the result of "155 - count(onKeyDown) >= 148" is false, so the log contains more than 7 onKeyDown event would be predicted "Failed".
I want to show users that if onKeyDown event appears more than 7 times(155-148=7), this app will crash.
However, the real formula is much more complicated, such as:
(< !( ( SUM( {Att[17]}, Event[5]) <= MAX( {Att[7]}, Att[0] >= Att[11]) OR SUM( {Att[17]}, Event[5]) > MIN( {Att[12]}, 734 > Att[19]) ) OR count(Event[5]) != 1 ) > (< count(Att[4] = Att[3]) >= count(702 != Att[8]) + 348 / SUM( {Att[13]}, 641 < Att[12]) mod 587 - SUM( {Att[13]}, Att[10] < Att[15]) mod MAX( {Att[13]}, Event[2]) + 384 > count(Event[10]) != 1))
I tried to implement this function by C++, but it's quite difficult, here's the snippet of code I am working right now.
Does anyone knows how to implement this function quickly?(maybe by some tools or research findings?)Any idea is welcomed :)
Thanks in advance.