Differentiate between finger touch and hand/palm rest - android

Is there any technique to differentiate between finger touch and palm rest on surface while drawing on touch surface in iOS touch based os?
Edit : Added android tag as question is relevant for any touch based os.
Thanks

I can't speak for Android systems, but on iOS I do not believe there is any definitive distinction as all touches, regardless of surface area' are resolved to individual points.
That said, there are ways you could determine whether a specific touch is a palm or finger. If you are making a hand-writing App for instance, if you ask the user whether they are left or right handed then you could ignore all touches to the right/left of the touch that would logically be the finger/stylus. Your method for eliminating the "palm touches" would be specific to the App you are writing.
Something else I have noticed (although I hasten to add this is not particularly reliable!) is that a palm touch will often result in many small touches around the palm area rather than a single touch due to the shape of the palm, small palm movement/rolling, etc. If you know you only ever need to receive and use one touch, this may be something you can manipulate.
In the end, any technique you use to differentiate a palm touch from a finger touch will probably be logical, rather than API/hardware-available. Again, this is iOS specific I am not at all familiar with Android's touch system.

Related

How to receive touch events when touch area is very large (e.g. ball of the hand)? (Androis/iOS)

I am working on a project for which I have to measure the touch surface area. This works both for android and iOS as long as the surface area is low (e.g. using the thumb). However, when the touch area increases (e.g. using the ball of the hand), the touch events are no longer passed to the application.
On my IPhone X (Software Version 14.6), the events where no longer passed to the app when the UITouch.majorRadius exceeded 170. And on my Android device (Redmi 9, Android version 10) when MotionEvent.getPressure exceeded 0.44.
I couldn't find any documentation on this behavior. But I assume its to protect from erroneous inputs.
I looked in the settings of both devices, but I did not find a way to turn this behavior off.
Is there any way to still receive touch events when the touch area is large?
Are there other Android or iOS devices that don't show this behavior?
I would appreciate any help.
So I've actually done some work in touch with unusual areas. I was focusing on multitouch, but its somewhat comparable. The quick answer is no. Because natively to the hardware there is no such thing as a "touch event".
You have capacitance changes being detected. That is HEAVILY filtered by the drivers which try to take capacitance differences and turn it into events. The OS does not deliver raw capacitance data to the apps, it assumes you always want the filtered versions. And if it did deliver that- it would be very hardware specific, and you'd have to reinterpret them into touch events
Here's a few things you're going to find out about touch
1)Pressure on android isn't what you should be looking at. Pressure is meant for things like styluses. You want getSize, which returns the normalized size. Pressure is more for how hard someone is pushing, which really doesn't apply to finger touches these days.
2)Your results will vary GREATLY by hardware. Every single different sensor will differ from each other.
3)THe OS will confuse large touch areas and multitouch. Part of this is because when you make contact with a large area like your heel of your hand, the contact is not uniform throughout. Which means the capacitances will differ, which will make it think you're seeing multiple figures. Also when doing heavy multitouch, you'll see the reverse as well (several nearby fingers look like 1 large touch). This is because the difference between the two, on a physical level, is hard to tell.
4)We were writing an app that was enabling 10 finger multitouch actions on keyboards. We found that we missed high level multitouch from women (especially asian women) more than others- size of your hand greatly effected this, as does how much they hover vs press down. The idea that there were physical capacitance differences in the skin was considered. We believed that it was more due to touching the device more lightly, but we can't throw out actual physical differences.
Some of that is just a dump because I think you'll need to know to look out for it as you continue. I'm not sure exactly what you're trying to do, but best of luck.

Is it possible to take stylus input in Flutter?

I want to make a drawing app using Flutter. There is this widget called CustomPaint that allows you to easily have a Canvas and draw on it with you fingers.
Let's say that I want to use a tablet with a dedicated stylus will CustomPaint take into account the pressure sensitivity automatically.
If not, what should I do for my app to support the stylus.
I've been looking around for example apps and the only ones I found don't even mention the possibility of pressure sensitivity or even just plain usage with stylus.
Example apps
https://github.com/vemarav/signature
https://github.com/psuzn/draw-it
For basic input handling you would use the GestureDetector widget.
For low level input detection you can use the Listener widget that has onPointerDown, onPointerMove, onPointerHover and onPointerUp event listeners (and much more), which you can use to get the information of your stylus.
The information you can get from the listeners can be found under the according PointerEvent given by each event listener. One of the information you can get from PointerEvent is the pressure.
You can find a basic introduction to input detection under Taps, drags, and other gestures.

Detect pressure on screen without finger-like object

I'm trying to measure the pressure of something on an android screen, however it can't be detected using an onTouch method because it isn't anything like a finger (something like a bottle, book, etc..). Is there a way to bypass this, or can the touch screen not read these kinds of objects?
I don't need a specific pressure measurement, just either 0 or 1, but nothing I've found ever addresses this.
To give you a complete answer :
"A capacitive touch screen is a control display that uses the conductive touch of a human finger or a specialized device for input.
Unlike resistive and surface wave panels, which can sense input from either fingers or simple styluses, capacitive touch screen panels must be touched with a finger or a special capacitive pen or glove. The panel is coated with a material that can store electrical charges and the location of touch to the screen is signaled by the change in capacitance in that location."
Most devices these days use capacitive screens and therefore you probably wont be able to achieve what you want to achieve. taken from : https://whatis.techtarget.com/definition/capacitive-touch-screen

How to do pinch-and-zoom with a touchpad on ChromeOS/Android?

I have an Android app which I'm running on a Chromebook. I have views which scale with pinch-and-zoom gestures when the user touches the device's screen, and these work fine on the Chromebook. I'm trying to get pinch-and-zoom working with the touchpad as well.
I can three-finger drag scrollable elements. I can two-finger drag and it drags around screen elements where dragging makes sense. I still get hover events and the events claim there are two pointers, so that's all good. However, as soon as the fingers start moving in opposing directions, the event stream stops.
Is there any way I can get the unfiltered everything input event stream so I can see what's going on? I feel like the emulation layer's best-effort attempt to make everything "just work" (and it's a really good effort!) is biting me here. I also notice that some events are coming in as generic motion events, and some are coming in as touch events. And some, like tap-to-click do some of each. If it matters, the input device data for ChromeOS Mouse claims it has the ( touchscreen mouse ) sources, which mostly makes sense. Except shouldn't it be touchpad instead since it's not directly attached to a display?
On this page, list item #5 implies that some kind of synthetic event might be created and used somehow. Is there any way to see if those are being generated? And if yes, how would I take advantage?
Help!
A little more detail: Single finger operation of the touchpad gives me ACTION_HOVER_MOVE generic events. Two-finger drag gives me ACTION_MOVE touch events so long as both fingers are moving together. As soon as they start heading in different directions, the event stream stops.
Pinch-to-zoom support for Touchpad is still work in progress. Once it is there, it will work seamlessly with the standard gesture recognizer used for touchscreen zoom as well, you should not have to do anything.
I can highly recommend upgrading to API level 24 if you want to target Chromebooks, there are also more details on input devices on Chromebooks to be found here: https://developer.android.com/topic/arc/input-compatibility.html
edit: The "touchpad" device type is very confusingly named. It is reserved for off-screen devices. The touchpad is treated as a mouse since it moves the mouse cursor on screen.

Rapid succession of taps interpreted as a single moving touch event? (tested on Nexus 10)

I'm developing an Android application in which a user may be rapidly tapping with multiple fingers. Initial testing on the Nexus 10 tablet reveals that a tap rapidly followed by another tap nearby is often interpreted as a single touch-and-slide motion, with both touch points attributed to a single pointer index. This appears to be a hardware issue, or at least an issue with some low-level touch processing in Android.
Similarly, I've seen tapping with two fingers placed together simultaneously being interpreted as a single touch from one large finger.
Has anyone experienced any similar difficulty with rapid pairs of touches being mistakenly interpreted as a single touch? I'm hoping that someone has some clever workarounds, or ways to cleverly infer that a moving touch may actually be coming from a single touch?
To be clear, my observations are based on visualizing touches by turning on "Pointer Location" from the developer settings, and so are independent of my application.

Categories

Resources