I want to make a drawing app using Flutter. There is this widget called CustomPaint that allows you to easily have a Canvas and draw on it with you fingers.
Let's say that I want to use a tablet with a dedicated stylus will CustomPaint take into account the pressure sensitivity automatically.
If not, what should I do for my app to support the stylus.
I've been looking around for example apps and the only ones I found don't even mention the possibility of pressure sensitivity or even just plain usage with stylus.
Example apps
https://github.com/vemarav/signature
https://github.com/psuzn/draw-it
For basic input handling you would use the GestureDetector widget.
For low level input detection you can use the Listener widget that has onPointerDown, onPointerMove, onPointerHover and onPointerUp event listeners (and much more), which you can use to get the information of your stylus.
The information you can get from the listeners can be found under the according PointerEvent given by each event listener. One of the information you can get from PointerEvent is the pressure.
You can find a basic introduction to input detection under Taps, drags, and other gestures.
Related
I am working on a project for which I have to measure the touch surface area. This works both for android and iOS as long as the surface area is low (e.g. using the thumb). However, when the touch area increases (e.g. using the ball of the hand), the touch events are no longer passed to the application.
On my IPhone X (Software Version 14.6), the events where no longer passed to the app when the UITouch.majorRadius exceeded 170. And on my Android device (Redmi 9, Android version 10) when MotionEvent.getPressure exceeded 0.44.
I couldn't find any documentation on this behavior. But I assume its to protect from erroneous inputs.
I looked in the settings of both devices, but I did not find a way to turn this behavior off.
Is there any way to still receive touch events when the touch area is large?
Are there other Android or iOS devices that don't show this behavior?
I would appreciate any help.
So I've actually done some work in touch with unusual areas. I was focusing on multitouch, but its somewhat comparable. The quick answer is no. Because natively to the hardware there is no such thing as a "touch event".
You have capacitance changes being detected. That is HEAVILY filtered by the drivers which try to take capacitance differences and turn it into events. The OS does not deliver raw capacitance data to the apps, it assumes you always want the filtered versions. And if it did deliver that- it would be very hardware specific, and you'd have to reinterpret them into touch events
Here's a few things you're going to find out about touch
1)Pressure on android isn't what you should be looking at. Pressure is meant for things like styluses. You want getSize, which returns the normalized size. Pressure is more for how hard someone is pushing, which really doesn't apply to finger touches these days.
2)Your results will vary GREATLY by hardware. Every single different sensor will differ from each other.
3)THe OS will confuse large touch areas and multitouch. Part of this is because when you make contact with a large area like your heel of your hand, the contact is not uniform throughout. Which means the capacitances will differ, which will make it think you're seeing multiple figures. Also when doing heavy multitouch, you'll see the reverse as well (several nearby fingers look like 1 large touch). This is because the difference between the two, on a physical level, is hard to tell.
4)We were writing an app that was enabling 10 finger multitouch actions on keyboards. We found that we missed high level multitouch from women (especially asian women) more than others- size of your hand greatly effected this, as does how much they hover vs press down. The idea that there were physical capacitance differences in the skin was considered. We believed that it was more due to touching the device more lightly, but we can't throw out actual physical differences.
Some of that is just a dump because I think you'll need to know to look out for it as you continue. I'm not sure exactly what you're trying to do, but best of luck.
I'm trying to measure the pressure of something on an android screen, however it can't be detected using an onTouch method because it isn't anything like a finger (something like a bottle, book, etc..). Is there a way to bypass this, or can the touch screen not read these kinds of objects?
I don't need a specific pressure measurement, just either 0 or 1, but nothing I've found ever addresses this.
To give you a complete answer :
"A capacitive touch screen is a control display that uses the conductive touch of a human finger or a specialized device for input.
Unlike resistive and surface wave panels, which can sense input from either fingers or simple styluses, capacitive touch screen panels must be touched with a finger or a special capacitive pen or glove. The panel is coated with a material that can store electrical charges and the location of touch to the screen is signaled by the change in capacitance in that location."
Most devices these days use capacitive screens and therefore you probably wont be able to achieve what you want to achieve. taken from : https://whatis.techtarget.com/definition/capacitive-touch-screen
In my application, I required to draw strokes to the view with touches. I would have to save a previous coordinated when touch down and touch move (after processed the touch move). When I looked at the API, there is GetHistoricalX and GetHistoricalY.
1) How does these historical data work. Will they ever be removed?
2) Will it start keeping all the historical data when the touch start moving?
Since I using Xamarin Form which also implement for IOS. Does IOS has the same thing as this.
Android:
On Android, GetHistoricalX|Y will contain the X/Y that have not been reported since the last ACTION_MOVE event (i.e. this are batched into a single touch event for efficiency).
The coordinates are "historical" only insofar as they are older than the current coordinates in the batch; however, they are still distinct from any other coordinates reported in prior motion events. To process all coordinates in the batch in time order, first consume the historical coordinates then consume the current coordinates.
Note: Since there is no standard Input Sampling rates defined for /dev/input/event0, the rate is determined by the hardware developer and how their digitizer grid driver is written/configured. Android will then collect the number of samples available and offer those to the developer within the historical data along with the most current X/Y sample. If everyone knows how to get this frequency from the OS, I would love to know ;-)
You can use the GetHistorySize to get the number of "points" available, process them first and then process the current X/Y, but remember these are only the locations since the last move batch event.
There is sample Java code under the Batching section # https://developer.android.com/reference/android/view/MotionEvent.html
iOS:
On iOS, the number of touch events reported are based on a 60hz sampling rate of their digitizer. Some iDevices have a faster frequency (newer iPads at 120hz & iPad Pro at 240hz). These 'extra" points are reported within the coalescedTouchesForTouch method (Xamarin.iOS = GetCoalescedTouches).
Note: iOS even has predictedTouchesForTouch (Xamarin.iOS = GetPredictedTouches) that might be available within the UIEvent. These can be used to "pre-draw" where the user might be moving to, Apple has dev code samples of this when using the Apple Pencil to prevent a visual "lag" from the tip of the pencil...
Net Result:
In the end, if you need to preserve a history of X/Y touch locations in order to replay them, you will need to store these yourself as neither iOS or Android is going to buffer these outside of a touch event.
Is there any technique to differentiate between finger touch and palm rest on surface while drawing on touch surface in iOS touch based os?
Edit : Added android tag as question is relevant for any touch based os.
Thanks
I can't speak for Android systems, but on iOS I do not believe there is any definitive distinction as all touches, regardless of surface area' are resolved to individual points.
That said, there are ways you could determine whether a specific touch is a palm or finger. If you are making a hand-writing App for instance, if you ask the user whether they are left or right handed then you could ignore all touches to the right/left of the touch that would logically be the finger/stylus. Your method for eliminating the "palm touches" would be specific to the App you are writing.
Something else I have noticed (although I hasten to add this is not particularly reliable!) is that a palm touch will often result in many small touches around the palm area rather than a single touch due to the shape of the palm, small palm movement/rolling, etc. If you know you only ever need to receive and use one touch, this may be something you can manipulate.
In the end, any technique you use to differentiate a palm touch from a finger touch will probably be logical, rather than API/hardware-available. Again, this is iOS specific I am not at all familiar with Android's touch system.
I am looking into developing a new input method on the Android platform that would emulate the touch input of the screen, is it possible to create a service that can interact directly or indirectly with the touch API to achieve this?
Specifics:
The interactions will come in the form of colour tracking from the camera which is processed into x/y coordinates and touch:0/1 events. Is it possible to have these interact with the touchscreen just as if it were a touch on the screen itself?
I understand that there may be permission problems with this approach of 'injection' control or piggybacking?
Also this is a technical exercise for an experimental report rather than a distributable app/piece of software so root/modifications are not a problem.
I have searched to no avail on the subject (at least not on the android platform) and i would like to find out the feasibility/difficulty of the project before undertaking it so any input would be much appreciated!
I'm sort of guessing here, but MotionEvent has some set...-functions like setLocation(float x, float y). There is also MotionEvent.Pointer to play with.