I am an amateur (desktop) programmer but I want to get into phones. I have some ideas for apps but the touchscreen and it's inputs confuse me....
I know that touchscreens can accept multiple points of touch. For instance zooming in you take two fingers and you bring them closer.. and for zooming out you do the opposite.
Here is my problem though... I've never seen functionality with any phone app on any phone (I use windows phones and android phones) where.... the input on touch is multiple points but it doesn't begin at the same time.
For the sake of illustration I'll make an example. Suppose you have a mini browser on a phone... and it has a vertical scroll bar... and a horizontal one. What I want to do is be able to scroll down... and WHILE i am scrolling down also scroll the horizontal one so i can move the page left or right. So a few seconds after I touch the screen and begin moving the vertical scrollbar downwards or upwards... i want to use a different finger and touch the horizontal scrollbar and move it as well (at the same time).
Is this even possible? Are there certain hardware or software limitations preventing something like this?
You are mixing up gestures and touches. Gestures are touch behaviors, such as...
Two fingers placed at the same time that grow apart from each other means zoom-in.
Tap and hold means context popup.
Tap and drag equals scroll.
You can cancel these gestures when your app doesn't follow these conventions. For example, Angry birds doesn't scroll if you tap and drag on a bird, but it does if you do it elsewhere on the scene.
The default state of gestures is to not detect additional touches while you are performing a gesture. if you scroll and introduce a second finger to click on a button while still holding the scroll finger, nothing will happen. I'm not sure if you can override this behavior (and I don't think it's a good idea either).
Touches, on the other hand, allow a certain amount of simultaneous touches depending on the device. When a touch is not a gesture you can start a second or n number of touches after the first one starts.
You can try this out yourself at http://raphaeljs.com/touches.html.
Now, going back to your example: it depends on how it's implemented. If you are using the OS gestures (tap and drag anywhere) then no, you can't introduce a second finger to drag horizontally, you'd use the same finger used to scroll vertically (panning with a single finger). But, if you have actual scrollbars (like those in mouse interfaces) then yes, you can implement the kind of interface you are describing.
Yes its possible and it depends on the phone but since most android devices and all WP7 devices got multi touch it shouldn't be a problem
Related
I'm working on an android application on tablet to draw every points touched by the finger or the hand. It is very important for the application to track not only the fingers but also the palm (even if it's not drawing the exact touched area but only a thin path, as long as we can see the movement of the palm it's fine). A good example of what I want to do is the "show pointer location" in the Android developer menu : if the drawing could be exactly like this, that would be perfect.
I managed to code a multi-touch drawing app, but the problem I have here is that every time the touched area is too wide (for example when I touch the screen with my entire palm), the application stops drawing (even the fingers drawings stops) and I have this error in the log : [ViewRootImpl] action cancel - 1, eccen:1.4225352 (the number after "eccen" changes depending on the size of my palm touch).
I'm quite new to android, I spent a lot of time searching how I could prevent this action_cancel but I couldn't find anything that makes it work... I tried to prevent the parent views from taking control of the onTouch events, but it didn't work. So if you have any idea of how I can manage that, it would be great :)
Thanks !
(English is not my native language, so don't hesitate to ask me to reformulate is something is not clear)
The KitKat SDK supports a new type of scale gesture called quick scale. Instead of requiring two fingers to pinch zoom, the user can doubletap and swipe to scale a view. You can see this in action in the Chrome and Maps apps.
Both Chrome and Maps differentiate between a doubletap (which zooms into the relevant content area, as before) and a doubletap-swipe (which allows you to scale arbitrarily with one finger).
Under the hood, the ScaleGestureDetector uses a GestureDetector to detect doubletaps and start looking for the corresponding swipe.
My question is how to mimic Chrome and Maps, detecting both doubletaps and this doubletap-swipe gesture but not both at the same time. That is, I'd like to differentiate between a normal doubletap (no swipe) and a doubletap-swipe.
I have both a GestureDetector and a ScaleGestureDetector being fed all touch events on my view. Currently, both GestureListener.onDoubleTap() and ScaleGestureListener.onScaleBegin() fire when I do a doubletap-swipe. onDoubleTap() gets fired first, so there's no way cancel handling events in the ScaleGestureListener.
I see two possible solutions, neither of which is very clean:
Copy the ScaleGestureDetector from the Android source and add a new callback to the ScaleGestureListener interface for something like onDoubleTapConfirmed() (doubletap, no swipe).
Add a small delay to onDoubleTap() so we handle the event X milliseconds after it gets triggered. If onScaleBegin() gets fired before the delay is up, cancel handling the onDoubleTap() event and start handling the scale instead.
What's the best way to handle this?
A bit late to this party, but the quick scale is more like an unfinished double tap on which the second tap turns into a drag, something like:
tap down, tap up, tap down, move
It seems your interpretation of quick scale is:
tap down, tap up, tap down, tap up, tap down, move
Which makes your problem a lot easier if you only register a plain double tap on the second action down. When the second action up happens, you just need to figure out whether a drag has happened during the second tap cycle. In case the gesture recognizers don't work for you, you can do that manually by checking against getScaledTouchSlop.
Also, copying the ScaleGestureDetector is probably not as bad as you think - github is littered with "compat" projects that are basically backports of features in newer SDK versions.
One thing I find many Android games and emulators get wrong is when the user presses multiple (on-screen) buttons simultaneously. I'm wondering how one could fix that.
Imagine a game like Super Mario World. You have two buttons on the right side (simplified): Y is for running and B is for jumping. Typically, you hold Y most of the time with the tip of your thumb, and when you want to jump you lay down your thumb and press B, too.
Situations like this understandably confuse Android. Instead of detecting two presses, it just moves the one from the Y button down a bit.
What I'd need to fix this is one of the following:
Raw touch data as a bitmap (but probably too computationally expensive, and doesn't leave the touchscreen anyway)
The detected touch points in more detail, e.g. as best-fit ellipses or polygons
The ability to define touch regions. If a finger overlaps such a region a certain amount, the region fires.
(The points are from low- to high level, e.g. if I had the first I could emulate the other ones.)
Any ideas?
Is it possible to detect every pixel being touched? More specifically, when the user touches the screen, is it possible to track all the x-y coordinates of the cluster of points touched by the user? How can I tell the difference between when users are drawing with their thumb and when they are drawing with the tip of a finger? I would like to reflect the brush difference depending on how users touch the screen, and would also like to track x-y coordinates of all the pixels being touched over time. Thanks so much in advance for any help.
This would be very tricky primarily because every android phone is going to behave differently. There are some touch screen devices that are very, very sensitive and some that are basically "dull" by comparison.
It also sounds more like you are wanting to track pressure - how hard is the user pushing on the screen - which is actually supported on android devices.
I think some of your answer may be found by monitoring all of the touch events - in practice, most applications ignore a great number of events or perform some kind of "smoothing" of the events since there is literally a deluge of touch events when the user is manipulating the screen. Doing this may negatively impact your applications performance though.
I would recommend that you look into pressure sensitivity and calculate a circular region around the primary touch point based on pressure, then build your brush around that.
Another idea would be to incorporate more of a gesture approach to what you are trying to do - for example, visualize touching the screen with the tip of two fingers together (index and middle) and rolling the middle finger around the index finger or simply moving the middle finger up and down in relation to the index finger. Both fingers would be moved together for painting. This could be used to manipulate drawing angle on the fly or perhaps even toggle between a set of pre-selected brushes or could change brush size on the fly as you are painting.
Some of the above ideas I would love to see implemented - let me know when you have your app ready.
Good luck!
Rodney
If you have a listener on your image it will respond that there was a touch within that bounding box, basically.
So, to get what you want, you could, but, I would never do this, create a box around every pixel, or small group of pixels, and listen for a touch.
Wherever you get a touch, it may fire off an event, then you can react accordingly.
I can't think of any other solution that will give you each pixel that a person touched, at one time.
You may want to read up on multitouch though, as there are some suggestions in here that my help you:
http://android-developers.blogspot.com/2010/06/making-sense-of-multitouch.html
If you're looking for a way to get your content view as a View after Activity#setContentView(int), then you can set an id on the outer-most element of your layout:
android:id="#+id/entire_view" and reference it in your onCreate() method after setContentView:
View view = getViewById(R.id.entire_view);
view.setOnTouchListener( ... );
I am trying to automatically scroll the browser using monkeyrunner. So far I can scroll by "Drag" event, but how can I scroll by "Flick". I appreciate if you can give me some hits or instructions.
Using drag:
for i in range(1, 40):
device.drag((400,700),(400,300),0.15,1)
MonkeyRunner.sleep(.7071)
edit
We cannot replicate the pressure using Monkeyrunner so we cannot do the flick. Just dragging is only way we have for now
MonkeyDevice.java doesn't have any flick method in it, but you can adjust the duration parameter to drag, which appears to be the third argument. A fling is basically a very quick drag, so perhaps by reducing the duration to a very small number (0.01, maybe?) you can get the emulator or device to respond to a fling.
As a work around, why not just 'drag' it many times?
It may take a little bit of work, but you should be able to reproduce the flick effect by performing lots of little drags.
Sorry I can't provide much more then that