I am trying to automatically scroll the browser using monkeyrunner. So far I can scroll by "Drag" event, but how can I scroll by "Flick". I appreciate if you can give me some hits or instructions.
Using drag:
for i in range(1, 40):
device.drag((400,700),(400,300),0.15,1)
MonkeyRunner.sleep(.7071)
edit
We cannot replicate the pressure using Monkeyrunner so we cannot do the flick. Just dragging is only way we have for now
MonkeyDevice.java doesn't have any flick method in it, but you can adjust the duration parameter to drag, which appears to be the third argument. A fling is basically a very quick drag, so perhaps by reducing the duration to a very small number (0.01, maybe?) you can get the emulator or device to respond to a fling.
As a work around, why not just 'drag' it many times?
It may take a little bit of work, but you should be able to reproduce the flick effect by performing lots of little drags.
Sorry I can't provide much more then that
Related
Sometimes when i tap on the map the tap is recognized as onPanStart. I need to do something when the user swipes on the map and something different when he taps , but there is no onSwipe gesture in the MapGesture.OnGestureListener. With using onPanStart sometimes the wrong action is called. Is there a better way to handle this?
Differentiating between gestures is somewhat subjective and as a result the parameters are generally tweaked to provide a good UX for the given use case.
Without knowing more about your exact use case, if you want the actions to correspond directly to how the HERE SDK is interpreting the input, then using the callbacks onPanStart and onTapEvent would be the right thing to do. Note that even though a Pan is technically triggered, it could have such a small velocity that the Map doesn't move much. "Pan" is equivalent to a "Swipe" gesture.
If you do want to tweak the UX a bit, an option would be to write your own Android GestureDetector to get the feel you would like (potentially fusing the result with the output of OnGestureListener events as well). Alternatively, you could also check that the Map actually moves a certain amount after onPanStart is called before triggering your event, perhaps using Map#OnTransformListener but this could be tricky to get right.
I see that the latest version of GMail has a slider. Basically, I can slide an entry to reveal what's behind it. I have been needing to do the same thing. I have tried to use onTouch to track displacement, etc. But my approach is very jiggery and the actual scrolling lags quite a bit. Does anyone know how to accomplish what I seek with horizontal scrollView or such? Or better yet, how GMail is doing theirs?
An important aspect of mine is to have some snap action. So if the user has scrolled to greater than X, I am to continue sliding to the left for her, for example, until the front image reaches the left edge.
Or could I use a navigation drawer to accomplish this? I don't think so as yet, but maybe someone has done it. I have been working on this for about a week now, and all my attempts are not quite there.
There are a couple Dev Bytes that discusses how to implement this from scratch (see "DevBytes: Animating ListView Deletion" and "DevBytes: Animating ListView Deletion: Now on Gingerbread!"). Alternatively, you could look at SwipeListView.
I am an amateur (desktop) programmer but I want to get into phones. I have some ideas for apps but the touchscreen and it's inputs confuse me....
I know that touchscreens can accept multiple points of touch. For instance zooming in you take two fingers and you bring them closer.. and for zooming out you do the opposite.
Here is my problem though... I've never seen functionality with any phone app on any phone (I use windows phones and android phones) where.... the input on touch is multiple points but it doesn't begin at the same time.
For the sake of illustration I'll make an example. Suppose you have a mini browser on a phone... and it has a vertical scroll bar... and a horizontal one. What I want to do is be able to scroll down... and WHILE i am scrolling down also scroll the horizontal one so i can move the page left or right. So a few seconds after I touch the screen and begin moving the vertical scrollbar downwards or upwards... i want to use a different finger and touch the horizontal scrollbar and move it as well (at the same time).
Is this even possible? Are there certain hardware or software limitations preventing something like this?
You are mixing up gestures and touches. Gestures are touch behaviors, such as...
Two fingers placed at the same time that grow apart from each other means zoom-in.
Tap and hold means context popup.
Tap and drag equals scroll.
You can cancel these gestures when your app doesn't follow these conventions. For example, Angry birds doesn't scroll if you tap and drag on a bird, but it does if you do it elsewhere on the scene.
The default state of gestures is to not detect additional touches while you are performing a gesture. if you scroll and introduce a second finger to click on a button while still holding the scroll finger, nothing will happen. I'm not sure if you can override this behavior (and I don't think it's a good idea either).
Touches, on the other hand, allow a certain amount of simultaneous touches depending on the device. When a touch is not a gesture you can start a second or n number of touches after the first one starts.
You can try this out yourself at http://raphaeljs.com/touches.html.
Now, going back to your example: it depends on how it's implemented. If you are using the OS gestures (tap and drag anywhere) then no, you can't introduce a second finger to drag horizontally, you'd use the same finger used to scroll vertically (panning with a single finger). But, if you have actual scrollbars (like those in mouse interfaces) then yes, you can implement the kind of interface you are describing.
Yes its possible and it depends on the phone but since most android devices and all WP7 devices got multi touch it shouldn't be a problem
The Android Gesture API is a real help and removes the need to reimplement a pretty complex wheel.
I however need the API to be more "responsive".
As way of an example. Lets say I define a Gesture, such as "Circle" in which a single finger gesture event is defined to be, yep you guessed it, a single finger circle.
If the user undertakes this gesture continuously, without lifting the finger, I would like for "onGesturePerformed ()" to be called repeatedly, ie the user continues to perform the same gesture.
I would like a firing granularity of maybe 1/4 second. I have seen a similar question but in that case the user wanted a longer delay and not a shorter delay where the user does not lift the finger.
Many thanks in advance.
Paul
This is an open question about Android ListViews, Gestures and Animations.
I'm really not familiar with the gestures in Android, so I'm just looking for ideas and grey matter on this.
Here's two screenshot and a video examples of the effect on what I'm trying to cogitate. Consider taking a look at the video, it's really worth it.
The screenshots are from an iOS open source project found here.
The question is, how would you implement a "listview opening" gesture like the one I see more and more often in iPhone/iPad apps, but for Android ?
Edit 1, idea 1:
Okay first idea, AFAIK the Pinch gesture is somehow like a dragging gesture, so I guess we can get the X and Y coordinates of the two fingers on the screen?
Next, the answer to this question may help, the basic idea is:
Get the index position of the first visible item in the list
Get the index position of the last visible item in the list
Iterate from the first index to the last with the getChildAt function
For each child, call the getLocationOnScreen method to get coordinates of the current iterated item
After that, some comparison between the pinch gesture coordinates and each item coordinates might be done inside the loop to get the two items between which the new row must me inserted.
Performances considerations appart I think it could work, but maybe there's a simpler way to get those two items(?).
Who's next? :)
Update:
Thanks for the tip #rhlnair, I take this occasion to tell everybody that I started to work on this on my spare time and you are more then welcome to help on this.
The project is at https://github.com/arnaudbos/Android-GestureListView. I started two different implementations on two different branches, and would enjoy anybody to create a new branch.
I have something really encouraging in branch "attemp-via-scale-gesture-detector" but some side effects from the ListView.
Come on folks!
seems to be a challenging idea..
i think some of the effect in clear app like dragging a selected row up/down can be taken from
https://github.com/commonsguy/cwac-touchlist
When doing a pinch gesture you have two fingers on the screen and therefore a point in the middle of those two points. That is simple euclidean arithmetic to find that middle point.
Then find as you say the element in the list that this point is above. You mention performance and I do not think this will be a problem. You are iteration a loop a few times and asking for coordinates. I have done much worse on touch events.
If the point between the pinch is above the middle of the list item you create the item above, and vice versa below.
See the section at the bottom where they use a scale listener:
http://android-developers.blogspot.com/2010/06/making-sense-of-multitouch.html
Using the scale listener you can use the scale to find out if you create a new element. If the scale is above (you are "zooming" out) a threshold you create a new element and let the view repopulate from the list adapter.