I want to move an image around the screen according to the trackball movement. I am able to capture the movements using the onTrackballEvent method. But this is being called for very small float values (I believe for each rotation?) of the trackball.
Now as X and Y positions of views should be integers when specifying with LayoutParams, it makes no sense to move the view with every rotation. Instead I want to move the view only after the user stops rotating the trackball.
Is there any method by which we can get whether the user stopped using the trackball or not?
If there isn't (I don't know Android), the classic approach is to start a timer in each event, and keep track of the deltas reported in it (by adding them to a running sum, one for X and one for Y). On any subsequent event, before the timer has triggered, just re-start the same timer, pushing its triggering further into the future.
When the user stops generating events, the timer will eventually trigger, and then you can apply the sum of all the movements, and clear the sums.
A suitable timeout must be chosen so that the user interaction is not perceived as being too sluggish. Perhaps something on the order of 200-300 milliseconds, or more. This is easy to tweak until it feels "right".
I've also gotten a lot of value out of setting a threshold. So if the total movement in a specific direction adds up to a given amount, then trigger a navigational event.
Related
When a user starts a touch in a particular view and then moves the finger outside the view (while still in contact with the surface), the MotionEvent returned clamps the x/y value into coordinates inside the box.
That is, I won't get negative Y values if I move the finger above the view, for example.
Is there a way to get this raw value? getRawY seems to clamp the value as well.
I can't do anything with activities, since this is for an IME project, and the IME basically accepts a View from you and takes care of rendering etc.
Set the FLAG_WATCH_OUTSIDE_TOUCH on the window associated with your view. That should make it work.
I do have another way to do it, but you really don't want to use it (it involves the ugly hack of making a partially transparent full screen keyboard) so lets try this first.
Here's the horrible hack you don't want to use this method:
1)Add a transparent view above your keyboard, taking up the rest of the screen.
2)In onComputeInsets, define your touchable insets to be a REGION, and pass it a region made up of only your keyboard
3)In onTouchEvent, on a ACTION_DOWN set a flag that says we're in touch mode, and make sure to invalidate your view (this will cause onComputeInsets to be called again)
4)In onComputeInsets, look at the value of that touch mode flag. If false, use the insets set in 2. If true, set the entire screen as touchable
5)In onTouchEvent, on a ACTION_UP or ACTION_CANCEL clear that flag and call invalidate to reset the insets
What that will do is tell the framework that touches on that transparent view do no belong to the keyboard (so they'll be sent to the app) unless you're in touch mode in which case we tell the framework we want all touches anywhere. Its hacky, but I've done this kind of code before and it does work. It just requires you to play with onComputeInsets, which is almost undocumented (they added minimal documentation a few versions ago) and has been used by probably about 20 people in the world. Its an ugly api.
How do I get the current touch coordinates?
In my case, I'm setting a timer and want to see if there was any movement during the elapsed time. When receiving a MotionEvent, it's easy enough to cache the x & y coordinates from the event. As long as more touch events come, it's easy to cache the latest touch coordinates. Then, finally, compare the most recent to the originals.
But this technique fails because, sometimes, some other view consumes future touch events and my view no longer gets them (it gets a "cancel"). Thus, the most recent cached value is "stale".
The obvious solution is to somehow query the system to get the current touch coordinates (or none if he user let go). So far, I haven't found any such method.
The Android Gesture API is a real help and removes the need to reimplement a pretty complex wheel.
I however need the API to be more "responsive".
As way of an example. Lets say I define a Gesture, such as "Circle" in which a single finger gesture event is defined to be, yep you guessed it, a single finger circle.
If the user undertakes this gesture continuously, without lifting the finger, I would like for "onGesturePerformed ()" to be called repeatedly, ie the user continues to perform the same gesture.
I would like a firing granularity of maybe 1/4 second. I have seen a similar question but in that case the user wanted a longer delay and not a shorter delay where the user does not lift the finger.
Many thanks in advance.
Paul
Is it possible to detect every pixel being touched? More specifically, when the user touches the screen, is it possible to track all the x-y coordinates of the cluster of points touched by the user? How can I tell the difference between when users are drawing with their thumb and when they are drawing with the tip of a finger? I would like to reflect the brush difference depending on how users touch the screen, and would also like to track x-y coordinates of all the pixels being touched over time. Thanks so much in advance for any help.
This would be very tricky primarily because every android phone is going to behave differently. There are some touch screen devices that are very, very sensitive and some that are basically "dull" by comparison.
It also sounds more like you are wanting to track pressure - how hard is the user pushing on the screen - which is actually supported on android devices.
I think some of your answer may be found by monitoring all of the touch events - in practice, most applications ignore a great number of events or perform some kind of "smoothing" of the events since there is literally a deluge of touch events when the user is manipulating the screen. Doing this may negatively impact your applications performance though.
I would recommend that you look into pressure sensitivity and calculate a circular region around the primary touch point based on pressure, then build your brush around that.
Another idea would be to incorporate more of a gesture approach to what you are trying to do - for example, visualize touching the screen with the tip of two fingers together (index and middle) and rolling the middle finger around the index finger or simply moving the middle finger up and down in relation to the index finger. Both fingers would be moved together for painting. This could be used to manipulate drawing angle on the fly or perhaps even toggle between a set of pre-selected brushes or could change brush size on the fly as you are painting.
Some of the above ideas I would love to see implemented - let me know when you have your app ready.
Good luck!
Rodney
If you have a listener on your image it will respond that there was a touch within that bounding box, basically.
So, to get what you want, you could, but, I would never do this, create a box around every pixel, or small group of pixels, and listen for a touch.
Wherever you get a touch, it may fire off an event, then you can react accordingly.
I can't think of any other solution that will give you each pixel that a person touched, at one time.
You may want to read up on multitouch though, as there are some suggestions in here that my help you:
http://android-developers.blogspot.com/2010/06/making-sense-of-multitouch.html
If you're looking for a way to get your content view as a View after Activity#setContentView(int), then you can set an id on the outer-most element of your layout:
android:id="#+id/entire_view" and reference it in your onCreate() method after setContentView:
View view = getViewById(R.id.entire_view);
view.setOnTouchListener( ... );
I am trying to develop a game with Android where I need to move a tile in all directions. My question is how can I get the motion of the finger on the screen(right,left,top) using getX() and getY()?
Thanks.
Take a look at getHistoricalSize, getHistoricalX(int) and getHistoricalY(int).
When the finger is touching the screen, Android records the move positions and saves them. You can then call getHistorySize to get the number of motions recorded, and then call getHistoricalX and getHistoricalY with a parameter less that the history size to get the x/y at that history position.
So, for example, you can call getHistoricalY with a parameter that indicates that previous motion event, and then compare it to the current one. If the current one is bigger, then the finger is swiping down.
Note: Motion events recording applies only to ACTION_MOVE.