I'm trying to combine the capability of doing image zoom/panning
inside of a Gallery view. The problem is controlling which touch
events are processed by each. In a horizontal drag on the image, the
ImageView needs to process the drag in order to pan, up until the edge
of the image is reached then the gallery view needs to process the
touch event so that it can swipe over to the next photo.
It seems that if I return false from the imageview's ontouchevent,
then I won't be notified of the panning-drag move events.
What I tried to do was return true from the imageview's ontouchevent
until the edge of the image was hit then return false. However I
believe that this doesn't cause the Gallery to begin processing these
events, as it missed the initial touchdown event.
My next thought is to somehow turn the containing activity into the
touch handler, and somehow stop the Gallery and ImageView from getting
touch events normally, and have the activity forward the touch event's
manually. Not sure if this is possible or if there is an example.
My last resort will be to simply not use the android.widget.Gallery,
which I want to avoid because I want the widget to "feel" the same as
other places on the phone, and I like code re-use. Unfortunately I
might have to do this, as I see all other apps that have this zoom/pan
capability don't seem to use the Gallery widget (Gallery3D etc).
My question is, what's the best way to design for this situation where
motionevents need to split between two views? Any ideas would be
greatly appreciated thank you.
please follow the links here.
http://jmsliu.com/138/android-infinite-loop-gallery.html
Related
I wan't to develop game named "Balda".
I have a 2d grid of imageviews (or buttons, maybe).
User should be able move his finger on the grid, and app should know wich images in grid was touched in this move.So, in the picture below is what I'm trying to achieve. A - is start point where user pressed on screen. B is end point where finger leaves the screen. And I need to know what images were touched (There are blue in the picture).
I know that I can do something like this. But I think that this is a wrong solution, because It contradicts the principle of giving functionality by responsibilitys.
I think that it is responsibility of the imageView to know when finger enters its borders and when it is leaving its borders.
I thought, that this would be in android API. And it has MotionEvent like EVENT_HOVER_ENTER and EVENT_HOVER_LEAVE but it's not working with finger. After finger is pressed on some View it will recieve all other MotionEvents, if I get it right.
I think that this is wrong. What can I do to get this functionality? Maybe I could create some custom listeners and custom Views, that supports them?
I think your requirement is slightly similar to custom gridview.
You can try below steps-
1)Create Custom view
2)Attach TouchListener to it.
3)Divide this view into 4*3 matrix.
4)Map your images to this 4*3 matrix
5)Write a function which gives the cell number respective toTouched Co-ordinates
6)After getting cell number;get the mapped image for that cell number
7)Put this image in arraylist
8)When user lifts his finger you will get arraylist of touched images(do whatever you want with it).
9)Remember to put this custom view in your activity
Tell me,if you have any doubt or concern
I want to create some kind of a tower defense map using a background image which is fourth as large as the screen of the device (twice horizontal, twice vertical).
my question is, how can I do this best. I'm new to Android, just got some Java basics , and want ti try out some stuff.
I want the user to scroll over the entire map using their finger and want him to zoom in via 2 finger pinch, and of course the objects (towers, sprites) should stay were they are.
I've been searching for hours for now and only found answers like " use Scrollview". I just want a food for thought to get in the right way, maybe with some examples.
You can use ImageView and set appropriate onTouchListener where you will detect pinch-to-zoom gesture using GestureDetector and change view coordinates when user drags the finger.
I want to create an interactive graphic for my app. It will essentially be a simple picture of a bus line where users can select 2 stops at a time (one for departure times and another for arrivals) I'm not sure how to create this image though, and have it have 20 or so different clickable points. Is there a framework I could use for this? Or is there a way to do this in pure android?
Thanks for the help.
I would suggest writing an onClick listener and using a collection of Rect instances to manage the collision/position of the 'click'. Check out the on click page and the rect page.
One thing to keep in mind are the origin point of your clicks, I'd assume you'll want to use one corner of your image as the point (0,0) and reference everything (clicks and rects) from there.
I would say try to create an ImageView to load your image and set a touch event listener or a click listener to that view. Hard code all places that you want your image to react upon a click.
Check click using an event listener would require you to handle both ACTION_DOWN and ACTION_UP in the MotionEvent object passed in. But it's easier to grab the coordinate of where the user clicks on the page, so you only need one listener but needs to put more work on handling the conversion from the coordinate passed in by MotionEvent to the coordinate on image. This is particular a major issue when your image can be sized larger than the screen size.
Using a click listener would save you from this trouble. As #smitec said, you need to overlay rectangles on your image as "buttons", so you can react to user input based on which buttons they pressed. This way you need to bind listeners to all of them (I suppose) and hard code their positions on your image. But, as mentioned earlier, it saves you from dealing with coordinates later on.
I am working on an application that can read pdf documents and I am trying to implement zooming and panning on a pdf page.
The page is loaded as a bitmap and displayed in an ImageView. There are also some other functionalities already implemented such as navigating the pages in a document with a bottom custom navigation bar that can be scrolled.
Also, the bottom bar appears when the user taps on the page and disappears with the next tap and when flinging, users should be able to navigate to next/previous page in the document.
All functionality such as the scroll, tap and fling is handled by implementing OnGestureListener in the reader activity and I am trying to do the zoom and pan with an OnTouchListener implementation that is set on the ImageView containing the page.
The code seems to work, however it appears that some of the other events, especially the scroll, are interfering with it which makes it slow.
My question is if there is a better way to go about reconciling everything, since the OnGestureListener is used for the GestureDetector's handling of onFling, onTapUp and onScroll, but I can't find a good way to add the zoom/pan code in one of the methods supported by this.
So, if anyone has some experience on handling all these events for one view or some good suggestions/tutorials on this, I would very much appreciate it.
I'm not sure what you mean by 'some other events such as the scroll make it slow', could you elaborate?
You should be able to perform zoom and pan using onTouch fairly trivially, without scrolling being an issue if you choose to use Canvas. Full implementation code can be found at Touch and drag image in android for scrolling, and http://www.zdnet.com/blog/burnette/how-to-use-multi-touch-in-android-2-part-6-implementing-the-pinch-zoom-gesture/1847 for zooming.
It's generally not common to see advanced interactive features applied solely on an ImageView, but that doesn't mean people haven't done it (see How can I get zoom functionality for images? for zoom examples including support for multi-touch or Adding Fling Gesture to an image view - Android ).
Regardless of which way you do it, you'll probably want a GestureDetector in your onTouch function. You should be able to just cut and splice together sections of the tutorials I've linked to in order to get full gesture support + zoom/pan. However, if I were doing it, I would probably just use onTouch with Canvas (since Canvas will give better performance) and use the core MotionEvents such as DOWN MOVE UP for greater control, or if I was feeling lazy, use a GestureDetector for trickier bits like flinging.
I'm a bit stumped here, I'm trying to make it so an image (already in the drawable folder) gets created everytime you touch the screen and removed when the finger is lifted. I have the touch part coded already, overriding the onTouch method with a couple switch cases that handle each type of action (down, up, move, etc). But I can't for the life of me figure out how to make the image appear and then disappear with the coordinates (matrix?) of where the finger is.
Any help would be greatly appreciated!
You could use an object of the ImageView class to draw the image.
Get the co-ordinates of the touch event using the MotionEvent.getX(), MotionEvent.getY() or MotionEvent.getRawX(), MotionEvent.getRawY() functions (use the appropriate one depending on the kind of layout you are using, linear/relative). Use these co-ordinates as your ImageView object's left and top margins respectively.
Regrds,
Anirudh.