I want to create some kind of a tower defense map using a background image which is fourth as large as the screen of the device (twice horizontal, twice vertical).
my question is, how can I do this best. I'm new to Android, just got some Java basics , and want ti try out some stuff.
I want the user to scroll over the entire map using their finger and want him to zoom in via 2 finger pinch, and of course the objects (towers, sprites) should stay were they are.
I've been searching for hours for now and only found answers like " use Scrollview". I just want a food for thought to get in the right way, maybe with some examples.
You can use ImageView and set appropriate onTouchListener where you will detect pinch-to-zoom gesture using GestureDetector and change view coordinates when user drags the finger.
Related
I wan't to develop game named "Balda".
I have a 2d grid of imageviews (or buttons, maybe).
User should be able move his finger on the grid, and app should know wich images in grid was touched in this move.So, in the picture below is what I'm trying to achieve. A - is start point where user pressed on screen. B is end point where finger leaves the screen. And I need to know what images were touched (There are blue in the picture).
I know that I can do something like this. But I think that this is a wrong solution, because It contradicts the principle of giving functionality by responsibilitys.
I think that it is responsibility of the imageView to know when finger enters its borders and when it is leaving its borders.
I thought, that this would be in android API. And it has MotionEvent like EVENT_HOVER_ENTER and EVENT_HOVER_LEAVE but it's not working with finger. After finger is pressed on some View it will recieve all other MotionEvents, if I get it right.
I think that this is wrong. What can I do to get this functionality? Maybe I could create some custom listeners and custom Views, that supports them?
I think your requirement is slightly similar to custom gridview.
You can try below steps-
1)Create Custom view
2)Attach TouchListener to it.
3)Divide this view into 4*3 matrix.
4)Map your images to this 4*3 matrix
5)Write a function which gives the cell number respective toTouched Co-ordinates
6)After getting cell number;get the mapped image for that cell number
7)Put this image in arraylist
8)When user lifts his finger you will get arraylist of touched images(do whatever you want with it).
9)Remember to put this custom view in your activity
Tell me,if you have any doubt or concern
I want to create an interactive graphic for my app. It will essentially be a simple picture of a bus line where users can select 2 stops at a time (one for departure times and another for arrivals) I'm not sure how to create this image though, and have it have 20 or so different clickable points. Is there a framework I could use for this? Or is there a way to do this in pure android?
Thanks for the help.
I would suggest writing an onClick listener and using a collection of Rect instances to manage the collision/position of the 'click'. Check out the on click page and the rect page.
One thing to keep in mind are the origin point of your clicks, I'd assume you'll want to use one corner of your image as the point (0,0) and reference everything (clicks and rects) from there.
I would say try to create an ImageView to load your image and set a touch event listener or a click listener to that view. Hard code all places that you want your image to react upon a click.
Check click using an event listener would require you to handle both ACTION_DOWN and ACTION_UP in the MotionEvent object passed in. But it's easier to grab the coordinate of where the user clicks on the page, so you only need one listener but needs to put more work on handling the conversion from the coordinate passed in by MotionEvent to the coordinate on image. This is particular a major issue when your image can be sized larger than the screen size.
Using a click listener would save you from this trouble. As #smitec said, you need to overlay rectangles on your image as "buttons", so you can react to user input based on which buttons they pressed. This way you need to bind listeners to all of them (I suppose) and hard code their positions on your image. But, as mentioned earlier, it saves you from dealing with coordinates later on.
Is it possible to detect every pixel being touched? More specifically, when the user touches the screen, is it possible to track all the x-y coordinates of the cluster of points touched by the user? How can I tell the difference between when users are drawing with their thumb and when they are drawing with the tip of a finger? I would like to reflect the brush difference depending on how users touch the screen, and would also like to track x-y coordinates of all the pixels being touched over time. Thanks so much in advance for any help.
This would be very tricky primarily because every android phone is going to behave differently. There are some touch screen devices that are very, very sensitive and some that are basically "dull" by comparison.
It also sounds more like you are wanting to track pressure - how hard is the user pushing on the screen - which is actually supported on android devices.
I think some of your answer may be found by monitoring all of the touch events - in practice, most applications ignore a great number of events or perform some kind of "smoothing" of the events since there is literally a deluge of touch events when the user is manipulating the screen. Doing this may negatively impact your applications performance though.
I would recommend that you look into pressure sensitivity and calculate a circular region around the primary touch point based on pressure, then build your brush around that.
Another idea would be to incorporate more of a gesture approach to what you are trying to do - for example, visualize touching the screen with the tip of two fingers together (index and middle) and rolling the middle finger around the index finger or simply moving the middle finger up and down in relation to the index finger. Both fingers would be moved together for painting. This could be used to manipulate drawing angle on the fly or perhaps even toggle between a set of pre-selected brushes or could change brush size on the fly as you are painting.
Some of the above ideas I would love to see implemented - let me know when you have your app ready.
Good luck!
Rodney
If you have a listener on your image it will respond that there was a touch within that bounding box, basically.
So, to get what you want, you could, but, I would never do this, create a box around every pixel, or small group of pixels, and listen for a touch.
Wherever you get a touch, it may fire off an event, then you can react accordingly.
I can't think of any other solution that will give you each pixel that a person touched, at one time.
You may want to read up on multitouch though, as there are some suggestions in here that my help you:
http://android-developers.blogspot.com/2010/06/making-sense-of-multitouch.html
If you're looking for a way to get your content view as a View after Activity#setContentView(int), then you can set an id on the outer-most element of your layout:
android:id="#+id/entire_view" and reference it in your onCreate() method after setContentView:
View view = getViewById(R.id.entire_view);
view.setOnTouchListener( ... );
I am developing a puzzle game, where user has to arrange the images in the grid. The screenshot is given below
I want to able to drag the image of one cell of the grid to another cell. I searched many sites and every where I found examples with Drag and drop API (I.e. by using OnDragListener etc.) which was introduced in Android 3.0, but my application should run in Android 2.2.
So please help me how to implement it using Touching API (I.e. OnTouchListener etc.)
One way of doing it would be take the x & y location of the touch in relation to the grid.
IE. at 10x10 grid on a 100x100 area.
If the touch was at 25,25, it would choose square 2,2(using an array). You could then save that location to a variable (so as to move whatever piece to it that you are changing it with) and on drag update the bitmap x,y in relation to the touch.
Once you lift your finger at say, 75,75, it would set the puzzle piece at 7,7 and move that piece to 2,2.
I used something similar, less the drag, on my Lazer Maze Lite game. Mine was basically moving mirrors and bombs on touch though, but....
http://developer.android.com/guide/topics/ui/drag-drop.html
I am recently getting into Android programming and want to make a simple game using 2D canvas drawing. I have checked out the Lunar Lander example and read up on some gestures, but it looks like I can only detect if a gesture occurred. I am looking to do a little more complicated detection on a swipe:
I want to make a simple game where a user can drag their finger through one or more objects on the screen and I want to be able to detect the objects that they went over in their path. They may start going vertically, then horizontally, then vertically again, such that at the end of a contiguous swipe they have selected 4 elements.
1) Are there APIs that expose the functionality of getting the full path of a swipe like this?
2) Since I am drawing on a Canvas, I don't think I will be able to access things like "onMouseOver" for the items in my game. I will have to instead detect if the swipe was within the bounding box of my sprites. Am I thinking about this correctly?
If I have overlooked an obvious post, I apologize. Thank you in advance!
I decided to implement the
public boolean onTouchEvent(MotionEvent event)
handler in my code for my game. Instead of getting the full path, I do a check to see which tile the user is over each time the onTouchEvent fires. I previously thought this event fired only once on the first touch, but it fires as long as you are moving along the surface of the screen, even if you haven't retouched.