I have an Android game app that populates the screen with tiny images (circles). The game is to tap the circles on the screen as quickly as possible until they all disappear. The problem is you can circumvent the game easily by swiping/moving one's finger across the screen (and making every circle touched disappear). In four swipes you're done with the game. I'm not even sure of the correct terminology.
Any ideas? onFling? onLongPress? I do not know.
Related
I am trying to make Scrolling Object Collection from MRTK Unity work with Android.
The problem is, that scrolling is just too slow. When I swipe across the whole screen the objects in the list barely move.
If I look at the collection, touch the screen, and then move my phone in the direction I want to scroll, everything works fine.
But that is not how I want users to scroll.
Any ideas how to make the touch behave the same way/have the same impact as moving the phone?
So, here is the deal. I want to create a little game-like app(let me call it the game, from now on). Before I begin, I'd like to show you up the raw UI of the game.
So, as you can see, I would have:
a red ball in the middle
two walls on either sides
Procedure:
User would move the device to right side and hits the right wall (the gray ball on the right side is simulated)
Movement of device occures to left so it can reach the left wall as well. (like the gray ball on the left)
All I know regarding the program side is the sensors, especially Accelerometers. Most of the people at Stackoverflow are of the opinion that working with sensors not on its good stages, that is, the data comes from them is very noisy, processing them can be hard. However, I think a lot of games are using them to some extent...
Well, I tried moving an image by drawing it. I attempted to get data from gyroscope, accelerometer, linear accelerometer and used that data to draw my image. The problem is that if I move the device smoothly and slowly then I got almost no movement on the drawings. That gives the horrible user experience.
The question is how can I implement my game so that it can meet all my needs above even if I move my device slowly. Any approaches are welcome.
I'm creating a live wallpaper for Android and ran into an issue with the manual scrolling (without using the onOffsetsChanged method). The scrolling works like a charm except for one, but very unpleasent side-effect - when a user opens the "AllApps" screen and this screen is semi-transparent (like on the Xperia phones), the wallpaper keeps on reacting the touch events when the user goes through screens with his/her installed apps and scrolls.
I understand that nothing is changed from the wallpaper's "point of view" if this screen is semi-transparent. It keeps on rendering and continues to process all the touch events.
So, the question is - is there a way to track down the moment when a user opens the "AllApps" screen and stop processing touch events until this screen is closed? Any ideas?
I have a very creative requirement - I am not sure if this is feasible - but it would certainly spice up my app if it could .
Premise: On Android phones, if the screen is covered by hand(not touching, just close to the screen) or if the
phone is placed over the ear during a call the phone locks or
basically it blacks out. So there must be some tech to recognize that
my hand is near the screen.
Problem: I have an image in my app. If the
user points to the image without touching the screen, just as an
extension to the premise, I must be able to know that the user is
pointing to the image and change the image. Is this possible ?
UPDATE: An example use:
Say I want to build a fun app, on touch the image leads to some other
place. For example - I have two doors one to a car and one to a lion.
Now just when the user is about to touch door 1 - the door should show
a message saying are you sure, and then actually touching it takes you
to another place. Kinda rudimentary example, but I hope you get the
point
The feature you are talking about is the proximity sensor. See Sensor and SensorEvent.values for Sensor.TYPE_PROXIMITY.
You could get the distance of the hand from the screen, but you won't really be sure where in the XY co-ordinate system the hand is. So you won't be able to figure out whether the user is pointing to the "car door" or to the "lion door".
You could make this work on a phone with a front camera with a wide angle so it can see the whole screen. You'd have to write the software for recognizing hand movements, and translate these to screen actions.
Why not just use touch, if I may ask?
I am designing a Game and have a large background. The background it a lot bigger than the phone display so the user will only have a small "View" of the background. They will be able to move around by scrolling with their finger.
How do i go about this?
My application does this.
It is open-source so feel free to copy-and-paste:
http://code.google.com/p/androidbigimage