I am currently building a game in Unity intending for android devices. Right now for development purposes I am coding by the keyboard controls. What would be the simplest and effective way of approaching the actions when the screen is swiped? I want to access the events screen is swiped left right back or front.
Easiest way to plug into the drag (and other) events is by using EventTrigger component on a UI element (for example Image)
Related
I am trying to automate a user experience on AndroidTV Apps which don't have a standard android view hierarchy (probably written using openGL framework).
I can provide the dump of the view of any of these apps if needed.
I'm trying to fire a click event for a particular button, say 'ABC' which is present at the 'x y' coordinates on the screen.
For native android ATV apps, I can do that by firing an 'adb tap x y' event or UiDevice.click(x,y).
However, I'm not able to do so with Netflix, Prime, Youtube or Hulu Apps for ATV.
The click/tap is actually triggered on the screen, but the button doesn't respond to that.
Maybe because it's part of just a View(Framelayout) and not actually a button in the openGL world.
I don't want to use the D-Pad Remote Control events for this.
(Maybe, shift focus to that coordinate and then press dpad centre can. work)
Is there any way I can achieve that ?
I want to add multiple pages in my Android App, similar to the home screen on my phone, I want to be able to swipe left and right to see multiple pages.
I'm developing my app in Adobe Flash CC 2014 using "AIR 16.0 for Android".
Anyone know how I can do this?
You can go with different approaches for this problem. You could create some SwipeGestures to detect that or you could go the way Flash went since 1999, setup a Movieclip (or many) and listen for onMouseDown (ontouchstart) events and then say mc.startDrag(); (you want to limit the drag-movement to the X axis). Then onMouseUp (ontouchend) you can determine if the current MC is relativly cented and then tween it into the middle of the screen, or if the page is to far left/right and therefore page to the next page. There is also a Touch Drag implementation out of the box with ontouchmove .
Basicly what you are looking for is some kind of coverflow for AS3 ... or something a lot less fancy. Please make yourself comfortable with startDrag and StopDrag and you will see how you get there my doing.
I'm trying to make my GUI designed on Qt look good on Android devices. Now I'm using widgets with an idea to make universal GUI (for desktop and mobile).
If anybody has an experience in this field can you share some literature, materials, techniques or smth like this?
Thanks!
There are two routes I can see:
Design separate UI for both desktop (mouse + keyboard) and Android (touch).
With Qt and QML this is a very strong alternative. Have the application logic in C++ and also in separate Javascript .js files. Then write/Design the UI .qml files from scratch for both. The downside, or perhaps extra bonus depending on your point of view, is that you have to be pretty careful with the overall architecture, so you can share as much code as possible, and really have just the GUI which is different.
Limit yourself to common user interactions. For some UIs this is perfectly fine. There's no universal 1-1 mapping between touch and mouse/keyboard, but there are some common idioms:
tap / click for activation
long tap / right mouse button for context menu
flick / mouse wheel for scrolling
pinch zoom / zoom with ctrl+wheel
pan with two fingers / drag with mouse button down, or with right button down, or with alt/control keys down
Biggest hurdle for common UI is perhaps selection, especially text selection. What is simple drag or clicks with shift/ctrl pressed when using mouse, becomes complex exercise when using touch, possibly requiring separate icon to enter selection mode, or long tap and selecting right choice from context menu. If selecting things is a central action of your app, you're probably better off going with alternative 1. above, so you can really optimize the touch UI for it, while giving desktop users the "standard" desktop way of selecting things.
I'm starting from scratch and I don't even know where to look.. or even what to call this.. so any help would be appreciated.
I am working with the Snake example that Google supplies and I want it to work on Android devices without physical keys. How would I be able to use touch gestures (up/left/right/down) to make the snake turn and whatnot.
Again, I don't even know where to start looking.. I'm still slowly learning my way through programming for Android.
Thanks!
You could implement View.OnTouchListener. I think this page is the one you're looking for.
I would modify it so that touching the top part of the screen sent the snake up, the left side sent it left, etc.
I would like to know what is meant by gestures in typical mobiles more specifically android ones. Android supports gesture recognition.
Is a gesture termed as when user
holds the device and shakes it(say
upwards, downwards or side- side)?
Is a gesture termed as when a
finger is placed on the screen and
traced UP,DOWN,RIGHT,LEFT? If so
what is the difference between touch
screen and gestures.
I am confused between 1) and 2) option.
What is a gesture exactly?
As I understand it, a gesture is anytime a user touches the screen and does a pre-defined motion that the system understands. I would venture to say that shaking the phone is not a gesture, but a function of detecting changes in the accelerometers.
From Android's reference web page, a gesture is a hand-drawn shape on a touch screen. It can have one or multiple strokes. Each stroke is a sequence of timed points. A user-defined gesture can be recognized by a GestureLibrary.
https://developer.android.com/reference/android/gesture/Gesture.html
I see gestures as being a type of input pattern that you expect from the user. I.e., you can setup mouse gestures in web browsers to do things like going "Forward" or "Back" in the browse by doing a specific "gesture" (maybe middle mouse button click and moving the mouse left goes "Back").
I'll give a brief answer to your bonus question: Yes, it's quite possible to to character recognition from input gestures on Android. In fact, at least one major vendor has already ported an existing handwriting engine to that platform. Works beautifully, but there's a lot of legal and marketing cruft to take care of as well before it ends up on real devices :(