So I followed Mathew Casperson's Making Games on Android Tutorial and got a small game running a few days ago, now I am trying to switch the controls to touchscreen instead of the D-pad.
I am running into some problems and was wondering if anyone here could help me. Flixel doesn't have any built in touchscreen functions so I am overriding onTouchEvent(MotionEvent event) in my Activity (FlixelDemo.java in the tutorial) and hopefully getting the coordinates of the touch.
I then have a function in my Player.java that given the touch coordinates could tell me whether my player has been touched.
The problem I am having is trying to figure out how to get to/call that function (isCollision) from in the activity.
It seems that I can only override the onTouchEvent in FlixelDemo.java and that I can only use the isCollision function in GameState.java where I add the player.
How do I get the info from the overridden touch event to any of my other classes? Can anyone tell me what I am doing wrong or help me figure out a different way of implementing touch events?
Looking at the code, FlixelDemo is really just a container for org.flixel.FlxGameView (via the res/layout/main.xml file).
The onTouchEvent method can be applied to any View, so you can apply it to just the flixel viewport.
And in fact, that's probably what you want to do here: Add your handler directly to FlxGameView.java, and then let it call a method on the internal GameThread class.
It's already handling the other events this way. See FlxGameView.onKeyDown (and the associated FlxGameView.GameThread.doKeyDown) for a good example.
Related
I'm new to android and trying to learn intercept touch events handling using this documentation, but i do not understand that calculateDistanceX(MotionEvent) method at all, and the more I searched about it the less i found. I'm completely confused.
Could anyone please explain it to me?
Thanks in advance.
It says that the method calculateDistanceX() is left as an exercise for a user. One solution would be to cache the first MotionEvent and then the distance would be mFirstMotionEvent.getX() - currentMotionEvent.getX()
I am trying to add a button which when pressed shares the high score of the user to the chosen app (facebook, twitter, messaging etc...)
I have been searching online and I just could not find a clear tutorial on how to do this. It may be because I am not so familiar with the terms such as bindings etc.. so I cannot understand these advanced tutorials. Can anyone explain clearly how I could do this?
Thanks!
p.s it would be great for the share button to work for both android and iOS
For Facebook, check out gdx-facebook.
It's a cool project that integrates libgdx with Facebook's API. I discovered it recently, and so far so good.
Please note: There seems to be some gradle issues with this project when using Eclipse IDE. I've been using it with Android Studio, and it's working fine.
You'll need to have a binding on the user interacting with the button which uses the API of the corresponding service to make a post. It's sufficiently complicated and there are so many ways of doing it that I don't suspect you'll find any one answer to this problem.
The basic part of reacting on a tap or a click will look something like this:
ClickListener shareFacebook = new ClickListener() {
public boolean touchDown(InputEvent event, float x, float y, int pointer, int button) {
//Interacting with the facebook API here
}
}
shareButton.addListener(shareFacebook)
In this example, the ClickListener is a listener-type object which will react to the following events happening: enter, exit, touchDown, touchDrag, and touchUp. The binding is the function which we define inside the listener. So if we define a function for touchDown, then whenever someone touches or clicks on the the entity we attach this to, the function we define there will run.
Similarly for the other events: enter on the mouse moving into the element's clickable region, exit on the mouse moving out of the element's clickable region, touchDrag on the mouse moving while being pressed, and touchUp on the mouse being released.
I have a newbie question. I just started learning about libgdx and I'm a bit confused. I read the documentation/wiki, I followed some tutorials like the one from gamefromscratch and others and I still have a question.
What's the best way to check and do something for a touch/tap event?
I'm using Scenes and Actors and I found at least 4 ways (till now) of interacting with an Actor, let's say:
1) myActor.addListener(new ClickListener(){...});
2) myActor.setTouchable(Touchable.enabled); and putting the code in the act() method
3) verifying Gdx.input.isTouched() in the render() method
4) overriding touchDown, touchUp methods
Any help with some details and suggestions when to use one over the other, or what's the difference between them would be very appreciated.
Thanks.
I've always been using the first method and I think from an OOP viewpoint, it's the "best" way to do it.
The second approach will not work. Whether you set an Actor to be touchable or not, Actor.act(float) will still be called whenever you do stage.act(float). That means you would execute your code in every frame.
Gdx.input.isTouched() will only tell you that a touch event has happened anywhere on the screen. It would not be a good idea to try to find out which actor has been hit by that touch, as they are already able to determine that themselves (Actor.hit()).
I'm not sure where you'd override touchDown and touchUp. Actors don't have these methods, so I'm assuming you are talking about a standard InputProcessor. In this case you will have the same problem like with your 3rd approach.
So adding a ClickListener to the actors you want to monitor for these kind of events is probably the best way to go.
I have a watch face built upon latest API (extending CanvasWatchFaceService.Engine). Now I'd like to get touch event to have kind of active area in watch face that can be touched to open settings, etc.
A CanvasWatchFaceService.Engine inherits from WallpaperService.Engine, which declares two important methods: setTouchEventsEnabled(boolean) and onTouchEvent(MotionEvent).
But even when I call setTouchEventsEnabled(true) in onCreate() method of the Engine I never receive a call to onTouchEvent(MotionEvent) in my engine implementation.
Am I doing something wrong or is it simply not possible? I'm aware of some watch faces that offer active areas, but I'm not sure if these are built upon latest API or if they are build upon deprecated API (using layouts and GUI elements).
Use the Interactive Watch Face API in new Android Wear 1.3:
http://developer.android.com/training/wearables/watch-faces/interacting.html
http://android-developers.blogspot.hu/2015/08/interactive-watch-faces-with-latest.html
UPDATE:
Add just one line to the init function in the onCreate function in your CanvasWatchFaceService.Engine based class:
setWatchFaceStyle(new WatchFaceStyle.Builder(mService)
.setAcceptsTapEvents(true)
// other style customizations
.build());
And manage the tap events under
#Override
public void onTapCommand(int tapType, int x, int y, long eventTime) { }
Notes:
you'll have to add your own code to find which touch targets (if any) were tapped (tapType==TAP_TYPE_TAP)
you can only detect single, short taps, no swipes or long presses.
by detecting TAP_TYPE_TOUCH and TAP_TYPE_TOUCH_CANCEL you can kind of guess swipe gestures, though TAP_TYPE_TOUCH_CANCEL doesn't provide the exit coordinates.
For more control your only bet is to add a SYSTEM_ALERT layer and toggle it on/off based on information you can gather from onVisibilityChanged and onAmbientModeChanged...
It is not possible. I guess I read it in a thread on Google Plus on the Android developers group.
The watch face is meant to be a "static" thing, just there, just showing the time. Touch events are never dispatched to it.
The suggested pattern for Settings is to implement it on a companion app on the phone app.
I want to develop in android a simple application: when the user push a button, an index will be incremented until the button is released. Which button events i have to use?
Thanks
Use an OnTouchListener. And as the others said, you should accept answers to some of your previous questions.
This code implements an open-source prototype, the Android HCI Extractor, which tracks and monitor the user and system interaction events in multimodal applications (e.g. touch, keypresses, scroll, number of elements provided by the system, etc.)
Android HCI Extractor code: http://code.google.com/p/android-hci-extractor/
In the source code you can find the class /android.hci.extractor/src/org/mmi/android/instrumentation/filters/NavigationInputFilter.java in one method is aimed to control the touch-press, the touch-move and the touch-release events. The method is called "public boolean onTouch (View v, MotionEvent event)", and the MotionEvent is the extra information that describes you the action made with the finger.
In this clas you can also find other methods aimed to control clicks and itemClicks.
Further information about the MIM project (including tutorials about the tool integration and usage): http://www.catedrasaes.org/trac/wiki/MIM
I hope it helps you!!