I have a watch face built upon latest API (extending CanvasWatchFaceService.Engine). Now I'd like to get touch event to have kind of active area in watch face that can be touched to open settings, etc.
A CanvasWatchFaceService.Engine inherits from WallpaperService.Engine, which declares two important methods: setTouchEventsEnabled(boolean) and onTouchEvent(MotionEvent).
But even when I call setTouchEventsEnabled(true) in onCreate() method of the Engine I never receive a call to onTouchEvent(MotionEvent) in my engine implementation.
Am I doing something wrong or is it simply not possible? I'm aware of some watch faces that offer active areas, but I'm not sure if these are built upon latest API or if they are build upon deprecated API (using layouts and GUI elements).
Use the Interactive Watch Face API in new Android Wear 1.3:
http://developer.android.com/training/wearables/watch-faces/interacting.html
http://android-developers.blogspot.hu/2015/08/interactive-watch-faces-with-latest.html
UPDATE:
Add just one line to the init function in the onCreate function in your CanvasWatchFaceService.Engine based class:
setWatchFaceStyle(new WatchFaceStyle.Builder(mService)
.setAcceptsTapEvents(true)
// other style customizations
.build());
And manage the tap events under
#Override
public void onTapCommand(int tapType, int x, int y, long eventTime) { }
Notes:
you'll have to add your own code to find which touch targets (if any) were tapped (tapType==TAP_TYPE_TAP)
you can only detect single, short taps, no swipes or long presses.
by detecting TAP_TYPE_TOUCH and TAP_TYPE_TOUCH_CANCEL you can kind of guess swipe gestures, though TAP_TYPE_TOUCH_CANCEL doesn't provide the exit coordinates.
For more control your only bet is to add a SYSTEM_ALERT layer and toggle it on/off based on information you can gather from onVisibilityChanged and onAmbientModeChanged...
It is not possible. I guess I read it in a thread on Google Plus on the Android developers group.
The watch face is meant to be a "static" thing, just there, just showing the time. Touch events are never dispatched to it.
The suggested pattern for Settings is to implement it on a companion app on the phone app.
Related
I am trying to add a button which when pressed shares the high score of the user to the chosen app (facebook, twitter, messaging etc...)
I have been searching online and I just could not find a clear tutorial on how to do this. It may be because I am not so familiar with the terms such as bindings etc.. so I cannot understand these advanced tutorials. Can anyone explain clearly how I could do this?
Thanks!
p.s it would be great for the share button to work for both android and iOS
For Facebook, check out gdx-facebook.
It's a cool project that integrates libgdx with Facebook's API. I discovered it recently, and so far so good.
Please note: There seems to be some gradle issues with this project when using Eclipse IDE. I've been using it with Android Studio, and it's working fine.
You'll need to have a binding on the user interacting with the button which uses the API of the corresponding service to make a post. It's sufficiently complicated and there are so many ways of doing it that I don't suspect you'll find any one answer to this problem.
The basic part of reacting on a tap or a click will look something like this:
ClickListener shareFacebook = new ClickListener() {
public boolean touchDown(InputEvent event, float x, float y, int pointer, int button) {
//Interacting with the facebook API here
}
}
shareButton.addListener(shareFacebook)
In this example, the ClickListener is a listener-type object which will react to the following events happening: enter, exit, touchDown, touchDrag, and touchUp. The binding is the function which we define inside the listener. So if we define a function for touchDown, then whenever someone touches or clicks on the the entity we attach this to, the function we define there will run.
Similarly for the other events: enter on the mouse moving into the element's clickable region, exit on the mouse moving out of the element's clickable region, touchDrag on the mouse moving while being pressed, and touchUp on the mouse being released.
So, I have an Android application written in Java. I want to be able to use talkback in the accessibility option for those that are visually impaired. As the user swipes through the activity, the talkback "focuses" on various components and I want to be able to know this:
1) How to know if it's "focused" on a certain component (ex, textview)?
2) When it's "focused" on that certain component, how to interrupt it to play my own audio file then go back and let talkback take over again?
Thank you in advance!
---- edit
Just to be a little bit more clear, in case you're not familiar with talkback...
Once the user enables talkback, it reads out everything on the phone screen. If the user wants to select an application, the user will keep swiping right,left,up, or down until that application name is highlighted/focused and announced by the talkback. So, I want to know when a specific textview is highlighted.
Answer A:
Find the views View.AccessibilityDelegate.
Then override the following method:
#Override
public boolean onRequestSendAccessibilityEvent(View child, AccessibilityEvent event) {
if(event.getEventType() == TYPE_VIEW_ACCESSIBILITY_FOCUSED) {
/*do your stuff here*/
} else {
return super.onRequestSendAccessibilityEvent(child, event);
}
}
You want to look for events of the type TYPE_VIEW_ACCESSIBILITY_FOCUSED.
Answer B: Under WCag 2.0 Guideline 3.2, it is generally considered poor accessibility to have things automatically happen on focus. Instead of figuring out how to make the accessibility framework do something that is generally considered inaccessible, just don't do it instead :).
My application has separate algorithms to fetch data for scroll change and on user location change . For location change am using com.google.android.gms.location.LocationListener Which is working fine.
But for on user scroll, I am getting mMap.setOnCameraChangeListener(new OnCameraChangeListener(). But the issue is com.google.android.gms.location.LocationListene also triggers mMap.setOnCameraChangeListener(new OnCameraChangeListener().
So how to distinguish. Presently I am using Boolean values to differentiate, but it's not reliable and dirty.
I had the same issue, was trying all sorts of stuff like wrapping the view to intercept touches, or wrapping any calls to change the viewport and listening to when it started and stopped changing so I could tell if something else changed it, or matching the current viewport to what I last set it to... it's all rather fiddly. Then someone pointed me at this, and it looks promising update: has now worked flawlessly for us for months without a single issue:
map.setOnCameraMoveStartedListener { reasonCode ->
if (reasonCode == GoogleMap.OnCameraMoveStartedListener.REASON_GESTURE) {
// I will no longer keep updating the camera location because
// the user interacted with it. This is my field I check before
// snapping the camera location to the latest value.
tracking = false
}
}
I doubt that there's an easy, reliable, and real-time way to do this. IOW, I suspect that it is going to be "dirty".
If I had to try, I would supply my own LocationSource, so that I knew when the map would be changing based upon position. I would then try to ignore the next call (calls?) to my OnCameraChangeListener, as being ones tied to the location change.
(BTW, note that LocationListener was deprecated)
Or, I would try to change my data-fetching algorithm to treat all camera changes equally. After all, if my objective is to make sure that I have data on all sides of the map from the current center, it doesn't matter how I got to that center, just so long as I am there.
So I followed Mathew Casperson's Making Games on Android Tutorial and got a small game running a few days ago, now I am trying to switch the controls to touchscreen instead of the D-pad.
I am running into some problems and was wondering if anyone here could help me. Flixel doesn't have any built in touchscreen functions so I am overriding onTouchEvent(MotionEvent event) in my Activity (FlixelDemo.java in the tutorial) and hopefully getting the coordinates of the touch.
I then have a function in my Player.java that given the touch coordinates could tell me whether my player has been touched.
The problem I am having is trying to figure out how to get to/call that function (isCollision) from in the activity.
It seems that I can only override the onTouchEvent in FlixelDemo.java and that I can only use the isCollision function in GameState.java where I add the player.
How do I get the info from the overridden touch event to any of my other classes? Can anyone tell me what I am doing wrong or help me figure out a different way of implementing touch events?
Looking at the code, FlixelDemo is really just a container for org.flixel.FlxGameView (via the res/layout/main.xml file).
The onTouchEvent method can be applied to any View, so you can apply it to just the flixel viewport.
And in fact, that's probably what you want to do here: Add your handler directly to FlxGameView.java, and then let it call a method on the internal GameThread class.
It's already handling the other events this way. See FlxGameView.onKeyDown (and the associated FlxGameView.GameThread.doKeyDown) for a good example.
I want to develop in android a simple application: when the user push a button, an index will be incremented until the button is released. Which button events i have to use?
Thanks
Use an OnTouchListener. And as the others said, you should accept answers to some of your previous questions.
This code implements an open-source prototype, the Android HCI Extractor, which tracks and monitor the user and system interaction events in multimodal applications (e.g. touch, keypresses, scroll, number of elements provided by the system, etc.)
Android HCI Extractor code: http://code.google.com/p/android-hci-extractor/
In the source code you can find the class /android.hci.extractor/src/org/mmi/android/instrumentation/filters/NavigationInputFilter.java in one method is aimed to control the touch-press, the touch-move and the touch-release events. The method is called "public boolean onTouch (View v, MotionEvent event)", and the MotionEvent is the extra information that describes you the action made with the finger.
In this clas you can also find other methods aimed to control clicks and itemClicks.
Further information about the MIM project (including tutorials about the tool integration and usage): http://www.catedrasaes.org/trac/wiki/MIM
I hope it helps you!!