Solution for OnPointerExit() Triggering on Touch Release On Android? - android

I have created some custom radial menu buttons for my Android game. The radial menu displays when the game object with the menu is touched. I'm using mouse events to activate the menu, which works in Unity and also when built to Android. When the menu is active, you can mouse or slide over a menu item to select it. If you then release on that menu item, it will pass the selection to the radial menu, which then takes the appropriate action.
The following works in Unity:
public void OnPointerEnter (PointerEventData eventData)
{
myMenu.selected = this;
Debug.Log ("Menu selection is now: " + this.action.ToString ());
defaultColor = circle.color;
circle.color = Color.white;
}
public void OnPointerExit (PointerEventData eventData)
{
myMenu.selected = null;
Debug.Log ("Menu selection has been nulled out.");
circle.color = defaultColor;
}
However, it does not work correctly when built to Android. Via some debug testing, I've determined that in Unity, if I activate the menu and mouse over a menu item, then release the mouse, myMenu.selected is correctly assigned. However, on Android only, lifting my finger over the menu item processes a final OnPointerExit, which nulls it out, meaning that menu never gets a proper selection. The mouse does not behave this way--it doesn't treat the pointer as having exited when the mouse button is released.
I can "solve" this problem by switching everything to touch, but then I cannot test using the Unity player and my mouse. So far, everything via mouse also worked correctly via touch. This is the first issue I've run into. Is there any clean solution for this? Or is the best solution to use macros to determine if I'm in the editor and have separate mouse and touch code for each?

It depends on the behavior you trying to implement.
Touch input does not have the notion of "hover" so you should probably be using the IPointerClickHandler interface and/or the IPointerDownHandler and IPointerUpHandler interfaces instead.
I recommend separating the hover logic vs touch/click clearly by using the preprocessor directive #if UNITY_STANDALONE around your hover-code.

Related

How to detect occurence of ContextMenu in OS?

Im using Gluon to develop javafx applications to Android, Iphone (and to desktop). When I export a test application to my Android phone (Marshmallow 6.0) - I cannot hold down onto text to access the menu from where you can copy text (the context menu)
(Which is an example of what you can do with a context menu - and is not a question of how to copy text on long hold specifically in Android).
This was possible on iphone 6 when testing it there.
How can I detected wether the device/operating system has a default context menu or not in java?
On Desktop there is a default ContextMenu that is created and installed in TextFieldBehavior (private API). If you don't set your own custom context menu, that will be the one used when a ContextMenuEvent is fired (with a right click event for instance).
On mobile, both Android and iOS have a ContextMenu as well.
On iOS, it uses a native TextField (UITextField). When the long press event happens, it triggers the default context menu (on my iPad I can see a small magnifying glass, and after that the context menu shows up).
On Android, the JavaFX TextField has a custom skin, but shares the same private TextFieldBehavior as the desktop version. The problem in this case is the missing right click event that would trigger the ContextMenuEvent event.
That's why you have to fire manually a ContextMenuEvent event, as it was described in this question.
Conclusion: so far, this is basically required only on Android:
TextField textField = new TextField();
addPressAndHoldHandler(textField, Duration.seconds(1), event -> {
Bounds bounds = textField.localToScreen(textField.getBoundsInLocal());
textField.fireEvent(new ContextMenuEvent(ContextMenuEvent.CONTEXT_MENU_REQUESTED,
0, 0, bounds.getMinX() + 10, bounds.getMaxY() + 10, false, null));
});

Simulate Touch Controls Through Code

I'm trying to make it possible to navigate through my Google Glass application by using head gestures. I'm able to recognize head gestures like looking to the right left and up. They each have their own method for what to do when this gesture is recognized
Now I need to simulate the corresponding touch gestures inside each method. So it will think I'm swiping to the left or right which will allow me to navigate through the cards with the head gestures.
Does anyone have any idea on how to actually achieve this?
Edit
I created a quick hello world application to play with. I added my headgesture code and started trying to get the keys working.
I added the following to my onCreate()
Instrumentation instr = new Instrumentation();
Then I added the following lines to each respective headgesture method.
Headgesture upwards should correspond with tapping the touchpadinst.sendKeyDownUpSync(KeyEvent.KEYCODE_DPAD_CENTER)
Headgesture to the left should correspond with swiping left on the touchpad inst.sendKeyDownUpSync(KeyEvent.KEYCODE_DPAD_LEFT);
Headgesture to the right should correspond with swiping right on the touchpadinst.sendKeyDownUpSync(KeyEvent.KEYCODE_DPAD_RIGHT);
They are responding accordingly now, however I'm getting an exception saying:
java.lang.RuntimeException: This method can not be called from the main application thread
The Solution
In the end I went a different direction then the one I mentioned in my edit above.
I found out that it is possible to call touch controls in the shell by using
adb shell input keyevent <keycode here>
I then found a way to use this in android, I have the following class named issueKey
public class issueKey {
public void issueKey(int keyCode)
{
try {
java.lang.Process p = java.lang.Runtime.getRuntime().exec("input keyevent " + Integer.toString(keyCode) + "\n");
} catch (Exception e) {
Log.wtf("IssueKeyError", e.getMessage());
}
}
}
Then I simply call the class and pass the keycode for the corresponding gesture
mIssueKey.issueKey(4);//functions as swipe down
Here is the list of keycodes that I tested for anyone that is interested.
Keys for each respective button/gesture
4: Swipe Down
21: Swipe Left
22: Swipe Right
23: Tap
24: Volume Up
25: Volume Down
26: Lock/Unlock Screen
27: Camera Button
However, what I'm wondering now is. What would be best practice, getting the solution I metioned in my edit to work by using a asyncTask or is the solution I'm currently using better.
Using the Instrumentation class would work if you use a separate thread to call the sendKeyDownUpSync method from.
This can be done using the following steps:
Create and start a thread from your activity
In the run method, use the Looper class and create a Handler as explained here
Every time you want to call sendKeyDownUpSync, post a Runnable instance to the Handler, which calls sendKeyDownUpSync in its run method.
A similar code sample (not from me) is available here

Android TV PlaybackControlsRow

So building a new app specifically for the Android TV interface (lollipop leanback) and I'm using the PlaybackOverlayFragment that is provided by the framework which has a PlaybackControlsRow with all the usual controls on it.
The problem is, the default behavior is for the user to have to click the "Play" button to start the video and I want it to start automatically. That part is easy and I have it working but then the Play/Pause icons on the provided control are out of sync (showing play when should be pause) because the item was started outside of the events of clicking on that control.
Documentation is sparse on these framework elements and examining the class I can't find any public method that would allow me to put this control in the proper "mode" or tell it to display the play or pause icon myself.
Anyone with experience with these yet that would know how to do this?
In order to change the state of the button, even after adding your Actions to the Adapter, you'll need to notify the changes to the adapter that has your Action.
mPlayPauseAction.nextIndex(); // next index, if it was pause, it'll be play
notifyChanged(mPlayPauseAction);
// where notifyChanged(Action action) is:
private void notifyChanged(Action action) {
ArrayObjectAdapter adapter = mPrimaryActionsAdapter; // reference to your adapter
if (adapter.indexOf(action) >= 0) {
adapter.notifyArrayItemRangeChanged(adapter.indexOf(action), 1);
return;
}
}
Well, I partially answered my own question.
If I know before the PlaybackControlsRow is created that I want to set it to the pause state (actually, playing state but showing pause button) then if I call setIndex(PlaypauseAction.PAUSE) on the PlayPauseAction before adding it to the controlsrow then it works.
It doesn't appear that I can modify it myself after adding it but that may be something else I'm doing wrong.

JQuery Mobile swipe event only firing every other swipe

I have set up a swipeleft event in my app to move between fields of a form. All of the fields are dynamically generated, so I'm not swapping between pages, I'm clearing and re-generating all the DOM elements. The problem is the swipe event only fires every other time I swipe on the page or if I touch or tap anything on the page.
Here's the code that sets up the events:
$(document).delegate("#scorePage", "pageshow", function() {
$.event.special.swipe.scrollSupressionThreshold = 10;
$.event.special.swipe.horizontalDistanceThreshold = 30;
$.event.special.swipe.durationThreshold = 500;
$.event.special.swipe.verticalDistanceThreshold = 75;
$('#divFoo').on("swipeleft", swipeLeftHandler);
$('#divFoo').on("swiperight", swipeRightHandler);
tableCreate(traits[0].keyboardID);
});
For context, tableCreate is putting a dynamically generated table into divFoo that contains information a user can pick from. Here's the event code itself:
function swipeLeftHandler() {
$("#divFoo").empty();
traitIndex++;
tableCreate(traits[traitIndex].keyboardID);
}
Why is my swipe event only firing every other time there is a swipe on the page?
Primarily testing on Android right now, if that makes a difference.
Edit I'm using JQuery Mobile version 1.4.4
I figured out a way around this problem by simply rolling my own implementation of these events. There's some sample code on how to do something similar here:
https://snipt.net/blackdynamo/swipe-up-and-down-support-for-jquery-mobile/
If anyone else uses this code to fix my same problem, make sure to be aware that the article is implementing swipeup and swipedown so you will have to adapt it. In the end, I'm not entirely sure about the differences between this code and the actual implementations of swipeleft and swiperight, but this works consistently so I'm cutting my losses and going with it.

What DOM events are available to WebKit on Android?

I'm building a mobile web app targeting Android users. I need to know what DOM events are available to me. I have been able to make the following work, but not terribly reliably:
click
mouseover
mousedown
mouseup
change
I have not been able to get the following to work:
keypress
keydown
keyup
Does anyone know the full list of what is supported and in what contexts (e.g., is onchange only available to form inputs?)? I can't find a reference for this on The Googles.
Thanks!
Update: I asked the same question on the Android developers list. I will be doing some more testing and will post my results both here and there.
OK, this is interesting. My use case is that I have a series of links (A tags) on a screen in a WebKit view. To test what events area available, using jQuery 1.3.1, I attached every event listed on this page (even ones that don't make sense) to the links then used the up, down, and enter controls on the Android emulator and noted which events fired in which circumstances.
Here is the code I used to attach the events, with results to follow. Note, I'm using "live" event binding because for my application, the A tags are inserted dynamically.
$.each([
'blur',
'change',
'click',
'contextmenu',
'copy',
'cut',
'dblclick',
'error',
'focus',
'keydown',
'keypress',
'keyup',
'mousedown',
'mousemove',
'mouseout',
'mouseover',
'mouseup',
'mousewheel',
'paste',
'reset',
'resize',
'scroll',
'select',
'submit',
// W3C events
'DOMActivate',
'DOMAttrModified',
'DOMCharacterDataModified',
'DOMFocusIn',
'DOMFocusOut',
'DOMMouseScroll',
'DOMNodeInserted',
'DOMNodeRemoved',
'DOMSubtreeModified',
'textInput',
// Microsoft events
'activate',
'beforecopy',
'beforecut',
'beforepaste',
'deactivate',
'focusin',
'focusout',
'hashchange',
'mouseenter',
'mouseleave'
], function () {
$('a').live(this, function (evt) {
alert(evt.type);
});
});
Here's how it shook out:
On first page load with nothing highlighted (no ugly orange selection box around any item), using down button to select the first item, the following events fired (in order): mouseover, mouseenter, mousemove, DOMFocusIn
With an item selected, moving to the next item using the down button, the following events fired (in order): mouseout, mouseover, mousemove, DOMFocusOut, DOMFocusIn
With an item selected, clicking the "enter" button, the following events fired (in order): mousemove, mousedown, DOMFocusOut, mouseup, click, DOMActivate
This strikes me as a bunch of random garbage. And, who's that cheeky IE-only event (mouseenter) making a cameo, then taking the rest of the day off? Oh well, at least now I know what events to watch for.
It would be great if others want to take my test code and do a more thorough run through, perhaps using form elements, images, etc.
Since this is the second most popular Android + JavaScript post on SO (which is just a sad commentary on the state of web development targeting the Android platform), I thought it may be worthwhile including a link to pkk's touch event test results at http://www.quirksmode.org/mobile/tableTouch.html and also http://www.quirksmode.org/mobile/ in general.
As of Android 1.5, the same touch(start|move|end|cancel) events that the iPhone supports work in Android as well.
One problem I found was that touchmove ends get queued up. No workaround yet.

Categories

Resources