I'm looking at the Support4Demos from the Android SDK and I'm not sure if this is working as intended. Specifically, I'm referring to accessibility focus when using the Widget > Explore by Touch Helper demo. If I tap on one of the blue regions to give it accessibility focus then tap on the on-screen home button focus will switch to the home button. But if I tap on the same blue region as before nothing happens (but if I tap on the other blue region focus transfers from the home button). I'm testing this on a Kindle with on-screen navigation buttons.
Am I correct in assuming that focus should transfer back to the blue region? If so, what is the proper way to handle this? Presumably there would be a way to know that the user tapped on something outside of the app and in response to this I could clear state in the ExploreByTouchHelper but I'm not sure how I would go about that.
Related
I have a very simple UI that has one entry control to enter phone number and a button. The entry control has a handler for removing border around it. When the entry control got focus, keyboard pops up. But when I try to tap outside the entry control such as on the screen empty area, the keyboard does not dismiss and the entry control does not lose focus. Also since the button is at the bottom of the screen, therefore, the soft keyboard hides it and there is no way to tap the button. The button can only be tapped if I press the Android device back button.
At present, I have not checked this behavior on an iOS device.
This was not a problem in Xamarin Forms though. I searched a lot on Internet and found that it is currently a bug in MAUI.
I tried to attach a tap gesture on the parent layout control and invoked platform-specific code to hide the keyboard but it seems the entry does not lose focus and in turn the tap gesture event is never called.
However, the entry control should lose focus automatically when I tap outside the entry control (such as on the screen) and the soft keyboard should automatically dismiss.
Please provide a workaround if there is any.
Known bug. Removing the focus sometimes helps. Sometimes you need to do Disable/Enable in sequence. (I go with the second).
If you want, you can read this for example:
https://github.com/dotnet/maui/issues/12002
(Most disturbing part, considering this is know bug for half year+)
We can leave the behavior how this is for now in NET7 and provide an
API in NET8 that lets users toggle this behavior on/off for iOS and
Android
Is there any way to enable onFocus, onBlur etc. on Touchable-Elements on non-TVs devices?
The touchable elements event is not triggered, but the default feedback is visible when I use the physical keyboard.
I need simple dialogs on a Zebra mc3300. This device has a physical keyboard and when I show a dialog with a yes/no question, one button should have focus when the dialog becomes visible.
I'm not sure, maybe ref.current?.focus() works, when I press enter the key onPress is triggered, but i don't get any visible feedback which key has the focus. Only when I use the hardware button to switch to the next element and back again, the button became a different opacity.
Many thanks to you.
Recently I was able to create a tablet software for my Cerebral Palsy girl to "talk" to me, since she can't speak.
Well, a friend of mine has Amyothrophic Lateral Sclerosis (ALS). He can move just one finger and he is willing to use my daughter's software to be able to "talk" again.
Since he can move just a finger, I created a version where each item "blinks" (in yellow) for some time (just one second) and if he presses a mouse button, the item focused (in yellow) is activated.
See below:
It works if you leave the mouse over the black portion of the screen. If mouse is over the buttons, it won't work, it will click the button where the mouse is over.
Also, if he uses a keyboard and he presses the ENTER key, it will activate the first button, then, if he clicks the left arrow and then presses the ENTER again, it will activate the second button and so on.
So, I wish I could create an generic event that if he would press the ENTER key, just the selected (in yellow) item would be activated.
Any ideas?
Well you could make two different modes. One where there are click listeners for each picture,and another where simply a click anywhere on the screen is recognized ( ex. find the largest layout id and set a listener). Then in the second mode, use
http://developer.android.com/reference/android/view/KeyEvent.html
to detect KeyEvents such as Enter, and handle them depending on the highlighted View.
Is there a way to find out when did a user click "Setting" options on the screen menu?
The onClick listner's KeyDown events catches only the hardware buttons on the phone and not the clicks of the sofware keyboard that shows up when a textbox/editText gets a focus and the key guard shows up.
Is it even possible using public android SDK.
P.S. : I am only concerned with 2.2 and 2.3 so its fine if this is not possible on 3.0 and above.
Thnx
EDIT
Explanation of a scenario that will help understand the question better!
I have a full screen activity with a editText and a button. I want to intercept all the clicks that a user make and based on that make some decisions.
I am able to register a listner to intercept what phisical keys are being clicked(HOME, MENU, VOLUME UP/DOWN etc)...The problem is, when the user clicks on the editText i.e. the text box gets the focus, the sotware keypad shows up. Now I also want to intercept what keys(numbers, alphabets, special characters or even custom functions on some samsung android phone like 'Go To Settings' are clicked and perform action based on the clicks.
My question is, is it possible and if yes, then how?
NOTE: Please dont ask me why am I doing this because its bad user experience. I am very much aware of that. I am trying to do this in a particular context that needs this functionality. Thnx!
You need to use the KeyListener class and setKeyListener
http://developer.android.com/reference/android/widget/TextView.html
This only allows you to modify/filter input into the TextView.
I am trying to create a menu that will behave much like the on-screen keypad of Android, wherein when press-holding a key with accented characters like "i", a popup menu for the accented characters is displayed. Then sliding the finger into the button of any accented character within the popup menu highlights/selects the button, and finally releasing the finger, inputs the accented character into the target textbox. I have seen this behavior in Android v2.3, but not on the older versions. Not sure if this is a new feature?
I would like to know how to handle the touch gesture such that main button (e.g. "i") will react to press and hold touch events, and then after the popup appears, the button for "ï" will react to the point and release touch events? The user will not need to release the finger, thus the motion will be a press-hold-slide-release between 2 buttons.
I have tried to find the code used in the Android keypad but was not successful as I only got directed to CharacterPickerDialog which does not show the press-hold-slide-release between 2 buttons, since it requires the user to release the finger before selecting a button in the popup menu.
Hope anyone can provide some info and insights on this. Thanks in advance!