In Android Studio, using Java, to set the accessibility focus on a certain view after, for example, pressing a button, you have to do myview.sendAccessibilityEvent(AccessibilityEvent.WINDOWS_CHANGE_ACCESSIBILITY_FOCUSED). In Xamaring, using C#, the closest thing that I found is myview.SendAccessibilityEvent(EventTypes.ViewAccessibilityFocused); but it doesn't work that good, infact, it only read out the myview accessibility name, without actually setting the focus on that, the focus remain on the button pressed. I also tried myview.SendAccessibilityEvent(EventTypes.ViewFocused); but it doesn't work either.
Another thing that I found is myview.SendAccessibilityEventUnchecked(AccessibilityEvent.WindowsChangeAccessibilityFocused); but I got an error saying Argument 1: cannot convert from 'Android.Views.Accessibility.WindowsChange' to 'Android.Views.Accessibility.AccessibilityEvent' even if it is said that SendAccessibilityEventUnchecked takes an AccessibilityEvent as argument.
Any idea on how to solve?
Or any idea on how to set the accessibility focus to a certain view after some event?
(Note if needed, I am just testing it on Android native, than I will need to use that in a Forms project using Dependency Service)
WINDOWS_CHANGE_ACCESSIBILITY_FOCUSED is a sub type for TYPE_WINDOWS_CHANGED, which should only be dispatched by the system. And the accessibility focus won't be changed by sending accessibility event.
Actually, send accessibility event triggers Accessibility Service, and you can change the accessibility focus in the service's onAccessibilityEvent method with:
public override void OnAccessibilityEvent(AccessibilityEvent e)
{
//you can use event type and source node to see if the event is sent by you
AccessibilityNodeInfo sourceNode = e.Source;
AccessibilityNodeInfo targetNode = sourceNode.FocusSearch(FocusSearchDirection.Down);
targetNode.PerformAction(Android.Views.Accessibility.Action.AccessibilityFocus);
}
See more information about Accessibility Focus and Input Focus here.
Problem solved, the right way to achieve that should be myview.SendAccessibilityEvent(EventTypes.ViewHoverEnter) that has the same integer value of the Android value WINDOWS_CHANGE_ACCESSIBILITY_FOCUSED.
Another way is to do myview.SendAccessibilityEvent((EventTypes) (int) WindowsChange.AccessibilityFocused) so you have to do a double cast, int and EventTypes, on WindowsChange.AccessibilityFocused because WindowsChange is a different enum from EventTypes.
I leave the github issue that I opened where I got these answers git issue.
Related
I am creating a custom keyboard for a Xamarin Android Application. I have it working, and my listener receives the input. My issue is the parameter Android.Views.Keycode. This enumeration does not have all of the possible keys. I have found some of them generate the correct keystroke if the KeyEvent is created with MetaKeyStates.ShiftOn. But even in that I have not found the right combination for the {} keys. There also doesn't appear to be a clear answer to which key or key/MetaState combination maps to backspace, Next, Done, etc.
I have not found documentation that shows which Keyode in conjunction with the required MetaState will generate which key strokes. Does such documentation exists? Does anyone have an exhaustive example showing which Keycode and which metastates create which characters?
Also, in the case of my keyboard, the ! and ? characters will appear on the screen with the normal text. Do I need to use a custom Keycode for them so I can tell the difference between them and the character that would have the same Keycode without a metastate?
Why are you using keycodes? Those are for hardware buttons. Software keyboards usually use InputConnection.commitText and skip keycodes entirely.
Next, Done, etc are the action button. That's another call on InputConnection- performEditorAction.
Delete is generally done by InputConenction.deleteSurroundingText.
Shifts are generally an internal state and not connected to any keycode.
You're doing everything the wrong way, basically. Here's the android implementation, I assume xamarin has its wrappers. https://developer.android.com/reference/android/view/inputmethod/InputConnection
So, I have an Android application written in Java. I want to be able to use talkback in the accessibility option for those that are visually impaired. As the user swipes through the activity, the talkback "focuses" on various components and I want to be able to know this:
1) How to know if it's "focused" on a certain component (ex, textview)?
2) When it's "focused" on that certain component, how to interrupt it to play my own audio file then go back and let talkback take over again?
Thank you in advance!
---- edit
Just to be a little bit more clear, in case you're not familiar with talkback...
Once the user enables talkback, it reads out everything on the phone screen. If the user wants to select an application, the user will keep swiping right,left,up, or down until that application name is highlighted/focused and announced by the talkback. So, I want to know when a specific textview is highlighted.
Answer A:
Find the views View.AccessibilityDelegate.
Then override the following method:
#Override
public boolean onRequestSendAccessibilityEvent(View child, AccessibilityEvent event) {
if(event.getEventType() == TYPE_VIEW_ACCESSIBILITY_FOCUSED) {
/*do your stuff here*/
} else {
return super.onRequestSendAccessibilityEvent(child, event);
}
}
You want to look for events of the type TYPE_VIEW_ACCESSIBILITY_FOCUSED.
Answer B: Under WCag 2.0 Guideline 3.2, it is generally considered poor accessibility to have things automatically happen on focus. Instead of figuring out how to make the accessibility framework do something that is generally considered inaccessible, just don't do it instead :).
I have a watch face built upon latest API (extending CanvasWatchFaceService.Engine). Now I'd like to get touch event to have kind of active area in watch face that can be touched to open settings, etc.
A CanvasWatchFaceService.Engine inherits from WallpaperService.Engine, which declares two important methods: setTouchEventsEnabled(boolean) and onTouchEvent(MotionEvent).
But even when I call setTouchEventsEnabled(true) in onCreate() method of the Engine I never receive a call to onTouchEvent(MotionEvent) in my engine implementation.
Am I doing something wrong or is it simply not possible? I'm aware of some watch faces that offer active areas, but I'm not sure if these are built upon latest API or if they are build upon deprecated API (using layouts and GUI elements).
Use the Interactive Watch Face API in new Android Wear 1.3:
http://developer.android.com/training/wearables/watch-faces/interacting.html
http://android-developers.blogspot.hu/2015/08/interactive-watch-faces-with-latest.html
UPDATE:
Add just one line to the init function in the onCreate function in your CanvasWatchFaceService.Engine based class:
setWatchFaceStyle(new WatchFaceStyle.Builder(mService)
.setAcceptsTapEvents(true)
// other style customizations
.build());
And manage the tap events under
#Override
public void onTapCommand(int tapType, int x, int y, long eventTime) { }
Notes:
you'll have to add your own code to find which touch targets (if any) were tapped (tapType==TAP_TYPE_TAP)
you can only detect single, short taps, no swipes or long presses.
by detecting TAP_TYPE_TOUCH and TAP_TYPE_TOUCH_CANCEL you can kind of guess swipe gestures, though TAP_TYPE_TOUCH_CANCEL doesn't provide the exit coordinates.
For more control your only bet is to add a SYSTEM_ALERT layer and toggle it on/off based on information you can gather from onVisibilityChanged and onAmbientModeChanged...
It is not possible. I guess I read it in a thread on Google Plus on the Android developers group.
The watch face is meant to be a "static" thing, just there, just showing the time. Touch events are never dispatched to it.
The suggested pattern for Settings is to implement it on a companion app on the phone app.
I am writing an accessibility service. I've been trying to focus EditText fields in a way that pops up the software keyboard and I have been unsuccessful. I am able to get the cursor into the EditText field, but the soft keyboard doesn't pop up. Given EditTextNode is an AccessibilityNodeInfo that I have gotten from various accessibility events and that said nodeInfo isEditable, isFocusable, isVisibleToUser and isClickable when I attempt all of these actions and they all return true upon completion.
editTextNode.performAction(AccessibilityNodeInfo.ACTION_CLICK);
In my mind the above should simply work, and the fact that it does not is a bug in the Accessibility API, if not for my entire android version, at the very least on my device (Droid Ultra, Android 4.4.4). I have also tried:
editTextNode.performAction(AccessibilityNodeInfo.ACTION_FOCUS);
Puts focus into the field so I can see the input cursor, but no keyboard.
editTextNode.performAction(AccessibilityNodeInfo.ACTION_ACCESSIBILITY_FOCUS);
Doesn't really do anything unless talkback is enabled.
editTextNode.performAction(AccessibilityNodeInfo.ACTION_SET_SELECTION, someBundle);
Along with appropriate arguments this will work, but only if there is already text in the editText field. I need to pull the keyboard up on empty text fields as well. Yes, I tried inputing 0,0 for start and end text selection arguments. The SET_SELECTION method only works with text in the field!
This has been frustrating me for a couple days, hopefully you guys can help. Though I believe I've stumbled onto a bug and am going to have to wait for Android to update. Because the ACTION_CLICK method should really be all that is needed. But, I could be missing something silly, Accessibility API Doc is somewhat scant. Am willing to attempt anything.
To restate #alanv's comment in answer form:
You are correct, it is a bug in accessibility services prior to API 21.
I have a problem since a long time and I have not find a way of solving it.
My app as a webview that loads an url where the user can fill certain text inputs, also, the app produces sounds in certain situations(to help/assist the users), the problem is that, sometimes when the user is writing in an input and some sound is triggered the soft keyboard hides because the input lost its focus.
This is quite annoying to the users because they have to touch the input again to continue writing.
Any ideas about how to solve it?
I was confused on this same issue but I learned that it is common practice to call search.blur() (assuming we are talking about a search text input field) when a window.resize event is triggered for desktop, but sometimes that same code causes issues on mobile because the virtual keyboard triggers a resize event as well. If this is the case, you may have to use something like modernizr to determine if you are in a mobile device context and if so don't call that search.blur(). Search your code base for the blur() pattern to see if that is what is happening for you.
To fix, try something like this:
window.onresize = function () {
if (!isMobile) {
search.blur()
}
}