I currently retrieve the root node of the active window with getRootInActiveWindow(). Afterwards, I perform a breadth first search to get a list of all nodes.
My questions:
How can I traverse this list of nodes to get the focus order? Is this list ordered according to the nodes' focus order already? Is there a different approach of retrieving the focus order?
Thanks in advance.
Ultimately, no. You cannot, and the accessibility api documentation on this is very misleading.
One might think that the "focusSearch(int direction)" function of AccessibilityNodeInfo would be the way to go. But, ultimately it is broken, and if you go back into the inards of the Android Accessibility APIs you end up finding a function that's documented as such:
Searches for the nearest view in the specified direction that can take the input focus.
COOL, just what we want right? HOLD ON A SECOND. This function calls a function
AccessibilityInteractionClient.getInstance().focusSearch(mConnectionId, mWindowId, mSourceNodeId, direction);
And this function is documented as such:
Finds the accessibility focused {#link android.view.accessibility.AccessibilityNodeInfo}. The search is performed in the window whose id is specified and starts from the node whose accessibility id is specified.
Dissappointingly, you will also find that this function does neither of those jobs (directional focus search, or directional accessibility focus search) well.
Ultimately, when an element in TalkBack gains accessibility focus due to the focus of an element, it's actually responding to the actual focus event coming from the OS and NOT any calculation that TalkBack is doing on input focus ordering.
In order to do this, you would basically have to implement the entire logic from the OS for Tab Ordering yourself, with less information the system itself has to do it, because the System has real android.widget.View objects to deal with, not fake AccessibilityNodeInfo objects with limited information. Any way you can come up to do this is going to be exceptionally fragile, and not guaranteed to line up with what the system actually does.
Your inclination to try a breadth first traversal of "input focusable" (the focusable property of AccessibilityNodeInfo) is a pretty reasonable solution. The problem is going to be, that the real system Tab Ordering is going to be a subset of that, which ignores things like elements existing on different windows and such.
Seriously, open up the TalkBack settings screen, turn on TalkBack, connect a hardware keyboard, go crazy on the tab and shift tab keys, and marvel at how you can't put focus on the "Settings" button. Your traversal function would get this wrong, because the "Settings" button is focusable but somehow, you can't put focus there. The reason why digs into some serious AOSP nonsense that I will omit, and no, I'm not sure if the information that would keep this corner case from coming up in your theoretical algorithm is available to AccessibilityServices.
To summarize, the breadth first traversal can theoretically work, but there are going to be a ton of corner cases, and no I'm not sure if the information to deal with each corner case correctly is available to AccessibilityServices. Also, notably, because of this particular issue, if you'd like the solution to be correct across 4.4, 5.0 and 7.0+ devices, you will have to have different variations of this calculation for different OS versions. YEP, those corner cases are going to be different! (Queue Evil Laughter).
My honest recommendation... give up.
Related
So I have a requirement around accessibility wherein a button needs to be read out when it comes in focus and then another text to be read out after user clicks on it.
For example,
An "OK" button when focussed should read "OK" but when a user taps on it, it should read out some other text eg. "Navigating to the other page".
Is there a way in Android to implement this?
I have not been able to find anything around it.
You can use View#announceForAccessibility(CharSequence) to make a general announcement - so in your OnClickListener get a reference to some View (e.g. your Button, it doesn't matter what it is) and call that on it.
Like it says in the docs, this is a convenience function that creates a very general "something is getting announced for no specific reason" event - you might want to give more context, like creating a TYPE_VIEW_CLICKED event. This might be more helpful to the user (depending on how the accessibility service handles it) and could provide a better experience, since stuff that's read out is prioritised depending on what it is. I don't have time to get into it here, but it's something you can investigate if you want
Also I'm not sure if this is what you mean, but just in case - if the user focuses your button, it should say "[OK] button, double tap to [some description]". The bits in brackets you can customise, the rest is standard description for a Button in the UI. You shouldn't change this to just say "OK".
That predictable and consistent system is there for a reason, to help partially sighted and blind people understand exactly what's going on with the app they're using. It might sound clunky at times, but it's meant to be functional, not slick. So we shouldn't try to get around it and make it "sound better" by removing important info and context that some people really need. I don't know if that's what you meant, but it's always worth mentioning!
In my Android app I have a custom layout that is being used as a button - it consists of some TextViews and an ImageView, additionally it has some gradient background.
I'm aligning my app now to conform to the Accessibility rules. In order to do so, I would need to convert this layout into a button, so that TalkBack can correctly indicate the action, that this whole layout is clickable and serves like a button.
I know that on iOS there is a possibility to set the UIAccessibilityTraits to treat such view as a button - this kind of solution would save me a huge amount of work in terms of migration.
Is there any similar solution on Android for that? What approach should I follow in order to make this layout recognized correctly by TalkBack?
No, there's no concept of accessibility traits on Android - but you can still get a good accessibility experience without needing to specifically convert your layout into a Button.
Generally, it's most important that TalkBack (or whatever accessibility service is being used - remember, it's not just TalkBack) is able to detect that the widget is clickable and to be able to read a coherent description of what it does. The additional information that it's a button, specifically, isn't super useful, especially because there are so many different kinds of UI elements that it's often a very ambiguous question whether something even is a button.
You can test this by selecting it in TalkBack and confirming that it reads the content description properly, says something along the lines of "Double tap to activate," and performs the correct action when you double tap.
If it's not correct, make sure the content description, clickable flag, and click action are set correctly on the widget's AccessibilityNodeInfo.
There is an element I want TalkBack to skip when reading out lout the screen.
I can set it's contentDescription to null.
I also read about ImportantForAccessibility: indicates whether an element is visible or not to the Accessibility API.
Which other APIs are there?
Is it cleaner to use ImportantForAccessibility=false over contentDescription=null?
ImportantForAccessibility=false is used to hide any element from the accessibility tree, including buttons, content etc.
contentDescription=null is only useful for things like images (there may be other items I can't think of) that you want to hide as otherwise the Accessibility Tree will do it's best to find a suitable name for an item.
A prime example would be an ImageButton - if you use contentDescription=null then it will announce 'button' and the destination / button text. If you use ImportantForAccessibility=falseit would hide that item completely from the accessibility tree.
The best analogy I can come up with (if you are familiar with Web Standards) is that contentDescription is like an alt attribute or aria-labelledby attribute and ImportantForAccessibility=false is similar to aria-hidden="true".
One thing I would caution you on - other than decorative items you should not really be hiding items from the accessibility tree, just be careful that you are not giving a different experience to screen reader users (you didn't specify your use case, just wanted to hammer that point home).
Final thing - try it with TalkBack, testing it on a device is the quickest way to know if you got it right!
I have an Android app using PhoneGap 1.6 and Sencha Touch 1.1.1. One view has a text input field which behaves oddly in Android: it duplicates itself and it is very difficult to remove focus.
I have determined that it is a WebTextView coming up over top of the "real" input field. The problem I am having with this is that blurring the text field with screen taps is extremely difficult, and if you scroll the parent container, the WebTextView does not scroll with it, so you can see both at the same time.
The only way to remove focus on the element is to tap furiously all over parts of the screen, much like triggering the frustration detector from Mavis Beacon.
My actual question is: how can I turn off this functionality completely, or at least work around it? It is not reasonable to expect the user to do anything other than single-tap outside of the box, or press the Back button on the device to stop input in the text field. As it is, pressing Back simply stows the soft keyboard and does not give up focus.
These are browser bugs, triggered by some CSS-flags.
To explain the bug:
The device creates some kind of "screenshot" from the web-sites content. All transformations and transitions are made on top of the "screenshots" from the actual page.
If you have input elements, there will be some kind of proxy elements rendered on top of the "screenshot". Sometimes the are these proxy-elements at the wrong position.
This happens, if you trigger the website to be hardware-accelerted. You have to drop some CSS-definitions:
transform(), translate(), transform3d(), translate3d().
The Bad news are:
You cannot solve this problem, because it a bug within the browser.
I have some different android devices, all have different problems, one fix will break another device.
I think the bug will never be solved, because noone cares about the embedded browser since android 4.1 and the chrome.
If you can disble hardware accelertion, this may help.
The good news are:
There are rumors about an embeddable chrome-webview.
I started to write some blogposts about "the new IE":
http://christian-kuetbach.de/blog/post/14
I have a custom view for which I want the user to be able to enter characters from an app-defined set of characters. To do this, as I understand it, I need to write an input method service. The user not only needs to install it, but then needs to enable the IME in the Settings > Language & keyboard, and then select the custom IME for use in the view.
This seems really crazy. I want this IME to be used for just one view in one application. I don't want it to be available system-wide or force the user to make global setting changes.
The only alternative I can see is defining my own in-app custom view and simulate an IME (probably a full-screen one) when the view gains focus. Isn't there anything better?
I do not think the IMEs are conceived for that kind of task. Their concept is to allow user input in a standardized way so it can be used across multiple applications from different vendors.
My strategy would be similar to what you are thinking:
prevent the soft keyboard from appearing,
intercept the menu button key press to show your own instead,
add a custom layout (probably a GridView or a TableView inside a RelativeLayout with bottom gravity)
use an OnItemClickListener
send the required KeyEvents to the root View. If the characters are invented, the KeyCodes do not even need to relate to the ASCII character. You just intercept the code and use at will.
Sorry I can't give you an option as you asked, but this alternative does not seem to be much more work than creating a whole new IME.
Edit: upon reading the related question, it makes sense to use android.inputmethodservice.KeyboardView instead of reinventing the wheel with the GridView.