Android talkback element type announcement - android

I am trying to make my app more accessible. I am having a hard time finding helpful things because there isn't a lot of documentation (at least I could not find it).
In my app, Talkback does not announce the element type for ImageViews. What I basically want is for Talkback to announce my contentDescription for the ImageView and follow it up with ", Image".
This link states that " Many accessibility services, such as TalkBack and BrailleBack, automatically announce an element's type after announcing its label, so you shouldn't include element types in your labels. For example, "submit" is a good label for a Button object, but "submitButton" isn't a good label." but it does not specify which element types it announces and which it does not.
https://developer.android.com/guide/topics/ui/accessibility/apps.html
Does anyone have any idea if Talkback announces "Image" after the contentDescription for ImageViews?
When does Talkback announce a link as a "Link"? Or is it the developer's responsibility to add it at the end of the contentDescription? Can I make talkback announce clickable text as a "Link"?
Any help/information/pointers is greatly appreciated. Thanks in advance.

A: don't add stuff to the end of the content description. It is an accessibility violation and in almost ALL circumstances just makes things less acessible (will explain more later).
B: A lot of contextual things are communicated to TalkBack users via earcons (bips, beeps, etc), you may just not be noticing.
C: Yes, this is confusing and difficult to determine, no images are not announced, though I think this is for good reason. For example, an image with a click listener may just be a fancy styled button. In iOS there are traits for you to change for this, Android has omitted this highly useful feature, so we're stuck with odd workarounds. The ideal solution would be for the Accessibility APIs to allow the developer to communicate this information.
As for links, typically inline links in text views are announced (basically anything android detects and underlines automatically), but otherwise are not. So, in practice A LOT of links are missed.
Now, as for when you should/should not supply this information yourself. If unsure, just don't and you'll obtain a reasonably high level of accessibility by following the above guidelines. In fact, any of the considerations below are really just fighting the Android OS limitations, and are their problem! However, the Android Accessibility Ecosystem is very weak, and if you want to provide a higher level of accessibility, it is understandable, however, difficult! In your attempts you can actually end up working against yourself. Let me explain:
In Accessibility there is a line between providing information and consistency. By adding contextual information to a content description you are walking along this line. What if Google said "We're not going to share contextual information, add it yourself!".
You would have buttons in music players across different music playing apps that announce in TalkBack like this:
App1: "Play, Button"
App2: "Play, Actionable"
App3: "Play, Clickable"
Do we have consistency here? Now a final example!
App4: "Play, Double Tap to click if you're on TalkBack, Hit enter if you're a keyboard user, use your select key for SwitchAccess users....."
Notice how complicated App4's Play Button is, this is illustrating that the information that TalkBack is using is NOT JUST FOR TALKBACK. The accessibility information from you app is consumed by a multitude of Accessibility services. When you "Hack" contextual information onto a content description, sure it might sound better for a TalkBack user, but what have you done to Braille Back users? To SwitchAccess users? In general, the content description of an element should describe the element, and leave contextual information for TalkBack to calculate/users to figure out given proximity to other controls.
TO ANSWER YOUR TWO PARTICULAR ISSUES (Images and Links):
What I recommend for images is in the content description make it obvious that the thing your describing is an image.
Let's say you have a picture of a kitten.
BAD: A kitten, Image
Good: A picture of a kitten.
See here, if TalkBack fails to announce this as an image (which it will) the user still gets the idea that it is a picture. You have added contextual information to the content description in a way that has ACTUALLY BETTER DESCRIBED THE CONTROL. This is actually the most accessible solution! Go figure.
Links:
For links this gets a bit tricky. There is a great debate in Accessibility about what constitutes a link vs a button. In the web browser world, this is a good debate. However, in native mobile I think we have a very clear definition:
Any button that when activated takes you away from your current Application Context, is a link.
The fact that the Context (Capital C Context!!!) is going to change significantly when a user interacts with a control is EXCEPTIONALLY important information.
TalkBack fails to recognize links A LOT, and as such, for this very important piece of information, if you find that TalkBack is not sharing this information, go ahead and append ", link" to your content description for this element. THIS IS THE ONLY EXCEPTION TO THIS RULE I HAVE FOUND, but believe it is a good exception. Reason being, YES it does add a violation or two for other Assistive Technologies, but it conveys important enough information to justify doing so. YOU CANNOT create a WCAG 2.0 compliant application of reasonable complex User Interface using the Android Accessibility APIs. They have too many limitations, you simply can't do everything you need to do to accomplish this without "hacks". So, we have to make judgement calls sometimes.

Related

Trigger Android talkback programmatically

I've always looked to provide good accessibility support in my apps, but in my latest work, I've needed to build parts of the UI within a native OpenGL view, which means that there is no Android widgets or components on which one can hang a contentDescription or similar.
In theory, it shouldn't be too hard to implement "talkback" functionality regardless; just check whether talkback is switched on when the user taps a button and if it's the first tap, play the talkback text, and if a second consecutive, do what the button would normally do. But - somewhat to my surprise - there does not seem to be any APIs for this, or at least no documentation for them that I could find.
Anyone who has tried to work with the Android talkback functionality directly (or tried alternate solutions)? I'd hate to abandon those users who benefit from Talkback if there is a way to work around its seemingly limited implementation.

Android Accessibility : characters announced twice

Precondition : Accessibility Talkback on.
Problem : While typing in characters from soft keyboard into the edit text, the characters are announced twice.
(I think once by the keyboard and once by the edit text).
Problem: Sighted users don't understand/empathize well with the types of information that blind users need. They just use a keyboard with TalkBack on, without actually closing their eyes and experiencing it the way a blind or partially sighted user would.
Solution: Close your eyes and actually use the keyboard the same way a TalkBack user would. You will discover that there are actually two relevant pieces of information here. The key that you're going to select and the key that you've actually selected.
Real Solution: Let Android do its thing unless you REALLY know what you're doing. More often than not, well intentioned, developers just muck things up when they start trying too hard. Provide content descriptions for your non-text controls, hook up some nice data associations when necessary and otherwise leave things alone.

Implementing accessibility in viewpager android to have a "swipe next for more" kind of thing

I am implementing a viewpager which has multiple hints to a particular functionality. Now when I activate talkback feature it reads the content in the current view. I also want the talkback to say as "swipe left for more hints" how can I do that.
Easy answer: Don't. This may be a sensible thing to say if the user is using TalkBack Gestures. But what if they're using an attached USB keyboard in conjunction with TalkBack? Is this really the ONLY way to reach those items. Probably not. And if it is, you're app may work great for TalkBack gestures, but is broken for switch access and keyboard access. TalkBack users know how to use TalkBack, switch users know how to use switches, etc. Use good, common design patterns in your layouts, and let users figure things out for themselves.
Let's explain this in a different way. Let's say you have a button. You have marked this thing up so it looks like a button. Do you have a big sign next to your button that says "HEY, CLICK YOUR MOUSE HERE TO ACTIVATE THIS BUTTON"... no of course not. Users know how to use buttons. Use reasonable design idioms, and users should understand how your UI works, independent of whether they are using TalkBack, Switches, Keyboards, etc. It is misleading, inappropriate, and actually LESS accessible to include TalkBack gesture specific instructions on how to perform actions.
See WCag 2.0 Guideline 4.1

Integration with TalkBack

I am a student programmer and the topic my degree work is to finalize one of the input methods for touchscreen devices by visually impaired people (including the blind).
I want to make my application work correct with TalkBack. But I totally don't know, how to do it. I've found the package for accessibility, but it's not clear for me, how to it integrates with TB.
You can start with simple layout with ImageView and add android:contentDescription="your string" as a parameter in xml. Then turn on talkback and click on that image to see what happens.
Use android:contentDescription="Generic Image" in any View with any custom content.
Note: When using ViewGroup, should be careful of clicking through view.
Here is a example: https://github.com/dotrinhdev/AndroidTalkback
As an application developer, you don't need to specifically integrate your app with TalkBack. Instead, you should focus on providing correct data to the accessibility framework. This will ensure that your application works not only with TalkBack, but also with Braille and switch-based accessibility services.
See the Android Developer guide on Making Applications Accessible for an overview of what steps you need to take to ensure your application works correctly with accessibility services.
You may also want to watch the Google I/O 2012 talk Making Android Apps Accessible, which covers basic application accessibility.

How to code Android for the visually impaired?

Does Android support visually impaired users in the same way as HTML alt tags are used to provide input for screen readers on the web?
If so, what is the best practice to code Buttons and ImageViews etc so they can be read by a screen reader?
I'm not quite clear on your question. The internet browser, or any applications that rely heavily on HTML rendering are not accessible This quote is taken from this blog post.
You can turn on the accessibility features by going to Settings --> Accessibility and checking the box "Accessibility". While the web browser and browser-based
applications do not yet "talk" using these enhancements, we're working on them for upcoming releases.
Android does provide screen reader support for a lot of applications, see this wiki page for a list of applications known to work well with Android using a free and open source screen reader.
I can't find any general guidelines for creating accessible apps but this LinkedIn group may be helpful. I don't have a LinkedIn account though so don't know how active the group is.
The best resource I've found on the Android Accessibility API is this code walk through: https://sites.google.com/site/gdevelopercodelabs/android/accessibility
See http://developer.android.com/guide/practices/design/accessibility.html for plenty of details on writing an accessible app.
The nearest equivalent to HTML's ALT is the contentDescription property - set in code or in XML.
If you are creating you own custom control, you'll need to do a bit more work to specify other details too; more details at the link above.
Most important thing: when you're done, test with TalkBack, the free Android screenreader from Google. (It's pre-installed on some Android models, but you can download from Android Market if you don't already have it.) You should be able to navigate to all the interactive elements in your app using the directional pad alone, and TalkBack should read out appropriate values for all elements as it does so. (It should pick up the contentDescription and read it out here.)
One thing to watch for is that from what I remember, the screenreader only reads out things that you can navigate to, so if you have instructional text on the page, it may not read out, so you may need to ensure that the contentDescription for other controls is suitably descriptive. To be sure, test with TalkBack, and see for you self (er, hear for yourself!) if what is read out makes sense.
(As noted in one of the other replies, although Android has an accessibility API, the Android browser doesn't actually support it (yet), so HTML pages - even properly marked up HTML code - isn't accessible on Android using the default browser. There are a couple of 3rd party browsers that add accessibility to HTML, though, such as the free IDEAL Web Reader app, which appears to wrap the Android HTML control and then add voicing on top of it. Hopefully Android will make their default browser fully accessible in some later release...)

Categories

Resources