Integration with TalkBack - android

I am a student programmer and the topic my degree work is to finalize one of the input methods for touchscreen devices by visually impaired people (including the blind).
I want to make my application work correct with TalkBack. But I totally don't know, how to do it. I've found the package for accessibility, but it's not clear for me, how to it integrates with TB.

You can start with simple layout with ImageView and add android:contentDescription="your string" as a parameter in xml. Then turn on talkback and click on that image to see what happens.

Use android:contentDescription="Generic Image" in any View with any custom content.
Note: When using ViewGroup, should be careful of clicking through view.
Here is a example: https://github.com/dotrinhdev/AndroidTalkback

As an application developer, you don't need to specifically integrate your app with TalkBack. Instead, you should focus on providing correct data to the accessibility framework. This will ensure that your application works not only with TalkBack, but also with Braille and switch-based accessibility services.
See the Android Developer guide on Making Applications Accessible for an overview of what steps you need to take to ensure your application works correctly with accessibility services.
You may also want to watch the Google I/O 2012 talk Making Android Apps Accessible, which covers basic application accessibility.

Related

Android - how can i make a minimizable app?

So , i want to make an app that can i can minimize and move over other apps.
I saw this feature on the twitch app and did some research but didn't find how they did it.here is an example of this feature.
Thank you for your answers in advance.
The example you showed is Android's Picture-in-Picture feature: https://developer.android.com/guide/topics/ui/picture-in-picture
However, it's not exactly a minimized app. Picture-In-Picture mode is simply a way for your content to be shown when users switch to other apps.
Due to this, there's a number of restrictions and guidelines, such as you're encouraged to hide all UI elements when in PIP mode besides the content that is displayed.
You're allowed to provide a list of Actions for the user to interact with, such as media controls. But note that there's a limit to how many actions that can be provided.
So it's not exactly a minimized app in the sense that its a smaller version of your app. Instead, it's more like the main viewing content of your app has been plucked out of your app and it's displaying over other apps.
If that's all your app needs, then Picture-In-Picture mode will work well.
But if you want a real mini-version of your app, you're going to have to consider Floating Windows instead. Floating Apps create a UI through a Service instead of an Activity.

Replacement of text in any EditText view of any app

There are apps like Texpand which are able to replace text in any EditText view - even of views which are part of other apps. Looking at the app-info this is happening without any requested permissions. I'm scratching my head how this is done - my (rookie) understanding is that each app resides in its own separated sandbox, so it should not have direct access to other apps views?
I looked for possible global events which could be provided by any central manager, but found nothing. More likely I would expect the replacement to be done passively (that means without the app being aware of the actual EditText), but checking for possible bindings or user dictionaries I found nothing promising either.
Looking at my Android system it seems the app is neither using permissions nor installing a keyboard. Additionally I don't see any entries in my user dictionary. Does anybody have an idea how the described functionality could actually be achieved?
Texpand's Google Play posting indicates that it uses Accessibility Services. Accessibility services are a set of APIs Android offers to help build tools to allow non-standard interactions with apps (such as audio descriptions/voice commands) to expand access to the platform to people with an impairment that might otherwise prevent them from using a touch-screen/smart-phone.
These include the ability to take action on the behalf of a user, such as filling in text fields.

Android talkback element type announcement

I am trying to make my app more accessible. I am having a hard time finding helpful things because there isn't a lot of documentation (at least I could not find it).
In my app, Talkback does not announce the element type for ImageViews. What I basically want is for Talkback to announce my contentDescription for the ImageView and follow it up with ", Image".
This link states that " Many accessibility services, such as TalkBack and BrailleBack, automatically announce an element's type after announcing its label, so you shouldn't include element types in your labels. For example, "submit" is a good label for a Button object, but "submitButton" isn't a good label." but it does not specify which element types it announces and which it does not.
https://developer.android.com/guide/topics/ui/accessibility/apps.html
Does anyone have any idea if Talkback announces "Image" after the contentDescription for ImageViews?
When does Talkback announce a link as a "Link"? Or is it the developer's responsibility to add it at the end of the contentDescription? Can I make talkback announce clickable text as a "Link"?
Any help/information/pointers is greatly appreciated. Thanks in advance.
A: don't add stuff to the end of the content description. It is an accessibility violation and in almost ALL circumstances just makes things less acessible (will explain more later).
B: A lot of contextual things are communicated to TalkBack users via earcons (bips, beeps, etc), you may just not be noticing.
C: Yes, this is confusing and difficult to determine, no images are not announced, though I think this is for good reason. For example, an image with a click listener may just be a fancy styled button. In iOS there are traits for you to change for this, Android has omitted this highly useful feature, so we're stuck with odd workarounds. The ideal solution would be for the Accessibility APIs to allow the developer to communicate this information.
As for links, typically inline links in text views are announced (basically anything android detects and underlines automatically), but otherwise are not. So, in practice A LOT of links are missed.
Now, as for when you should/should not supply this information yourself. If unsure, just don't and you'll obtain a reasonably high level of accessibility by following the above guidelines. In fact, any of the considerations below are really just fighting the Android OS limitations, and are their problem! However, the Android Accessibility Ecosystem is very weak, and if you want to provide a higher level of accessibility, it is understandable, however, difficult! In your attempts you can actually end up working against yourself. Let me explain:
In Accessibility there is a line between providing information and consistency. By adding contextual information to a content description you are walking along this line. What if Google said "We're not going to share contextual information, add it yourself!".
You would have buttons in music players across different music playing apps that announce in TalkBack like this:
App1: "Play, Button"
App2: "Play, Actionable"
App3: "Play, Clickable"
Do we have consistency here? Now a final example!
App4: "Play, Double Tap to click if you're on TalkBack, Hit enter if you're a keyboard user, use your select key for SwitchAccess users....."
Notice how complicated App4's Play Button is, this is illustrating that the information that TalkBack is using is NOT JUST FOR TALKBACK. The accessibility information from you app is consumed by a multitude of Accessibility services. When you "Hack" contextual information onto a content description, sure it might sound better for a TalkBack user, but what have you done to Braille Back users? To SwitchAccess users? In general, the content description of an element should describe the element, and leave contextual information for TalkBack to calculate/users to figure out given proximity to other controls.
TO ANSWER YOUR TWO PARTICULAR ISSUES (Images and Links):
What I recommend for images is in the content description make it obvious that the thing your describing is an image.
Let's say you have a picture of a kitten.
BAD: A kitten, Image
Good: A picture of a kitten.
See here, if TalkBack fails to announce this as an image (which it will) the user still gets the idea that it is a picture. You have added contextual information to the content description in a way that has ACTUALLY BETTER DESCRIBED THE CONTROL. This is actually the most accessible solution! Go figure.
Links:
For links this gets a bit tricky. There is a great debate in Accessibility about what constitutes a link vs a button. In the web browser world, this is a good debate. However, in native mobile I think we have a very clear definition:
Any button that when activated takes you away from your current Application Context, is a link.
The fact that the Context (Capital C Context!!!) is going to change significantly when a user interacts with a control is EXCEPTIONALLY important information.
TalkBack fails to recognize links A LOT, and as such, for this very important piece of information, if you find that TalkBack is not sharing this information, go ahead and append ", link" to your content description for this element. THIS IS THE ONLY EXCEPTION TO THIS RULE I HAVE FOUND, but believe it is a good exception. Reason being, YES it does add a violation or two for other Assistive Technologies, but it conveys important enough information to justify doing so. YOU CANNOT create a WCAG 2.0 compliant application of reasonable complex User Interface using the Android Accessibility APIs. They have too many limitations, you simply can't do everything you need to do to accomplish this without "hacks". So, we have to make judgement calls sometimes.

Manipulate android apps that don't work with Talkback

I recently used Talk back on an Android device. Some apps don't work well with the Talk Back screen reader because the it's not developed according to the accessibility API. Is there any way that I can access the UI of these apps (from OS level) and manipulate them to be compatible with Talk back?
No. Such change should be made in code. There is no way to modify it from OS level.
Those changes have to be made in XML or Java. User cannot access these values from system layer (security policy). You can refer to: Android Developer guidlines for accesibility. Without proper 'contentDescription' set TalkBack is unable to retrive any information about some views: images, imagebuttons etc.

How to code Android for the visually impaired?

Does Android support visually impaired users in the same way as HTML alt tags are used to provide input for screen readers on the web?
If so, what is the best practice to code Buttons and ImageViews etc so they can be read by a screen reader?
I'm not quite clear on your question. The internet browser, or any applications that rely heavily on HTML rendering are not accessible This quote is taken from this blog post.
You can turn on the accessibility features by going to Settings --> Accessibility and checking the box "Accessibility". While the web browser and browser-based
applications do not yet "talk" using these enhancements, we're working on them for upcoming releases.
Android does provide screen reader support for a lot of applications, see this wiki page for a list of applications known to work well with Android using a free and open source screen reader.
I can't find any general guidelines for creating accessible apps but this LinkedIn group may be helpful. I don't have a LinkedIn account though so don't know how active the group is.
The best resource I've found on the Android Accessibility API is this code walk through: https://sites.google.com/site/gdevelopercodelabs/android/accessibility
See http://developer.android.com/guide/practices/design/accessibility.html for plenty of details on writing an accessible app.
The nearest equivalent to HTML's ALT is the contentDescription property - set in code or in XML.
If you are creating you own custom control, you'll need to do a bit more work to specify other details too; more details at the link above.
Most important thing: when you're done, test with TalkBack, the free Android screenreader from Google. (It's pre-installed on some Android models, but you can download from Android Market if you don't already have it.) You should be able to navigate to all the interactive elements in your app using the directional pad alone, and TalkBack should read out appropriate values for all elements as it does so. (It should pick up the contentDescription and read it out here.)
One thing to watch for is that from what I remember, the screenreader only reads out things that you can navigate to, so if you have instructional text on the page, it may not read out, so you may need to ensure that the contentDescription for other controls is suitably descriptive. To be sure, test with TalkBack, and see for you self (er, hear for yourself!) if what is read out makes sense.
(As noted in one of the other replies, although Android has an accessibility API, the Android browser doesn't actually support it (yet), so HTML pages - even properly marked up HTML code - isn't accessible on Android using the default browser. There are a couple of 3rd party browsers that add accessibility to HTML, though, such as the free IDEAL Web Reader app, which appears to wrap the Android HTML control and then add voicing on top of it. Hopefully Android will make their default browser fully accessible in some later release...)

Categories

Resources