There are apps like Texpand which are able to replace text in any EditText view - even of views which are part of other apps. Looking at the app-info this is happening without any requested permissions. I'm scratching my head how this is done - my (rookie) understanding is that each app resides in its own separated sandbox, so it should not have direct access to other apps views?
I looked for possible global events which could be provided by any central manager, but found nothing. More likely I would expect the replacement to be done passively (that means without the app being aware of the actual EditText), but checking for possible bindings or user dictionaries I found nothing promising either.
Looking at my Android system it seems the app is neither using permissions nor installing a keyboard. Additionally I don't see any entries in my user dictionary. Does anybody have an idea how the described functionality could actually be achieved?
Texpand's Google Play posting indicates that it uses Accessibility Services. Accessibility services are a set of APIs Android offers to help build tools to allow non-standard interactions with apps (such as audio descriptions/voice commands) to expand access to the platform to people with an impairment that might otherwise prevent them from using a touch-screen/smart-phone.
These include the ability to take action on the behalf of a user, such as filling in text fields.
Related
I am trying to make my app more accessible. I am having a hard time finding helpful things because there isn't a lot of documentation (at least I could not find it).
In my app, Talkback does not announce the element type for ImageViews. What I basically want is for Talkback to announce my contentDescription for the ImageView and follow it up with ", Image".
This link states that " Many accessibility services, such as TalkBack and BrailleBack, automatically announce an element's type after announcing its label, so you shouldn't include element types in your labels. For example, "submit" is a good label for a Button object, but "submitButton" isn't a good label." but it does not specify which element types it announces and which it does not.
https://developer.android.com/guide/topics/ui/accessibility/apps.html
Does anyone have any idea if Talkback announces "Image" after the contentDescription for ImageViews?
When does Talkback announce a link as a "Link"? Or is it the developer's responsibility to add it at the end of the contentDescription? Can I make talkback announce clickable text as a "Link"?
Any help/information/pointers is greatly appreciated. Thanks in advance.
A: don't add stuff to the end of the content description. It is an accessibility violation and in almost ALL circumstances just makes things less acessible (will explain more later).
B: A lot of contextual things are communicated to TalkBack users via earcons (bips, beeps, etc), you may just not be noticing.
C: Yes, this is confusing and difficult to determine, no images are not announced, though I think this is for good reason. For example, an image with a click listener may just be a fancy styled button. In iOS there are traits for you to change for this, Android has omitted this highly useful feature, so we're stuck with odd workarounds. The ideal solution would be for the Accessibility APIs to allow the developer to communicate this information.
As for links, typically inline links in text views are announced (basically anything android detects and underlines automatically), but otherwise are not. So, in practice A LOT of links are missed.
Now, as for when you should/should not supply this information yourself. If unsure, just don't and you'll obtain a reasonably high level of accessibility by following the above guidelines. In fact, any of the considerations below are really just fighting the Android OS limitations, and are their problem! However, the Android Accessibility Ecosystem is very weak, and if you want to provide a higher level of accessibility, it is understandable, however, difficult! In your attempts you can actually end up working against yourself. Let me explain:
In Accessibility there is a line between providing information and consistency. By adding contextual information to a content description you are walking along this line. What if Google said "We're not going to share contextual information, add it yourself!".
You would have buttons in music players across different music playing apps that announce in TalkBack like this:
App1: "Play, Button"
App2: "Play, Actionable"
App3: "Play, Clickable"
Do we have consistency here? Now a final example!
App4: "Play, Double Tap to click if you're on TalkBack, Hit enter if you're a keyboard user, use your select key for SwitchAccess users....."
Notice how complicated App4's Play Button is, this is illustrating that the information that TalkBack is using is NOT JUST FOR TALKBACK. The accessibility information from you app is consumed by a multitude of Accessibility services. When you "Hack" contextual information onto a content description, sure it might sound better for a TalkBack user, but what have you done to Braille Back users? To SwitchAccess users? In general, the content description of an element should describe the element, and leave contextual information for TalkBack to calculate/users to figure out given proximity to other controls.
TO ANSWER YOUR TWO PARTICULAR ISSUES (Images and Links):
What I recommend for images is in the content description make it obvious that the thing your describing is an image.
Let's say you have a picture of a kitten.
BAD: A kitten, Image
Good: A picture of a kitten.
See here, if TalkBack fails to announce this as an image (which it will) the user still gets the idea that it is a picture. You have added contextual information to the content description in a way that has ACTUALLY BETTER DESCRIBED THE CONTROL. This is actually the most accessible solution! Go figure.
Links:
For links this gets a bit tricky. There is a great debate in Accessibility about what constitutes a link vs a button. In the web browser world, this is a good debate. However, in native mobile I think we have a very clear definition:
Any button that when activated takes you away from your current Application Context, is a link.
The fact that the Context (Capital C Context!!!) is going to change significantly when a user interacts with a control is EXCEPTIONALLY important information.
TalkBack fails to recognize links A LOT, and as such, for this very important piece of information, if you find that TalkBack is not sharing this information, go ahead and append ", link" to your content description for this element. THIS IS THE ONLY EXCEPTION TO THIS RULE I HAVE FOUND, but believe it is a good exception. Reason being, YES it does add a violation or two for other Assistive Technologies, but it conveys important enough information to justify doing so. YOU CANNOT create a WCAG 2.0 compliant application of reasonable complex User Interface using the Android Accessibility APIs. They have too many limitations, you simply can't do everything you need to do to accomplish this without "hacks". So, we have to make judgement calls sometimes.
I am a student programmer and the topic my degree work is to finalize one of the input methods for touchscreen devices by visually impaired people (including the blind).
I want to make my application work correct with TalkBack. But I totally don't know, how to do it. I've found the package for accessibility, but it's not clear for me, how to it integrates with TB.
You can start with simple layout with ImageView and add android:contentDescription="your string" as a parameter in xml. Then turn on talkback and click on that image to see what happens.
Use android:contentDescription="Generic Image" in any View with any custom content.
Note: When using ViewGroup, should be careful of clicking through view.
Here is a example: https://github.com/dotrinhdev/AndroidTalkback
As an application developer, you don't need to specifically integrate your app with TalkBack. Instead, you should focus on providing correct data to the accessibility framework. This will ensure that your application works not only with TalkBack, but also with Braille and switch-based accessibility services.
See the Android Developer guide on Making Applications Accessible for an overview of what steps you need to take to ensure your application works correctly with accessibility services.
You may also want to watch the Google I/O 2012 talk Making Android Apps Accessible, which covers basic application accessibility.
I am trying to archive same thing as done in "hierarchyviewer" tool, which dumps the tree of Views present at any given moment on the device or emulator screen.
But i want it to be an Application running on a Android device. This app will keep running in background like a Serve and will dump the currently displayed Views in a text file.
Is it possible? is there any code examples are available?
Is it possible?
No.
The closest you can come is to implement an AccessibilityService. This would more closely mirror the uiautomatorviewer functionality, giving you a subset of what you see in Hierarchy View. This also requires a double-opt-in by the user: the user must install your app and activate it in Settings in the accessibility area.
As far as I know, you couldn't access other apps if they do not explicity share that info with you by the use of Intents (or if you own these other apps).
So, based on this limitation, my bets are you can't access another app's View Tree by regular means. And if you chould, I think you shouldn't, as this is somehow "secret" to other apps, and you'd be registering information without permission. In fact, what hierarchyview uses is, for sure, some sort of trick that directly uses internal private libraries of Android. Like taking a screenshot, that you can't do with the "default" implementation, but using these kind of testing tools.
That being said, check this answer, where it shows how to get the current app in foreground. From here, getting the View tree should be impossible, but as long as you could call getWindow() on that app's current activity, this could be done.
So I am using the Home sample to build an application that creates a second home screen for the user. The idea is to be able to have only one user account yet restrict certain access to chosen applications. I have managed to ensure that all of the applications are invisible in the XML yet I am struggling with how to change this to make certain apps visible.
Is it possible to write a whitelist of accepted apps for instance the preinstalled apps or child friendly apps for children who game using the android device and then put in a Java method to access this white list? This is the only way I can think to make it work.
If anyone knows the correct way can you please help.
Thanks.
Ok so I discovered how to do this.
In the home sample they provide a for loop in the Home.java file that covers all apps and displays them. It take a simple if statement to restrict the apps that can be viewed -
// for loop is here
if (info.activityInfo.applicationInfo.packageName.contains("com.android"))
//then the rest of the home sample is here.
Still very basic but provides me with a good enough UI so that kids cannot see apps I don't want them to.
Does Android support visually impaired users in the same way as HTML alt tags are used to provide input for screen readers on the web?
If so, what is the best practice to code Buttons and ImageViews etc so they can be read by a screen reader?
I'm not quite clear on your question. The internet browser, or any applications that rely heavily on HTML rendering are not accessible This quote is taken from this blog post.
You can turn on the accessibility features by going to Settings --> Accessibility and checking the box "Accessibility". While the web browser and browser-based
applications do not yet "talk" using these enhancements, we're working on them for upcoming releases.
Android does provide screen reader support for a lot of applications, see this wiki page for a list of applications known to work well with Android using a free and open source screen reader.
I can't find any general guidelines for creating accessible apps but this LinkedIn group may be helpful. I don't have a LinkedIn account though so don't know how active the group is.
The best resource I've found on the Android Accessibility API is this code walk through: https://sites.google.com/site/gdevelopercodelabs/android/accessibility
See http://developer.android.com/guide/practices/design/accessibility.html for plenty of details on writing an accessible app.
The nearest equivalent to HTML's ALT is the contentDescription property - set in code or in XML.
If you are creating you own custom control, you'll need to do a bit more work to specify other details too; more details at the link above.
Most important thing: when you're done, test with TalkBack, the free Android screenreader from Google. (It's pre-installed on some Android models, but you can download from Android Market if you don't already have it.) You should be able to navigate to all the interactive elements in your app using the directional pad alone, and TalkBack should read out appropriate values for all elements as it does so. (It should pick up the contentDescription and read it out here.)
One thing to watch for is that from what I remember, the screenreader only reads out things that you can navigate to, so if you have instructional text on the page, it may not read out, so you may need to ensure that the contentDescription for other controls is suitably descriptive. To be sure, test with TalkBack, and see for you self (er, hear for yourself!) if what is read out makes sense.
(As noted in one of the other replies, although Android has an accessibility API, the Android browser doesn't actually support it (yet), so HTML pages - even properly marked up HTML code - isn't accessible on Android using the default browser. There are a couple of 3rd party browsers that add accessibility to HTML, though, such as the free IDEAL Web Reader app, which appears to wrap the Android HTML control and then add voicing on top of it. Hopefully Android will make their default browser fully accessible in some later release...)