I'm developing an Android webapp (an app built with Android Studio mainly based on a single webview which loads remote content) that is meant to be used by visually impaired people.
Standard accessibility features, such as those provided by Android itself and Talkback, are completely useless for my use case, because this app is meant to be used in a completely non standard way, by using the headset button as the only input, and spoken text as the output.
The webapp is NOT meant to be used by people who are not visually impaired alike. We already have a completely different version of the webapp for that. However, I do want the spoken texts to be displayed on screen, mainly for testing/debugging purposes.
So, I don't need nor want to take advantage of Google Talkback, and my webapp, built from scratch using native TTS called from JavaScript (and TTS events, such as ending of utterances, triggering JavaScript code), already works fine when TalkBack is disabled.
However, my app is going to be used mostly on phones whose users DO have accessibility and TalkBack enabled (or perhaps other accessibility extensions - hopefully not, given the nearly nonexistent standardization of the framework for accessibility features).
So I need to prevent it from interfering with my own calls to the TTS engine.
Which is complicated as hell.
Also, I need it to be compatible with Android 2.x through 4.x.
On older Android versions, the problem is relatively small, as TalkBack doesn't inject scripts into the webview, and the webview is almost opaque, its contents being "invisible" to Talkback.
On newer versions, it's a nightmare.
Android doesn't provide a way to simply disable accessibility features from the app, which is ridiculous.
The AccessibilityManager.interrupt() method doesn't seem to do the trick, and it's practically undocumented, so I can hardly figure out in which ways I could try to use it.
There doesn't seem to be any documentation about how the injected scripts work and how I could try to interact with them.
And to screw things further, I'm reading that since 4.4.something, the scripts formerly injected by TalkBack are now replaced by something else, integrated into the very web engine, or something like that. So maybe even any hack I can figure out for TalkBack injected scripts may not work with the most recent versions.
Is there some workaround/hack to completely disable TalkBack from my app, for my app, when my app is running?
Is there a way to prevent the injection of javascript code from TalkBack (when the script injection is enabled systemwise)? (obviously without disabling javascript altogether in my webview, since I do use my own javascript)
Is there at least some documentation of those scripts available? Or their code? So that I could at least try to "fool" them?
Is there a way I can make some parts of the webview's content "invisible" to TalkBack?
Or any other suggestion in order to be able to have my app do its own call to TTS while avoiding Talkback to interfere with them, and render text on screen while preventing TalkBack from interacting with it?
I've been working with TalkBack and Android for about a month now and find the same frustrations you have - little to no documentation other than reading thousands of lines of source code. In 4.4 the WebView component changed to use the Chromium engine, which is from what I can tell the reason the scripts injected aren't exactly the same. Both still seem to be using ChromeVox. There are no public APIs for disabling TalkBack, it can be done with reflection if you're included to deal with the headaches that come with that. You're best bet is to disable ChromeVox (it's injected into your WebView automatically on 4.4+ and on lower 4.x assuming the web accessibility setting is enabled) from the JS side.
I can only test on 4.4 now, but this does the trick for me:
cvox.ChromeVox.host.activateOrDeactivateChromeVox(false);
https://code.google.com/p/google-axs-chrome/source/browse/trunk/chromevox/chromevox/injected/init_document.js?r=193
Related
I've always looked to provide good accessibility support in my apps, but in my latest work, I've needed to build parts of the UI within a native OpenGL view, which means that there is no Android widgets or components on which one can hang a contentDescription or similar.
In theory, it shouldn't be too hard to implement "talkback" functionality regardless; just check whether talkback is switched on when the user taps a button and if it's the first tap, play the talkback text, and if a second consecutive, do what the button would normally do. But - somewhat to my surprise - there does not seem to be any APIs for this, or at least no documentation for them that I could find.
Anyone who has tried to work with the Android talkback functionality directly (or tried alternate solutions)? I'd hate to abandon those users who benefit from Talkback if there is a way to work around its seemingly limited implementation.
I'm having some trouble getting TalkBack to work with a web view (testing with a Nexus 9 on Android 5.1). I read that TalkBack support was added to web views around the release of Android JellyBean by checking a preference titled "Enhance Web Accessibility." I can't for the life of me find this preference in the system settings.
Focusing on the web view in our application simply reads "WebView" and provides no other options.
Was TalkBack support for web views removed in KitKat? If not, what am I missing here?
Am using Mobile accessibility Plugin to read my customized talkback
My app work fine with Android 4.4.4 . Am using aria-hidden=true to stop default talkback of Mobile accessibility. But with that attribute lollipop doesn't focus to the event and not all reading
Your question is a little unclear. But I can guarantee one thing, this has nothing to do with the "Enhanced Web Accessibility" option. This was an experimental accessibility setting in Android 4.1 - 4.2 and has since been deprecated. This is why you cannot find this setting. What "Enhanced Web Accessibility" did was add some visual elements to help users spot things like Links and such with nice outlines. It has nothing to do with basic TalkBack support for WebViews, which has been supported since TalkBack was released, though it has certainly improved since then, and is still improving.
Now, this doesn't really answer your question. Only removes one of the solutions you seem to have been concerned about investigating. As per your actual solution, your question is a little unclear what your actual problem is. It seems to me you have some sort of event firing in a custom WebView you have created, and this should cause TalkBack to read something out. I await further details to continue this answer, though I suspect that you have not set the WebView's accessibility delegate.
mWebView.setAccessibilityDelegate(new View.AccessibilityDelegate());
It must be set to setJavaScriptEnabled (false);
web components analysis appears to be view.view.
I doubt it is, but is it possible to detect if the volume button is pressed, by jQuery or any of the web languages?
I want to do some basic user testing, to investigate by which way it's best to call a certain function; either a gesture or a hardware button. I't not a problem to call a function based on a touch gesture, but calling a function based on pressing a hardware button is a bit more difficult.
As I got both an Android phone and iPhone here, it doesn't matter if it only works on one of both devices.
Since I can't write C++, this language isn't an option. Even though this language does support hardware button detection
What I want to create
It's the most basic version of a test: I want to see by which way people would like to switch from round to squared display.
There is a simple round object in the center and it should transform to a square, based on any of the following gestures: A hardware button press (volume), pinch, pinch-out, 4-finger pinch out, 2 finger swipe down.
Short answer:
If you're running this in a mobile browser (e.g. Safari or Chrome), you don't have access to the volume buttons.
Explanation:
JQuery relies on JavaScript which relies on a browser, webview (for native apps), or a node.js server (irrelevant for you). Webviews tend to be like a browser but with fewer features. The main browsers out there do not provide access to the volume buttons. Therefore, jQuery isn't going to solve this for you.
Solution:
You're going to need a native app if you really want to use the volume buttons. On Android, apps are written in Kotlin (based on Java). On iOS, they're written in Swift (or Objective-C). If you know only one or none of those languages, there are cross-platform tools that'll let you write the app once and deploy to both platforms. Depending on the level of control you want, you could use a tool that provides a unified framework or go for a fancy drag-and-drop builder. There are TONS of options out there.
Extra:
Looking at your history, it seems like you're a "web" guy. If you just want to use jQuery/JavaScript for the convenience, you could create a simple native app that basically does just two things: 1) Load your webapp, and 2) Provide an API to the webapp for accessing the volume buttons. This topic will get you moving in that direction: Calling android native APIs from javascript functions of embedded WebView
I encourage other web folks to hack around a little on mobile platforms. You never know when a base level understanding could come in handy.
Does anyone have any experience with making an Android app accessible when utilizing PhoneGap? At the very minimum we need to make our apps Section 508-compliant.
I have tried implementing some standard accessibility features (labels for textboxes, adding title attributes to divs, etc). However, when using TalkBack and Expore by Touch in Android, when my PhoneGap app is loaded it just says "WEB CONTENT" - and that's it. Nothing else about my app is spoken aloud.
When the same app runs in iOS with VoiceOver everything works quite well. It reads all of the "title" attributes just fine.
Yeah, I can reproduce that problem as well. It doesn't look like
TalkBack can read things inside the WebView. You should raise an issue
with them:
https://code.google.com/p/eyes-free/issues/list
I'm sure we'd be happy to work with them on it.
I am programming an Android keyboard. Right now I reached the point where I have to rewrite the KeyboardView and I'm asking myself why not to switch to an Webapp instead of a native Android app. Thereby I could later easily deploy the application for other platforms whithout the need of rewriting.
My approach would be to use the Android App as an container for my web app probaply in combination with GWT. I am also considering using PhoneGap.
The problem is that so far I haven't found anything about people using webapps for an InputMethod. Now I'm wondering if it's reasonable or even possible to use webapps for this task or if they are only suitable for normal activities.
Has anyone tried doing this before or some hints why or why not it is possible?