Preamble
TalkBack is an Accessibility Service for Android that helps blind and vision-impaired users interact with their devices. It’s an screen reader, that reads every user interface element.
This is great visually impaired people can use so many different apps that have not specially adapted.
But it does not work for all types of apps. I want to use a widget, that handle all touch events despite TalkBack is active.
VoiceOver is the complement of TalkBack in IOS. Here is the solution for my problem in IOS.
VoiceOver accessibility in a virtual musical instrument iPhone app?
Question
I implemented this solution to my IOS App and it works fine. Is there any equivalent for Android/TalkBack?
[ mySubView setAccessibilityTraits: UIAccessibilityTraitAllowsDirectInteraction ];
Apps that play sound directly by touch should be able to use with TalkBack on. Instrument apps, several games and apps like mine, that make a room discoverable with 3D Sound, should work with TalkBack on.
Turning off TalkBack is for a blind person not an option. It’s like turning of the screen for seeing people.
If you think there should be such a function, please upvote.
Related
I am currently using a Pixel 4 API 30 as an avd in android studio.
I want to activate text to speech in the virtual device's settings, but there does not seem to be a direct way to do this. How can I activate it?
Here's a photo of the settings menu. As you can see, although you can test the text to speech output, I haven't found a way to use it.
Text to speech is different to Talkback. There is a question on this here
Text to speech is a voice synthesis tool that can be used by apps to convert text to speech. One of the Google provided apps called TalkBack uses this as a screen reader for those who may find it difficult to read or see the screen.
I think what you want to activate is TalkBack as a screen reader - this means if you wanted to do it via the emulator you would have to install TalkBack on your emulator. The answers here may be able to help you, either by downloading the APK from the store or by just opening Google Play from the emulator and installing it. I recommend you go through the tutorial before diving in, as TalkBack uses gestures to navigate. You might not have success as swipes tend to have "levels" (a programmatic adb swipe is different from a swipe with a finger physically on the screen for some odd reason) when it comes to accessibility.
I have found it much easier to test accessibility on real devices, as adb controls are limited. I am trying to improve the situation with my code, but it's slow going and TalkBack is far from perfect.
I work on an app with iOS and Android versions and am looking to know more about our users who have accessibility requirements.
There are two pieces of info I'd love to have:
How many of our users on iOS and Android already use accessibility features like voice over, switch control, font scaling, or color and motion settings (amongst many other options).
How many users generally use these features?
I'm having a very hard time using the Google to answer these questions, so any advice would be well received.
Use case: I have security cameras around my home (on zoneminder). When an event is triggered (someone walks by the camera) I want to have some screen nearby the door and ideally push a notification to devices (hopefully android tablets) to display the camera view on all these connected devices.
I'm happy developing my own app if necessary (even better if such an app already exists). But does Android support this functionality to allow an app to open based on an external event (some message to the app)? I've personally never seen it (phonecall is the one exception). I'm sure listening for events is easy, but I can see there being a rule not to allow apps to 'force open' in this way.
If there is a way, if you could point me to the class that would support this I would appreciate it. Thanks!
(If you have any other solutions for my use case, I'd be happy to hear them. Unfortunately, I'd like these tablets to run other apps too, so keeping the 'camera app' open constantly isn't really a solution for me.)
To explain my question, a bit of info about my test setup might help. I have a Moto Z, with a Moto Mod projector (my spoilt son's christmas present). I've now added a gyroscopic probox2 remote/gamepad, so he can theoretically use his phone while projecting, for films/games, without tapping the phone (which is behind him).
I've connected it and it works to an extent. It works in the core Android UI (home screen, app launcher, settings etc). However it doesn't work at all in most APPs. It works in Amazon Prime, for example, but not in Netflix.
I was expecting it to work pretty much seamlessly, as it would on Android TV boxes, even though I'm connecting it to a phone.
I've noticed it seems to identify itself to Android as a keyboard, rather than a gamepad, which makes sense since the gyroscopic "air mouse" functionality wouldn't necessarily make sense on a gamepad. The gboard popup disappears when the remote is connected, even though the remote itself doesn't have an actual keyboard. The remote allows you to switch between a sort of gamepad mode and a mouse mode, although in both cases identified as a keyboard.
Because it doesn't work out of the box in Android, and I think somebody would have noticed on an Android TV if it didn't work with Netflix, then I'm assuming Android TV developers do something to force compatibility from APPs that aren't allowing input from a "keyboard".
Possibly a service that detects "keyboard" presses and simultaneously triggers a "gamepad" press?
That's how I would probably approach it, and I assume that's how the non-root "button remapper" type APPs approach it, because they can't interfere with the actual button mapping file... but it might not be the best/easiest way?
Any ideas?
Having looked into this further I think I understand.
APPs for Android TV are maintained separately from their mobile counterparts (https://www.apkmirror.com/apk/netflix-inc/netflix-android-tv/netflix-android-tv-3-3-2-build-1530-release/netflix-android-tv-3-3-2-build-1530-android-apk-download/) and it's not possible to side-load them without getting hacky.
So that's basically the answer - the approach in the Android TV industry to ensure compatibility with keyboards, mice, and media remotes etc, is to create separate versions of apps for Android TV which support them. On mobile, presumably developers are mainly interested in ensure physical keyboards work in text input areas, rather than in all areas in unusual cases like mine.
Which doesn't help me at all.
The not so good ways of doing it... There's possibly an approach of creating a virtual gamepad, mapped to the key presses of the physical keyboard (i.e. remote). An example of an app which appears to do just this, requires root in order to do so https://play.google.com/store/apps/details?id=com.locnet.gamekeyboard2&hl=en_GB
i have developed some applications for blind which makes them to use basic mobile functionalities like making call, add contact,delete contact. but now i am in dilemma that how the blind can even unlock the mobile if he is given the android mobile and navigate through the menu to reach the specific application. So does one need to customize the entire OS by making it interactive to the blind user by responding to "Voice commands" (like "unlock mobile" should unlock it) to enable him to use mobile functionalites.
But i have tried some of the voice recognition tutorials,Voice recogntition approach was not accurate. What could be the possible way to customize the mobile for blind and is it possible to access the internals of the OS and customize it, to make it respond to voice commands?
how to code it i am not getting.
please throw some light on this problem. in what way it could be made possible.
Android comes with accessibility services which can be used. It has a talk back service which could be used to enabled voice response on touch.
Refer to the link below for more details
http://eyes-free.googlecode.com/svn/trunk/documentation/android_access/index.html
http://eyes-free.googlecode.com/svn/trunk/documentation/android_access/services.html