I am currently using a Pixel 4 API 30 as an avd in android studio.
I want to activate text to speech in the virtual device's settings, but there does not seem to be a direct way to do this. How can I activate it?
Here's a photo of the settings menu. As you can see, although you can test the text to speech output, I haven't found a way to use it.
Text to speech is different to Talkback. There is a question on this here
Text to speech is a voice synthesis tool that can be used by apps to convert text to speech. One of the Google provided apps called TalkBack uses this as a screen reader for those who may find it difficult to read or see the screen.
I think what you want to activate is TalkBack as a screen reader - this means if you wanted to do it via the emulator you would have to install TalkBack on your emulator. The answers here may be able to help you, either by downloading the APK from the store or by just opening Google Play from the emulator and installing it. I recommend you go through the tutorial before diving in, as TalkBack uses gestures to navigate. You might not have success as swipes tend to have "levels" (a programmatic adb swipe is different from a swipe with a finger physically on the screen for some odd reason) when it comes to accessibility.
I have found it much easier to test accessibility on real devices, as adb controls are limited. I am trying to improve the situation with my code, but it's slow going and TalkBack is far from perfect.
Related
To explain my question, a bit of info about my test setup might help. I have a Moto Z, with a Moto Mod projector (my spoilt son's christmas present). I've now added a gyroscopic probox2 remote/gamepad, so he can theoretically use his phone while projecting, for films/games, without tapping the phone (which is behind him).
I've connected it and it works to an extent. It works in the core Android UI (home screen, app launcher, settings etc). However it doesn't work at all in most APPs. It works in Amazon Prime, for example, but not in Netflix.
I was expecting it to work pretty much seamlessly, as it would on Android TV boxes, even though I'm connecting it to a phone.
I've noticed it seems to identify itself to Android as a keyboard, rather than a gamepad, which makes sense since the gyroscopic "air mouse" functionality wouldn't necessarily make sense on a gamepad. The gboard popup disappears when the remote is connected, even though the remote itself doesn't have an actual keyboard. The remote allows you to switch between a sort of gamepad mode and a mouse mode, although in both cases identified as a keyboard.
Because it doesn't work out of the box in Android, and I think somebody would have noticed on an Android TV if it didn't work with Netflix, then I'm assuming Android TV developers do something to force compatibility from APPs that aren't allowing input from a "keyboard".
Possibly a service that detects "keyboard" presses and simultaneously triggers a "gamepad" press?
That's how I would probably approach it, and I assume that's how the non-root "button remapper" type APPs approach it, because they can't interfere with the actual button mapping file... but it might not be the best/easiest way?
Any ideas?
Having looked into this further I think I understand.
APPs for Android TV are maintained separately from their mobile counterparts (https://www.apkmirror.com/apk/netflix-inc/netflix-android-tv/netflix-android-tv-3-3-2-build-1530-release/netflix-android-tv-3-3-2-build-1530-android-apk-download/) and it's not possible to side-load them without getting hacky.
So that's basically the answer - the approach in the Android TV industry to ensure compatibility with keyboards, mice, and media remotes etc, is to create separate versions of apps for Android TV which support them. On mobile, presumably developers are mainly interested in ensure physical keyboards work in text input areas, rather than in all areas in unusual cases like mine.
Which doesn't help me at all.
The not so good ways of doing it... There's possibly an approach of creating a virtual gamepad, mapped to the key presses of the physical keyboard (i.e. remote). An example of an app which appears to do just this, requires root in order to do so https://play.google.com/store/apps/details?id=com.locnet.gamekeyboard2&hl=en_GB
I want to develop a HTML5 app to control doors, lights and other things in my garage. The frontend of will get displayed on a 24/7 turned on screen, that really only displays the app (kiosk mode). So there will be really only the app, no browser controls like back and forth button or url bar, and also no notification bar or back and home buttons of the device. Just the app.
For this I considered 2 options:
Doing it with a raspberry PI & connect a touch screen to i
Doint it on a dead cheap noname Android Tablet
Since the 2nd option is much more elegant from the hardware point of view (everything I need is already built in) I decided to try this first and bought a 70 USD Prestigio Tablet.
But since I'm no android dev, I'm not sure how I can even modify the stock firmware that's currently installed on the device.
As far as I can see, the bootloader is unlocked and ready for flashing a modified firmware image.
And here comes the question:
How can I get the device's currently installed firmware image? Do I have to contact the manufacturer for this or can I extract it from the device directly?
If I get the firmware image, how will I be able to modify files in it?
Or do you think I got the wrong route?
You need to find sources of firmware for your specific device. I bet manufacturer would not provide them. For Nexus devices it is easier because there is AOSP (android open source project) which gives you an opportunity to build a firmware. You can also check specific forums like 4pda.
But there is another way - to make a KIOSK mode for your device example. I'm not sure how does that fit into requirements. Also You can make your app as Launcher app and live with that :)
Preamble
TalkBack is an Accessibility Service for Android that helps blind and vision-impaired users interact with their devices. It’s an screen reader, that reads every user interface element.
This is great visually impaired people can use so many different apps that have not specially adapted.
But it does not work for all types of apps. I want to use a widget, that handle all touch events despite TalkBack is active.
VoiceOver is the complement of TalkBack in IOS. Here is the solution for my problem in IOS.
VoiceOver accessibility in a virtual musical instrument iPhone app?
Question
I implemented this solution to my IOS App and it works fine. Is there any equivalent for Android/TalkBack?
[ mySubView setAccessibilityTraits: UIAccessibilityTraitAllowsDirectInteraction ];
Apps that play sound directly by touch should be able to use with TalkBack on. Instrument apps, several games and apps like mine, that make a room discoverable with 3D Sound, should work with TalkBack on.
Turning off TalkBack is for a blind person not an option. It’s like turning of the screen for seeing people.
If you think there should be such a function, please upvote.
I'm a fairly new Android developer working on an app that I would like to have speak text that the app supplies. I have found several examples on using the Text to Speech feature on Android phones, such as this one)
and I've been able to get them to work much of the time, but they don't consistently work without problems on the Eclipse Emulator or the HTC phone I use to test. Problems include the speech working sometimes and sometimes not, or the app causing the phone to go to google play to download "SpeechSynthesis Data Installer" even though the phone has already demonstrated the ability to do text to speech and even though the store says that the item is already installed on the phone. I won't go into all the examples of issues but there are several more that I have come across in a few hours.
My question is has anyone put the Text to Speech feature in an app put in the marketplace or on multiple phones and found it to be relatively reliable, simple and straightforward, or is it an unreliable and inconsistent feature no matter how well it is implemented? Thanks.
There appears to be no accessibility features for the Android emulator. Ideally one would be able to have their computer read the contents of the Android emulation screen to them. From what I've seen, the contents of the Android screen and the buttons that can be used to manipulate the emulation Android etc. are all invisible to a screen reader.
Does anyone know of a workaround for this?
I found what looks like a promising resource here. It's a Text-to-Speech library for Android developed by T. V. Raman of Google. I'm still looking for more information from the community though.
I'm up dating my answer with my experiences. I bought a refurbished first gen Nexus 7 to try and learn Android programming. Installing the Android SDK with the bundled Eclipse was completely accessible. I was also able to enable accessibility on my Nexus 7 with no sited help. Enabling developer settings on the Nexus was also fully accessible. I was able to create an Android project using Eclipse with no problems. I was unable to use the graphical layout editor to add Widgets to a layout, although I was able to edit the XML to create a button with no issues. It looks like layouts are doable, you will just have to reference the docs for proper XML a lot. I created a method to be called when the button was clicked with a for loop so I could test debugging. I debugged the application on my Nexus and set a breakpoint in the body of the for loop. I was able to use standard Eclipse functions to step by line once the breakpoint was hit and view variable values. So far Android accessibility is looking good for the standard Android SDK. I am planning on testing out Android Studio and will update my answer with the results.
A long thread on this can be found at
http://www.freelists.org/post/programmingblind/Is-Android-Programming-Accessible
What I've gathered from it is that accessibility can be enabled with little to no sighted help. When I tried enabling talkback it made the emulator unusably slow although this was over a year ago so maybe things have gotten better? I'm a blind programmer and know Eclipse is accessible with Jaws so he should be able to program with either an IDE or command line and a text editor. I haven't researched this but if the emulator is slow maybe another option would be to run an x86 build of Android in VMWare player? A screen reader written by google employees can be found at
http://google-opensource.blogspot.com/2009/10/talkback-open-source-screenreader-for.html
and one written by someone else can be found at
http://spielproject.info/
One option might be to do debugging on a real phone with accessibility turned on. Debugging works essentially the same (and you don't have to deal with the slowness of the emulator - I much prefer this method because it's so much faster).
It's surely a more expensive option if your friend doesn't already have an android phone.
I think the better question might be "why are the accessibility features missing from the android emulator"? Maybe text-to-speech is too slow on the emulator?