I'm looking for some help on how to make an Air application accessible to the blind. Its required that I use the screen reader functionality that comes built into android called TalkBack.
After researching I have failed to find anything useful. I did find out that you can check if the device has accessibility aids using "Capabilities.hasAccessibility()" however that returns False even when the device does have aids and they are turned on (Accessibility.active is also set to False when TalkBack is active). I also found out that you could give a Display object AccessibilityProperties such as a name and a description that I assumed the screen reader would use. However, It doesn't work. I have also used the Accessibility.updateProperties() after adding the properties and still no luck.
I also tried adding the permissions to the manifest for READ_PHONE_STATE (although I'm not really sure if that's what its meant for) but again no joy.
I'm using Actionscript 3 only. Not flex.
As best I can tell, Adobe has decided to ignore the disabled community when it comes to mobile apps built with Air. There does not appear to be any way to implement accessibility for such applications.
I would love to be wrong about this, but as it stands now, I am pretty certain that this is the case.
From searching the past few days and doing a bunch of tests it seems that the accessibility features you can build into AIR applications are only going to work in browser-based apps. Any kind of stand-alone style app (.app/.dmg/.exe/.air) form of distribution doesn't seem to work with any of the accessibility tools we have tested. We tested Window-Eyes and the built in Narrator in Windows 7. I have been unable to find any Adobe documentation to confirm this, but it seems like a huge oversight for them not to mention it in any documents.
Related
I recently started looking into automating accessibility testing on Android. There isn't much information out there on the web.
Has anyone explored this or are currently doing this? If so, can you share your ideas/approach?
Seems like Android's uiautomator relies on Accessibility features working but it doesn't support testing Accessibility. If it relies on Accessibility features, does that mean that basic validation like accessible label exists, etc. can be done by just executing UI tests using uiautomator?
This is a new area for me so any information could be helpful.
Here's a great introduction to Accessibility testing in Android. It basically boils down to:
Manually test your app for visual issues with Accessibility Scanner
Turn on TalkBack and manually test your app to find hearing impaired issues
To find font scaling and layout issues, use Large Text
Definitely lint check, but make sure that 'Image without contentDescription' is set to Severity = Error
Any/all accessibility issues you find or that are reoccurring, write an Espresso test to fail when that accessibility issue is violated in the future
For automation, you'll also need to consider how to perform visual validation of certain screen artifacts and audio analysis if hearing impaired functionality is a requirement.
Also, I recommend watching this presentation from GTAC 2015 on accessibility testing for some great context on the topic.
For automated tests that check for accessibility, I'd very much recommend starting with issues that can be identified in elements that are shared across screens (menus, layouts, themes, custom controls). While they won't catch the one-off errors that will occasionally pop up, they'll address issues that happen everywhere in your app, a 'prioritize by volume' approach if you will.
Also, if your team uses Android Studio then you definitely want to push for the ability to write Espresso tests that reside with the code. QA are part of the development process, period. Getting access to a subfolder where your tests reside shouldn't be a problem unless there some legal bologna to deal with. For instance, split out the 'androidTest' folder as a submodule where you have pull/push rights as a tester, but only read rights to the rest of the app so you can compile and run yourself. If you're writing Appium tests, it may be harder to ask your dev team to run them as part of their own BVT/smoke test process during builds, but not unheard of.
As for visual analysis and audio injection/confirmation, these are advanced capabilities that you'll probably need to use some service or commercial tool for.
Best of luck!
I agree with Paul's answer in its entirety and he links to some extremely helpful resources (so please have a look at them!), but if all you're looking for is basic accessibility test coverage as you suggest (e.g. checking for accessible labels on all your components), your use case might be a good one for something like Continuum for Mobile, specifically the Android variant. You can then do more manual passes once you've found the more basic violations that can be detected using automated tools; as of right now, manual testing is always necessary for total compliance with accessibility standards, but something like this will get you closer.
I have an iOS app that I want to convert to Android, it mostly uses UITableView, and other basic UI Objects, but other than that nothing fancy.
There are several tools out there that claim it can translate iOS to Android automatically.
I personally tried Apportable, StellaSDK, and also Intel App-Porter, but I didnt get any of them to work, even if I tried it with the simplest possible "hello world" iOS app using XIB.
Has anybody ever tried these or any other tool, that converts iOS apps to Android successfully for any non-game iOS app?
Thanks
rough
As of January 2014:
In any case it's never "automatic", there is always some amount of work involved, as these porting kits always lag behind Apple and their implementations are incomplete.
Apportable: promising, but despite what their web site is saying it's not quite there yet, still under development (see https://groups.google.com/forum/#!forum/apportable-discuss)
Right now trying StellaSDK with no luck: samples don't work or don't compile and there is virtually no information on the 'Net, even here on SO. The only thing these sample apps seem to be able to do is to ping home (morningtec.cn) when you run them, and I don't think you can disable this easily.
App Porter: never tried but I don't believe auto-translating ObjC to HTML5 is such a great idea on many levels.
Yeah, the picture is grim.
Long story short: How open is Android OS for developers?
A little in depth:
For instance, if I'm willing to write my own text-input interface, would it be possible (like, totally overwrite the built-in-phone one)? Or that's something like core feature and cannot be changed?
And, is there a difference for developers, whether I buy a Google phone, HTC or Samsung etc.?
P.S. If that's all possible, no warranty voids for such changes?
Hope I've made myself clear and thanks in advance!
You can create custom input method. For development the best is Google phone, but in order to work on all hardware, especially a keyboard, you need to test on all the devices you want to be supported (try to borrow them instead of buying all of them :P)
For android you can always do that..you need to download the source code from google repository. make the changes according to your wish for the global component, build the code and flash that to different device.
so you can always add your customized component to open source Android
I am developing a mobile app in Flash Builder 4.5 based on an Actionscrip Mobile Project - i.e. no flex just pure actionscript.
Is there a listView component that I can use from a pure actionscript project (i.e. a list of items where each item has a picture and some text and you can flick the list up and down with your finger and select an item to proceed to the next screen) ?
After much searching I just don't seem to be able to find one but because it seems like this sort of thing must be available in a platform for developing mobile apps I can only assume I have missed something obvious. I think there is something like this in a Flex library (?) but I am not sure how to access it or if I even can/should from a pure actionscript project.
(After much searching I sat down and wrote one myself and it seems to work fine and replicates the 'real' thing quite nicely for my purposes. However I am assuming that someone else will have done it better and so would like to find the real thing if I can).
There are a few AS3 libraries available that you may want to check out:
MadComponents: http://madskool.wordpress.com/ & http://code.google.com/p/mad-components/
I'm testing the MadComponents library right now and it looks promising. Super easy to get up and running. Lacking on the documentation, but I suspect that will change soon as I've been talking with the creator.
AS3Flobile: http://custardbelly.com/blog/category/as3flobile/ & https://github.com/bustardcelly/as3flobile
Looks really nice and I've made a quick test with it. It has another dependency in the AS3 Signals library. Its a bit more involved to get the basic shell of the program up and running and has limited skinning ability from what I gather.
HTH.
There are no built-in controls if you are building an actionscript mobile project. You have a large assortment of controls if you build a Flex-based AIR application however.
What are the key differences between Android, iOS and Blackberry OS in terms of level of accessibility by application developers (i.e. access to the video input, sound input, phone functionalities, to which extent, etc.)?
PS: Assume latest version of each OS.
EDIT: Can someone turn this into a wiki so we can compile answers from people that don"t necessarily have experience in all 3 plaforms.
I'm not familiar with BlackBerry, but on Android and iOS you can access just about anything. Until recently iOS had some restrictions about camera access (see this), but I belive those have been solved. Because Android is open-source, you can theoretically go as deep as you want as far as accessing the hardware, but you may or may not be able to get any deeper through the standard Android API than you can through the iOS API.
On Android, you can do a lot more to override default functionality. For example, you can create your own launcher screen or phone application. The iOS approval process wouldn't allow these kinds of applications.
API hardware access really isn't an issue on either platform, the bigger concern is overriding default software (almost never possible in iOS) and what types of applications iOS allows.
Each platform has its own nice and bad parts. I have been working on both Android and BB. I wish I could take only nice parts from both to create a platform of a dev dream! :)
For instance, I could take these features from BB:
The greates feature I like in BB is the simplicity of the application architecture - you can always count on your main UIApplication instance - OS never kills it.
Also I do like the simplicity the Dialog class provides - it is very easy to implement business logic related to user choice - while Dialog screen is shown the code execution just stops and waits for user input.
From Android I'd take the following:
Network communication. On BB this is a real nightmare (BES, BIS, WIFI, Direct TCP without APN, Direct TCP with APN, WAP, WAP2, Unite - who's next? :)).
For file manipulations you just use a native/usual Java API.
Nice looking UI components are available right out of the box.
I should add I'm not happy with GPS related stuff on both platforms, however maybe it is due to GPS hardware limitations rather than API creators.
Thanks!
BlackBerry is a pain, once I made a project for it (the JDE version was 4.7 back then) and it didn't had an ArrayList. WTF?