I am developing an accessibility based application which lets users use the application using a bluetooth keyboard, hands-free. What I need now is for the users to be able to zoom in and out, in the application(and possibly the whole android system) using certain key-combo(say Ctrl+Alt+?).
The zoom should work something like the Magnification Gesture feature of the android accessibility settings.The magnification on that works on triple-tap to activate(which I want to activate by a certain key-combo) and pan around using two finger swipe( Which probably would be the direction keys from keyboard).
The only thing I have found even remotely related to doing something like this is the Android Accesibility Service. But I dont think it let's me get keyboard key-combo and then zoom-in the screen.
Is there any way we can do this in Android? Do I need to go AOSP and make my own Android OS version probably? Please help!
Related
Both Android and iPhone:
With a screen overlay, can it be used to activate parts of the screen, thereby simulating user actions, activating the underlying action areas on the screen, i.e. scrolling?
I found this floating window library but that may do the trick on Android, wondering if anyone else has suggestions?
With a screen overlay, can it be used to activate parts of the screen, thereby simulating user actions, activating the underlying action areas on the screen, i.e. scrolling?
No, on Android. Faking input into other apps has significant security ramifications. On Android, accessibility services have some limited ability to do this sort of thing, which is why Google is starting to restrict their distribution. And, of course, on a rooted device you will have more options.
I moved from iPhone to Android (Samsung S7) and missing one feature. In iPhone I could use quick access to set clock and timer and set direct from locked screen if to press a button at quick access panel, shown at the picture with arrow. Is it possible to make a similar quick access to clock app in android from lock screen? To put it in android quick access panel or in some other way like third party app, etc: aiming to access clock app set on lock screen quick w/o logging in? Same question with Calculator app.
In Android you can do almost everything with apps.
I found here some apps that allow you to create widgets on your lockscreen, you click on them, unlock your screen, and the app launches:
https://www.maketecheasier.com/launch-apps-lockscreen-android/
I own a OnePlus so I know they have this option build in.
You can also use the Xposed module "GravityBox" or "LockscreenMods"
(it is not possible to bypass the unlockscreen through a lockscreen widget!)
I'm using ARC Welder to turn my Android app into a Chrome OS app. Most of it works perfectly, except that text selection with a mouse behaves like it would on a touch device, requiring long-clicking or double-clicking on words, and then dragging the ends. Is there a way around this?
As CommonsWare points out, this is the Android behavior.
Feel free to file a bug even for a feature request.
We are open to allowing the behavior to be changed, but it also seems like something that should be left up to the end user (how do they expect to interact with an Android app on a Chromebook?)
I would like to create an Android Accessibility Application/Service.
This Accessibility app would be able to magnify any screen image produced by any application resident on the android device.
for example, I would like to be able to magnify...
the home screen
Settings menu and sub menus
I would like to magnify Text and images/icons etc..
I've googled and searched the android dev docs for hints/tips/ideas.
Sadly I've hit a dead end.
Is this type of Accessibility application impossible to develop on Android?
Jellybean - Android 4.2 - apparently has this functionality built-in - see this release article detailing new features in Jelly Bean: "Accessibility: Enable screen magnification to easily zoom or pan the entire screen to get a closer look. Visually impaired users can now enter full-screen magnification with a triple-tap on the screen"
Typically on mobile operating systems these features are built into the OS, and not something that a 3rd party can write; partly for security reasons (a magnifier would have access to the graphic output of other apps, so could in theory send screenshots containing sensitive information back to base on the sly) and partly because magnification is complex, in that it involves interfering with normal video output and also with touch input (touch input has to be scaled in the inverse way that the original graphics area, so that touching a magnified button goes to the right place).
There may be a way of doing this if you are prepared to root your device and poke around at the OS/driver level, but that's not going to help much if you want an app you can put in the store.
We’re porting to Android some interactive iOS apps used to teach young children with learning disabilities. We have hit a major usability issue, because we can't figure out how to disable physical or on-screen navigation buttons (Home and Recent Apps).
Before anyone says “you don’t want to do that”, we fully understand why you would always want these buttons enabled for an able-bodied adult, but these children pose a unique set of accessibility issues. Specifically:
Their fine motor control may be poor - they may inadvertently touch a different area of the screen to the area they intend, or accidentally use more than one finger at once.
They may have weak muscle tone and poor physical strength – so e.g. the bottom of the palm of their hand may drop and touch the screen while trying to just use a finger.
They struggle to achieve and easily become disheartened or disruptive if they fail.
For instance, a typical 5 year old child with Down syndrome will accidentally drop out of the app they are using as a result of inadvertently touching the Home button: when this happens repeatedly, and the adult teacher or parent has to go back into the app for them repeatedly, the child loses interest and focus. Another typical scenario is a young child with Autism, who may freak out completely and need physically restraining if this happens while using their favourite app. Also, many disabled children will try to poke any other button they can find, in search of a response. In any of these situations, a potentially valuable educational session may have to be completely abandoned.
We're aware of SYSTEM_UI_FLAG_HIDE_NAVIGATION and SYSTEM_UI_FLAG_LOW_PROFILE, but these only reduce the visibility of the on-screen buttons until the child touches some other part of the screen, and then they re-appear in a way that’s more distracting than if they were visible all the time.
On iOS there is the “Guided Access” feature that solves this problem trivially. Can we emulate anything similar on Android?
On iOS there is the “Guided Access” feature that solves this problem trivially.
Guided access appears to be a device setting, not something that developers enable unilaterally themselves, thank heavens.
Can we emulate anything similar on Android?
There is no similar device setting in stock Android.
You can download the Android source code, modify it as you see fit, build the results into a ROM mod, and install that ROM mod on devices as you see fit.
Or, you can perhaps work with a device manufacturer creating tablets aimed at children to see if either they have already added this capability to their devices, or would be willing to work with you to add such a capability in a future iteration of their devices.