Recently Android Wear introduced gestures so the user can navigate between apps/notifications without using their finger.
I already had a working app but that kept being removed from the screen due to these gestures. I tried it with an app made by Google itself (Googlesamples-JumpingJack) This has the same problem.
The purpose of these app is to move. But these cause the gestures to trigger. How can I disable these as long as my app is on, so that the Activity stays on the foreground?
As far as my understanding goes, you can (only?) disable gestures within the settings menu of the watch. I myself am struggling with disabling one or two specific gestures and so far found you can enable or disable them all. Haven't found anything on how to do this within your code, rather than within the settings.
Related
Background
I work on an app that has a floating UI shown over other apps (using SAW permission, AKA "display over other apps").
It also handles touch events that might come from the sides.
The problem
It seems that on some devices, when gesture navigation is enabled, touching on side sides might prioritize the system navigation instead of the app.
What I've tried
I want to change the UI so that at least it will be larger and makes it easier to touch, but only if the current configuration of the OS is on gesture navigation.
Sadly, all gesture-related questions that I've found are related to when you are inside an Activity already, offering to handle the protected areas. Nothing I've found is related to simple detection, let alone without View/Activity/Window.
The question
Given only the bare basic classes that are outside of Activity/View, how can I detect (including a callback when it changes) whether the device is on gesture navigation or not?
I am developing an Android application for the deaf and blind using the accessibility service
Need to disable the touch screen, because android app is controlled through a third-party device
I tried to add a view to WindowManager and disable everything there (FLAG_NOT_FOCUSABLE, FLAG_NOT_TOUCHABLE), but as far as I understand, it disabled the touch screen only in this view, but not at all
I can request any type of permission except the root
EDIT:
Due to Android security, disabling the touchscreen is not possible without root, but you can use Proximity Sensor to disable touchscreen until first power button pressure
Congratulations and good will for you are working for a good cause.
Not sure exactly how to do this but One way is to create a full screen activity - hide navigation bar and soft navigation buttons also -- please refer this documentation
And 2. override onBackPressed and never allow it to call super.onBackPressed() or relaunch your app in OnPause() method.
Hope this helps somewhat.
I've an app that can emulate the back button via the accessibility service by extending the AccessibilityService and using following:
performGlobalAction(AccessibilityService.GLOBAL_ACTION_BACK)
This works fine in all cases but one.
Problem
One of my users told me that it works on his android p device as long as his software buttons are enabled, but if he uses the new systems gesture navigation (like explained here: https://www.androidcentral.com/how-enable-android-p-gesture-navigation-system) this action does not work anymore.
Question
Any ideas why or how I can solve this? Even detecting if gesture navigation is enabled or not would help a little...
I am developing an accessibility based application which lets users use the application using a bluetooth keyboard, hands-free. What I need now is for the users to be able to zoom in and out, in the application(and possibly the whole android system) using certain key-combo(say Ctrl+Alt+?).
The zoom should work something like the Magnification Gesture feature of the android accessibility settings.The magnification on that works on triple-tap to activate(which I want to activate by a certain key-combo) and pan around using two finger swipe( Which probably would be the direction keys from keyboard).
The only thing I have found even remotely related to doing something like this is the Android Accesibility Service. But I dont think it let's me get keyboard key-combo and then zoom-in the screen.
Is there any way we can do this in Android? Do I need to go AOSP and make my own Android OS version probably? Please help!
I have a laptop running windows 8.1 with a touch enabled screen.
I am using arc to run apk on the laptop.
The touch works fine on ARC welder but after launching apk, the touch event does not work on app's screen. Just the mouse event works.
I have tried multiple apks and everyone behaves same.
Do we have to enable something to make the touch events work?
This is a bug that has been reported here:
https://code.google.com/p/chromium/issues/detail?id=480745
So a way of getting around this at least for map applications is to specifically code the APK to handle the event and you must use the Arc-Metadata tag of
"enableSynthesizeTouchEventsOnWheel": false, this stops the runtime from trying to interpret your scroll wheel as a touch instead of the SOURCE_CLASS it is.
So for the input device SOURCE_CLASS_POINTER, you have to define ACTION_SCROLL so that a mouse scroll wheel will allow you to do actions. There is plenty of examples of how to do this all over StackOverflow though, just disable that meta-tag and most of the code you normally use for making things work in Android will work here.
Note: Haven't tried "enableSynthesizeTouchEventsOnClick": false, and testing with SOURCE_ that equals a external touchscreen but i believe it would work .