Currently developing a IME (InputMethodEditor) app, so there is the subclassed InputMethodService and that inflates a keyboardView. In this case it's just inflating a view, its size is similar to any other keyboard app. The View logs the touch event position of down, move, and up actions. I have minimised it to this, because I am testing the touch position given on move events. All positions are logged relative to the IME window, but when a move gesture goes beyond the view (above it) the y coordinates do not continue and remain at zero.
Is there a way to continue to receive the actual coordinates of the pointer even though the action move is outside of the IME window.
EDIT: The aim was to test if an action move event from a view in IME window can be passed to a view in another window, (i.e. to seamlessly continue with the action move from the IME Window to another) and also to move back to the IME window. But because the move coordinates seem to be bound to the IME window this doesn't work when the touch event goes beyond the IME window.
For security reasons this seems to be a limiting characteristics of the IME window and have not found a sensible way round this. Also there is no way of having an overlay to deal with this limitation, as stated in this answer: https://stackoverflow.com/a/33698077/3678942 , cannot allow touch events to pass through a window.
One way to work around this is to use the fact the y value is zero (i.e. it has reached the top most bounds of the IME window), and continue to use the touch event and its x values (which are still calculated), to process whatever handling is needed.
Having a candidate view also helps, the IME window increase its dimensions to accommodate it even if it is not shown and any underlying apps content will only measure to the visible input view.
Related
I have a very simple UI that has one entry control to enter phone number and a button. The entry control has a handler for removing border around it. When the entry control got focus, keyboard pops up. But when I try to tap outside the entry control such as on the screen empty area, the keyboard does not dismiss and the entry control does not lose focus. Also since the button is at the bottom of the screen, therefore, the soft keyboard hides it and there is no way to tap the button. The button can only be tapped if I press the Android device back button.
At present, I have not checked this behavior on an iOS device.
This was not a problem in Xamarin Forms though. I searched a lot on Internet and found that it is currently a bug in MAUI.
I tried to attach a tap gesture on the parent layout control and invoked platform-specific code to hide the keyboard but it seems the entry does not lose focus and in turn the tap gesture event is never called.
However, the entry control should lose focus automatically when I tap outside the entry control (such as on the screen) and the soft keyboard should automatically dismiss.
Please provide a workaround if there is any.
Known bug. Removing the focus sometimes helps. Sometimes you need to do Disable/Enable in sequence. (I go with the second).
If you want, you can read this for example:
https://github.com/dotnet/maui/issues/12002
(Most disturbing part, considering this is know bug for half year+)
We can leave the behavior how this is for now in NET7 and provide an
API in NET8 that lets users toggle this behavior on/off for iOS and
Android
Part of my Android app uses speech-to-text, and I want to disable touch events while it does that because otherwise the user can accidentally touch their phone and stop the conversation.
I'm trying to use a ViewGroup that can be the parent for that Activity and absorb the touch events, but am having trouble with it.
I have an Activity, and from there I pull up a DialogFragment where the user can enter information using speech-to-text.
When I tried using an overlay to absorb the touch events from the dialog, it only covered the DialogFragment and not the whole screen. And when I added it to the Activity, I couldn't access it from the DialogFragment.
In any case, it never stopped the touch events anyways because my main problem is when the speech-to-text dialog is up and that would come up on top of the overlay and I don't know how to get the handle for that.
Anyone here done anything like this before? Thanks.
I have a activity themed like a dialog and I have it setup so that it finishes when the user click outside.
this.setFinishOnTouchOutside(true);
As expected when the user clicks outside, it finishes. The activity is marked as floating activity and is only shown on top of the phone.
Now, if the user click on any other part of screen like the phone button/contact button on home screen, then the activity gets finished, but the user has to click on phone/contact app icon again to open phone/conatct app.
What I want is that if user click outside my activity, then the action must be performed as if the activity is not at all present on screen. Something like notification, which does not prevent user from doing other other tasks.
The only way you might be able to do this is by using a hidden WindowManager.LayoutParams flag, FLAG_SLIPPERY.
This allows touches starting on your View to continue to whatever View is below when the touch leaves your View but remains on the screen. However, I don't think this will work.
Android prevents you from touching "through" a touchable Window because it assumes that Window should be receiving the TouchEvent. Android also prevents you from programmatically "touching" the screen (without root or system access), most likely for security reasons.
I dug through AOSP for a while and found this.
Reading the comments, it's possible to infer that, while what you see doesn't take up the whole screen, the Activity's Window does. So, while nothing in your Activity is clicked, the Window is still overlaying everything, just with a transparent background, and is dealing with the touches that aren't passed to your Activity's UI. This brings us back to the "touching through" issue.
There are some application that disables all the touch inputs, including the touch events that occur on the navigation bar.
Examples are Touch Lock or UnTouch. How one can do that?
By analyzing the second linked app seems that there is a hidden layout that capture the touch events (like an empty onClickListener).
Initially I tried to draw a transparent foreground using the SYSTEM_ALERT_WINDOW permission and by assigning an empty touch listener. However in this way I cannot draw on the navigation bar, so the user can touch the home button and the back button.
Another way that I tried is to launch an Activity with transparent background and in fullscreen mode. In this way I can capture all the events. This works, but obviously this causes other activities to go in pause state.
So my question is, how can one reach the goal? Alternatively is possible to use some root/system commands?
Thanks!
Try this link. If you want to do it on just 1 view then edit out the iteration in the methods given.
I need to know when the user is using the screen over my activity, even when he is not strictly on my activity (for instance when he drawed the notification drawer, or when he is on Messenger). That is because i want my app to do something after a certain time of absence of action by the user, and such cases mess with the timer, as the activity is not paused.
I tried with dispatchTouchEvent() and onTouchEvent() but they only handle event made on my activity.
So is there a way to detect touch event made on layout that has been drawn by other app over my activity?
You can cretae transparent overlay window by system alert. It always on top.
Then you can handle touch event and stretching further.
Creating a system overlay window (always on top)