Button unpresses itself (possibly device specific) - android
I'm trying to troubleshoot a tricky, difficult to reproduce (therefore possibly hardware related) problem for the open source Panic Button application I'm working on for Amnesty. There's an issue on github about it.
The issue is with a LinearLayout View with Buttons. When displayed in this fragment, when I try to hold a button, it just unpresses itself after one second exactly. When logging the onClick event, it triggers itself automatically even though I haven't released my finger from the screen.
This happens on the Cherry Gem phone, and I wasn't able to reproduce the problem on other phones.
I've been removing a lot of code and adding lots of logging statements to try and isolate the bug when I discovered the strangest thing, which led me to post this on Stack Overflow since it's strange enough that maybe someone will recognise the pattern:
When I drag my finger on the screen, then the bug goes away! More precisely, after confirming on the screen that the button unpresses itself a few times, if I hold and drag my finger around, then release it, after that I can hold buttons pressed without them unpressing themselves. Wat?
Please note that I'm not asking to solve the problem of trying to detect a long press (which this leads to of course), but trying to understand this problem of unwanted unpressing before I move on to implement a workaround. Therefore I'd also would prefer not to move the events to onTouch listeners, because I'm worried it would not address the root cause of the problem and I'd like to understand why this is happening first.
I'm not a 100% sure if there could be an unwanted interaction with the rest of the code and will also try to create a minimalistic project from scratch with only that code, if no-one recognises the pattern here.
I can also post a small video of the problem if that helps. I'm also happy to post code excerpts or logcat results.
Thanks for your time!
Jun
Update
I've looked at the adb shell getevent log which confirms that after .9 seconds there is an EV_KEY BTN_TOUCH UP
event. Does this confirm that from the OS's standpoint, it's receiving an event from the hardware about a button up? I guess this might be also triggered by software.
This led me to look into software that is installed on the phone and could interfere with the input devices. I've deactivated the Google voice typing, and then holding buttons worked again. Reactivating Google voice typing didn't make the bug reappear. Rebooting the phone then makes the bug reappear.
I tried to look at whether only some applications were affected. The pre-installed calculator was also affected. When using the default virtual keyboard the bug didn't happen. But then when returning to my app the bug had gone away again. Rebooted again. Calculator still affected. Go back to using the virtual keyboard in the browser. Now the bug stays... Wat?
After a while, I removed the Google voice typing input device again and the bug disappeared once more.
I have tracked a forum where there seems to be a rom for this phone, I'm trying to find out whether it's more recent than the version I have, or if anyone else with this phone experienced that problem as well.
(I also updated the relevant github issue)
Update 2
I noticed the following in the logcat:
04-01 12:05:30.484: V/PhoneWindow(2749): DecorView setVisiblity: visibility = 0
04-01 12:05:30.525: V/InputMethodManager(2749): onWindowFocus: null softInputMode=288 first=true flags=#1810100
04-01 12:05:30.528: V/InputMethodManager(2749): START INPUT: com.android.internal.policy.impl.PhoneWindow$DecorView{41aa5ef8 V.E..... R.....ID 0,0-480,800} ic=null tba=android.view.inputmethod.EditorInfo#41a3fef8 controlFlags=#104
04-01 12:05:30.530: V/InputMethodManager(2749): Starting input: Bind result=InputBindResult{com.android.internal.view.IInputMethodSession$Stub$Proxy#41a48da8 com.android.inputmethod.latin/.LatinIME #45}
04-01 12:05:30.608: I/InputMethodManager(2749): handleMessage: MSG_SET_ACTIVE true, was false
Could this be part of the problem?
For odd problems such as this, I think it's useful to override the onTouchEvent listener, and observe which MotionEvent is triggered (like ACTION_MOVE, ACTION_UP/DOWN). The root problem may be the touch screen driver of the device. So it may be good to test that driver with onTouchEvent.
There is, I think, a good webpage # Touch Device Configuration. Search text for "Driver reports signal strength as pressure". I think this is interesting and perhaps you could simply change the values for touch.pressure.calibration or/and scale for troubleshooting. This file has extension of idc, should reside on system subdirectory.
I want to provide sample code to override mouse/swiping events:
#Override
public boolean onInterceptTouchEvent(MotionEvent ev) {
switch (ev.getAction() & MotionEvent.ACTION_MASK) {
// Make sure ACTION_DOWN is dispatched to children view
case MotionEvent.ACTION_DOWN:
onTouchEvent(ev);
// Allow the children view to process this action
return false; // skip calling onInterceptTouchEvent
case MotionEvent.ACTION_MOVE:
onTouchEvent(ev);
return false;
default:
break;
}
return super.onInterceptTouchEvent(ev);
}
#Override
public boolean onTouchEvent (MotionEvent event) {
switch (event.getAction() & MotionEvent.ACTION_MASK) {
case MotionEvent.ACTION_DOWN:
mDownX = (int)event.getX();
mDownY = (int)event.getY();
break;
case MotionEvent.ACTION_MOVE:
...
Notes:
Method onInterceptTouchEvent occurs before onTouchEvent.
This method intercepts all touch screen motion events. This allows you to watch events as they are dispatched to your children.
Related
Android scrolling makes whole screen blue
I have not been able to replicate this, but is anyone aware of what might cause the entire screen on an Android device to go light blue? It seems related to selecting a radio buttons and then scrolling. It happens on Nexus 5x with Android 8. Here is what it looks like: I have only heard of one other instance of this occurring. Could it be device specific? Strangely enough, once it happens it seem to stay this way, though the user says it is somewhat intermittent. Update: This only seems to happen on Android 8, if that helps anyone...
So, I eventually found the offending code. I verified this is only happening on Android 8 devices, maybe only Samsung? The offending code was: mFormScrollView.setDescendantFocusability(ViewGroup.FOCUS_BEFORE_DESCENDANTS); mFormScrollView.setFocusable(true); mFormScrollView.setFocusableInTouchMode(true); // Hide the keyboard when moving the screen up or down. This should only // be an issue when // on a text edit field. Also disable focus jump using // "requestFocusFromTouch" mFormScrollView.setOnTouchListener(new OnTouchListener() { public boolean onTouch(View view, MotionEvent event) { if (event.getAction() == MotionEvent.ACTION_UP || event.getAction() == MotionEvent.ACTION_DOWN) { Utilities.hideKeyBoard(FormActivity.this, view); } // Keeps screen from jumping to nearest EditText // view.requestFocusFromTouch(); return false; } }); The offending line is commented out - the view.requestFocusFromTouch() method, which was meant to keep the screen from auto jumping to the next text field when the keyboard was hidden and focus lost. On Android 8 this is not happening, but I need to verify with older versions.
Prevent application with SYSTEM_ALERT_WINDOW from obscuring my application
Is there any way to assure that my application's window is not obscured by any other application's view using with SYSTEM_ALERT_WINDOW permission? If not, then is there any better way to assure that my app is not obscured by such window apart from obtaining the same permission and refreshing/showing/whatever my own view (of course shown in alert window) every 100ms or so to keep it visible? Eventual flickering, in case my application is obscured, is actually a good thing and an indicator to the user that something is wrong. EDIT: It seems that there is no way to do it except from going through KNOX on Samsung or some other proprietary solution for Trusted UI. Accepted answer is enough for my purpose, but it is not an answer for the question asked.
Even if it's not exactly what you're asking, the closest replacement I know of is: Either setting android:filterTouchesWhenObscured="true" in your layout (touch events will be filtered and not reach your View if they are going through an overlay, regardless is transparent or opaque). See View#setFilterTouchesWhenObscured(boolean), Or overriding View#onFilterTouchEventForSecurity(android.view.MotionEvent) and checking for FLAG_WINDOW_IS_OBSCURED. See View#onFilterTouchEventForSecurity(android.view.MotionEvent). Later can be implemented like so: override fun onFilterTouchEventForSecurity(event: MotionEvent): Boolean { if ((event.flags and MotionEvent.FLAG_WINDOW_IS_OBSCURED) == MotionEvent.FLAG_WINDOW_IS_OBSCURED) { Toast.makeText(context, "Screen overlay detected!", Toast.LENGTH_LONG).show() return false // touch event is cancelled } return super.onFilterTouchEventForSecurity(event) } See also the Security section of View class documentation. Notice that this functionality is available from API 9+. A workaround for older APIs can be found in this SO Question: Analogue of android:filterTouchesWhenObscured for API level below 9.
Trigger(Touch) not working anymore on Android Unity3D VR App
My Cardboard-like VR-Viewer has a button that works by touching the screen. I created an app in Unity3D and this trigger mechanic first worked like a charm. Now all of a sudden, I think I only added an explosion particle effect, the touch function stopped working completely. I have tried things like removing the explosion from my scene again, but nothing seems to work. Another curious thing is, that I can't close the app in a normal way anymore (normally in VR Apps you have an X-Button in the top left of your screen, but clicking it doesn't do anything anymore too (It used to work!)). App still runs, doesn't crash, but no interaction is possible. I looked at the debug logs via adb - no errors there... App works like it used to when I start it inside the Unity Editor. Did someone encounter a similar error or may have an idea about what the problem is? I'm using Unity Daydream Preview 5.4.2f2. Edit: I forgot to mention I was using GvrViewer.Instance.Triggered to check if the screen was touched.
For all having the same problem, I worked around it by also checking if a touch just happened. In my Player : Monobehaviour I used: void Update() { if (GvrViewer.Instance.Triggered || Input.touchCount > 0 && Input.touches[0].phase == TouchPhase.Ended) { //Do stuff. } }
Simulate Touch Controls Through Code
I'm trying to make it possible to navigate through my Google Glass application by using head gestures. I'm able to recognize head gestures like looking to the right left and up. They each have their own method for what to do when this gesture is recognized Now I need to simulate the corresponding touch gestures inside each method. So it will think I'm swiping to the left or right which will allow me to navigate through the cards with the head gestures. Does anyone have any idea on how to actually achieve this? Edit I created a quick hello world application to play with. I added my headgesture code and started trying to get the keys working. I added the following to my onCreate() Instrumentation instr = new Instrumentation(); Then I added the following lines to each respective headgesture method. Headgesture upwards should correspond with tapping the touchpadinst.sendKeyDownUpSync(KeyEvent.KEYCODE_DPAD_CENTER) Headgesture to the left should correspond with swiping left on the touchpad inst.sendKeyDownUpSync(KeyEvent.KEYCODE_DPAD_LEFT); Headgesture to the right should correspond with swiping right on the touchpadinst.sendKeyDownUpSync(KeyEvent.KEYCODE_DPAD_RIGHT); They are responding accordingly now, however I'm getting an exception saying: java.lang.RuntimeException: This method can not be called from the main application thread
The Solution In the end I went a different direction then the one I mentioned in my edit above. I found out that it is possible to call touch controls in the shell by using adb shell input keyevent <keycode here> I then found a way to use this in android, I have the following class named issueKey public class issueKey { public void issueKey(int keyCode) { try { java.lang.Process p = java.lang.Runtime.getRuntime().exec("input keyevent " + Integer.toString(keyCode) + "\n"); } catch (Exception e) { Log.wtf("IssueKeyError", e.getMessage()); } } } Then I simply call the class and pass the keycode for the corresponding gesture mIssueKey.issueKey(4);//functions as swipe down Here is the list of keycodes that I tested for anyone that is interested. Keys for each respective button/gesture 4: Swipe Down 21: Swipe Left 22: Swipe Right 23: Tap 24: Volume Up 25: Volume Down 26: Lock/Unlock Screen 27: Camera Button However, what I'm wondering now is. What would be best practice, getting the solution I metioned in my edit to work by using a asyncTask or is the solution I'm currently using better.
Using the Instrumentation class would work if you use a separate thread to call the sendKeyDownUpSync method from. This can be done using the following steps: Create and start a thread from your activity In the run method, use the Looper class and create a Handler as explained here Every time you want to call sendKeyDownUpSync, post a Runnable instance to the Handler, which calls sendKeyDownUpSync in its run method. A similar code sample (not from me) is available here
What DOM events are available to WebKit on Android?
I'm building a mobile web app targeting Android users. I need to know what DOM events are available to me. I have been able to make the following work, but not terribly reliably: click mouseover mousedown mouseup change I have not been able to get the following to work: keypress keydown keyup Does anyone know the full list of what is supported and in what contexts (e.g., is onchange only available to form inputs?)? I can't find a reference for this on The Googles. Thanks! Update: I asked the same question on the Android developers list. I will be doing some more testing and will post my results both here and there.
OK, this is interesting. My use case is that I have a series of links (A tags) on a screen in a WebKit view. To test what events area available, using jQuery 1.3.1, I attached every event listed on this page (even ones that don't make sense) to the links then used the up, down, and enter controls on the Android emulator and noted which events fired in which circumstances. Here is the code I used to attach the events, with results to follow. Note, I'm using "live" event binding because for my application, the A tags are inserted dynamically. $.each([ 'blur', 'change', 'click', 'contextmenu', 'copy', 'cut', 'dblclick', 'error', 'focus', 'keydown', 'keypress', 'keyup', 'mousedown', 'mousemove', 'mouseout', 'mouseover', 'mouseup', 'mousewheel', 'paste', 'reset', 'resize', 'scroll', 'select', 'submit', // W3C events 'DOMActivate', 'DOMAttrModified', 'DOMCharacterDataModified', 'DOMFocusIn', 'DOMFocusOut', 'DOMMouseScroll', 'DOMNodeInserted', 'DOMNodeRemoved', 'DOMSubtreeModified', 'textInput', // Microsoft events 'activate', 'beforecopy', 'beforecut', 'beforepaste', 'deactivate', 'focusin', 'focusout', 'hashchange', 'mouseenter', 'mouseleave' ], function () { $('a').live(this, function (evt) { alert(evt.type); }); }); Here's how it shook out: On first page load with nothing highlighted (no ugly orange selection box around any item), using down button to select the first item, the following events fired (in order): mouseover, mouseenter, mousemove, DOMFocusIn With an item selected, moving to the next item using the down button, the following events fired (in order): mouseout, mouseover, mousemove, DOMFocusOut, DOMFocusIn With an item selected, clicking the "enter" button, the following events fired (in order): mousemove, mousedown, DOMFocusOut, mouseup, click, DOMActivate This strikes me as a bunch of random garbage. And, who's that cheeky IE-only event (mouseenter) making a cameo, then taking the rest of the day off? Oh well, at least now I know what events to watch for. It would be great if others want to take my test code and do a more thorough run through, perhaps using form elements, images, etc.
Since this is the second most popular Android + JavaScript post on SO (which is just a sad commentary on the state of web development targeting the Android platform), I thought it may be worthwhile including a link to pkk's touch event test results at http://www.quirksmode.org/mobile/tableTouch.html and also http://www.quirksmode.org/mobile/ in general.
As of Android 1.5, the same touch(start|move|end|cancel) events that the iPhone supports work in Android as well. One problem I found was that touchmove ends get queued up. No workaround yet.