I have an app which detects swipe on edge of the screen using an overlay view. Now I wanted to add an action which will be performed when the overlayed view is touched and held without moving. However, I observed some weird behaviours, which I suspect, after hours of trying and searching, is caused by the accidental palm touch rejection feature of Android. My observations:
If I touch and hold for some duration and then release without moving, no event is detected. (this is the problem)
If I quickly touch and release, both action down and up events are generated consecutively.
The above behave the same way with both MotionEvent and onClickListener.
If I touch and hold as long as I want, and then move without releasing, both down and move events are generated as soon as move starts.
If I remove FLAG_NOT_FOCUSABLE and FLAG_NOT_TOUCH_MODAL flags and write 0 there, whole screen becomes touchable and detects the touches correctly, except the left/right edges.
If I place the overlay away from the edge towards middle of the screen, everything works correctly.
On android emulator, everything works well with mouse clicks on default overlay position.
So, I attached a mouse to my physical phone with OTG, it also works correctly with mouse clicks.
In the main settings activity of the app, same behaviour is observed on preference buttons. No button animation when pressing and holding on edge of the screen, but animation and click works if I quickly touch and release.
I tried all flags of WindowManager.LayoutParams, tried to capture dispatch events of the view, none worked.
Is the problem palm rejection feature? Is there a way to bypass this and detect touch and hold (preferably with custom hold duration) on edges of the screen on overlay?
Note: Tried on Pixel 2 XL, Android 11.
val overlayedButton = View(this)
val overlayedWidth = 30
val overlayedHeight = resources.displayMetrics.heightPixels / 2
val LAYOUT_FLAG: Int = if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
WindowManager.LayoutParams.TYPE_APPLICATION_OVERLAY
} else {
#Suppress("DEPRECATION")
WindowManager.LayoutParams.TYPE_PHONE
}
val params = WindowManager.LayoutParams(
overlayedWidth,
overlayedHeight,
LAYOUT_FLAG,
WindowManager.LayoutParams.FLAG_NOT_FOCUSABLE or WindowManager.LayoutParams.FLAG_NOT_TOUCH_MODAL,
PixelFormat.TRANSLUCENT
)
params.gravity = Gravity.RIGHT
params.x = 0
params.y = 0
if (overlayedButton!!.parent == null)
{
wm!!.addView(overlayedButton, params)
}
else
{
wm!!.updateViewLayout(overlayedButton, params)
}
overlayedButton!!.setOnTouchListener(this)
override fun onTouch(v: View, event: MotionEvent): Boolean {
when (event.action) {
MotionEvent.ACTION_DOWN -> {
//...
}
MotionEvent.ACTION_MOVE -> {
//...
}
MotionEvent.ACTION_UP, MotionEvent.ACTION_CANCEL -> {
//...
}
//...
}
Maybe take a look at setSystemGestureExclusionRects. Also this article explains what is going on with the OS intercepting touches for high-level gestures.
Related
In an Android photo viewing app, I want to allow users to:
Zoom into a photo, drag to see its details, unzoom.
Swipe to go to the next photo.
Implementation attempt using ZoomableDraweeView in Facebook's Fresco library:
private fun init(imageUri: Uri?) {
val hierarchy = GenericDraweeHierarchyBuilder.newInstance(resources)
.setActualImageScaleType(ScalingUtils.ScaleType.FIT_CENTER)
.setProgressBarImage(ProgressBarDrawable())
.setProgressBarImageScaleType(ScalingUtils.ScaleType.FIT_CENTER)
.build()
zoomableDraweeView!!.hierarchy = hierarchy
zoomableDraweeView!!.setAllowTouchInterceptionWhileZoomed(false)
zoomableDraweeView!!.setIsLongpressEnabled(false)
zoomableDraweeView!!.setTapListener(DoubleTapGestureListener(zoomableDraweeView))
val controller: DraweeController = Fresco.newDraweeControllerBuilder()
.setUri(imageUri)
.setControllerListener(loadingListener)
.build()
zoomableDraweeView!!.controller = controller
Problem: When I zoom in, lift the fingers, then try to unzoom, this gets misinterpreted as a swipe and I am randomly sent to the next picture.
What am I doing wrong? How to disable swipes when zoomed in (or any better UX solution)?
I specifically call setAllowTouchInterceptionWhileZoomed(false), whose javadoc says: "If this is set to true, parent views can intercept touch events while the view is zoomed. For example, this can be used to swipe between images in a view pager while zoomed."
I tried to execute the swipe action only when zoomableDraweeView.getZoomableController().isIdentity() is false, but that does not always prevent unintended swipe. In particular, when I fully zoom out, often swipe accidentally happens, maybe because isIdentity() has been updated by the time I release all fingers. getScaleFactor() has the same issue. Another issue is that this solution only allows me to drag the zoomed picture with two fingers, dragging with one finger has no effect.
I thought about writing my own DraweeController and surface the zoom level (and ignore swipes when zoomed) but the base classes do not seem to contain any zoom level information.
By the way, here is how I detect swipes:
open class OnSwipeTouchListener(context: Context?) : View.OnTouchListener {
private inner class GestureListener :
GestureDetector.SimpleOnGestureListener() {
override fun onFling(
event1: MotionEvent,
event2: MotionEvent,
velocityX: Float,
velocityY: Float
): Boolean {
try {
val diffX: Float = event2.x - event1.x
if (abs(diffX) > abs(diffY)) {
if (abs(diffX) > SWIPE_THRESHOLD && abs(velocityX) >
SWIPE_VELOCITY_THRESHOLD) {
if (diffX > 0) {
goToTheNextPhoto() // Swipe detected.
(full code on GitHub if needed)
The issue is not in the Fresco library or in the ZoomableDraweeView. The problem is in the orchestrating of the overall UX in your activity. It is not hard though, and you were on the right track with the zoomableDraweeView.getZoomableController().isIdentity() and setAllowTouchInterceptionWhileZoomed.
There are several ways to overcome it, starting from the custom ZoomableController and ending by exchanging your swipes detector with a proper ViewPager2(RecyclerView based)(or even two as you have vertical and horizontal swipes) with snapping and its own callbacks and animations. But the solution will be similar in all the cases - using everything where(and when) it is supposed to be used. It may require a bit of refactoring of your overall approach.
The first way to fix it is via setAllowTouchInterceptionWhileZoomed, which is not working for you because you apply your swipe detector directly to the ZoomableDraweeView rather than to its parent. Thus your view parent cannot handle swipes because you told it not to via setAllowTouchInterceptionWhileZoomed, but your ZoomableDraweeView can, and it does. Thus making you ZoomableDraweeView parent handle swipes rather than the view itself should do the trick.
Secondly, the ZoomableController is AnimatedZoomableController thus, it performs animation, and the animation has a duration(in this case, it depends on the pinch gesture velocity) - in the case when you were zooming out, and your swipe detector was changing the image - the animation was still ongoing, but the transformation matrix was already back to identity - thus the issue.
To fix this, you have to consider animation duration. I think even a simple 200ms delay should fix it in most cases. Also, I would suggest having an onTransformChanged method of the ZoomableDraweeView view(making a custom view) overridden, checking for the identity matrix in it, and exposing it to the activity in some way - this way, you will be sure that the transformation has already happened(the animation delay is still relevant in this case), and you can easily enable your own swipe detection.
The first method is preferable and more "clean," and the second one is "hacky," but its minimal implementation should be very fast.
I haven't tried to refactor it so I assume some additional work may be required.
Hope it helps.
We have pointerInput for detecting tap, drag and pan events, and it also provides a handy awaitPointerEventScope, the pointer being the finger, for mobile devices here. Now, we do have a awaitFirstDown() for detecting when the finger first makes contact with the screen, but I can't seem to find an upper equivalent to this method.
I have a little widget that I wish to detect taps on, but the thing is that the app is made for such a use-case that the user might be in weird positions during its use, and so I wished to have it trigger the required action on just a touch and lift of the finger. The paranoia is that the user might accidentally 'drag' their finger (even by a millimeter, android still picks it up), and I do not want that to be the case. I could implement a tap as well as a drag listener, but none of them offer a finger-lift detection, as far as I know.
What solution, if there is one as of now, is suitable for the use-case while adhering to and leveraging the declarative nature of Compose while keeping the codebase to a minimum?
Better way, and what is suggested by Android code if you are not using interoperability with existing View code is Modifier.pointerInput()
A special PointerInputModifier that provides access to the underlying
MotionEvents originally dispatched to Compose. Prefer pointerInput and
use this only for interoperation with existing code that consumes
MotionEvents. While the main intent of this Modifier is to allow
arbitrary code to access the original MotionEvent dispatched to
Compose, for completeness, analogs are provided to allow arbitrary
code to interact with the system as if it were an Android View.
val pointerModifier = Modifier
.pointerInput(Unit) {
forEachGesture {
awaitPointerEventScope {
awaitFirstDown()
// ACTION_DOWN here
do {
//This PointerEvent contains details including
// event, id, position and more
val event: PointerEvent = awaitPointerEvent()
// ACTION_MOVE loop
// Consuming event prevents other gestures or scroll to intercept
event.changes.forEach { pointerInputChange: PointerInputChange ->
pointerInputChange.consumePositionChange()
}
} while (event.changes.any { it.pressed })
// ACTION_UP is here
}
}
}
This answer explains in detail how it works, internals and key points to consider when creating your own gestures.
Also this is a gesture library you can check out for onTouchEvent counterpart and 2 for detectTransformGestures with onGestureEnd callback and returns number of pointers down or list of PointerInputChange in onGesture event. Which can be used as
Modifier.pointerMotionEvents(
onDown = {
// When down is consumed
it.consumeDownChange()
},
onMove = {
// Consuming move prevents scroll other events to not get this move event
it.consumePositionChange()
},
onUp= {}
delayAfterDownInMillis = 20
)
Edit
As of 1.2.0-beta01, partial consumes like
PointerInputChange.consemePositionChange(),
PointerInputChange.consumeDownChange(), and one for consuming all changes PointerInputChange.consumeAllChanges() are deprecated
PointerInputChange.consume()
is the only one to be used preventing other gestures/event.
pointerInteropFilter is the way to go
Item(
Modifier.pointerInteropFilter {
if (it.action == MotionEvent.ACTION_UP) {
triggerAction()
}
true // Consume touch, return false if consumption is not required here
}
)
I'm working with touch gestures in Android using the OnGestureListener interface and GestureDetector.
I made an app to test if detecting two fingers works, in onFlp(MotionEvent e1, MotionEvent e2, float velocityX,float velocityY), I print the id of the different MotionEvents but these ids are the same (apparently only detects one finger).
Does GestureDetector support multi-touch events?
The Issue
Using OnGestureListener to detect multitouch gestures does not seem to be implemented by default.
The first thing you may have tried is reading event.pointerCount to get the count of fingers on the screen. However, this will be equal to 1. This is because you will (quite likely) never be able touch the screen with both fingers in exactly the same millisecond.
Fixing it
You have to buffer pointerCount (the amount of fingers on screen). First add those variables somewhere in the context that you intend to track gestures in:
// track how many fingers are used
var bufferedPointerCount = 1
var bufferTolerance = 500 // in ms
var pointerBufferTimer = Timer()
Then, in the onTouchEvent(event: MotionEvent) function, you add this:
// Buffer / Debounce the pointer count
if (event.pointerCount > bufferedPointerCount) {
bufferedPointerCount = event.pointerCount
pointerBufferTimer = fixedRateTimer("pointerBufferTimer", true, bufferTolerance, 1000) {
bufferedPointerCount = 1
this.cancel() // a non-recurring timer
}
}
Essentially this tracks the maximum amount of fingers on the display and keeps it valid for bufferTolerance milliseconds (here: 500).
I currently am implementing it in a custom Android Launcher I created (finnmglas/Launcher | see related issue)
Does anyone have any idea how this can be implemented?
Some examples:
https://play.google.com/store/apps/details?id=gpc.myweb.hinet.net.PopupVideo
https://play.google.com/store/apps/details?id=com.milone.floatwidget
Any idea? Thanks.
That is done using an overlay on top the existing activity regardless. You can find the source found here on my github to demonstrate the proof of concept that was on android's stackexchange.com question in which someone who suffered from a disability and could not swipe the screen to scroll and was looking for a way to do this conveniently, in which I cobbled together the code to show 'Page Up/Page Down' with minimum movement, just by tapping on the text (really, its a button) which overlaid on top of a activity. But due to the way Android and security worked, the scrolling event could not be injected into the currently running activity.
The way it works is this, from the onCreate activity there's this
WindowManager.LayoutParams layOutParams = new WindowManager.LayoutParams(
WindowManager.LayoutParams.TYPE_SYSTEM_OVERLAY,
WindowManager.LayoutParams.FLAG_WATCH_OUTSIDE_TOUCH,
PixelFormat.TRANSLUCENT);
The flag that is needed is WindowManager.LayoutParams.TYPE_SYSTEM_OVERLAY and WindowManager.LayoutParams.FLAG_WATCH_OUTSIDE_TOUCH.
The onus is to ensure that the touch is being handled which means watching out when the touch event 'hits' on a area and to act accordingly. By "listening" in on the View's onTouch event, that is:
LayoutInflater layOutInflater = (LayoutInflater)getSystemService(LAYOUT_INFLATER_SERVICE);
View myView = layOutInflater.inflate(R.layout.myview, null);
myView.setOnTouchListener(new OnTouchListener(){
#Override
public boolean onTouch(View v, MotionEvent event) {
....
}
});
For example, imagine a button within that myview layout, so using a button widget,
Button myButton = (Button)myView.findViewById(R.id.myButton);
Rect outRectHit = new Rect();
myButton.getHitRect(outRectHit);
Now we need to determine the 'collision' of the boundaries of the touch, which happens inside the onTouch handler:
float x = event.getX();
float y = event.getY();
if (x > outRectHit.left && x < outRectHit.right &&
y > outRectHit.top && y < outRectHit.bottom){
... Handle this accordingly
}
This brief summary explains how to do such a thing as an overlay on-top of any activity.
Just to make sure, that the code gets debugged and does not interfere in any shape or form with any activity shown. For example, what happens if the activity is in fact a OpenGL game, what do you do?
That is to illustrate what needs to be watched out for.
You might want to have a look at Tooleap SDK.
It offers the simplest way of turning regular Android Activities into floating widgets.
E.g., let's say you have an Activity called MyActivity
Intent intent = new Intent(context, MyActivity.class);
TooleapPopOutMiniApp miniApp = new TooleapPopOutMiniApp(context, intent);
Tooleap tooleap = Tooleap.getInstance(context);
tooleap.addMiniApp(miniApp);
MyActivity will now become available as a floating widget on top of any other application.
I'm building a little game in HTML/JS on Android. I'm running into a problem on my HTC Desire (Android 2.2). When I touch the screen, all the images look pixelated and they get un-pixelated when the touch ends.
Here is a screenshot:
On the right it's when the screen is being touched. Can someone help me figure out what's causing this issue?
Notes:
No problems during the animations if the screen is not touched
I don't have this problem on my LG 540 Android 2.1
it seems images get a restricted number of colors when it's being touched
I'm using Phonegap
As far as I can tell, that "pixelated" behavior is an optimization made for scrolling (in Froyo and above). If the rendering is simplified, it makes things like the fling scroll animation require less processing.
If you need full browser functionality, I'm not sure you can help it much.
Since you've said you're making a game, however, I might have a workaround. I'm hoping your game doesn't need scroll (one full screen), so the scrolling optimization isn't necessary.
I've done a simple test with a WebView. On tap, as you mentioned, the rendering gets simplified, and things look a little off. Then once something is clicked (the WebView knows no more scrolling is taking place), things go back to normal.
I modified my Layout by replacing a WebView, with a FrameLayout. The FrameLayout contains the WebView and an invisible Button (on top). This Button grabs all the touch events. Then, I selectively choose what types of events the WebView should need, and pass them to the WebView. If a touch down and touch up happen close together, with no movement in betweeen, there's no reason for scrolling, so I haven't seen any of that "pixelated" behavior.
Because it was simplest for this example, I've chosen to detect the "MotionEvent.ACTION_UP" event, and when it's complete, I send a down first, so that it simulates a real click. You could certainly trigger on ACTION_DOWN, but you'll get more than one of those if the user swipes or something, and I wanted to keep the logic here simple. You can customize as you see fit, and probably with enough work, even enable scrolling in some cases. I hope the code below is enough to relay what I think works.
WebView wv = new WebView(this);
View dummyView = new Button(this);
dummyView.setBackgroundColor(0x00000000);
dummyView.setOnTouchListener(new OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
if (event.getAction() == MotionEvent.ACTION_UP) {
MotionEvent down = MotionEvent.obtain(100, 100,
MotionEvent.ACTION_DOWN, event.getX(),
event.getY(), 0);
wv.onTouchEvent(down);
wv.onTouchEvent(event);
}
return false;
}
});
FrameLayout fl = new FrameLayout(this);
fl.addView(wv);
fl.addView(dummyView);
topLayout.addView(fl);
EDIT:
If you don't want to edit PhoneGap source, you might be able to do something like the following to change the PhoneGap layout... It's untested, but seems like it should work:
#Override
public void onCreate(Bundle arg0) {
super.onCreate(arg0);
super.loadUrl("file:///android_asset/www/index.html");
// Get the "root" view from PhoneGap
LinearLayout droidGapRoot = super.root;
// Create a new "root" that we can use.
final LinearLayout newRoot = new LinearLayout(this);
for (int i = 0; i < droidGapRoot.getChildCount(); i++) {
// Move all views from phoneGap's LinearLayout to ours.
View moveMe = droidGapRoot.getChildAt(i);
droidGapRoot.removeView(moveMe);
newRoot.addView(moveMe);
}
// Create an invisible button to overlay all other views, and pass
// clicks through.
View dummyView = new Button(this);
dummyView.setBackgroundColor(0x00000000);
dummyView.setOnTouchListener(new OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
// Only pass "UP" events to the specific view we care about, but
// be sure to simulate a valid "DOWN" press first, so that the
// click makes sense.
if (event.getAction() == MotionEvent.ACTION_UP) {
MotionEvent down = MotionEvent.obtain(100, 100,
MotionEvent.ACTION_DOWN, event.getX(),
event.getY(), 0);
newRoot.onTouchEvent(down);
newRoot.onTouchEvent(event);
}
return false;
}
});
// Layer the views properly
FrameLayout frameLayout = new FrameLayout(this);
frameLayout.addView(newRoot);
frameLayout.addView(dummyView);
// Add our new customized layout back to the PhoneGap "root" view.
droidGapRoot.addView(frameLayout);
}
I'm running into this same issue. One thing you could try is adding android:hardwareAccelerated="true" to your app's manifest in the tag. This stopped the problem for me, however now the whole app seems a little more pixelated overall on my device, so this might not be the best solution.
Check twice if your images have the same size in CSS in pixels as the files themselves. It seems to be somehow related. If I take a large image and re-scale it with a device-dependent generated CSS, I see the problem. Otherwise it's either not presented or not visible. Not sure if the original problem has been fixed in the latest Androids, but I still support 2.3, so hope it helps.