I want to generate a touch event via programming on the screen outside of my app. My app currently has a floating window. I am trying to make something like repetitouch
I used the Hover library to create a floating window to display some message and buttons. I tried following this answer but the Hover library doesn't provide me with any views to work on
[ when I try to use view.dispatchTouchEvent(motionEvent); ]
The ClassName.this, getView() and all don't work.
Is there any other floating window library which provides views after implementation so that I can generate touches via code.
Can I do this task using services? If yes, what should I use?
Updated:::::
Refer the following link that shows How to stimulate touch event in android
https://stackoverflow.com/a/23902985/9287163
(Hope you find it useful)
I found my solution. I used instrumentation.
InstrumentationObj.sendPointerSync(MotionEvent.obtain(SystemClock.uptimeMillis(), SystemClock.uptimeMillis(), MotionEvent.ACTION_DOWN, x_coordinate, y_coordinate, 0));
Related
I'm trying to write UI automation tests for the custom soft keyboard using UI Automator and\or Espresso. Tried different ways but I can't find a proper solution to "find the exact button on the opened keyboard and click it".
Problems:
UIAtomator's UiDevice.findObject(By.text("Q")).click() doesn't find Q button on keyboard.
Espresso's onView(withText("Q")).perform(click()) doesn't find the button either.
For now, it looks like the only way to click button is to measure XY coordinates based on screen height and keyboard height. But this solution is ugly and not persistent.
typeText("text")and uiObject.text = "text" don't work since it bypasses keyboard input.
Was anyone working with custom keyboards? Please help.
Since you're building a custom soft keyboard then I expect you're using a KeyboardView. KeyboardView draws the keys using a canvas therefore it is not possible to get the resource ids of the keys... so no chance to find them through the UiDevice's findObject method.
Considering the KeyboardView class is deprecated since API 29, a possible solution will be to reimplement your own KeyboardView (as suggested here) and use AccessibilityNodeInfo class to build virtual elements (one for each key) that will be included into the view hierarchy.
The best solution in my opinion would be to create your own TCP server to solve this issue. Please refer to this link to find out how: https://ops.tips/blog/a-tcp-server-in-c/
Helo,
Is their any library that supports the swipe to delete feature as implemented in gmail on Android, that also shows the undo button ? I saw this also on google io 2013, so i assumed this is natively supported by Android ? Is this so ?
Kind Regards
yes there is but you need to slightly modify those based on your need:
https://github.com/romannurik/Android-SwipeToDismiss and this https://github.com/timroes/SwipeToDismissUndoList
and this https://github.com/47deg/android-swipelistview
There is no library i guess. But u can implement it by using DragListener and making the view's visibility to gone (make a custom view by extending it). And when you undo, set the Visibility to "Visible"
You can also use OnTouchListener and listen to the co-ordinates on which user performs drag & then perform your logic.
I need to implement a swipe gesture using two finger touch input on my ListView from right to left and vice-versa, but it should work exactly like an app named Clean Master who applied this under its "History" section (check the image) for going from Cache to Residual files (but using sinlgle touch input and i want to implement this using two finger touch input). If you've used this app, can you please tell me how to implement this. I've no idea being new to this concept. Please help me learn.
Is this what you need? Its a library called swipe listview.Its so easy to use and have customization options.
I have a need to create a circular dial/rotary style component for use in an application. It's essentially a circular menu that allows users to select from the items that are ringed around it, and then they can click the button in the center to activate the selected item. However, I've never created a custom UIView of this type, and don't really know where to begin. Can anyone give me any pointers as to how I would draw the view and then rotate it as the user drags their finger? I obviously know how to intercept touch events, etc. but I'm not sure how to actually go about manipulating the UI appropriately. Any tips or pointers would be great!
I don't know if you've already found a solution to this, but here is a nice overview of how to get started:
http://shahabhameed.blogspot.com/2011/05/custom-views-in-android.html
For you, I think you can extend an existing View, that View being the SeekBar. You can take the standard SeekBar and draw it in a circle.
Finally, here is a source code that does the rotation with a volume knob. It is its own project though, so you have to do some work to use it in your own app.
http://mindtherobot.com/blog/534/android-ui-making-an-analog-rotary-knob/
Good Luck!
I have a neat library to do this. It is extremely stable and well maintained. https://bitbucket.org/warwick/hgdialrepo
Heres a youtube demo: https://youtu.be/h_7VxrZ2W-g
This library comes with a demo app with source code and the demo app actually uses a dial as a menu, So I think this should be the perfect solution for you.
The TouchUtils class in the android documentation has functions like drag():
https://developer.android.com/reference/android/test/TouchUtils.html#drag(android.test.InstrumentationTestCase,%20float,%20float,%20float,%20float,%20int)
but they do not support multi touch gestures, like a two finger swipe.
Looking at the MotionEvent.obtain() methods, there does not seem to be any way of invoking a "virtual" multi touch event from a testcase.
Anyone has got it working?
Apparently there is no other way than to use the private function MotionEvent.obtainNano() to mock the multi touch events. Hopefully this will change in future versions.