I'm using a uinput to control an Android phone from my PCs mouse and keyboard.
The problem I'm having is that I can't get the cursor position to match up to what's being sent from the PC as Android is accelerating/smoothing them (which is an absolute requirement for this project).
I've tried relative mouse movements using EV_REL. I've also tried absolute positioning using EV_ABS and BTN_TOUCH to simulate a touchscreen but, strangely, this too has acceleration.
Is there a way of either disabling acceleration or sending absolute positioned click events?
Related
I am trying to automate a user experience on AndroidTV Apps which don't have a standard android view hierarchy (probably written using openGL framework).
I can provide the dump of the view of any of these apps if needed.
I'm trying to fire a click event for a particular button, say 'ABC' which is present at the 'x y' coordinates on the screen.
For native android ATV apps, I can do that by firing an 'adb tap x y' event or UiDevice.click(x,y).
However, I'm not able to do so with Netflix, Prime, Youtube or Hulu Apps for ATV.
The click/tap is actually triggered on the screen, but the button doesn't respond to that.
Maybe because it's part of just a View(Framelayout) and not actually a button in the openGL world.
I don't want to use the D-Pad Remote Control events for this.
(Maybe, shift focus to that coordinate and then press dpad centre can. work)
Is there any way I can achieve that ?
I have an android phone with a small rectangle area of the touch screen not working(red rectangle in the image).
faulty touch image
see video
The question is: Is there a way to tap on the faulty area using the other working area as 80%+ of touch is all right? I think of a virtual touchpad .I couldn't find anything after extensive searching.
Basically I'm looking for a way to manipulate the mouse in Android. When you plug in an external mouse to the USB, a pointer comes up which can be used instead of touch input and gives you more precision/mouseover control.
I was looking for an app to control this mouse pointer programmatically -i.e a virtual touchpad to control the same android device.All the apps i have searched ,control either windows/linux PC or another android device
I am creating a project in Android.
This have pre-requirement is : Play video By moving Android device.
I have implemented Accelerometer Sensor, but if I move device on Plain surface and move device Up, Down, Left or Right, then No Event is called. It is only detects when device rotates in any direction.
Is it possible to detect moving device on plain surface??
Thanks in Advance.
How to capture click coordinates on Android? I mean the X Y coordinates that are visible when "Pointer location" is enabled in Settings -> Developer options. The coordinates should be captured all the time, independent of what is currently happening on the system.
The coordinates should be either written out to a file, printed out to logcat, send through TCP socket or whatever.
Related question: Read /dev/input/event in android via Java programming language
Edit 4 Jun 2016: there is a tool for that, RERAN. Its bug-fixed code is available in this fork on GitHub.
There is a tool allowing for screenshot capture of avd / real device and for interactive monkeyRunner script generation with coordinates taken from mouse clicks on the screenshot: wGetAndroidSnapshot
One hackish thing I could think of is to capture coordinates of mouse clicks of the underlying OS and map them to coordinates of the running AVD on which the mouse was clicked. Even more hackish would be to capture screenshot of the value of coordinates on AVD when "Pointer location" is enabled and to OCR them :)
There are also commercial solutions.
I would like to know what is meant by gestures in typical mobiles more specifically android ones. Android supports gesture recognition.
Is a gesture termed as when user
holds the device and shakes it(say
upwards, downwards or side- side)?
Is a gesture termed as when a
finger is placed on the screen and
traced UP,DOWN,RIGHT,LEFT? If so
what is the difference between touch
screen and gestures.
I am confused between 1) and 2) option.
What is a gesture exactly?
As I understand it, a gesture is anytime a user touches the screen and does a pre-defined motion that the system understands. I would venture to say that shaking the phone is not a gesture, but a function of detecting changes in the accelerometers.
From Android's reference web page, a gesture is a hand-drawn shape on a touch screen. It can have one or multiple strokes. Each stroke is a sequence of timed points. A user-defined gesture can be recognized by a GestureLibrary.
https://developer.android.com/reference/android/gesture/Gesture.html
I see gestures as being a type of input pattern that you expect from the user. I.e., you can setup mouse gestures in web browsers to do things like going "Forward" or "Back" in the browse by doing a specific "gesture" (maybe middle mouse button click and moving the mouse left goes "Back").
I'll give a brief answer to your bonus question: Yes, it's quite possible to to character recognition from input gestures on Android. In fact, at least one major vendor has already ported an existing handwriting engine to that platform. Works beautifully, but there's a lot of legal and marketing cruft to take care of as well before it ends up on real devices :(