How to capture click coordinates on Android? - android

How to capture click coordinates on Android? I mean the X Y coordinates that are visible when "Pointer location" is enabled in Settings -> Developer options. The coordinates should be captured all the time, independent of what is currently happening on the system.
The coordinates should be either written out to a file, printed out to logcat, send through TCP socket or whatever.
Related question: Read /dev/input/event in android via Java programming language

Edit 4 Jun 2016: there is a tool for that, RERAN. Its bug-fixed code is available in this fork on GitHub.
There is a tool allowing for screenshot capture of avd / real device and for interactive monkeyRunner script generation with coordinates taken from mouse clicks on the screenshot: wGetAndroidSnapshot
One hackish thing I could think of is to capture coordinates of mouse clicks of the underlying OS and map them to coordinates of the running AVD on which the mouse was clicked. Even more hackish would be to capture screenshot of the value of coordinates on AVD when "Pointer location" is enabled and to OCR them :)
There are also commercial solutions.

Related

Clicking/Tapping Screen Coordinates on Netflix/Prime/Hulu App on AndroidTV device

I am trying to automate a user experience on AndroidTV Apps which don't have a standard android view hierarchy (probably written using openGL framework).
I can provide the dump of the view of any of these apps if needed.
I'm trying to fire a click event for a particular button, say 'ABC' which is present at the 'x y' coordinates on the screen.
For native android ATV apps, I can do that by firing an 'adb tap x y' event or UiDevice.click(x,y).
However, I'm not able to do so with Netflix, Prime, Youtube or Hulu Apps for ATV.
The click/tap is actually triggered on the screen, but the button doesn't respond to that.
Maybe because it's part of just a View(Framelayout) and not actually a button in the openGL world.
I don't want to use the D-Pad Remote Control events for this.
(Maybe, shift focus to that coordinate and then press dpad centre can. work)
Is there any way I can achieve that ?

How to get all touched pixels on android

Is there a way to get every single touched pixel during a touch event? I understand that Android can provide an eclipse but I would like to access every single pixel that was touched for research purposes.
I understand this is the same question as How to get the all the raw touch points at a particular instant for a tap?
Did something change since then to provide this data now on Android devices or possibly iOS?

faulty touch screen/android touchpad for same device

I have an android phone with a small rectangle area of the touch screen not working(red rectangle in the image).
faulty touch image
see video
The question is: Is there a way to tap on the faulty area using the other working area as 80%+ of touch is all right? I think of a virtual touchpad .I couldn't find anything after extensive searching.
Basically I'm looking for a way to manipulate the mouse in Android. When you plug in an external mouse to the USB, a pointer comes up which can be used instead of touch input and gives you more precision/mouseover control.
I was looking for an app to control this mouse pointer programmatically -i.e a virtual touchpad to control the same android device.All the apps i have searched ,control either windows/linux PC or another android device

Get coordinates of touch event in Android emulator externally

I want to have a real-time record of the pixel xy coordinates of touches made on the Android emulator that is application-independent and is available to programs outside of the emulator (perhaps an updating text file or stream). I know about MotionEvent but that would not be application-independent (requires editing the code of any app that would be used).
Basically, I want the functionality of the Dev Tools/Development Settings/Pointer Location function on the emulator, that would record the latest touch event and can send that coordinate information external of the emulator.
Is there anyway that I might accomplish this?

Change an image on hover without touching the screen in android

I have a very creative requirement - I am not sure if this is feasible - but it would certainly spice up my app if it could .
Premise: On Android phones, if the screen is covered by hand(not touching, just close to the screen) or if the
phone is placed over the ear during a call the phone locks or
basically it blacks out. So there must be some tech to recognize that
my hand is near the screen.
Problem: I have an image in my app. If the
user points to the image without touching the screen, just as an
extension to the premise, I must be able to know that the user is
pointing to the image and change the image. Is this possible ?
UPDATE: An example use:
Say I want to build a fun app, on touch the image leads to some other
place. For example - I have two doors one to a car and one to a lion.
Now just when the user is about to touch door 1 - the door should show
a message saying are you sure, and then actually touching it takes you
to another place. Kinda rudimentary example, but I hope you get the
point
The feature you are talking about is the proximity sensor. See Sensor and SensorEvent.values for Sensor.TYPE_PROXIMITY.
You could get the distance of the hand from the screen, but you won't really be sure where in the XY co-ordinate system the hand is. So you won't be able to figure out whether the user is pointing to the "car door" or to the "lion door".
You could make this work on a phone with a front camera with a wide angle so it can see the whole screen. You'd have to write the software for recognizing hand movements, and translate these to screen actions.
Why not just use touch, if I may ask?

Categories

Resources