I want to have a real-time record of the pixel xy coordinates of touches made on the Android emulator that is application-independent and is available to programs outside of the emulator (perhaps an updating text file or stream). I know about MotionEvent but that would not be application-independent (requires editing the code of any app that would be used).
Basically, I want the functionality of the Dev Tools/Development Settings/Pointer Location function on the emulator, that would record the latest touch event and can send that coordinate information external of the emulator.
Is there anyway that I might accomplish this?
Related
I am trying to automate a user experience on AndroidTV Apps which don't have a standard android view hierarchy (probably written using openGL framework).
I can provide the dump of the view of any of these apps if needed.
I'm trying to fire a click event for a particular button, say 'ABC' which is present at the 'x y' coordinates on the screen.
For native android ATV apps, I can do that by firing an 'adb tap x y' event or UiDevice.click(x,y).
However, I'm not able to do so with Netflix, Prime, Youtube or Hulu Apps for ATV.
The click/tap is actually triggered on the screen, but the button doesn't respond to that.
Maybe because it's part of just a View(Framelayout) and not actually a button in the openGL world.
I don't want to use the D-Pad Remote Control events for this.
(Maybe, shift focus to that coordinate and then press dpad centre can. work)
Is there any way I can achieve that ?
Is there a way to get every single touched pixel during a touch event? I understand that Android can provide an eclipse but I would like to access every single pixel that was touched for research purposes.
I understand this is the same question as How to get the all the raw touch points at a particular instant for a tap?
Did something change since then to provide this data now on Android devices or possibly iOS?
I get stuck on how to get the capacitance image out of an android phone. Is it necessary to Hack the android system to get the capacitance image in Android?
In the released Android API, it only allows me to get which point an finger is touching on the screen. But since the touchscreen is using capacitive sensing, I am wondering how can I get the raw capacitance sensing data from the system? I guess I may need to hack the system for further information. Does anyone have done it before? Or give me some hints or directions on where I should start with?
I'm trying to develop an app that can recognize an active object (for example: a memory) that touch the smartphone display. Before I start to develop I've to know if there's any objects that my touch screen display can recognize? Which device can be recognizable by a smartphone display? I'm interested to know that for iPhone or for Android phone.
I found this app and you can see that with a card I can interact with a mobile device, now I'm asking you if anyone know how to do this kind of app with an iPhone or with an Android phone.
Does anyone knows how to do that? There's a library (iOS or Android) to recognize object that I put over the display?
volumique is the company that develops the monopoly card technology that you are talking about. However I will suggest two things.
For Android devices you can use NFC. Its kind of what you are doing right now but you just need to bring your object closer to the screen, no need to actually touch it.
For iOS, there is no NFC or RFID technology available. However you can develop a hardware which has active capacitors arranged in a pattern over it so when you bring your device closer to the iOS screen, the touch controller should recognize the pattern of the capacitors and report this to the main controller which can do recognition of the object with the help of your code. basically capacitive touch screens used in iPhones are just an array of capacitors arranged in a grid pattern. So when you touch using your finger, you change the capacitance of one or two capacitors and then the controller finds out the location of the change. However if you change the capacitance of say 5 6 sensors at the same time, in a particular order like in a pentagon, then you can write software for your controller that if the location of the sensors whose capacitance has been changed by this external object form the shape of a pentagon, then show the viewer that it is a 5 $ card (just an example). This is one way I can think of doing this.
Thanks
How to capture click coordinates on Android? I mean the X Y coordinates that are visible when "Pointer location" is enabled in Settings -> Developer options. The coordinates should be captured all the time, independent of what is currently happening on the system.
The coordinates should be either written out to a file, printed out to logcat, send through TCP socket or whatever.
Related question: Read /dev/input/event in android via Java programming language
Edit 4 Jun 2016: there is a tool for that, RERAN. Its bug-fixed code is available in this fork on GitHub.
There is a tool allowing for screenshot capture of avd / real device and for interactive monkeyRunner script generation with coordinates taken from mouse clicks on the screenshot: wGetAndroidSnapshot
One hackish thing I could think of is to capture coordinates of mouse clicks of the underlying OS and map them to coordinates of the running AVD on which the mouse was clicked. Even more hackish would be to capture screenshot of the value of coordinates on AVD when "Pointer location" is enabled and to OCR them :)
There are also commercial solutions.