Trigger touch event in specific physical coordinates in the screen Android - android

I'm performing a little experiment that requires two activities running side by side to interact. The interaction is very simplistic. Pressing a button in one activity should result in a touch event being triggered anywhere in the other.
I'm not even sure this is possible, it might not be because of security reasons, but I thought I would ask here anyways.
Is there a way to perform an touch event on position X,Y relative to the physical the screen. (not relative to my current activity)
Again, I'm doing this just for fun.

I have found this blogpost that explains three possible methods to perform a click on an external activity
http://www.pocketmagic.net/injecting-events-programatically-on-android/
They require root, but for my use this is not an issue. If anyone finds something more simplistic or that doesnt require root, I will mark that answer as correct.

Related

How to do pinch-and-zoom with a touchpad on ChromeOS/Android?

I have an Android app which I'm running on a Chromebook. I have views which scale with pinch-and-zoom gestures when the user touches the device's screen, and these work fine on the Chromebook. I'm trying to get pinch-and-zoom working with the touchpad as well.
I can three-finger drag scrollable elements. I can two-finger drag and it drags around screen elements where dragging makes sense. I still get hover events and the events claim there are two pointers, so that's all good. However, as soon as the fingers start moving in opposing directions, the event stream stops.
Is there any way I can get the unfiltered everything input event stream so I can see what's going on? I feel like the emulation layer's best-effort attempt to make everything "just work" (and it's a really good effort!) is biting me here. I also notice that some events are coming in as generic motion events, and some are coming in as touch events. And some, like tap-to-click do some of each. If it matters, the input device data for ChromeOS Mouse claims it has the ( touchscreen mouse ) sources, which mostly makes sense. Except shouldn't it be touchpad instead since it's not directly attached to a display?
On this page, list item #5 implies that some kind of synthetic event might be created and used somehow. Is there any way to see if those are being generated? And if yes, how would I take advantage?
Help!
A little more detail: Single finger operation of the touchpad gives me ACTION_HOVER_MOVE generic events. Two-finger drag gives me ACTION_MOVE touch events so long as both fingers are moving together. As soon as they start heading in different directions, the event stream stops.
Pinch-to-zoom support for Touchpad is still work in progress. Once it is there, it will work seamlessly with the standard gesture recognizer used for touchscreen zoom as well, you should not have to do anything.
I can highly recommend upgrading to API level 24 if you want to target Chromebooks, there are also more details on input devices on Chromebooks to be found here: https://developer.android.com/topic/arc/input-compatibility.html
edit: The "touchpad" device type is very confusingly named. It is reserved for off-screen devices. The touchpad is treated as a mouse since it moves the mouse cursor on screen.

Is there a way to speed up touch/click detection?

I have been working on a game where speed is required to score well, in the game the user is clicking on objects and using them with other objects which are held in a gridView that's being controlled by an imageAdapter. I have noticed that when I click quite fast some clicks don't register, I have cleaned up the code running when the user clicks but that doesn't seem to be the problem since I tested it just highlighting objects and not running code when clicked and it was just as slow. So is there a way to speed up the click detection, or would this be limited by the speed of the device its self, I have been testing on an htc one m8.
Return as soon as possible from the handler and run the UI update code in background with 'runOnUiThread()'.
Notice that changing view status MUST be done in the UI thread or the Android runtime will throw an exception. You can work complex calculations in background 'AsyncTask' and call 'runOnUiThread()' from within them whenever you want to update UI components.
As far as I know, there is no way to do such think. It depends on the speed of your hardware. But what you can do is to use the onTouch listener. In this way you listen only after one action(when it is pressed). for onClick it is registered 2 actions(when u press the button and when u release the button). In this way maybe you could do it faster.
You can also try this:
http://developer.android.com/guide/topics/graphics/hardware-accel.html

Android touch data logs

For the research I am doing, I need to some way to track user touches when they are using the phone in general daily basis. The user will be fully aware about what they are recording. Any method to do would be great.
What have I tried so far?
Method 1.
Create an service with overlay transparent view.
Problem Due to obvious security flaws this is prevented starting with ICS. The input touches on the transparent view is not transferred to background and hence user is not able to interact with phone normally. I tried various methods with overlay view defining as type phone or type system alert or switching between them during program execution.
Method 2.
View with 1% screen size make with touch outside model
Problem As problem as previous. Touch outside only returns the if touch event happened outside without even initial x, y coordinates.
There are other methods I tried but those are highlighted. Currently, I am thinking about other options:
Option 1 - The pointer location option in developer options: In settings there is this pointer location option that I can utilize. When that option is on, all the info about touch are shown in the top panel. If I can have access to those data afterwards that would be fine too, despite the fact that there will be drawings on the screen when user is using the phone. I have traced the source code of ICS and found the core class that is making that tracking possible. I recreated that class into an service. The logcat shows all the info about touch when I am running it from app. Only problem is the same as problem 1. I cannot track it outside current app. So, if it logs the tracking info even when pointer option is turned on, how will be able to get the information later to use?
This option seems the easiest.
Option 2 - Android NDK If above method is not possible is it possible to do so using NDK? Right direction to this route is also great.
Option 3 - Custom ROM Do I really need to go for Custom ROM while doing this? I am pretty sure this is 100% sure way to do it. But it is seeming very impractical in this particular research.
I would appreciate any suggestion to the path that I can follow.
Thank you all in advance.
You can use rooted phones and RepetiTouch for your research. https://play.google.com/store/apps/details?id=com.cygery.repetitouch.free

Actionscript 3 for Android - should I attach mouse event listeners to the stage or to individual sprites?

In AS3 on Android is it bad from a performance perspective to attach mouse event listeners to individual sprites rather than to the stage?
I am writing an app for an Android phone using AS3 in Flash Builder. The app has multiple screens that respond to user touch. The screens are arranged in a hierarchy and show list data so that when you click on an item in a list you are presented with a new screen with a new sub list on it.
I have been using an event listener to detect mouse / touch input and based on something I read that indicated that performance is much better if you keep the number of objects you are listening to to a minimum I have attached the mouse listeners from each screen to the stage object.
This all works fine but I am finding that as I move between screens (and they get popped or pushed onto the dislay stack) I have to keep track of alot of adding and removing listeners to the stage object. If I don't then windows higher up the hierarchy than the current screen keep receiving mouse events.
If I used listeners attached to sprites in each window then when the window was removed from the display even though it is kept in memory (ready to be popped back when a child window is closed) it won't receive any mouse events....
Performance doesn't seem to be impacted using listeners directly on sprites when using my HTC phone to test with, however I obviously don't know what it will be like on other phones. Does anyone have any experience either way or a view on the best approach?
I would recommend to use Listeners on specific sprites, as it will be easier to code and maintain, also coordinates conversion might get cumbersome to manage with different screen/sprites sizes, and removing/adding listeners might not be so easy to maintain as you add more screens...
As for performance, I don't believe Listeners will have any impact, it is just a function that get called when the sprite is clicked, if you don't set a Listener, the OS registers the click anyway and sends it down to the lower level View until it eventually finds a Listener to the event, or drops it.

Is it possible for Android to return the number of samples per some X time from the touch screen?

Basically, I am trying to write a benchmark application to test the responsiveness of different Android devices' touchscreens. I figured the best way to go about doing this is to write an application that counts the number of samples returned from the touchscreen whenever it is touched. Unfortunately, I have no idea if this is possible and so far haven't found anything relevant to this idea.
I will most likely be using the MotionEvent class to determine when the touch screen is pressed, but what classes are available to determine the samples returned?
This is how I imagine my app to function:
Start app
Brief description screen, then button to begin testing
User touches an area of the screen
I haven't really determined how to do the output yet, but either part of the screen updates a real time graph, or the user just touches for some time and after he releases his finger from the screen, the app will output a new graph in a different activity.
Any help would be useful.
Thanks!

Categories

Resources