For the research I am doing, I need to some way to track user touches when they are using the phone in general daily basis. The user will be fully aware about what they are recording. Any method to do would be great.
What have I tried so far?
Method 1.
Create an service with overlay transparent view.
Problem Due to obvious security flaws this is prevented starting with ICS. The input touches on the transparent view is not transferred to background and hence user is not able to interact with phone normally. I tried various methods with overlay view defining as type phone or type system alert or switching between them during program execution.
Method 2.
View with 1% screen size make with touch outside model
Problem As problem as previous. Touch outside only returns the if touch event happened outside without even initial x, y coordinates.
There are other methods I tried but those are highlighted. Currently, I am thinking about other options:
Option 1 - The pointer location option in developer options: In settings there is this pointer location option that I can utilize. When that option is on, all the info about touch are shown in the top panel. If I can have access to those data afterwards that would be fine too, despite the fact that there will be drawings on the screen when user is using the phone. I have traced the source code of ICS and found the core class that is making that tracking possible. I recreated that class into an service. The logcat shows all the info about touch when I am running it from app. Only problem is the same as problem 1. I cannot track it outside current app. So, if it logs the tracking info even when pointer option is turned on, how will be able to get the information later to use?
This option seems the easiest.
Option 2 - Android NDK If above method is not possible is it possible to do so using NDK? Right direction to this route is also great.
Option 3 - Custom ROM Do I really need to go for Custom ROM while doing this? I am pretty sure this is 100% sure way to do it. But it is seeming very impractical in this particular research.
I would appreciate any suggestion to the path that I can follow.
Thank you all in advance.
You can use rooted phones and RepetiTouch for your research. https://play.google.com/store/apps/details?id=com.cygery.repetitouch.free
Related
I'm performing a little experiment that requires two activities running side by side to interact. The interaction is very simplistic. Pressing a button in one activity should result in a touch event being triggered anywhere in the other.
I'm not even sure this is possible, it might not be because of security reasons, but I thought I would ask here anyways.
Is there a way to perform an touch event on position X,Y relative to the physical the screen. (not relative to my current activity)
Again, I'm doing this just for fun.
I have found this blogpost that explains three possible methods to perform a click on an external activity
http://www.pocketmagic.net/injecting-events-programatically-on-android/
They require root, but for my use this is not an issue. If anyone finds something more simplistic or that doesnt require root, I will mark that answer as correct.
Is it possible to determine if a device (non-rooted) is in use at the moment, even if my app is not in the foreground? Precisely "in use" means the user made touch events in the last 5 seconds or display is on.
If so, what specific rights are required?
Thanks
AFAIK, android security model would not allow you to record touches if your app in not in the foreground.
There are some crude workarounds like overlaying a transparent screen to record touches. Not sure if these work now though.
"in use" means the user made touch events in the last 5 seconds
In Android, that's not practical, short of writing your own custom ROM.
or display is on
In Android, you can find out if the device is in an "interactive" mode or not. This does not strictly align with screen-on/screen-off, as the whole notion of screen-on/screen-off has pretty much fallen by the wayside.
I have been working on a game where speed is required to score well, in the game the user is clicking on objects and using them with other objects which are held in a gridView that's being controlled by an imageAdapter. I have noticed that when I click quite fast some clicks don't register, I have cleaned up the code running when the user clicks but that doesn't seem to be the problem since I tested it just highlighting objects and not running code when clicked and it was just as slow. So is there a way to speed up the click detection, or would this be limited by the speed of the device its self, I have been testing on an htc one m8.
Return as soon as possible from the handler and run the UI update code in background with 'runOnUiThread()'.
Notice that changing view status MUST be done in the UI thread or the Android runtime will throw an exception. You can work complex calculations in background 'AsyncTask' and call 'runOnUiThread()' from within them whenever you want to update UI components.
As far as I know, there is no way to do such think. It depends on the speed of your hardware. But what you can do is to use the onTouch listener. In this way you listen only after one action(when it is pressed). for onClick it is registered 2 actions(when u press the button and when u release the button). In this way maybe you could do it faster.
You can also try this:
http://developer.android.com/guide/topics/graphics/hardware-accel.html
I'm with not much experience in Android development.
I'm considering a big project and before getting deeply into it, I want to check whether my requirements are even possible:
My goal is to manipulate the system by changing the coordinates of a user's touch on the touchscreen. For example: If a user is touching the screen on point (X,Y), I want any opened application to act like the user touched (X+5,Y-3).
I have thought on a few levels that this may be possible to be defined in:
Touch-screen's driver level, OS level, application level (i.e. background application).
A big advantage will be to built it in a way that will allow as much compatibility as possible.
What is the best/right way to do it?
I'm not looking for a full solution, only a hint regarding the best direction to start digging...
Thanks in advance.
Basically, I am trying to write a benchmark application to test the responsiveness of different Android devices' touchscreens. I figured the best way to go about doing this is to write an application that counts the number of samples returned from the touchscreen whenever it is touched. Unfortunately, I have no idea if this is possible and so far haven't found anything relevant to this idea.
I will most likely be using the MotionEvent class to determine when the touch screen is pressed, but what classes are available to determine the samples returned?
This is how I imagine my app to function:
Start app
Brief description screen, then button to begin testing
User touches an area of the screen
I haven't really determined how to do the output yet, but either part of the screen updates a real time graph, or the user just touches for some time and after he releases his finger from the screen, the app will output a new graph in a different activity.
Any help would be useful.
Thanks!