I'm writing the library for Android apps which will report all crashes to the server.
I need to report the last user clicks or user interactions with the app before the crash.
So the question is - Is there anyway to track all the clicks in whole android application?
In my library I have only Application instance.
I am not sure its possible to catch all UI clicks. but you can make the developers add code to specific actions that will track this. like have your implementation of the onClick Listener, on touch listener that developers will extend to use with onTouch, on click.
Other than that i don't think its possible apart from adding a container view that will intercept all touch events as the root view.
For instance. you can provide extended basic components such as FrameLayout, RelativeLayout etc.. which override onInterceptTouchEvent and track the events there. The developers will use your layouts for the root views for each of their screens.
Having a global touch handler which you would just plug into the Application class is not possible i believe. Java enables you to set default exception handlers as ACRA and the others do for this purpose.
Related
how can i use multiple ActivityViews correctly in Android11?
I already created a working sample in P+Q. But in R touch stopped working. The activities are displayed well but only the highest view reacts on touch. All other activies do not respond. If i reorder the views during runtime, the new highest activity becomes interactable. The other views will again not respond.
After a lot of testing i saw, that those ActivityViews which are not interactable, will receive touch events if i add a corresponding listerner inside the hosting app. Strange is that the view which is working well, will not get any event. I assume that the touch is redirected through the virtual display to the hosted activity directly. But i dont have a proof.
How can i make all hosted activities interactable? Since i have system access, i am fine with accessing hidden APIs. Unfortunately i cannot find which change in R causes the issue. So at least hints where to search would be helpful because i am running out of ideas.
I want to know if it is possible to generate automatic touch at regular intervals of time, say 5 seconds in another application.
For example..I want to develop an application which will create a touch response just as we touch the screen, at a particular coordinate at regular fixed interval.
Please help.
It's possible to create and dispatch touch events in your application. It's pretty easy. Look at View.dispatchTouchEvent method and parameters. You have to override that method in root view group so you can pass your events to all views in the activity.
It's not possible to access other applications though (due to security reasons).
edit: seems like dispatchTouchEvent is public, so no need to override
I know that it is easy to measure the touch pressure by using the function getPressure() provided by android API within an application activity.
However, I would like to measure all the motionevents (not only within the application activiy, but also outside the application activity).
How can I achieve this?
You can't, thankfully.
For security reasons, Android does not allow you to intercept touch events on anything outside of your own App's Activities. If this were allowed, a malicious app could take over and render the device useless by simply consuming all the touch events.
There are some parts of the framework which are not quite clear to me yet. I am well known with the flow of an input event (Kernel -> Eventhub -> InputReader -> InputDispatcher -> ...).
Situation
(Requirements: Handle input keys without changing the Android Framework.)
I want to handle key events coming from a device (keyboard/gamepad/controller/...) but there are some requirements. For one, I don't want to change the Android framework. This means, I don't want to extends the WindowManagerPolicy and its functions like interceptKeyBeforeDispatching where the home-key is being handled. This would result in the key event being dispatched into the application layer which is fine. The downside is, I have another tricky requirement here.
Example: When I am playing Angry Birds and I press my GoToAlpha-button on my connected input device, the Alpha-application has to start. Angry Birds has no clue which button GoToAlpha is, will not handle/recognize it and there will be for example no intent broadcasted to start my Alpha-application.
Question
Is there a way to handle my (custom) key event after it is being dispatched, knowing that the application in the foreground can not handle the key?
My (failed) solutions
Create a service which will handle the key events. This is not possible because an application like Angry Birds will not bind to my service and the key event will not be caught inside my service. If I am wrong, please provide more information :).
Create an external library where I allow my application's activities to inherit from my own ActivityBase. All the key events and there default behavior can be handled here. Downside, existing applications will not support my custom key events because they don't use the library.
Extend the framework would be in my eyes the cleanest solution but that will result in not meeting my requirement.
Any help or useful information would be appreciated
Extra
If the first question could be solved on one way or another.. I want
to customize my Intent behind the GoToAlpha-button. This means.. By
default the Alpha-application will be started but after the user has
customized it, the Beta-application will be started from now on.. Any
thoughts?
Thanks
Thanks for the comment Victor.
Using the InputMethodService will not provide me with enough freedom and functionality to handle my problems.
My Solution / Compromise
Within the Android Framework, there is a PhoneWindowManager which is responsible for handling InputEvents. The WindowManagerService which is started by the SystemServer, is owner of this manager and creates an instance.
By creating my own custom WindowManager and let it inherit from Android's PhoneWindowManager, I don't lose any default functionality and this allows me to add my own implementation within this class. This results is adding a new file to the framework and changing only one line inside the Android Framework: The WindowManagerService will not create a PhoneWindowManager, but will create a CustomPhoneWindowManager (extends PhoneWindowManager).
If anyone sees a better solution or has any specific thoughts about my compromis, don't hesitate to comment. :)
I doubt that it's possible with public API's (Boy and Martijn pointed out security concerns).
Most like your best bets (if you don't want to customize Android) would be
a) Try to use InputMethodService (http://developer.android.com/reference/android/inputmethodservice/InputMethodService.html)
It doesn't give that kind of control which you wish, but it could be enough for some needs.
b) Try to go through whole stack (from Kernel to Application) and find some vulnerabilities to use.
This definitely will take a lot of time and doesn't guarantee to bring any fruits.
I need to make a service that capture all touch events, not in a specific view or when an specific activity is open. This service is started when the boot is completed (and I have an app to stop/play this service when I want it). So it write something (in a Toast) when the user touch any place in the screen.
Can I do this? Or only in specifics things (with OnTouchListener and adding specifics views, for example)?
Sorry for my bad english.
thanks
If you override the View itself and build your own ROM, you will be able to have these kind of things. So, straight answer will be just no. Sorry.