Android Live Wallpaper Touch Event Hierarchy - android

I have a Live Wallpaper that uses various screen touches to trigger certain events. At the moment it is working but I seem to get all touch events. For example, when the user touches an icon to launch an app, I seem to get that event too.
Is it possible to determine whether the user has touched the background screen (i.e. the inter-icon gaps.) so that I can only take my actions at that time and ignore the others.
Failing that, (and assuming – possibly erroneously – that if I am first in the queue than there is no other application on top of me on the screen) can I determine where I am in the touch event queue so that I can only take actions when I am the first one in the queue?
Or any other suggestions please.
Thanks
Richard

Ran into the same problem and ended up looking at the source for the Nexus wallpaper to see how it's implemented there.
I don't think it's possible to determine whether the actual home screen has consumed the touch. However, the default Android home screen does send a command to the wallpaper when you tap on empty space. So in your Engine, you can write:
#Override
public Bundle onCommand(String action, int x, int y, int z, Bundle extras, boolean resultRequested) {
if (action.equals(WallpaperManager.COMMAND_TAP)) {
// do whatever you would have done on ACTION_UP
}
return null;
}

Related

How do I prevent auto click cheats in an android app?

I built an android app where the user uses a stopwatch to try to get it to stop on exactly 1 second. There is also a second gamemode where the user tries to start and stop the stopwatch as fast as possible and get the lowest time. There are several auto clicker apps that you can install that will start and stop the stopwatch exactly 1 second apart and also double click the screen within milliseconds.
My question is what is the best way to prevent cheating by this method? Are there libraries designed for this? Thanks for any help!
This question has been answered here.
Basically, use an API to check for clicks using accessibility services, but you can't straight out block artificial clicks because people who use accessibility features need them. Instead you can measure the time between clicks or the frequency of clicks and block the ones that are 'too perfect'.
E.g. if someone gets exactly 1 second between clicks 10 times in a row, they're probably cheating. If someone clicks however many times per second at exactly the same timing of 0.X ms between clicks, they're definitely cheating.
I was focusing on tap or touch event so that I can differentiate between human behaviour and other than human. Here's what I have done:
#Override
public boolean dispatchTouchEvent (MotionEvent event) {
if(event.getToolType(0) == MotionEvent.TOOL_TYPE_UNKNOWN) {
return false;
} else {
return super.dispatchTouchEvent(event);
}
}

How do I prevent device rotation from generating duplicate output?

After user inputs parameters in MainActivity for my app (shown below), he taps Search, which calls MatchesActivity, which generates output on a new screen (shown further below), which is exited via tapping back.
But with MatchesActivity active, every time the device is rotated, Search is again executed because the Activity restarts. In the screenshot below, I rotated device from vertical to horizontal to vertical to horizontal back to vertical.
It looks silly.
The output is generated in MatchesActivity that is invoked in onCreate in MainActivity like so:
Intent matchesIntent;
matchesIntent = new Intent(MainActivity.this, MatchesActivity.class);
startActivity(matchesIntent);
Here's the essence of onCreate for MatchesActivity:
#Override protected void onCreate(Bundle savedInstanceState)
{
MainActivity.dbc.setDbProcesslistener(this); // to know txaMatches has been defined
MainActivity.dbc.findDBMatches(); // generate output
}
I did research. I found some complicated ways of preventing an activity from restarting when the device is rotated. For example .
I'm hoping for a simpler solution. Any ideas?
As you have found, one option is to prevent the activity from being recreated on configuration changes all together. This is not always the best option, as this will prevent other things depending on the configuration from being recreated/reloaded too (e.g. resources overridden with the "-land" qualifier).
Another option is to cache the result of the DB search somehow. This could be done by adding a wrapper around your database that memorizes the term and results of the last search. Another way to cache the results would be to use a fragment, and reuse that fragment across activity recreations. Whether a fragment is recreated along with its activity is controlled by this method:
http://developer.android.com/reference/android/app/Fragment.html#setRetainInstance(boolean).
My solution was simple.
Introduce boolean variable outputIsShowing, set it to true in onCreate as MatchesActivity terminates, set it for false when onCreate or onResume are executed in MainActivity (i.e., when MatchesActivity terminates), and return immediately in onCreate for MatchesActivity if outputIsShowing is true.
So if MatchesActivity is active when device is rotated, outputIsShowing will be true, so don't execute again.
It may not be best practice, but I've extensively tested it under normal conditions and am happy enough so far. Not sure if anything is lurking out there as a "gotcha".
I plan to go back and study the suggestions made so far since the more general situation is definitely important. And I will have to do so if someone finds fault with what I've done.
#Override protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
// usual details prior to asking for matches
if(outputIsShowing)
return;
MainActivity.dbc.setDbProcesslistener(this); // to know matches was defined
MainActivity.dbc.findDBMatches();
outputIsShowing = true;
}
* EDIT *
Strangely, after embedding TextView txaMatches in a ScrollView to accomplish smooth, accelerated scrolling, I had to remove the references to outputIsShowing in order to see output after two device orientation changes.
And now, maybe I'll submit another question to address the fact that, very infrequently after screensaver forces waking the device, the output does NOT show if that is where the focus was when screensaver became active. Tapping 'back' to get to user input screen and then immediately tapping Search restores all to normal until about 100 (give or take) screensaver instances later, the output is again missing.
Such a bug makes me think I ought to follow the advice above.
If I do, or when I figure out the problem, I'll edit this again.

How do Android applications receive input from touch screen?

I want to know how applications handle input from a touch screen. For example, if the user touch the coordinates x,y, how an opened (active in the foreground) application will know that the gadget (button for example) at the coordinates x,y must be clicked now?
Also, can I control the way by which apps handle the touch input using another app? I mean, I want to build an app that uses services to control how other apps handle their inputs, of course this needs my app to have permission to access other apps settings, but my question is, is it possible?
I have searched for how apps handle touch input, I found these results, which are useful, but not relevant to my case,
http://developer.android.com/training/gestures/index.html
How does Android device handle touch screen input?
Also, I know that any input hardware is controlled by HAL (Hardware Abstraction Layer) in Android, also every input device has its own driver. But how apps handles the inputs coming from these devices?
Thank you.
There are several ways to handle touches in Android.
First with Buttons you can set a onClick() method that will automatically be triggered when you touch the screen.
Another option is to attach a onTouchlistener to your activity.
In this example a custom view class called "Example" with the id "exampleid" is getting attached to a "onTouchListener
Public class Example extends View implements View.onTouchListener {
public Example(Context context) {
Example exampleView = (Example) findViewById(R.id.exampleid); //This is how you set up a onTouch Listener.
gameView.setOnTouchListener(this);
}
#Override
public boolean onTouch(View v, MotionEvent event) {
//Do something when touched.
}
}

Background app to listen to Drag gestures

I need to register a broadcast receiver that will tell me any kind of Drag events throughout the system. My app will run at background and perform any task if any kind of Drag event happens even any other app is running in the foreground. Is it possible? Any idea on how can I do it?
Updates: Do not think I'm going to make keylogger. My app will be visible but will run in background. And all I want is simply to detect Drag events (drag to left, drag to right, drag to up and drag to down).
I'll accept any answer if you can tell me about how can I display 4 buttons those are permant, on top of any other apps because this can also serve me what I want.
Your app can run without a "normal" UI by running as a Service, as per your link, but I think the code you linked may be slightly out of date.
Remember that your service must run in the foreground - the code is supplied there in your link, but not explicitly called. Without running in the foreground, the system could well stop your app rather than running it in the background.
When I created a task switcher using such overlays, I found it was necessary to use a TYPE_SYSTEM_ALERT rather than TYPE_SYSTEM_OVERLAY.
Android 4.x - System Overlay - Cannot capture touch events
I declare my window parameters without FLAG_WATCH_OUTSIDE_TOUCH.
WindowManager.LayoutParams params =
new WindowManager.LayoutParams(width, height, x, y,
WindowManager.LayoutParams.TYPE_SYSTEM_ALERT,
WindowManager.LayoutParams.FLAG_NOT_FOCUSABLE
| WindowManager.LayoutParams.FLAG_NOT_TOUCH_MODAL,
PixelFormat.TRANSLUCENT);
Your service should also be sure to properly unregister the overlay view from the WindowManager when it ends. Without this, you app will leak memory.
public void onDestroy()
{
super.onDestroy();
if (overlay != null)
{
((WindowManager) getSystemService(WINDOW_SERVICE)).removeView(overlay);
overlay = null;
}
}
I see that this is done in OverlayView.destory() (note the incorrect spelling of that method name - it would be a good idea to use the correct name for that method).
So your real requirement is to be able to pass some input to your app whilst allowing the screen to be dedicated to outputting video.
Have you thought about the following:
detecting tilt or orientation of the device to control direction
having 4 nfc tags, and detecting which of those you are over to change direction (may not give you a quick enough response)
It is also possible to have actions that are selectable directly from notifications, so you could have one or more notification that offer actions to control the direction, and simply allow the notification API to handle the job of appearing in front of the video.
Of course there are apps that manage to overlay their UI in front of other apps. Such an example is Thrutu, though you seem to be pushed for time, and getting such a solution working is not straightforward - see How do I implement a slide-out drawer on the Android's call screen?
I found my answer here. I need to start a service.
The library standout allows you to create floating applications(applications that go over other applications). I used this for a couple android applications I put on the market a while back. It makes creating the floating windows pretty simple. Check it out at:
http://forum.xda-developers.com/showthread.php?t=1688531

How to scan user activity in order to reset timer in Android

I'm facing one issue and I cannot find the solution.
My application has a PIN code and I want to un-verify it after 2 minutes without any action from user. For instance user verified the PIN and than leave the phone on the table (with running application) and I want to un-verify the pin after 2 minutes.
Question is how to scan the user activity (it means touch anywhere on the screen or buttons). Of course I can scan the touch on active components (buttons etc) but I also want to scan the touch anywhere on the screen to reset the timer.
Do you have any idea how to do it?
Thanks a lot!
you can override the activity's main layout's onTouchEvent to detect any touch gesture within the whole viewable area(which should fill the whole screen). just remember to return super.onTouchEvent to ensure that the other active components can still consume the event when they are touched separately.
Override activity method onUserInteraction()
#overide
public void onUserInteraction() {
//reset you timer here
super.onUserInteraction();
}

Categories

Resources