How do Android applications receive input from touch screen? - android

I want to know how applications handle input from a touch screen. For example, if the user touch the coordinates x,y, how an opened (active in the foreground) application will know that the gadget (button for example) at the coordinates x,y must be clicked now?
Also, can I control the way by which apps handle the touch input using another app? I mean, I want to build an app that uses services to control how other apps handle their inputs, of course this needs my app to have permission to access other apps settings, but my question is, is it possible?
I have searched for how apps handle touch input, I found these results, which are useful, but not relevant to my case,
http://developer.android.com/training/gestures/index.html
How does Android device handle touch screen input?
Also, I know that any input hardware is controlled by HAL (Hardware Abstraction Layer) in Android, also every input device has its own driver. But how apps handles the inputs coming from these devices?
Thank you.

There are several ways to handle touches in Android.
First with Buttons you can set a onClick() method that will automatically be triggered when you touch the screen.
Another option is to attach a onTouchlistener to your activity.
In this example a custom view class called "Example" with the id "exampleid" is getting attached to a "onTouchListener
Public class Example extends View implements View.onTouchListener {
public Example(Context context) {
Example exampleView = (Example) findViewById(R.id.exampleid); //This is how you set up a onTouch Listener.
gameView.setOnTouchListener(this);
}
#Override
public boolean onTouch(View v, MotionEvent event) {
//Do something when touched.
}
}

Related

Is android.view.View touched now?

Sure, the onTouchEvent event will let me know whenever the user interacts with the view, so toggling a flag according to the getActionMasked() state of MotionEvent should supply this information. But I have found this method to fall short of the desired result at times.
Is there a more straight forward way to simply ask if a user is currently touching a view?
Might be worth checking out the documentation for onUserInteraction().
Something like this would allow you to know how recently the user has interacted with the screen:
#Override
public void onUserInteraction(){
// do something
}

How to scan user activity in order to reset timer in Android

I'm facing one issue and I cannot find the solution.
My application has a PIN code and I want to un-verify it after 2 minutes without any action from user. For instance user verified the PIN and than leave the phone on the table (with running application) and I want to un-verify the pin after 2 minutes.
Question is how to scan the user activity (it means touch anywhere on the screen or buttons). Of course I can scan the touch on active components (buttons etc) but I also want to scan the touch anywhere on the screen to reset the timer.
Do you have any idea how to do it?
Thanks a lot!
you can override the activity's main layout's onTouchEvent to detect any touch gesture within the whole viewable area(which should fill the whole screen). just remember to return super.onTouchEvent to ensure that the other active components can still consume the event when they are touched separately.
Override activity method onUserInteraction()
#overide
public void onUserInteraction() {
//reset you timer here
super.onUserInteraction();
}

Android: how to combine Activity.onTouchEvent() and View.setOnLongClickListener() capabilities

In Android, I'm trying to capture user touches as well as User long-touches. I have an Activity and in it I override the onTouchEvent() method to handle a variety of screen touches.
I'm trying to incorporate "Long Presses" into my repertoire of User Interface choices.
I can't find a Activity.onLongTouchEvent() for me to override.
My application also has a SurfaceView and I see that I can do this:
sv.setOnLongClickListener (new View.OnLongClickListener()
{
#Override
public boolean onLongClick (View v)
{
SendAToast();
return false;
}
});
When I implement that code, it works exactly like it should.
However, now my onTouchEvent() code is never called even when I don't touch the screen long enough for it to be a "Long Press".
Is anybody aware of a way to get these two bits of code to work together?
Edit:
After I posted this, a co-worker showed me the "OnGestureListener" interface. Here's an example:
http://www.ceveni.com/2009/08/android-gestures-detection-sample-code.html
I use the interface to capture long presses, and it even provides the screen coordinates to work with (which the OnLongClickListener does not). So, it seems to do the trick.
Why this function not automatically part of the Activity? It sure seems like core functionality to me.
I would move the on touch stuff into the view's onTouchEvent instead of the Activity.

How to get onTouchEvent, long click and context menu working together?

In our application we have a custom view (that extends ImageView) and in it we handle the touch events to record data. I wanted to add context menu functionality to this view and followed the guidelines in the official Android documents.
The onTouchEvent code works fine by itself. The context menu code also works fine. However, if I add them both, the context menu code stops working. I found out that with both pieces of code added, onCreateContextMenu is never called, therefore context menu is never displayed.
According to my interpretation of the Android documentation, returning false from onTouchEvent indicates that the event is not consumed, so it should be used for further processing. For some reason, it is not happening here. I would appreciate if anybody can tell me what I am missing. BTW, the target is Nexus One running 2.3.4 ROM.
Here's the code for the onTouchEvent in the custom view:
public boolean onTouchEvent(MotionEvent event)
{
switch (event.getAction())
{
case MotionEvent.ACTION_DOWN:
// Add event coordinates to an arraylist
break;
}
return false;
}
Thank you in advance for any help.
Elaborating on hackbod answer, you should probably have as last method statement return super.onTouchEvent(event);.
I guess that if you don't process the event, and if you don't invoke the default View behavior, than no one will do anything, and nothing will happen.
The point of return value might be for example to invoke some ancestor' default behavior, and let the derived class know if the ancestor processed the event or not.
After doing some search on Android Developers, referring to the topic override an existing callback method for the View here it says :
This allows you to define the default behavior for each event inside your custom View and determine whether the event should be passed on to some other child View.
Hence the main idea behind the return value is to let Android know whether the event should be passed down to child Views or not.
HTH
Edit:
Regarding the "directions" you mention in your comment, generally speaking (i.e. not only on Android) the UI event handling process goes on something like this:
At some point your derived custom control receives the event. In your event hook implementation, it's up to you whether to involve your ancestor's behavior or not. That's all you got regarding the class inheritance direction.
Then, there's the other direction, the one related to the UI controls hierarchy. Your custom control might be contained in one larger control container, and your control might as well contain other inner controls (textboxes, buttons, ...). Regarding this direction, if you declare not to process the event (returning false) then the UI event handling mechanism will pass the bucket to the containing (?) control (think the one on the background of yours).
You could call, from your long click listener,
openContextMenu(View view)
http://developer.android.com/reference/android/app/Activity.html#openContextMenu(android.view.View)
Do not register for context menu in OnCreate(), do it in onTouch() before
return true;
registerForContextMenu(View v);
openContextMenu(View v);
return true;
Returning false tells the parent that you didn't consume the event. The default implementation of View implements touch handling for that view; if you want that to execute, you must call super.onTouchEvent(event);
I encounter similar problem recently. When I enable long clickable in RecyeclerView's child, the ACTION_DOWN event can't not be received in RecyclerView's onTouchEvent.
If I changed to RecyclerView's dispatchTouchEvent, I would works. The ACTION_DOWN event can be received.

Android Live Wallpaper Touch Event Hierarchy

I have a Live Wallpaper that uses various screen touches to trigger certain events. At the moment it is working but I seem to get all touch events. For example, when the user touches an icon to launch an app, I seem to get that event too.
Is it possible to determine whether the user has touched the background screen (i.e. the inter-icon gaps.) so that I can only take my actions at that time and ignore the others.
Failing that, (and assuming – possibly erroneously – that if I am first in the queue than there is no other application on top of me on the screen) can I determine where I am in the touch event queue so that I can only take actions when I am the first one in the queue?
Or any other suggestions please.
Thanks
Richard
Ran into the same problem and ended up looking at the source for the Nexus wallpaper to see how it's implemented there.
I don't think it's possible to determine whether the actual home screen has consumed the touch. However, the default Android home screen does send a command to the wallpaper when you tap on empty space. So in your Engine, you can write:
#Override
public Bundle onCommand(String action, int x, int y, int z, Bundle extras, boolean resultRequested) {
if (action.equals(WallpaperManager.COMMAND_TAP)) {
// do whatever you would have done on ACTION_UP
}
return null;
}

Categories

Resources