i am using AndEngine and to get touch events.
there is OnAreaTouched() event,
but i need to get TAP event of images drawn (to Click)..i can do that using onAreaTouch but that gives even when user simply touch..i want user to tap on that. Suggestions, examples, or tutorials?
Basically, a click is a combination of ACTION_DOWN, some ACTION_MOVE's and an ACTION_UP touch event actions, that appear in a small area. All you need to do is to check, whether your ACTION_UP appeared near to your ACTION_DOWN. If you need some extra accuracy, you can check the time interval between those actions to be sure this was a click. Just store position and time of the ACTION_DOWN and compare it to your ACTION_UP's position and time - and you'll be able to differentiate click from fling or something else. Hope this helps.
Related
Im having a problem with logic as to how to go about this.
I require 1 button to do 3 things.
On click --play a music file once
If held (im guessing OnTouch) the music would play and loop but stop on releasing the thumb.
OnLongClick - goes to a new view for user input.
I have been able to implement 1 and 3.
My problem is if i use the onTouchListner it will eventually fire the OnLongClick Event and I think it will try to run the onClick event when I release the touch .
Ant thoughts would apprecieted
onTouch gives you Motion Event. Thus, you can do a lot of fancy things as it help you separate state of movement. Just to name a few
ACTION_UP
ACTION_DOWN
ACTION_MOVE
Those are common actions we usually implement to get desire result such as dragging view on screen.
On the other hand, onClick doesn't give you much except which view user interacts. onClick is a complete event comprising of focusing,pressing and releasing.
For more information on how to handle single and multitouch events in android visit here
So ,what you need to work this situation with ,is "MyGestureListener".
You can still use what you have implemented Onclick and OnLongClick and instead "OnTouch" I am pretty sure you are meant to use "onDown"/"ON_DOWN".
Here is a list of all the methods you can use: https://developer.android.com/samples/BasicGestureDetect/src/com.example.android.basicgesturedetect/GestureListener.html
Hope it helped.
You better use option 1 normally,then instead of ontouch use onLongClick and write code for a menu to appear with option 2 and 3.
Cheers! :)
Let's say
-I have a button that listens to a "tap" event, and directs to a function that does something.
-I put an ImageRact that covers the button. One layer up.
When I click on the cover image just above the area the buttons lies behind , the event function STILL executes.
How do I avoid this?
example:
local function hidebg()
display.remove(logo3)
logo3=nil
end
local logo2= display.newImage("logo.png")
logo2.x=display.contentCenterX
logo2.y=280
logo2.width=200
logo2.height=74
logo2:addEventListener("tap", hidebg)
local cover =display.newImageRect("NEW GAME A.png", 480,320)
cover.x=display.contentCenterX/2
cover.y=display.contentCenterY/2
The hidebg() function is still executed although the "logo2" is covered by "cover" image.
I know I could make the button isVisible=false and solve the problem, but I have dozens of buttons in different groups in different layers, and I wonder how to do it in a smart way. Maybe somehow disable a whole group? I don't know.
There are 2 ways that you can disable that button in your project.
1) Just create a listener to cover as below and return true as follows:
function coverPressed()
return true;
end
cover:addEventListener("tap",coverPressed)
2) Check if cover exists, and then remove the listener of logo2 as:
logo2:removeEventListener("tap", hidebg)
Keep Coding............ 😃
Control Touch Propagation
The reason that this problem can be solved by adding a touch event listener that returns true to the masking DisplayObject, as suggested in the accepted answer, is that this handles or halts the propagation of the touch. Having been handled by the masking object, the touch will never reach the listener on the button located further down in the display hierarchy (or further back, if you prefer).
This is explained in the Corona SDK documentation on tap/touch propagation:
When the user touches the screen, the event is dispatched to the display hierarchy. Only those display objects that intersect the touch location on the screen will receive the event.
Tap and touch events propagate through these objects in a particular order. By default, the first object to receive the event is the front-most display object in the display hierarchy that intersects the touch location. The next object to receive the event is the next object back in the hierarchy that intersects the touch location, and so on.
Tap and touch events propagate until they are "handled." This means that if you have multiple objects overlaying each other in the display hierarchy, and a tap or touch event listener has been applied to each, the event will propagate through all of these objects. However, you can stop propagation to the next underlying object by telling Corona that the event has been handled. This is as simple as returning true [emphasis mine] from the event listener — this stops the propagation cycle and prevents any underlying objects from responding to the hit event.
Change Widget Properties
If your button is from the widget.* library, you can achieve the same result more simply by disabling it and making it invisible:
button:setEnabled( false )
button.isVisible = false
By the way, the advantage of using isVisible (rather than changing alpha) is that you don't need to keep track of the alpha value before hiding the button. If you later do button.isVisible = true, the ButtonWidget will have the same alpha value as before.
Environment:
I have an Android ListView with rows that consist of TextViews containing some HTML with tappable links (URLSpans). On the ListView, I have set an OnItemLongClickListener to listen for long click events on individual rows.
Goal:
When I receive a long click event, I want to DISABLE taps on the links for the same touch event, but I'm not seeing how to do this. The long click fires correctly, but then as soon as I lift my finger, the link tap also fires.
What I've Tried Already:
I've tried returning true (and false) on the onItemLongClick method -- it doesn't seem to make a difference either way. I've tried to intercept the MotionEvent.ACTION_UP after a long click so that I can temporarily consume the link tap, but the ACTION_UP doesn't fire -- at least not on the ListView.
I did some more searching and found this answer to a similar question: Android TextView Linkify intercepts with parent View gestures
I used the concepts from this answer to solve my problem. I extended the TextView class and overrode onTouchEvent and look to see if I'm tapping on a link on touch down events. If I am, I save that link and "click" it programmatically on my ListView.onItemClick if I didn't encounter an LongClick first. Yuck.
If anybody else has a more elegant way to solve the problem, post it and I'll accept your answer if it works. If not, I'll accept my own answer in a few days.
Is there a way to detect a click from MotionEvents in Android? To be more specific, I need a way to distinct two kind of events: movement and click (I'm more interested in the latter).
For example, this kind of behavior can be observed in MapView component: if you drag the map just a little - it does not move (I would call this a click), however if movement distance is bigger the map starts to move as well (I would call this movement). Is there a standard threshold (global parameter) or other method to distinct these two actions?
You can implement an OnGestureListener. onSingleTapUp() will be called for a click type event and onScroll() will be called for a movement type event.
I have a bit of doubt. I am using an image button (e.g. Play icon in media player). I want to know which action Listener I am supposed to use, onClickListener or onTouchListener. What is the difference between those two actions and when should I use either.
The answer by #vishy1618 has the key insight of this thread (tried to leave this as a comment there, but too long).
Conceptually, onClick is just a 'wrapper' around a particular sequence of touch events - down, no drag, up. So comparing onTouch vs. onClick is just a low-level API (raw touch events) vs. a high-level API (a logical user 'click').
But, an important compatibility issue: in Android, onClick can also be fired by the KEYBOARD (or trackball, or whatever alternative input/hardware device is being used). But (afaict) there's no support for firing touch events via any other input device apart from the touch screen.
So, if you code your UI against touch events exclusively, you are implicitly requiring a touchscreen. Whereas if you stick to onClick, your app could theoretically work on a non-touch device.
Of course, all 'compliant' Android phones currently do have touch screens ... so this is effectively moot. But if you want your app to work on non-phone hardware, this might be worth considering.
There is some good discussion here:
How to determine if an Android device has a touchscreen?
https://groups.google.com/forum/?fromgroups=#!topic/android-beginners/cjOVcn0sqLg
onClickListener is used whenever a click event for any view is raised, say for example: click event for Button, ImageButton.
onTouchListener is used whenever you want to implement Touch kind of functionality, say for example if you want to get co-ordinates of screen where you touch exactly.
Update:
Just check the official doc for both: onClickListener and onTouchListener.
So from official doc, definition for both are:
onClickListner: Interface definition for a callback to be invoked when a view is clicked.
onTouchListener: Interface definition for a callback to be invoked when a touch event is dispatched to this view. The callback will be invoked before the touch event is given to the view.
The onClickListener is a number of events that are triggered using either the keyboard or the touchscreen. They are performed on a specific view, and the entire view receives the event. In contrast, the onTouchListener is used only for touchscreen events, and they cannot be triggered through the keyboard or any other inputs. They typically also receive the corresponding touch information like the x, y corrdinates, etc.
I think the onClickListener would be appropriate for your application, if you are not using more complex inputs, like gestures, etc.
This question I also got in mind that should use click or touch listener.
Then I have my understandings like this,
When I need any View(Button/Image/etc) to make clickable that means user just don't touch that part of screen but delebrately try to Touch on that part of screen so the next action gets called I use onClickListener , Also another thing is like suppose working with Button we can make it Clickable True/False as per requirement dynamically,Hence in this situations the OnClickListener is preffered.
new View.OnClickListener() {
public void onClick(View v) {
// TODO Auto-generated method stub
}
};
While developing screens where the simple Touch of User is to be taken as action like more in Games or Working with Images that you want to capture that where user has touched and also you need to find the Motion Events up/down/left/right of the Touch I preffer to use onTouchListener.
new View.OnTouchListener() {
public boolean onTouch(View v, MotionEvent event) {
// TODO Auto-generated method stub
return false;
}
};
And
In your case I suggest to use the onClickListener
Extra information for anyone viewing this thread in the future:
When you set the FocusableInTouchMode property for the view to "True" onClick will not fire until the second click/touch on the view. I assume the OS treats the first touch as gaining focus only. OnTouch however, fires on the first touch.
When FocusableInTouchMode = "False" both will fire on the first touch.