I need this for the:
public abstract boolean onTouch (view v, motionevent event).
method.
Since i have no pre knowlege of android development i dont even know what view is.
Seems like for small apps with compose i don't need them.(Simple off topic question: do i need to learn how to use view for compose ?)
Related
I have a Service that can draw a Canvas on top of all applications using a SYSTEM_ALERT_WINDOW which contains a custom View.
My Service and custom View both implement View.OnTouchListener
The onTouch(View v, MotionEvent ev){} method of the Service returns true while the custom View returns false. This allows me to display my custom View on top of all apps and still interact with the underlying Activity. This part works.
I also want the Service to simulate a touch event in the specified coordinates in the current Activity that is underneath the SYSTEM_ALERT_WINDOW.
I was hoping I could call View.dispatchTouchEvent(...) on the custom View and because onTouch(...) returns false, the touch event would get passed on to the underlying Activity. It does not work.
How can I simulate touch events in any Activity in this situation?
I have used code from the following sources...
Draw a canvas on top of all applications:
http://www.piwai.info/chatheads-basics/
Draw an Android canvas on top of all applications?
Passing touches to underlying app:
How can I interact with elements behind a translucent Android app?
Simulating touches:
How to simulate a touch event in Android?
AFAIK, you can't, for obvious security reasons, except probably on rooted devices. Even the accessibility APIs don't support this AFAICT. Allowing arbitrary apps to fake input to other apps would be a major security hole.
If your objective is to do this during testing, instrumentation testing allows you to simulate touch events in your own app.
I'm writing an API for Android and I need to know when the developer using the API calls View.setOnClickListener() and View.setOnFocusChangeListener(). I don't want to override either because that would mean to extend View and I don't want to force the developer to use my subclass -basically because he wouldn't be able to use the Android GUI editor for Eclipse-.
I tried to override Activity.dispatchTouchEvent() but then I cannot capture movements done with keypad/virtual keyboard.
Any ideas or guidelines on how to do this?
I found that using both Activity.dispatchTouchEvent() and Activity.dispatchKeyEvent() is the only way of doing this. You have to keep track of the view that had focus previously (so you can know when onFocusChange() is triggered). The onClick() can be detected using MotionEvent action and the view ID.
On android we have onDraw(). What is the equivalent in iOS ?
You probably want drawRect:, though depending on what you want in your view there might be other options of interest (subviews & Core Animation layers). See the View Programming Guide.
If you're writing a custom view, it's basically -drawRect: which gets called on the view every time the system wants to redraw (e.g., every time the runloop turns and -setNeedsDisplay flag is set.
you override the -drawRect method of a UIView to do direct drawing to the screen. However this isn't commonly needed for lots of use cases. If you want, provide more detail about what you want to achieve. There may be a more iOS way.
i'm new to android (and programming in general), and i was wondering why the OnClickListener interface is under View. For instance, i might set up a button, that when you click on the button, i show a toast message. Why does the onclicklistener need to know anything about a view?
So i understand what a callback is, and why you make the OnClickListener interface something that the developer implement. It's reusable. But maybe i don't really understand encapsulation? or maybe i'm totally missing the point of interfaces and callback?
This is more of a conceptual questions, and i would highly appreciate any answers with explanation of the concept as well as maybe simple/short example code to explain.
Thanks!
The major views we are using are sub classes of View class. See this
Android defined many Interfaces in View class to handle some events, which are all common to any view type(Button, TextView, etc.). To make the API simple and providing easier way to programmers. This is actually the concept of inheritance in OOPS.
If you have any doubts refer Android Developers website, you will get complete details there.
View is super class for all widgets like Button, TextView , EditText etc.
at click event we should know which widgets is call that is why we need View as a argument.
at if we have multiple view so we can recognize by it's id.
The TouchUtils class in the android documentation has functions like drag():
https://developer.android.com/reference/android/test/TouchUtils.html#drag(android.test.InstrumentationTestCase,%20float,%20float,%20float,%20float,%20int)
but they do not support multi touch gestures, like a two finger swipe.
Looking at the MotionEvent.obtain() methods, there does not seem to be any way of invoking a "virtual" multi touch event from a testcase.
Anyone has got it working?
Apparently there is no other way than to use the private function MotionEvent.obtainNano() to mock the multi touch events. Hopefully this will change in future versions.