How to get an array/bitmap data from android touch screen?
(the shape and area of the touch).
Same question as this one Android: get raw bitmap data from touch events, but this is very old.
The data is processed in the kernel, is there any way I can get it?
It's not possible since your touchscreen only recognizes a few points. For example, try using any kind of a Paint app. You will see that your whole finger won't get noticed if you put it on the screen. Only a few dots will be painted.
Also notice that some phones don't support multi touches. I would recommend you to rethink about your app design. Maybe you could try let the user draw something to achieve a result you want.
Yes that's what I ment. Check cursor location.
Related
I am trying to develop a game for Android using pygame.
It would be a platformer. To move the main charachter, I would make the game to wait for mouse events (like pygame.MOUSEBUTTONDOWN).
In order to do that on mobile, I'd like to create a graphic representation of a joypad with arrow keys to be shown on the bottom left corner of the screen.
Now, when the user touches one of the arrows, a MOUSEBUTTONDOWN event should be triggered and the charachter should move accordingly.
My question is: since the "joypad" object is a mere draw, how can I link it to the event with pygame?
Is there a way to do so? Should I use the pixel coordinates of the arrow keys of the joypad or is there a better choice?
As far as I know this is not possible.
When handling input, mouse input and touch input are to be handled separately.
So to answer the 2 questions you listed at the end:
As far as I know there is no way to implement this functionality.
You could use the pixel coordinates of the arrows. However you can use Rects for that and test if the place of mouse input/touch input is inside the arrow button Rect with the collidepoint method
You can achieve that as follows:
arrow_left.collidepoint(mouse_x, mouse_y)
I hope this answer helped you!
I have received requirement like this http://www.youtube.com/watch?v=7MYQicokwmY&feature=plcp I am reviewing this requirement.As per requirement we have to build touch detection like in video link for Android enabled Tablets.
In that video toys (toys with circular, star or rectangle shape) uses Conductive Silicone Sensors with that they are detecting touch on screen & deciding shape of external world object like triangle,circle or a star & further processing the shape.
I have to use same touch detection for android tablets.Can anybody help me to find the way to implement this on Android platform ? Is there any API or framework to implement it?
If you see the video around 1:13, they show what I am guessing are some prototypes, the circle has three points, the hexagon too...
My best guess is that the biggest part of the object is non-conductive and only has a few points that are conductive and would actually register as touch points on the screen. The key is that each of them will be different enough that you would be able to recognize them no matter what the orientation is, what the position (and depending on your requirements whether you have several of those objects at the same time on the screen).
You can also play with the area of each conductive points so in your code, you will get the touch information, you can get different pressure values from the MotionEvent
Now how you place the conductive points and how many on each shape is completely up to you and would really depend on what your requirements are (recognizing arbitrary shape is not an option...)
Most touch screens would reject the touch if the area is too large (that's palm rejection), so I don't think there are much other ways to do this...
Is it possible to detect every pixel being touched? More specifically, when the user touches the screen, is it possible to track all the x-y coordinates of the cluster of points touched by the user? How can I tell the difference between when users are drawing with their thumb and when they are drawing with the tip of a finger? I would like to reflect the brush difference depending on how users touch the screen, and would also like to track x-y coordinates of all the pixels being touched over time. Thanks so much in advance for any help.
This would be very tricky primarily because every android phone is going to behave differently. There are some touch screen devices that are very, very sensitive and some that are basically "dull" by comparison.
It also sounds more like you are wanting to track pressure - how hard is the user pushing on the screen - which is actually supported on android devices.
I think some of your answer may be found by monitoring all of the touch events - in practice, most applications ignore a great number of events or perform some kind of "smoothing" of the events since there is literally a deluge of touch events when the user is manipulating the screen. Doing this may negatively impact your applications performance though.
I would recommend that you look into pressure sensitivity and calculate a circular region around the primary touch point based on pressure, then build your brush around that.
Another idea would be to incorporate more of a gesture approach to what you are trying to do - for example, visualize touching the screen with the tip of two fingers together (index and middle) and rolling the middle finger around the index finger or simply moving the middle finger up and down in relation to the index finger. Both fingers would be moved together for painting. This could be used to manipulate drawing angle on the fly or perhaps even toggle between a set of pre-selected brushes or could change brush size on the fly as you are painting.
Some of the above ideas I would love to see implemented - let me know when you have your app ready.
Good luck!
Rodney
If you have a listener on your image it will respond that there was a touch within that bounding box, basically.
So, to get what you want, you could, but, I would never do this, create a box around every pixel, or small group of pixels, and listen for a touch.
Wherever you get a touch, it may fire off an event, then you can react accordingly.
I can't think of any other solution that will give you each pixel that a person touched, at one time.
You may want to read up on multitouch though, as there are some suggestions in here that my help you:
http://android-developers.blogspot.com/2010/06/making-sense-of-multitouch.html
If you're looking for a way to get your content view as a View after Activity#setContentView(int), then you can set an id on the outer-most element of your layout:
android:id="#+id/entire_view" and reference it in your onCreate() method after setContentView:
View view = getViewById(R.id.entire_view);
view.setOnTouchListener( ... );
Suppose I am writing a word "abc" in my GestureOverlayView and while doing so I need to save all those alphabets in the screen until I press a clear button.Can anyone tell me a good way to do this..
I will write "a" which is taken as a gesture (not as stroke) i.e One thing I though of was like use a ImageView or SurfaceView on bottom of a GestureOverLayView.and Suppose when I draw "a" on GestureOverLayView then in the "onGesturePerformed" event it will take the Gesture and then get the strokes and then convert them into paths and then draw the paths onto the underlying ImageView or SurfaceView.Can anyone suggest me the code or guide me.I tried various combinations of them but couldn't solve it..
There's an app called GestureBuilder in the samples directory of the SDK. This app shows how to persist gestures drawn by the user.
I realize this is an old question, but you can "cheat" by increasing the fadeOffset to some ridiculously high number. This can be done either in the xml
android:fadeOffset = "some very large number"
whatever, or programatically,
yourgestureoverlayView.SetFadeOffset(some very large number in milliseconds)
I'm fairly new to the Android platform and was wondering if I could get some advice for my current head scratcher:
I'm making an app which in one view will need an image, which can be scrolled on one axis, with a load of selectable points over the top of it. Each point needs to be positionable on the x and y (unlikely to change once the app is running, but I'll need to fine tune the positions whilst I'm developing it).
I'd like to be able to let the user select each point and have a graphic drawn on the point the user has selected or just draw a graphic on one/more points without user intervention.
I though for the selectable points I could extend the checkbox with a custom image for the selected state - does that sounds right, or is there a better way of doing this? Is there any thing I can read up on doing this, I can't seem to find anything on the net about replacing the default images?
I was going to use the absolute layout, but see that it's been depreciated and I can't find anything to replace it.
Can anyone give me some code or advice on where to read up on what I need to do?
Thank you in advance
This really feels like something you should be doing with the Canvas and 2D graphics, rather than trying to twist the widget framework to fit.