Detect if the user draws a specified shape - android

I am creating an app where a shape (circle, square, triangle...) pops up on the screen. The user then has to trace the shape shown on the screen in order to move on. The only documentation I have found is on simple gestures, but nothing on complex shape drawing gesture. Is anyone aware of any android constructs that achieve this? Any help is very appreciated!
so far I have looked into android canvas and the ability for the user to draw shapes. However, I cannot find a method that will match the drawn shape to a pattern. I also looked into gesture detection, but found that it was only used to detect basic gestures such as swipes
-Kelton

Try reading this Tracking Movement. It should work when you store data from movement and then compare it to your actual object

Related

Hide/remove/change color of a canvas circle When toched in Android

this is my first question here, so maybe I'll make some mistakes.
I want to create an Android app for touch test, and draw a cross on the screen, however I'm succeeded to draw cross, but now I want to implement test screen login.
What I want to do is that, When user clicks on any circle it should be removed/hidden or change color to transparent.
Any kind of help would be appreciated.
this is the link to my code.
https://github.com/akhlaqshah36/stackoverflow_questions/blob/master/drawCicles
and attached screenshot.
I got the solution by using some formulas of distance and Using Pointer objects.
check the solution here if someone need it.
Solution Link drawing cross in canvas

Detect shapes and create histograms

I am working on an app that will compare histograms in hopes to match faces.
The app allows the user to take a photo, select a few key points in the image and then the app draws circles around those points. I then detect the circles using the OpenCV Hough Circle Transform functions. Up to this point the app works great.
What I need to implement now is one of two options:
Detect the circles and create separate histograms for the area inside of each circle.
Detect the circles and blackout the area(s) around the circles and create one histogram.
I'm leaning towards method 2, but I'm not sure how mask/color/paint around the area outside of the circles after they are detected. Any input would be appreciated. Thanks.
Instead of painting the area outside the circles in the original image, why not create a new image and copy the content of the circles to it?
Another point is that histograms are independent of translation. So, it does not matter if you copy the circles to the exact locations in the new image.
Do clarify if I did not answer your question, or if you have other questions now.

how to detect the shape that touch the touch screen

I have received requirement like this http://www.youtube.com/watch?v=7MYQicokwmY&feature=plcp I am reviewing this requirement.As per requirement we have to build touch detection like in video link for Android enabled Tablets.
In that video toys (toys with circular, star or rectangle shape) uses Conductive Silicone Sensors with that they are detecting touch on screen & deciding shape of external world object like triangle,circle or a star & further processing the shape.
I have to use same touch detection for android tablets.Can anybody help me to find the way to implement this on Android platform ? Is there any API or framework to implement it?
If you see the video around 1:13, they show what I am guessing are some prototypes, the circle has three points, the hexagon too...
My best guess is that the biggest part of the object is non-conductive and only has a few points that are conductive and would actually register as touch points on the screen. The key is that each of them will be different enough that you would be able to recognize them no matter what the orientation is, what the position (and depending on your requirements whether you have several of those objects at the same time on the screen).
You can also play with the area of each conductive points so in your code, you will get the touch information, you can get different pressure values from the MotionEvent
Now how you place the conductive points and how many on each shape is completely up to you and would really depend on what your requirements are (recognizing arbitrary shape is not an option...)
Most touch screens would reject the touch if the area is too large (that's palm rejection), so I don't think there are much other ways to do this...

Detecting touch area on Android

Is it possible to detect every pixel being touched? More specifically, when the user touches the screen, is it possible to track all the x-y coordinates of the cluster of points touched by the user? How can I tell the difference between when users are drawing with their thumb and when they are drawing with the tip of a finger? I would like to reflect the brush difference depending on how users touch the screen, and would also like to track x-y coordinates of all the pixels being touched over time. Thanks so much in advance for any help.
This would be very tricky primarily because every android phone is going to behave differently. There are some touch screen devices that are very, very sensitive and some that are basically "dull" by comparison.
It also sounds more like you are wanting to track pressure - how hard is the user pushing on the screen - which is actually supported on android devices.
I think some of your answer may be found by monitoring all of the touch events - in practice, most applications ignore a great number of events or perform some kind of "smoothing" of the events since there is literally a deluge of touch events when the user is manipulating the screen. Doing this may negatively impact your applications performance though.
I would recommend that you look into pressure sensitivity and calculate a circular region around the primary touch point based on pressure, then build your brush around that.
Another idea would be to incorporate more of a gesture approach to what you are trying to do - for example, visualize touching the screen with the tip of two fingers together (index and middle) and rolling the middle finger around the index finger or simply moving the middle finger up and down in relation to the index finger. Both fingers would be moved together for painting. This could be used to manipulate drawing angle on the fly or perhaps even toggle between a set of pre-selected brushes or could change brush size on the fly as you are painting.
Some of the above ideas I would love to see implemented - let me know when you have your app ready.
Good luck!
Rodney
If you have a listener on your image it will respond that there was a touch within that bounding box, basically.
So, to get what you want, you could, but, I would never do this, create a box around every pixel, or small group of pixels, and listen for a touch.
Wherever you get a touch, it may fire off an event, then you can react accordingly.
I can't think of any other solution that will give you each pixel that a person touched, at one time.
You may want to read up on multitouch though, as there are some suggestions in here that my help you:
http://android-developers.blogspot.com/2010/06/making-sense-of-multitouch.html
If you're looking for a way to get your content view as a View after Activity#setContentView(int), then you can set an id on the outer-most element of your layout:
android:id="#+id/entire_view" and reference it in your onCreate() method after setContentView:
View view = getViewById(R.id.entire_view);
view.setOnTouchListener( ... );

Comparing gestures/images in Android

What would be the best way to compare a gesture made on an Android device's screen with a stored gesture? For example, if in my application, I want it so that if I draw a triangle with my finger, the screen will turn blue, and if I draw a circle, the screen will turn red, how could that be done? The only thing I have been able to think of so far is to somehow generate an image file and then compare that to an image of a triangle or circle and check for similarities. But that wouldn't really account for different sized shapes or offset ones. Any ideas on how this could be implemented? Thanks!
There is no need to compare/match the shape of a gesture with an image. The better way is to mathematically guess which one of the recognized shapes did the user draw. http://developer.android.com/resources/articles/gestures.html provides a great reference for implementing gestures.
HTH,
Akshay

Categories

Resources