Is there a way to get every single touched pixel during a touch event? I understand that Android can provide an eclipse but I would like to access every single pixel that was touched for research purposes.
I understand this is the same question as How to get the all the raw touch points at a particular instant for a tap?
Did something change since then to provide this data now on Android devices or possibly iOS?
Related
How to programmatically get touch coordinates to the screen in Android, regardless of open applications or games? The display of these coordinates can be turned on in the Developer options menu. Example of required coordinates. But how to get access to them, which class is responsible for this? Maybe there is some library or some way to find it?
I tried to use Accessibility service, but it doesn't have this data.
I get stuck on how to get the capacitance image out of an android phone. Is it necessary to Hack the android system to get the capacitance image in Android?
In the released Android API, it only allows me to get which point an finger is touching on the screen. But since the touchscreen is using capacitive sensing, I am wondering how can I get the raw capacitance sensing data from the system? I guess I may need to hack the system for further information. Does anyone have done it before? Or give me some hints or directions on where I should start with?
I have a very creative requirement - I am not sure if this is feasible - but it would certainly spice up my app if it could .
Premise: On Android phones, if the screen is covered by hand(not touching, just close to the screen) or if the
phone is placed over the ear during a call the phone locks or
basically it blacks out. So there must be some tech to recognize that
my hand is near the screen.
Problem: I have an image in my app. If the
user points to the image without touching the screen, just as an
extension to the premise, I must be able to know that the user is
pointing to the image and change the image. Is this possible ?
UPDATE: An example use:
Say I want to build a fun app, on touch the image leads to some other
place. For example - I have two doors one to a car and one to a lion.
Now just when the user is about to touch door 1 - the door should show
a message saying are you sure, and then actually touching it takes you
to another place. Kinda rudimentary example, but I hope you get the
point
The feature you are talking about is the proximity sensor. See Sensor and SensorEvent.values for Sensor.TYPE_PROXIMITY.
You could get the distance of the hand from the screen, but you won't really be sure where in the XY co-ordinate system the hand is. So you won't be able to figure out whether the user is pointing to the "car door" or to the "lion door".
You could make this work on a phone with a front camera with a wide angle so it can see the whole screen. You'd have to write the software for recognizing hand movements, and translate these to screen actions.
Why not just use touch, if I may ask?
I need users to be able to draw and type on a vector image. The user shall also be able to resize and move paths/texts so the "elements" needs to be able to receive click/touch events.
The problem is that I don't know how I should draw them on screen to be able to handle the events and save the result. Canvas doesn't seem to let me do that.
Anyone that can lead me to the right track?
The only thing I can think of is a WebView and develop the hole thing using JS and SVG. But that doesn't feel right...
I will use a API level of at least 11 but up to 13 is okay.
You could have one big AndroidGraphics stretched on the screen and catch all the touchDown Events, which give you coordinates and decide for yourself what to do with them.
Try libGdx framework which has already implemented most details for you (plus it runs GL Es)
ok, I'm going to ask this because I really can not find an answer, after scouring for hours.
I'm making an android game, and I have a gameboard 'view'. My gameboard view has a few things - it has a background image, and then it has a grid of board pieces ontop of it.
I want the user to be able to double tap to zoom in, and then pan around the board.
I've been able to apply a transform in the past to get this effect (on other platforms) but I'm not sure how to do this in Android (I'm still new) - does anyone know how to apply transforms or achieve the above effect, WITHOUT me having to calculate the position of everything in my view?
Please dont suggest Canvas - I want my game pieces to be objects, and I want to be able to listen for 'on touch' events (AKA I do not want to have to figure out what the user touched when they touch the screen). I also want this for Android 2.1+
Thanks for any help ahead of time!!
Unfortunately doing this with views does require recalculating sizes and seems very tedious. I'm sure you know that Canvas has various matrix transformations one can apply.
If this functionality is important to you, I would think that figuring out how to have your Canvas objects receive touch events is the easier of the two paths, given that most games using the Canvas will also be receiving touch events and correlating them with the underlying game models.