Problems with touch event - android

Speaking of events we know that the touch screen method onTouchEvent (MotionEvent events) allows you to capture touch events on the screen and event.getX () and
event.getY () give me the coordinates of the touch as float values ​​(ie values ​​with a comma).
Specifically, I realized that taking logcat using the fixed point the finger in a mobile phone screen and not moving it, the event is not only perceived but MotionEvent.ACTION_DOWN MotionEvent.ACTION_MOVE and returned coordinates are manifold. This i can understand because the surface of the finger touches more points and more by reading the device coordinates believe that the finger moves while holding it.
My problem is that I would read the color of one pixel of a color image when I hold your finger still. In particular I need the central area occupied by the finger.
First, assuming you have a Bitmap object, I have to round to integer coordinates of the touch as the getPixel (intX, int y) coordinates of the entire Bitmap wants to return the pixel color.
Then how do I get the central point of touch? Be that there is a function that can help me? Keep in mind that while I stopped and read the value of a pixel on the image then I start to move slowly and even though I always record every move the center pixel. I need this because every move, if you change the color of the image in pixels, I want to vibrate or not the device.
If I could stop thinking about themselves in a dynamic array to store the coordinates of different pixels and search for the center of gravity, but moving I do not know what to do. I think however, that the research center of gravity, if not done very efficiently, can be slow during the move and then give the wrong results. If the CG is not quite enough for me to another point belonging to the environment.
I hope you can help me maybe with a few lines of code sketched.
Thanks in advance.

To get the touch position of finger, just cast the getX() and getY() to int and perform your desired tasks.
public boolean onTouchEvent(MotionEvent event) {
if (event.getAction() == MotionEvent.ACTION_DOWN ) {
getPixelAt((int)event.getX(), (int)event.getY());
}
}

you can put you on main activity class when you can want to touch event occurs. let see you can put following code or not ? and return is most important as well.
#Override
public boolean onTouchEvent(MotionEvent event) {
if (event.getAction() == MotionEvent.ACTION_DOWN) {
mDistanceX = 0;
}
return super.onTouchEvent(event);
}

Related

Camera rotation not rotating correctly on mobile

I'm making a FPS and I want the player to rotate the camera, my code works for PC, but on mobile if I'm rotating the camera and I also touching the fire button (or anywhere else with my other finger) the camera rotates to right (here where my fire button is) and I don't know if I can do something about it, or I need to cancel the release for android and IOS and publish my game only for PC
Part of my code:
if (CanProcessInput())
{
// Check if this look input is coming from the mouse
bool isGamepad = Input.GetAxis(stickInputName) != 0f;
float i = isGamepad ? Input.GetAxis(stickInputName) : Input.GetAxisRaw(mouseInputName);
// handle inverting vertical input
if (InvertYAxis)
i *= -1f;
// apply sensitivity multiplier
i *= LookSensitivity;
if (isGamepad)
{
// since mouse input is already deltaTime-dependant, only scale input with frame time if it's coming from sticks
i *= Time.deltaTime;
}
else
{
// reduce mouse input amount to be equivalent to stick movement
i *= 0.01f;
#if UNITY_WEBGL
// Mouse tends to be even more sensitive in WebGL due to mouse acceleration, so reduce it even more
i *= WebglLookSensitivityMultiplier;
#endif
}
return i;
}
Segment the touch input so you ignore values that are generated in the area of the fire button. You can do this by checking the Touch. position value:
Description
The position of the touch in screen space pixel coordinates.
Position returns the current position of a touch contact as it's dragged. If you need the original position of the touch see Touch.rawPosition.
The documentation hints at what you might be interested in, too - the original touch position. Your question body is asking about horizontal motion, but your code is referencing y values. The position coordinates are given in (width, height), so I just wanted to be clear up front that I'll use x values to answer your question about horizontal motion.
So if you know that the bottom-left corner of the screen is (0,0) and you can use Screen.width to get the width in pixels, then you could calculate the right 20% of the screen as reserved for the fire button. Anything less than that would be acceptable inputs for rotation, but the 80%-of-width pixel would be the maximum allowable input:
private int maxAllowablePixel;
void Start()
{
maxAllowablePixel = 0.8f * Screen.width;
}
and then only process the touch as a rotation if the original touch was less than that value:
if(touch.rawPosition.x < maxAllowablePixel)
{
DoRotation(touch.position);
}
and again here you are allowing the user to put their finger down in the middle of the screen, drag it over the fire button, and still rotate that much, but any touches that originate in the fire button "exclusion zone" are ignored for the purposes of doing rotations.
Also when you do the rotation, do it where the finger is now (Touch.position) and not where it started (Touch.rawPosition).
The way I always solve an issue like this is to not use the touch positions directly, but to have a "TouchReceiver" Object in the Canvas with a script that implements IDragHandler. So the class might look like this:
[RequireComponent(typeof(Image))]
public class TouchReceiver : MonoBehaviour, IDragHandler
{
public void OnDrag(PointerEventData data)
{
Vector2 delta = data.delta;
// do something with delta, like rotating the camera.
}
}
Then set up your canvas like so:
Make sure to put the "TouchReceiver" behind the fire button, that way the OnDrag event will not occur when pressing the fire button, because the fire button blocks the "TouchReceiver" object itself. The "TouchReceiver" object needs to have an Image component, otherwise it won't be able to receive any touches. Of course the image should then be made totally transparent.
With this setup is pretty easy to move the fire button around without changing anything else. This might also come in handy when adapting to different screen resolutions.

Android: Rotate image by dragging a specific part of it

I have this ImageView:
I need to be able to rotate the whole image (The circle and the two "30" circles)
when I drag the outside circles around. It should rotate only if I start dragging from the outside "30" circles
They should all rotate around the dot in the center.
I managed to make this image rotate when I drag it in any part of it using a View.OnTouchListener and by calculating the angle between the touch event coordinates and the pivot point, and then rotate the imageView.
How can I detect when the motion event is in the outer circles only, to prevent the image from rotating when dragging inside the big circle?
In OnTouchListener's onTouchEvent method, you can utilize the ACTION_DOWN event to check, if a TouchEvent starts inside the little outer circles. If this is true, set a boolean to true. In the ACTION_MOVE event, you just have to check the boolean and only rotate if it is set to true. Then, use ACTION_UP and ACTION_CANCEL to set the boolean to false again.
#Override
public boolean onTouchEvent(MotionEvent event) {
int x = (int)event.getX();
int y = (int)event.getY();
boolean rightPosition = false;
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
// When the event is in the little circles rightPosition=true
break;
case MotionEvent.ACTION_MOVE:
if(rightPosition) yourRotationCode();
break;
case MotionEvent.ACTION_UP:
case MotionEvent.ACTION_CANCEL:
rightPosition = false;
}
return false;
}
How do you know, if the drag starts in the little circles?
Tl;dr: Keep track of the polar coordinates of the little circles and compare them with the ACTION_DOWN event's coordinates.
To determine, if the ACTION_DOWN event occurs inside the little circles, keep track of the angle, your ImageView is rotated from it's 'zero position'. In the beginning, the little circles are at 45° and 125°. Offset them against the current rotation of your ImageView, get the angle between 12 o'clock and the ACTION_DOWN event and compare them. Also take into account the distance between the point of rotation and the event. As you already calculate the needed rotation, I think you know how to determine these angles.
The right distance from the point of rotation should be calculated from the ImageView's height and width, so you remain density independent. This needs some pixel counting and calculating the ratio, based on the original image resource, done by hand. This ratio can then be hard coded to a constant value.
As the position of an TouchEvent is precise to one pixel, you should add some offset around the calculated center of the little circles. I would recommend a square around the little circle. Its side length has again to be calculated in aspect to the ImageView's height and width. Again you can determine the ratio of your 'valid area' to the size of the ImageView by hand and provide it as a constant. Then you just have to check if the starting point's coordinates are within +/-(side length / 2) of the center of the little circles.

Dragging a sprite in Android

I'm simply trying to drag a sprite around using my finger. I'm doing this by detecting the distance the finger that is touching the screen has moved by and then moving the sprite by the same amount.
This is an openGL ES 2.0 app, so my rendering and logic updating takes place on my GL Rendering thread, and obviously touch events are captured on the UI thread.
So, my setup is something like this:
UI Thread
case MotionEvent.ACTION_MOVE: {
movedByX = event.getX()-oldXPosition;
movedByY = event.getY()-oldYPosition;
oldXPosition = event.getX();
oldYPosition = event.getY();
break;
}
GL Rendering thread
Rendering
#Override
public void render() {
drawSprite(testSprite); //Draws using the sprites' internal x and y coordinates
}
logic update
#Override
public void updateLogic() {
testSprite.x+=movedByX; //Update the sprite's X position
testSprite.y+=movedByY; //Update the sprite's Y position
}
The issue
If one drags the sprite around the screen for a while, and then stops. The resting point of the finger relative to the sprite is not the same as it was when the finger initially went down. Take for a example a circlar sprite like so. The blue circle is the sprite and the red dot represents the finger/pointer.
So as you can see, it doesn't quite work as expected and I'm not sure why.
I had a similar question in previously in which I 'worked around' the problem by initially grabbing the X and Y in ACTION_MOVE (onTouchEvent/UI Thread) as I am above, but then in my updateLogic method, I make a copy of it and work out the 'moveByY' amount there before applying it to my sprite's position.
Doing this effectively solved the problem of the finger 'wandering' - but - it makes the movement of the sprite very choppy, therefore I can't use this solution.
I think this choppiness may be because the rendering thread sometimes runs twice without the UI thread running, therefore, even though the finger has moved, the logic is still using the version it has because onTouch hasn't been able to capture to actual most up to date finger position. But I'm not 100% sure.
If I simply update my sprite's positionin my UI thread, (again, in ACTION_MOVE) - again, I get very choppy movement but the pointer position does remain correct).
So, I need to keep the smooth movement that I get from the method outlined at the top of the question, but I need to know why the moveBy amount is causing the sprite to wander from the finger.
Other Notes
I need to move the sprite using a the difference between the finger's old and current positions, and not simply draw the sprite at the finger's current position because this will eventually become part of a 'scrollable' menu system.
All of my variables are declared as 'private float volatile' and my onTouchEvent and updateLogic methods are synchronised.
i dont know if its a typo
#Override
public void updateLogic() {
testSprite.x+=movedByX; //Update the sprite's X position
testSprite.x+=movedByY; //Update the sprite's Y position
}
but you set testsprite.x both times

Circle in canvas taking entire parent space

My parent view is a RelativeLayout, I added a triangle to it extending the View class. I draw the triangle using canvas and paint. The problem i am facing is that, when i touch on the relative layout, both touch listeners , the relative layout and triangle are being triggered. I just want the triangle to take the exact space as it requires. How can i limit my customview , triangle in this case to take it's space, rather than occupying the entire parent layout.
Actually my requirement is: i have one relative layout on the layout i am adding some custom views dynamically. those have touch listeners for for dragging points in the triangle,but those are taking full screen of my parent view, because of that i am unable to trigger touch listeners separately for parent and child view's.
I've spent countless hours developing solutions for this problem and I just couldn't get my head around it. Any help would be greatly appreciated.
From what I am gathering you are saying,you have a triangle drawn within a RelativeLayout in a CustomView. The actual View size is FILL_PARENT for both width and height. So the square representing the touchable area of the triangle is the same as the square representing the touchable area of the parent RelativeLayout.
You would like touch events inside the triangle's drawn area to be received by the triangle View and anything outside to be received by the parent. It is not possible to define a custom touch area from what I understand.
That doesn't stop you manipulating drawing bounds and touch focus to accomplish what you want though.
How it can be done
Ensure that your CustomView with the triange always receives the touch event. When it receives the touch event, it perform basic maths to figure out if the touch was within your triangle area (you must have your triangle bounds you use to draw your triangle saved as global variables for the CustomView).
Then a simple conditional statement to determine whether to act on and consume or pass the event on (do you even want a TouchEvent on your RelativeLayout to do anything? If not that makes this a lot easier).
Example code skeleton:
#Override
public boolean onTouchEvent(MotionEvent event) {
boolean inTriangle = false;
float x = event.getX();
float y = event.getY();
// This really depends on what behaviour you want, if a drag extends outside the
// triangle bounds, do you want this to end the touch event? This code could be done
// more efficiently depending on your choices / needs
// Do maths to figure out if the point is actually inside the triangle. Once again,
// many solutions and some much easier than others depending on type of triangle
// This flow will only register events in the triangle, and do nothing when outside
if (event.getAction() == MotionEvent.ACTION_DOWN) {
if (inTriangle) {
// Do whatever it is you want to do
return true; // to say you consumed the event if necessary
} else {
return false; // parent RelativeLayout should now get touch event
}
} else if (event.getAction() == MotionEvent.ACTION_MOVE) {
if (inTriangle) {
// Do whatever it is you want
return true;
} else {
// Do nothing
return true; // So is stil consumed
}
}
return true;
}
To be honest, the question is way to open-ended / unclear to give anything more specific. But this is how you would accomplish what you asked. Just apply it to your specific scenario.
Note: I have encountered issues with TouchEvent passing, so if you really do want to pass them to specific places, you may have to investigate toying with View.dispatchTouchEvent() if the order your Touches are being processed becomes an issue.
You have an issue to handle the touch events of the triangle because you are getting touch on both the view RalativeLayout and Your customview.
To solve this issue you have to do some changes in the customview's onTouchEvent.
Like If you are touching on the customview and if you want to handle the event then from onTouchEvent you will have to return true instead of false so that your parent view will not get the touch events.

ScaleGestureDetector triggers onScaleEvent with single finger touch

I have code that onTouchEvent passes the touch event to a member ScaleGestureDetector and if the scale GD triggers a onScaleEvent on my scale listener, I scale up the display item. If the onScale event is not triggered I treat this event as a potential drag and move the displayed item around. I have observed that on certain devices the ScaleGestureDetector triggers an onScale event for me, even if there is just a single finger touching the screen. Is this expected? What is the recommended mechanism to deal with scales and drags at the same time?
If I'm correct you're triggering mScaleGestureDetector.onTouchEvent(mEvent) with single pointer (mEvent.getPointersCount() == 1). It is supposed to be triggered with pointers greater than 1. Make sure you're triggering with pointersCount > 1 Eg.
if(pointersCount >1) {
TOUCH_GESTURE_MODE = GESTURE_MODE_SCALE;
mScaleGestureDetector.onTouchEvent(mEvent);
return true;
}

Categories

Resources