I used on touch event to find the touched position by the user.I used the following code
#Override
public boolean onTouch(View v, MotionEvent event) {
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
float x=event.getX();
float y=event.getY();
Log.d("MULTIPLE", "X=" + x + " Y=" + y);
}
return true;
}
And according to the position some methods are called.
When i tested in different devices the x and y values changed.Is there any way to get a unique value for x and y across all devices
I solved this in the following way
The mid point of the device can be find out this way
X=devicewidth * 0.5
Y=deviceHeight * 0.5
Call 0.5 as heightFactor and widhtFactor
Any other points in the device have its own heightFactor and widhtFactor
We get X and Y value from onTouch event (event.getX() and event.getY())
So to find heightFactor and widhtFactor
heightFactor =X/deviceHeight
widhtFactor=Y/devicewidth
thus we get heightFactor and widhtFactor for each point which will be same for all device
I'm not sure what you really mean with "in different devices the x and y values changed" because if a user touches the screen, the values will change even on same device. You might have a look into getRawX() and getRawY() methods. These values are not adjusted the containing window and view.
You can able to find ontouch event,developers.android.com gives a training on this page find out here
Related
I'm trying to make a View's x value move along with a finger as it drags across the screen. Although the view moving is smooth, it only moves ~1/3 of the distance that the finger does. The View in my case happens to be a RecyclerView, but I think this is irrelevant to the problem. What am I doing wrong in the following implementation?
view.setOnTouchListener((v, event) -> {
switch (event.getActionMasked()) {
case MotionEvent.ACTION_UP:
view.animate().translationX(0).setDuration(200);
break;
case MotionEvent.ACTION_MOVE:
if (event.getHistorySize() < 1) break;
final float latestX = event.getX(),
secondLatestX = event.getHistoricalX(event.getHistorySize() - 1),
firstX = event.getHistoricalX(0),
secondX = (event.getHistorySize() > 1) ?
event.getHistoricalX(1) : latestX;
final float firstY = event.getHistoricalY(0),
secondY = (event.getHistorySize() > 1) ?
event.getHistoricalY(1) : event.getY();
// if initial change x is greater than y
if (Math.abs(secondX - firstX) > Math.abs(secondY - firstY)) {
view.setX(messageList.getX() + (latestX - secondLatestX));
return true;
}
break;
}
return false;
});
If the code needs some explanation:
firstX is the first x value that the finger touched on the screen
secondX is the second x value that the finger touched on the screen
(as in, the next time onTouch was fired with ACTION_MOVE)
latestX is the most recent x value of the finger
secondLatestX is the second most recent x value of the finger
All y variables are the
same as their x counter-parts but in the y direction
Am I calculating the variables incorrectly? I cannot seem to find a logic issue in anything.
Since I could not find any way to do this, I just decided to create my own library from the ground up. Here it is:
https://github.com/GregoryConrad/SlideDetector
As from your Use-case what i can understand is you want to move the view based on your touch.
Check out my project Long press and Move the square in infinite View.
ZoomIt
Hey I am using this code to imitate a knob view in my app:
- (void) setupGestureRecognizer
{
CGPoint midPoint = CGPointMake(image.frame.origin.x + image.frame.size.width / 2,
image.frame.origin.y + image.frame.size.height / 2);
CGFloat outRadius = image.frame.size.width / 2;
gestureRecognizer = [[OneFingerRotationGestureRecognizer alloc] initWithMidPoint: midPoint
innerRadius: outRadius / 10
outerRadius: outRadius *2
target: self];
[self.view addGestureRecognizer: gestureRecognizer];
}
Like this, the gestureRecognizer handles all the events that happen on or very close to the button. What I want is the following:
gestureRecognizer only gets triggered when user touches inside
the image
if finger leaves the image, gestureRecognizer should continue
listening (and calculating the angle)
On Android I am doing it like the following:
public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY) {
float x = e2.getX() / ((float) getWidth());
float y = e2.getY() / ((float) getHeight());
float rotDegrees = cartesianToPolar(1 - x, 1 - y);
[...doing maths stuff here]
I got all the rotating stuff working but how do you make the gestureRecognizer work like you can handle events in Android? If I lost my internet connection and had no choice but to code it completely on my own I would just take 2 different gestureRecognizers, on handling the "init" press and one "following" the finger everywhere, setting the correspondent knob according to the key value being set in the first gestureRecognizer. But this looks like a massive pile of bad code to me so I'd appreciate some advice on this one.
Cheers, Alex
You should be able to use the delegate method, gestureRecognizer:shouldReceiveTouch: to examine the location of the touch, and return YES only if the touch point is within the image view's bounds (using CGRectContainsPoint). If the gesture recognizer is added to the image view's superview, it should continue to "listening" like you want.
-(BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch {
CGPoint touchPoint = [touch locationInView:self.view];
return CGRectContainsPoint(self.imageView.frame, touchPoint);
}
Also be sure to set the controller as the delegate of the gesture recognizer.
There are a lot of discussions of how MotionEvent.getX/.getY are "unreliable" (or other terms) and that we should use the Raw versions of these calls to get coordinates.
On my Nexus 7, I have discovered that .getX/.getY are reliably returning interleaved absolute and relative coordinates. In other words, say a given ACTION_MOVE event returns absolute coordinates when you call .getX and .getY. The next ACTION_MOVE event will then return relative coordinates on its .getX and .getY calls.
This cannot be accidental behavior. It also leads me to believe there must be a way to discern whether a given ACTION_MOVE will be returning absolute or relative coordinates.
Does anyone know how to check a given MotionEvent object to see if it is returning absolute vs. relative coordinates on its .getX and .getY calls?
EDIT: Per your request, here's the code. It's nothing special, just grab the coordinates and move the View object:
public boolean onTouch(View v,MotionEvent event) {
boolean bExitValue = true;
float fX;
float fY;
int iAction;
iAction = event.getActionMasked();
if (MotionEvent.ACTION_MOVE == iAction) {
fX = event.getX();
fY = event.getY();
v.setX(fX);
v.setY(fY);
Log.d("",("X: " + fX + ", Y: " + fY));
}
else if (MotionEvent.ACTION_DOWN != iAction) {
bExitValue = false;
}
return(bExitValue);
}
The Log.d call and standalone floats aren't necessary to make the code work, but they do allow you to see the interleaving of values in the LogCat window.
I have found out, that on the galaxy s 4 getY and getRawY both are wrong. But they change in an orthogonal way. So you can get the right value by the following code:
rawY = event.getRawY() - spaceOverLayout;
normalY = event.getY();
y = 0F;
float prozentPosition = ((rawY + normalY) / 2) / height;
y = (normalY * (1 - prozentPosition)) + (rawY * prozentPosition);
hopefully it will help.
I'm creating a simple OpenGL 'app' to rotate a triangle. I wish, on the first touch, to save the angle the touch position corresponds to. Then, on motion, rotate the shape by the angle corresponding to current position minus angle of first touch.
It was my understanding that the first step should be done in MotionEvent.ACTION_DOWN, and the second in MotionEvent.ACTION_MOVE. However, it seems as if ACTION_DOWN is being called during the motion. That is, the below code causes the shape to rotate as a finger is dragged (and I understood that it would rotate only to the position of the initial touch):
private double mTheta;
#Override
public boolean onTouchEvent(MotionEvent e) {
super.onTouchEvent(e);
float x = e.getX();
float y = e.getY();
switch (e.getAction()) {
case MotionEvent.ACTION_DOWN:
x -= getWidth() / 2;
y -= getHeight() / 2;
mTheta = Math.atan2(-x,-y) * 180.0f / Math.PI;
GL20Renderer.mAngle = (float) mTheta;
requestRender();
}
return true;
}
Is my code wrong, or is this some weird behaviour of the emulator? (I don't currently have access to an android device.)
(Addendum: I original attempted to implement the above fully, with a MotionEvent.ACTION_MOVE case for calculating the new angle and rendering. The ACTION_DOWN case was only saving the starting offset angle. This didn't work, in that the shape didn't rotate - because the offset angle was being re-calculated during movement - which is how I ended up at this point.)
It might have been that you forgot to put a break statement in your switch/case. So once ACTION_MOVE is done, ACTION_DOWN follows immediately after
Needed to be using getActionMasked() rather than getAction(). See comment from WarrenFaith.
I'm currently working on my own basic drawing app. So far, my app is working well but I noticed that my motion event doesn't seem to get all the X and Y axis points that are touched.
When I move my finger across the screen, there is noticeable spaces between the circles. Only when I move my finger slowly does it capture all the points. Is there a way I can grab all the points or is there a way i can optimize it to be able to handle all the points?
Here's how I'm doing it:
setOnTouchListener(new View.OnTouchListener() {
public boolean onTouch(View v, MotionEvent event) {
int x;
int y;
switch (event.getAction())
{
case MotionEvent.ACTION_DOWN:
{
x = (int) event.getX();
y = (int) event.getY();
//method draws circle at x and y coordinate
MyPainter mp = new MyPainter(x,y);
break;
}
case MotionEvent.ACTION_MOVE:
{
x = (int) event.getX();
y = (int) event.getY();
MyPainter mp = new MyPainter(x,y);
break;
}
}
return true;
}
});
Any Suggestions or comments are appreciated. Thanks
This is just the way the hardware is designed. The hardware samples the touchscreen N times per second, and spits out the coordinates every time it senses a finger.
If you move faster than the screen samples, you'll have gaps in the reported data. Only thing you can do is to generate the missing points yourself by interpolating along the line between the two touches.
The accepted answer isn't strictly correct, it is possible to access the touch data from the graps using the following methods:
event.getHistorySize(); // returns the number of data points stored as an int
event.getHistoricalX(int);
event.getHistoricalX(int);
Other related methods are available when there are multiple pointers (see documentation).
A good example of how this can be used in practice can be found here.