Android getX/getY interleaves relative/absolute coordinates - android

There are a lot of discussions of how MotionEvent.getX/.getY are "unreliable" (or other terms) and that we should use the Raw versions of these calls to get coordinates.
On my Nexus 7, I have discovered that .getX/.getY are reliably returning interleaved absolute and relative coordinates. In other words, say a given ACTION_MOVE event returns absolute coordinates when you call .getX and .getY. The next ACTION_MOVE event will then return relative coordinates on its .getX and .getY calls.
This cannot be accidental behavior. It also leads me to believe there must be a way to discern whether a given ACTION_MOVE will be returning absolute or relative coordinates.
Does anyone know how to check a given MotionEvent object to see if it is returning absolute vs. relative coordinates on its .getX and .getY calls?
EDIT: Per your request, here's the code. It's nothing special, just grab the coordinates and move the View object:
public boolean onTouch(View v,MotionEvent event) {
boolean bExitValue = true;
float fX;
float fY;
int iAction;
iAction = event.getActionMasked();
if (MotionEvent.ACTION_MOVE == iAction) {
fX = event.getX();
fY = event.getY();
v.setX(fX);
v.setY(fY);
Log.d("",("X: " + fX + ", Y: " + fY));
}
else if (MotionEvent.ACTION_DOWN != iAction) {
bExitValue = false;
}
return(bExitValue);
}
The Log.d call and standalone floats aren't necessary to make the code work, but they do allow you to see the interleaving of values in the LogCat window.

I have found out, that on the galaxy s 4 getY and getRawY both are wrong. But they change in an orthogonal way. So you can get the right value by the following code:
rawY = event.getRawY() - spaceOverLayout;
normalY = event.getY();
y = 0F;
float prozentPosition = ((rawY + normalY) / 2) / height;
y = (normalY * (1 - prozentPosition)) + (rawY * prozentPosition);
hopefully it will help.

Related

OpenCV Android Comparing Finger Touch Position to Moments Position

I have an Android application that uses Native OpenCV Library to track objects in my camera view. I find the position of the objects using moments:
Moments moment = moments((Mat) contours[i]);
double area = moment.m00;
object.setXPos(moment.m10 / area);
object.setYPos(moment.m01 / area);
What I am trying to implement is a way to see if my finger touch Point is within a distance threshold to the object position. However, Android calculates my finger position based on touch location on screen, whereas the object's position is calculated by moments, which I believe is causing wacky results when I calculate the distance from the touch event to the object location. Is there any way to remedy this, or am I going about this the wrong way? Thanks in advance for your help!
Other possibly useful info:
#Override
public boolean onTouchEvent(MotionEvent event) {
double x = event.getX();
double y = event.getY();
switch(event.getAction()) {
case MotionEvent.ACTION_UP:
{
//jni function. Converts x and y to Point(x, y) and compares its distance to tracked object locations
GetTouchedPoint(x, y);
}
}
return false;
}
//Native function: p1 is finger touch location, p2 is object location (found by moments)
int ObjectDetector::distance(Point p1, Point p2) {
int dx = p1.x - p2.x;
int dy = p1.y - p2.y;
int distance = sqrt(pow(dx, 2) + pow(dy, 2));
return distance;
}

Get Position from OnTouchEvent

I used on touch event to find the touched position by the user.I used the following code
#Override
public boolean onTouch(View v, MotionEvent event) {
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
float x=event.getX();
float y=event.getY();
Log.d("MULTIPLE", "X=" + x + " Y=" + y);
}
return true;
}
And according to the position some methods are called.
When i tested in different devices the x and y values changed.Is there any way to get a unique value for x and y across all devices
I solved this in the following way
The mid point of the device can be find out this way
X=devicewidth * 0.5
Y=deviceHeight * 0.5
Call 0.5 as heightFactor and widhtFactor
Any other points in the device have its own heightFactor and widhtFactor
We get X and Y value from onTouch event (event.getX() and event.getY())
So to find heightFactor and widhtFactor
heightFactor =X/deviceHeight
widhtFactor=Y/devicewidth
thus we get heightFactor and widhtFactor for each point which will be same for all device
I'm not sure what you really mean with "in different devices the x and y values changed" because if a user touches the screen, the values will change even on same device. You might have a look into getRawX() and getRawY() methods. These values are not adjusted the containing window and view.
You can able to find ontouch event,developers.android.com gives a training on this page find out here

Getting All x and y coordinates from a motion event in android

I'm currently working on my own basic drawing app. So far, my app is working well but I noticed that my motion event doesn't seem to get all the X and Y axis points that are touched.
When I move my finger across the screen, there is noticeable spaces between the circles. Only when I move my finger slowly does it capture all the points. Is there a way I can grab all the points or is there a way i can optimize it to be able to handle all the points?
Here's how I'm doing it:
setOnTouchListener(new View.OnTouchListener() {
public boolean onTouch(View v, MotionEvent event) {
int x;
int y;
switch (event.getAction())
{
case MotionEvent.ACTION_DOWN:
{
x = (int) event.getX();
y = (int) event.getY();
//method draws circle at x and y coordinate
MyPainter mp = new MyPainter(x,y);
break;
}
case MotionEvent.ACTION_MOVE:
{
x = (int) event.getX();
y = (int) event.getY();
MyPainter mp = new MyPainter(x,y);
break;
}
}
return true;
}
});
Any Suggestions or comments are appreciated. Thanks
This is just the way the hardware is designed. The hardware samples the touchscreen N times per second, and spits out the coordinates every time it senses a finger.
If you move faster than the screen samples, you'll have gaps in the reported data. Only thing you can do is to generate the missing points yourself by interpolating along the line between the two touches.
The accepted answer isn't strictly correct, it is possible to access the touch data from the graps using the following methods:
event.getHistorySize(); // returns the number of data points stored as an int
event.getHistoricalX(int);
event.getHistoricalX(int);
Other related methods are available when there are multiple pointers (see documentation).
A good example of how this can be used in practice can be found here.

Limit down scaling of a Bitmap in a SurfaceView, Android

im having a single Bitmap in a surfaceview.
I am using multi touch to handle gestures such as zoom and drag, the problem is that when i scale the bitmap i dont want it to be able to be down scaled that much it wont cover the whole display(surface view). I have found no way to work around this since i cant find a way to get the current downscaled bitmaps current height or width.
Here's some of my code:
private void multiTouchBehavior(MotionEvent event)
{
if(event.getAction() == MotionEvent.ACTION_POINTER_2_DOWN)
{
_oldDist = spacing(event);
if(_oldDist > 10f) // just be secure of a bug in the API
{
_saved.set(_matrix);
_mid = midPoint(event);
}
}
else if(event.getAction() == MotionEvent.ACTION_MOVE)
{
_newDist = spacing(event);
if(_newDist > 10f)
{
_matrix.set(_saved);
float scale = _newDist / _oldDist;
_matrix.postScale(scale, scale, _mid.x, _mid.y);
}
}
}
And here is what happens in method spacing(event) :
private float spacing(MotionEvent event) {
float x = event.getX(0) - event.getX(1);
float y = event.getY(0) - event.getY(1);
return FloatMath.sqrt(x * x + y * y);
}
My onDraw(canvas) is just canvas.drawBitmap(myBitmap, _matrix, null)
Anyone know how i could solve this problem?
So just to make sure you get my problem right think of it as a big map of the world that i want to able to zoom in to, and of course zoom out on when iv'e already zoomed in, but NOT allowing it to be out zoomed more than it is at start.
Found my solution, there probably is a better way but now I use:
matrix.mapRadius(1);
that i can compare to a variable initialized to that method-value in beggining.
Edit:
float[] f = new float[9];
matrix.getValues(f);
float xScale = f[0];
float yScale = f[3];
works better and is easier tho.

Getting coordinates on touch event relative to scrollable map

Is there a way to get the coordinates of a touch event on a scrollable map?
Meaning when I touch at one point on the screen, getX() and getY() return the respective values and then when I scroll the map, the coordinates returned should be relative to the map, not the screen.
eg. I have a 750x750 background map, and my device screen size is 480x800.
when I first touch, say the coordinates returned are (100, 200). now when I scroll the map and touch somewhere else, I get the coordinates as 200, 200.
I want to get the coordinates with respect to the map and not the screen.
I've been trying to figure this out for a long time and have scoured the net and other sites in vain.
please help.
thanks in advance
\m/
i need the coordinates coz i'm developing a game in which i have a large map and objects placed on it. when i scroll the map, i need the objects to move along with in the same position.
here is my code:
#Override
public boolean onTouchEvent(MotionEvent ev) {
switch (ev.getAction()) {
case MotionEvent.ACTION_DOWN:
touchStart.set(ev.getX(), ev.getY());
x = ev.getX();
y = ev.getY();
break;
case MotionEvent.ACTION_MOVE:
newX = ev.getX() - touchStart.x + prevPicStart.x;
newY = ev.getY() - touchStart.y + prevPicStart.y;
if ((newX <= 0 && newX > 0 - mBG.getWidth() + Craz.DISP_WIDTH)) {
picStart.x = newX;
}
if ((newY <= 0 && newY > 0 - mBG.getHeight() + Craz.DISP_HEIGHT)) {
picStart.y = newY;
}
invalidate();
break;
case MotionEvent.ACTION_UP:
prevPicStart.x = picStart.x;
prevPicStart.y = picStart.y;
break;
}
return true;
}
#Override
protected void onDraw(Canvas canvas) {
Paint paint = new Paint();
canvas.drawBitmap(mBG, picStart.x, picStart.y, paint);
canvas.translate(picStart.x, picStart.y);
mBDoor.draw(canvas);
}
Its pretty easy. When you scroll your map, you have an offset. If you scroll to the right (the background will move to the left), your offset will be negative. Lets say, you have an offset on x of -50, and you click the screen coordinates 100 you simply do the math:
mapCoordX = screenX - offsetX; // makes: 100 - (-50) = 150
just looking at the X coordinate, for Y it should be the same.
I have written a tutorial about a map of tiles and scrolling over it. Maybe you take a look at it, too.
Pseudocode!
for (int id = 0; id < mapSize; id++) {
Tile tile = new Tile(id);
startX = id * tileWidth;
startY = id % rowLenght + id / rowLenght;
tile.setBackground(createCroppedBitmap(background, startX, startY));
}

Categories

Resources