Assume I have a large screen device and a small screen device, now if I touched the large screen device at some random x,y like this:
//on touch event
float x = event.getX();
float y = event.getY();
And send these to the smaller screen device, how do I know on the small screen device where that x,y coordinate must be?
In other words if it was at the top right corner on the larger screen device, it must be at the top right corner on the small screen device
Most important for this is you need to have widths of both screens, and calculate it proportionally.
Suppose largeWidth = 400, smallWidth = 300, and your x on Large screen is 200.
Then if you have to calculate width on small screen, then simple math would help here.
smallX = x * smallWidth/LargeWidth
smallX = 200 * 300/400
smallX = 150
Related
I have a smartphone of 2560 x 1440 px. Now I am using this function for my TextView:
int[] locationOnScreen = new int[2];
txtAp.GetLocationInWindow(locationOnScreen);
It is supposed to give me the total x and y coordinates in pixels.
My TextView is pretty much in the middle of the screen, so supposedly at
(1280,770). But the function returns [69, 1111].
How can that be? If that is not the way of doing that, then what is?
I have a smartphone of 2560 x 1440 px... My TextView is pretty much in the middle of the screen, so supposedly at (1280,770).
Not correct, unless you talk about the center of the TextView. Any view is rendered on screen within its rectangle, so its coordinates are considered to be of [left, top, right, bottom].
The GetLocationInWindow() method returns (left, top) coordinates of a view. So (x = 69, y = 1111) looks to be meaningful for the left top corner of your TextView located in the middle of screen.
Important note: GetLocationInWindow() returns the coordinates w.r.t the root view, not actual window. You should take the status bar height into consideration. And here is how you can calculate Height of status bar in Android.
I am creating layout for finger selection. In this I am trying to achieve click events for each individual finger. This layout should be uniform on any type of screen resolution.
My approach:
Inside relative layout, I am assigning radio buttons (not radio group but individual) to each finger inside hand image using margins and padding but it is not resting properly over finger image. They are slightly moving left or right.
Problem in this - radio button positions is changing if screen resolution changes.
I failed to find library for such click events. Also in SO I didn't find any related questions. Can someone guide me in this to library or example or better approach than this?
A several years ago I worked on the similar task. Unfortunately, I don't remember the whole solution, but idea was pretty simple. In my case it was an image of the map where a user could select a district by tapping the map. I knew the resolution of the original image that I used to display in UI. I encoded each district against its boundaries so it gave me a list of pair's number. I had a touch listener attached to ImageView that was used to display the map. So every time a user clicked on the map I got a position of his click, multiply this value by a scale factor(this one was calculated based on the size of original image and the one that was scaled by Android). Then I checked if that value laid in any polygons.
So to make it more clear:
Let width, height = size of original image
x, y = user touch
scaleWidth, scaleHeight = size of the image displayed by Android on the user device
scaleX = scaleWidth / width, scaleY = scaleHeight / height
originalX = scaleX * x, originalY = scaleY * y
Then check if originalX and originalY fits in polygons. In your case those polygons could be just squares around every finger.
On my game screen i want to have a swipe detected only if its more than 100px, this is because the user taps a lot on the game screen and it tends to detect a swipe which changes the screen back to title. How can i make the swipe detect only if its longer than 100px?
There are two ways to achieve this.
The first one is to save the starting point of the touch and measure the distance on end of the touch event, just like Paul mentioned.
The second is to enlarge the tap square size if you use the GestureDetector of libgdx. Its defaulted to 40px which means if you're finger moves more than 20px to any side it's no longer a tap event but a pan event. I'd recommend using the GestureListener/Detector as it will give you the basic mobile gestures out of the box rather than recoding them.
On a side note: Determining the distance by pixels is error-prone because the pixel density will vary between mobile devices, especially if you code for android! 100px on one device may be only half the distance than on another device. Take pixel density into consideration when doing this or change to relative measurements like 1/3 of the screen size!
Save the position in the touch up and down.
private Vector2 downPos = new Vector2(), upPos = new Vector2();
private Vector3 tmp = new Vector3();
public void touchDown(float x, float y.....) {
tmp.set(x, y, 0);
camera.unproject(tmp);
downPos.set(tmp.x, tmp.y);
}
public void touchUp(float x, float y.....) {
tmp.set(x, y, 0);
camera.unproject(tmp);
upPos.set(tmp.x, tmp.y);
float distance = downPos.dst(upPos); // the distance between thoose vectors
if (distance > 100) {
// there was a swipe of a distance longer than 100 pixels.
}
}
If you don't want to do that only on touch up, put the code in the touchdrag method.
I have an image (ImageView). I have certain areas on the image. When a use taps the screen, I want to detect which area was selected.
I have identified the area boundaries on the original image, but the x and y of MotionEvent are off.
I tried dip-to-pixel conversion (TypedValue.applyDimension(TypedValue.COMPLEX_UNIT_DIP, dipValue, metrics)), but they are still off. It certainly has to do with the screen size / density / etc., but how exactly to get the pixel value of the touch event for the original image? (or, vice-versa - how to convert the original image coordinates to something that is comparable with the x and y of the motion event)
I managed to calculate it using the right and bottom coordinates of the ImageView (pic) compared to the real image size (568x1207).
float xCoef = 568f / pic.getRight();
float yCoef = 1207f / pic.getBottom();
float x = event.getX() * xCoef;
float y = event.getY() * yCoef;
I have an onTouchListener on an ImageView and I use event.getX() or getY().
My goal is to display an image and launch a dialog or something when the user touch a particular part of my image.
The problem is that with different screen, the X et Y values change for the same part of my image view.
How can I get the real position of the event in pixel on every screen ?
For instance I would like to display an Android face, and do something when the user click in his eyes...
Write your code to detect the touch at a specified point using a fixed resolution, e.g. 480x640. Then get the resolution of the device the app is running on using DisplayMetrics. Calculate xScale and yScale (e.g. DisplayMetrics.widthPixels / 480) and multiply your x and y by these scale factors.