How to find touch area for android devices? - android

Im trying to find touch area for android screen like how much area is covered by any finger,i know about event.getSize() method but its always gives me 0 output and pointerIndex is also 0. how can i find touch area for all android devices as further i also need to calculate touch pressure?

TRY THIS
final View view= findViewById(R.id.view);
view.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
Toast.makeText(context, "\"Touch coordinates : \" +\n" +
" String.valueOf(event.getX()) + \"x\" + String.valueOf(event.getY()", Toast.LENGTH_SHORT).show();
return true;
}
});

For your question regarding touch pressure. MotionEvent().getPressure(i) should return a value between 0 and 1 based on the "pressure" placed on the screen. In reality for capacitive screens it is the size of the capacitive object rather than literal pressure, but the concept is almost the same for fingers (fingers are squishy). Ranges higher than one may be returned depending on the calibration of the touchscreen.
If your screen is only returning 0 or 1, try testing on another device. Perhaps your screens driver simply does not return those values.Below link can be helpful for you
https://developer.android.com/reference/android/view/MotionEvent.html#getPressure(int)
http://android-er.blogspot.com/2014/05/get-touch-pressure.html

Related

Is there a offset between onTouchEvent and onTouchListener?

I have developed a game that shoots when player touches the screen by using onTouchListener for my custom SurfaceView and Thread.
But now I want to change the approach and instead of onTouchListener I added onTouchEvent to the SurfaceView o my Activity.
The problem is that I get some kind of offset when I click on the emulator.
Everything is working great, except that offset keeps appearing and I don't understand why.
Also let me mention that my app is running in landscape mode, maybe this is relevant.
I suspect that it isn't working properly because onTouchListener was added to the view and depended on it, but onTouchEvent doesn't depend on the view.
Also my view doesn't have any padding or margin properties, it is full-screen view (fill_parent).
Does anyone have any ideas on this?
I have done my application, and everything works correctly now, but i still do not know what the problem was.
After lots of debugging of my application the onTouchEvent returned random Y values that were always higher than the ones that the onTouchListener returned. And i am not sure why this is hapening since my view that recognizes the onTouchListener is a full-screen view.
So i figured out a way to get passed this by doing some math.
The first function that the android calls is the onTouch method which gives the correct values but just for one touch. I needed it to give right values even on the MotionEvent.ACTION_MOVE so i noticed that MotionEvent.ACTION_MOVE is actually doing correctly depending on the first touch recognized by the onTouchEvent.
So i just got the coordinate Y from the onTouch and the different coordinate with the offset from onTouchEvent calculated the difference and in every onTouchEvent from there, until the user lifts up their finger, i just subtract that difference and that gives me the correct value.
If anyone else has this problem, and doesn't know how to fix it, here is my code, maybe it will be helpful.
#Override
public boolean onTouchEvent(MotionEvent arg1) {
/*you can only touch when the thread is running*/
if(game.state() != STATE_PAUSE){
if(arg1.getAction() == MotionEvent.ACTION_DOWN){
coordinateX = arg1.getX();
coordinateY = arg1.getY();
differenceY = Math.abs(coordinateY - touchedY);
coordinateY = coordinateY - differenceY;
shootingIsOkay = true;
game.setDrawCircle(coordinateX,coordinateY);
}
if(arg1.getAction() == MotionEvent.ACTION_MOVE){
coordinateX = arg1.getX();
coordinateY = arg1.getY();
coordinateY = coordinateY - differenceY;
shootingIsOkay = true;
game.setDrawCircle(coordinateX,coordinateY);
}
if(arg1.getAction() == MotionEvent.ACTION_UP){
shootingIsOkay = false;
}
}
return false;
}
And the onTouch method that is called from the onTouchListener that depends on the view is just simple, here:
#Override
public boolean onTouch(View arg0, MotionEvent arg1) {
touchedY = arg1.getY();
return false;
}
If anyone knows how to fix this problem, please post your answer, i am very curious why this is happening

Determining where someone touches on an image

This is a basic question that leads into others down the line.
I am looking at expanding my app to have a image of a target (3 circles) and I want the user to be able to touch on the target image where they hit. Then the app determines where the user clicked.
Not progressed down this line of development yet and do not know where the best place to start / learn. Has anyone got any tips / websites / examples that I can be pointed at to get the ball rolling
Thanks
UPDATED
What I am trying to do and I have no knowledge on where to start
Draw a target on a canvas, 3 circles
draw a cross depending where the user clicks on the target
record a score depending on which circle the user clicked in
thanks
You can use an OnTouchListener on your View. That will give you touch events and pass you the coordinates where the finger was at inside of a MotionEvent
Something like this ought to work:
img.setOnTouchListener(new OnTouchListener() {
public void onTouch(View v, MotionEvent me){
Log.i("TAG", "x: " + me.getX() + " y: " + me.getY());
}
});

Android onTouch returning wrong color

I have a strange problem getting the color of the point that was touched. I created an image (.bmp) and filled it with the paint can. No gradients or other colors. Most of the time when I touch the screen, I get the color I am expecting, but sometimes I get a slightly different color. My code seems straightforward enough:
final Bitmap bm2 = BitmapFactory.decodeFile(image_overlay);
if (bm2!=null) {
overlayimage.setImageBitmap(bm2);
image.setOnTouchListener(new OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent mev) {
Log.d(MY_DEBUG_TAG, "onTouch()");
DecodeActionDownEvent(v, mev, bm2);
return false;
}
});
}
private void DecodeActionDownEvent(View v, MotionEvent ev, Bitmap bm2)
{
Log.d(MY_DEBUG_TAG, "DecodeActionDownEvent()");
xCoord = new Integer((int)ev.getRawX());
yCoord = new Integer((int)ev.getRawY());
colorTouched = bm2.getPixel(xCoord, yCoord);
Log.d(MY_DEBUG_TAG, "The coordinates touched were x: " + xCoord + "; y: " + yCoord);
Log.d(MY_DEBUG_TAG, "The color touched was (hex) " + Integer.toHexString(colorTouched));
}
Recently I got a "miss", and checked the coordinates of the image by moving the eyedropper tool around until it was on the exact spot, and the pixel there is no different than the other pixels in the area.
Is it not "safe" to expect android to return the exact color? If I paint a target with #ff424542, is it not safe to assume that if I hit that target, I would get a pixel color of #ff424542? In this case, android was returning #ff4a454a. The attached image is my "image_overlay" file. The area I am targeting is the dark gray in the middle-right. Like I said, most of the time it works perfectly, but every once in a while I record a miss, even when I am clearly in the target zone. From my logs, recent misses were at x: 360, y:399 and x:368, y:399. Successful hits were at x:363, y:393 and x:365, y434.
I'm guessing you're using the eyedropper tool in an app on your PC, where you generated the image? You are then assuming that the image you are working with in Android is identical to the one you are working with on your PC. However, this is not a safe assumption. What are the properties of your source bitmap, e.g. resolution, DPI etc. Then take a look at Bitmap.decodeFile. The clue is in "decode". By default (hence the question about the bitmap properties), Android will not simply read the bitmap from your file specifier and load it into memory. For example, and I'm not sure without digging into the source code, perhaps decodeFile is dithering the resultant bitmap.
You should be using BitmapFactoryOptions to control how the input bitmap is decoded:
http://developer.android.com/reference/android/graphics/BitmapFactory.Options.html

Query the exact number of pointers supported by multitouch

Is there any way to get programatically the maximum number of individual fingers that the touch screen can detect simultaneously?
I've only been able to find FEATURE_TOUCHSCREEN_MULTITOUCH, FEATURE_TOUCHSCREEN_MULTITOUCH_DISTINCT and FEATURE_TOUCHSCREEN_MULTITOUCH_JAZZHAND, which only tell me if the hardware supports "2 or more" and "5 or more", respectively.
As far as I've seen, there's no way to get the exact number of fingers supported.
I've been able to find out that my Nexus S supports a maximum of 5 fingers with the following code:
public boolean onTouchEvent(MotionEvent event) {
Log.d("multitouch", event.getPointerCount() + " fingers detected");
return super.onTouchEvent(event);
}
But I'd like to be able to get this data from some sort of environment variable so my users won't have to go through a "detection screen" just to get this information.
You can do so by analizing the event.toString()

how to know the coordinates of image when touch

How to get image coordinates at mouse hover position.
Please let me know
Thank you
Quoted from the "android-porting" mailing list (end of August 2010):
Android currently doesn't support mouse input, so has no concept of mouse hover.
You set an onTouchListener for the image, and in the onTouch event, you can pull the x,y coordinates out of the MotionEvent. getX and getY will get you the x and y coordinates in relation to the image, and getRawX and getRawY will get the x,y coordinates of the screen.
public boolean onTouch(View arg0, MotionEvent arg1) {
System.out.println("X: "+arg1.getX());
System.out.println("Y: "+arg1.getY());
System.out.println("Raw X: "+arg1.getRawX());
System.out.println("Raw Y: "+arg1.getRawY());
return true;
}
As stakx said, Android doesn't support the mouse. However, if you are referring to the image that currently has focus, try this:
Image focusedImg = getViewById(R.id.YourMainLayout).findFocus();
int[] relativeToParentPixels = { focusedImg.getLeft(), focusedImg.getTop(),
focusedImg.getRight(), focusedImg.getBottom() };
Now you'll have the boundary positions of the image in an array.
Try this
http://developer.android.com/reference/android/view/View.OnHoverListener.html
Available from api 14.

Categories

Resources