Android Swipe Finger - android

Please look at this picture
Basically you swipe your finger down in that area and it shows more.. how can i do this?

With ViewFlipper or (horizontal)ListView , each page/item of the ViewFlipper/ListView will contain one of these view (I bet they are images). For swiping you can implement``onTouchListener or GestureDetector. Override onFling() method, and measure the distance between the initial X-coordinate when the user touches the screen and the final x-coordinate when the user releases the touch screen with method event1.getX() and event2.getX(). Do not measure the motion on the y-axis since --->(to right, you have motion only on the x-axis). Intialize some MINIMUM_DISTANCE to compare with the result from difference between the first touch and release touch coordinate (event1.getX() - event2.getX()).

Related

Camera rotation not rotating correctly on mobile

I'm making a FPS and I want the player to rotate the camera, my code works for PC, but on mobile if I'm rotating the camera and I also touching the fire button (or anywhere else with my other finger) the camera rotates to right (here where my fire button is) and I don't know if I can do something about it, or I need to cancel the release for android and IOS and publish my game only for PC
Part of my code:
if (CanProcessInput())
{
// Check if this look input is coming from the mouse
bool isGamepad = Input.GetAxis(stickInputName) != 0f;
float i = isGamepad ? Input.GetAxis(stickInputName) : Input.GetAxisRaw(mouseInputName);
// handle inverting vertical input
if (InvertYAxis)
i *= -1f;
// apply sensitivity multiplier
i *= LookSensitivity;
if (isGamepad)
{
// since mouse input is already deltaTime-dependant, only scale input with frame time if it's coming from sticks
i *= Time.deltaTime;
}
else
{
// reduce mouse input amount to be equivalent to stick movement
i *= 0.01f;
#if UNITY_WEBGL
// Mouse tends to be even more sensitive in WebGL due to mouse acceleration, so reduce it even more
i *= WebglLookSensitivityMultiplier;
#endif
}
return i;
}
Segment the touch input so you ignore values that are generated in the area of the fire button. You can do this by checking the Touch. position value:
Description
The position of the touch in screen space pixel coordinates.
Position returns the current position of a touch contact as it's dragged. If you need the original position of the touch see Touch.rawPosition.
The documentation hints at what you might be interested in, too - the original touch position. Your question body is asking about horizontal motion, but your code is referencing y values. The position coordinates are given in (width, height), so I just wanted to be clear up front that I'll use x values to answer your question about horizontal motion.
So if you know that the bottom-left corner of the screen is (0,0) and you can use Screen.width to get the width in pixels, then you could calculate the right 20% of the screen as reserved for the fire button. Anything less than that would be acceptable inputs for rotation, but the 80%-of-width pixel would be the maximum allowable input:
private int maxAllowablePixel;
void Start()
{
maxAllowablePixel = 0.8f * Screen.width;
}
and then only process the touch as a rotation if the original touch was less than that value:
if(touch.rawPosition.x < maxAllowablePixel)
{
DoRotation(touch.position);
}
and again here you are allowing the user to put their finger down in the middle of the screen, drag it over the fire button, and still rotate that much, but any touches that originate in the fire button "exclusion zone" are ignored for the purposes of doing rotations.
Also when you do the rotation, do it where the finger is now (Touch.position) and not where it started (Touch.rawPosition).
The way I always solve an issue like this is to not use the touch positions directly, but to have a "TouchReceiver" Object in the Canvas with a script that implements IDragHandler. So the class might look like this:
[RequireComponent(typeof(Image))]
public class TouchReceiver : MonoBehaviour, IDragHandler
{
public void OnDrag(PointerEventData data)
{
Vector2 delta = data.delta;
// do something with delta, like rotating the camera.
}
}
Then set up your canvas like so:
Make sure to put the "TouchReceiver" behind the fire button, that way the OnDrag event will not occur when pressing the fire button, because the fire button blocks the "TouchReceiver" object itself. The "TouchReceiver" object needs to have an Image component, otherwise it won't be able to receive any touches. Of course the image should then be made totally transparent.
With this setup is pretty easy to move the fire button around without changing anything else. This might also come in handy when adapting to different screen resolutions.

How to draw an arrow instead of a dragshadow?

I need the user to drag an arrow between two views, but all info I can find online is to make a dragshadow.
here's an exemple of the result i need.
https://imgur.com/pfQxtSK
thanks in advance.
What you want requires you to observe touch events inside of your parent ViewGroup, or custom view (whatever is drawing the light-blue grid from the image), using onTouchEvent(MotionEvent) callback. Next, you wanna determine the direction of the drag using the MotionEvent's x and y coordinates and some basic trigonometry. This will allow you to determine the angle of the drag and therefore dynamically the adjust the angle (rotation) of your arrow so that it accurately tracks the finger. Next use the same x and y coordinates you got from the MotionEvent to determine where on the canvas to draw the arrow. If the arrow is another view added as a child, you can set that view's x and y coordinates or translate the view using methods provided by the View class (you may need to play with offset's so the view doesn't get drawn off-screen after several drags/touches). If the arrow is a bitmap our drawable, you will need to adjust the draw coordinates in your view's onDraw(Canvas) method.

Android : Move arrow's end point and start point on user's touch event

I have Relative layout with background #mipmap/arrow. I am adding this relative layout (On user touch) in another relative layout which is having ImageView (using imageRelativeLayout.addView(arrowRelativeLayout)).
Goal :
1) I want to add arrow on ImageView after onTouchEvent() to particular (x,y) position. (This is working perfectly)
2) After adding arrow, I want to move arrow's end Point or start Point as per user's touch
3) when user touch on arrow's end point then it should move to respective (x,y) position but start point should be fixed
4)when user touch on arrow's start point then it should move to resp. (x,y) position but end point should be fixed
Arrow Head is Start Point & Arrow tail is End Point
Please Help me
Thanks in advance
So add some invisible views and anchor them to the start end location of the arrows, when a user presses this move them to the new location and then update the image view.
It depends on your graphics if the arrow is edge to edge or corner to corner within the rect, i would suggest corner to corner then you just have to set the layout parameters with the correct top/left and bottom/right also Rotate / Flip it depending on your start end points, but you only need to rotate by 90,180,270 degrees.
This method will mean that your arrow head will be squashed or stretched, if you want you can include a seperate arrow head which uses the angle generated and rotates to this angle, anchoring with the end of the arrow line that you want.

How to let two overlapping views to both receive the user input events?

I would like to have two overlapping image views in my layout. But I need both of them to receive the user input event, such as touchDown, touchUp, pinch, etc. Could you give me some hints? Thanks
Set an onTouchListener on both of the views. Then, get the bounds of both views (x and y location, width, height). When the onTouch method is fired, check to see which view was touched, then check to see if the x and y location of the touch falls within the bounds of the other view. If it does, then forward the touch event to the other view by doing:
otherView.onTouch(motionEvent);

ScaleGestureDetector triggers onScaleEvent with single finger touch

I have code that onTouchEvent passes the touch event to a member ScaleGestureDetector and if the scale GD triggers a onScaleEvent on my scale listener, I scale up the display item. If the onScale event is not triggered I treat this event as a potential drag and move the displayed item around. I have observed that on certain devices the ScaleGestureDetector triggers an onScale event for me, even if there is just a single finger touching the screen. Is this expected? What is the recommended mechanism to deal with scales and drags at the same time?
If I'm correct you're triggering mScaleGestureDetector.onTouchEvent(mEvent) with single pointer (mEvent.getPointersCount() == 1). It is supposed to be triggered with pointers greater than 1. Make sure you're triggering with pointersCount > 1 Eg.
if(pointersCount >1) {
TOUCH_GESTURE_MODE = GESTURE_MODE_SCALE;
mScaleGestureDetector.onTouchEvent(mEvent);
return true;
}

Categories

Resources