In my program I am having it draw a rectangle while the finger is down and moving than erase it after the finger is up. this is to show the user the range of values he/she is using as a "guess" to find the root. however the rectangle never shows up! But, if I remove the call to close the rectangle in the "action_up" part the user can draw the rectangle.
Here's the code:
in the on draw function:
if(dataline>1)//if greater than 1, draw rectangle
{
myPaint.setColor(Color.CYAN);
canvas.drawRect(tX1,0, tX2,canvas.getHeight(),myPaint);
}
in the motion event function:
public boolean onTouchEvent(MotionEvent ev) {
final int action = ev.getAction();
switch (action) {
case MotionEvent.ACTION_DOWN: {
final float x = ev.getX();
final float y = ev.getY();
// Remember where we started
mLastTouchX = x;
mLastTouchY = y;
tX1=(int)ev.getX();
tX2=tX1;
x_1 = ev.getX();
x_1=(x_1-X1)/(zoom_x);
clicks= 1;
tX1=(int) ev.getX();//set first x coord
tX2=tX1;// make second x coord equal to the first
}
case MotionEvent.ACTION_MOVE: {
final float x = ev.getX();
final float y = ev.getY();
// Calculate the distance moved
final float dx = x - mLastTouchX;
final float dy = y - mLastTouchY;
mLastTouchX = x;
mLastTouchY = y;
dataline=2;//let onDraw() draw the rectangle while dragging finger
tX2+= (int)dx;// find new second coordinate
}
case MotionEvent.ACTION_UP: {
dataline=0;//if commented out, rectangle is drawn otherwise, it is never seen.
}
}
return true;
}
Problem solved! I have learned that you have to put a return statement in each case, or else it will just run all the cases.
Related
Recently I have been trying to implement dragging and scaling on a picture that I place in a FrameLayout. What I want to achieve is simple: to be able to drag the picture around and zoom it. I went to the Android Developer website and followed the guide there.
Then following the code examples on that website I wrote MyCustomView:
public class MyCustomView extends ImageView {
private static final int INVALID_POINTER_ID = 0xDEADBEEF;
private ScaleGestureDetector mScaleDetector;
private float mScaleFactor = 1.f;
private float mLastTouchX, mLastTouchY;
private int mActivePointerId = INVALID_POINTER_ID;
private LayoutParams mLayoutParams;
private int mPosX, mPosY;
public MyCustomView(Context context) {
super(context);
mScaleDetector = new ScaleGestureDetector(context, new CustomScaleListener());
mLayoutParams = (LayoutParams) super.getLayoutParams();
if (mLayoutParams != null) {
mPosX = mLayoutParams.leftMargin;
mPosY = mLayoutParams.topMargin;
} else {
mLayoutParams = new LayoutParams(300, 300);
mLayoutParams.leftMargin = 0;
mLayoutParams.topMargin = 0;
}
}
#Override
public void onDraw(Canvas canvas) {
super.onDraw(canvas);
canvas.save();
canvas.scale(mScaleFactor, mScaleFactor);
canvas.restore();
}
#Override
public boolean onTouchEvent(MotionEvent ev) {
// Let the ScaleGestureDetector inspect all events
mScaleDetector.onTouchEvent(ev);
final int action = MotionEventCompat.getActionMasked(ev);
switch (action) {
case MotionEvent.ACTION_DOWN: {
final int pointerIndex = MotionEventCompat.getActionIndex(ev);
//final float x = MotionEventCompat.getX(ev, pointerIndex);
//final float y = MotionEventCompat.getY(ev, pointerIndex);
final float x = ev.getRawX();
final float y = ev.getRawY();
// Remember where we started (for dragging)
mLastTouchX = x;
mLastTouchY = y;
// Save the ID of this pointer (for dragging)
mActivePointerId = MotionEventCompat.getPointerId(ev, 0);
break;
}
case MotionEvent.ACTION_MOVE: {
// Find the index of the active pointer and fetch its position
final int pointerIndex = MotionEventCompat.findPointerIndex(ev, mActivePointerId);
//final float x = MotionEventCompat.getX(ev, pointerIndex);
//final float y = MotionEventCompat.getY(ev, pointerIndex);
final float x = ev.getRawX();
final float y = ev.getRawY();
final float dx = x - mLastTouchX;
final float dy = y - mLastTouchY;
//TODO: Update the location of this view
mPosX += dx;
mPosY += dy;
mLayoutParams.leftMargin += dx;
mLayoutParams.topMargin += dy;
super.setLayoutParams(mLayoutParams);
invalidate();
mLastTouchX = x;
mLastTouchY = y;
break;
}
case MotionEvent.ACTION_UP: {
mActivePointerId = INVALID_POINTER_ID;
break;
}
case MotionEvent.ACTION_CANCEL: {
mActivePointerId = INVALID_POINTER_ID;
break;
}
case MotionEvent.ACTION_POINTER_UP: {
final int pointerIndex = MotionEventCompat.getActionIndex(ev);
final int pointerID = MotionEventCompat.getPointerId(ev, pointerIndex);
if (pointerID == mActivePointerId) {
// This was our active pointer going up. Choose a new active pointer and
// adjust accordingly
final int newPointerIndex = pointerIndex == 0 ? 1 : 0;
//mLastTouchX = MotionEventCompat.getX(ev, newPointerIndex);
//mLastTouchY = MotionEventCompat.getY(ev, newPointerIndex);
mLastTouchX = ev.getRawX();
mLastTouchY = ev.getRawY();
mActivePointerId = MotionEventCompat.getPointerId(ev, newPointerIndex);
}
break;
}
}
return true;
}
private class CustomScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener {
#Override
public boolean onScale(ScaleGestureDetector detector) {
mScaleFactor *= detector.getScaleFactor();
mScaleFactor = Math.max(0.1f, Math.min(mScaleFactor, 5.0f));
invalidate();
return true;
}
}
In the MainActivity I simply instantiated a MyCustomView object and attached it to ViewGroup at the background, which is a FrameLayout. The xml file has nothing but a FrameLayout there.
public class MainActivity extends AppCompatActivity {
private ViewGroup layoutRoot;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
layoutRoot = (ViewGroup) findViewById(R.id.view_root);
final MyCustomView ivAndroid = new MyCustomView(this);
ivAndroid.setImageResource(R.mipmap.ic_launcher);
ivAndroid.setLayoutParams(new FrameLayout.LayoutParams(300, 300));
layoutRoot.addView(ivAndroid);
}
}
And here comes the problem that troubles me:
The Android Developer website uses this to obtain the coordinates of the finger that touches the picture:
final float x = MotionEventCompat.getX(ev, pointerIndex);
final float y = MotionEventCompat.getY(ev, pointerIndex);
But it works horribly! The picture moves, but it does not follow my finger exactly, it always moves LESS than my finger does, and most importantly, it flashes.
So that is why you can see in MyCustomView that I have commented out this line and instead used this code:
final float x = ev.getRawX();
final float y = ev.getRawY();
While this time the picture moves smoothly in accordance with my finger, this change introduces a new problem. On the Android Developer website for dragging and scaling, there is a design principle that says:
In a drag (or scroll) operation, the app has to keep track of the original pointer (finger), even if additional fingers get placed on the screen. For example, imagine that while dragging the image around, the user places a second finger on the touch screen and lifts the first finger. If your app is just tracking individual pointers, it will regard the second pointer as the default and move the image to that location.
After I started using ev.getRawX() and ev.getRawY(), adding a second finger to the screen gives me exactly the problem stated above. But MotionEventCompat.getX(ev, pointerIndex) and MotionEventCompat.getY(ev, pointerIndex) does not.
Can somebody help me explain why it happens? I know that MotionEventCompat.getX(ev, pointerIndex) returns the coordinate after some sort of adjustment, and that ev.getRawX() returns the absolute coordinate. But I don't understand how exactly the adjustment works (Is there a formula or graphical explanation for it?). I also want to know why using MotionEventCompat.getX(...) would prevent the picture from jumping to the second finger on screen (after the first finger has been lifted).
Last but not least, the scaling code simply doesn't work AT ALL. If someone and teach me on that it will also be greatly appreciated!
This question is long, so I will partionate it in smaller bits. Also, english is not my native language so I had some difficulties writting the answer. Comment if a part is not clear.
Can somebody help me explain why it happens?
getRawX() and ev.getRawY() will both give you the absolute pixel value of the event. Those will also (for the sake of backwards compatibility, when most screens could only track 1 "region" at a time) will always consider the finger as the first (and only) finger that is interacting with the device.
Then, came improvements that allowed to track the finger ID., the MotionEventCompat.getX(ev, pointerIndex) and MotionEventCompat.getY(ev, pointerIndex) functions allowed for further finesse when creating our onTouch() Listeners.
Is there a formula or graphical explanation for it?
Basically, you need to take into consideration the "Screen Density" of that device. Such as:
float SCREEN_DENSITY = getResources().getDisplayMetrics().density;
protected void updateFrame(FrameLayout frameLayout, int h, int w, int x, int y) {
FrameLayout.LayoutParams params = new FrameLayout.LayoutParams(
(int) ((w * SCREEN_DENSITY) + 0.5),
(int) ((h * SCREEN_DENSITY) + 0.5)
);
params.leftMargin = (int) ((x * SCREEN_DENSITY) + 0.5);
params.topMargin = (int) ((y * SCREEN_DENSITY) + 0.5);
frameLayout.setLayoutParams(params);
}
I also want to know why using MotionEventCompat.getX(...) would prevent the picture from jumping to the second finger on screen (after the first finger has been lifted)
If you take into consideration that the first "finger" was lifted, then the new one, has a different "initial point", and different "history", because of that, it can send its event in relation to the movement made, not the final position on screen. This way it wont "jump to where the finger is" but will move according to the ammount of "x units" and "y units" traversed.
Last but not least, the scaling code simply doesn't work AT ALL. If someone and teach me on that it will also be greatly appreciated!
You are consuming the event (by returning true on your onTouch Listener), because of that, no other Listener can continue reading from the event, in a way that you can trigger more Listeners.
If you desire, move both functions (move and resize) inside the onTouch. My onTouch Listener has over 1700 lines of code (because it does a lot of stuff, including programatically creating Views and adding listeners to that), so I cant post it here, but basically:
1 Finger = move the frame. Get raw values, and use the "updateFrame"
2 Fingers = resize the frame. Get raw values, and use the "updateFrame"
3+ Fingers = Drop first finger, suppose 2 Fingers.
I am trying to make an Android paint application for finger painting and I am having trouble with moving the lines I draw.
What I tried to do was offset the path of the currently selected line by the difference between the initial finger press coordinates and the current coordinates in OnTouchEvent during ACTION_MOVE.
case MotionEvent.ACTION_MOVE:
selectline.getLine().offset(x - otherx, y - othery);
otherx and othery are set as the x and y coordinates during ACTION_MOVE and x and y are the current cursor coordinates. My lines are stored as a separate class containing the path, color, thickness and bounding box.
What I got was the shape flying off the screen in the direction of my finger without stopping at the slightest movement. I tried using a matrix to move the path, but the result was the same.
When I tried to insert a "do while" that would check whether the current coordinates would match the path's .computeBounds() rectangle center, but the program crashes as soon as I move my finger.
Any help would be appreciated, thanks.
Most likely that you did not use the right scale for the coordinates.
Source: Get Canvas coordinates after scaling up/down or dragging in android
float px = ev.getX() / mScaleFactor + rect.left;
float py = ev.getY() / mScaleFactor + rect.top;
// where mScaleFactor is the scale use in canvas and rect.left and rect.top is the coordinate of top and left boundary of canvas respectively
Its a bit late but it may solve others problem. I solved this issue like this,
get initial X,Y position on onLongPress
public void onLongPress(MotionEvent motionEvent) {
try {
shapeDrag = true;
SmEventX = getReletiveX(motionEvent);
SmEventY = getReletiveY(motionEvent);
} catch (Exception e) {
e.printStackTrace();
}
and then on onToucn(MotionEvent event)
case MotionEvent.ACTION_MOVE: {
actionMoveEvent(motionEvent);
try {
if (shapeDrag) {
StylePath sp = alStylePaths
.get(alStylePaths.size() - 1);
Path mpath = sp.getPath();
float tempX = getReletiveX(motionEvent) - SmEventX;
float tempY = getReletiveY(motionEvent) - SmEventY;
mpath.offset(tempX, tempY);
SmEventX = getReletiveX(motionEvent);
SmEventY = getReletiveY(motionEvent);
}
} catch (Exception e) {
e.printStackTrace();
}
break;
}
}
I faced the same trouble and in my case it was a very naive mistake. Since the description of the "symptoms" matches exactly (shape flying off the screen in the direction of the finger at the slightest movement, shape moved correctly at ACTION_UP event), I think the reason behind might be the same.
Basically the problem is in the update of the touch position coordinates within the ACTION_MOVE event. If you don't update the last touch position, the calculated distance will be always between the current touch position and the first touch position stored at ACTION_DOWN event: if you apply this offset consecutively to the path, the translation will sum up and consequently the shape will "fly" rapidly off the screen.
The solution is then quite simple: just update the last touch position at the end of the ACTION_MOVE event:
float mLastTouchX, mLastTouchY;
#Override
public boolean onTouchEvent(MotionEvent ev) {
final int action = ev.getAction();
switch (action) {
case MotionEvent.ACTION_DOWN: {
// get touch position
final float x = ev.getX();
final float y = ev.getY();
// save the initial touch position
mLastTouchX = x;
mLastTouchY = y;
break;
}
case MotionEvent.ACTION_MOVE: {
// get touch position
final float x = ev.getX();
final float y = ev.getY();
// calculate the distance moved
final float dx = x - mLastTouchX;
final float dy = y - mLastTouchY;
// here apply translation to the path
// update touch position for the next move event
mLastTouchX = x;
mLastTouchY = y;
break;
}
}
return true;
}
Hope this helps.
Im trying to do a pinch to zoom on a imageview. The problem is that the when i zoom it does'nt scale where i pinched it scales up to the left corner. Im not sure why i does this and i seems that there is a lot of people having the same problem but i havent found a soloution to it yet.
public override bool OnTouchEvent(MotionEvent ev)
{
_scaleDetector.OnTouchEvent(ev);
MotionEventActions action = ev.Action & MotionEventActions.Mask;
int pointerIndex;
switch (action)
{
case MotionEventActions.Down:
_lastTouchX = ev.GetX();
_lastTouchY = ev.GetY();
_activePointerId = ev.GetPointerId(0);
break;
case MotionEventActions.Move:
pointerIndex = ev.FindPointerIndex(_activePointerId);
float x = ev.GetX(pointerIndex);
float y = ev.GetY(pointerIndex);
if (!_scaleDetector.IsInProgress)
{
// Only move the ScaleGestureDetector isn't already processing a gesture.
float deltaX = x - _lastTouchX;
float deltaY = y - _lastTouchY;
_posX += deltaX;
_posY += deltaY;
Invalidate();
}
_lastTouchX = x;
_lastTouchY = y;
break;
case MotionEventActions.Up:
case MotionEventActions.Cancel:
// We no longer need to keep track of the active pointer.
_activePointerId = InvalidPointerId;
break;
case MotionEventActions.PointerUp:
// check to make sure that the pointer that went up is for the gesture we're tracking.
pointerIndex = (int) (ev.Action & MotionEventActions.PointerIndexMask) >> (int) MotionEventActions.PointerIndexShift;
int pointerId = ev.GetPointerId(pointerIndex);
if (pointerId == _activePointerId)
{
// This was our active pointer going up. Choose a new
// action pointer and adjust accordingly
int newPointerIndex = pointerIndex == 0 ? 1 : 0;
_lastTouchX = ev.GetX(newPointerIndex);
_lastTouchY = ev.GetY(newPointerIndex);
_activePointerId = ev.GetPointerId(newPointerIndex);
}
break;
}
return true;
}
protected override void OnDraw(Canvas canvas)
{
base.OnDraw(canvas);
canvas.Save();
canvas.Translate(_posX, _posY);
canvas.Scale(_scaleFactor, _scaleFactor, _lastTouchX, _lastTouchY);
_icon.Draw(canvas);
canvas.Restore();
}
I think it might have to do with this code in the beginning of the class where the bounds of the image is set to 0. But if i delete that code the image wont render.
public GestureRecognizer(Context context, ImageView imgview)
: base(context, null, 0)
{
//_icon = context.Resources.GetDrawable(Resource.Drawable.ic_launcher);
_icon = imgview.Drawable;
_icon.SetBounds(0, 0, _icon.IntrinsicWidth, _icon.IntrinsicHeight);
_scaleDetector = new ScaleGestureDetector(context, new MyScaleListener(this));
}
Have a look at MultiTouchController library. It makes it much easier to write multitouch applications for Android and pinch to zoom, rotate and drag is nicely implemented. Here is the link.
I am trying to move an ImageView (not rotate). The movement is supposed to be on the edge of a circle. This circle is also an image view.
based on the onTouch, ACTION_MOVE event, I am trying to move it.
Noe the Dilema is that the use may not move the finger in a perfectly circular fashion but I would like to make sure that the image still moves around edge of this circle.
I am currently using the following inside ACTION_MOVE:
mCurrTempIndicator.setTranslationX(event.getX());
mCurrTempIndicator.setTranslationY(event.getY());
But this will not move in a perfect circle.
could someone please help.
UPDATE: code
#Override
public boolean onTouch(View v, MotionEvent event) {
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
mInitialX = event.getX();
mInitialY = event.getY();
break;
case MotionEvent.ACTION_MOVE:
mEndX = event.getX();
mEndY = event.getY();
float deltaX = mEndX - mInitialX;
float deltaY = mEndY - mInitialY;
double angleInDegrees = Math.atan(deltaY / deltaX) * 180 / Math.PI;
mInitialX = mEndX;
mInitialY = mEndY;
mCurrTempIndicator.setRotation((float)angleInDegrees);
mCurrTempIndicator.setTranslationX((float)(310*(Math.cos(angleInDegrees))));
mCurrTempIndicator.setTranslationY((float)(310*(Math.sin(angleInDegrees))));
break;
case MotionEvent.ACTION_UP:
allowRotating = true;
break;
}
return true;
}
calculate the center Point of the Circle
get the current touch point
calculate the angle between center and new touch point
Calculate the point on the circle using angle and radius of circle (x = r * cos(angle), y = r * sin(angle)).
Reset the image position to the new point.
To get the angle use the below equation
deltaY = P2_y - P1_y
deltaX = P2_x - P1_x
angleInDegrees = arctan(deltaY / deltaX) * 180 / PI
//Code inside ACTION_MOVE case
mInitialX = event.getX();
mInitialY = event.getY();
float deltaX = circleCenter.x - mInitialX;
float deltaY = circleCenter.y - mInitialY;
double angleInRadian = Math.atan2(yDiff, xDiff);
PointF pointOnCircle = new PointF();
pointOnCircle.x = circleCenter.x + ((float)(circleRadius*(Math.cos(angleInRadian))));
pointOnCircle.y = circleCenter.y + ((float)(circleRadius*(Math.cos(angleInRadian))));
I've created an onTouchListener for dragging Views. Images drag smoothly if I use getRawX() and getRawY(). The problem with that is the image will jump to the second pointer when you place a second pointer down then lift the first pointer.
This onTouchListener attempts to fix that issue by keeping track of the pointerId. The problem with this onTouchListener is while dragging an ImageView, the ImageView jumps around pretty crazily. The getX() and getY() values jump around.
I feel like I'm doing this correctly. I don't want to have to write a custom view for this because I've already implemented a scaleGestureDetector and written a custom rotateGestureDetector that work. Everything works fine but I need to fix the issue I get when using getRawX() and getRawY().
Does anybody know what I'm doing wrong here?
Here's my onTouchListener:
final View.OnTouchListener onTouchListener = new View.OnTouchListener()
{
#Override
public boolean onTouch(View v, MotionEvent event)
{
relativeLayoutParams = (RelativeLayout.LayoutParams) v.getLayoutParams();
final int action = event.getAction();
switch (action & MotionEvent.ACTION_MASK)
{
case MotionEvent.ACTION_DOWN:
{
final float x = event.getX();
final float y = event.getY();
// Where the user started the drag
lastX = x;
lastY = y;
activePointerId = event.getPointerId(0);
break;
}
case MotionEvent.ACTION_MOVE:
{
// Where the user's finger is during the drag
final int pointerIndex = event.findPointerIndex(activePointerId);
final float x = event.getX(pointerIndex);
final float y = event.getY(pointerIndex);
// Calculate change in x and change in y
final float dx = x - lastX;
final float dy = y - lastY;
// Update the margins to move the view
relativeLayoutParams.leftMargin += dx;
relativeLayoutParams.topMargin += dy;
v.setLayoutParams(relativeLayoutParams);
// Save where the user's finger was for the next ACTION_MOVE
lastX = x;
lastY = y;
v.invalidate();
break;
}
case MotionEvent.ACTION_UP:
{
activePointerId = INVALID_POINTER_ID;
break;
}
case MotionEvent.ACTION_CANCEL:
{
activePointerId = INVALID_POINTER_ID;
break;
}
case MotionEvent.ACTION_POINTER_UP:
{
// Extract the index of the pointer that left the touch sensor
final int pointerIndex = (action & MotionEvent.ACTION_POINTER_INDEX_MASK) >> MotionEvent.ACTION_POINTER_INDEX_SHIFT;
final int pointerId = event.getPointerId(pointerIndex);
if(pointerId == activePointerId)
{
// This was our active pointer going up. Choose a new
// active pointer and adjust accordingly
final int newPointerIndex = pointerIndex == 0 ? 1 : 0;
lastX = (int) event.getX(newPointerIndex);
lastY = (int) event.getY(newPointerIndex);
activePointerId = event.getPointerId(newPointerIndex);
}
break;
}
}
return true;
}
};
image1.setOnTouchListener(onTouchListener);
The issue was simple, but unexpected. getRawX/Y() returns absolute coordinates, while getX/Y() returns coordinates relative to the view. I would move the view, reset lastX/Y, and the image wouldn't be in the same spot anymore so when I get new values they'd be off. In this case I only needed where I originally pressed the image (not the case when using `getRawX/Y').
So, the solution was to simply remove the following:
// Save where the user's finger was for the next ACTION_MOVE
lastX = x;
lastY = y;
I hope this will help somebody in the future, because I've seen others with this problem, and they had similar code to me (resetting lastX/Y)
After a lot of research, I found this to be the issue, getRawX is absolute and getX is relative to view. hence use this to transform one to another
//RawX = getX + View.getX
event.getRawX == event.getX(event.findPointerIndex(ptrID1))+view.getX()
I have One tip & think it will help.
invalidate() : does only redrawing a view,doesn't change view size/position.
What you should use is
requestLayout(): does the measuring and layout process. & i think requestLayout() will be called when call setLayoutParams();
So Try by removing v.invalidate()
or try using view.layout(left,top,right,bottom) method instead of setting layoutParams.