I am using the Multitouch Controller made by Luke Hutchison found here
https://code.google.com/p/android-multitouch-controller/
My problem is this. I can place an image onto the canvas. I can select it, move it, rotate it and scale it perfectly fine. The problem is I also allow the user to move the canvas by translating it on drag. On the screen the images seem to appear in the correct location following the translation of the entire canvas. The problem is trying to select them afterwards.
I don't know if it's an issue with the touch point after the canvas translation not taking into account the new canvas location/offset or the location of the images haven't been updated with the canvas translation. Does anyone have any experience with this or be able to point me in the right direction!?
Related
I'm struggling with something I would expect to be straight forward with libgdx.
In short this is a "finger paint" app where I want to draw a path of a certain width where the user touches the screen.
I've earlier done this by using a plain Android android.view.View. I had a android.graphics.Path in which I stored the coordinates of the user's current touch. In the onDraw() method of the view I drew the path to the android.graphics.Canvas. Whenever the user released a finger I drew the path to an offline canvas/android.graphics.Bitmap which also was drawn in the onDraw() method. Plain and simple.
How can that be done using libgdx?
I have tried using a com.badlogic.gdx.graphics.Pixmap that I can draw a line to whenever the user moves a finger. This works well except the fact that I'm unable to control the witdh of the line using Gdx.gl.glLineWidth(). I know I can draw a rectangle instead of a line to set the width, but Pixmap doesn't seem to have any means of rotating, so I don't see how this can be done.
I can use a com.badlogic.gdx.graphics.glutils.ShapeRenderer for drawing lines (or rectangles) in com.badlogic.gdx.Screen.render(). As far as I can see I then need to store every single touch point of the current touch, and then draw all lines on render. Whenever the user relases a finger I guess I can store the screen as-is with something like com.badlogic.gdx.utils.ScreenUtils.getFrameBufferPixmap(). Hopefully there is a easier way to achieve what I want.
I ended up drawing circles on a pixmap where the line should be:
One circle on touchDown
Several circles from last touch point to next touch point reported to touchDragged
I'm not very happy with this solution, but it kinda works.
Maybe you can calculate line dot coordinates manually and use triangles to draw them on pixmap? Like use 2 triangles to form (rotated) box?
Using an example found here:
http://www.vogella.com/tutorials/AndroidTouch/article.html I'm able to draw a path every time a finger touches the screen and that works great, Is there any way that I can capture that drawn path and reproduce it in another view in other coordinates? Like capturing the full drawn image and reproducing it scaled in a segment of the other view?
Sorry for my bad english
You can save your drawn image and save it as a bitmap. Then you can scale it with Matrix or Canvas and redraw it by calling to where you want to draw
I am developing one application with drawing and all, One problem I am facing is I want single touch scaling image. I have done image scaling with multi touch but I am so stuck with single touch.
Example :-
I have one Rectangle on canvas. Now I want to scale and rotate that Rectangle with its one corner with one finger. I am finding this on google but not get perfect materiel or demo and else.
Please help me to find this.
For zooming with single touch this will help you
http://developer.sonymobile.com/wp/2010/05/18/android-one-finger-zoom-tutorial-part-1/
I'm trying to do an app in Android that work with finger paint and that must be able to manage the zoom. So I have an image in the background and I draw over it, but when I zoom in and out, I have to resize both the image and the path drawn and I don't know how. I used the code of Zdnet to implement the pinch to zoom gesture and it works great for the background image, so I thought that the only step to make the "path resize" was to put mPath.transform(matrix) at the end of onTouch(), but it doesn't allow me to draw properly and does not resize the path.
How could I do it?
Note: the code to implement the finger paint is that of Google.
mPath.transform should work, it's hard to tell what's wrong without code.
Alternatively, you could use Canvas.scale if you want to scale everything drawn by the same factor.
I have two bitmap overlapped .The top bitmap is transparent and when user touch to screen i copy pixels from another bitmap to top bitmap.My goal is to give to users the feeling of erasing image with touching to see another image.However it is not working properly especially when user drags his finger too fast on the screen.I made a few tests and i beleive drawing bitmaps to the canvas every time cause the lag but i don't know how to fix it.
It looks like a Canvas or Bitmap isn't redrawn until you wipe it yourself. That means you just have change the alpha value of the pixels in the top Canvas/bitmap/thingy that are being "painted" rather than redrawing the entire canvas for every update.