I am developing a RDP application using android 3.0. I am using surface view to display the images. Is there any way to implement pinch zoom functionality for surface view(I am trying to zoom the image).
Subclass surfaceview and implements onTouchEvent(). Use a GestureDetector and GestureListener to modify a matrix representing the current state of manipulation and draw your canvas modified by this matrix.
It works, but is a piece of software to figure out. Took me around 6 weeks to have everything set up correctly, especially if you want some overlay items which also listen to touch events. I'd think about publishing that code, but it's company code...
Related
I'm building an android game and this is more of just best practices and performance question.
For the bitmaps that I have that the user interacts with, should I place them in an imageView and set an onTouchListener for each of them individually or should I just draw them onto the canvas and use the custom view's onTouch method to obtain the x and y of the touch and see if it falls in the range of any of the bitmaps to detect a touch.
My custom view takes up the entire screen, and I don't know how if it is even possible to draw an imageview onto the screen using a canvas which is why as of now I just use the onTouch method.
Thanks for any insight.
Depending on how dynamic your bitmaps render will be, you should go for either GLSurfaceView or SurfaceView, for something simple as just bitmaps i would recommend you to use SurfaceView as "renderer" where you can get the canvas from and of course you can draw on the whole screen if your surfaceview match the screen size.
TouchListener should be completely separated handled on its own listener promoting encapsulation and reuse of code for future apps that you want to do. I've found this quiet helpful for the last games i've developed asingning a surface view as renderer and just creating Objects which takes the canvas as parameter and drawing them self into it, the only thing you have to take on count is the bitmap resource management, but if you are careful of releasing and creating them when is proper, you should have no problems...
Regards!
I am implementing a simple paint application (Finger paint android). I would like to use two fingers for both scaling and zooming. Should I use different listeners and detectors for both. The problem is that when i try to scroll with two fingers zooming kicks in and the screen just jumps. Can you please point me to implementations.
Thanks
You can use ScaleGestureDetector to easily detect things like pinch-to-zoom. Go through this Android developers blog post for a detailed explanation.
I'm developing an android game with the standard pattern of SurfaceView, SurfaceHolder, Canvas, etc. In this game I will be drawing multiple bitmaps on the canvas that I would like to know if they've been touched. I am aware that I can override onTouchEvent() for SurfaceView and could linearly walk through all my items to determine if the touch coordinates contains the area of one of my bitmaps, but I was wondering if I could make things easier on myself.
Here's my question:
Can I have classes which handle drawing my bitmaps on the screen descend from android.View, attach them to my SurfaceView and register click listeners on them so I don't have to go through the process of determining clicks myself? Will the click listeners be accurate if I am drawing the views (bitmaps) to the canvas myself?
Furthermore, would I suffer a performance hit from making all of my bitmaps android views by bogging down the android view hierarchy and making android do the work I didn't want to?
Thanks.
If you want to attach normal android views to SurfaceView then what is the point in using SurfaceView! You are given a SurfaceView so that you can draw on the surface yourself rather than having the android view system do it for you. So "you" have to keep track of your touches by yourself.
You can refer to AOSP Gallery2 code which is purely OpenGL based and incorporate its design on handling touches.
I Have created a guidance application which consist a map. This map is drawn dynamically using the data from the database.[rectangle coordinates] To draw the map i have used a View class and overriden the Ondraw method.
The problem is that i cant find a way to implement the zoom in functionality. I have already used Gesture Detector class to handle OnLongPress and the OnTouchEvent methods. I was thinking of the Pinch-to-Zoom-in functionality but have no idea of how to do it.
Looking forward for some great ideas. Thank you!
Classes extends
View
GesturDetector
Can you give us some more information about how you are drawing this dynamic map? For now, I'll make an assumption and run with it. If you are drawing shapes on a canvas you can point the canvas at a Bitmap to draw onto:
Bitmap myDynamicMapBitmap = new Bitmap(MAP_WIDTH, MAP_HEIGHT, Bitmap.Config.ARGB_8888);
mapCanvas.setBitmap(myDynamicMapBitmap);
// Draw your map on the canvas here
Now that you have a bitmap representation of your map, you could utilize this open source project, which is basically an adaptation of the built in Android photo gallery app that allows users to pan and zoom images:
ImageViewZoom on GitHub
I've used that project before for an app, and it works really well. You may need to tweak it some to get your desired behavior (for example disabling panning, since you didn't say you wanted that).
Then all you have really have to do is:
imageViewZoomInstance.setImageBitmap(myDynamicMapBitmap);
That view includes built-in pinch zooming. You'll probably want to merge it with your current View that you've created.
I'm using OpenGL ES on android 2.3.3 at the minute to render a simple 3D game. I'd like to extend it using the built in gestures library, but I can't find a way to recognise the gestures from a GLSurfaceView as opposed to an android view(I don't have an XML layout is what I'm saying)
Is there any way to either implement an XML layout on top of what I have already, or preferably to implement the gestures library on top of the GLSurfaceview instead.
Thanks.
You can attach a normal onTouchListener to GLSurfaceView, so long as you have an instance of GLSurfaceView (which it sounds like is what you have). This is only really useful if you just want to know the raw x,y coordinates on screen where the user has pressed (e.g to rotate around the y if the user moves their finger left/right across the screen)
For gesture library, which I haven't used myself, you should be able to just place your GLSurfaceView inside a frame layout and then place another view (e.g. linear layout set to match_parent) in the same frame layout so that it completely covers the GLSurfaceView and is on top of it. Attach the gesture library to this view (and obviously make sure it is transparent so people can still see the GLSurfaceView below).
The gesture library has a way of 'stealing' events from the view. See here for details.
Here, here, and here are some examples that should make it clearer.