I'm making an android game using andengine. I'm makin some sprites as object, where I can rotate sprite. But when I touch transparent part of image still it rotates the sprite.
To prevent this, I need to get Bitmap from Sprite. So can anyone provide method for that, by using that I can convert sprite to Bitmap.
I looked into this link.But in this it was using GLES1, but I'm working with GLES2.
bitmap for sprite follow this link for more
If you have not see this, there is a GLES2 version of Pixel-perfect collision. Its an android extension called "Andengine Collision Extension"
https://github.com/MakersF/AndEngineCollisionsExtension
It should handle the case you are describing.
NOTE: You have to modify you copy of the Andengine source code to use the extension.
Related
I'm developing a game in Android and Java. In android I am using andengine for sprite image and i was able to rotate in all directions.
int bikeFrame:
//bikeFrame++,bikeFrame--
bikeSprite.setRotation(bikeFrame);
I want to make the game in j2me also. But in j2me we have only four methods to rotate angle
(TRANS_MIRROR,TRANS_MIRROR 90,TRANS_MIRROR 270,TRANS_MIRROR 180)..
If i take images as frames I am still not getting smooth animation.
How to rotate sprite image in all angles in j2me?
See this thread, omarhassan123 created a code snippet that should allow you to rotate the image by any angle you like.
There is a library called J2ME ARMY KNIFE that provides all sorts of image manipulation techniques, you can get it here.
Also, see this question:
Image rotation algorithm
Another idea: decompile a game called Flexis Extreme. They do a lot of image rotation in real time so you could try to find out how they did it.
If you can, try LWUIT Image.rotate there is a sample at this page http://lwuit.blogspot.com.br/2008/11/round-round-infinite-progress-and.html
I'm using cocos2d.
Now I've added some images in the layer, and played around a bit.
I'm trying to save the whole screen as image file.
How can I do this?
The only way to capture the content of a SurfaceView is if you are rendering into it using OpenGL. You can use glReadPixels() to grab the content of the surface. If you are drawing onto the SurfaceView using a Canvas, you can simply create a Bitmap, create a new Canvas for that Bitmap and execute your drawing code with the new Canvas.
It is my understanding that cocos2d-android also has a CCRenderTexture class with the saveBuffer method. In that case have a look at my CCRenderTexture demo program and blog post for cocos2d-iphone which gives you an example for how to create a screenshot using CCRenderTexture and saveBuffer. The same principle should be applicable to cocos2d-android.
I want to make brushes displayed in below image for drawing application. Which is a suitable method - Open GL or Canvas & How can we implement it?
I'd say Canvas, as you'll want to modify an image. OpenGLES is good for displaying images, but does not (as far as I know) have methods for modifying its textures (unless you render to a texture that then render to screen with some modifications, which is not always so effective).
Using the Canvas you will have the methods for drawing your brush-strokes onto the Bitmap you're painting on, in GLES you would have to modify a texture (by using a canvas) and then upload that to the GPU again, before it could be rendered, and the rendering would most likely just consist of drawing a square with your texture on it (as the fillrate for most mobile GPUs are quite bad, you don't want to draw the strokes separately).
What I'm trying to say is; The most convenient way to let the user draw on an openGLES surface would be by creating a texture by drawing on a Canvas.
But, there might still be some gain in using GL for drawing, as the Canvas-operations can be performed off-screen, and you can push this data to a gl-renderer to (possibly) speed up the on-screen drawing.
However; if you are developing for Android 3.x+ you should take a look at RenderScript, (which I personally have never had a chance to use), but seems like it would be a good solution in this case.
Your best solution is going to be using native code. That's how Sketchbook does it. You could probably figure out how by browsing through the GIMP source code http://www.gimp.org/source . Out of Canvas vs OpenGL, Canvas would be the way to go.
It depends. if you want to edit the image statically, go with canvas. But if you want after brushing the screen, to have the ability to edit, scale, rotate, it would be easier with opengl.
An example with opengl: Store the motion the user do with touchs. create a class that store a motion and have fields for size, rotation etc. to draw this class, just make a path of the brush image selected following the stored motion.
I have in my app some sprites.
When I touch a sprite (in TouchEvent.isActionDown() ) , I need to change its image
How can I made this?
I'm not familiar with AndEngine, but by the looks of it, the Sprite class does not provide the functionality to change its image - or better said: texture. However, you might be able to accomplish your goal by using the TiledSprite or AnimatedSprite.
The latter is an extension of the first, so you should be able to use a TiledSprite. It has setCurrentTileIndex() and nextTile methods that seem to allow you to swap out one texture region by another. You may need to modify your images into a format suitable for AndEngine though and obviously you will need a handle to the touched sprite.
What I'm trying to do is have a background image, for sake of simplicity, lets say it's a picture of the front of a house. Then, I want to have a red ball move from window to window.
**I want to have a background picture, and a picture on top of it.
**I then want to be able to tell the top picture EXACTLY where to go.
How can I do this?
I'm just beginning to learn about animations in Android, and have not yet run across any way to do this.
There are two routes to animation in android: Canvas and OpenGL ES.
I would recommend OpenGL for anything requiring smoothness and speed, like a moving ball.
You should create a view using the helper class GLSurfaceView
http://android-developers.blogspot.com/2009/04/introducing-glsurfaceview.html, and implement a Renderer.
I assume you have the images saved in your res/drawable folders, in a format like png and the ball file contains an alpha channel.
You can see many tutorials online, but basically you need to load your background image and your ball resource at onSurfaceCreated and store it in a texture using GLUtils.texImage2D.
In the onDrawFrame method, you should set up a 2D projection such as glOrtho2D, then draw the background.
Then just before you draw the ball texture, you can use the glTranslate(x,y,0) function to move the ball over the house. Use an alpha blend for the ball:
glBlendFunc(GL_SRC_ALPHA, GL_SRC_ONE_MINUS_ALPHA);
glEnable(GL_BLEND);
Unfortunately writing in OpenGL isn't as straightforward as you might hope. Everything is done with 3D coordinates, despite the fact you want only a 2D image. But hopefully this gives you enough info to google for good exmaples, which are abundant!