Attach Sprite on another Sprite ANDENGINE - android

I am working on andengine and i have two sprite one is plate and the other is an apple . My plate sprite move form point 1 to point 2 and my apple sprite is jumping up and down.
Now i want to make apple jump on plate. I tried it with attched child apple with plate but the apple not place on the plate. Apple place below the plate i used zindex but its not working.
Actually problem is to move apple and plate at the same time. Any help would be appriciated. I am stuck with that that why this is happening and what will be solution .Here is my code:
plateDisplay = new Sprite( 250, 300, this.plate, this.getVertexBufferObjectManager());
appleDisplay = new Sprite( 250, 140, this.apple, this.getVertexBufferObjectManager());
plateDisplay.registerEntityModifier(new LoopEntityModifier(new PathModifier(20, path, EaseLinear.getInstance())));
appleDisplay.registerEntityModifier(new LoopEntityModifier(new ParallelEntityModifier(new MoveYModifier(1, appleDisplay.getY(),
(appleDisplay.getY()+70), EaseBounceInOut.getInstance()))));
this.appleDisplay.setZIndex(1);
plateDisplay.setZIndex(0);
plateDisplay.attachChild(this.appleDisplay);
scene.attachChild(plateDisplay);

The issue you are having is that there are different coordinate systems for each object. The Plate sprite has its own X and Y in the scene coordinates. But when you add the apple to the plate object you are now using the plates local coordinates. So if the apple was on the scene's 50,50, when you add it to the plate, it will now be 50,50 as measured from the transform center point of the plate.
There are LocaltoScene and ScenetoLocal coordinate utilities in andengine to help you make this conversion. But underneath they are not super complex - they just add the transforms of all the nested sprites. Both utilites are part of the Sprite class, so you call them from the sprite in question. In your case probably
// Get the scene coordinates of the apple as an array.
float[] coodinates = [appleDisplay.getX(), appleDisplay.getY()];
// Convert the the scene coordinates of the apple to the local corrdinates of the plate.
float[] localCoordinates = plateDisplay.convertSceneToLocalCoordinates(coordinates);
// Attach and set position of apple
appleDisplay.setPosition(localCoordinates[0], localCoordintates[1]);
plateDisplay.attachChild(appleDisplay);

Related

PDFTron : Drawing Ink Annotation programmatically

I am drawing ink annotation from points stored in db. Where those points were extracted from previously drawn shape over pdf. I have referred this example given by PDFTron but I am not able to see annotation drawn on page in proper manner.
Actual Image
Drawn Programmatically
Here is the code I have used for drawing annotation.
for (Integer integer : uniqueShapeIds) {
Config.debug("Shape Id's unique "+integer);
pdftron.PDF.Annots.Ink ink = pdftron.PDF.Annots.Ink.create(
mPDFViewCtrl.getDoc(),
getAnnotationRect(pointsArray, integer));
for (SaveAnnotationState annot : pointsArray) {
Config.debug("Draw "+annot.getxCord()+" "+annot.getyCord()+" "+annot.getPathIndex()+" "+annot.getPointIndex());
Point pt = new Point(annot.getxCord(), annot.getyCord());
ink.setPoint(annot.getPathIndex(), annot.getPointIndex(),pt);
ink.setColor(
new ColorPt(annot.getR()/255, annot.getG()/255, annot
.getB()/255), 3);
ink.setOpacity(annot.getOpacity());
BorderStyle border=ink.getBorderStyle();
border.setWidth(annot.getThickness());
ink.setBorderStyle(border);
}
ink.refreshAppearance();
Page page = mPDFViewCtrl.getDoc().getPage(mPDFViewCtrl.getCurrentPage());
Annot mAnnot=ink;
page.annotPushBack(mAnnot);
mPDFViewCtrl.update(mAnnot, mPDFViewCtrl.getCurrentPage());
}
can any one tell me what is going wrong here?
On a typical PDF page, the bottom left corner of the page is coordinate 0,0. However, for annotations the origin is the bottom left corner of the rectangle specified in the BBox entry. The BBox entry is the 3rd parameter of you call to Ink.Create, which is called pos unfortunately.
This means the Rect passed into Ink.Create, is supposed to be the minimum axis-aligned bounding box of the all the points that make up the Ink Annot.
I suspect in your call to getAnnotationRect you start with Rect(), which is really Rect(0,0,0,0), so when you union all the other points you end up with an inflated Rect.
What you should do is store the BBox in your database, by calling Annot.getRect().
If this is not possible, or too late, then initialize the Rect with the first point in your database.
Rect rect = new Rect(pt.x, pt.y, pt.x, pt.y);
API:
http://www.pdftron.com/pdfnet/mobile/docs/Android/pdftron
http://www.pdftron.com/pdfnet/mobile/docs/Android/pdftron/PDF/Annot.html#getRect%28%29/PDF/Annot.html#create%28pdftron.SDF.Doc,%20int,%20pdftron.PDF.Rect%29

Ball sticking to sidewalls in android unity 3d?

I try to run roll a ball game Unity 3d example in android device, The ball is sticking to the sidewalls and also ball is moving very slowly when the ball is in contact with sidewalls. Help me regarding this issue?
Here is my accelerometer code for ball moving
Screen.sleepTimeout = SleepTimeout.NeverSleep;
curAc = Vector3.Lerp(curAc, Input.acceleration-zeroAc, Time.deltaTime/smooth);
GetAxisV = Mathf.Clamp(curAc.y * sensV, -1, 2);
GetAxisH = Mathf.Clamp(curAc.x * sensH, -1, 2);
Vector3 movement = new Vector3 (GetAxisH, 0.0f, GetAxisV);
rigidbody.AddForce(movement * speedAc*2f);
Thanks In Advance
I had a similar problem when building a pinball game. I was not using accelerometer, but the ball behavior was the very same.
Just check out the physic material of yout objects. Ball, walls and either floor has to be checked. As I don't know exactly what kind of game you are building, I recommend you to try out every parameter.

Android AndEngine two circles collision perfectly

Android AndEngine two circles collision perfectly. I have two circle and a collision method for them, I want when they touch each other the collision happens, currently when they near each other the collision happens.
I think that it is because of the transparent free space in the .png file of each circle.
In the picture you can see that now they collide from a distance, I want when both touch each other.
My collision method:
if (circle1.collidesWith(circle)){
Score += 1;
}
I am almost sure you are right that transparent places in png causes it. You probably creating BoxBody. In your case you should use circle body like this:
Body circleBody = PhysicsFactory.createCircleBody(pWorld, pSprite, BodyType.StaticBody, FixtureDef);
If it doesn't help there is method overload where you can provide position and size of the body. I can recommend you using DebugRender which you only have to attach to scene:
new DebugRenderer(physicsWorld, vbom)
When u use this you will see how helpful it can be:) Just remember that it may slowdown your phone when you have a lot of bodies on the scene.
PS. You didn't give us a lot of information but you should use contactListener to check colisions. There are plenty of tutorials in the internet for it
PS2. If you don't use Box2D extension - do it. This is great feature of AndEngine and it's pointless to implement that for yourself. It will be hard to detect circle shape collision of 2 objects without Box2D.
If you are not in Box2d , You must use Pixel-Perfect Collision library. Well default AndEngine Library, Does not support pixel perfect collision. To get this support, you need to import this library in eclipse and add this to your project uses library.
Here, I Demonstrate how to use this library. When you define Texture and Atlas for your sprite write as below.
private BitmapTextureAtlas lifeAtlas;
public PixelPerfectTiledTextureRegion life_Texture;
PixelPerfectTextureRegionFactory.setAssetBasePath("gfx/game/");
lifeAtlas = new BitmapTextureAtlas(textureManager, 1280, 128,
TextureOptions.BILINEAR);
life_Texture = PixelPerfectTextureRegionFactory.createTiledFromAsset(
lifeAtlas, activity, "heart_tiled.png", 0, 0, 10, 1, 0);
lifeAtlas.load();
For your custom sprite class,
public class Plane extends PixelPerfectAnimatedSprite {
public Plane(float pX, float pY,
PixelPerfectTiledTextureRegion pTiledTextureRegion,
VertexBufferObjectManager pVertexBufferObjectManager) {
super(pX, pY, pTiledTextureRegion, pVertexBufferObjectManager);
setColor(Color.GREEN);
}
}
You also need some adjustment with your AndEngine library to use it. Follow this thread to go.

rotate an Object, but translate the Object always in its own front Axis

I want to program a racinggame for Android. My Problem is, that if I rotate the car and want to translate the position it doesn't translate into the new direction of the car , but always in the X axis of the world.
Here is my wrong code.. thank you
gl.glTranslatef(car.position.x, car.position.y, car.position.z);
gl.glRotatef(car.currentAngle, 0, 1, 0);
Opengl uses matrices to create images.
In Matrices, multiplication do not have an associative property. Therefore when you rotate an object and then translate it, the object will end up in a different position as opposed if you did not translate it first.
A solution to transforming and translating an object would be to animate and translate. That way you can translate anywhere you want without worrying about object rotation's associative property.
To see the effects of the non-associative multiplication on your object, try this: rotate and translate your object about 8 times, rotating and translating 8 times each respectively. You will notice that your object will disappear. As opposed to "rotate in a circle while changing position".
Ok I have the solution. All I have to do is to translate my Car towards the new directional vector who gets changed by the new angle of my car :)
if (accel < 0)
position.add((float) Math.sin(current * Math.PI/180)/5, 0, (float) Math.cos(currentangle * Math.PI/180)/5);
if (accel > 0)
position.sub((float) Math.sin(current * Math.PI/180)/5, 0, (float) Math.cos(currentangle * Math.PI/180)/5);
and in the rendering class
gl.glTranslatef(car.position.x, car.position.y, car.position.z);
gl.glRotatef(car.currentAngle, 0, 1, 0);

First Person Camera rotation in 3D

I have written a first person camera class for android.
The class is really simple , the camera object has its three axes
X,y and Z
and there are functions to create the ModelView matrix ( i.e. calculateModelViewMatrix() ),
rotate the camera along its X and Y axis
and Translate the camera along its Z-axis.
I think that my ModelViewMatrix calulation is correct and i can also translate the camera along the Z-axis.
Rotation along x-axis seems to work but along Y-axis it gives strange results.
Also another problem with the rotation seems to be that instead of the camera being rotated, my 3d model starts to rotate instead along its axis.
I have written another implementation based on the look at point and using the openGL ES's GLU.gluLookAt( ) function to obtain the ModelView matrix but that too seems to suffer from the exactly the same problems.
EDIT
First of all thanks for your reply.
I have actually made a second implementation of the Camera class, this time using the rotation functions provided in android.opengl.Matrix class as you said.
I have provided the code below, which is much simpler.
To my surprise, the results are "Exactly" the same.
This means that my rotation functions and Android's rotation functions are producing the same results.
I did a simple test and looked at my data.
I just rotated the LookAt point 1-dgree at a time around Y-axis and looked at the coordinates. It seems that my LookAt point is lagging behind the exact rotation angle e.g. at 20-deg it has only roatated 10 to 12 degree.
And after 45-degrees it starts reversing back
There is a class android.opengl.Matrix which is a collection of static methods which do everything you need on a float[16] you pass in. I highly recommend you use those functions instead of rolling your own. You'd probably want either setLookAtM with the lookat point calculated from your camera angles (using sin, cos as you are doing in your code - I assume you know how to do this.)
-- edit in response to new answer --
(you should probably have edited your original question, by the way - your answer as another question confused me for a bit)
Ok, so here's one way of doing it. This is uncompiled and untested. I decided to build the matrix manually instead; perhaps that'll give a bit more information about what's going on...
class TomCamera {
// These are our inputs - eye position, and the orientation of the camera.
public float mEyeX, mEyeY, mEyeZ; // position
public float mYaw, mPitch, mRoll; // euler angles.
// this is the outputted matrix to pass to OpenGL.
public float mCameraMatrix[] = new float [16];
// convert inputs to outputs.
public void createMatrix() {
// create a camera matrix (YXZ order is pretty standard)
// you may want to negate some of these constant 1s to match expectations.
Matrix.setRotateM(mCameraMatrix, 0, mYaw, 0, 1, 0);
Matrix.rotateM(mCameraMatrix, 0, mPitch, 1, 0, 0);
Matrix.rotateM(mCameraMatrix, 0, mRoll, 0, 0, 1);
Matrix.translateM(mCameraMatrix, 0, -mEyeX, -mEyeY, -mEyeZ);
}
}

Categories

Resources