I'd have already post this question in the Andengine forum but there are already some questions regarding this topic, some have replies but the ones i want to know don't have any replies yet.
I'm trying to simulate a player jump like in Super Mario Bros. First, I used a simple contact listener to have a boolean value false when contact occurs but the contact occurs with walls grounds, everything. So, I'm now trying to attach another small body to the bottom of player as foot sensor using WeldJoint. But I couldn't achieve that. The WeldJoint wouldn't stick at all. I tried to create the WeldJoint on an update thread, nothing. I tried with the setposition method to update the sensor position with the player's, but it just positions the sensor below ground.
Any suggestions would be appreciated. Here is how i tried to create WeldJoint.
Player and sensor
mPlayer = new AnimatedSprite(100, 150, PlayerTextureRegion);
PlayerBody = PhysicsFactory.createBoxBody(this.mPhysicsWorld,mPlayer,BodyType.DynamicBody, PLAYER_FIXTURE_DEF);
this.mPhysicsWorld.registerPhysicsConnector(new PhysicsConnector(mPlayer, PlayerBody, true, true));
mScene.getLastChild().attachChild(mPlayer);
final Shape mSensor= new Rectangle(mPlayer.getX()+4,mPlayer.getY()+mPlayer.getHeight(),10,4);
final Body SensorBody = PhysicsFactory.createBoxBody(this.mPhysicsWorld,mSensor,BodyType.DynamicBody, SENSOR_FIXTURE_DEF);
this.mPhysicsWorld.registerPhysicsConnector(new PhysicsConnector(mSensor, SensorBody, true, true));
mScene.getLastChild().attachChild(mSensor);
mScene.registerUpdateHandler(new IUpdateHandler() {
#Override
public void reset() { }
#Override
public void onUpdate(final float pSecondsElapsed) {
this.createJoint(PlayerBody,SensorBody);
.......
Joint Method
private void createJoint(Body Anchor, Body Sensor){
final WeldJointDef join = new WeldJointDef();
join.initialize(Anchor,Sensor,Anchor.getWorldCenter());
this.mPhysicsWorld.createJoint(join);
}
Ok, instead of WeldJoint I used RevoluteJoint, without the motor configuration and it works fine now. Just initialize two bodies using revoluteJointDef and they are stuck like weldjoint. For time being I'm going with revoluteJoint to make two bodies as one.
Related
Hey guys so Im making my first app. I know this is a pretty loaded question but Im having a hard time following examples because the script code is different. I am making a 2D platform runner. To start, I created the platforms, environment as well as a majority if not all of the physics. The player at this point is just a circle (just a place holder). The circle can move from left to right and jump. I have now, created an actual player sprite and made animations for walking, jumping and idle. How do I apply the new sprite animations to the current circle placeholder as well as the script? My next step is to go into the animator and start making transitions i guess. Im just not sure how to add the animations to my current script. I knew this was going to be a challenge and if there is any other info you need, please let me know. Thanks so much guys.
This is my "Controls.cs" that is currently attached to my circle player/placeholder. My CheckGround is attached to him. Everything else should be in relation to the platforms he is jumping on and I dont think will change. Again, I have a sprite walking, jumping and idle that I would like to take place of the current circle/placeholder. I need the walking animation to take place when the left and right arrows are pressed, the pump animation to play when the jump button is pressed and idle animation to play when the player is just standing still otherwise. Again, thanks so much guys!
using UnityEngine;
using System.Collections;
public class Controls : MonoBehaviour
{
public Rigidbody2D rb;
public float movespeed;
public float jumpheight;
public bool moveright;
public bool moveleft;
public bool jump;
public Transform groundCheck;
public float groundCheckRadius;
public LayerMask whatIsGround;
private bool onGround;
// Use this for initialization
void Start()
{
rb = GetComponent<Rigidbody2D>();
}
void FixedUpdate()
{
onGround = Physics2D.OverlapCircle(groundCheck.position, groundCheckRadius, whatIsGround);
}
// Update is called once per frame
void Update()
{
if (Input.GetKey(KeyCode.LeftArrow))
{
rb.velocity = new Vector2(-movespeed, rb.velocity.y);
}
if (Input.GetKey(KeyCode.RightArrow))
{
rb.velocity = new Vector2(movespeed, rb.velocity.y);
}
if (Input.GetKey(KeyCode.Space))
{
if (onGround)
{
rb.velocity = new Vector2(rb.velocity.x, jumpheight);
}
}
if (jump)
{
if (onGround)
{
rb.velocity = new Vector2(rb.velocity.x, jumpheight);
}
jump = false;
}
if (moveright)
{
rb.velocity = new Vector2(movespeed, rb.velocity.y);
}
if (moveleft)
{
rb.velocity = new Vector2(-movespeed, rb.velocity.y);
}
}
}
Read about Animation Controller. After you create it you can grab Animator from your gameObject like transform.GetComponent<Animator>()
Later you can blend your animations or just play them. Unity give you even possibility to make input conditions to play your animations so to be honest you don't even have to type too much code.
Definitely use an Animation Controller.
Have a look at this video - I'm following this tutorial at the moment.
https://www.youtube.com/watch?v=I0IVZhHNarg
I am making a game in which my player is a UFO, and when the player collides with other game objects, I need the game objects to be attached or floated in the air below the player (UFO), like original UFO. I tried to attach them as a child, but it didn't worked.
I made one script as below:
if (coll) {
distance = Vector2.Distance (this.transform.position, player.transform.position);
if (distance < 2) {
this.transform.parent = encaixe.transform;
this.transform.localPosition = new Vector2 (0f, 1.2f);
this.transform.localRotation = Quaternion.identity;
encaixe.rigidbody2D.gravityScale=0;
}
}
In using this script, the gameobject is attaching, but the player is not moving as like original. The game object is pulling down or up forcefully.
Please suggest to me how to do this.
I know that this Thread is rather old, but maybe someone might come across this question and is in need of an answer.
You can actively set the Object's Position and Rotation to the UFO's on Collision.
Something like the following (pseudocode):
private bool hasCollided = false;
void OnCollisionEnter()
{
hasCollided = true;
}
void LateUpdate()
{
if (hasCollided)
{
followPlayer();
}
}
void followPlayer()
{
//update position and rotation
}
I have two bodies(A,B). I want B follows A. I can change B position with setTransfrom() function to A position. But I have to change B's position in every frame rate. So I have to use something like contact listener. When I use normal object in Andengine, it has this function below instead of contactlistener.
this.foot = new Rectangle(this.getX(), this.getY(), 8, 10, ResourcesManager.getInstance().vbom){
#Override
protected void onManagedUpdate(float pSecondsElapsed)
{
// super.onManagedUpdate(pSecondsElapsed);
this.setPosition(body.getPosition().x*PhysicsConstants.PIXEL_TO_METER_RATIO_DEFAULT+1,
body.getPosition().y*PhysicsConstants.PIXEL_TO_METER_RATIO_DEFAULT-15);
}
};
I mean I can set this kind of listener when I am creating it.Is there any option for box2d body? I mean something like that:
this.footBody=PhysicsFactory.createBoxBody(this.mPhysicsWorld, this.foot, BodyType.DynamicBody, footFixtureDef){
#Override
protected void onManagedUpdate(float pSecondsElapsed)
{
// super.onManagedUpdate(pSecondsElapsed);
this.setPosition(body.getPosition().x*PhysicsConstants.PIXEL_TO_METER_RATIO_DEFAULT+1,
body.getPosition().y*PhysicsConstants.PIXEL_TO_METER_RATIO_DEFAULT-15);
}
};
The poisition updates of physics bodies is managed by the Box2D library. But you can register an additional update handler to your scene and tell the box2D world what you want done during that update.
this.scene.registerUpdateHandler(new IUpdateHandler() {
#Override
public void onUpdate(float pSecondsElapsed) {
// TODO Auto-generated method stub
//Your code to run here!
}
So if you maintain a list of references to the bodies in your world, you should be able to apply forces or set their positions on each tick of the game.
If you want to make sure you update the world before or after the box2D world step, you could event call
mBox2dWorld.onUpdate(pSecondsElapsed);
from within your own update handler, instead of registering the world as an updateHandler itself.
Here are a couple of less practical suggestions as well:
Or to do that the box2D way you could use a joint. I depends on the following behavior you want.
OR... you don't even have object B as part of the Box2D world, perhaps, and just manage the following behavior outside of box2D. Does it need to collide with stuff?
Hello everyone I'm trying to develop a game like Jetpack so i want to set a gravity that normally push down and when the user tap the screen the gravity change it push up. I searched for a week a tutorial that explain how to do that but I didn't find what I search. Someone can explain me how to do that or post a link to a tutorial ?
Thank you !!
main = new Sprite(sX, sY, mainTextureRegion);
main.setScale(1);
main.setFlippedHorizontal(true);
scene.attachChild(main);
mPhysicsWorld = new PhysicsWorld(new Vector2(0, SensorManager.GRAVITY_EARTH), false);
final FixtureDef objectFixtureDef = PhysicsFactory.createFixtureDef(1, 0.5f, 0.5f);
final Body body = PhysicsFactory.createBoxBody(mPhysicsWorld, main, BodyType.DynamicBody, objectFixtureDef);
mPhysicsWorld.registerPhysicsConnector(new PhysicsConnector(main, body, true, true));
final Vector2 gravity = new Vector2(0, 5f);
mPhysicsWorld.setGravity(gravity);
scene.registerUpdateHandler(new IUpdateHandler() {
#Override
public void onUpdate(float pSecondsElapsed) {
}
#Override
public void reset() {
// TODO Auto-generated method stub
}});
When the user touches the screen, you shouldn't change the gravity because it is not suppose to be done like that.
When he touches the screen, just apply a LinearImpulse (check if it not already at maximum speed) or something like that to make it go upper. The gravity shouldn't change as it should always attract all of your physics objects down. If your change the gravity you change the behavior of all your objects, not only the player one.
EDIT
To have the physics engine to work you have to call the step() method of the PhysicsWorld class at each render. The problem with Andengine is that you don't directly access the render() method as Andengine handle it at all.
So what you have to do is to register a IUpdateHandler on the Engine object that you return in your onLoadEngine() method.
This way the onUpdate() method of the interface will be called at each render and here you can call the step() method to get the physics part to work. The float value tells you about how many seconds elapsed since last render but you don't have to take care of that. The method is called multiple times per second.
Hope it helps.
EDIT 2
I think this should work :
//on the Scene object or on the Engine one
scene.registerUpdateHandler(new IUpdateHandler() {
#Override
public void onUpdate(float pSecondsElapsed) {
mPhysicsWorld.onUpdate(pSecondsElapsed);
}
#Override
public void reset() {}
});
I have an application using a GlSurfaceView and renderer. I have it set so that when the user exits the application via the back button I call myActivity.finish();
This is fine and I can see the activity getting calls to onStop() and onDestroy();
The app works fine the first time run however when I subsequently run I have had a problem with my motionEvents.
I handle motion events by queuing them into a pool and having the renderer access the pool at the right time like so:
try
{
//Get the history first
int hist = event.getHistorySize();
if (hist > 0)
{
//Oldest is first in the list. (I think).
for (int i=0;i <hist; i++)
{
InputObject input = inputObjectPool.take();
input.useEventHistory(event, i);
defRenderer.feedInput(input);
}
}
//The current one still needs to be added
InputObject input = inputObjectPool.take();
input.useMotionEvent(event);
defRenderer.feedInput(input);
}
And in the renderer:
synchronized (inputQueueMutex)
{
ArrayBlockingQueue<InputObject> inputQueue = this.inputQueue;
while (!inputQueue.isEmpty()){try
{
InputObject input = inputQueue.take();
if (input.eventType == InputObject.EVENT_TYPE_TOUCH)
{
screenManager.processMotionEvent(input);
}
else if (input.eventType == InputObject.EVENT_TYPE_KEY)
{
screenManager.processKeyPress(input);
}
input.returnToPool();
}
catch (InterruptedException ie)
{
DLog.defError("Interrupted blocking on input queue.", ie);
}
}
}
As you can see in the above code I hand these motion events to the ScreenManager which basically is my way of having several "scenes" which I render out. This works fine the first time I run the application and the screen interprets my motion touches into movement of a simple square at the moment.
However the second time I run the application the square is drawn to the screen fine however the motion events do nothing.
I have followed the motion events and although they are given to the "new" renderer it seems to be giving the motion events to an old screen. Or rather to an old object on the screen. This is confusing as in my code in the onCreate() method I do this:
//Set up the renderer and give it to the SurfaceView
defRenderer = new DefRenderer();
defView = new DefView(this);
defView.setRenderer(defRenderer);
//Set out content to the surface view.
setContentView(defView);
//Set up the input queue
createInputObjectPool();
OnCreate is called both the first time and the second time my app is run (and the app was destroyed!) the screens are made new in defRenderer and are given to a new defView.
I am very confused how data could remain in the defRenderer to receive the motionEvents as the app is completely remade.
Is there something obvious going on that I am missing here? I would have thought that when onDestroy is called the app would be completely dereferenced and so no trace of it would remain. Is this not true? Does somehow when I call new Renderer(); is it referencing an old one?
I am at a loss as to what is going on really. Especially as this app is a basic copy of another I have written which works completely fine!
EDIT:
After a small amount of experimentation I have discovered that the motion events are actually going to an old ScrollPanel (an object I made..) which is registered as a listener (and by listener I mean my own implementation ..) for MotionEvents. I have made my own event system for these like so:
public interface TouchSource
public static final int TYPE_TOUCHDOWN = 0;
public static final int TYPE_TOUCHDRAG = 1;
public static final int TYPE_TOUCHCLICK = 2;
public Vector<TouchListener> listeners = new Vector<TouchListener>();
public void addTouchListener(TouchListener listener);
public void removeTouchListener(TouchListener listener);
public void touchOccured(int type, int xPos, int yPos);
}
And the listener interface:
public interface TouchListener
public boolean touchDownOccured(int xPos, int yPos);
public boolean touchDragOccured(int xPos, int yPos);
public boolean touchClickOccured(int xPos, int yPos);
So the Screen implements touchSource and so has a list of the listeners. Now despite being REMADE by Screen currentScreen = new Screen(); called in the OnCreate(); of the manager this list of listeners is still populated with the old ScrollPanel?
How is this? I'm clearly missing something obvious. Like somehow the list of listeners is static for some reason and not getting dereferenced despite the app being completely remade?
I suspect the issue you're facing might have something to do with the fact that the original motionevents are recycled (returned to their pool) by the framework after the onMotionEvent() returns.
From the way you're using your InputObjects, I think you might be keeping a reference to the original motionevents in there, and not copying the event data.
Quickly try using MotionEvent.obtain(event) whereever you use event now (this makes a copy) and see if this makes the weird behaviour go away. Naturally, if this works you will eventually have to recycle() those copies after you're done with them. Do not call recycle() on the original motionevents though.
Cheers, Aert.