I'm trying to rotate the camera around a cube with LookAt() function and using the accelerometer in an Android device. It works well. But I want the rotation to stop at some value in the Y axis. Here is my code so far:
public Transform target; // The object to follow
public float topMargin = 0.2f; // Top rotation margin
// The position of the target
private Vector3 point;
void Start () {
point = target.transform.position;
transform.LookAt (point);
}
void Update () {
// Freeze
if (transform.rotation.y >= topMargin) {
transform.RotateAround (point, new Vector3 (0, 1, 0), 0);
}
// Freeze
else if (transform.rotation.y <= -topMargin) {
transform.RotateAround (point, new Vector3 (0, 1, 0), 0);
} else {
transform.RotateAround (point, new Vector3(0, 1, 0), Input.acceleration.x);
}
}
The problem is that when the camera reaches the top margin, I can't start rotating again in the opposite direction. I've tried with a flag variable but can't get the correct program logic (tried different if/else's). Any suggestion on how to achive this?
You could check for the (intended) rotation direction. If the result would lead to a rotation that is more appropriate (away from the boundary and towards the allowed area) you could "allow" the intended rotation to apply.
I guess using the accelerometer value to check for "intended direction" would be easiest and least error prone. (Rather than checking the rotation itself)
Related
I am developing a Gdx game but I have got stuck on some part and I will explain briefly about it:
I have 9 balls organized on 3*3 and I need to detect the ball that I'm touching as shown in the image below:
enter image description here
and I typed this code:
for (int i : listBalls) {
touchPoint = new Vector3(Gdx.input.getX(), Gdx.input.getY(), 0);
rectangle = new Rectangle(sprite[i].getX(), sprite[i].getY(), spriteSize, spriteSize);
if (rectangle.contains(touchPoint.x, touchPoint.y)) {
Gdx.app.log("Test", "Touched dragged " + String.valueOf(i));
pointer = i;
}
}
In case of touching any ball of the above row, it detects the opposite ball in the bottom row. For example in the above image, if I touch ball no 2 on the top it will point to ball no 8, and same if touching any of the bottom balls.
In case of touching any ball of the middle ball, it gives the right on.
I hope I could explain clearly my issue. Please help.
As explained here: LibGDX input y-axis flipped the coordinate system of the input is inverted on the y-axis, try substracting the screen height of your device or camera
int y = screenHeight - Gdx.input.getY();
Keep in mind that using a Camera and unprojecting the input coordinates is the most recommended way to go about detecting input in LibGDX. Example:
#Override
public boolean touchDown(int screenX, int screenY, int pointer, int button) {
Vector3 unprojected = camera.unproject(new Vector3(screenX, screenY,0));
}
You don't even have to invert the y-coordinate, unproject() does this automatically for you. To use the correct x and y coordinates you then could use:
float x = unprojected.x;
float y = unprojected.y;
I want to change the OSMdroid MapView orientation to face the direction which the user is going (calculated with Location.bearingTo between the previous and current user location on each onLocationChanged, and converted into normal degrees instead of the -180/180° East of True North degrees).
This direction is correct, I'm rotating an arrow image towards this direction and it points towards the right direction without fail.
However, when I want to orientate the MapView to these userDirection using the setMapOrientation method (documented here), this isn't working as I want it to be. When I orientate the map towards the user's direction, the arrow image should always be pointing north, right? Because this is want I want to achieve: to make it seem like the arrow is always pointing forward (like a GPS tracker: your location on GPS is always represented by an icon going forward, my arrow is pointing to all kinds of directions because the map orientation is wrong).
I'm guessing the osmdroid.MapView orientation is expecting another sort of degree value, but I've tried converting back to East of True North degrees, didn't work. Or my logic is completely wrong and it is working correctly.
How do set the orientation for the MapView so that it is always facing the user's current direction, so that the arrow is always pointing forward (and not going backwards, right or left, ... )?
I think what you are referring to as is "True North" orientation of Map using the compass True North. For this you need the device Compass or Sensor Listener to get the direction, after getting the heading you need to set it for the MapView. Here is the Snippet which is very helpful.
private void compassHeadingUp() {
if(enableCompassHeadUp){
mSensorManager.registerListener(mySensorEventListener,
SensorManager.SENSOR_ORIENTATION,
SensorManager.SENSOR_DELAY_FASTEST);
} else {
mSensorManager.unregisterListener(mySensorEventListener);
mDirection = 0;
}
}
public SensorListener mySensorEventListener = new SensorListener(){
#Override
public void onAccuracyChanged(int arg0, int arg1) {
}
#Override
public void onSensorChanged(int sensor, float[] values) {
synchronized (this) {
float mHeading = values[0];
if(Math.abs(mDirection-mHeading) > Constance.ROTATION_SENSITIVITY){
mMapView.setMapOrientation(-mHeading);
mDirection = mHeading;
}
Matrix matrix = new Matrix();
mCompassImageView.setScaleType(ScaleType.MATRIX);
matrix.postRotate((float) -mHeading, mCompassImageView.getDrawable().getBounds().width()/2, mCompassImageView.getDrawable().getBounds().height()/2);
//Set your Arrow image view to the matrix
mCompassImageView.setImageMatrix(matrix);
}
}
};
I have solved this issue by inverting degrees like this:
float bearing = location.getBearing();
float t = (360 - bearing);
if (t < 0) {
t += 360;
}
if (t > 360) {
t -= 360;
}
//help smooth everything out
t = (int) t;
t = t / 5;
t = (int) t;
t = t * 5;
mapOSM.setMapOrientation(t);
I am trying to make it so that the CardboardMain in Unity will slowly drift in the direction that the center point of the VR is pointing. I have the script:
using UnityEngine;
using System.Collections;
public class Move : MonoBehaviour {
public float balloon_speed = 0.0001f;
// Use this for initialization
void Start () {
}
// Update is called once per frame
void Update () {
//float rotLeftRight = Input.GetAxis ("Mouse X");
//transform.Rotate (0, rotLeftRight, 0);
Vector3 direction = new Vector3(0,0,balloon_speed);
direction = transform.rotation * direction;
transform.localPosition += direction;
}
}
If the lines
//float rotLeftRight = Input.GetAxis ("Mouse X");
//transform.Rotate (0, rotLeftRight, 0);
are un-commented, the script works perfectly in Unity. When I load it to an android device, the camera will drift forwards and the direction won't change. I think the reason for this is that the VR coordinates are different from what transform.rotaion will return. Any advice?
I'd try this:
void Update() {
transform.localPosition += balloon_speed * Vector3.forward;
}
I think in your script you were adding a world-coordinate vector (rotation * direction) to the local-coordinate position.
I am using LibGDX and Box2d to build my first Android game. Yay!
But I am having some serious problems with Box2d.
I have a simple stage with a rectangular Box2d body at the bottom representing the ground, and two other rectangular Box2d bodies both at the left and right representing the walls.
A Screenshot
Another Screenshot
I also have a box. This box can be touched and it moves using applyLinearImpulse, like if it was kicked. It is a DynamicBody.
What happens is that in my draw() code of the Box object, the Box2d body of the Box object is giving me a wrong value for the X axis. The value for the Y axis is fine.
Those blue "dots" on the screenshots are small textures that I printed on the box edges that body.getPosition() give me. Note how in one screenshot the dots are aligned with the actual DebugRenderer rectangle and in the other they are not.
This is what is happening: when the box moves, the alignment is lost in the movement.
The collision between the box, the ground and the walls occur precisely considering the area that the DebugRenderer renders. But body.getPosition() and fixture.testPoint() considers that area inside those blue dots.
So, somehow, Box2d is "maintaining" these two areas for the same body.
I thought that this could be some kind of "loss of precision" between my conversions of pixels and meters (I am scaling by 100 times) but the Y axis uses the same technique and it's fine.
So, I thought that I might be missing something.
Edit 1
I am converting from Box coordinates to World coordinates. If you see the blue debug sprites in the screenshots, they form the box almost perfectly.
public static final float WORLD_TO_BOX = 0.01f;
public static final float BOX_TO_WORLD = 100f;
The box render code:
public void draw(Batch batch, float alpha) {
x = (body.getPosition().x - width/2) * TheBox.BOX_TO_WORLD;
y = (body.getPosition().y - height/2) * TheBox.BOX_TO_WORLD;
float xend = (body.getPosition().x + width/2) * TheBox.BOX_TO_WORLD;
float yend = (body.getPosition().y + height/2) * TheBox.BOX_TO_WORLD;
batch.draw(texture, x, y);
batch.draw(texture, x, yend);
batch.draw(texture, xend, yend);
batch.draw(texture, xend, y);
}
Edit 2
I am starting to suspect the camera. I got the DebugRenderer and a scene2d Stage. Here is the code:
My screen resolution (Nexus 5, and it's portrait):
public static final int SCREEN_WIDTH = 1080;
public static final int SCREEN_HEIGHT = 1920;
At the startup:
// ...
stage = new Stage(SCREEN_WIDTH, SCREEN_HEIGHT, true);
camera = new OrthographicCamera();
camera.setToOrtho(false, SCREEN_WIDTH, SCREEN_HEIGHT);
debugMatrix = camera.combined.cpy();
debugMatrix.scale(BOX_TO_WORLD, BOX_TO_WORLD, 1.0f);
debugRenderer = new Box2DDebugRenderer();
// ...
Now, the render() code:
public void render() {
Gdx.gl.glClearColor(0, 0, 0, 1);
Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT);
camera.update();
world.step(1/45f, 6, 6);
world.clearForces();
stage.act(Gdx.graphics.getDeltaTime());
stage.draw();
debugRenderer.render(world, debugMatrix);
}
Looks like the answer to that one was fairly simple:
stage.setCamera(camera);
I was not setting the OrthographicCamera to the stage, so the stage was using some kind of default camera that wasn't aligned with my stuff.
It had nothing to do with Box2d in the end. Box2d was returning healthy values, but theses values were corresponding to wrong places in my screen because of the wrong stage resolution.
I'm trying to rotate my Bitmap using a readymade solution I found somewhere. The code is below:
public void onDraw(Canvas canvas) {
float x = ship.Position.left;
float y = ship.Position.top;
canvas.drawBitmap(ship.ship, x,y,null);
invalidate();
}
However, when I do it, the X and Y axii change their direction - if I increase the Y the image goes towards the top of the screen, not towards the bottom. Same happens to X if I rotate by 90 degrees.
I need to rotate it but without changing the Y and X axii directions.
Even rotated, I still want the Bitmap to go towards the bottom if I increase Y and to the right if I increase the X.
public void update()
{
if(!moving)
{
fall();
}
else //moving
{
move();
faceDirection();
}
Position.top += Speed;
}
private void move() {
if(Speed < MAXSPEED)
Speed -= 0.5f;
}
private void fall() {
if(Speed > MAXSPEED*-1)
Speed += 0.2f;
}
private void faceDirection() {
double OldDiretion = Direction;
Direction = DirectionHelper.FaceObject(Position, ClickedDiretion);
if (Direction != OldDiretion)
{
Matrix matrix = new Matrix();
matrix.postRotate((float)Direction);
ship = Bitmap.createBitmap(ship, 0, 0, ship.getWidth(),ship.getHeight(), matrix, false);
}
I tried the code above, but it's still changing the Y direction, It's going to bottom of the BitMap, not bottom of the screen.
Here is the project: https://docs.google.com/file/d/0B8V9oTk0eiOKOUZJMWtsSmUtV3M/edit?usp=sharing
You should first rotate, than translate:
matrix.postTranslate(x, y);
matrix.postRotate(degree);
alternative would be to try to use preRotate() instead of postRotate().
I also strongly recommend to translate/rotate the original while drawing. So your createBitmap() call shouldn't modify the orientation. At least not when you change it dynamically on user interaction. Otherwise you would create a lot of bitmaps to represent rotations over and over again which would impact the performance.
The problem is that you don't actually rotate the bitmap - you just draw it rotated. So, the next time you redraw it, you first push it towards the bottom or right by incrementing x/y and then rotate it.
You have to actually rotate the bitmap itself. You could use the following code:
ship.ship = Bitmap.createBitmap(ship.ship, 0, 0, ship.ship.getWidth(), ship.ship.getHeight(), matrix, false);
Here you create a new rotated bitmap and set your reference to point to it.
Note! You must do this only once! So you can't do it in the onDraw method, since then it will get rotated every time it's redrawn. You have to do it somewhere else and then draw it as usual in onDraw (without the matrix rotations).