Add a sprite on circumference of a circle - android

I want to randomly generate sprites on circumference of circle , but even after research of several hours , I couldn't come up with any solution.
That's what I can made till now
I've used this formula for it :
Sprite * pin = Sprite::create("pin.png");
pin->setPosition(Vec2((_circle->getContentSize().width/2)*(0.7/3), _circle->getContentSize().height*0.7));
Sprite * pin2 = Sprite::create("pin.png");
pin2->setPosition(Vec2((_circle->getContentSize().width/2)*(0.6/3), _circle->getContentSize().height*0.6));
Sprite * pin3 = Sprite::create("pin.png");
pin3->setPosition(Vec2((_circle->getContentSize().width/2)*(0.8/3), _circle->getContentSize().height*0.8));
Sprite * pin4 = Sprite::create("pin.png");
pin4->setPosition(Vec2((_circle->getContentSize().width/2)*(0.9/3), _circle->getContentSize().height*0.9));
Sprite * pin5 = Sprite::create("pin.png");
pin5->setPosition(Vec2((_circle->getContentSize().width/2)*(1/3), _circle->getContentSize().height));
_circle->addChild(pin);
_circle->addChild(pin2);
_circle->addChild(pin3);
_circle->addChild(pin4);
_circle->addChild(pin5);
But I want something like that(with correct angle which I couldn't do in sample image)
Please suggest some precise solution for it. Thanks for your time!

Basic trig stuff -- sin and cos are your friends.
Example:
const float circle_x = ...;
const float circle_y = ...;
const float circle_radius = ...;
const float angle = ...;
const float x = cos(angle)*circle_radius + circle_x;
const float y = sin(angle)*circle_radius + circle_y;
// Draw stuff at (x, y).

First, it's "circumference" not "circumstance" (that will help with your searches)
Second, you are using the size of the image, not the circle inside the image.
Third, you will need to use basic trigonometry for the solution. Determining points on a circle require the use of sin and cos functions. After you find the center of the circle and it's radius, these should be easy to calculate with just a little bit of research.

Related

Can an ellipse be drawn on google maps in Android?

I want to draw an ellipse on google maps in my android app. I have already drawn circle but now I want to draw ellipse. Is there a way to draw an ellipse?
This is how I am drawing the circle:
private void drawMarkerWithCircle(LatLng position) {
int strokeColor = 0xffff0000; //red outline
int shadeColor = 0x44ff0000; //opaque red fill
CircleOptions circleOptions = new CircleOptions().center(position).radius(radiusInMeters).fillColor(shadeColor).strokeColor(strokeColor).strokeWidth(8);
mCircle = mMap.addCircle(circleOptions);
How should I change this code to make ellipse type shape?
This question is a little old, but I faced the same issue, so I will explain how I solved it. As far as I know there is no Ellipse feature on Google maps, the approach I use is to draw a polygon, calculating the vertices using the parametric equation of the ellipse.
If you have the center of the ellipse and two radius:
for (var angle = 1; angle <= 360; angle++)//360 points
{
var py = center.Latitude + semiMajor * Math.Cos(radians * angle);
var px = center.Longitude + semiMinor * Math.Sin(radians * angle);
point = new LatLng(py, px);
polygon.Points.Add(point);
}
You can increase the angle variable by more than 1 in order to get less points. You can check this post for the entire example.
Hope it helps!

Libgdx - box2d polygonshape doesn't fit its sprite

I have a body with a polygonshape created using .setasbox but when I run my game the box is a bit bigger than my sprite.
I know setasbox uses half height and half width, I used my scaling constant to convert meters to pixels and I know the sprite has the origin of the axis on the bottom left as well. Despite of that I still have a box with a width a bit larger than the sprite and this gap is the same however I change the size of the box...
This is the code I use to create my box (160 is the constant to scale meters to pixels):
public Block(World w, float halfWidth, float halfHeight, Vector2 position, Texture tex){
world = w;
bodyd = new BodyDef();
bodyd.type = BodyDef.BodyType.KinematicBody;
bodyd.gravityScale = 0;
shape = new PolygonShape();
shape.setAsBox(halfWidth, halfHeight);
fixtured = new FixtureDef();
fixtured.shape = shape;
fixtured.density = DENS;
fixtured.friction = FRIC;
fixtured.restitution = REST;
bodyd.position.set(new Vector2(position.x, position.y));
body = world.createBody(bodyd);
fixture = body.createFixture(fixtured);
body.setUserData(this);
texture = tex;
sprite = new Sprite(texture);
sprite.setSize(halfWidth * 2 * 160, halfHeight*2*160);
sprite.setPosition((body.getPosition().x - halfWidth) * 160, (body.getPosition().y - halfHeight) * 160);
}
can you try using Box2DSprite? its very easy..
https://bitbucket.org/dermetfan/libgdx-utils/wiki/net.dermetfan.gdx.graphics.g2d.Box2DSprite
http://www.java-gaming.org/index.php?topic=29843.0
I don't see anything wrong with your code
did you consider that the size that you put for your sprite is the size of the full sprite not the size of the block inside your sprite
I think this is why your brick sprite is smaller than your brick physic :
unless the your brick has the full size of the sprite then may be the problem is related to something else
hope that was helpful !

Imprecise Box2d coordinates using LibGDX

I am using LibGDX and Box2d to build my first Android game. Yay!
But I am having some serious problems with Box2d.
I have a simple stage with a rectangular Box2d body at the bottom representing the ground, and two other rectangular Box2d bodies both at the left and right representing the walls.
A Screenshot
Another Screenshot
I also have a box. This box can be touched and it moves using applyLinearImpulse, like if it was kicked. It is a DynamicBody.
What happens is that in my draw() code of the Box object, the Box2d body of the Box object is giving me a wrong value for the X axis. The value for the Y axis is fine.
Those blue "dots" on the screenshots are small textures that I printed on the box edges that body.getPosition() give me. Note how in one screenshot the dots are aligned with the actual DebugRenderer rectangle and in the other they are not.
This is what is happening: when the box moves, the alignment is lost in the movement.
The collision between the box, the ground and the walls occur precisely considering the area that the DebugRenderer renders. But body.getPosition() and fixture.testPoint() considers that area inside those blue dots.
So, somehow, Box2d is "maintaining" these two areas for the same body.
I thought that this could be some kind of "loss of precision" between my conversions of pixels and meters (I am scaling by 100 times) but the Y axis uses the same technique and it's fine.
So, I thought that I might be missing something.
Edit 1
I am converting from Box coordinates to World coordinates. If you see the blue debug sprites in the screenshots, they form the box almost perfectly.
public static final float WORLD_TO_BOX = 0.01f;
public static final float BOX_TO_WORLD = 100f;
The box render code:
public void draw(Batch batch, float alpha) {
x = (body.getPosition().x - width/2) * TheBox.BOX_TO_WORLD;
y = (body.getPosition().y - height/2) * TheBox.BOX_TO_WORLD;
float xend = (body.getPosition().x + width/2) * TheBox.BOX_TO_WORLD;
float yend = (body.getPosition().y + height/2) * TheBox.BOX_TO_WORLD;
batch.draw(texture, x, y);
batch.draw(texture, x, yend);
batch.draw(texture, xend, yend);
batch.draw(texture, xend, y);
}
Edit 2
I am starting to suspect the camera. I got the DebugRenderer and a scene2d Stage. Here is the code:
My screen resolution (Nexus 5, and it's portrait):
public static final int SCREEN_WIDTH = 1080;
public static final int SCREEN_HEIGHT = 1920;
At the startup:
// ...
stage = new Stage(SCREEN_WIDTH, SCREEN_HEIGHT, true);
camera = new OrthographicCamera();
camera.setToOrtho(false, SCREEN_WIDTH, SCREEN_HEIGHT);
debugMatrix = camera.combined.cpy();
debugMatrix.scale(BOX_TO_WORLD, BOX_TO_WORLD, 1.0f);
debugRenderer = new Box2DDebugRenderer();
// ...
Now, the render() code:
public void render() {
Gdx.gl.glClearColor(0, 0, 0, 1);
Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT);
camera.update();
world.step(1/45f, 6, 6);
world.clearForces();
stage.act(Gdx.graphics.getDeltaTime());
stage.draw();
debugRenderer.render(world, debugMatrix);
}
Looks like the answer to that one was fairly simple:
stage.setCamera(camera);
I was not setting the OrthographicCamera to the stage, so the stage was using some kind of default camera that wasn't aligned with my stuff.
It had nothing to do with Box2d in the end. Box2d was returning healthy values, but theses values were corresponding to wrong places in my screen because of the wrong stage resolution.

Augmented Reality + Bullet Physics - trouble with rayTest/Ray picking

I am trying to pick objects in the bullet physics world but all I seem to be able to pick is the floor/ground plane!!! I am using the Vuforia SDK and have altered the ImageTargets demo code. I have used the following code to project my touched screen points to the 3d world:
void projectTouchPointsForBullet(QCAR::Vec2F point, QCAR::Vec3F &lineStart, QCAR::Vec3F &lineEnd, QCAR::Matrix44F &modelViewMatrix)
{
QCAR::Vec4F normalisedVector((2 * point.data[0] / screenWidth - 1),
(2 * (screenHeight-point.data[1]) / screenHeight - 1),
-1,
1);
QCAR::Matrix44F modelViewProjection;
SampleUtils::multiplyMatrix(&projectionMatrix.data[0], &modelViewMatrix.data[0] , &modelViewProjection.data[0]);
QCAR::Matrix44F inversedMatrix = SampleMath::Matrix44FInverse(modelViewProjection);
QCAR::Vec4F near_point = SampleMath::Vec4FTransform( normalisedVector,inversedMatrix);
near_point.data[3] = 1.0/near_point.data[3];
near_point = QCAR::Vec4F(near_point.data[0]*near_point.data[3], near_point.data[1]*near_point.data[3], near_point.data[2]*near_point.data[3], 1);
normalisedVector.data[2] = 1.0;//z coordinate now 1
QCAR::Vec4F far_point = SampleMath::Vec4FTransform( normalisedVector, inversedMatrix);
far_point.data[3] = 1.0/far_point.data[3];
far_point = QCAR::Vec4F(far_point.data[0]*far_point.data[3], far_point.data[1]*far_point.data[3], far_point.data[2]*far_point.data[3], 1);
lineStart = QCAR::Vec3F(near_point.data[0],near_point.data[1],near_point.data[2]);
lineEnd = QCAR::Vec3F(far_point.data[0],far_point.data[1],far_point.data[2]);
}
when I try a ray test in my physics world I only seem to be hitting the ground plane! Here is the code for the ray test call:
QCAR::Vec3F intersection, lineStart;
projectTouchPointsForBullet(QCAR::Vec2F(touch1.tapX, touch1.tapY), lineStart, lineEnd,inverseProjMatrix, modelViewMatrix);
btVector3 btRayFrom = btVector3(lineEnd.data[0], lineEnd.data[1], lineEnd.data[2]);
btVector3 btRayTo = btVector3(lineStart.data[0], lineStart.data[1], lineStart.data[2]);
btCollisionWorld::ClosestRayResultCallback rayCallback(btRayFrom,btRayTo);
dynamicsWorld->rayTest(btRayFrom, btRayTo, rayCallback);
if(rayCallback.hasHit())
{
char* pPhysicsData = reinterpret_cast<char*>(rayCallback.m_collisionObject->getUserPointer());//my bodies have char* messages attached to them to determine what has been touched
btRigidBody* pBody = btRigidBody::upcast(rayCallback.m_collisionObject);
if (pBody && pPhysicsData)
{
LOG("handleTouches:: notifyOnTouchEvent from physics world!!!");
notifyOnTouchEvent(env, obj,0,0, pPhysicsData);
}
}
I know I am predominantly looking top-down so I am bound to hit the ground plane, I at least know my touch is being correctly projected into the world, but I have objects lying on the ground plane and I can't seem to be able to touch them! Any pointers would be greatly appreciated :)
I found out why I wasn't able to touch the objects - I am scaling the objects up when they are drawn, so I had to scale the view matrix by the same value before I projected my touch point into the 3d world (EDIT I also had the btRayFrom and btRayTo input cooordinates reversed, it is now fixed):
//top of code
int kObjectScale = 100.0f
....
...
//inside touch handler method
SampleUtils::scalePoseMatrix(kObjectScale, kObjectScale, kObjectScale,&modelViewMatrix.data[0]);
projectTouchPointsForBullet(QCAR::Vec2F(touch1.tapX, touch1.tapY), lineStart, lineEnd,inverseProjMatrix, modelViewMatrix);
btVector3 btRayFrom = btVector3(lineStart.data[0], lineStart.data[1], lineStart.data[2]);
btVector3 btRayTo = btVector3(lineEnd.data[0], lineEnd.data[1], lineEnd.data[2]);
My touches are projected correctly now :)

Move an object on on a Bézier curve path

I want to move my image on a Bézier curve path from top to bottom but I can't get how can I calculate x/y points and slope from this path. The path looks like the following image:
I have start points, end points and two control points.
Path path = new Path();
Point s = new Point(150, 5);
Point cp1 = new Point(140, 125);
Point cp2 = new Point(145, 150);
Point e = new Point(200, 250);
path.moveTo(s.x, s.y);
path.cubicTo(cp1.x, cp1.y, cp2.x, cp2.y, e.x, e.y);
Android gives you an API to accomplish what you want. Use the class called android.graphics.PathMeasure. There are two methods you will find useful: getLength(), to retrieve the total length in pixels of the path, and getPosTan(), to retrieve the X,Y position of a point on the curve at a specified distance (as well as the tangent at this location.)
For instance, if getLength() returns 200 and you want to know the X,Y position of the point in the middle of the curve, call getPosTan() with distance=100.
More info: http://developer.android.com/reference/android/graphics/PathMeasure.html
This is a cubic Bézier curve for which the formula is simply [x,y]=(1–t)^3*P0+3(1–t)^2*t*P1+3(1–t)t^2*P2+t^3*P3. With this you can solve for each point by evaluating the equation. In Java this you could do it like this:
/* t is time(value of 0.0f-1.0f; 0 is the start 1 is the end) */
Point CalculateBezierPoint(float t, Point s, Point c1, Point c2, Point e)
{
float u = 1 – t;
float tt = t*t;
float uu = u*u;
float uuu = uu * u;
float ttt = tt * t;
Point p = new Point(s.x * uuu, s.y * uuu);
p.x += 3 * uu * t * c1.x;
p.y += 3 * uu * t * c1.y;
p.x += 3 * u * tt * c2.x;
p.y += 3 * u * tt * c2.y;
p.x += ttt * e.x;
p.y += ttt * e.y;
return p;
}
So if you wanted to move a sprite along the path, then you would simply set the t value from a value of 0 - 1 depending on how far down the path you want to be. Example:
int percentMovedPerFrame = 1;// Will complete path in 100 frames
int currentPercent = 0;
update() {
if (currentPercent < 100) {
this.pos = CalculateBezierPoint(currentPercent / 100.0f, this.path.s, this.path.c1, this.path.c2, this.path.e);
currentPercent += percentMovedPerFrame
}
}
To find a point on a Bezier curve, you can use the De Casteljau algorithm.
See for example http://www.cs.mtu.edu/~shene/COURSES/cs3621/NOTES/spline/Bezier/de-casteljau.html or use Google to find some implementations.
If you only have 2 control points, a bezier curve is a linear line.
If you have 3, you have a quadratic curve. 4 control points define a cubic curve.
Bezier curves are functions which depend on "time". It goes from 0.0 - 1.0. If you enter 0 into the equation, you get the value at the beginning of the curve. If you enter 1.0, the value at the end.
Bezier curves interpolate the first and last control points, so those would be your starting and ending points. Look carefully at what package or library you are using to generate the curve.
To orient your image with the tangent vector of the curve, you have to differentiate the curve equation (you can look up the cubic bezier curve equation on wiki). That will give you the tangent vector to orient your image.
Note that changing the parameter in the parametric form of a cubic bezier does not produce linear results. In other words, setting t=0.5 does not give you a point that is halfway along the curve. Depending on the curvature (which is defined by control points) there will be non-linearities along the path.
For anyone who needs to calculate static value points of Bezier curve Bezier curve calculator is a good source. Especially if you use the fourth quadrant (i.e. between X line and -Y line). Then you can completely map it to the Android coordinate system doing mod on negative value.

Categories

Resources