From the image you can see that the ball fired on the left that fire behind it, does not match the calculated trajectory. Im drawing the ball trajectory using an equation from a SO question, this is modified to take into consideration the box2d steps of 30 frames per second. This does calculate a valid trajectory but it does not match the actual trajectory of the ball, the ball has a smaller trajectory. I am applying a box2d force to the ball, this also has a density set and a shape. The shape radius varies depending on the type of ball. Im setting the start velocity in the touchdown event.
public class ProjectileEquation {
public float gravity;
public Vector2 startVelocity = new Vector2();
public Vector2 startPoint = new Vector2();
public Vector2 gravityVec = new Vector2(0,-10f);
public float getX(float n) {
return startVelocity.x * (n * 1/30f) + startPoint.x;
}
public float getY(float n) {
float t = 1/30f * n;
return 0.5f * gravity * t * t + startVelocity.y * t + startPoint.y;
}
}
#Override
public void draw(SpriteBatch batch, float parentAlpha) {
float t = 0f;
float width = this.getWidth();
float height = this.getHeight();
float timeSeparation = this.timeSeparation;
for (int i = 0; i < trajectoryPointCount; i+=timeSeparation) {
//projectileEquation.getTrajectoryPoint(this.getX(), this.getY(), i);
float x = this.getX() + projectileEquation.getX(i);
float y = this.getY() + projectileEquation.getY(i);
batch.setColor(this.getColor());
if(trajectorySprite != null) batch.draw(trajectorySprite, x, y, width, height);
// t += timeSeparation;
}
}
public boolean touchDown (InputEvent event, float x, float y, int pointer, int button) {
if(button==1 || world.showingDialog)return false;
touchPos.set(x, y);
float angle = touchPos.sub(playerCannon.position).angle();
if(angle > 270 ) {
angle = 0;
}
else if(angle >70) {
angle = 70;
}
playerCannon.setAngle(angle);
world.trajPath.controller.angle = angle;
float radians = (float) angle * MathUtils.degreesToRadians;
float ballSpeed = touchPos.sub(playerCannon.position).len()*12;
world.trajPath.projectileEquation.startVelocity.x = (float) (Math.cos(radians) * ballSpeed);
world.trajPath.projectileEquation.startVelocity.y = (float) (Math.sin(radians) * ballSpeed);
return true;
}
public CannonBall(float x, float y, float width, float height, float damage, World world, Cannon cannonOwner) {
super(x, y, width, height, damage, world);
active = false;
shape = new CircleShape();
shape.setRadius(width/2);
FixtureDef fd = new FixtureDef();
fd.shape = shape;
fd.density = 4.5f;
if(cannonOwner.isEnemy) { //Enemy cannon balls cannot hit other enemy cannons just the player
fd.filter.groupIndex = -16;
}
bodyDef.type = BodyType.DynamicBody;
bodyDef.position.set(this.position);
body = world.createBody(bodyDef);
body.createFixture(fd);
body.setUserData(this);
body.setBullet(true);
this.cannonOwner = cannonOwner;
this.hitByBall = null;
this.particleEffect = null;
}
private CannonBall createCannonBall(float radians, float ballSpeed, float radius, float damage)
{
CannonBall cannonBall = new CannonBall(CannonEnd().x, CannonEnd().y, radius * ballSizeMultiplier, radius * ballSizeMultiplier, damage, this.world, this);
cannonBall.velocity.x = (float) (Math.cos(radians) * ballSpeed);
//cannonBall.velocity.x = (float) ((Math.sqrt(10) * Math.sqrt(29) *
// Math.sqrt((Math.tan(cannon.angle)*Math.tan(cannon.angle))+1)) / Math.sqrt(2 * Math.tan(cannon.angle) - (2 * 10 * 2)/29))* -1f;
cannonBall.velocity.y = (float) (Math.sin(radians) * ballSpeed);
cannonBall.active = true;
//cannonBall.body.applyLinearImpulse(cannonBall.velocity, cannonBall.position);
cannonBall.body.applyForce(cannonBall.velocity, cannonBall.position );
return cannonBall;
}
trajPath = new TrajectoryActor(-10f);
trajPath.setX(playerCannon.CannonEnd().x);
trajPath.setY(playerCannon.CannonEnd().y);
trajPath.setWidth(10f);
trajPath.setHeight(10f);
stage.addActor(trajPath);
Here is a code that I used for one of my other games, which proved to be very precise. The trick is to apply the impulse on the body and read the initial velocity. Having that I calculate 10 positions where the body will be within 0.5 seconds. The language is called Squirrel which is Lua based with C/C++ like syntax. You should be able to grasp what is going on there. What returns from the getTrajectoryPointsForObjectAtImpulse is an array of 10 positions through which the ball will pass within 0.5 seconds.
const TIMESTER_DIVIDOR = 60.0;
function getTrajectoryPoint( startingPosition, startingVelocity, n )
{
local gravity = box2DWorld.GetGravity();
local t = 1 / 60.0;
local stepVelocity = b2Vec2.Create( t * startingVelocity.x, t * startingVelocity.y );
local stepGravity = b2Vec2.Create( t * t * gravity.x, t * t * gravity.y );
local result = b2Vec2.Create( 0, 0 );
result.x = ( startingPosition.x + n * stepVelocity.x + 0.5 * ( n * n + n ) * stepGravity.x ) * MTP;
result.y = ( startingPosition.y + n * stepVelocity.y + 0.5 * ( n * n + n ) * stepGravity.y ) * -MTP;
return result;
}
function getTrajectoryPointsForObjectAtImpulse (object, impulse)
{
if( !object || !impulse ) return [];
local result = [];
object.bBody.ApplyLinearImpulse( impulse, object.bBody.GetWorldCenter() );
local initialVelocity = object.bBody.GetLinearVelocity();
object.bBody.SetLinearVelocity( b2Vec2.Create(0, 0) );
object.bBody.SetActive(false);
for ( local i = 0.0 ; i < ( 0.5 * TIMESTER_DIVIDOR ) ; )
{
result.append( getTrajectoryPoint(object.bBody.GetPosition(), initialVelocity, i.tointeger() ) );
i += ( (0.5 * TIMESTER_DIVIDOR) * 0.1 );
}
return result;
}
If you do not understand any part of the code, please let me know and I will try to explain.
Related
Trying to take a picture and save in a specified path.I have attached the script to a RawImage.Initially tried Barts answer.But it was having different rotation and image flipped.So added some code to adjust the rotation and flipping to correct the view.Even though now the camera view looks correct ,it looks like the video feed getting from the camera is too wide and not clear.
Attaching screenshot and code.
private WebCamTexture camTexture;
// public RawImage Img;
// Start is called before the first frame update
void Start()
{
camTexture = new WebCamTexture();
WebCamDevice[] devices = WebCamTexture.devices;
if (devices.Length > 0)
{
camTexture.Play();
//Code below to adjust rotation
float rotationangle = (360 - camTexture.videoRotationAngle);
Quaternion rotQuaternion = new Quaternion();
rotQuaternion.eulerAngles = new Vector3(0, 0, rotationangle);
this.transform.rotation = rotQuaternion;
}
}
// Update is called once per frame
void Update()
{
GetComponent<RawImage>().texture = camTexture;
//CODE TO FLIP
float scaleY = camTexture.videoVerticallyMirrored ? -1f : 1f;
this.GetComponent<RawImage>().rectTransform.localScale = new Vector3(1f, scaleY, 1f);
}
public void PicTake()
{
TakePhoto();
}
How to correct this.
I had similar troubles when I was testing with Android,iOS,Mac,PC devices. Below is the script I used for solving the scaling & rotation problem.
It uses Unity Quad as background plane and fill the screen.
void CalculateBackgroundQuad()
{
Camera cam = Camera.main;
ScreenRatio = (float)Screen.width / (float)Screen.height;
BackgroundQuad.transform.SetParent(cam.transform);
BackgroundQuad.transform.localPosition = new Vector3(0f, 0f, cam.farClipPlane / 2f);
float videoRotationAngle = webCamTexture.videoRotationAngle;
BackgroundQuad.transform.localRotation = baseRotation * Quaternion.AngleAxis(webCamTexture.videoRotationAngle, Vector3.forward);
float distance = cam.farClipPlane / 2f;
float frustumHeight = 2.0f * distance * Mathf.Tan(cam.fieldOfView * 0.5f * Mathf.Deg2Rad);
BackgroundQuad.transform.localPosition = new Vector3(0f, 0f, distance);
Vector3 QuadScale = new Vector3(1f, frustumHeight, 1f);
//adjust the scaling for portrait Mode & Landscape Mode
if (videoRotationAngle == 0 || videoRotationAngle == 180)
{
//landscape mode
TextureRatio = (float)(webCamTexture.width) / (float)(webCamTexture.height);
if (ScreenRatio > TextureRatio)
{
float SH = ScreenRatio / TextureRatio;
float TW = TextureRatio * frustumHeight * SH;
float TH = frustumHeight * (webCamTexture.videoVerticallyMirrored ? -1 : 1) * SH;
QuadScale = new Vector3(TW, TH, 1f);
}
else
{
float TW = TextureRatio * frustumHeight;
QuadScale = new Vector3(TW, frustumHeight * (webCamTexture.videoVerticallyMirrored ? -1 : 1), 1f);
}
}
else
{
//portrait mode
TextureRatio = (float)(webCamTexture.height) / (float)(webCamTexture.width);
if (ScreenRatio > TextureRatio)
{
float SH = ScreenRatio / TextureRatio;
float TW = frustumHeight * -1f * SH;
float TH = TW * (webCamTexture.videoVerticallyMirrored ? 1 : -1) * SH;
QuadScale = new Vector3(TW, TH, 1f);
}
else
{
float TW = TextureRatio * frustumHeight;
QuadScale = new Vector3(frustumHeight * -1f, TW * (webCamTexture.videoVerticallyMirrored ? 1 : -1), 1f);
}
}
BackgroundQuad.transform.localScale = QuadScale;
}
The above script should work on all devices. Just simple math solution.
A while back I found this great color picker from Piotr Adams which I can not find on Git anymore but it's still on this page: https://www.programcreek.com/java-api-examples/index.php?source_dir=Random-Penis-master/app/src/main/java/com/osacky/penis/picker/ColorPicker.java
The main reason I use this color picker in my app is because I want to be able to place a pointer on the RadialGradient based on a color. This library calculates the position for a certain color, this means placing a picker on the correct location is very fast and easy.
The problem is I don't quite understand how it works. I now want to generate a RadialGradient with different colors. But the logic it uses does not work when I generate a RadialGradient with different colors.
Here is the code that generates the RadialGradient:
private Bitmap createColorWheelBitmap(int width, int height) {
Bitmap bitmap = Bitmap.createBitmap(width, height, Config.ARGB_8888);
int colorCount = 12;
int colorAngleStep = 360 / 12;
int colors[] = new int[colorCount + 1];
float hsv[] = new float[]{0f, 1f, 1f};
for (int i = 0; i < colors.length; i++) {
hsv[0] = (i * colorAngleStep + 180) % 360;
colors[i] = Color.HSVToColor(hsv);
}
colors[colorCount] = colors[0];
SweepGradient sweepGradient = new SweepGradient(width / 2, height / 2, colors, null);
RadialGradient radialGradient = new RadialGradient(width / 2, height / 2, colorWheelRadius, 0xFFFFFFFF, 0x00FFFFFF, TileMode.CLAMP);
ComposeShader composeShader = new ComposeShader(sweepGradient, radialGradient, PorterDuff.Mode.SRC_OVER);
colorWheelPaint.setShader(composeShader);
Canvas canvas = new Canvas(bitmap);
canvas.drawCircle(width / 2, height / 2, colorWheelRadius, colorWheelPaint);
return bitmap;
}
The code for listening to changes of the picker, so this calculates the color based on a position:
#Override
public boolean onTouchEvent(MotionEvent event) {
int action = event.getAction();
switch (action) {
case MotionEvent.ACTION_DOWN:
case MotionEvent.ACTION_MOVE:
int x = (int) event.getX();
int y = (int) event.getY();
int cx = x - getWidth() / 2;
int cy = y - getHeight() / 2;
double d = Math.sqrt(cx * cx + cy * cy);
if (d <= colorWheelRadius) {
colorHSV[0] = (float) (Math.toDegrees(Math.atan2(cy, cx)) + 180f);
colorHSV[1] = Math.max(0f, Math.min(1f, (float) (d / colorWheelRadius)));
selectedPointer.setColor(Color.HSVToColor(colorHSV));
notifyListeners();
invalidate();
}
return true;
case MotionEvent.ACTION_BUTTON_PRESS:
}
return super.onTouchEvent(event);
}
Finally the code that calculates the position based on a color:
// drawing color wheel pointer
float hueAngle = (float) Math.toRadians(colorHSV[0]);
int colorPointX = (int) (-Math.cos(hueAngle) * colorHSV[1] * colorWheelRadius) + centerX;
int colorPointY = (int) (-Math.sin(hueAngle) * colorHSV[1] * colorWheelRadius) + centerY;
float pointerRadius = 0.075f * colorWheelRadius;
int pointerX = (int) (colorPointX - pointerRadius / 2);
int pointerY = (int) (colorPointY - pointerRadius / 2);
colorPointerCoords.set(pointerX, pointerY, pointerX + pointerRadius, pointerY + pointerRadius);
canvas.drawOval(colorPointerCoords, colorPointerPaint);
So my question is how can I for example change the RadialGradient to only include 2 colors, without breaking the calculations of getting the color? Even an explanation on how this works would be great!
There is great tutorial here: http://tekeye.uk/android/examples/ui/android-color-picker-tutorial (not mine). I don't know much about the theory behind it either but you can use this code to calculate color based on position.
// Calculate channel based on 2 surrounding colors and p angle.
private int ave(int s, int d, float p) {
return s + java.lang.Math.round(p * (d - s));
}
// Calculate color based on drawn colors and angle based on x and y position.
private int interpColor(int colors[], float unit) {
if (unit <= 0) {
return colors[0];
}
if (unit >= 1) {
return colors[colors.length - 1];
}
// Adjust the angle (unit) based on how many colors there are in the list.
float p = unit * (colors.length - 1);
// Get starting color position in the array.
int i = (int)p;
p -= i;
// Now p is just the fractional part [0...1) and i is the index.
// Get two composite colors for calculations.
int c0 = colors[i];
int c1 = colors[i+1];
// Calculate color channels.
int a = ave(Color.alpha(c0), Color.alpha(c1), p);
int r = ave(Color.red(c0), Color.red(c1), p);
int g = ave(Color.green(c0), Color.green(c1), p);
int b = ave(Color.blue(c0), Color.blue(c1), p);
// And finally create the color from the channels.
return Color.argb(a, r, g, b);
}
You can call the interpreting function like this for example.
#Override
public boolean onTouchEvent(MotionEvent event) {
float x = event.getX() - CENTER_X;
float y = event.getY() - CENTER_Y;
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
case MotionEvent.ACTION_MOVE:
// Calculate the angle based on x and y positions clicked.
float angle = (float)java.lang.Math.atan2(y, x);
// need to turn angle [-PI ... PI] into unit [0....1]
float unit = angle/(2*PI);
if (unit < 0) {
unit += 1;
}
// mColors is your list with colors so int[].
int color = interpColor(mColors, unit);
break;
}
}
I already tried it in my project and it works like a charm. So hope it helps you too. :)
EDIT:
Oh so my colors are set up like this.
mColors = intArrayOf(-0x10000, -0xff01, -0xffff01, -0xff0001, -0xff0100, -0x100, -0x10000)
So you can add/remove as many colors as you want and since the interpret functions calculates based on size of this array it should work.
How can I draw Path with fading (opacity or thicknes) line? Something like this.
I know there is LinearGradient shader for Paint, but it won't bend along the Path.
One possible solution might be to get points along the Path and just draw it by myself through the segments`. But I coouldn't find any method for that either.
I came up with the following code. The mos important thing is PathMeasure's getPosTan() method.
if (getGesturePath() != null) {
final short steps = 150;
final byte stepDistance = 5;
final byte maxTrailRadius = 15;
pathMeasure.setPath(getGesturePath(), false);
final float pathLength = pathMeasure.getLength();
for (short i = 1; i <= steps; i++) {
final float distance = pathLength - i * stepDistance;
if (distance >= 0) {
final float trailRadius = maxTrailRadius * (1 - (float) i / steps);
pathMeasure.getPosTan(distance, pathPos, null);
final float x = pathPos[0] + RandomUtils.nextFloat(0, 2 * trailRadius) - trailRadius;
final float y = pathPos[1] + RandomUtils.nextFloat(0, 2 * trailRadius) - trailRadius;
paint.setShader(new RadialGradient(
x,
y,
trailRadius > 0 ? trailRadius : Float.MIN_VALUE,
ColorUtils.setAlphaComponent(Color.GREEN, random.nextInt(0xff)),
Color.TRANSPARENT,
Shader.TileMode.CLAMP
));
canvas.drawCircle(x, y, trailRadius, paint);
}
}
}
I have a tough one (I think) problem. I am prepearing a game with a part, which is a race. I want to make it similar to oldschool "Lotus" games. Player will see a car from behind, a a road should have turns from time to time. So my idea is to make a Mesh (road), which will have updated vertexes based on bezier curve. A game should check every vertex on the left edge of the mesh and check how much should it be moved to left or right. Same for right edge. Below is my code based on:
http://www.andengine.org/forums/post47301.html?hilit=%20mesh#p47301
Calculations of bezier curve are taken from QuadraticBezierCurveMoveModifier included in andEngine.
private Mesh createRoad(){
//create triangles
//
// E--F
// |\ |
// B--D
// |\ |
// A--C
final int triangleCount = 10;
int i = 0;
final float pInitialX = 0;
float pInintialY = 0;
int vertexCount = Mesh.VERTEX_SIZE * triangleCount * 4;
final float pColor = new Color(0f,0f,0f).getABGRPackedFloat();
final float pBufferData[] = new float[vertexCount];
for(int triangleIndex = 0; triangleIndex < triangleCount; triangleIndex++){
pBufferData[(i * Mesh.VERTEX_SIZE) + Mesh.VERTEX_INDEX_X] = pInitialX;
pBufferData[(i * Mesh.VERTEX_SIZE) + Mesh.VERTEX_INDEX_Y] = pInintialY;
pBufferData[(i * Mesh.VERTEX_SIZE) + Mesh.COLOR_INDEX] = pColor;
pBufferData[((i+1) * Mesh.VERTEX_SIZE) + Mesh.VERTEX_INDEX_X] = pInitialX;
pBufferData[((i+1) * Mesh.VERTEX_SIZE) + Mesh.VERTEX_INDEX_Y] = pInintialY + 30;
pBufferData[((i+1) * Mesh.VERTEX_SIZE) + Mesh.COLOR_INDEX] = pColor;
pBufferData[((i+2) * Mesh.VERTEX_SIZE) + Mesh.VERTEX_INDEX_X] = pInitialX + 200;
pBufferData[((i+2) * Mesh.VERTEX_SIZE) + Mesh.VERTEX_INDEX_Y] = pInintialY;
pBufferData[((i+2) * Mesh.VERTEX_SIZE) + Mesh.COLOR_INDEX] = pColor;
pBufferData[((i+3) * Mesh.VERTEX_SIZE) + Mesh.VERTEX_INDEX_X] = pInitialX + 200;
pBufferData[((i+3) * Mesh.VERTEX_SIZE) + Mesh.VERTEX_INDEX_Y] = pInintialY + 30;
pBufferData[((i+3) * Mesh.VERTEX_SIZE) + Mesh.COLOR_INDEX] = pColor;
i = i + 4;
pInintialY = pInintialY + 30;
}
final VertexBufferObjectManager vbom = engine.getVertexBufferObjectManager();
final HighPerformanceMeshVertexBufferObject pMeshVBOM = new HighPerformanceMeshVertexBufferObject(vbom, pBufferData, pBufferData.length, DrawType.STREAM, true, Mesh.VERTEXBUFFEROBJECTATTRIBUTES_DEFAULT);
road = new Mesh(300, 50, vertexCount, DrawMode.TRIANGLE_STRIP, pMeshVBOM){
#Override
protected void onManagedUpdate(final float pSecondsElapsed) {
super.onManagedUpdate(pSecondsElapsed);
drawByBezier();
onUpdateVertices();
this.mMeshVertexBufferObject.setDirtyOnHardware();
};
void drawByBezier(){
float pBuff[] = pMeshVBOM.getBufferData();
int i = 0;
for (int triangleIndex = 0; triangleIndex < triangleCount; triangleIndex++) {
pBuff[(i * Mesh.VERTEX_SIZE) + Mesh.VERTEX_INDEX_X] = getBezierX(i, 100, 0, 0);
pBuff[((i+1) * Mesh.VERTEX_SIZE) + Mesh.VERTEX_INDEX_X] = getBezierX(i+1, 0, 0, 0);
pBuff[((i+2) * Mesh.VERTEX_SIZE) + Mesh.VERTEX_INDEX_X] = getBezierX(i+2, 0, 0, 0) + 200;
pBuff[((i+3) * Mesh.VERTEX_SIZE) + Mesh.VERTEX_INDEX_X] = getBezierX(i+3, 0, 0, 0) + 200;
i = i + 4;
}
}
float bezierX;
float getBezierX(int triangleIndex, float P1X, float PcX, float P2X){
float t = triangleIndex / (triangleCount * 4);
float u = 1 - t;
float tt = t * t;
float uu = u * u;
float ut2 = 2 * u * t;
return bezierX = (uu * P1X) + (ut2 * PcX) + (tt * P2X);
}
};
return road;
}
Results are:
When I put (i,0,0,0) in all getBezierX() (for all vertices:
https://www.dropbox.com/s/upuyjti66rtm44i/Screenshot_2015-05-07-15-02-17.png?dl=0
When I put (i, 100, 0, 0) for first one getBezierX() the result is:
https://www.dropbox.com/s/w3231zxrwytot71/Screenshot_2015-05-07-15-03-41.png?dl=0
As you can see all first vertexes are moved by the same offset to the right (100). It does not matter what are other parameters, it is always moved by value of the first parameter. Why is that? Do I miss anything here?
EDIT
To clarify. I want to make this black block to be curved like a turn on the road. So its left and right edge should be curved. I wannt to use bezier curve to calculate x position of every vertice of every triangle of a mesh used. y positions should be same as they are. A mesh diagram is commented in the begining of the code. Co I would like to position points A, B , E along one curve , and C, D, F along another moved to the right.
Ok I found a problem but I dont really understand why it was a problem. So in:
float getBezierX(int triangleIndex, float P1X, float PcX, float P2X){
float t = triangleIndex / (triangleCount * 4);
float u = 1 - t;
float tt = t * t;
float uu = u * u;
float ut2 = 2 * u * t;
return bezierX = (uu * P1X) + (ut2 * PcX) + (tt * P2X);
}
I changed float getBezierX(int triangleIndex, float P1X, float PcX, float P2X) to float getBezierX(float triangleIndex, float P1X, float PcX, float P2X)and it works now. But why the int was a problem?
problem on calculation of x coordinate for plotting on iPhone screen.When points are within the range of 300 meter we are getting all the point of interest closer even-though In actual they are spread.I have even changed the width of the viewPort from 0.5 to 0.17(In degrees converted 28.647889757 to 10.0).Can anyone suggest such that every points of interest are properly placed with respect to the actual position.
The standard way(Mixare,ARToolkit) of calculating points on AR is
Calculate using ARKit
double pointAzimuth = coordinate.coordinateAzimuth;
//our x numbers are left based.
double leftAzimuth = self.currentCoordinate.coordinateAzimuth - VIEWPORT_WIDTH_RADIANS / 2.0;
if (leftAzimuth < 0.0) {
leftAzimuth = 2 * M_PI + leftAzimuth;
}
if (pointAzimuth < leftAzimuth) {
//it's past the 0 point.
point.x = ((2 * M_PI - leftAzimuth + pointAzimuth) / VIEWPORT_WIDTH_RADIANS) * 480.0;
} else {
point.x = ((pointAzimuth - leftAzimuth) / VIEWPORT_WIDTH_RADIANS) * 480.0;
}
IN Mixare:
CGPoint point;
CGRect viewBounds = self.overlayView.bounds;
//NSLog(#"pointForCoordinate: viewBounds.size.width = %.3f, height = %.3f", viewBounds.size.width, viewBounds.size.height );
double currentAzimuth = self.currentCoordinate.coordinateAzimuth;
double pointAzimuth = coordinate.coordinateAzimuth;
//NSLog(#"pointForCoordinate: location = %#, pointAzimuth = %.3f, pointInclination = %.3f, currentAzimuth = %.3f", coordinate.coordinateTitle, point.x, point.y, radiansToDegrees(pointAzimuth), radiansToDegrees(currentAzimuth), radiansToDegrees(pointInclination) );
double deltaAzimuth = [self deltaAzimuthForCoordinate:coordinate];
BOOL isBetweenNorth = [self isNorthForCoordinate:coordinate];
//NSLog(#"pointForCoordinate: (1) currentAzimuth = %.3f, pointAzimuth = %.3f, isNorth = %d", radiansToDegrees(currentAzimuth), radiansToDegrees(pointAzimuth), isBetweenNorth );
// NSLog(#"pointForCoordinate: deltaAzimuth = %.3f", radiansToDegrees(deltaAzimuth));
//NSLog(#"pointForCoordinate: (2) currentAzimuth = %.3f, pointAzimuth = %.3f, isNorth = %d", radiansToDegrees(currentAzimuth), radiansToDegrees(pointAzimuth), isBetweenNorth );
if ((pointAzimuth > currentAzimuth && !isBetweenNorth) ||
(currentAzimuth > degreesToRadians(360-self.viewRange) &&
pointAzimuth < degreesToRadians(self.viewRange))) {
// Right side of Azimuth
point.x = (viewBounds.size.width / 2) + ((deltaAzimuth / degreesToRadians(1)) * 12);
} else {
// Left side of Azimuth
point.x = (viewBounds.size.width / 2) - ((deltaAzimuth / degreesToRadians(1)) * 12);
}