I'm developing ad app withthe GearVR framework in which I must show a sphere and a series of interactive points on this sphere around the camera (a 360 photo with some points on the border). I could create the sphere witha a 360 photo and put the camera in but I can't figure out how to put the points onto the sphere because they are given to me related to the photo (photo is, like, 4608 x 2304 and my point is on the image 280 x 1115). How can I transalate the x,y coordinates to the sphere?
I tried many formulas but none seem to work. Here's my code, thanks in advance:
private int wSphere = 4608;
private int hSphere = 2304;
#Override
public void onInit(GVRContext gvrContext) {
GVRScene scene = gvrContext.getNextMainScene();
GVRSphereSceneObject sphereObject = null;
Future<GVRTexture> texture = gvrContext.loadFutureTexture(new GVRAndroidResource(gvrContext, R.raw.office));
sphereObject = new GVRSphereSceneObject(gvrContext, false, texture);
sphereObject.getTransform().setScale(50f, 50f, 50f);
Future<GVRTexture> futureTexture = gvrContext.loadFutureTexture(new GVRAndroidResource(gvrContext, R.raw.texture));
GVRMaterial material = new GVRMaterial(gvrContext);
material.setMainTexture(futureTexture);
float normX = (280f);
float normY = (1115f);
HashMap<String, Float> positions = xyToXYZ(normX, normY, 50);
GVRCubeSceneObject cubeObject = new GVRCubeSceneObject(gvrContext, true, material);
cubeObject.getTransform().setScale(5.5f, 5.5f, 5.5f);
cubeObject.getTransform().setPosition(positions.get("x"), positions.get("z"), positions.get("y"));
cubeObject.getRenderData().setMaterial(material);
scene.addSceneObject(sphereObject);
scene.addSceneObject(cubeObject);
}
public HashMap<String, Float> xyToXYZ(float x, float y, float r) {
HashMap<String, Float> map = new HashMap<String, Float>();
float theta = (float) (-2* Math.atan((Math.sqrt(Math.pow(x,2) + Math.pow(y,2)))/(2*r)) + 90);
float phi = (float) Math.atan((x/(-y)));
float sinTheta = (float) Math.sin(theta);
float cosTheta = (float) Math.cos(theta);
float sinPhi = (float) Math.sin(phi);
float cosPhi = (float) Math.cos(phi);
float nX = (float) (cosTheta * sinPhi);
float nY = (float) cosPhi * cosTheta;
float nZ = (float) sinTheta;
map.put("x", nX);
map.put("y", nY);
map.put("z", nZ);
return map;
}
Related
Trying to take a picture and save in a specified path.I have attached the script to a RawImage.Initially tried Barts answer.But it was having different rotation and image flipped.So added some code to adjust the rotation and flipping to correct the view.Even though now the camera view looks correct ,it looks like the video feed getting from the camera is too wide and not clear.
Attaching screenshot and code.
private WebCamTexture camTexture;
// public RawImage Img;
// Start is called before the first frame update
void Start()
{
camTexture = new WebCamTexture();
WebCamDevice[] devices = WebCamTexture.devices;
if (devices.Length > 0)
{
camTexture.Play();
//Code below to adjust rotation
float rotationangle = (360 - camTexture.videoRotationAngle);
Quaternion rotQuaternion = new Quaternion();
rotQuaternion.eulerAngles = new Vector3(0, 0, rotationangle);
this.transform.rotation = rotQuaternion;
}
}
// Update is called once per frame
void Update()
{
GetComponent<RawImage>().texture = camTexture;
//CODE TO FLIP
float scaleY = camTexture.videoVerticallyMirrored ? -1f : 1f;
this.GetComponent<RawImage>().rectTransform.localScale = new Vector3(1f, scaleY, 1f);
}
public void PicTake()
{
TakePhoto();
}
How to correct this.
I had similar troubles when I was testing with Android,iOS,Mac,PC devices. Below is the script I used for solving the scaling & rotation problem.
It uses Unity Quad as background plane and fill the screen.
void CalculateBackgroundQuad()
{
Camera cam = Camera.main;
ScreenRatio = (float)Screen.width / (float)Screen.height;
BackgroundQuad.transform.SetParent(cam.transform);
BackgroundQuad.transform.localPosition = new Vector3(0f, 0f, cam.farClipPlane / 2f);
float videoRotationAngle = webCamTexture.videoRotationAngle;
BackgroundQuad.transform.localRotation = baseRotation * Quaternion.AngleAxis(webCamTexture.videoRotationAngle, Vector3.forward);
float distance = cam.farClipPlane / 2f;
float frustumHeight = 2.0f * distance * Mathf.Tan(cam.fieldOfView * 0.5f * Mathf.Deg2Rad);
BackgroundQuad.transform.localPosition = new Vector3(0f, 0f, distance);
Vector3 QuadScale = new Vector3(1f, frustumHeight, 1f);
//adjust the scaling for portrait Mode & Landscape Mode
if (videoRotationAngle == 0 || videoRotationAngle == 180)
{
//landscape mode
TextureRatio = (float)(webCamTexture.width) / (float)(webCamTexture.height);
if (ScreenRatio > TextureRatio)
{
float SH = ScreenRatio / TextureRatio;
float TW = TextureRatio * frustumHeight * SH;
float TH = frustumHeight * (webCamTexture.videoVerticallyMirrored ? -1 : 1) * SH;
QuadScale = new Vector3(TW, TH, 1f);
}
else
{
float TW = TextureRatio * frustumHeight;
QuadScale = new Vector3(TW, frustumHeight * (webCamTexture.videoVerticallyMirrored ? -1 : 1), 1f);
}
}
else
{
//portrait mode
TextureRatio = (float)(webCamTexture.height) / (float)(webCamTexture.width);
if (ScreenRatio > TextureRatio)
{
float SH = ScreenRatio / TextureRatio;
float TW = frustumHeight * -1f * SH;
float TH = TW * (webCamTexture.videoVerticallyMirrored ? 1 : -1) * SH;
QuadScale = new Vector3(TW, TH, 1f);
}
else
{
float TW = TextureRatio * frustumHeight;
QuadScale = new Vector3(frustumHeight * -1f, TW * (webCamTexture.videoVerticallyMirrored ? 1 : -1), 1f);
}
}
BackgroundQuad.transform.localScale = QuadScale;
}
The above script should work on all devices. Just simple math solution.
I have been searching the internet for a while now. The problem is that the solution to my problem in mostly available in either python or C++. I have tried to replicate the code but no luck.
I have detected a card (Rectangle) and I am able to crop it if the rectangle is straight but if the rectangle is rotated at an angle I get a image that cuts the card.
Image showing what I want to achieve...
My working code for straight Image.
Bitmap abc = null;
Point topleft, topright, bottomleft, bottomright;
float xRatio = (float) original.getWidth() / sourceImageView.getWidth();
float yRatio = (float) original.getHeight() / sourceImageView.getHeight();
float x1 = (points.get(0).x) * xRatio;
float x2 = (points.get(1).x) * xRatio;
float x3 = (points.get(2).x) * xRatio;
float x4 = (points.get(3).x) * xRatio;
float y1 = (points.get(0).y) * yRatio;
float y2 = (points.get(1).y) * yRatio;
float y3 = (points.get(2).y) * yRatio;
float y4 = (points.get(3).y) * yRatio;
Point p1 = new Point(x1, y1);
Point p2 = new Point(x2, y2);
Point p3 = new Point(x3, y3);
Point p4 = new Point(x4, y4);
List<Point> newpoints = new ArrayList<Point>();
newpoints.add(p1);
newpoints.add(p2);
newpoints.add(p3);
newpoints.add(p4);
Collections.sort(newpoints, new Comparator<Point>() {
public int compare(Point o1, Point o2) {
return Double.compare(o1.x, o2.x);
}
});
if (newpoints.get(0).y > newpoints.get(1).y) {
bottomleft = newpoints.get(0);
topleft = newpoints.get(1);
} else {
bottomleft = newpoints.get(1);
topleft = newpoints.get(0);
}
if (newpoints.get(2).y > newpoints.get(3).y) {
bottomright = newpoints.get(2);
topright = newpoints.get(3);
} else {
bottomright = newpoints.get(3);
topright = newpoints.get(2);
}
final Mat newimage = new Mat();
Bitmap bmp32 = original.copy(Bitmap.Config.ARGB_8888, true);
org.opencv.android.Utils.bitmapToMat(bmp32, newimage);
final float dd = getAngle(bottomleft, bottomright);
Mat finalMat = new Mat(newimage, new org.opencv.core.Rect(topleft, bottomright));
abc = RotateBitmap(createBitmapfromMat(finalMat), (-dd));
Current code when rectangle is straight :
Current code when rectangle is rotated:
Links to similar questions :
Link 1 Link 2
I am developing an android app which visualize the map of an environment and currently i am using libgdx to draw the map, also like any map application the user should be capable of zoom, rotate and moving the map,
I have developed a GestureHandler class which implements GestureListener interface and interacts with a PerspectiveCamera(since i will use 3d components in the future):
#Override
public boolean pan(float x, float y, float deltaX, float deltaY) {
float tempX = (mapView.getCamera().position.x - deltaX * 0.5f);
float tempY = (mapView.getCamera().position.y + deltaY * 0.5f);
mapView.getCamera().position.set(
MathUtils.lerp(mapView.getCamera().position.x, tempX, mapView.getCamera().fieldOfView / 100),
MathUtils.lerp(mapView.getCamera().position.y, tempY, mapView.getCamera().fieldOfView / 100),
mapView.getCamera().position.z);
mapView.getCamera().update();
return false;
}
float initialDistance = 0;
float initialAngle = 0;
float distance = 0;
private void zoom(Vector2 initialPointer1, Vector2 initialPointer2, Vector2 pointer1, Vector2 pointer2)
{
initialDistance = initialPointer1.dst(initialPointer2);
float iDeltaX = initialPointer2.x - initialPointer1.x;
float iDeltaY = initialPointer2.y - initialPointer1.y;
initialAngle = (float)Math.atan2((double)iDeltaY,(double)iDeltaX) * MathUtils.radiansToDegrees;
if(initialAngle < 0)
initialAngle = 360 - (-initialAngle);
distance = initialPointer1.dst(pointer2);
float deltaX = pointer2.x - initialPointer1.x;
float deltaY = pointer2.y - initialPointer1.y;
newAngle = (float)Math.atan2((double)deltaY,(double)deltaX) * MathUtils.radiansToDegrees;
if(newAngle < 0)
newAngle = 360 - (-newAngle);
//Log.e("test", distance + " " + initialDistance);
//Log.e("test", newAngle + " " + initialAngle);
float ratio = initialDistance/distance;
mapView.getCamera().fieldOfView = MathUtils.clamp(initialZoomScale * ratio, 1f, 100.0f);
Log.e("zoom", String.valueOf(mapView.getCamera().fieldOfView));
mapView.getCamera().update();
}
#Override
public boolean pinch(Vector2 initialPointer1, Vector2 initialPointer2, Vector2 pointer1, Vector2 pointer2) {
zoom(initialPointer1, initialPointer2, pointer1, pointer2);
float delta1X = pointer2.x - pointer1.x;
float delta1Y = pointer2.y - pointer1.y;
newAngle = (float)Math.atan2((double)delta1Y,(double)delta1X) * MathUtils.radiansToDegrees;
if(newAngle < 0)
newAngle = 360 - (-newAngle);
System.out.println("new "+newAngle);
if(newAngle - currentAngle >= 0.01000f)
{
System.out.println("Increasing");
mapView.getCamera().rotate(0.5f,0,0,1);
}
else if(newAngle - currentAngle <= -0.010000f) {
System.out.println("DEcreasing");
mapView.getCamera().rotate(-0.5f,0,0,1);
}
if(Math.abs(newAngle - currentAngle) >= 0.01000f)
{
currentAngle = newAngle;
}
return true;
}
Everything is fine until as far as i don't rotate the camera, just like this unsolved similar question after rotating the camera, movements will be affected by applied rotation.Any help specially sample codes?
Edit:
After lots of efforts i finally solved it,
As Tenfour04 said in his answer i had to use two separate matrices for transformation and rotations, and finally set the result of their multiplication to view matrix of camera using:
camera.view.set(position).mul(orientation);
Also the most important thing is to set the Transformation Matrix of my batch to camera.view:
batch.setTransformationMatrix(camera.view)
Instead of applying the gestures directly to the camera, apply them to a pair of Matrix4's that you use to store the orientation and position separately. Then in the render method, multiply the two matrices and apply them to your camera's view.
In the render() method:
camera.view.set(orientation).mul(position); //Might need to swap orientation/position--don't remember.
camera.update();
Your zoom method is fine because field of view affects the camera's projection matrix rather than its view matrix.
From the image you can see that the ball fired on the left that fire behind it, does not match the calculated trajectory. Im drawing the ball trajectory using an equation from a SO question, this is modified to take into consideration the box2d steps of 30 frames per second. This does calculate a valid trajectory but it does not match the actual trajectory of the ball, the ball has a smaller trajectory. I am applying a box2d force to the ball, this also has a density set and a shape. The shape radius varies depending on the type of ball. Im setting the start velocity in the touchdown event.
public class ProjectileEquation {
public float gravity;
public Vector2 startVelocity = new Vector2();
public Vector2 startPoint = new Vector2();
public Vector2 gravityVec = new Vector2(0,-10f);
public float getX(float n) {
return startVelocity.x * (n * 1/30f) + startPoint.x;
}
public float getY(float n) {
float t = 1/30f * n;
return 0.5f * gravity * t * t + startVelocity.y * t + startPoint.y;
}
}
#Override
public void draw(SpriteBatch batch, float parentAlpha) {
float t = 0f;
float width = this.getWidth();
float height = this.getHeight();
float timeSeparation = this.timeSeparation;
for (int i = 0; i < trajectoryPointCount; i+=timeSeparation) {
//projectileEquation.getTrajectoryPoint(this.getX(), this.getY(), i);
float x = this.getX() + projectileEquation.getX(i);
float y = this.getY() + projectileEquation.getY(i);
batch.setColor(this.getColor());
if(trajectorySprite != null) batch.draw(trajectorySprite, x, y, width, height);
// t += timeSeparation;
}
}
public boolean touchDown (InputEvent event, float x, float y, int pointer, int button) {
if(button==1 || world.showingDialog)return false;
touchPos.set(x, y);
float angle = touchPos.sub(playerCannon.position).angle();
if(angle > 270 ) {
angle = 0;
}
else if(angle >70) {
angle = 70;
}
playerCannon.setAngle(angle);
world.trajPath.controller.angle = angle;
float radians = (float) angle * MathUtils.degreesToRadians;
float ballSpeed = touchPos.sub(playerCannon.position).len()*12;
world.trajPath.projectileEquation.startVelocity.x = (float) (Math.cos(radians) * ballSpeed);
world.trajPath.projectileEquation.startVelocity.y = (float) (Math.sin(radians) * ballSpeed);
return true;
}
public CannonBall(float x, float y, float width, float height, float damage, World world, Cannon cannonOwner) {
super(x, y, width, height, damage, world);
active = false;
shape = new CircleShape();
shape.setRadius(width/2);
FixtureDef fd = new FixtureDef();
fd.shape = shape;
fd.density = 4.5f;
if(cannonOwner.isEnemy) { //Enemy cannon balls cannot hit other enemy cannons just the player
fd.filter.groupIndex = -16;
}
bodyDef.type = BodyType.DynamicBody;
bodyDef.position.set(this.position);
body = world.createBody(bodyDef);
body.createFixture(fd);
body.setUserData(this);
body.setBullet(true);
this.cannonOwner = cannonOwner;
this.hitByBall = null;
this.particleEffect = null;
}
private CannonBall createCannonBall(float radians, float ballSpeed, float radius, float damage)
{
CannonBall cannonBall = new CannonBall(CannonEnd().x, CannonEnd().y, radius * ballSizeMultiplier, radius * ballSizeMultiplier, damage, this.world, this);
cannonBall.velocity.x = (float) (Math.cos(radians) * ballSpeed);
//cannonBall.velocity.x = (float) ((Math.sqrt(10) * Math.sqrt(29) *
// Math.sqrt((Math.tan(cannon.angle)*Math.tan(cannon.angle))+1)) / Math.sqrt(2 * Math.tan(cannon.angle) - (2 * 10 * 2)/29))* -1f;
cannonBall.velocity.y = (float) (Math.sin(radians) * ballSpeed);
cannonBall.active = true;
//cannonBall.body.applyLinearImpulse(cannonBall.velocity, cannonBall.position);
cannonBall.body.applyForce(cannonBall.velocity, cannonBall.position );
return cannonBall;
}
trajPath = new TrajectoryActor(-10f);
trajPath.setX(playerCannon.CannonEnd().x);
trajPath.setY(playerCannon.CannonEnd().y);
trajPath.setWidth(10f);
trajPath.setHeight(10f);
stage.addActor(trajPath);
Here is a code that I used for one of my other games, which proved to be very precise. The trick is to apply the impulse on the body and read the initial velocity. Having that I calculate 10 positions where the body will be within 0.5 seconds. The language is called Squirrel which is Lua based with C/C++ like syntax. You should be able to grasp what is going on there. What returns from the getTrajectoryPointsForObjectAtImpulse is an array of 10 positions through which the ball will pass within 0.5 seconds.
const TIMESTER_DIVIDOR = 60.0;
function getTrajectoryPoint( startingPosition, startingVelocity, n )
{
local gravity = box2DWorld.GetGravity();
local t = 1 / 60.0;
local stepVelocity = b2Vec2.Create( t * startingVelocity.x, t * startingVelocity.y );
local stepGravity = b2Vec2.Create( t * t * gravity.x, t * t * gravity.y );
local result = b2Vec2.Create( 0, 0 );
result.x = ( startingPosition.x + n * stepVelocity.x + 0.5 * ( n * n + n ) * stepGravity.x ) * MTP;
result.y = ( startingPosition.y + n * stepVelocity.y + 0.5 * ( n * n + n ) * stepGravity.y ) * -MTP;
return result;
}
function getTrajectoryPointsForObjectAtImpulse (object, impulse)
{
if( !object || !impulse ) return [];
local result = [];
object.bBody.ApplyLinearImpulse( impulse, object.bBody.GetWorldCenter() );
local initialVelocity = object.bBody.GetLinearVelocity();
object.bBody.SetLinearVelocity( b2Vec2.Create(0, 0) );
object.bBody.SetActive(false);
for ( local i = 0.0 ; i < ( 0.5 * TIMESTER_DIVIDOR ) ; )
{
result.append( getTrajectoryPoint(object.bBody.GetPosition(), initialVelocity, i.tointeger() ) );
i += ( (0.5 * TIMESTER_DIVIDOR) * 0.1 );
}
return result;
}
If you do not understand any part of the code, please let me know and I will try to explain.
android uses the following code to calculate rotation matrix:
float Ax = gravity[0];
float Ay = gravity[1];
float Az = gravity[2];
final float Ex = geomagnetic[0];
final float Ey = geomagnetic[1];
final float Ez = geomagnetic[2];
float Hx = Ey*Az - Ez*Ay;
float Hy = Ez*Ax - Ex*Az;
float Hz = Ex*Ay - Ey*Ax;
final float normH = (float)Math.sqrt(Hx*Hx + Hy*Hy + Hz*Hz);
if (normH < 0.1f) {
// device is close to free fall (or in space?), or close to
// magnetic north pole. Typical values are > 100.
return false;
}
final float invH = 1.0f / normH;
Hx *= invH;
Hy *= invH;
Hz *= invH;
final float invA = 1.0f / (float)Math.sqrt(Ax*Ax + Ay*Ay + Az*Az);
Ax *= invA;
Ay *= invA;
Az *= invA;
final float Mx = Ay*Hz - Az*Hy;
final float My = Az*Hx - Ax*Hz;
final float Mz = Ax*Hy - Ay*Hx;
if (R != null) {
if (R.length == 9) {
R[0] = Hx; R[1] = Hy; R[2] = Hz;
R[3] = Mx; R[4] = My; R[5] = Mz;
R[6] = Ax; R[7] = Ay; R[8] = Az;
} else if (R.length == 16) {
R[0] = Hx; R[1] = Hy; R[2] = Hz; R[3] = 0;
R[4] = Mx; R[5] = My; R[6] = Mz; R[7] = 0;
R[8] = Ax; R[9] = Ay; R[10] = Az; R[11] = 0;
R[12] = 0; R[13] = 0; R[14] = 0; R[15] = 1;
}
}
i would like to know what is the logic behind this
also, what is the order of rotation used?
i want to correct the rotation using the rotation matrix. so the order of calculation by android is important.
Android assumes the gravity parameter is a vector lying on the world sky axis. That is if (w_1, w_2, w_3) is the world basis where w_1 is a unit vector pointing East, w_2 is a unit vector pointing North and w_3 is a vector pointing toward the sky, then the gravity parameter is a vector that is a multiple of w_3. Therefore the normalize of the gravity parameter is w_3.
Also, the code assume the geomagnetic field parameter is a vector lying on the plane spanned by w_2 and w_3 Thus the cross product of the normalize geomagnetic field parameter and the normalize gravity parameter is a unit vector orthogonal to the plane spanned by w_2 and w_3. Therefore this product is just w_1.
Now the cross product of w_3 and w_1 is w_2. Thus you get the change of basis from the device coordinate to the world coordinate.
I do not understand what do you mean by "the order of rotation used" and thus cannot answer that question.