Unity2D Android Touch misbehaving - android

I am attempting to translate an object depending on the touch position of the user.
The problem with it is, when I test it out, the object disappears as soon as I drag my finger on my phone screen. I am not entirely sure what's going on with it?
If somebody can guide me that would be great :)
Thanks
This is the Code:
#pragma strict
function Update () {
for (var touch : Touch in Input.touches)
{
if (touch.phase == TouchPhase.Moved) {
transform.Translate(0, touch.position.y, 0);
}
}
}

The problem is that you're moving the object by touch.position.y. This isn't a point inworld, it's a point on the touch screen. What you'll want to do is probably Camera.main.ScreenToWorldPoint(touch.position).y which will give you the position inworld for wherever you've touched.
Of course, Translate takes a vector indicating distance, not final destination, so simply sticking the above in it still won't work as you're intending.
Instead maybe try this:
Vector3 EndPos = Camera.main.ScreenToWorldPoint(touch.position);
float speed = 1f;
transform.position = Vector3.Lerp(transform.position, EndPos, speed * Time.deltaTime);
which should move the object towards your finger while at the same time keeping its movements smooth looking.

You'll want to ask this question at Unity's dedicated Questions/Answers site: http://answers.unity3d.com/index.html
There are very few people that come to stackoverflow for Unity specific question, unless they relate to Android/iOS specific features.
As for the cause of your problem, touch.position.y is define in screen space (pixels) where as transform.Translate is expecting world units (meters). You can convert between the two using the Camera.ScreenToWorldPoint() method, then creating a vector out of the camera position and screen world point. With this vector you can then either intersect some geometry in the scene or simply use it as a point in front of the camera.
http://docs.unity3d.com/Documentation/ScriptReference/Camera.ScreenToWorldPoint.html

Related

Is possible to retrieve the physical (device) camera height over the physical floor?

I'm developing an app using ARCore. In this app I need:
1) to place an object always staying at the same pose in world space. Following the "Working with Anchors" article recommendations (https://developers.google.com/ar/develop/developer-guides/anchors) I'm attaching an anchor to the ARCore Session. That is, I'm not using Trackables at all.
2) as a secondary requisite the object must be placed automatically, that is, without tapping on the screen.
I've managed to solve the two requisites, having now the object "floating" in front of me, this way (very common code):
private void onSceneUpdate(FrameTime frameTime) {
...
if (_renderable!=null && _anchorNode==null) {
float[] position = {0f,0f,-10f};
float[] rotation = {0,0,0,1};
//
Anchor anchor=_arFragment.getArSceneView().getSession().createAnchor(new Pose(position,rotation));
//
_anchorNode = new AnchorNode(anchor);
_anchorNode.setRenderable(_renderable);
_anchorNode.setParent(_arFragment.getArSceneView().getScene());
_anchorNode.setLocalScale(new Vector3(0.01f,0.01f,0.01f)); //cm -> m
...
}
As i want the object to be on the floor, I need to find out what the height of my physical (device) camera above the floor is, in order to subtract that number from the current object's Y coordinate:
float[] position = {0f,HERE_THE_VALUE_TO_SUBTRACT_FROM_CAMERA_HEIGHT,-10f};
Certainly, it's an easy implementation when plane Trackables are used but here I have the requisites above-named.
I've managed to solve the two requisites, having now the object "floating" in front of me.
As i want the object to be on the floor, I need to find out what the height of my physical (device) camera above the floor is, in order to subtract that number from the current object's Y coordinate.
Trying with different camera/device Pose retrieval APIs, namely: frame.getAndroidSensorPose(), frame.getCamera().getPose() and frame.getCamera().getDisplayOrientedPose() are showing not valid values.
Thanks for your advice.
P.S.:Certainly, it's an easy implementation when plane Trackables are used but here I have other requisites, as above-named.
EDIT after Michael Dougan comments.
Well I think we have then two ways to achieve the requisites:
1) leave the code w/o changes, keeping on using the Session Anchor, asking the user to launch the app and the to follow a "calibration process" which the device on the floor. As this is a professional use app, and not a consumer one, we think it is perfectly suitable;
2) go ahead with the good-and-old Trackables, by means of the usual floor as an anchor, including the pose of that anchor in the calculation of the position of the 3D model.

Changing GameObject Position with Unity and ARToolkit

I'm building an Augmented Reality app with Unity and ARToolkit for Android. I have multiple GameObjects on screen that are children of my marker. Works well. I then created a very simple script to move one of the objects and I attached the script to the game object. It looks like:
void Update()
{
Vector3 currentPos = transform.position;
transform.position = new Vector3(currentPos.x + (.01f * xDirection * xSpeed), currentPos.y + (.01f * yDirection * ySpeed), currentPos.z);
}
The rest of the script does nothing other than alter the value of the direction and speed variables. It works and goes in the directions that I expect, however the object shrinks in size visually. Possible it's just lower on the z axis so it appears smaller, or possible scaling is getting affected. I think it may be related to the movement of the phone up and down while looking at the marker.
I suppose I have to move GameObjects in a different manner than normal when using ARToolkit. What's the proper way?
Thanks
I've no connection with ARToolkit by try checking out their Coordinate System

Panning the view of a gameObject instead of the camera in Unity3d?

I'm having a hard time to pan a view of a gameObject in Unity3d. I'm new to scripting and I'm trying to develop an AR (Augmented Reality) application for Android.
I need to have a gameObject (e.g. a model of a floor), from the normal top down view, rendered to a "pseudo" iso view, inclined to 45 degrees. As the gameObject is inclined, I need to have a panning function on its view, utilizing four (4) buttons (for left, right, forward(or up), backward(or down)).
The problem is that, I cannot use any of the known panning script snippets around the forum, as the AR camera has to be static in the scene.
Need to mention that, I need the panning function to be active only at the isometric view, (which I already compute with another script), not on top down view. So there must be no problem with the inclination of the axes of the gameObject, right?
Following, are two mockup images of the states, the gameObject (model floor) is rendered and the script code (from Unity reference), that I'm currently using, which is not very much functional for my needs.
Here is the code snippet, for left movement of the gameObject. I use the same with a change in -, +speed values, for the other movements, but I get it only move up, down, not forth, backwards:
#pragma strict
// The target gameObject.
var target: Transform;
// Speed in units per sec.
var speedLeft: float = -10;
private static var isPanLeft = false;
function FixedUpdate()
{
if(isPanLeft == true)
{
// The step size is equal to speed times frame time.
var step = speedLeft * Time.deltaTime;
// Move model position a step closer to the target.
transform.position = Vector3.MoveTowards(transform.position, target.position, step);
}
}
static function doPanLeft()
{
isPanLeft = !isPanLeft;
}
It would be great, if someone be kind enough to take a look at this post, and make a suggestion on how this functionality can be coded the easiest way, as I'm a newbie?
Furthermore, if a sample code or a tutorial can be provided, it will be appreciated, as I can learn from this, a lot. Thank you all in advance for your time and answers.
If i understand correctly you have a camera with some fixed rotation and position and you have a object you want to move up/down/left/right from the cameras perspective
To rotated an object to a set of angles you simply do
transform.rotation = Quaternion.Euler(45, 45, 45);
Then to move it you use the cameras up/right/forward in worldspace like this to move it up and left
transform.position += camera.transform.up;
transform.position -= camera.transform.right;
If you only have one camera in your scene you can access its transform by Camera.main.transform
An example of how to move it when someone presses the left arrow
if(Input.GetKeyDown(KeyCode.LeftArrow))
{
transform.position -= camera.transform.right;
}

Unity2D - How to rotate a 2D object on touch/click/press

First of all, I'll let you know that I'm new to Unity and to coding overall (I do know some very basic javascript coding). My question: How can I rotate a 2D object (prefab) 120 degrees on a certain axis (in my case the z-axis, so it rotates like you're looking at a steering wheel) every time I touch on the screen. Right now I have it like this:
function TouchOnScreen ()
{
if (Input.touchCount > 0)
{
var touch = Input.touches[0];
if (touch.position.x < Screen.width/2)
{
transform.rotation = Quaternion.Euler(0,0,120);
Debug.Log("RotateRight");
}
else if (touch.position.x > Screen.width/2)
{
transform.rotation = Quaternion.Euler(0,0,-120);
Debug.Log("RotateLeft");
}
}
}
This code rotates the object whenever I press on a certain side of the screen, but not how I want it to. I want it to rotate so you see the object rotating from A to B, but not (like it is now) in one frame from A to B. Also, this code lets me only rotate one time to each direction.
How can I make it that whenever I press on a certain side of the screen, that it adds or subtracts to/from the previous rotated angle, so I can keep on rotating.
NOTE: Please use javascript, and if you know a simpler code, let me know!
Help is highly appreciated, thanks in advance!
Instead of
transform.rotation = Quaternion.Euler(0,0,-120);
You use:
var lSpeed = 10.0f; // Set this to a value you like
transform.rotation = Quaterion.Lerp ( transform.rotation, Quaternion.Euler(0,0,-120), Time.deltaTime*lSpeed);

Moving object to touch position acting unexpectedly

Good evening, quick question.
Im developing a top-down 2D platformer game in Unity3D. Here is a picture of the game.
I have pretty much everything worked out on a desktop, but when attempting to set up the controls for mobile, I can't seem to get it to work the way it should. All I need is to get the player to move in the direction of wherever the user touches the screen. With the current code im using, the player just rotates in 4 directions, up, down, left and right. He also moves a little, but never goes far from his spawn point.
Please take a look at my revised code:
public Camera camera;
public float movespeed = 0;
// Use this for initialization
void Start () {
movespeed = 2.75F;
}
// Update is called once per frame
void Update () {
if (Input.touchCount > 0) {
// The screen has been touched so store the touch
Touch touch = Input.GetTouch(0);
if (touch.phase == TouchPhase.Stationary || touch.phase == TouchPhase.Moved) {
// If the finger is on the screen, move the object smoothly to the touch position
Vector3 touchPosition = camera.ScreenToWorldPoint(new Vector3(touch.position.x, touch.position.y, -13));
Quaternion rot = Quaternion.LookRotation(transform.position - touchPosition, Vector3.back);
transform.rotation = rot;
transform.eulerAngles = new Vector3 (0, 0, transform.eulerAngles.z);
rigidbody2D.angularVelocity = 0;
//float input = Input.GetAxis ("Vertical");
transform.position = Vector3.Lerp(transform.position, touchPosition, Time.deltaTime);
}
}
}
}
Any ideas on how I can get my player to move to the touched are of the screen? Any help would be much appreciated. Thanks in advance.
If I have understood correctly, you want your player game object to move towards the point on the screen that is being touched. I think it's probably best to describe the behavior of your code so that you can hopefully better understand what might be happening.
From the code posted, I can see one possible issue. Look again at this line:
Quaternion rot = Quaternion.LookRotation(transform.position - touchPosition, Vector3.back);
Here, you are asking Unity to calculate the unit quaternion that represents a rotation from the direction of Vector3.forward to the direction from of the player game object from the touch position. This probably isn't what you want. From the description of the problem, you want the game object to rotate to face the point on the screen being touched (rather than the opposite direction). You can either change the order of the subtraction operands or, preferably, use instead the Transform.LookAt method.
After this, you update the transform's rotation:
transform.rotation = rot;
That's fine, but note that you wouldn't need to do this when using Transform.LookAt.
You then set the transform's rotation again using in this line:
transform.eulerAngles = new Vector3 (0, 0, transform.eulerAngles.z);
I'm not entirely sure why you are doing this. If you only want one axis of rotation, you can use, for example:
transform.LookAt(new Vector3(touchPosition.x, touchPosition.y, transform.position.z))
This should rotate the player's transform around the z-axis to look in the direction of the point being touched.
Finally, you linearly interpolate the transform's position from its current position to the point being touched:
transform.position = Vector3.Lerp(transform.position, touchPosition, Time.deltaTime);
This isn't necessary. Instead, you should just move the player's transform forward. The player should be looking in the direction of the touched screen point. Hence, translating the player forward will move the player towards said screen point:
transform.position += transform.forward * speed * Time.deltaTime;
When the player is very close to the touched screen point it will overshoot and immediately rotate to look in the opposite direction. This will occur repeatedly. You should include some distance that specifies when the player is assumed to have reached the target point.

Categories

Resources