I'm using Unity 3D for developing 3D desfense game.
I made a monster that can take agro from intruders of map, making them occupied while being vulnerable to other attacks. I wanted to make monster can be moved when it's touched once.
Process: touch the monster → touch the destination → monster moves to the destination
I wanted my monster to move through the entire map, but can take intruder's agro
only when they're on the road. I'm currently using Raycast to make this happen, and I'm stuck.
It works just fine on Unity, but when it's on a build and played on the phone, monster can't recognizes the touch or can't recognizes the point where I touched.
if (IsClickMonster)
{
if (Input.GetMouseButtonDown(0))
{
Ray MoveClick = cam.ScreenPointToRay(Input.mousePosition);
if (Physics.Raycast(MoveClick, out ClickedPosition, float.MaxValue, whatCanbeClickedOn) && ClickedPosition.collider.gameObject.CompareTag("monster") == false)
{
Debug.Log("Monster is Moving!");
clickEffect.transform.position = new Vector3(ClickedPosition.point.x, ClickedPosition.point.y + 1.0f, ClickedPosition.point.z);
StopCoroutine("MonsterMove");
IsMoving = true;
StartCoroutine("MonsterMove");
}
else
{
clickEffect.SetActive(false);
StopCoroutine("MonsterMove");
IsClickMonster = false;
return;
}
}
}
else
{
return;
}
↑ Picture of Moving the Skull Soldier. White dot(clickEffect) is where Player touched(ClickedPosition).
Physics.Raycast(MoveClick, out ClickedPosition, float.MaxValue, whatCanbeClickedOn)
Even though I'm using the "whatCanbeClickedOn" layermask, I wonder what can I do to avoid this from happening on mobile build.
In my experience with android, when something doesnt work well in android could be either you have some error or warning on execution that did not trigger on unity just by chance or maybe the phone is not capable of handling the game like it moves too many polygons or lights and it cant handle it well then starts making odd things, try checking the warnings in unity and see if any error appears too of course, thats my best bet
Related
I use Unity 3D with Samsung Gear. I have a working OVRPlayerController in my scene but I am having difficulties mapping the oculus tap, swipe and return button.
I have tried with something like:
if (Input.GetMouseButtonDown(0))
{
Debug.Log("input detected");
}
And with this I detect the tap encircled in red
I have tried also something like:
if (OVRInput.Get(OVRInput.Button.PrimaryThumbstick))
{
Debug.Log("Input detected");
}
Or :
if (OVRInput.Get(OVRInput.Button.One))
{
Debug.Log("Input detected");
}
But nothing seems to work. Is there any documentation that explains how is the input mapped on Samsung Gear that I encircled in yellow ? Does anyone have experience with this or can maybe guide me to some useful documentation on this matter ?
Cheers
My project settings for input:
For swipes, you can get the current Vector2 on the touchpad (either the HMD and/or the incoming remote controller touchpad) by using:
CurrentVector = OVRInput.Get(OVRInput.Axis2D.PrimaryTouchpad);
as for the back button, it is accessible using OVRInput.Button.Back either on Get, GetDown or GetUp. But know that a 2 seconds long back press is reserved for the Universal Menu and should not have any implications inside your game or app.
I'm having a bit of a problem with OnTriggerEnter when I'm using my mobile as a test device.
I have some touch code that successfully lets me drag objects around the screen.
I am then having the objects collide with other objects on the screen.
This was working perfectly until I turned the objects into prefabs. ( I'm needing to do this as the objects are being randomly generated at runtime)
Now, I can still move the objects around the screen but they no longer collide with the other objects, which are also prefabs. It does however still work fine when running it on my laptop in the unity editor.
All my objects have colliders on them with trigger checked, and the moving objects have rigidbodies.
On trigger enter code
public void OnTriggerEnter(Collider other)
{
Debug.Log ("here");
Debug.Log(this.gameObject.tag +"is this");
Debug.Log(other.gameObject.tag + "is other");
if (this.gameObject.tag == other.gameObject.tag)
{
Debug.Log("here2)");
Reftomanager.miniGameScore++;
Reftomanager.updateScore();
Destroy(this.gameObject);
}
}
touch code
if (Input.touchCount > 0)
{
Touch touch = Input.GetTouch(0);
switch(touch.phase)
{
case TouchPhase.Began:
Ray ray = Camera.main.ScreenPointToRay (touch.position);
if (Physics.Raycast(ray,out hit))
{
thisObject = hit.collider.gameObject;
touchPos = Camera.main.ScreenToWorldPoint (touch.position);
if(thisObject.name!="circle")
{
draggingMode = true;
}
}
break;
case TouchPhase.Moved:
if (draggingMode)
{
touchPos = Camera.main.ScreenToWorldPoint (touch.position);
newCentre = touchPos;
thisObject.transform.position = touchPos;
}
break;
case TouchPhase.Ended:
draggingMode = false;
break;
}
}
}
I'm completely stumped so any help would be amazing.
Thanks
Just got this same error recently. I suggest using
If(other.gameObject.CompareTag ("YourTagName"))
Also if you recently added a tag or edited any tags, I found that unity has a bug where your tags will not register on your android build unless you restart unity.
GL.
Since your using 3D colliders, is it possible that the position you are assigning them is different? Touch.position is a Vector2, which means ScreenToWorldPoint would be using 0 for z. If you are using a Vector3 with a z value other than 0 to get the world point in the editor (Standalone Input), it could get you a different value even if x and y are the same.
Another possibility is that there is a platform specific error happening somewhere else in the code, upon object instantiate. Your movement code would still work fine, if it isn't in the same Monobehavior.
If you have an Android, you can use Android Monitor with the Unity tag to check for error messages.
I am new to Unity and I am trying to learn the basics, I learn physics at school(ten grade).
What I have done so far - added a ball to my project and used gravity on it with RigidBody.
I want to make the ball jump suddenly on air when there is some touch input, for example - flappy bird.
My script is basic:
void Update()
{
if (Input.touchCount == 1)
{
GetComponent<Rigidbody>().AddForce(new Vector2(0, 10), ForceMode.Impulse);
}
}
With this script, the ball is falling(gravity) and when I touch the script, is Y coordinate changes but it happens by sudden (no animation) and it changes by like ~1 and continue falling(I can't keep the ball on screen) , also I can make it jump only once, if I press multiple times it will jump only once, as you can see here:
https://vid.me/aRfk
Thank you for helping.
I have created same scene in Unity3D Editor and played a little with same setup you have. And yes I had similar problems adding force on Update and also (but less) on FixedUpdate. But adding force on LateUpdate seems to work okay.
Here is my code:
public Rigidbody2D rb2dBall;
void LateUpdate()
{
if(Input.GetKeyUp(KeyCode.Space))
{
rb2dBall.AddForce (Vector2.up * 10f, ForceMode2D.Impulse);
}
}
And also I turned on interpolation:
I cant say why the physics bahaves like this, could be some bug in Unity3D 5.X since on FixedUpdate it should work fine.
Firstly, when working with physics in Unity, it's highly recomended to use the FixedUpdate method, instead of the Update. You can check in this link http://docs.unity3d.com/ScriptReference/MonoBehaviour.FixedUpdate.html
The second thing is that maybe you are not applying so much force at the ball, the force needed to give a quite impulse will depends of the mass of your ball's rigidbody. So try to do something like that:
GetComponent<Rigidbody>().AddForce(Vector2.up * 100, ForceMode.Impulse);
Change the fixed value 100 to adjust your needs.
I am adroid proggrammer,because of many object in scene my game has lagging
i have theory for remove lagging in my game.
if i can control rendering in unity i can remove lagging.
using UnityEngine;
using System.Collections;
public class Enemy : MonoBehaviour {
void Update(){
void Start(){
GetComponent<Renderer>().enabled = false;
}
object2 = GameObject.Find("TR");
var distance = Vector3.Distance(gameObject.transform.position, object2.transform.position);
print (distance);
if(distance <= 80){
GetComponent<Renderer>().enabled = true;
}
}
}
Don't work.how can i have boolean render that when have collision will render
else remove.
i want have zone that all object in my zone rendered and allthing outside do not render.
void OnTriggerEnter(Collider collision)
{
if(collision.gameObject.tag == "zone")
{
GetComponent<Renderer>().enabled = true;
}
else{
GetComponent<Renderer>().enabled = false;
}
don't work
void OnTriggerEnter(Collider collision)
{
if(collision.gameObject.tag == "zone")
{
gameObject.SetActive(false);
}
else{
gameObject.SetActive(true);
}
This is either implemented in Unity or implementing it is a bad idea because raycasts are expensive and you need a lot of them. Try finding other problems which cause lagging in your game, disable feature by feature and write how many frames you have, this will get you best overview of what's the problem. Look online which methods are expensive (Instantiating, Destroy, try merging all models you have, smaller amount of shaders, fast shaders, less textures to load, FindGameObjectByName (or tag...)).
Here you will find a great document about optimization. It's preapared for mobile devices but i hope you will find what you need: Unity Optimization Guide for x86 Android
I would recommend having your blue blobs in an object pool, and the ones leaving your screen getting disabled.
You know your position and you know the position of the objects in the pool, you can math your distance in one direction, for instance behind you and disable after x amount.
Raycasting or collisions are abundant.
On your terrain generation scripts, check for disabled pool objects and if one exist, it should be put ahead in the level and repositioned or w/e logic you have there.
Don't instantiate and destroy unless you really need it, do it on level-load instead of on the fly.
(It's expensive.)
There's some really good tutorials on the unity page, have a look there.
They cover things like endless-runners.
I am trying to make a slice effect when user moves his fingers on the screen on Android Device like in Fruit Ninja
I have a movieClip named Particle which has a circle
I tried following
stage.addEventListener(MouseEvent.MOUSE_DOWN , stratSlice);
stage.addEventListener(MouseEvent.MOUSE_UP , endSlice);
function startSlice(e:MouseEvent):void
{
stage.addEventListener(MouseEvent.MOUSE_MOVE , drawSlice);
}
function endSlice(e:MouseEvent):void
{
stage.addEventListener(MouseEvent.MOUSE_MOVE , drawSlice);
}
function drawSlice(e:MouseEvent):void
{
var p:Particle = new Particle();
addChild(p);
p.x = mouseX;
p.y = mouseY;
}
but when I run it The slice is broken I want it to be seamless.
Adding e.updateafterevent() in the mouse move handler might improve performance a bit.
In general your code will probably not work as well as other particle effects you see in games such as fruits ninja. For good results you will need to use a particle engine such as
StarDust: https://code.google.com/p/stardust-particle-engine/
Flint: http://flintparticles.org/
Partigen: http://www.desuade.com/partigen/engine
Starling Particle: https://github.com/PrimaryFeather/Starling-Extension-Particle-System
I know starling works great on android, not sure about the others