I'm trying to make an app that responds to the Cardboard.SDK.Tilted flag in some Update() method.
When running in Unity Player, by pressing the Esc button, Cardboard.SDK.Tilted is set to true, so here's all good.
But when i Build the app for Android, Cardboard.SDK.Tilted stays false if I tilt the device. Other VR apps with tilt actions work fine on my phone. Is there any other option I have to enable before building for Android to make this work?
I'm using Unity v5.3.3f1 and a Cardboard SDK v0.6, the devices I've tried on are Xperia Z2, Samsung Galaxy S3 and iPhone 6.
EDIT:
So, I've tried putting this code into both Update() and LateUpdate() methods:
if (Cardboard.SDK.Tilted) {
print("tilted, next scene");
NextScene ();
}
When the screen is tilted, new scene should be loaded.
But as I've said, it works only in Unity Player by pressing the Esc button to trigger the tilt, on a real device nothing happens - the Cardboard.SDK.Tilted variable is never set to true.
I've seen on https://recordnotfound.com/cardboard-unity-googlesamples-6780/issues that there was a issue of discontinuation of Tilt in v0.6, is it possible that this is no longer supported? But it's strange that it works in Unity Player but not on a real device.
I can verify that the Cardboard.SDK.Tilted flag does not appear to working as in previous versions of the SDK. The escape button triggers it in the debugger, but the tilt action does not trigger it in builds.
However, you can implement an equivalent quite simply with Input.acceleration:
float angle = 80.0f / 360.0f * 2.0f * Mathf.PI;
bool isTilted = Mathf.Abs(Input.acceleration.normalized.x) > Mathf.Sin(angle);
if (Cardboard.SDK.Tilted || isTilted)
{
//Action here
}
If the device's acceleration is entirely due to gravity, the angle float is the angle of the device from horizontal. Attempting to normalize a Vector3 that is too small sets it to zero, so small vectors shouldn't trip the conditional. Pre-calculate your sine to save a cycle.
Related
I use Unity 3D with Samsung Gear. I have a working OVRPlayerController in my scene but I am having difficulties mapping the oculus tap, swipe and return button.
I have tried with something like:
if (Input.GetMouseButtonDown(0))
{
Debug.Log("input detected");
}
And with this I detect the tap encircled in red
I have tried also something like:
if (OVRInput.Get(OVRInput.Button.PrimaryThumbstick))
{
Debug.Log("Input detected");
}
Or :
if (OVRInput.Get(OVRInput.Button.One))
{
Debug.Log("Input detected");
}
But nothing seems to work. Is there any documentation that explains how is the input mapped on Samsung Gear that I encircled in yellow ? Does anyone have experience with this or can maybe guide me to some useful documentation on this matter ?
Cheers
My project settings for input:
For swipes, you can get the current Vector2 on the touchpad (either the HMD and/or the incoming remote controller touchpad) by using:
CurrentVector = OVRInput.Get(OVRInput.Axis2D.PrimaryTouchpad);
as for the back button, it is accessible using OVRInput.Button.Back either on Get, GetDown or GetUp. But know that a 2 seconds long back press is reserved for the Universal Menu and should not have any implications inside your game or app.
I am using an input processor for touchinput on a flappy bird like game. This works fine on my droid turbo, and a couple other newer phones. But with my two older tables, a xoom, and verizon tablet, touchDown occasionally doesn't fire. I should mention that the FPS is 60 throughout gameplay. Also, I use an inputMultiplexer which adds both the playerInput and hud/play stages. Could this just be a problem with older android? Any fix? I am sure it's not my code for the fact that it works on newer phones.
EDIT
I tried using Gdx.input.isTouched like so :
if(Gdx.input.isTouched()){
if(!touched){
jump();
}
touched = true;
} else{
touched = false;
}
But it gives me the same results as input processor :\
This is not a problem with the jump method, as of right now it just prints "touched" to the console.
it is the issue with the viewPort. that is the different phone has different screen size. please check the below link which will help you
http://stackoverflow.com/questions/39810169/libgdx-text-not-rendering-properly-on-larger-screens/39946652#39946652
I'm updating my application based on users huge request. My app turns the screen ON after something happens and now I'm integrating "pocket mode" functionality. So basically if the user has phone or device in his/her pocket I would like to detect this via proximity sensor and act based on this. But I'm experiencing a lot of troubles..
So I'm registering sensor and everything as usual. One thing I would like to point out is that I'm telling the PowerManager object to register as Proximity_Screen_Off_Wake_Lock. That means that every time the screen will go automatically off when something near is detected.
powerManager.newWakeLock(PowerManager.PROXIMITY_SCREEN_OFF_WAKE_LOCK, "ProximityScreenOff");
Basically when device is on the table and I move finger to the sensor the screen turns off as expected.
The problem starts when my activity launches and I'm already holding the finger on the sensor (or is in pocket - it's the same). So sensor doesn't detect anything that is already near phone. If I move the finger a little bit away the screen will turn ON again.
Is there anything that I could do, so I would get my desired behavior - that is, turning the screen OFF when phone is already in pocket?
You can manage this using boolean variable.
That is if near event of onSensorChanged called, Then n then only far event called.
boolean isNearCalled =false;
void onSensorChanged() {
if(near sensitivity) {
isNearCalled=true;
} else if(Distance sensitivity && isNearCalled) {
// Do stuff for far like screen off
isNearCalled= false;
}
}
I'm new to Unity and I am trying to build a solar system exploration app through unity. I have the environment set up, and now all I need is the ability to look around (via tilting and moving the phone itself, which is android) smoothly. I have the ability to look around, but if I do a complete 180, it seems to invert the physical orientation of the phone with the visual movements in game, e.g. if I have turn 180 degrees, if I tilt the phone down it shifts my vision in game to the right, up results in visual shift to the left. Here is the code I have thus far:
#pragma strict
private var quatMult : Quaternion;
private var quatMap : Quaternion;
function Start () {
Input.gyro.enabled = true;
}
function Update () {
#if UNITY_ANDROID
quatMap = Input.gyro.attitude;
#endif
transform.localRotation = Quaternion.Euler(90, 0, 0) * quatMap * Quaternion(0,0,1,0) /*quatMult*/;
}
Any help is greatly appreciated. Thanks.
This should be what you're looking for: https://gist.github.com/chanibal/baf46307c4fee3c699d5. Just drag it to the camera and it should work.
You might want to remove the reset on touch part (Input.touchCount > 0 in Update) and debug information (the OnGui method).
I'm interested in AR applications of mobile devices and naturally I would like to make better use of the compass.
The only issue I've been having to work against isn't how twitchy the compass is. (Angular Smoothing seems to solve this issue just fine) My main issue is that when the device is held Vertical the compass values start freaking out. Causing an on screen compass to flip about all over the place. I don't have a lot of experience with mobile application development so I'm not sure what would be causing this issue, if its a Unity issue or if its just a limitation of the digital compass. I know other apps do seem to be able to use the compass fine in any orientation, but this is all stupidly new to me.
I've definitely tried moving the phone in a figure of 8. The device I have to play around with is a Nexus 4.
using UnityEngine;
using System.Collections;
public class Compass : MonoBehaviour {
// Use this for initialization
void Start () {
Input.location.Start ();
Input.compass.enabled = true;
}
// Update is called once per frame
void Update ()
{
var heading = Input.compass.trueHeading;
transform.eulerAngles = new Vector3 (0, 0, heading);
}
}
Preamble :)
First of, I'm not an expert (unfortunately) in subjects that I will talk about. But still, I've decided to share my thoughts.
Theory
The problem can be generalized in the following way. You want to have some continuous function that takes a 3D vector (which is device orientation in your case) and returns another vector that is orthogonal to original vector. Theory says (see hairy ball theorem) that for some arguments that function will return zero vectors. In case when such a function is compass, zero vectors are returned when device is oriented vertical (and this fells quite natural if you have ever used an ordinary compass).
Practice
Sometimes you want your app to tell which side of the world does phone back (rear camera) is pointing to.
Or maybe even you want combined approach:
If the phone is oriented flat, show what is the phone's top pointing to.
If the phone is oriented vertical, show what is the phone's back pointing to.
In both cases you need to use gyroscope in addition to compass.