I am building this app, that will recognize painting and will display the info about it with the help of AR.
And I need to call multiple image target but not simultaneously, it will only call the image target if it is detected by AR camera.
*Ive tried creating many scenes with Image target on it but I cant call different imagetarget it keeps on reverting to only 1 imagetarget.
This is wat you can see in menu,
Main menu
Start AR camera(This part should have many image target but not detecting it simultaneously)
Help(how to use the app)
Exit
*Im using vuforia in creating AR
Thanks in advance for those who will help me.
This is the imagetarget and its Database
View post on imgur.com
Run the multi target scene sample. There are three target (stone, wood and road).
Each contains the TrackableBehaviour component.
Grab it and disable it in Start. If you do it in Awake it will be set back to active most likely in the Awake of the component itself or via some other manager.
public class TrackerController:MonoBehaviour
{
private IDictionary<string,TrackableBehaviours> trackers = null;
private void Start()
{
this.trackers = new Dictionary<string,TrackableBehaviour>();
var trackers = FindObjectsOfType<TrackableBehaviour>();
foreach(TrackingBehaviour tb in trackers)
{
this.trackers.Add(tb.TrackableName, tb);
tb.enabled = false;
}
}
public bool SetTracker(string name, bool value)
{
if(string.IsNullOrEmpty(name) == true){ return false; }
if(this.trackers.ContainsKey(name) == false){ return false; }
this.trackers[name].enabled = value;
return true;
}
}
The method finds all TrackableBehaviour and places them in a dictionary for easy access. The setting method return boolean, you can change it to throw exception or else.
Related
How can I set the display to stereoscopic programmatically in Unity for an app deployed to an Android device?
I want a UI menu where the user can toggle between "VR mode" and normal mode. I do not want VR mode by default as it should be an option at run-time. I know there is a setting for "Virtual Reality Supported" in the build settings, but again, I do not want this enabled by default.
Include using UnityEngine.XR; at the top.
Call XRSettings.LoadDeviceByName("") with empty string followed by XRSettings.enabled = false; to disable VR in the start function to disable VR.
When you want to enable it later on, call XRSettings.LoadDeviceByName("daydream") with the VR name followed by XRSettings.enabled = true;.
You should wait for a frame between each function call. That requires this to be done a corutine function.
Also, On some VR devices, you must go to Edit->Project Settings->Player and make sure that Virtual Reality Supported check-box is checked(true) before this will work. Then you can disable it in the Start function and enable it whenever you want.
EDIT:
This is known to work on some VR devices and not all VR devices. Although, it should work on Daydream VR. Complete code sample:
IEnumerator LoadDevice(string newDevice, bool enable)
{
XRSettings.LoadDeviceByName(newDevice);
yield return null;
XRSettings.enabled = enable;
}
void EnableVR()
{
StartCoroutine(LoadDevice("daydream", true));
}
void DisableVR()
{
StartCoroutine(LoadDevice("", false));
}
Call EnableVR() to enable vr and DisableVR() to disable it. If you are using anything other than daydream, pass the name of that VR device to the LoadDevice function in the EnableVR() function.
For newer builds of Unity (e.g. 2019.4.0f1) you can use the XR Plugin Management package.
To enable call:
XRGeneralSettings.Instance.Manager.InitializeLoader();
To disable call:
XRGeneralSettings.Instance.Manager.DeinitializeLoader();
I'm using Unity 2021 but this probably works in earlier versions, I'm also using XR Plug-in Management.
Start:
XRGeneralSettings.Instance.Manager.StartSubsystems();
Stop:
XRGeneralSettings.Instance.Manager.StopSubsystems();
Full documentation at:
https://docs.unity3d.com/Packages/com.unity.xr.management#4.0/manual/EndUser.html
2020.3.14f1
Doesn't work for me, I get this error when running my Android app.
Call to DeinitializeLoader without an initialized manager.Please make
sure wait for initialization to complete before calling this API.
[RuntimeInitializeOnLoadMethod(RuntimeInitializeLoadType.AfterAssembliesLoaded)]
static void TryToDeinitializeOculusLoader()
{
XRGeneralSettings.Instance.Manager.DeinitializeLoader();
}
More context.
I try to unload the Oculus loader, before he manages to load the plugin.
I have an Android app, and the Oculus loader calls Application.Quit because the device is not an Oculus headset.
Waiting for XRGeneralSettings.Instance.Manager.isInitializationComplete takes too long.
Tried all RuntimeInitializeLoadType annotations.
OculusLoader.cs
#elif (UNITY_ANDROID && !UNITY_EDITOR)
[RuntimeInitializeOnLoadMethod(RuntimeInitializeLoadType.AfterAssembliesLoaded)]
static void RuntimeLoadOVRPlugin()
{
var supported = IsDeviceSupported();
if (supported == DeviceSupportedResult.ExitApplication)
{
Debug.LogError("\n\nExiting application:\n\nThis .apk was built with the Oculus XR Plugin loader enabled, but is attempting to run on a non-Oculus device.\nTo build for general Android devices, please disable the Oculus XR Plugin before building the Android player.\n\n\n");
Application.Quit();
}
if (supported != DeviceSupportedResult.Supported)
return;
try
{
if (!NativeMethods.LoadOVRPlugin(""))
Debug.LogError("Failed to load libOVRPlugin.so");
}
catch
{
// handle Android standalone build with Oculus XR Plugin installed but disabled in loader list.
}
}
#endif
SOLUTION
Made my build class extend IPreprocessBuildWithReport
public void OnPreprocessBuild(BuildReport report)
{
DisableXRLoaders(report);
}
///https://docs.unity3d.com/Packages/com.unity.xr.management#3.2/manual/EndUser.html
/// Do this as a setup step before you start a build, because the first thing that XR Plug-in Manager does at build time
/// is to serialize the loader list to the build target.
void DisableXRLoaders(BuildReport report)
{
XRGeneralSettingsPerBuildTarget buildTargetSettings;
EditorBuildSettings.TryGetConfigObject(XRGeneralSettings.k_SettingsKey, out buildTargetSettings);
if (buildTargetSettings == null)
{
return;
}
XRGeneralSettings settings = buildTargetSettings.SettingsForBuildTarget(report.summary.platformGroup);
if (settings == null)
{
return;
}
XRManagerSettings loaderManager = settings.AssignedSettings;
if (loaderManager == null)
{
return;
}
var loaders = loaderManager.activeLoaders;
// If there are no loaders present in the current manager instance, then the settings will not be included in the current build.
if (loaders.Count == 0)
{
return;
}
var loadersForRemoval = new List<XRLoader>();
loadersForRemoval.AddRange(loaders);
foreach (var loader in loadersForRemoval)
{
loaderManager.TryRemoveLoader(loader);
}
}
public void Awake() {
StartCoroutine(SwitchToVR(()=>{
Debug.Log("Switched to VR Mode");
}));
//For disable VR Mode
XRSettings.enabled = false;
}
IEnumerator SwitchToVR(Action callback) {
// Device names are lowercase, as returned by `XRSettings.supportedDevices`.
// Google original, makes you specify
// string desiredDevice = "daydream"; // Or "cardboard".
// XRSettings.LoadDeviceByName(desiredDevice);
// this is slightly better;
string[] Devices = new string[] { "daydream", "cardboard" };
XRSettings.LoadDeviceByName(Devices);
// Must wait one frame after calling `XRSettings.LoadDeviceByName()`.
yield return null;
// Now it's ok to enable VR mode.
XRSettings.enabled = true;
callback.Invoke();
}
I am working on a multlayer third person game and I am using motion controller for animations and photon for network manager.I ahve a problem: when I connect and join the room the other players don't move on others player screen. They move only on their devices. Here is what I deactivated:
using UnityEngine;
using com.ootii.Input;
using com.ootii.Actors;
using com.ootii.Actors.AnimationControllers;
public class netView : Photon.MonoBehaviour {
public Camera cam;
public UnityInputSource uis;
public GameObject canvas;
public ActorController ac;
public MotionController mc;
// Use this for initialization
void Start () {
if (photonView.isMine) {
cam.enabled = true;
uis._IsEnabled = true;
canvas.active = true;
ac.enabled = true;
mc.enabled = true;
} else {
cam.enabled = false;
uis._IsEnabled = false;
canvas.active = false;
ac.enabled = false;
mc.enabled = false;
}
}
}
Here is a video: https://youtu.be/mOaAejsVX04 . In it i am playing in editor and on my phone. In my device I move around and the editor player does not move. Also in editor, the player from the device just stays there, doesn't move while on phone is moveing around.
For input I am using CrossPlatformManager class. How can I repair it?
In your case I think the problem is that you don't synchronize the transform to begin with. You need either a PhotonTransformView Component attached to your network object, with a photonView observing that PhotonTransformView, or inside your network behaviour manually writing and reading to that network object stream.
I strongly encourage you do go through the basic tutorial which will show you all the above technique step by step:
https://doc.photonengine.com/en-us/pun/current/demos-and-tutorials/pun-basics-tutorial/player-networking#trans_sync
https://doc.photonengine.com/en-us/pun/current/demos-and-tutorials/pun-basics-tutorial/player-networking#beams
it doesn't matter the input technique you use, what matters is the synchronization of the transform.
I am trying to use a LookToWalk script in my Unity VR app that should run on my Daydream View. In the "Game" Mode to preview the changes everything works as expected (I configured the script to run forward once the user camera faces 30.0 degrees downwards or more.
However when I try to build the daydream app and install it on my Google Pixel the CharacterController.SimpleMove doesn't seem to work any more.
The logs were showing that the 30.0 degree stuff was triggered as expected but no movement was seen on the daydream.
Do you know why this could be happening? Seems really strange that it runs on the "emulator" but not the 'real' device.
using UnityEngine;
using System.Collections;
public class GVRLookWalk : MonoBehaviour {
public Transform vrCamera;
public float toggleAngle = 30.0f;
public float speed = 3.0f;
private bool shouldWalk;
private CharacterController cc;
// Use this for initialization
void Start () {
cc = GetComponent<CharacterController>();
}
// Update is called once per frame
void Update () {
if (vrCamera.eulerAngles.x >= toggleAngle && vrCamera.eulerAngles.x < 90.0f){
shouldWalk = true;
} else {
shouldWalk = false;
}
if (shouldWalk) {
Vector3 forward = vrCamera.TransformDirection (Vector3.forward);
cc.SimpleMove (forward * speed);
}
}
Is the Camera a child of another transform? You cannot move the camera directly. "you cannot move the camera directly in Unity. Instead, the camera must be a child of another GameObject, and changes to the position and rotation must be applied to the parent’s Transform."
https://unity3d.com/learn/tutorials/topics/virtual-reality/movement-vr
I am using the SDL-2.0.3 along with NDK-r10e, I'm attempting to make the return button switch the app to the background so I tried to use the function SDL_MinimizeWindow() but It does nothing ! is this a bug or do I miss something ?
here is my code :
if(event.key.keysym.sym == SDLK_AC_BACK)
{
SDL_MinimizeWindow(window);
SDL_Log("window minimized !\n");
}
everything just work fine and I get the log message when the button is pressed but the window is not minimized
That doesn't appear to be supported on Android (there's not really anything corresponding to minimizing a "window" on Android, unless you count finishing an Activity).
The SDL_MinimizeWindow function looks like this:
void
SDL_MinimizeWindow(SDL_Window * window)
{
CHECK_WINDOW_MAGIC(window, );
if (window->flags & SDL_WINDOW_MINIMIZED) {
return;
}
SDL_UpdateFullscreenMode(window, SDL_FALSE);
if (_this->MinimizeWindow) {
_this->MinimizeWindow(_this, window);
}
}
Where _this is an SDL_VideoDevice *, which is set to point to an SDL_VideoDevice for the appropriate platform at runtime. The Android video driver only sets up the following 3 Window-related functions:
device->CreateWindow = Android_CreateWindow;
device->SetWindowTitle = Android_SetWindowTitle;
device->DestroyWindow = Android_DestroyWindow;
Trying to perform any other operations on an SDL_Window on Android is likely to do nothing.
Some further information in the form of a couple of lines of code from SDL_androidwindow.c:
window->flags &= ~SDL_WINDOW_RESIZABLE; /* window is NEVER resizeable */
window->flags |= SDL_WINDOW_FULLSCREEN; /* window is always fullscreen */
I need to detect if the phone has a front facing camera, and if so, I need to calculate the megapixels. The same thing goes for a rear facing camera.
I know how to get the megapixels of a "Camera" object, but I don't know how to check for the other things.
P.s.: I would also be nice if you know a way to check if the Camera has flash or not, and other cool statistics about the camera
I always try to create helpers
check if you have a front Camera:
public static boolean checkCameraFront(Context context) {
if(context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA_FRONT)) {
return true;
} else {
return false;
}
}
Check if you have Camera in your device
public static boolean checkCameraRear() {
int numCamera = Camera.getNumberOfCameras();
if(numCamera > 0) {
return true;
} else {
return false;
}
}
http://developer.android.com/reference/android/hardware/Camera.html#getNumberOfCameras() , introduced in API lvl 9. This gets you the number of cameras
http://developer.android.com/reference/android/hardware/Camera.CameraInfo.html contains information of its facing direction.
http://developer.android.com/reference/android/hardware/Camera.Parameters.html#getPictureSize() is megapixels, if counted
http://developer.android.com/reference/android/hardware/Camera.Parameters.html#getFlashMode() returns null if no flash..
many other parameters can be gotten from the camera object too
http://developer.android.com/reference/android/hardware/Camera.html has step by step instructions for using camera. You can follow these instructions if you understand any object oriented language.