Unity Google Cardboard reverts any camera changes made when built on Android - android

I am working with Google Cardboard through Unity on a virtual reality project in which the view slowly changes from stereoscopic VR (a slightly different image in each eye) to monoscopic VR (the same image in each eye).
I can edit the cameras in the Unity Editor, and the changes work as intended in the Game window within the Editor, but when I build the project onto an Android phone, all the user can see is the scene through the default Google Cardboard camera setup.
This seems to happen as long as "Virtual Reality Supported" checkbox in the Player Settings menu is flagged. Does anyone know why this happens and if it can be overcome?

This happens because user's well being is our most precious resource in VR, so the devs of the SDK (GVR and OVR alike) have made some decisions to make messing with the camera intentionally convoluted - to avoid people sayich that the tech is at fault when all their audience feels sick because they changed the FOV or something.
I believe you can still do what you want to do by scaling the camera parent

Related

ARCore: Emulator and unity

I'd like to test ARCore using Unity/c# before buying an android device - can I use Unity and ARCore emulator without having a device to put together an AR app but just using a camera from my PC, and does the camera require a specific spec?
I read Android Studio Beta now supports ARCore in the Emulator to test an app in a virtual environment right from the desktop, but can't tell if the update is integrated into Unity.
https://developers.googleblog.com/2018/02/announcing-arcore-10-and-new-updates-to.html
Any tips how people may be interacting with the app using a pc camera would be really helpful.
Thank you for your help !
Sergio
ARCore uses a combination of the device's IMU and camera. The camera tracks feature points in space and uses a cluster of those points to create a plane for your models. The IMU generates 3D sensor data which is passed to ARCore to track the device's movements.
Judging from the requirements above we can say a webcam just isn't going to work, since it lacks the IMU needed by ARCore. Just the camera won't be able to track the device's position, which may lead to objects drifting all over the place (If you managed to get it working at all). Even Google's page or reddit threads indicate that it just won't work.

Wrong Camera Orientation with Android & Vuforia

We are developing our own Android-based hardware and we wish to use Vuforia (developed via Unity3D) for certain applications. However, we are having problems making Vuforia work well with our current camera orientation settings.
On our hardware, when the camera is placed horizontally - everything works fine. That is, when the camera is parallel to the placement of the display. However, we need to place the camera vertically, or in other words, with a 90 degree difference to the placement of the display. These are all hardware settings. Our kernel is programmed according to such settings and every other program that utilises the camera works compatibly with everything, including our IMU sensors. However, apps developed with Vuforia behave completely odd when the camera is placed vertically.
We assume the problem to be related to Vuforia's algorithms of processing raw camera data however we are not sure. Moreover, we do not know how to fix the situation. For further details, I can list:
-When "Enable Video Background" is on, the projected image is distorted and no video feed is available. The AR projection appears on a black background with distorted dimensions.
-When "Enable Video Background" is on and the device is rotated, the black background is replaced by flickering solid colors.
-When "Enable Video Background" is off, the AR projection has normal dimensions (no distortion) however it is tracked with wrong axis settings. For example, when the target moves left in real world, the projection moves up.
-When "Enable Video Background" is off and the device is rotated, the AR projection is larger compared to its appearance when the device is in it's default state.
I will be glad to provide any more information you need.
Thank you very much, have a nice day.
PS: We have found out that applications that use the camera as a main purpose (Camera apps, Barcode Scanners, etc) work fine while apps for which camera usage is an extra quality (such as some games) have the same problem as Vuforia. This make me think that apps who access the camera directly work fine whereas those who use Android API and classes fail for some reason.
First understand that every platform deals with cameras differently and that beyond this different android phone manufacturers deal with these differently as well. In my testing WITHOUT vuforia I had to transform the plane I cast the video feed onto 0,-90,90 for android/iphone and -270,-90,90 for the windows surface tablet. Past this the iPhone rear camera was mirrored, the android front camera was mirrored as well as the surface front camera. That is easy to account for, but an annoying issue is that the Google Pixel and Samsung front cameras were mirrored across the y (as were ALL iOS on the back camera), but the Nexus 6p was mirrored across the x. What I am getting at here is that there are a LOT of devices to account for with android so try more than just that one device. Vuforia so far has dealt with my pixel and 4 of my iOS devices just fine.
As for how to fix your problem:
Go into your player settings for unity and look at the orientation. There are a few options here and my application only uses portrait so I force portrait and it seems to work fine (none of the problems I had to account for with the above mentioned scenario). Vuforia previously did NOT support auto rotation so you need to make sure you have the latest version since it sounds like that is what you need. If the auto rotate is set and it is not working right you may have to account for that specific device (don't do this for all devices until after you test those devices). To account for that device use an if (or construct a case statement if you have multiple instances of this problem with different devices) and then reflect or translate as needed. Cross platform development systems (like unity) doesn't always get everything perfect since there is basically no standard. In these cases you have to directly account for them by creating a method and a case statement within that so you can cleanly and modularly manipulate all necessary devices. It is a pain, but it beats developing for all devices separately.
One more thing is make sure you check out the vuforia configuration file as it has some settings such as camera mirror and direction settings on there. These seem to be public settings so you should also be able to script to these in your case statement in the event you need to use "Flip Horizontally" for one phone, but not another.

Project Tango Unity app crashes iff there are two scene Cameras

Making an area-aware Unity app on Tango, and it works fine until I add another camera. It uses an ADF for location awareness, and the default AR Camera prefab. So it shows what the device's camera sees. Fine. Now I want to add a mini-map in the upper corner. Simply adding another camera will cause the app to crash. I tried:
- perspective vs orthog
- render to texture
- various 'clear' flags
- checked my layers
- no culling or anything fancy
It's simple. If I enable the camera, it crashes. If I disable the camera, it works.
Does anyone have any insights into this?
Thanks

Google cardboard for unity disables AA on android device

I'm developing simple VR app for android using google cardboard SDK for unity. I'm using unity 5(free), and latest version cardboard package for unity(0.4.9), i am testing on sony xperia z3 compact(lolipop) and on samsung galaxy note 3(kitkat)
The issue i'm having is, that i can't turn on anti-aliasing, or rather google sdk package, seems to disable it. I dont mind a little jagged corners, but the thing is that those corners are flickering when they are far away enough. I tried moving textures apart(since the flickering could be result of overlapping), but issue still persists.
It can't be the issue with the unity exporting for android, since if i build same project(test example provided inside google cardboard unity package) using stock camera provided by unity, insted of Cardboard main gameobject), AA is working. It cant get simpler than that, once cube and one camera... I have tried turning on and off the 32-bit display buffer option in player settings, tried forcing open GL ES 2.0 and various other tick/untick checkbox inside unity tips found across web, with no success.
So my question is, is anyone else having this same issue. And how to fix it?
I hope my question and description of the problem are detailed enough.
Cheers
The Cardboard.cs class holds a RenderTexture which is what the cameras render to, and then it is rerendered to the phone screen with distortion correction for the lenses. This bypasses Unity's normal rendering to the screen, so the AA settings for the project won't have any effect.
To see what effect AA settings in Unity will have, you can do a couple of different things:
Turn off Cardboard.SDK.nativeDistortionCorrection, so that Unity is drawing directly to the screen, or
Edit Cardboard.CreateStereoScreen() and change the settings on the RenderTexture that is allocated there.
However, the native code rerender for distortion does not use anti-aliasing in the framebuffer, so I'm not sure how much effect you'll see in #2. And there will certainly be a performance penalty either way.
Turning off distortion correction fixed this issue. Then you can set the Unity project anti alias settings up to 4x or 8x. Looks great now - no more jaggies while viewing on an iPhone 6+.
Unity 5 running Google Cardboard SDK v0.5 --
To turn off Distortion Correction:
Go to Hierarchy | CardboardMain | in Inspector uncheck Distortion Correction
To enable 4x or 8x Anti Aliasing in Unity:
Go to Edit | Project Settings | Quality | in Inspector set Anti Aliasing to 4x or 8x Multi Sampling.
If you don't want to turn off distortion correction you need to change the depth of the texture from 16 to 32 bits and the format of the texture to ARGB32. To do that go to file BaseVRDevice.cs, then find CreateStereoScreen function and change it to
public virtual RenderTexture CreateStereoScreen() {
Debug.Log("Creating new default cardboard screen texture.");
return new RenderTexture(Screen.width, Screen.height,32, RenderTextureFormat.ARGB32);
}
I followed serg hov's suggestion, however AA is not working with distortion on even after applying that fix.
It doesn't make sense to turn off distortion, so I am wondering what would be the solution to have both distortion and AA working. It works for Gear VR, so there gotta be a way to get it working for Cardboard VR.

Vuforia & Unity 1.5 not rendering object on the scene on Android

I am very frustrated with this problem and the Unity3D community isn't very helpful because no one there is answering my question. I have done a ton of searching to find what the problem could be, but I didn't succeed. I install Qualcomm Vuforia 1.5 and Unity3D 1.5.0.f. I use the Unity extension. I imported their demo app called vuforia-imagetargets-android-1-5-10.unitypackage, put their wood chips image target on the scene, their AR camera, and added a box object on top of the image target. Then I built it and sent to my Samsung Galaxy tablet. However, when I open the app on the tablet and point the tablet to the image target, nothing shows up - the box isn't there. As if I didn't add any objects to the scene. I just see what the device camera sees.
Has anybody experienced this before? Do you have any ideas what could be wrong? No one online seems to be complaining about it.
Thank you!
Make sure you have your ImageTarget dataset activated and loaded for your ARCamera (in the inspector, under Data Set Load Behaviour) and that the checkbox next to the Camera subheading in the inspector is ticked.
Also ensure that the cube (or other 3d object) is a child of ImageTarget in the hierarchy, and that your ImageTarget object has it's "Image Target Behaviour" set to the intended image.
You may also have to point your ARCamera in such a way that your scene is not visible to it.
Okay. I got the solution. I asked on Qualcomm's forum and one gentleman was nice enough to explain to me that I missed setting up Data Set Load Behaviour in Unity's AR camera. I had to active the image target data set and load the corresponding image target. Once I set these two things up, built and deployed, all worked well.
Good luck!

Categories

Resources