I imported the Vuforia sample for gear vr in unity and replaced the objects with blender objects. Now when in my scene I put the object very close to the camera it works alright. But when I am in the Ar mode or I look form a distance at the object the edges of the objects seem very pixelatex. And move a little bit.(Blink). I have the anti aliassing on the hightes it can be but that didn´t change a thing. The blinking would indicate to me that the depth buffer is confused but since i am new to unity I have no idea what to do about that. I also read about mipmapping, that the resulution of the textures might be false. But I use materials which are colored so I can´t imagine which settings to change. Please help! Any suggestion would be very welcome!
You should set your rendering from the default 50% to 100%. Do you have any screen shots it will help.
How to do it
If you are using Oculus SDK 1.6+, you might want to check the OVRManager and make sure that "Use Recommanded MSAA" is turned off. You can also try adjust the "Min & Max Render Scale".
https://i.stack.imgur.com/yrya1.png
Related
I am trying to experiment with ARCore and Unity using Vuforia and I have some trouble with object not being lit up. The whole object is dark and I tried a few different lighting settings too.
Can you help?
Here are some screenshots:
Phone:
Unity:
UPDATE:
So I tried it with a different object downloaded from the asset store (a tree) and if works flawlessly with that. So the problem must be with the knight... but what is it?
You have to add a directional light GameObject to the scene (but I guess it works with any light object).
Right Click > Light > Directional Light
Here is my light object configuration:
Directional Light Configuration
It works flawlessly. Hope it works for you.
Let me know if this solved the problem.
Check material of the game object. If it’s Standard then change it to Lit Image or transparent or transparent cutout whatever you texture type.
Make sure the directional light is directed correctly towards the object
Just realized how old this is, but the answer was to turn up the Lightmap Encoding in the Player Settings when building the app. File>Build Settings>Player Settings>Player. Mine defaulted to Low, and I switched it to Normal.
You may also need to update the default quality level to at least Medium.
i'm trying to do a simple AR scene with NFT image that i've created with genTextData. The result works fairly well in unity editor, but once compiled and run on an android device, the camera resolution is very bad and there's no focus at all.
My marker is rather small (3 cm picture), and the camera is so blurred that the AR cannot identify the marker from far away. I have to put the phone right in front of it (still verrrrryy blurred) and it will show my object but with a lot of flickering and jittering.
I tried playing with the filter fields (Sample rate/cutoff..), it helped just a little bit wit the flickering of the object, but it would never display it from far away..i always have to put my phone like right in front of it. The result that i want should be: detecting the small marker (sharp resolution or/and good focus) from a fair distance away from it..just like the distance from your computer screen to your eyes.
The problem could be camera resolution and focus, or it could be something else. But i'm pretty sure that the AR cannot identify the marker points because of the blurriness.
Any ideas or solutions about this problem ?
You can have a look here:
http://augmentmy.world/augmented-reality-unity-games-artoolkit-video-resolution-autofocus
I compiled the Unity plugin java part and set it to use the highest resolution from your phone. Also the auto focus mode is activated.
Tell me if that helps.
I'm developing simple VR app for android using google cardboard SDK for unity. I'm using unity 5(free), and latest version cardboard package for unity(0.4.9), i am testing on sony xperia z3 compact(lolipop) and on samsung galaxy note 3(kitkat)
The issue i'm having is, that i can't turn on anti-aliasing, or rather google sdk package, seems to disable it. I dont mind a little jagged corners, but the thing is that those corners are flickering when they are far away enough. I tried moving textures apart(since the flickering could be result of overlapping), but issue still persists.
It can't be the issue with the unity exporting for android, since if i build same project(test example provided inside google cardboard unity package) using stock camera provided by unity, insted of Cardboard main gameobject), AA is working. It cant get simpler than that, once cube and one camera... I have tried turning on and off the 32-bit display buffer option in player settings, tried forcing open GL ES 2.0 and various other tick/untick checkbox inside unity tips found across web, with no success.
So my question is, is anyone else having this same issue. And how to fix it?
I hope my question and description of the problem are detailed enough.
Cheers
The Cardboard.cs class holds a RenderTexture which is what the cameras render to, and then it is rerendered to the phone screen with distortion correction for the lenses. This bypasses Unity's normal rendering to the screen, so the AA settings for the project won't have any effect.
To see what effect AA settings in Unity will have, you can do a couple of different things:
Turn off Cardboard.SDK.nativeDistortionCorrection, so that Unity is drawing directly to the screen, or
Edit Cardboard.CreateStereoScreen() and change the settings on the RenderTexture that is allocated there.
However, the native code rerender for distortion does not use anti-aliasing in the framebuffer, so I'm not sure how much effect you'll see in #2. And there will certainly be a performance penalty either way.
Turning off distortion correction fixed this issue. Then you can set the Unity project anti alias settings up to 4x or 8x. Looks great now - no more jaggies while viewing on an iPhone 6+.
Unity 5 running Google Cardboard SDK v0.5 --
To turn off Distortion Correction:
Go to Hierarchy | CardboardMain | in Inspector uncheck Distortion Correction
To enable 4x or 8x Anti Aliasing in Unity:
Go to Edit | Project Settings | Quality | in Inspector set Anti Aliasing to 4x or 8x Multi Sampling.
If you don't want to turn off distortion correction you need to change the depth of the texture from 16 to 32 bits and the format of the texture to ARGB32. To do that go to file BaseVRDevice.cs, then find CreateStereoScreen function and change it to
public virtual RenderTexture CreateStereoScreen() {
Debug.Log("Creating new default cardboard screen texture.");
return new RenderTexture(Screen.width, Screen.height,32, RenderTextureFormat.ARGB32);
}
I followed serg hov's suggestion, however AA is not working with distortion on even after applying that fix.
It doesn't make sense to turn off distortion, so I am wondering what would be the solution to have both distortion and AA working. It works for Gear VR, so there gotta be a way to get it working for Cardboard VR.
I am very frustrated with this problem and the Unity3D community isn't very helpful because no one there is answering my question. I have done a ton of searching to find what the problem could be, but I didn't succeed. I install Qualcomm Vuforia 1.5 and Unity3D 1.5.0.f. I use the Unity extension. I imported their demo app called vuforia-imagetargets-android-1-5-10.unitypackage, put their wood chips image target on the scene, their AR camera, and added a box object on top of the image target. Then I built it and sent to my Samsung Galaxy tablet. However, when I open the app on the tablet and point the tablet to the image target, nothing shows up - the box isn't there. As if I didn't add any objects to the scene. I just see what the device camera sees.
Has anybody experienced this before? Do you have any ideas what could be wrong? No one online seems to be complaining about it.
Thank you!
Make sure you have your ImageTarget dataset activated and loaded for your ARCamera (in the inspector, under Data Set Load Behaviour) and that the checkbox next to the Camera subheading in the inspector is ticked.
Also ensure that the cube (or other 3d object) is a child of ImageTarget in the hierarchy, and that your ImageTarget object has it's "Image Target Behaviour" set to the intended image.
You may also have to point your ARCamera in such a way that your scene is not visible to it.
Okay. I got the solution. I asked on Qualcomm's forum and one gentleman was nice enough to explain to me that I missed setting up Data Set Load Behaviour in Unity's AR camera. I had to active the image target data set and load the corresponding image target. Once I set these two things up, built and deployed, all worked well.
Good luck!
I am trying to display a .md2 model over top of the camera preview on my android phone. I don't need to use the accelerometers or anything. If anyone could even just point me in the right direction as to have to set up an opengl overlay that would be fantastic. If you are able to provide code that shows how to enable this that would be even better! It would be greatly appreciated..
I'm not able to provide code until later this week, but you might want to check out a library called min3d, because I believe they already have a parser written for .md2 files. Then I believe that if you use a GLSurfaceView, the background can be set to be transparent, and you can put a view of the camera behind it. Are you trying to get some kind of augmented reality effect? There are android specific libraries for that too, but they're pretty laggy (at least on my Motorola Droid).