ARCore: Emulator and unity - android

I'd like to test ARCore using Unity/c# before buying an android device - can I use Unity and ARCore emulator without having a device to put together an AR app but just using a camera from my PC, and does the camera require a specific spec?
I read Android Studio Beta now supports ARCore in the Emulator to test an app in a virtual environment right from the desktop, but can't tell if the update is integrated into Unity.
https://developers.googleblog.com/2018/02/announcing-arcore-10-and-new-updates-to.html
Any tips how people may be interacting with the app using a pc camera would be really helpful.
Thank you for your help !
Sergio

ARCore uses a combination of the device's IMU and camera. The camera tracks feature points in space and uses a cluster of those points to create a plane for your models. The IMU generates 3D sensor data which is passed to ARCore to track the device's movements.
Judging from the requirements above we can say a webcam just isn't going to work, since it lacks the IMU needed by ARCore. Just the camera won't be able to track the device's position, which may lead to objects drifting all over the place (If you managed to get it working at all). Even Google's page or reddit threads indicate that it just won't work.

Related

ARCore alternative on Android

I'm developing an Android app with Augmented Reality in order to display points of interests at given location. I do not need face, plane or object recognition, only placing some points at specific locations (lat/long).
It seems ARCore on Android only supports few devices, my customer requires more devices supported as the AR view is the core of the app.
I was wondering if there are alternatives to ARCore on Android that supports placing points of interest at some coordinates, covering a large number of Android devices.
Thanks for any tip.
Well, there is this location based AR framework for Android: https://github.com/bitstars/droidar
However it hasn't been maintained for quite a long time. You can also look at vuforia, however it's not free:
https://developer.vuforia.com/

Area learning after Google Tango

Area learning was a key feature of Google Tango which allowed a Tango device to locate itself in a known environment and save/load a map file (ADF).
Since then Google has announced that it's shutting down Tango and putting its effort into ARCore, but I don't see anything related to area learning in ARCore documentation.
What is the future of area learning on Android ? Is it possible to achieve that on a non-Tango / ARCore-enabled device ?
Currently, Tango's area learning is not supported by ARCore and ARCore's offerings are not nearly as functional. First, Tango was able to take precise measurements of the surroundings, whereas ARCore is using mathematical models to make approximations. Currently, the ARCore modeling is nowhere near competitive with Tango's measurement capabilities; it appears to only model certain flat surfaces at the moment. [1]
Second, the area learning on Tango allowed the program to access previously captured ADF files, but ARCore does not currently support this -- meaning that the user has to hardcode the initial starting position. [2]
Google is working on a Visual Positioning Service that would live in the cloud and allow a client to compare local point maps with ground truth point maps to determine indoor position [3]. I suspect that this functionality will only work reliably if the original point map is generated using a rig with a depth sensor (ie. not in your own house with your smartphone), although mobile visual SLAM has had some success. This also seems like a perfect task for deep learning, so there might be robust solutions on the horizon.[4]
[1] ARCore official docs https://developers.google.com/ar/discover/concepts#environmental_understanding
[2] ARCore, ARKit: Augmented Reality for everyone, everywhere! https://www.cologne-intelligence.de/blog/arcore-arkit-augmented-reality-for-everyone-everywhere/
[3] Google 'Visual Positioning Service' AR Tracking in Action
https://www.youtube.com/watch?v=L6-KF0HPbS8
[4] Announcing the Matterport3D Research Dataset. https://matterport.com/blog/2017/09/20/announcing-matterport3d-research-dataset/
Now at Google Developers channel on YouTube there are Google ARCore videos.
These videos will learn users how to create shared AR experiences across Android and iOS devices and how to build apps using the new APIs revealed in the Google Keynote: Cloud Anchors, Augmented Images, Augmented Faces and Sceneform. You'll come out understanding how to implement them, how they work in each environment, and what opportunities they unlock for your users.
Hope this helps.

Wrong Camera Orientation with Android & Vuforia

We are developing our own Android-based hardware and we wish to use Vuforia (developed via Unity3D) for certain applications. However, we are having problems making Vuforia work well with our current camera orientation settings.
On our hardware, when the camera is placed horizontally - everything works fine. That is, when the camera is parallel to the placement of the display. However, we need to place the camera vertically, or in other words, with a 90 degree difference to the placement of the display. These are all hardware settings. Our kernel is programmed according to such settings and every other program that utilises the camera works compatibly with everything, including our IMU sensors. However, apps developed with Vuforia behave completely odd when the camera is placed vertically.
We assume the problem to be related to Vuforia's algorithms of processing raw camera data however we are not sure. Moreover, we do not know how to fix the situation. For further details, I can list:
-When "Enable Video Background" is on, the projected image is distorted and no video feed is available. The AR projection appears on a black background with distorted dimensions.
-When "Enable Video Background" is on and the device is rotated, the black background is replaced by flickering solid colors.
-When "Enable Video Background" is off, the AR projection has normal dimensions (no distortion) however it is tracked with wrong axis settings. For example, when the target moves left in real world, the projection moves up.
-When "Enable Video Background" is off and the device is rotated, the AR projection is larger compared to its appearance when the device is in it's default state.
I will be glad to provide any more information you need.
Thank you very much, have a nice day.
PS: We have found out that applications that use the camera as a main purpose (Camera apps, Barcode Scanners, etc) work fine while apps for which camera usage is an extra quality (such as some games) have the same problem as Vuforia. This make me think that apps who access the camera directly work fine whereas those who use Android API and classes fail for some reason.
First understand that every platform deals with cameras differently and that beyond this different android phone manufacturers deal with these differently as well. In my testing WITHOUT vuforia I had to transform the plane I cast the video feed onto 0,-90,90 for android/iphone and -270,-90,90 for the windows surface tablet. Past this the iPhone rear camera was mirrored, the android front camera was mirrored as well as the surface front camera. That is easy to account for, but an annoying issue is that the Google Pixel and Samsung front cameras were mirrored across the y (as were ALL iOS on the back camera), but the Nexus 6p was mirrored across the x. What I am getting at here is that there are a LOT of devices to account for with android so try more than just that one device. Vuforia so far has dealt with my pixel and 4 of my iOS devices just fine.
As for how to fix your problem:
Go into your player settings for unity and look at the orientation. There are a few options here and my application only uses portrait so I force portrait and it seems to work fine (none of the problems I had to account for with the above mentioned scenario). Vuforia previously did NOT support auto rotation so you need to make sure you have the latest version since it sounds like that is what you need. If the auto rotate is set and it is not working right you may have to account for that specific device (don't do this for all devices until after you test those devices). To account for that device use an if (or construct a case statement if you have multiple instances of this problem with different devices) and then reflect or translate as needed. Cross platform development systems (like unity) doesn't always get everything perfect since there is basically no standard. In these cases you have to directly account for them by creating a method and a case statement within that so you can cleanly and modularly manipulate all necessary devices. It is a pain, but it beats developing for all devices separately.
One more thing is make sure you check out the vuforia configuration file as it has some settings such as camera mirror and direction settings on there. These seem to be public settings so you should also be able to script to these in your case statement in the event you need to use "Flip Horizontally" for one phone, but not another.

Virtual Reality: How to move camera with user's physical motion in Unity3D

I am trying to move a camera in the virtual environment, when I move while holding my mobile device. So far I can search on the internet and there is no much help I found. I am using Unity 5.2.4.
If someone has done the work of a similar type?
Thanks.
The problem of self-content positional tracking for mobile VR is not easy to solve and that's why there is no solution on Google Cardboard and GearVR SDKs, at least for a while. Also keep in mind that there is no good approach that uses only the IMU data.
The approaches investigated by the time uses a variety of computer vision techniques, using sensor fusion and the mobile camera like the one presented by Univrses, or using depth cameras like the StructureSensor.
John Carmack from Oculus is also working on a solution to achieve a good positional tracking for GearVR (https://youtu.be/dRPn_LK2Hkc).
More about the complexity evolved in this problem here.

Desktop based Augmented Reality Application

I am developing an AR based Application which contains around 30-50 models. Is it possible to develop it on Android cause there might be Memory problem in mobile devices. Is there any Desktop based AR API/SDK that can be used with 3D animation??
Yes you can create an android application for augmented reality. There are many applications on android market especially, the one in GPS. However handling 50 models might cause a memory problem. However in high end devices like Samsung Galaxy S4 and Note 2, i dont think so you might face memory issue. Further you can also place your models in a dedicated server from where your application can fetch it. This can reduce the chances memory issues.
Some basic examples for AR on android are given here:
http://readwrite.com/2010/12/01/3-augmented-reality-tutorials#awesm=~ohLxX5jDGJLml9
AR application for desktop i haven't worked on it. I think this might help:
http://www.arlab.com/
Does desktop application include WebGL applications in the web browser?
If so, then you might want to check out skarf.js, a framework that I have written for handling JavaScript augmented reality libraries in Three.js (JavaScript 3D library that wraps WebGL). It currently integrates two JavaScript-based augmented reality libraries: JSARToolKit and js-aruco.
The skarf.js framework takes care of a number of things for you, including automatic loading of models when the associated markers are detected (association is specified in a JSON file). There is also a GUI marker system which allows users to control settings using AR markers.
Integration with Three.js is just one line of code to create a Skarf instance and another line of code to update.
There are videos, live demos, source codes, examples and documentation available. Check out http://cg.skeelogy.com/skarfjs/ for more info.

Categories

Resources