Rendering on any point in ARCore - android

I have been working on and exploring ARCore for the past few days. I saw this video from Scope AR. I noticed that they are freely rendering the arrows wherever they touch (ex: on the engine). From what I have understood , you can only render at points or planes identified by ARCore. My question is how they are rendering the arrows without even knowing if that point(where the person taps on the screen) is actually identified by ARCore?

They are using a Microsoft Hololens, it has nothing to do with ARCore.

Related

Can I change pose of ArCore virtual camera or have multiple virtual cameras in a scene?

I want to apply offsets to both translation and rotation of ArCore's virtual camera pose(displayOrientedCameraPose). Is there any way I can do that ? ArCore's camera only lets me read the current pose and not edit/update the same. Trying to create another virtual camera that will have the Pose with offsets applied doesn't work since a frame can have only one camera.
Unlike many others I have started working with ArCore first with Unity and now moving to Android Studio. In Unity it was quite staright-forward since it supports multiple camera rendering. Wondering if anything similar is possible with Android Studio ?
At the moment ARCore allows you to use only one active ArSession which contains only one ArCamera, i.e. camera in your smartphone. Changing ArCamera's Pose is highly useless because 3D tracking heavily depends on its Pose (every ArFrame stores camera position and rotation as well as all scene's ArAnchors and feature points).
Instead of reposition and reorientation of your ArCamera you can move/rotate the whole ArScene.
Hope this helps.

is it posible to accurately place a circle relevant to real life object using arcore?

Using arcore and/or sceneform, would it be possible to place circles accurately on a real life object. Lets say i had a real world table and a known set of coordinates where small ( 10mm ) AR "stickers" need to be placed. They could be on the top/side/underside of the table and need to be placed accurately to the mm. I am currently solving this problem with a number of fixed mounted lasers. would this be possible to accomplish using arcore on a mobile device - either a phone or AR/smart glasses? Accuracy is critical so how accurate could this solution using arcore be ?
I think you may find that current AR on mobile devices would struggle to meet your requirements.
Partly because, in my experience, there is a certain amount of drift or movement with Anchors, especially when you move the view quickly or leave and come back to a view. Given the technologies available to create and locate anchors, i.e. movement sensors, camera, etc it is natural this will not give consistent millimetre accuracy.
Possibly a bigger issues for you at this time is Occlusion - currently ARcore does not support this. This means that if you place your renderable behind an object it will still be drawn in front of, or on top of, the object as you move away or zoom out.
If you use multiple markers or AR "stickers" your solution will be pretty precise considering your location of your circles will be calculated relative to those markers. Image or marker based tracking is quite impressive with any Augmented Reality SDKs. However, having these markers 10mm can cause problems for detection of markers. I would recommend creating these markers using AugmentedImageDatabase and you can specify real world size of the images which helps for tracking of these images. Then you can check if ARCore can detect your images on the table. ARCore is not the fastest SDK when it comes to detecting images but it can continue tracking even markers are not in the frame. If you need fast detection of markers i would recommend Vuforia SDK.

Will the front facing camera work with ARCore

Is it possible to use the device's front facing camera with ARCore? I see no references in Google's docs.
I can't find any reference in the docs. It would also make the whole AR process much more complicated. You have to invert the logic for moving the camera etc. And it's much harder to recognize planes as the user is always in the way. Right now, ARCore only recognizes planes so you can't detect feature points, e.g. in the face of a user.
The answer is: Yes.
With a front-facing camera (without a depth sensor) on any supported Android device you can track users' face since ARCore 1.7 SDK release. The API allowing you to work with a front-facing camera was named Augmented Faces. So, from now you can create a high quality 468-point 3D mesh that your Android app can overlay on a user’s face to bring fun animated effects.

Making Virtual Object Interactive with Mobile's Touch Input - Augmented Reality

I want to develop an augmented reality application for android that is capable of using markers to generate 3D objects and these 3D objects are interactive upon touch using the mobile's touch input.
I have browsed through the available SDKs like Vuforia , Junaio or Layar Player and found that they all support:
Marker detection with 3D virtual image overlay
Virtual buttons that get active when you make them invisible. (Vuforia)
Interactive video playback.
However, what I am looking for is:
Virtual object in AR that can be made interactive using mobile's touch.
I am pretty sure it is possible, as there are virtual video overlays that upon clicking/tap would start a video (similar to an interactive virtual element).
Q. Could someone suggest a library/toolkit best suited for this functionality that I'm looking for?
or
Q. Is there something that I apparently missed during my search with the aforementioned toolkits that already support the functionality I want?
According to your last description, what you need is supported by Vuforia, and there is a sample for pure Android (no Unity) as well.
You need to look at the Dominos sample, where they show how to drag an OpenGL domino object on the screen.
Look here for a quick description:
https://developer.vuforia.com/forum/faq/android-how-do-i-project-screen-touch-target
In case you run into some problems while trying to implement it yourself, you can search at the Vuforia forums for some answers to common problems others have faced with this. But basically, it works well in their sample.
Well, this is for Unity 5.x
First, go through Vuforia's Documentation to know more about Image Targets and AR Camera.
Import your 3D models to the scene so that all interactive objects are a child of the image target.
Read touch on mobile phone (I used android for my project)
if(Input.touchCount > 0 && Input.GetTouch(0).phase == TouchPhase.Began)
Convert touch point into a ray from the screen into the 3D world
Ray ray = Camera.main.ScreenPointToRay(Input.GetTouch(0).position);
Create a plane in the scene (for the ray to hit)
Plane plane = new Plane(Vector3.up, Vector3.zero);
If, the ray hits the plane, get the x,y,z position. Value of pos will have the world position
if (plane.Raycast(ray, out distance)){
Vector3 pos = ray.GetPoint(distance);
}
Please modify the code according to your need. This is a very basic example.

AR image tracker at distant?

I am working on an Augmented Reality app that requires image tracker placed at a distant. A target would be a bill board or scoreboard in a basketball game. I have tried Qualcomm's Vuforia SDK, it seems it only works when the marker is placed within 3 feet from the camera. When you move further, I think it loses detail and AR engine is not able to recognize the tracker any more.
In theory, if the marker is large and bright enough, and with a clearly defined details and border markings for tracking purpose, should it not work?
Also, is there anyway for an AR app to recognize ANY flat surface like a table or hardwood floor with variety of colors and textures, as long as it's a flat surface. Typical applications would be virtual keyboard or chess board.
thanks,
Joe
AR is about recognition markers, not shape. AR engine input is image from camera and there is no way to determine shape from it, so answer for Your second question is: NO.
PS: In my case (iOS) default marker is detected from about 1,5m and can be tracked to about 4m. I think, that resolution of the camera is important thing and can affect on tracking effiency.
The experience we have is that a marker in the size of about 20x20cm is readable by the Vuforia SDK in about 5 meter distance. That seems to be the very limit.

Categories

Resources