Using Google Project Tango tablet, I want to take a selfie during an Area Learning usage session. That is, it's learned an area, now I want to have the AR tracking working while I take a selfie. I tried using the WebcamTexture Unity has to get at device 2 (the front facing camera), but the logcat says:
Unable to initialize camera: Fail to connect to camera service
My guess is Tango takes over all the cameras and disallows having this happen.
Is there a way around this? Can I temporarily suspend the AR camera(s), turn on the front camera for a while, save a frame of that, then stop the front camera, then resume the AR camera(s)? And would I be able to use IMU data to keep some sense of orientation while the AR camera(s) off? Using Unity.
In order to access camera from process other than Tango Service, you have to disconnect the Tango Service.
However, you should be able to get store a camera image from just AR Camera. See this post:
Unity plugin using OpenGL for Project Tango
Related
I'd like to test ARCore using Unity/c# before buying an android device - can I use Unity and ARCore emulator without having a device to put together an AR app but just using a camera from my PC, and does the camera require a specific spec?
I read Android Studio Beta now supports ARCore in the Emulator to test an app in a virtual environment right from the desktop, but can't tell if the update is integrated into Unity.
https://developers.googleblog.com/2018/02/announcing-arcore-10-and-new-updates-to.html
Any tips how people may be interacting with the app using a pc camera would be really helpful.
Thank you for your help !
Sergio
ARCore uses a combination of the device's IMU and camera. The camera tracks feature points in space and uses a cluster of those points to create a plane for your models. The IMU generates 3D sensor data which is passed to ARCore to track the device's movements.
Judging from the requirements above we can say a webcam just isn't going to work, since it lacks the IMU needed by ARCore. Just the camera won't be able to track the device's position, which may lead to objects drifting all over the place (If you managed to get it working at all). Even Google's page or reddit threads indicate that it just won't work.
I'm developing an AR application using ARCore and got stuck in the following issue.
I want to get the Camera instance, that the ARCore session initializes, in order to configure it myself(change preview resolution, bitrate, white balance and so on).
Unfortunately, ARCore uses an object called Session(which is also part of ARCore lib) which uses Tango(was taken from Tango Project). The Tango object handles the hardware camera through JNI calls.
ARCore doesn't have API to configure the camera and getting it through reflection seems like a bad idea.
Anybody?
I'm writing Unity for Android app which uses Vuforia Augmented Reality framework.
I need manually change camera exposure to reach better augmentation. The problem is that Vuforia hides all work with camera and make it inaccessible from outside.
Also Vuforia itself uses JNI to work with device's camera.
When I try to do next in my Android plugin:
Camera camera = Camera.open();
I got CAMERA_ERROR_EVICTED exception: "Camera was disconnected due to use by higher priority user."
Is it any way to get current Camera object using JNI or any other approach?
Thanks for any help.
I am trying to build a data logger with machine vision camera function. The most important part from camera2 api is the ability to set focus to infinite to me.
I recently got a Lenovo Phab 2 pro and have been exploring Tango's motion tracking and depth map functions. I would like to record the pose estimation and depth map from Tango alongside my original camera image. Although Tango seems to use only the fisheye camera and time of flight camera for those tasks, I am not able to get any reading for pose and depth map whenever I open the main back facing camera using camera2. I have been doing some research on Tango to see if it allows manual focus control. Unfortunately,I couldn't find anything useful.
My questions are:
Is there a way to get Tango to work while having the back camera controlled by camera2?
If not, is there a way to manually control the main camera's focus using Tango api?
Thank you all very much!
I'd like to be able to use the video stream from the camera (or even a few frames per second) while another app is also using the camera. Is this possible?
No. Access to a camera is exclusive.
See the Camera documentation, which says:
If your application does not properly release the camera, all
subsequent attempts to access the camera, including those by your own
application, will fail and may cause your or other applications to be
shut down.
According to Android dev guide, the answer is no. Usually when you start another camera app, you have to release camera in onPause() of the previous camera app. Otherwise an exception will be thrown. But why don't you try to use different camera? If one app is using back-facing camera, another one can use front-facing camera. On Samsung's galaxy 4, there's a camera mode called dual-shot. running different camera simultaneously seems to be no problem.