I'm using the Android Emulator (version 28.0.23-5264690) to develop an app that uses QR codes. The Virtual Scene feature for the camera on the emulator has an option to add your own images to the scene:
When using the emulator with a camera app, you can import an image in PNG or JPEG format to be used within a virtual scene. To choose an image for use in a virtual scene, click Add image in the Camera -> Virtual scene images tab in the Extended controls dialog.
This feature can be used to import custom images such as QR codes for use with any camera-based app.
However, I've tried adding both JPG's and PNG's in the Emulator settings, but they never show up in the virtual scene in either the stock camera app or my own app. I've tried restarting the emulator as well, but still no luck.
Any idea what could be wrong?
I just found this piece of text at the Android emulator documentation for augmented reality:
To view these image locations in the scene, launch your emulator, then move the camera to the dining room area through the door behind the camera’s starting position.
So apparently the images show up in a different room than the one you start out in. By using the movement commands described in the link above, you have to walk into the "dining room", which is through the door behind the dog. There the images are displayed on the wall and the table.
Sadly, the QR code scanner Flutter library I'm using doesn't respond to the QR code, so I'm stuck on using a physical device anyway.
Related
In a virtual android setup you're supposed to be able to navigate through a virtual room when using the camera. In that room, you can change your wall and table arts by uploading images. However, When using certain apps like snapchat, instagram and whatsapp, you get the above pixel art of a green house that's jitters around slightly.
The fact that this same images appears for both snapchat and instagram makes me think that this video is somewhere on the emulated device, and therefore we can change and edit it. and if not, can I still go into that virtual room or upload any images to the camera?
In your AVD configuration, check that your back camera is set to "VirtualScene" instead of emulated. You can find it by clicking on "Advanced Settings".
I am working in a personal project on Android and I would like to add images captured from Webcams into the app, e.g from here:
https://www.skylinewebcams.com/en/webcam/united-states/new-york/new-york/new-york-manhattan.html
My problem is the Webcams are in Flash and I want to show in the app just a picture.jpg or .png captured , so I am thinking:
1.- Getting a webCam screenshot using the webcam url (will called with a cronojob)
2.- Saving the screenshot in a database
3.- Showing the screenshot in the app
I am trying with tools like www.screenshotlayer.com or www.whoapi.com but the problems I have are:
WebCam is not loaded when the script gets the screenshot, so in the screenshot is just exhibited the loader.
The screenshot does not contain the image captured (like the image I am attaching getting from screenshotlayer.com)
Any idea about how solve this?
I've encountered the same problem and have to switch to http://browshot.com The do support flash elements.
Making an area-aware Unity app on Tango, and it works fine until I add another camera. It uses an ADF for location awareness, and the default AR Camera prefab. So it shows what the device's camera sees. Fine. Now I want to add a mini-map in the upper corner. Simply adding another camera will cause the app to crash. I tried:
- perspective vs orthog
- render to texture
- various 'clear' flags
- checked my layers
- no culling or anything fancy
It's simple. If I enable the camera, it crashes. If I disable the camera, it works.
Does anyone have any insights into this?
Thanks
I develop a Web App' using ThreeJS and I convert it into an Android apk using Cordova.
Here is the thing, I would like to have the tablet or smartphone's camera preview in the background of my ThreeJS Scene. In order to complete this task, I'm using the Camera Preview Plugin. However, when I use this plugin it can't be triggered by a load event. Moreover, when I finally activate this plugin with a button the camera isn't visible or the camera is hidding the ThreeJS Scene.
Does anyone have a solution to get the camera preview in the ThreeJS Scene's background ?
Sorry for that guys I have discover that in my app, I modify the body's background-color. So instead of getting the video behind the Three.Scene, I get nothing. Indeed, to get this Plugin working you need to have a transparent background color !
I have implemented Augmented Reality application for android using Adobe Air for Android, FLARManager, Away3DLite.
The program works fine on flash, However when i publish it on my mobile phone (HTC Nexus One) or run it on the emulator my camera doesn’t activate and all i can see is the colour of my background and the framerate display.
I think that the problem is the Camera3D that i have used which is the FLARCamera_Away3DLite from FLARManager.
This is how I set my camera
import com.transmote.flar.camera.FLARCamera_Away3DLite;
private var camera3D:FLARCamera_Away3DLite;
this.camera3D = new FLARCamera_Away3DLite(this.flarManager, new Rectangle(0, 0, this.stage.stageWidth, this.stage.stageHeight));
I will really appreciate any advice i can get from you.
Thank you George
I think that you think wrong of the camera class. The camera class you use is the camera in your "virtual" 3d world and it is filming your 3d world. The "film" it then makes goes to the View class which can show your 3d world to 2d. Your screen is a 2d screen and is not capable of showing 3d. The camera class in combination with the view converts your 3D scene to a 2D image what is shows on your screen.
But since you want to make an AR app you mean the camera of the phone. You cant use the Away 3d camera class for this. This tutorial shows how to use the camera of your andriod phone in flash.
The steps you want to take is that you get your phonecamera feed, and past this on the screen. Then use the FLARtoolkit to determine the position of your marker. And then adjust the 3D model to the position of the marker. And last but not least show the 3d model on the screen (using the away3d/papervision camera and view). So basically you got 2 layers in you flash app. 1 background layer which is the feed of your phonecamera and the other layer (on top of it) is your view from away3d or papervision.
I think if you combine those tutorials you can make your application:
Use your phone camera
Augmented Reality with FLARManager
AR basics