Unity3d Using a different device for rendering and displaying - android

Our project consists of very detailed models and we are supposed to show it in a android device. We are using VR cardboard.
The project has become quiet large and it's starting to lag in the mobile phone. We are still add more models and effects.
We need a way such that the game runs on a pc and displayed on android device so that everything runs smoothly. The game needs to access gyroscope and other sensors for VR.
Can It be done?

Yea. Unity Remote is there to help you with just that.

Related

How to debug when using ARCore?

I'm supposed to make a small AR app for android but since it seems ARCore can't use the computer's camera and Unity Remote isn't compatible with ARCore I have no way of running the app in the unity editor.
Only thing I can do right now is building the app, installing it on my phone, eventually create a canvas where I put some text to see if the app goes where I want it to go. I probably spent half the day debugging simple things because I can't find a way to have access to any console while using the app.
I can run a normal unity app with Unity Remote as long as ARCore isn't used, otherwise, it's just a black screen.
It's for a professional project so I can't just installed plenty of other plugins to make it work that way, I'm limited to ARCore and AR Foundation.

ARCore with different scene / real camera

Friends and me are trying out the ARCore of Google. But since there is only limited phone support at the moment, none of us owns one of the supported phones.
So I googled and found out that you can run ARCore in the Android Emulator, which works nice.
But I have a question regarding that given scene: Can we change it somehow, or use the webcam from our laptops in order to test it under our real scenery?
Ultimately we want to build something with Unity + ARCore, but our current Unity-App crashs when running it on the Emulator. So I want to ask, if this is even possible in the current state, to build an .apk with ARCore in Unity and run in on the Emulator?
Sorry for alot of questions.

Simultaneous Localisation and Mapping for android

I recently started exploring the world of visual recognition and was particularly interested in SLAM. I have tried and tested different SLAM implementations on a laptop from openslam.org
I'm still new to the field and still learning about it. I want to implement SLAM on an android device. Can anyone point me to any such implementation. Or can you suggest which implementation of SLAM would work best on an android device. I have access to top of the line android devices such as the Galaxy S6 or the Nexus 5.
Before starting to work on my idea I just wanted to know which implementation would work best in terms of efficiency and accuracy on an android device.
Thankyou
Those guys are quite good, the source is available under GPLv3 license (in user friendly form) and a friend of mine was able to run it on android in real-time (It was developed for MAVs and they state that it should reach 50FPS on current embedded computers).
https://github.com/uzh-rpg/rpg_svo
(Check out the video on the homepage as well)
orbslam is a open source slam system basing on sparsity method.
on pc orbslam can reach about 35 FPS,on smartphone it can‘t perform like on pc, so you can't move the phone too fast. That is because the feature extraction process of orbslam take so much time.
svo is a good choice as for speed, however, svo is only a visual odometry, it dosen't have a loop closing module, therefore when run for a long time, the drift may be large.
therefore, i recommend orbslam.
many people have migrate the code to android, you can search on github:https://github.com/search?q=orb-slam+android&ref=opensearch

Augmented reality through the use of an external camera. Android Development

So I have been working in a project of my own and I encountered the biggest road block when I realized that it was apparently very difficult to use in SDK such as Vuforia, Metaio, Wikitube, DroidAr and NyArtToolKit a external camera communicating through Wi-Fi to a android device.
Now, I have two android devices smoothly connected through wi-fi. I can see the camera view from one device in the other device. I would like to use that view in the receiving android device in order to "create" the augmented reality experience. Because there is not a widely used of such a technique, those big AR SDK have not worked for me. I cannot pay for them either.
Do anyone know what can I do or if you can point me to a tutorial that deals with this issue? I obtained the bitmap from the server device and send everything through IP/UDP packages to the other devices.
Any help?
Thanks.

Test Flash on Android device via USB

I am currently learning Actionscript 3.0, mainly to start developing mobile games for Android.
Device central is really useful to emulate the Flash content running on devices, but I thought there was a way to test directly on the device itself via USB. Am I mistaken here, because I cannot for the life of me find any information on doing this.
I found several guides on how to get the USB connection up, but the actual testing/debugging itself seems to be done exclusively in Flash Builder.
I am using Flash CS5, and I want to test my AS3 projects directly on my Nexus S via USB.
The only guides I can find detail the publishing of Flash projects to Android, which is a fairly lengthy process. Surely there has to be a quicker way to preview content directly on your phone without having to go through the entire process of creating an APK for it?
This should help you out...
Getting Started with Adobe AIR for Android

Categories

Resources