So I have been working in a project of my own and I encountered the biggest road block when I realized that it was apparently very difficult to use in SDK such as Vuforia, Metaio, Wikitube, DroidAr and NyArtToolKit a external camera communicating through Wi-Fi to a android device.
Now, I have two android devices smoothly connected through wi-fi. I can see the camera view from one device in the other device. I would like to use that view in the receiving android device in order to "create" the augmented reality experience. Because there is not a widely used of such a technique, those big AR SDK have not worked for me. I cannot pay for them either.
Do anyone know what can I do or if you can point me to a tutorial that deals with this issue? I obtained the bitmap from the server device and send everything through IP/UDP packages to the other devices.
Any help?
Thanks.
Related
I have an android device running a custom ROM and self-developed apps.
Android version: 8.1 (API level 27)
There is a temperature and humidity sensor built into the android device. Now I want to feed the sensor data into my smart home system using Matter.
I´m not familiar with C++ and other languages apart from Java/Kotlin and Python. So is there any way to send the sensor data to my Amazon Echo Smart Home Hub via the Android software with Matter?
It would also be good if it could work independently of the Smart Home system (Google, Apple, Samsung, etc.).
I have checked out the Matter Github repository. There are some example apps and I tried to apply it to my issue. But I did not get any further at this point.
Any help is much appreciated.
An android app might not be the best path for this, as it cannot stay on and listen to external communications all the time. You can look for solutions to run a local server on the android system and implement a virtual device that way. Also, you can download the matter specification from the CSA website https://csa-iot.org/all-solutions/matter/ and take a look at how to implement matter communications.
Our project consists of very detailed models and we are supposed to show it in a android device. We are using VR cardboard.
The project has become quiet large and it's starting to lag in the mobile phone. We are still add more models and effects.
We need a way such that the game runs on a pc and displayed on android device so that everything runs smoothly. The game needs to access gyroscope and other sensors for VR.
Can It be done?
Yea. Unity Remote is there to help you with just that.
I am creating an application that connects to the phone camera from native code.
This works great on my phone.
The problem is that when i try to do the same thing with android emulator, there is
no "/dev/video" to connect to (i know the camera is connected because i am able to open it
using the camera app).
Does anyone know if there is another way i can connect to the camera from native code?
/dev/video0 is, in terms of Android, an implementation detail, and not guaranteed to be present on any device or emulator.
Emulator support for camera is very limited, see for example Android webcam enable in emulator
There is no official native camera API on Android, so there's no guaranteed to work way of doing this.
For maximum compatibility, use the Java API and send the image data to native code for processing, if necessary.
I am developing an iPhone and an Android app which will connect to the Sony Camera NEX 5T via the Sony Camera API and receive an image being sent from the camera. I downloaded the Sony tutorial from https://developer.sony.com/2013/11/29/how-to-develop-an-app-using-the-camera-remote-api-2/ and tried using it.
I am not able to connect to the camera via the demo app as well as it doesnt locate my camera at all.
But if i use, play memories app by Sony, it locates and sends the image in a fraction on a second. I even tried to manually reach the camera via 10.0.0.1/sony/camera but it doesnt connect.
Any help will be greatly appreciated. A working demo from anyone would be highly helpful.
Thanks in Advance
I can suggest a few things to check your setup.
1) Update to latest firmware on the camera (esupport.sony.com)
2) Smart Remote Control app in the NEX-5T camera (latest version is installed?)
https://www.playmemoriescameraapps.com/portal/usbdetail.php?eid=IS9104-NPIA09014_00-F00002
3) Latest version of iOS/Android sample applications from Camera Remote API SDK.
4) Start Smart Remote Control app in NEX-5T.
Best regards,
Prem, Member of Developer World team at Sony
I am currently learning Actionscript 3.0, mainly to start developing mobile games for Android.
Device central is really useful to emulate the Flash content running on devices, but I thought there was a way to test directly on the device itself via USB. Am I mistaken here, because I cannot for the life of me find any information on doing this.
I found several guides on how to get the USB connection up, but the actual testing/debugging itself seems to be done exclusively in Flash Builder.
I am using Flash CS5, and I want to test my AS3 projects directly on my Nexus S via USB.
The only guides I can find detail the publishing of Flash projects to Android, which is a fairly lengthy process. Surely there has to be a quicker way to preview content directly on your phone without having to go through the entire process of creating an APK for it?
This should help you out...
Getting Started with Adobe AIR for Android