Zed camera with Android phone/tablet - android

On the website it says that even though there is no SDK provided for android it is possible to connect it to the android device with a micro-USB adapter. Is it possible to extract images from both cameras of ZED in this case? Is so, any code samples to accomplish this task?

I have contacted the ZED Support Team and receive the following reply:
You need to connect the camera to the USB port (but note that it will
behave in USB2 mode).Then to extract the side by side image of the
ZED, you need to do some JNI in C (Java/Android does not allow native
grabbing from USB). To do so, you can find on github a libuvc wrapper
for android that reads UVC camera such as the ZED. (
https://github.com/saki4510t/UVCCamera )
Enjoy!

Related

how to use a USB Webcamera without NDK?

I am trying to connect USB web camera to my Android phone and capture images using the web camera. I want to do this without using any Library. As far as I see, I could not find any Android API to support USB web camera. I see a lot of applications on the PlayStore doing the job.

Access screen buffer on Android phones via USB using Python

I'm using Python to access Android phone via USB.
Library I'm using is PyUSB.
Is there any internal structure I can find that can lead me to access the screen buffer?
Or is Python too high level that I need to switch to C/C++?
If so, is there a way to access the screen buffer on the Android?
Please provide a detailed instruction.
a VNC server accesses the frame buffer, in droidVNCserver that is open source this is done in native C ( JNI - Java Native Interface ) and i think it is only possible in low level C. the source code is in https://github.com/oNaiPs/droidVncServer/tree/master/ , see https://github.com/oNaiPs/droidVncServer/tree/master/jni/vnc and see https://github.com/oNaiPs/droidVncServer/tree/master/jni/vnc/screenMethods for the jni code
How Droid VNC works?
so with this you can access the frame buffer on the android device however then you have to grab the image over USB using python what is not trivial. you can i.e. mount the android as mass storage or PTP ( picture transfer protocol ) or MTP ( media transfer protocol ) device or use adb however this is not easy
if you want an easy solution install droidVNCserver on your android and write a python program using libraries like https://pypi.python.org/pypi/vncdotool to access it...

Augmented reality through the use of an external camera. Android Development

So I have been working in a project of my own and I encountered the biggest road block when I realized that it was apparently very difficult to use in SDK such as Vuforia, Metaio, Wikitube, DroidAr and NyArtToolKit a external camera communicating through Wi-Fi to a android device.
Now, I have two android devices smoothly connected through wi-fi. I can see the camera view from one device in the other device. I would like to use that view in the receiving android device in order to "create" the augmented reality experience. Because there is not a widely used of such a technique, those big AR SDK have not worked for me. I cannot pay for them either.
Do anyone know what can I do or if you can point me to a tutorial that deals with this issue? I obtained the bitmap from the server device and send everything through IP/UDP packages to the other devices.
Any help?
Thanks.

Android Emulator - /dev/video missing

I am creating an application that connects to the phone camera from native code.
This works great on my phone.
The problem is that when i try to do the same thing with android emulator, there is
no "/dev/video" to connect to (i know the camera is connected because i am able to open it
using the camera app).
Does anyone know if there is another way i can connect to the camera from native code?
/dev/video0 is, in terms of Android, an implementation detail, and not guaranteed to be present on any device or emulator.
Emulator support for camera is very limited, see for example Android webcam enable in emulator
There is no official native camera API on Android, so there's no guaranteed to work way of doing this.
For maximum compatibility, use the Java API and send the image data to native code for processing, if necessary.

Connecting Kinect to Android

So I'm trying to hook up a Kinect to an Android tablet using any means necessary. I would preferably like to avoid a windows machine or arduino board in the middle.
The method I've already tried is to have a C# program (the kinect sdk uses C#) communicate with the android device. I tried to figure out how to send a message through usb, and decided to do port forwarding. This worked, but was slower than I would like it to be.
I guess the question is can I connect it to Android as a usb device or accessory and communicate via JNI?
In theory you should be able to use the OpenNI for ARM. I've seen Hirotaka's demo of OpenNI running on Linaro Android but using an Asus Xtion Pro sensor and a Panda board.
Hirotaka also posted notes on his setup.
Doing a quick youtube search reveals examples with Kinect and Android tablets.
Side note: I don't I understand why you're trying to use C#: you'll be writing Android applications in Java and OpenNI has a Java wrapper.

Categories

Resources