android: api to external usb devices for robotic applications - android

Android devices become incredible cheap (especially those with android v1.6). I'm considering to use one as a brain of autonomous robot. Unfortunately I didn't find any info on that.
I would like to connect two external USB-webcams and some DIY-selfmade USB ADC & output-ports converter to steer the wheels and read analog distance sensors. If I choose some cheap netbook than they usually already have 3 usb ports. But if I will be forced to use a tablet, then it requires also an usb hub.
Do android devices support usb-hubs?
Is there any API to grab still frames from external usb webcams (e.g. "vfa://0" & "vfa://1")
Is there any API to read from USB custom device? Let assume that it will simulate serial port for simplicity.
Do I get all of this in android 1.6 or any newer version?
As an update for your information: Based on answers I assume that android device will be too expensive in comparison to effort. I will go for cheap atom netbook with standard linux & arduinio USB device for controls & sensors. At cost of half kg (one pound) heavier device I will save months on learning & development.

You need an android device that either supports usb host mode out of the box (a few of the cheap tablets apparently do) or a phone that can do so with custom usb power wiring and perhaps a new kernel driver (as many phones can).
You will likely need root.
The api would be the normal linux USB stack, including just about any C-coded source-available device driver available for a desktop linux (excepting those that use bits of x86 binary windows drivers run in a compatibility wrapper).
You could interact with that either from the ndk using normal methods (device files, read/write/ioctl) or with careful driver design so things really look like files you can probably get at some of it from java or at the very least java with some thin ndk wrappers around device file operations.
Essentially, this isn't an "android" question, it's a question about the capabilities of a particular android device's hardware, how to get root on that device, and then it becomes a standard embedded linux question.

As far as I know you won't get any of this with the default Java API. Lots of this stuff can however be achieved if you build a custom kernel and add needed modules to it. Basically it all comes down to kernel hacking and won't be really Android related.
I'm very interested in stuff like that myself so keep me updated please.

Related

Hardware Access

I've always liked cheap smartphones ($ 50) because with little money I can have a powerful system with lots of sensors and things like that. So I wondered if it was possible to use the hardware without using the very limited android APIs, programming it at a low level then, of course with the root. In particular I wanted to see how the LTE module worked and experiment with this having full control, the Android API does not allow it to do much.
UPDATE: I'm using something called libhybris, a wrapper that permit the use of android driver blobs in Linux.
The first layer of software for the phone is the bootloader. It tells the processor what partition to load into memory for executing the kernel. This is the level that is usually blocked by manufacturers because of greedy corporate reasons that are beyond the scope of this site.
The second layer of the phone is the linux kernel. Rooting is the process of gaining root user access to this layer. Root is the main administrator user account that has permission to do anything to the device. Accessing this layer is what most people refer to rooting. A large portion of the kernel is written in C, with other parts in c++. What happens at this level is where all the magic is. For most phone this is where the code for the modem resides. Talking to this can usually be done via at commands of serial. Sensors are also programmed at this level and communicate via drivers. Root access is not normally needed to read sensor data, its just a case of permissions usually.
The next level is the android operating system, the java instance runs on top of that, which in turn executes the android operating system. This is the portion that most users will see and is primarily written in java. In reality you can run any kind of user interface at this level.
A very brief view of android apps.
The android api provides a way for java developers to write "apps" that communicate with the kernel and access different parts of the phone's hardware. These apps can also be written using c++. Only until recently has google integrated c++ into android studio, but the most common and still most effective method of doing so is using the QT framework.
It's a bit problematic.
Hardware manufacturers do that actually.
Take into account that Android is Linux much like other distributions.
The manufacturers develop hardware and then compile a version Android that sits on top of it. Each Android compilation is specifically tailored to the hardware and equipped with drivers that enable the main OS access to the different hardware capabilities.
For example, some tables will tweak the Android OS to not support cellular communication because they decided to cut costs and deliver the tablet without a cellular module.
From here you have 2 options:
To hack a specific hardware and understand how the OS communicates with the hardware.
Find hardware manufacturers that release some/all of their Android OS code. This is a much simpler way as you can both learn and extend the Android OS for that specific device.
An example of the 2nd way is Sony who has AOSP that allows low-level access to some of the Sony devices.
Also, there is always the Android NDK which gives you a more low-level access to Android but you are still constrained by the KIT API so I'm not sure it will help you.
I'm using something called libhybris, a wrapper that permit the use of android driver blobs in Linux.

Android USB Audio [duplicate]

Now that the Android APIs support working directly with USB devices (since 3.1), I am curious if there has been any work to create "soft-mode" drivers for some of the more popular class-compliant devices (such as audio or HID).
In other words, are there any open source projects that wrap up more useful communication with specific classes of devices into a Java class that can be added to an Android project?
For my purposes, I am specifically interested in USB audio, but it seems that a community-built set of classes derived from Linux kernel module sources could be beneficial to many projects. My hope is that others have thought of the same thing and have already began work. Any pointers in this direction would be most appreciated.
A few more resources that I have stumbled on:
User mode USB isochronous transfer from device-to-host
Audio Evolution seems to have built their own userland driver somehow
I have started work on an IRDA driver stack over USB in user space.
I am working out the basic plumbing but as far as I can see if I can create the equivalent user space driver to the linux kernel drivers such as the STIR4200 driver then I "ought" to then be able to port over existing IRDA protocol stacks such as JIR.
We shall see...

Android USB host mode "soft-mode" drivers for standard class-compliant USB devices

Now that the Android APIs support working directly with USB devices (since 3.1), I am curious if there has been any work to create "soft-mode" drivers for some of the more popular class-compliant devices (such as audio or HID).
In other words, are there any open source projects that wrap up more useful communication with specific classes of devices into a Java class that can be added to an Android project?
For my purposes, I am specifically interested in USB audio, but it seems that a community-built set of classes derived from Linux kernel module sources could be beneficial to many projects. My hope is that others have thought of the same thing and have already began work. Any pointers in this direction would be most appreciated.
A few more resources that I have stumbled on:
User mode USB isochronous transfer from device-to-host
Audio Evolution seems to have built their own userland driver somehow
I have started work on an IRDA driver stack over USB in user space.
I am working out the basic plumbing but as far as I can see if I can create the equivalent user space driver to the linux kernel drivers such as the STIR4200 driver then I "ought" to then be able to port over existing IRDA protocol stacks such as JIR.
We shall see...

Serial port on an APAD - where to discuss this?

I've got an application that is written in Java and which talks to a device using virtual serial port (ie a USB CDC ACM device). Currently it runs an a PC (Windows/Linux/Mac OS X),
but it would be a perfect match to be able to port this to a cheap tablet PC to create a stand alone system.
I've been googling for hours now and it seems quite a lot of people are interested in this sort of thin (no surprise there) and some have managed, but I've not found a good match for what I'm looking for or a good place to discuss this.
I'm looking at something like this:
http://www.prlog.org/10776061-101-inch-android-ipad-android-google-mid-tablet-pc.html
I would like to discuss the following:
This says that it supports USB host so it should be doable, eh?
Android is a kind of Linux so I should be able to use a serial dongle there, right?
Has Android got drivers so that I could just plug in a serial port dongle and open it as /dev/tty?
Would the above quoted APAD be usable as a development platform ?
So where would be a best place to discuss this?
br Kusti
To keep this at least partly programming-related: If your application has a GUI, moving to Android is not going to be a load-it-and-go effort. Android has a very different application structure that you're used to and doesn't have Swing (if that's what you're using) or any GUI toolkits other than its own.
On your USB problem: There is support for a few USB-to-RS232C adapters in the stock kernel, but there are a bunch of practical reasons not to use it. Most involve limiting yourself to devices that support host mode, cabling and powering the Android device and the serial adapter. You might be better off using a Bluetooth-to-RS232 adapter on your serial device, which would allow your app to run on a wider variety of devices and gets you the bonus feature of being wireless to put in your marketing material.
I just ported the RXTX library to the Android. Unfortunately I had to fork it to accomodate for different layout of the android projects. More details are here http://v-lad.org/projects/gnu.io.android/
You need to have a device that supports host USB mode. Also the kernel on the device has to support the USB to Serial converter or you have to recompile the kernel yourslef.

Redirecting/duplicating the UI to an external output

Is it possible in the Android framework to duplicate what is displayed on the main display (UI)?
I have a situation where I need to demonstrate my app to many people, and it would be easier to do if I can duplicate the screen contents to an external monitor/TV. I am not married to the idea of using the HDMI port, I would be happy doing this through Wi-Fi or Bluetooth or USB if need be. What I am looking for is to see if I can do something similar to what Windows does by default when a second monitor is connected.
I have been through the developer's documentation and haven't been able to find anything that would allow me to do this, but it would not be the first time I've missed something. Specifically I need to do this with an HTC Evo.
Your options are limited, mostly by your choice of device. The HTC EVO's HDMI port will only play back apps via the built-in Gallery application (videos and still photos).
You will need to use a "software projector" like Droid#Screen -- attach your EVO to a Android SDK-equipped notebook that is connected to a projector. Droid#Screen will display the EVO's screen on the notebook (and, from there, on the projector). However, the frame rate is limited to about 5-6 fps, due to limitations in the SDK tools that Droid#Screen leverages.
Or, get your hands on an HTC Droid Incredible, which supports composite output to TVs of anything on the main display via a special cable. The Samsung Galaxy Tab also supports this for anything that does not involve a SurfaceView, based on my experimentation to date. Some versions of the Samsung Galaxy S also support this, at least to some extent.
Or, use a webcam.
Or, use an ELMO (basically a webcam designed for document or device projection).
You can write a UiCloningService in jni that exposes a JNI method to clone the display. Usually, as Android is based on Linux, it will use the Linux framebuffer technology to represent display devices as dev nodes under /dev/fb* or /dev/graphics/fb*, where '*' can be 0,1,2,... depending on number of display connected.
As your device already has an HDMI port, it would be exposed via /dev/graphics/fb1, considering fb0 to be your default LCD display.
In the cloning service, you can then write to device attribute files created for the HDMI port under sysfs and, if the display driver of your device has implemented those features (which most probably would have, otherwise what point to have an external HDMI display), these features/functions in the driver will be responsible for cloning the Ui on your primary display to the secondary display.
But you would have to write the Ui cloning service in JNI.(usually device manufacturers provide such methods, if at all an Android SDK is provided by them for development on that particular device).
For eg., I have attached a UiCloningService.cpp that has a cloning JNI function for Android GingerBread on an OMAP3 platform below:
UiCloningService.cpp

Categories

Resources