Currently I'm looking for a solution to communicate via SPI in Android. I installed the NDK on my system and used the native classes to communicate with the GPIO´s (/sys/class/gpio/gpioxx/value) on my Wandboard. It works fine. So the principle to use the NDK or C code in Android is clear for me. Unfortunately I can't find any SPI devices /dev/… . I only find I²C devices. Can somebody tell me, if android basically offers SPI? And if it is possible, where I can find the device? Can I communicate in the same way like with Linux?
thanks
This is an answer to a very old question but it could still be valuable for someone else
Often by default /dev/spidev0.0 is not enabled. It must be enabled both in the kernel during build time and device tree during runtime.
The first step is to change linux kernel build configuration to enable spidev by adding
CONFIG_SPI_SPIDEV=y
And in the device tree (one of the dts files) add a spidev leaf to the spi device you wish to map to connect to.
This is an example for hikey960 and spi3 on the high-speed connector other boards will be slightly different
&spi3 {
/* On High speed expansion */
label = "HS-SPI1";
status = "okay";
spi-cpol=<0>;
spidev#1 {
spi-max-frequency = <5000000>;
compatible = "rohm,dh2228fv";
pl022,com-mode = <2>;
}
};
You will be able to read and write to the device /dev/spidev0.0 with a simple cat > /dev/spidev0.0 and cat </dev/spidev0.0 but ioctl must be used for more complex options e.g. reconfiguration of speed, change polarity, and full duplex write/read.
Related
I am currently building a device and its driver in an embedded Android.
The device is basically an embedded Linux behaving like a touchscreen.
The device and the embedded Android are connected with UART. (I am using Lemaker Guitar eval boards). The communication is working : I receive data sent on /dev/ttyS0 (using minicom or cat /dev/ttyS0).
Now, I need to create the driver that will receive this input (it will be touches, so coordinates, the protocol is already written).
I found this resource and its lab.
We can read that :
"The driver name must be “atmel_usart” to match the device definitions in arch/arm/machat91/"
So I looked for it and found that the device (ie. the Linux embedded) has to be declared in the device tree OR in the file under a platform_device with a name matching the name of the platform_driver.
That made me question a lot about my approach to this problem.
I can read from /dev/ttyS0. But this is a just a device node, but not a device. Is it true ?
When implementing platform_device structure, we must allocate resources. Is there any way to let the system handle the resources ?
I've seen also another library that can be used but could not find any real documentation : serio. It might be a better path to resolve my issues ?
Thanks for reading. I am open to suggestions in order to solve this issue : what shape should have my driver.
Thanks again, you might be saving my internship :) :)
EDIT :
These words were not clear enough.
So I have two parts : one embedded Linux (ie. Ubuntu Mate) that will behave like a touchscreen. This will send the coordinates of the touches to an embedded Android.
The linux embedded is connected to the UART via serial link ; this communication works. Now, I want to make a driver in order to perform the touches in the Android.
Here is the block diagram :
Thanks again :)
This is my first question on stackoverflow even though I'm a continuous reader of this problem-solving source.
Anyway, this is the issue I'm facing :
I'm trying to connect with a bus SPI two evalboards :
The first one (source of data) simulates a touchscreen and is a Linux distro (for now : Raspbian).
The second one is an Android embedded.
I would like to connect those two by SPI and send from the Linux one to Android the touch sequence (according to the multi-touch protocol (https://www.kernel.org/doc/Documentation/input/multi-touch-protocol.txt)).
spidev is enabled, but I have no idea about how to "perform" the touches I will receive.
From what I see : I can't use Android input Devices (https://source.android.com/devices/input/input-device-configuration-files.html) because it can't rely on SPI communication.
Must I create a driver in the linux kernel then ? What is the "best practice" in this particular situation ?
Thanks in advance, you might be saving my internship :)
If your Android Linux kernel is set up to expose /dev/spidev (or you can set that up in the kernel), you do not have to create a Linux kernel module. You can access /dev/spidev from Android by writing an NDK wrapper in c/c++.
I have done that and it works. I would suggest that you start with writing a small c-program that configures and opens a /dev/spidev spi channel, and sends/receives some test data. When that works, rewrite the c-program into an NDK wrapper library you can access from an Android program.
This assumes that the Android app is one you write yourself. If you want to make the touch-events available to Android in general, I think you need to write a touch-driver as a kernel module.
A current product I work on supports standard modem AT commands over the serial link. Commands such as ATD, ATH, ATV, ATQ ATE etc to either get information from the device or seti info on the device.
The device also supports dial-up PPP connection which is typically preceded by some AT commands when initiated from the PC - ATD for e.g.
We are looking at creating a similar device using Android and I searched whether Android or Linux support Hayes modem AT commands over the serial interface such as RS232 or USB and am unable to find such a layer or component.
On Android RIL page I see it says that, "Android provides a reference Vendor RIL, using the Hayes AT command set, that you can use as a quick start for telephony testing and a guide for commercial vendor RILs". But I believe this may be for interfacing with the Baseband or Telephony layer. Is this also typically used with the serial interface where a DTE can issue AT commands and talk to the Android device ?
I also researched Linux and it does have a basic set of AT commands but they are not modem related and serve a different purpose - e.g. atq lists the user's pending jobs, atrm deletes jobs, identified by their job number.
So I am looking to undestand, whether Android or the Linux kernel has a default AT command parser which supports the Hayes AT Command set, which can be accessed by a DTE connected to the Android device over a serial link.
I think this is a typical case for phones and other may have come across the same question or issue, but I have not found an answer searching on the Android forums.
adding more info to clarify my question as it seems from the answers, it wasnt very clear what I was asking
To clarify my question, I am not trying to issue AT commands from the Android phone. Rather I am looking to issue AT commands from a PC to an Android phone over a serial link such as RS232/USB/Bluetooth.
I am trying to understand if Android inherently supports AT commands and has an AT parser ?
For e.g. to establish a PPP link (dial-up connection) from Windows to a device that supports PPP, windows will first send some AT commands and finally the ATD (dial) command. The device responds with a CONNECT response, afterwhich it switches to online mode and a PPP link is established and IP data cab be sent between the PC and the device. Thus typically such devices (modems) by default are in AT mode. ATD is just one such command. There are several other AT commands supported by the device.
Now to develop a similar device using Android, I'd like to do that without significantly altering the Linux kernel or Andorid architecture and am looking to understand if a AT command parser which recognizes the standard set of AT commands is inhenrently supported by android.
To make my question clearer, consider the case where a PC for e.g. connects to a device (phone) using the BT DUNS (dialup networking) profile. The DUNS profile requires the phone to support i.e. parse and recognize certain AT commands such as AT&C, &D, &F, +GCAP, +GMI, +GMM, +GMR, ATA, D, E, H, L, M etc.
How does Android address this ? Does it have native support for reconizing and responding to such AT commands ?
thanks in advance!
If I understood you correctly you like to send AT-commands via a serial communication from an Android device to your hardware - and get the responses back.
Basically one would implement a serial communication either via Bluetooth - or starting with level 12 using USB communication.
I have done quite extensive serial communication in both ways it it works quite well. The main problem so far has been that while Bluetooth serial-Adapters are quite expensive - the new USB-serial communication has flaws on certain devices.
The last time I tested Samsung Galaxy Tab, the USB driver's were not functional. On some other Android device there were no USB-drivers at all installed.
But once you have a working Android device serial communication (with or without using AT-commands) works fine
For more info you might like to check http://developer.android.com/guide/topics/usb/host.html
On linux, there's the chat command, which was typically used by the pppd demon. Then there's the classical TCL expect, which allows scripting. It has been ported to many languages, eg expect for Java, Python expect. I haven't used the latter two, but it looks as if you can use them as a library and don't have to call them as external programs.
It's a curious thing that the most advanced mobile phones still use the archaic Hayes AT command set instead of a proper API..
I know that the latest versions of Android (Honeycomb and ICS) have support for joysticks and gamepads.
Guitar Hero (and Garage Band) controllers are essentially USB HID devices, right?
So my question:
Is that possible to receive data (button clicks) from the Guitar Hero (or Rock Band) controllers on Android device?
Would the Android understand it as a gamepad input?
P.S. all I need is to detect in my game the input from those five buttons on the plastic guitar fret.
A good starting point would be to review the linux source code for Frets On Fire, which supports some of the Guitar Hero controllers.
Frets on Fire: SourceForge
SVN: https://fretsonfire.svn.sourceforge.net/svnroot/fretsonfire
It looks like it would be difficult to universally support all controllers, from different platforms. Each console has it's own protocol, but it does look like JoyStick to keyboard emulation is possible on the PC with the PS3 controller. There is a config file for the PS3 controller on the second link that may be helpful, it's for JoyToKey (which isn't open source), but some of the values in the config may help you.
Hey this is a really cool idea. start here:
http://developer.android.com/guide/topics/usb/host.html
Then, check out this sample:
http://developer.android.com/resources/samples/USB/MissileLauncher/index.html
in that sample, there is a method named setDevice(UsbDevice device)
If I were implementing this, I would start with a duplicate of the MissileLauncher project, and modify this setDevice method. I would log everything I could possible find about the UsbDevice device and try experimenting with these conditionals in the setDevice method:
if (ep.getType() != UsbConstants.USB_ENDPOINT_XFER_INT) {
...
if (intf.getEndpointCount() != 1) {
While the MissileLauncher uses this type and this endpointCount, it is very likely the garageband controller will have different values
ALSO
check out the run method to see an example of back and forth communication
DISCLAIMER: I have no idea if this will work. I've also seen blogs stating this cannot be done.
My guess is it should detect data. This is even possible on existing Android devices - but it is not Android Market friendly.
To accomplish this on non-4.0 devices you must provide an alternative power source to the usb port. Usually this can be done with a cheap usb hub that leaks power. The device also must be rooted. Once this is complete you need to create an interface to the device shell to launch a native language like C outside of your Dalvik VM. Use your C to bind to the appropriate socket and you should be able to bus data back and forth. You may also need to compile a kernel module to give driver support.
Like I said this is a complete hack for device below 4.0. But it is possible.
I am currently trying to find a way to handle USB data transfer on an isochronous endpoint on my Android 3.2 tablet (Host Mode supported). After writing some prototype code, I noticed that in the constants file for USB_ENDPOINT_XFER_ISOC states that "Isochronous endpoint type (currently not supported)".
Is this possible without rooting the device? If so how would I go about doing this?
Ideally I was hoping to stay within the java API, but if this is possible only via the NDK I would have to pursue that instead. I also understand that there might be some USB bandwidth issues based on the following post: User mode USB isochronous transfer from device-to-host
I have written a Java class for USB isochronous data transfer under Android (or Linux): UsbIso
It uses JNA to access the USBFS API via IOCTL calls.
You "can" do it without root, I believe.
You'll need to do it all using some native C code interfacing with the USB device using USBFS. The big issue comes from the lack of documentation of linux's usbfs. Basically everything has to be done through ioctls. That said you do open a device as you would normally from Java. Then you pass the file descriptor from the USBDeviceConnection
Add to that you will need to parse all the USB descriptors yourself. You can get at them, again from the USBDeviceConnection. Jumping from descriptor to descriptor is simple finding the documentation for what each descriptor means is a MASSIVE headache but you can find most of the documentation on www.usb.org.
I've written most of the code that is required to do the parsing for audio devices and I got all the way up to trying to submit an isochronous transfer and then started getting errors.
After switching to libusb I discovered that the problem, in my case, was because the audio device also had HID controllers and the default driver was atatching to those and stealing all the bandwidth away from the isochronous transfer. Had I known this earlier I might have persevered with the non-root non-libusb method. As it was I did get isochronous transfers working through lib usb but it required a rooted device :(
At some point I'll go back to it.
In summary, I'm pretty sure its possible but its not gonna be easy!!
you can find a runnable Solution of the UsbIso 64 bit on my git hub repo:
https://github.com/Peter-St/Android-UVC-Camera/tree/master/app/src/main/java/humer/uvc_camera/UsbIso64
You need all 5 files of the UsbIso64 folder and can use the USBIso like following:
USBIso usbIso64 = new USBIso(camDeviceConnection.getFileDescriptor(), packetsPerRequest, maxPacketSize, (byte) camStreamingEndpoint.getAddress());
usbIso64.preallocateRequests(activeUrbs);
usbdevice_fs_util.setInterface(camDeviceConnection.getFileDescriptor(), camStreamingInterface.getId(), altSetting);
usbIso64.submitUrbs();
// While loop //
USBIso.Request req = usbIso64.reapRequest(true);
req.initialize();
req.submit();
You can use this version of UsbIso with 32 and 64 bit devices.
So far,
Peter