How input received by SPI may be interpreted by Android - android

This is my first question on stackoverflow even though I'm a continuous reader of this problem-solving source.
Anyway, this is the issue I'm facing :
I'm trying to connect with a bus SPI two evalboards :
The first one (source of data) simulates a touchscreen and is a Linux distro (for now : Raspbian).
The second one is an Android embedded.
I would like to connect those two by SPI and send from the Linux one to Android the touch sequence (according to the multi-touch protocol (https://www.kernel.org/doc/Documentation/input/multi-touch-protocol.txt)).
spidev is enabled, but I have no idea about how to "perform" the touches I will receive.
From what I see : I can't use Android input Devices (https://source.android.com/devices/input/input-device-configuration-files.html) because it can't rely on SPI communication.
Must I create a driver in the linux kernel then ? What is the "best practice" in this particular situation ?
Thanks in advance, you might be saving my internship :)

If your Android Linux kernel is set up to expose /dev/spidev (or you can set that up in the kernel), you do not have to create a Linux kernel module. You can access /dev/spidev from Android by writing an NDK wrapper in c/c++.
I have done that and it works. I would suggest that you start with writing a small c-program that configures and opens a /dev/spidev spi channel, and sends/receives some test data. When that works, rewrite the c-program into an NDK wrapper library you can access from an Android program.
This assumes that the Android app is one you write yourself. If you want to make the touch-events available to Android in general, I think you need to write a touch-driver as a kernel module.

Related

How to access GPIO/SPI/I2C from NDK on Android (not Android Things)

I'm doing some prototyping with Android on Raspberry Pi 3B+ with Lineage OS (https://konstakang.com/devices/rpi3/LineageOS16.0/), which I think it's based on AOSP. Now I'm able to run a Hello Jni app with NDK. However, I need the access to GPIO/SPI/I2C interfaces from NDK to control sensors. The C libraries that I used for Raspberry Pi OS, wiringPi and PIGPIO, do not support Android out of the box.
I kinda understand how to access GPIO after watching this course: https://www.coursera.org/learn/internet-of-things-sensing-actuation/home/week/2. Based on my understanding, it's basically find the correct file for GPIO and change the value to control the interface.
However, even though I understand the concept that SPI is based on GPIO, but I couldn't figure out which file I need to access to control the interface, and how to deal with the 4-pin system for SPI instead of just interacting with 1 pin for GPIO.
As an Android or/and embedded dev newbie, I assume understanding how these interfaces work under the hood on Android would be helpful as well but I'm struggling to find documentation or code examples.
Any help on how GPIO/SPI/I2C works in Android, or how to enable them in RPi running Android is appreciated!

Why in Bluez 5.35 SBC codec capabilities are initialized in android/hal-audio-sbc.c packge not in AVDTP.c

I am updating Bluez 4.97 to 5.35 in my embedded device.
For A2DP connection, we have to share SBC codec capabilities. In ideal case the capablity will look like figure 1 . In Bluez 4.97 code, I am getting SBC codec capability from sbc_getcap_ind() function in AVDTP layer. In sbc_getcap_ind(), both sbc_codec_cap and avdtp_media_codec_capability are initialized. So this capability packet I can send back to Phone.
In 5.35, sbc_getcap_ind() function is not available. avdtp_media_codec_capability are set in endpoint_getcap_ind() function in AVDTP layer, which is as per my expectation. But sbc_codec_cap is not initialized. So I am getting packect like in figure2.
In blueZ 5.35 there comes the new package android/hal-audio-sbc.c, in this package SBC coded capability are set.
My embedded device is RTOS based and I have nothing to do with android. So I have following doubt:
1) Why there is new android package in blueZ stack? What's the development idea behind this?
2) Why SBC capabilities are initialized in android/hal-audio-sbc.c, how non-android device will access SBC capabilities?
3) How in my embedded environment, I can use android/hal-audio-sbc.c to get SBC capabilities?
I think I am not able to resolve this issue because I am missing understanding of new 5.35 architecture. And there are not enough documents to understand BlueZ architecture. I hope by getting answer of these question I can understand significance of android folder in 5.35 BlueZ package.
Before answering your questions, I would like to share couple of URL's.
Porting guide
Management interface
Coming for your questions,
BlueZ now supports both android and Linux platform. The directory "android" contains only sources related to android platform, which can't be used for Linux environment. The idea behind this is to share common code of development between Linux and android, and develop common functionalities separately (mostly under "src", "gdbus" and "profile" directories).
As part of BlueZ 4 to BlueZ 5 migration or major development, all the audio related implementations are moved out of BlueZ. Now it's the responsibility of the Audio application to implement the whole stuff on it's own and register with BlueZ (doc/profile-api.txt ==> RegisterProfile() method). BlueZ will only act as the mediator between your application and Devices. As far as Linux is concerned, there is no audio implementation inside BlueZ. Am not sure about Android directory under BlueZ. So non-android platforms needs to implement on it own.
As mentioned, you need to implement our own audio related profiles for BlueZ. We have one working software, which is pulseaudio. You can load the module-bluez-discover in pulseaudio (pactl) and Pulse audio takes care of audio.
There is also another solution available in open source, bluealsa which is currently under active development. After using it, I could see lot of delay in audio and less quality. If you want perfect solution, implement on your own or use pulseaudio (no so much real time).
In simple words, migration application from BlueZ 4.x to BlueZ 5.x is not easy!

How can I bind my own serial device with a driver?

I am currently building a device and its driver in an embedded Android.
The device is basically an embedded Linux behaving like a touchscreen.
The device and the embedded Android are connected with UART. (I am using Lemaker Guitar eval boards). The communication is working : I receive data sent on /dev/ttyS0 (using minicom or cat /dev/ttyS0).
Now, I need to create the driver that will receive this input (it will be touches, so coordinates, the protocol is already written).
I found this resource and its lab.
We can read that :
"The driver name must be “atmel_usart” to match the device definitions in arch/arm/mach­at91/"
So I looked for it and found that the device (ie. the Linux embedded) has to be declared in the device tree OR in the file under a platform_device with a name matching the name of the platform_driver.
That made me question a lot about my approach to this problem.
I can read from /dev/ttyS0. But this is a just a device node, but not a device. Is it true ?
When implementing platform_device structure, we must allocate resources. Is there any way to let the system handle the resources ?
I've seen also another library that can be used but could not find any real documentation : serio. It might be a better path to resolve my issues ?
Thanks for reading. I am open to suggestions in order to solve this issue : what shape should have my driver.
Thanks again, you might be saving my internship :) :)
EDIT :
These words were not clear enough.
So I have two parts : one embedded Linux (ie. Ubuntu Mate) that will behave like a touchscreen. This will send the coordinates of the touches to an embedded Android.
The linux embedded is connected to the UART via serial link ; this communication works. Now, I want to make a driver in order to perform the touches in the Android.
Here is the block diagram :
Thanks again :)

Android and Guitar Hero controller

I know that the latest versions of Android (Honeycomb and ICS) have support for joysticks and gamepads.
Guitar Hero (and Garage Band) controllers are essentially USB HID devices, right?
So my question:
Is that possible to receive data (button clicks) from the Guitar Hero (or Rock Band) controllers on Android device?
Would the Android understand it as a gamepad input?
P.S. all I need is to detect in my game the input from those five buttons on the plastic guitar fret.
A good starting point would be to review the linux source code for Frets On Fire, which supports some of the Guitar Hero controllers.
Frets on Fire: SourceForge
SVN: https://fretsonfire.svn.sourceforge.net/svnroot/fretsonfire
It looks like it would be difficult to universally support all controllers, from different platforms. Each console has it's own protocol, but it does look like JoyStick to keyboard emulation is possible on the PC with the PS3 controller. There is a config file for the PS3 controller on the second link that may be helpful, it's for JoyToKey (which isn't open source), but some of the values in the config may help you.
Hey this is a really cool idea. start here:
http://developer.android.com/guide/topics/usb/host.html
Then, check out this sample:
http://developer.android.com/resources/samples/USB/MissileLauncher/index.html
in that sample, there is a method named setDevice(UsbDevice device)
If I were implementing this, I would start with a duplicate of the MissileLauncher project, and modify this setDevice method. I would log everything I could possible find about the UsbDevice device and try experimenting with these conditionals in the setDevice method:
if (ep.getType() != UsbConstants.USB_ENDPOINT_XFER_INT) {
...
if (intf.getEndpointCount() != 1) {
While the MissileLauncher uses this type and this endpointCount, it is very likely the garageband controller will have different values
ALSO
check out the run method to see an example of back and forth communication
DISCLAIMER: I have no idea if this will work. I've also seen blogs stating this cannot be done.
My guess is it should detect data. This is even possible on existing Android devices - but it is not Android Market friendly.
To accomplish this on non-4.0 devices you must provide an alternative power source to the usb port. Usually this can be done with a cheap usb hub that leaks power. The device also must be rooted. Once this is complete you need to create an interface to the device shell to launch a native language like C outside of your Dalvik VM. Use your C to bind to the appropriate socket and you should be able to bus data back and forth. You may also need to compile a kernel module to give driver support.
Like I said this is a complete hack for device below 4.0. But it is possible.

Android phone as computer mouse

I created an Android app that serves the touch screen sensor data to a java client that is listening on Debian Lenny machine.
The client maps this data to locations on the screen just like a wacom pad does. I would like to out put the x_loc and y_loc to a file and have the file recognized as a device.(I foggily believe this is how it is supposed to work)
I have experience with Linux but have not had to create a device before. How do I tell Linux that this file is a mouse. Do I have to create a driver?
There's many ways to do this, ranging from writing an actual device driver, over writing X clients to generate X events (using the XTest extension for example), to using kernel interfaces to inject input subsystem events.
I'd go with the last one and use the uinput subsystem. That's part of pretty much all recent kernels and provides /dev/uinput, which you can open regularly and do various ioctls on to create input devices from regular userspace.
Please also note that some mechanisms for this already exist. Bluetooth Human Interface Devices, which work just fine on Linux, are one example. rinputd, a daemon to listen to rinput clients and generating uinput events based on the data they send. is another. You might want to consider just making your Android app akt as an rinput client.
You can either write a linux device driver to interpret your data as a genuine mouse, or you can convince the X server (or whatever else) to accept input from something else, such as a named pipe.
Actual device files are not files with any content - they are merely references to a major and minor number used to talk to a driver in the kernel which can perform vaguely file-like options on some device. You create device files with mknod, but they won't work until backed by a kernel driver with matching numbers. Believe there are now some stub mechanisms so the bulk of the actual driver can run in userspace.

Categories

Resources