Editing Functionality of Host Card Emulation in Android - android

I'm currently in the process of developing a project for my University course wherein I will be hopefully editing the functionality of the HCE Feature of Android to allow me to set my own UID when emulating a card.
Now, i've downloaded the AOSP source, and built a custom image with no edited code and installed that to my Nexus 7 (This includes downloading and including the Vendor specific hardware drivers), and i'm stuck on the next part.
I physically cannot find the device code that governs the NFC features of Android, and i'm unsure how to go about a) Looking for it, and b) How I should be editing this code.
Is the code for NFC in Android in the base Kernel? and if so how would I edit that before I run "make" again and hope it builds? or is it elsewhere? I've noticed that the files in the Vendor folder i've downloaded and extracted are in a .ncd format, which I don't think is editable.
Any help I can get on this would be greatly appreciated.

Ok ! So i've found a solution to the problem I was having!
On the Nexus 7, when the NFC is turned on, it gets its information from a config file in "/etc/" called "libnfc-brcm-20791b05.conf"
Inside of this file there is a parameter called "NFA_DM_START_UP_CFG"
By default, it looks like this:
NFA_DM_START_UP_CFG={42:CB:01:01:A5:01:01:CA:14:00:00:00:00:0E:C0:D4:01:00:0F:00:00:00:00:C0:C6:2D:00:14:0A:B5:03:01:02:FF:80:01:01:C9:03:03:0F:AB:5B:01:00:B2:04:E8:03:00:00:CF:02:02:08:B1:06:00:20:00:00:00:12:C2:02:01:C8}
To edit the UID that is generated at Emulation, you need to add some bytes to the end of this parameter.
The first byte you add is 0x33 (This means that you are going to manually set the UID)
The second byte that is added is the length of the UID you wish to set (This can be either 4,7 or 10 bytes, so this second byte can be 0x04, 0x07 or 0x0A)
The next bytes are then the ID you wish to set! (NOTE: Depending on how many Bytes you add, you should change the first byte of the array to reflect the new size of the array - it starts at 42, so if you were to add 6 bytes it should change to 48)
For example, if you wished to set a 7 byte ID of 01 02 03 04 05 06 07, the new config line would look like this:
NFA_DM_START_UP_CFG={4B:CB:01:01:A5:01:01:CA:14:00:00:00:00:0E:C0:D4:01:00:0F:00:00:00:00:C0:C6:2D:00:14:0A:B5:03:01:02:FF:80:01:01:C9:03:03:0F:AB:5B:01:00:B2:04:E8:03:00:00:CF:02:02:08:B1:06:00:20:00:00:00:12:C2:02:01:C8:33:07:01:02:03:04:05:06:07}
You can then push this config file to your nexus device using adb:
-> adb root
-> adb remount
-> adb push libnfc-brcm-20791b05.conf /etc/
-> adb reboot
This will reset the Nexus with the new config file in, and upon emulation the UID will now be set to 01 02 03 04 05 06 07
Hope this helps anyone reading my question!

Android's NFC stack is basically split into five parts:
The NFC interface device driver. This is part of the kernel. In a nutshell, this driver simply tunnels data frames (e.g. NCI protocol frames) between a character device file and the NFC controller hardware. You won't have to touch that part for your project.
The low-level interface library written in C (libnfc-nci, or libnfc-nxp for devices with NXP's PN544 NFC controller). This library provides a set of high-level functions to interact with the NFC controller. So it basically translates high-level functionality (e.g. "start polling for technologies X, Y and Z") into NCI commands that are sent to the NFC controller through the kernel driver. This is certainly a place where you will need to add modifications. As it's part of AOSP you can compile it using the normal AOSP build system.
The JNI interface library written in C++ (libnfc_nci_jni). This layer connects the libnfc-nci C library with high-level Java code. If you want to modify the emulated UID from Android apps, this is certainly a place where you will need to add modifications. As it's part of AOSP you can compile it using the normal AOSP build system.
The Android NFC system service written in Java. This service takes control over the whole NFC stack and implements the high-level functionality based on the resources provided by libnfc-nci. If you want to modify the emulated UID from Android apps, this is certainly a place where you will need to add modifications. As it's part of AOSP you can compile it using the normal AOSP build system.
The Android core framework provides an API to the functionality of the NFC system service that can be accessed by Android apps.
With regard to setting/modifying the emulated UID you will certainly want to have a look at these projects that I recently published on GitHub (though they are still work in progress):
https://github.com/mobilesec/swr50-android-external_libnfc-nci
https://github.com/mobilesec/swr50-android-packages_apps_nfcwatch1

Related

How input received by SPI may be interpreted by Android

This is my first question on stackoverflow even though I'm a continuous reader of this problem-solving source.
Anyway, this is the issue I'm facing :
I'm trying to connect with a bus SPI two evalboards :
The first one (source of data) simulates a touchscreen and is a Linux distro (for now : Raspbian).
The second one is an Android embedded.
I would like to connect those two by SPI and send from the Linux one to Android the touch sequence (according to the multi-touch protocol (https://www.kernel.org/doc/Documentation/input/multi-touch-protocol.txt)).
spidev is enabled, but I have no idea about how to "perform" the touches I will receive.
From what I see : I can't use Android input Devices (https://source.android.com/devices/input/input-device-configuration-files.html) because it can't rely on SPI communication.
Must I create a driver in the linux kernel then ? What is the "best practice" in this particular situation ?
Thanks in advance, you might be saving my internship :)
If your Android Linux kernel is set up to expose /dev/spidev (or you can set that up in the kernel), you do not have to create a Linux kernel module. You can access /dev/spidev from Android by writing an NDK wrapper in c/c++.
I have done that and it works. I would suggest that you start with writing a small c-program that configures and opens a /dev/spidev spi channel, and sends/receives some test data. When that works, rewrite the c-program into an NDK wrapper library you can access from an Android program.
This assumes that the Android app is one you write yourself. If you want to make the touch-events available to Android in general, I think you need to write a touch-driver as a kernel module.

Linux or Android Driver for KR070PC7S LCD Display

I'm trying to build an android linux kernel for an amlogic meson3 processor. Specifically the current running Linux shows the display configuration is
CONFIG_AML_TCON_KR070PC7S
When searching on google, it turns out to be a "60P LCD screen KR070PC7S". However, it seems google does not search the source code repositories around the web to give an answer.
Thus the questions:
Where you can find the driver in source code? Or in pre-built obj binary?
Where you can find the datasheet of the LCD if the driver is not available?
Which other devices (model, brand) use the same display?
Is 60P LCD a standard interface? Where is the spec?
Thank you in advance for any answers or hints.
Google does normally index CONFIG_xxx in parts of the Linux mainline kernel OK. I suspect you're running a custom kernel where they haven't merged it back into the mainline and haven't placed the source in a publically indexable area.
There is nothing wrong with the latter under the terms of the GPL as far as I know, but the original hardware vendor should make the modified kernel source available to customers upon request assuming they have based the driver on GPL code. Otherwise they have no obligation to provide datasheets / code etc.

Does Android(on ARM) have the hardware performance counters?

So like in Linux on Intel processor, we have a large amount of hardware performance counters to access. Like previously, using a user-space software called perfmon2, I could get values of cache miss rate, CPU stalling cycles due to some reason(e.g,. L1 cache miss) and etc.
My question is , do we have those stuff in Android? Since it's based on ARM, I do not think we have as strong performance monitor counter support as we have in x86, right?
ARM11 and Cortex-A/R do have hardware performance counters. You can check that on the official ARM website on this page: http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.faqs/ka4237.html
Since Android is also running on top of Linux, you can access performance counters through user-space software via the "Perf Events" subsystem. However, you will need to write native C code, which includes "perf_event.h" and build it using the Android NDK. Before you start writing code, I suggest you take a look at this project: platform/external/linux-tools-perf (https://android.googlesource.com/platform/external/linux-tools-perf/), which may be exactly what you are looking for. I don't know about any alternatives which allow doing this directly from Java Code.
In addition to Benny's answer, I'll add the following:
The linux-tools-perf project is integrated with the AOSP (I'm using 4.2). It is under $AOSP/external/linux-tools-perf. The ./linux-tools-perf/Documentation contains a lot of info on how to run everything. The perf tool is built in $PRODUCT_OUT/system/bin/perf as part of the AOSP but not packaged; you need to explicitly push it to the target.
Check the Android (linux) kernel to confirm it supports PM: PERF_EVENTS. Google how to get .config but easiest is running "extract-ikconfig zImage".
One needs to have root access on the target and then run something like "./perf stat -- testprog1" to see results. There are a lot of arguments to the command to record/retrieve different PM counters.
Remember most ARM implementations have multiple cores and A LOT of pipelining. Cortex-A9 has out-of-order execution. So the numbers can be a little freaky - almost meaningless sometimes.
I use the ARM PMCCNTR register, the ARM PM Cycle Counter, occasionally to test code performance. PMUSERENR.EN and PMINTENCLR.C must be set at PL1+ level and then PMCCNTR can be managed at PL0 level. See the ARM ARM for what all this means and the perf subsystem for example usage!
Note: All the shell vars are set up in $AOSP/build/envsetup.sh
I think you can try to use ARM® HWCPipe Exporter.
ARM® HWCPipe Exporter is a Prometheus exporter written in Java and C++ that retrieves metrics from Android devices running on ARM® Hardware components and exports them to the Prometheus monitoring system.
Disclaimer: I am the author of ARM HWCPipe Exporter.

Android USB isochronous data transfer

I am currently trying to find a way to handle USB data transfer on an isochronous endpoint on my Android 3.2 tablet (Host Mode supported). After writing some prototype code, I noticed that in the constants file for USB_ENDPOINT_XFER_ISOC states that "Isochronous endpoint type (currently not supported)".
Is this possible without rooting the device? If so how would I go about doing this?
Ideally I was hoping to stay within the java API, but if this is possible only via the NDK I would have to pursue that instead. I also understand that there might be some USB bandwidth issues based on the following post: User mode USB isochronous transfer from device-to-host
I have written a Java class for USB isochronous data transfer under Android (or Linux): UsbIso
It uses JNA to access the USBFS API via IOCTL calls.
You "can" do it without root, I believe.
You'll need to do it all using some native C code interfacing with the USB device using USBFS. The big issue comes from the lack of documentation of linux's usbfs. Basically everything has to be done through ioctls. That said you do open a device as you would normally from Java. Then you pass the file descriptor from the USBDeviceConnection
Add to that you will need to parse all the USB descriptors yourself. You can get at them, again from the USBDeviceConnection. Jumping from descriptor to descriptor is simple finding the documentation for what each descriptor means is a MASSIVE headache but you can find most of the documentation on www.usb.org.
I've written most of the code that is required to do the parsing for audio devices and I got all the way up to trying to submit an isochronous transfer and then started getting errors.
After switching to libusb I discovered that the problem, in my case, was because the audio device also had HID controllers and the default driver was atatching to those and stealing all the bandwidth away from the isochronous transfer. Had I known this earlier I might have persevered with the non-root non-libusb method. As it was I did get isochronous transfers working through lib usb but it required a rooted device :(
At some point I'll go back to it.
In summary, I'm pretty sure its possible but its not gonna be easy!!
you can find a runnable Solution of the UsbIso 64 bit on my git hub repo:
https://github.com/Peter-St/Android-UVC-Camera/tree/master/app/src/main/java/humer/uvc_camera/UsbIso64
You need all 5 files of the UsbIso64 folder and can use the USBIso like following:
USBIso usbIso64 = new USBIso(camDeviceConnection.getFileDescriptor(), packetsPerRequest, maxPacketSize, (byte) camStreamingEndpoint.getAddress());
usbIso64.preallocateRequests(activeUrbs);
usbdevice_fs_util.setInterface(camDeviceConnection.getFileDescriptor(), camStreamingInterface.getId(), altSetting);
usbIso64.submitUrbs();
// While loop //
USBIso.Request req = usbIso64.reapRequest(true);
req.initialize();
req.submit();
You can use this version of UsbIso with 32 and 64 bit devices.
So far,
Peter

Android phone as computer mouse

I created an Android app that serves the touch screen sensor data to a java client that is listening on Debian Lenny machine.
The client maps this data to locations on the screen just like a wacom pad does. I would like to out put the x_loc and y_loc to a file and have the file recognized as a device.(I foggily believe this is how it is supposed to work)
I have experience with Linux but have not had to create a device before. How do I tell Linux that this file is a mouse. Do I have to create a driver?
There's many ways to do this, ranging from writing an actual device driver, over writing X clients to generate X events (using the XTest extension for example), to using kernel interfaces to inject input subsystem events.
I'd go with the last one and use the uinput subsystem. That's part of pretty much all recent kernels and provides /dev/uinput, which you can open regularly and do various ioctls on to create input devices from regular userspace.
Please also note that some mechanisms for this already exist. Bluetooth Human Interface Devices, which work just fine on Linux, are one example. rinputd, a daemon to listen to rinput clients and generating uinput events based on the data they send. is another. You might want to consider just making your Android app akt as an rinput client.
You can either write a linux device driver to interpret your data as a genuine mouse, or you can convince the X server (or whatever else) to accept input from something else, such as a named pipe.
Actual device files are not files with any content - they are merely references to a major and minor number used to talk to a driver in the kernel which can perform vaguely file-like options on some device. You create device files with mknod, but they won't work until backed by a kernel driver with matching numbers. Believe there are now some stub mechanisms so the bulk of the actual driver can run in userspace.

Categories

Resources