I'm using Python to access Android phone via USB.
Library I'm using is PyUSB.
Is there any internal structure I can find that can lead me to access the screen buffer?
Or is Python too high level that I need to switch to C/C++?
If so, is there a way to access the screen buffer on the Android?
Please provide a detailed instruction.
a VNC server accesses the frame buffer, in droidVNCserver that is open source this is done in native C ( JNI - Java Native Interface ) and i think it is only possible in low level C. the source code is in https://github.com/oNaiPs/droidVncServer/tree/master/ , see https://github.com/oNaiPs/droidVncServer/tree/master/jni/vnc and see https://github.com/oNaiPs/droidVncServer/tree/master/jni/vnc/screenMethods for the jni code
How Droid VNC works?
so with this you can access the frame buffer on the android device however then you have to grab the image over USB using python what is not trivial. you can i.e. mount the android as mass storage or PTP ( picture transfer protocol ) or MTP ( media transfer protocol ) device or use adb however this is not easy
if you want an easy solution install droidVNCserver on your android and write a python program using libraries like https://pypi.python.org/pypi/vncdotool to access it...
Related
On the website it says that even though there is no SDK provided for android it is possible to connect it to the android device with a micro-USB adapter. Is it possible to extract images from both cameras of ZED in this case? Is so, any code samples to accomplish this task?
I have contacted the ZED Support Team and receive the following reply:
You need to connect the camera to the USB port (but note that it will
behave in USB2 mode).Then to extract the side by side image of the
ZED, you need to do some JNI in C (Java/Android does not allow native
grabbing from USB). To do so, you can find on github a libuvc wrapper
for android that reads UVC camera such as the ZED. (
https://github.com/saki4510t/UVCCamera )
Enjoy!
This is my first question on stackoverflow even though I'm a continuous reader of this problem-solving source.
Anyway, this is the issue I'm facing :
I'm trying to connect with a bus SPI two evalboards :
The first one (source of data) simulates a touchscreen and is a Linux distro (for now : Raspbian).
The second one is an Android embedded.
I would like to connect those two by SPI and send from the Linux one to Android the touch sequence (according to the multi-touch protocol (https://www.kernel.org/doc/Documentation/input/multi-touch-protocol.txt)).
spidev is enabled, but I have no idea about how to "perform" the touches I will receive.
From what I see : I can't use Android input Devices (https://source.android.com/devices/input/input-device-configuration-files.html) because it can't rely on SPI communication.
Must I create a driver in the linux kernel then ? What is the "best practice" in this particular situation ?
Thanks in advance, you might be saving my internship :)
If your Android Linux kernel is set up to expose /dev/spidev (or you can set that up in the kernel), you do not have to create a Linux kernel module. You can access /dev/spidev from Android by writing an NDK wrapper in c/c++.
I have done that and it works. I would suggest that you start with writing a small c-program that configures and opens a /dev/spidev spi channel, and sends/receives some test data. When that works, rewrite the c-program into an NDK wrapper library you can access from an Android program.
This assumes that the Android app is one you write yourself. If you want to make the touch-events available to Android in general, I think you need to write a touch-driver as a kernel module.
Background:
I'm trying to integrate a new sensor into an Android platform. For development purposes, I am using a Nvidia Jetson-TK1 dev board and a Spark Core. The Spark Core communicates with the sensor and outputs the data serially through USB.
At a high level, my needs are:
To be able to read/write serial data to the Spark Core over USB
To handle the data with Android Service written with the API
Accomplish all this at high speeds
In the future when I become more experienced at working with the HAL, I may eliminate the Spark Core completely and use the GPIO pins on the Jetson to control the sensor IC.
Onto the Details:
I can read the data through the command cat /dev/ttyACM0, but I'm looking for a more low-level approach. I want to use the HAL to communicate with the device. Specifically, I want the Spark Core to show up when I cat /proc/bus/input/devices.
Then I want to be able to read the data using getevent /dev/input/eventXX.
The Main Question:
Here is my approach:
Find or develop a USB device driver in native C code
Use JNI to compile the driver within Android source code
Create a HAL module (.so binary) with a HAL definition
Compile Android source code into the kernel
Flash onto the Jetson
Profit
Is this correct? Can someone point me in the direction of what I would take as a first step? I'm mostly confused because I know Android is built on the Linux kernel and the Linux kernel should have USB device drivers built into it (right?)
I am currently trying to find a way to handle USB data transfer on an isochronous endpoint on my Android 3.2 tablet (Host Mode supported). After writing some prototype code, I noticed that in the constants file for USB_ENDPOINT_XFER_ISOC states that "Isochronous endpoint type (currently not supported)".
Is this possible without rooting the device? If so how would I go about doing this?
Ideally I was hoping to stay within the java API, but if this is possible only via the NDK I would have to pursue that instead. I also understand that there might be some USB bandwidth issues based on the following post: User mode USB isochronous transfer from device-to-host
I have written a Java class for USB isochronous data transfer under Android (or Linux): UsbIso
It uses JNA to access the USBFS API via IOCTL calls.
You "can" do it without root, I believe.
You'll need to do it all using some native C code interfacing with the USB device using USBFS. The big issue comes from the lack of documentation of linux's usbfs. Basically everything has to be done through ioctls. That said you do open a device as you would normally from Java. Then you pass the file descriptor from the USBDeviceConnection
Add to that you will need to parse all the USB descriptors yourself. You can get at them, again from the USBDeviceConnection. Jumping from descriptor to descriptor is simple finding the documentation for what each descriptor means is a MASSIVE headache but you can find most of the documentation on www.usb.org.
I've written most of the code that is required to do the parsing for audio devices and I got all the way up to trying to submit an isochronous transfer and then started getting errors.
After switching to libusb I discovered that the problem, in my case, was because the audio device also had HID controllers and the default driver was atatching to those and stealing all the bandwidth away from the isochronous transfer. Had I known this earlier I might have persevered with the non-root non-libusb method. As it was I did get isochronous transfers working through lib usb but it required a rooted device :(
At some point I'll go back to it.
In summary, I'm pretty sure its possible but its not gonna be easy!!
you can find a runnable Solution of the UsbIso 64 bit on my git hub repo:
https://github.com/Peter-St/Android-UVC-Camera/tree/master/app/src/main/java/humer/uvc_camera/UsbIso64
You need all 5 files of the UsbIso64 folder and can use the USBIso like following:
USBIso usbIso64 = new USBIso(camDeviceConnection.getFileDescriptor(), packetsPerRequest, maxPacketSize, (byte) camStreamingEndpoint.getAddress());
usbIso64.preallocateRequests(activeUrbs);
usbdevice_fs_util.setInterface(camDeviceConnection.getFileDescriptor(), camStreamingInterface.getId(), altSetting);
usbIso64.submitUrbs();
// While loop //
USBIso.Request req = usbIso64.reapRequest(true);
req.initialize();
req.submit();
You can use this version of UsbIso with 32 and 64 bit devices.
So far,
Peter
I would like to write an Android app to pass commands to an external camera whose SDK is available from the manufacturer's website. The SDK is C-based and the camera I/O is done through a mini-USB or mini-HDMI ports. This is a two part question:
Can Android-based smartphones send commands to external USB devices? I do know that they can access data from an external USB device.
Will C wrappers for Java (that work on laptops) work on Android-based smartphone?
Q: Can Android-based smartphones send commands to external USB
devices? I do know that they can access data from an external USB
device.
A: Sure, as long as the device is connected via the Android USB port :)
Q: Will C ... work on Android-based smartphone?
A: With much effort, you can interface Java and C on android (analogous to JNI on other platforms). Look at the NDK docs for more details.
Q: Can I interface a binary C language library that's probably written
for Windows, in i386 object code, from Java, on Android, running an
ARM CPU?
A: Nope. It's more than just a different language. You're also dealing with a different platform, different object formats, and a completely different CPU architecture.
Your best bet is if the vendor can provide either a USB-level or message-level interface to the device. You're pretty much SOL with a binary .dll or .lib library interface.