I'm working on a simple project in which i need to control a Arduino Robot (2-servo motors) through an Android Phone's Gyroscope Sensor, via Internet.
As you can visualize, 3-axis Gyroscopic Coordinates change quite sensitively even with little change in Phone's orientation in 3D space, so i want to minimize the Lag to milliseconds.
Now aside from hardware, my first milestone is to send Gyroscope coordinates from Android to a computer through internet as Fast and Continuously as possible (like a RealTime stream of Numeric data). I know traditional HTTP based Client-Server mechanism will be quite slow therefore i've looked into following:
Google's Chanel API
WebRTC
WebSockets
According to my research, Channel API shows latency that from 10ms to even in Seconds. Also GAE limits requests to 30 seconds. Here is a Channels API stress test:
http://channelapistresstest.appspot.com/
Try clicking "send 5"-button a lot, and you will see latency numbers going up to several seconds.
Now WebRTC sounds most promising and faster than WebSockets. I'll be really grateful if someone can guide me about practical implementation of WebRTC in Native JAVA and Native Android (Any good libraries? i want to send coords. from Android and Receive via a JAVA-SE client on Desktop). I'm not interested in Hybrid App solutions (like Crosswalk). I would also like to know about How should i setup a Signalling Server. Summing it up i have following questions:
Which method should i use? (Channel API, WebSockets, WebRTC or something else) with native Java and android support?
Incase of WebRTC, how should i setup a Signalling server? (just brief description) or a WebSocket Server for WebSockets?
Can i make use of Google's Cloud platform or something similar to reduce complexity at my end?
Any overall suggestion?
Thanks in advance.
You do not want to use WebRTC. WebRTC requires you to setup a separate signaling channel like WebSockets anyway, so it is needlessly complex and very heavyweight for what you are trying to do.
If your requirement is simply to communicate a series of gyroscope values from an device to another, I recommend:
Start with a simple socket server
Connect your clients to this server via a socket
Relay messages from one client to the other
A simple server to print socket input to standard output is just a few lines of Python, for example. This does require you to learn to use sockets because your clients need to be interpreting the byte stream in the same way. You can also use WebSockets, but this may increase the complexity of your server significantly; Java EE is significantly more complex than Java SE, for example.
If you want data integrity (I imagine you do), you should use TCP.
If you are really worried about latency, you should also skip going to the Internet. Run everything on a LAN. I imagine you have to see the device under control anyway, so there's no point on going out to the Internet.
I would ignore WebRTC for this project. It is more complicated to setup and requires a special server.
WebSocket should be just fine for this project. It is as fast as TCP.
If you would like to avoid even this delay at all cost, but packet loss is not a problem, then I would go for simple UDP. For all these, you don't need any library, just a few lines of code.
Related
So I have been experimenting with multi-peer networks. Ultimately I am going to try to use different frameworks to make one that can connect devices of same os through Bluetooth and WiFi, and ones of different types through wifi.
My first shot was apple's Multi-peer Networking. Unfortunately I got had about 0.5 seconds of delay (I didn't actually calculate this just an estimate) before even one bit of information actually got to the other device. I am suspicious that the framework is optimized for larger and encrypted data way more then it is for 1-32 bit jobs.
I was just wondering what you guys knew about the latency of other frameworks out their, since it takes a decent chunk of time for me to learn how to use each new framework. Is latency of about 0.5 seconds the best the industry has?
Honestly I would be happy if their was a library that was optimized to send 1 bit to each connected device every (1/60th) of a second. But I think most of these networks package up the data like its of bigger size anyways.
I sorta wish mobile devices had NFC. Just look at systems like the 3ds that can do multi-peer multiplayer (smash-bros) with really really small latency and great accuracy.
Try changing the MCSessionSendDataMode to MCSessionSendDataUnreliable
MCSessionSendDataUnreliable
Messages to peers should be sent immediately without socket-level queueing. If a message cannot be sent immediately, it should be dropped. The order of messages is not guaranteed.
This message type should be used for data that ceases to be relevant if delayed, such as real-time gaming data.
but depends how reliable you really need the data to be, but on a closed network, it should be very reliable anyway
I'm working right now on a openCV project, that does some video processing.
I have a C++ program that runs on a PC, with some cameras connected, and it does the calculation and stuff and an Android app that controls the C++ program (something like aperture settings and starting some special calculations etc) and has a livestream of one camera.
The question is: How can these apps communicate.
I thought about two TCP sockets.
one for the LiveStream
one for the control
What do you think?
Will this work or is the a better way to implement this.
Thank you very much.
You've got a good guess.
Sockets is a good solution for you.
But TCP socket for video stream is really bad practice. In case of minor network issues you'll get annoying hangs, etc.
Use UDP socket for LiveStream. Just be ready, that some of packets can be missed if WiFi signal is low or smth.
I am currently building a virtual class room and I need to stream the camera and microphone to many students.
Previously I used red5 server, but due to the lack in the support for ipad, iphone and android, had to remove red5. I have abandoned webrtc since it is a peer to peer solution (Due to bandwith problems in client side).
The solution I am looking for is a web based solution and a non peer to peer solution is the preference. Is there a way to accomplish my task? I'd like to do this as an open source project.
Any help is greatly appreciated.
Your best option may be to reexamine the webrtc route. You don't have to do peer-to-peer, you could in fact use your server to relay the streams. I can't think of any servers that do this off the top of my head, but I am certain that I've seen them while doing my research.
I want to hook a camera up to an arduino which will send images to an android over bluetooth. I don't mind a delay in image transfer (once it's not to large). I then will process the images on the android (probably with the OpenCV library, motion tracking). This is a pan/tilt camera set up, the arduino will tell two motors how to behave based on the the images. How can I send this data over bluetooth or other wireless means? What type of camera is best for this situation?
Do you have to use an Arduino? I do not know if it will have the processing power needed for image processing. Have you looked into using a Raspberry Pi? You can install Java on it, and use the Pi4J library to access its GPIO. The people from Raspberry Pi recently created a camera module for easier integration with the main board.
Simply said, no, I don't think you can use an arduino for that:
the problem you will encounter is that the image capturing library you may find added to the bluetooth (or wifi) library (and the whole network stack) will fill your arduino up! Remember that you have only 32k of flash to put everything in, it's less than an Atari 2600. So you'll need a bigger arduino (like the arduino mega) which is close to the price of a beaglebone or a rasppi.
So to sum up, same conclusion as the others: just use a bone or a rasppi.
Though, here's one hack though that could help you doing what you want:
http://www.ladyada.net/make/IoTcamera/
it's a hack, because the arduino only copies the image over the eyefi, and the eyefi does not need to be handled by the arduino like a bluetooth/wifi shield.
Having a design discussion with some co-workers about our app. Looking for the best way to transfer large data files on a, say, weekly basis from a phone to a remote server. Server will
be in the DMZ and phone will either be in WiFi mode or GSM. Some of the files will be 100Mb can even get up to 400Mb. Just not sure of the best way to approach this in my Android code. I was looking at
MTOM or even just pure FTP. Any advice would be appreciated.
I have investiagated about the use of MTOM under Android but I found nothing. I don't know whether there's any implementation working with Android yet.
But this is something you can do by FTP, which would be a good choice I think. And you could check the integrity of the file using a checksum (calculated on both sides and then compared).
Using 3G for huge files is likely to take long and to be expensive, so to me the best is to use WiFi. You can detect whether your device is connected thru WiFi as described here.