I have a project going where several devices submit data to a central server on the local network using a REST service. I am trying to determine how complex of a client I need in Android. Each device will be generating data quickly (i.e. sensor data), so its client will send frequent requests but can also bundle data into larger, less frequent ones. I'd estimate the throughput per client at about 50KB per second.
Is a REST client an appropriate strategy? How much data would be too much? Any thoughts on this would be appreciated.
Even if it is advisable, there are several options to choose from such as
Roll your own using HttpUrlConnection or HttpClient
Volley (example) from Google, which uses four threads
Ion from koush
Others such as https://github.com/darko1002001/android-rest-client
I do have my own implementation from a couple years ago that uses HttpClient, but it's probably outdated.
It greatly depends on your exact needs, but
you also could just send a list of sensorDatas every 10minutes for instance (numbers figured), as you mentioned before. Sending those collections has the charm that the client could determine when the collection has to be sent, like every 10min or every 10 entries, or before the device goes into stanby/onPause etc.
On serverside there must be a way of putting/posting collections created by the client. Depending on your needs the collection then could be seperated into single resources on the server.
The clientside implementation should not be far from posting single sensorData. It's just bundeling and sending.
Socket communication on the other hand renders your RESTful Service pretty useless.
Because I got the impression, that the process isn't that complex, I would go for an own implementation using HttpClient AND I would read about the different HttpClients beforehand. (There is an apache one, and a fork of it rolling with the Android SDK. I am currently unaware of if this gets you in trouble later).
Because you are constantly sending data, I would prefere to create a Socket communication. With REST you would need to send a new POST for every chunk of data including the http overhead. The Socket communication will be keept alive and the overhead should be less.
Related
I have an Android client app that sends some data to a server in Python, where the Python server is supposed to run a long time-consuming operation/computation and return the results to the client.
To do so, I initially started using Flask with Python on the server side, and an asynchronous android http library on the client side to send the data via http POST. However, I quickly noticed that this is not the way to go, because the computation on the server takes time which causes problems such as the client getting timeout errors ... etc.
Then, I started using Tornado's Websockets on the server side, and an android library for websockets on the client side. However, the first main problem is that when the server is running the time-consuming operation for a given client, the other potential clients need to wait ... and it seems a bit of a pain to make tornado work in a multi-threaded setting (as it is originally planned to be single-threaded). Another minor problem, is if the client goes off-line while the server is processing his request, then the client might never get the result when he connects back.
Therefore, I would like to ask if you have any solutions or recommendation on what to use if I want to have such a setting with an asynchronous multi-threaded Python server who is supposed to do heavy-cpu computations with data from a client without making the other potential clients wait for their turn; and potentially making the client able to get the result from the server when he connects back.
FIrst of all, if you're going to do cpu-heavy operations in your backend, you [most probably] need to run it in separate process. Not in thread/coro/etc. The reason is that python is limited to single thread at time (you may read more about GIL). Doing cpu-heavy operation in multithreading gives your backend some availability, but hits performance overall.
Simple/old solution for this — run your backend in multiple process (and threads, preferably). I.e. deploy your flask with gunicorn, give it multiple worker processes. This way, you'll have system that capable of doing number_of_processes - 1 heavy computations and still be available for handling requests. Limit for processes is usually up to cpu_cores * 2, depending on cpu arch.
Slightly more complicated:
accept data
run heavy function in different process
gather result, return
Great interface for this would be ProcessPoolExecutor. The drawback is — it's harder to handle failures/process hanging over
Another way around is task queue + workers. One of most used is celery. Idea is to
open WS connection
put task in queue
worker (in different process or even different physical node) eventually picks up task, compute it, put result in some DB
main process gets callback/result of long polling over result DB
main process sends result over WS
This is more suited for really heavy and not real-time tasks, but gives you out-of-the-box handling for failures/restarts/etc.
I am currently working on a small scale project to prove something works, I currently have a smart band device which has an Android SDK.
From this device I use the SDK to track a users heart rate in real time.
So my Android application receives updates to the heart rate in real time.
This was fairly easy to do however I now need to send this data in real time from the Android device to the server as efficiently as possible.
To start with battery drain is OK as initially this is just a proof of concept.
I have limited experience with sending large amounts of data to a server in real time and I was wondering if anyone has an ideas on what might be the best approach on Android?
I have looked into Sync Adapters but these seem to be more about keeping data aligned between the client and the server, this is something I am not concerned about. Another approach would be to see if the RequestQueue from Volley might work but again I am unsure if looking into this is even worthwhile?
Should I be looking into creating a Service and somehow using a socket to transfer the data?
EDIT: It looks like IntentService may be the best option for handling the task execution but I am assuming http requests would be too heavy for the client and I should look into something else for the transfer?
I am working on similar kind of project but the wrist band I am dealing with is Empatica E4. Please bear in mind that I am not an expert developer, therefore I am also looking forward for corrections in my design. Also, I will try to justify my idea step-by-step and as much as I can. I hope that this will give you some hints for your application and help others as well.
So, my current architecture looks like;
First of all, Empatica has also provided an Android SDK to receive the data. SF stands for Sampling Frequency whereas EDA, Temp, BVP and AccXYZ are my sensors in wrist band. Each sensor has different sampling frequency and the maximum is 64 Hz which gives you 15 ms interval between each samples. This interval is quite challenging to perform all the operations, therefore I buffer the sensors data in (Volatile LinkedBlockingQueue) FIFO queue so that I don't miss any sample. This was all happening in the service of my application.
Now, I have a Runnable task that I have used with ScheduledExecutorService to collects samples from the queue with the interval of 250 ms (you can vary it as per your need but I used 250 ms considering my needs, network latency and the device performance) and put them in a single JSON object. Number of samples that this Runnable task collects are varying for each sensor, which are, BVP: 16 samples, AccXYZ: 8 samples, Temp: 1 sample and EDA: 1 sample. At the output, I have a JSON object with the data to be sent to my server.
For transferring data to my server, I am using HTTP POST request. The reasons are easy, fast, efficient and good for concurrency. I am using Volley framework which will handle all my network related issues by itself. So I just add the JSON object in the Volley RequestQueue and my client is done here. As you mentioned, you can use socket connection to achieve your goal but I have to use multiple devices, therefore in my case, sockets can be problematic to achieve concurrency. I also tried to do it manually by using HttpURLConnection but the code was becoming tedious and hard to handle.
Finally, I have a REST API (in Python) on my server side which will handle the POST requests, parse the data and insert it in my MySQL database. As of now, I am still working on this REST API to parse the data and store it in DB. However, I have tested my application and I am successfully receiving the data from my device to the server.
Regarding your question "Should I be looking into creating a Service and somehow using a socket to transfer the data?", it's an excellent option if you are working on a single device. If more than one device, i think Http is better option.
Regarding your second question, I don't think Http would be heavy for the client and Volley is taking all your pain on itself. You just have to make a request queue and voila !!! You can find plenty of good tutorials for volley and I particularly followed this.
I hope my answer will help you a little.
PS: As I am still working on this thing and I have not come up with the final product yet, therefore I cannot surely tell about the risks involved in it but I will keep you updated if something new happens. Also, I am open to any suggestions and ideas that can help. Lastly, the picture above is not very detailed, I made it for you, just to share how I am dealing with the same idea.
I am creating a simple android game that deals with multiplayer and connecting to server. This is what I want to achieve for my game.
Client sends keystroke movement to server.
Server accepts input, update the state and calculates and returns new position to client
Client gets new position and update the screen.
I already have some code written and so my one of my threads sends keystroke to server, waits for the new position, then update the screen. The thing is there is lag between the player's movement. I think it has to do with latency. I also tried two thread, one for sending/receiving data from server and another is updating the screen. But it didnt' solve my latency problem.
Can anyone suggest a good design for networking game?
Do I need to do early prediction?
Do I need separate thread for fetching data and rendering screen?
P.S.This is my first time creating a network game so i have no idea what im doing.
The structure I prefer to use is to have one thread updating graphics and logic on the client side, and then one thread receiving and sending data to the server. If you have problems with latency, there is a magic option which might solve it if you want data to be sent continously. (at least this is the syntax if you are using java)
mySocket.setTcpNoDelay(true);
This option solved my latency issues when I sent real-time coordinates over a network.
Edit: I think TCP by default waits and bunches together data before sending it, which could be a problem if you want to send small updates fast.
I need to make for school an app that runs on Android. Actually there are two apps, a client and a server. Ther server runs on a PC while the clients run on Android devices. I want to know what is the best technology for this. I know RMI and WebServices are not implemented in Android so what are the alternatives (besides actually communicating with sockets in the traditional way). One alternative that I did not look into is REST, but I will need to be able to notify a client once another client has done something, similar to turn base games where Player A notifies Player B that he made his move.
Like I said, sockets do the trick, but are little low-level compared to RMI and WebServices and only want to use those as a last resort.
Keep it simple. Use REST and have the clients poll for updates.
Also, if you get to a point down the road where you need to scale, this solution is actually fairly easy to scale because your servers do not need to maintain connections. Since there is no shared state between a particular server and the client (there is shared server between the application and the client), you can easily add more servers to handle the polling and put them behind a load balancer.
You can also add in caching so that the polling just gets the exact same response without causing a re-compute of the response. You would then be able to have the back-end game-state servers update the caches when the game state changes. This would let you poll much more frequently and still have a very flexible, scalable architecture.
For a turn-based game you can take a look at XMPP (e.g. Smack) that is traditionally used for instant messaging. It would also be interesting to create a game using C2DM that is used for push notifications.
You can also look into HTTP streaming which is essentially an unending HTTP response in which player moves are fed into.
Alternatively you can look into binary messaging systems that are more suited for real-time games but still applicable such as RabbitMQ (what's wrong with a smooth turn-based game?).
I have a mobile app on android that needs to send small amounts of data (id, lat and long coords) every 30 seconds to be stored in a SQL Server database sitting on an amazon ec2 instance. As an example usage, say that this app has 500 current users all sending data every 30 seconds. For proof of concept I created a windows service in c# running on the database server which listens for connections on a specific tcp port; it threads, and writes the data to the database. Now this works for the 5 users I tested with but I know there are better ways and I especially do not want the insert statements to be done with a program running on the database server. So my question is, what is the correct way to handle repetitive data streams from a large user base that scales in a manageable way. I have read information regarding implementing webservices to do this but I am not sure if that is the correct solution.
Thank you for any information.
Using a web service approach is definitely more scalable than using a Windows service. If your usage grows enough to justify it, it will be also be easier to deploy your web service in a new EC2 instance (or to multiple load-balanced instances) instead of using a windows service which shares resources with your DB instance. It is also much more manageable, especially under AWS, where you can scale your infrastructure with a few clicks.
Also, under a web service (or a web application, if you prefer), it is easier to set up access control and to take other preventive security measures.