I am using http://loopj.com/android-async-http/ to get simple JSON data from Reddit (just a constant) feed of links from a popular subreddit r/news I was wondering what is a reasonable request timeout in milliseconds for Android apps talking to JSON backend endpoints.
A reasonable timeout will be in seconds, not milliseconds. How long depends entirely on the expected service time of the server. You might want to consider the mean service time plus two or three standard deviations, but you will need to assemble the statistics on that first.
Related
I have an application with list of data that I get from server with http request. Now I want to make a notification when new data is available and I assume that it can be achieved with service.
Is that a good practice? Is there any limitations for number of requests made from service?
What I want to achieve is something like gmail application. When I get a new email, notification is shown.
I want my app to be as up to date with data as possible, but I understand that making requests every 5 seconds might be too much.
I am open to all alternatives and various ideas how to do that.
Not sure if you really need to pull data every 5 seconds. Of course, it is too much. You have two options:
Use GCM as #duynt suggested in comment. Follow Try cloud messaging for Android if you've never used it. This way you don't need to worry managing your service locally but whenever there is a latest data available, you will get notification so you can place request to get that and update in notification.
GCM needs An application server that you must implement in your environment. This application server sends data to a client app via the chosen GCM connection server, using the appropriate XMPP or HTTP protocol. Take a quick look About GCM connection server.
For any reason if you would like to pull data from your local Android Service component, you can still do that. But 5 seconds frequency is really high. As majority of the times the device is in sleep mode, you have to wake up the device then send request to pull data. Waking up device every 5 seconds means a battery drain along with consuming data.
If you still want to proceed with local service option by increasing the frequency, make sure you follow How to use http in sleep mode and implement it that way otherwise it wont work in deep sleep mode.
You have to make a decision now.
Our home-grown CMS creates calendar feeds which contain a youth sports team's schedule for a single season. We provide the URL for the end-user to consume the feed on their device (or in their software program), which tens of thousands of people do each year.
The problem is that for most programs and devices, the UI for adding calendar subscriptions is relatively good but for deleting those feeds when they're no longer needed - not so much.
For us, this created a situation where over 90% of calendar feed requests to our servers were from teams whose schedules were long since completed.
Initially we tried just returning an empty feed for non-current teams but that didn't do anything to prevent the devices from making the request in the first place (which ties up resources on our servers). It simply wasn't annoying enough (or at all) to motivate the end-user to do something.
So for the past couple years we've instead returned a single event that spans from the previous month to the following month, with a title of "Please Delete This Out-Dated Feed" and even provided some instructions for doing so. That helped a little bit but not enough.
Lately we've learned that it's possible to attach an alert to a calendar feed, so to try to provoke the user into taking action we've also decided to attach an alert to the single "out-dated" event; an alert that repeats twice at 3-hour intervals. This has been more effective but it's super-annoying for the end-user.
So, here is the question. Is there anything we can do on the server that when a device makes a request, that the response indicates to the device that the feed is no longer available and the device should STOP requesting it? Some other things we've tried include:
Returning a variety of HTTP status codes (302, 301, 500, etc)
Setting a very long TTL on the feed
None of that seems to have any effect. It seems like there should be a way to return an empty response with an HTTP status code of 410 "Gone" and the device / program should know what to do. Any / all suggestions are appreciated!
No, there's no standard or non-standard way to do this. The only thing you can do is what you already have. I would recommend reporting feature requests at the various clients though, they might be more receptive to change than you think.
I have implemented a queue service in Android that will change states based on queue and wifi/data connectivity events.
I queue transactions to be posted to a remote url. If the device has a data or wifi connection, it will iterate the queue and post data to the url until the queue is empty, or there is a disconnect event.
I can login to my app, enable airplane mode, generate data, turn airplane mode off, and the transaction are posted. No slow down, even with thousands of transactions. (I was trying to pish it a bit)
Enter: low reception!
My app slows down enormously when the 3G reception is low. (Yes, all uploading happens off the ui thread.) It seems that the cause of this slow down has to do with the post to the server taking a very long time to happen and sometimes just failing.
My question is, how can I solve this? Check for signal quality? Poll a known address? How do other apps, such as Gmail solve this? This must be a common scenario!
Well if you could potentially have thousands of tasks that all need to be executed, then surely they should be managed. Have you thought about implementing your own ThreadPoolExecutor? The documentation is very good and the class is easy to understand, but if you need examples try these sites:
http://www.javamex.com/tutorials/threads/ThreadPoolExecutor.shtml
http://javabeanz.wordpress.com/2010/02/19/threadpoolexecutor-basics/
The benefit of this is that you can limit the maximum number of threads you are spawning, so you shouldn't get a system-wide slow down if you limit your thread count to a reasonable number (For Android I'd recommend no more than 20).
May be do some fine-tuning of socket and conenction timeout? Thus, if your connection is slow and stalled, timeout will occur and transmission will fail.
After connection/sending is failed you can retry transmission later or do something else.
To adjust timeouts you can use the following code:
HttpParams httpParameters = new BasicHttpParams();
HttpConnectionParams.setConnectionTimeout(httpParameters, 30 * 1000);
HttpConnectionParams.setSoTimeout(httpParameters, 15 * 1000);
HttpClient client = DefaultHttpClient(httpParameters);
// use client...
We have similar situation for our application. We have considered signal issues to be a reality and one that can happen any time. One of the point that we follow is not to remove any content from device unless we get a functional confirmation from server and just base on http status code.As in the slow network or the cases where we can lose signal suddenly, while we may have posted our content, there were many cases where data was received only partially. And so we decided to let server know device by some manner [result through some http get based request calls made by device] that content has been received.
More than performance or checking the network, the question that you asked, we needed such behavior for our application robustness.
You should check out using HTTP Range headers, for example like here.
The server should write payload to disk while reading, and handle disconnects. The client cannot know how many bytes payload actually reached the server, so it needs to sync up with the server every time there has been a network error. Dont forget to handle battery and user issues too ;-)
If you want to wait for a better signal, perhaps the SignalStrength class, with its getCdmaDbm, getEvdoDbm, and getGsmSignalStrength methods, is what you are looking for.
Check out this:
http://www.youtube.com/watch?v=PwC1OlJo5VM#!
advanced coding tips and tricks, bandwidth-saving techniques, implementation patterns, exposure to some of the lesser-known API features, and insight into how to minimize battery drain by ensuring your app is a good citizen on the carrier network.
I'm in the process of developing an android game within a group of people. Our knowledge is very limited. As a part of the game it will be essential for us to have the continuous possibility for updates. To give some context; this will be a pirate based game where by people can go around and dig for treasure on a map. Friends have to be updated when treasure is retrieved and when new treasure is buried.
Rather than setting a time period to send out a request for any changes on the server, we thought it might be good if the server could just push out changes if they occur.
Does this a) Sound like a good idea for what we'll be updating? and if yes, b) What would be the simplest way of implementing this? / If it is not simple should we stick with the first idea of just sending requests every so often?
What I know about implementing a server push is faking the server push. There are two ways of doing this. One is continuous polling to the server asking for any updates. But when the polling rate is high it will take much network traffic and also will make your application slow. Other way of doing it is to block the client call from the server. When ever a client makes a call to the server, if there is nothing to return server just block that request ( inside a while loop). When ever server has something to return, loop will break and reply the update. But if your network connection is bad you cant do this easily.
Recently google introduced push-to-device service, but it's only available 2.2 and up.
I need a similar system in my app, and I'm trying to get around limitations.
The issue is battery life. Since the user must be notified immediately about the changes on the server, I thought to implement a service that would live in the background (standard Android service) and query the server for updates.
Of course, querying the server, even each second, will cost a lot of bandwidth, as well as battery, so my question is this: does it make a difference if the server is holding the response for some period of time? (the idea behind Comet type ajax request)
Works like this:
Device sends request for data update
Server gets the request and goes in the loop for one minute, checking if there are updates on each iteration
If there are updates, server sends response back with updates
If not, service goes on to the next iteration.
After a minute, it finally sends the response that no data is yet available
After response (no matter whether empty or with data) Android fires another such request.
It will definitely cost less bandwidth, but will it consume less (or even more) battery?
Holding a TCP socket (and consequently waiting for an HTTP response) as you suggest is probably going to be your best option. What you've described is actually already implemented via HTTP continuation requests. Have a look at the Bayeux protocol for HTTP push notifications. Also, check out the Android implementation here. For what it's worth, that's definitely what I would use. I haven't done any sort of analysis of it, but this allows you to minimize the amount of data transmitted over the line (which is directly proportional to the power consumption) by allowing the connection to hang for as long as possible.
In short, the way Bayeux works is very similar to what you've suggested. The client opens a request and the server waits on it. If it has something to send, it sends it otherwise it simply waits. Eventually, the request will timeout. At that point, the client makes another request. What you attain is near instantaneous push to the client from the server without constant polling and duplication of information like HTTP headers, etc.
When the phone is actively using the networks, it's battery is used more. That is to say when it sends the request and when it receives the response. It will also be using battery just by listening for a response. However, will the phone download data, checking to see if there's a response? Or will the phone just be open to receiving it and the server will push the response out to the phone? That is mainly what it depends on. If the phone is just open to receiving the response but does not actually use the network while trying to download some response the whole time it's waiting, it should use less battery.
Additionally, the phone sending a query every minute instead of every second definitely uses less battery, as far as using the networks goes. However it depends on how you make the phone hold, if you tie it up with very complex logic to make it wait it may not help battery life. That's probably not the case, however, and I would say that in all likelihood this would work out for you.
In closing, it should help the battery but there are ways you could make it in which it would not. It wouldn't hurt to write the program and then just change some type of variable (for example WAIT_TIME to 1 second instead of 1 minute) and test battery usage though, would it?