is it possible to set a download speed limit for a HttpUrlConnection in Android?
My app displays data. The data is retrieved from a web server.
There are two types of data that are loaded from the web server to display them in my app:
- small files (about 1 MB)
- big files (about 100 MB)
The problem is:
When I start to downlaod a big file, which is about 100 MB and may take about 5 minutes,
my app is nearly unuseable in the meantime.
A typical scenario is:
User klicks on a big file --> big file is downloaded in the background.
In the meantime the User wants to display another little file (1 MB, should take about a few seconds to load it from server ). But the problem is, that the first downlaod (loading the big file) uses the whole bandwith and therefore the download of the small file takes about 2 minutes (instead of a few seconds).
So I would like to set a speet limit for big files (for example half of the bandwith etc.) or to implement some priority queue for downloads...
How do I set the download limit?
What I would do is either use the DownloadManager as the previous commenter suggested, if you're developing for API level 9+. The trouble is with this is that downloads are shown in the notification bar and you might not want that.
As far as I can see there is no way to limit bandwidth on a specific download using the HttpClient used with Android. But I am guessing that you are downloading the file using an AsyncTask per file, and AsyncTasks are executed serially therefore that might explain why the 2nd file doesn't start downloading.
I strongly suggest looking at RoboSpice which is perfect for this type of background downloading. I'm pretty sure you will be able to download multiple files at once as well.
Related
I am working on a difference game (for android).
The apk size becomes large when I add too many levels(say 100+). I have optimized all images, etc. However the size become more than 10 Mb.
Is there any way to allow users to incrementally download 10 image files (png) at each time.
Something like only first 10 images with app install and then allowing incremental 10 images with each button click from within the app.
If I try to do it by downloading from a web server, what will I need to implement (is it async task or can I do it with something simple). In such cases, is http allowed or I need https.
I'd personally rather download that 10MB once and for all instead of downloading those additional images every time I play the game; it could be expensive to download all that data. You could of course cache them, so I don't have to download them again. And you could foreward-download an entire level of images at a time (level2.zip). But what if I suddenly beat my own high score and proceed to a new level ... and I don't have a very good internet connection at that particular moment? If you're ruining my gaming experience, I'll delete your app and give it a bad review immediately -- and so will everyone else =)
I'd definitely recommended adding all data and not depend on the user having a fast internet connection AND is willing to pay for the download.
If you app grows too big, you should look into expansion files, but I havne't used those myself so I can't tell much more than the link:
https://developer.android.com/google/play/expansion-files.html
My Android app needs to do a large (~150MB) download in the background. I am using an intent service and doing this in the background.
The problem is that this big download cloggs all other downloads.
(I'm using Volley+Glide for image downloads in the app and OkHTTP for the large file download. I would use Volley for everything, but since it's not recommended for large files, I'm using okHttp.)
Volley has a way of setting download priorities, but AFAIK, that's only used to determine WHEN a download starts, not for the % of bandwidth a d/l uses.
I was not able to find a way to set okHTTP to download at a very low priority.
Ideally, what I would like to do is just set the large okHttp download to a very low priority, and it would let everything else download.
Create a class called ThrottledSource that extends Okio’s ForwardingSource. This class needs to limit the kilobytes per second. You can see examples of throttling a stream in MockWebServer.java.
When you're done you'll have a stream that downloads slower than it needs to. That'll free up bandwidth for other transfers.
Split the file into smaller chunks, eg. split zips
Download the files with the IntentService when the app is not running (use JobScheduler)
im currently working on a android app that has to GET over 70 images all from different urls. I am currently using the asynctask to GET all the images. would my app load the images faster if i was to create many threads and divide task of getting 70 images between the threads like 20 on one thread and 50 on another?
I suppose you want to load all 70 images at the same time? It might depend on the speed of the internet connection but basically I guess the time to completely load all images would be the same, whether you load them sequentially or in parallel. Sequentially, each image can be loaded using the full internet bandwidth, while loading the images in parallel could only use a fraction of the bandwidth, so in the end the total amount of time will be roughly the same.
Well, I can imagine that having multiple http connections each managed by different thread may be beneficial for the performance. The best way to determine the number of threads to use (could be > 2) is to benchmark your piece of code for overall download time (i.e. between begin until all images have been loaded).
For downloading multiple images asynchronously I would also recommend using picasso library.
It has very few chances to get faster, because time of loading is due to time of the connection.
The right way to do I think, is to think about designing your application better, to avoid retrieving 70 images in one time, because I don't know your application, but do you really need to display 70 images in same time ? It would be better to load images only when you want to display it. This for 2 main reasons :
Your activity will be displayed faster
Your user may not have an unlimited data plan and is maybe not on wi-fi
I am currently using the asynctask to GET all the images. would my app
load the images faster if i was to create many threads and divide task
of getting 70 images between the threads like 20 on one thread and 50
on another?
If you are using a single task to download all your (70) images that is not a proper way for it. Of course you should use separate tasks while downloading images if you want them to be downloaded faster.(I assume there is no constraint for downloading images sequentially) If you continue using a single task to download 70 images, they are going to be downloaded one by one and it will take long.
Use a library for image downloading. It is a painful job to load remote images. I suggest you to use a library for this. I'm currently using ShutterBug. Try it.
I am developing an app (for both Android and iPhone). Actually, in my app I have to download lot of videos in background (using a background service, IntentService) as well as I have a screen showing the progress of downloads (using an Activity which has UI for showing download progress). But I don't know why the download speed is too less as compared to download speed for the same app in iPhone.
Also, after each download of video, I am marking that video as downloaded in the database. And this happens for all videos.
Is the database calls a problem for slow downloads of files in android?
Since the same functionality does not affect the download speed of files in iPhone.
You could try to implement some sort of download accelerator which downloads a single file by splitting it in segments and using several simultaneous connections to download these segments from a the server. First you should check if the server Accepts byte-ranges and then specify byte-ranges to download in a multi-threaded way. At last you just merge back the segments into a complete file.
Look into these documents for more info:
http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.5
Page 128:
http://www.ietf.org/rfc/rfc2068.txt
Also look up Java API URLConnection.setRequestProperty
However, I think the slow download is mostly because of your network bandwidth and server's upload speed.
Increase the Byte size of your inputstream
Update:
Take a look here
I need to parse some webpage in my application, but I have 1 big problem - data. Page I want to parse has something between 400-500kb, depending on time. I need to parse it few times per day - depending on user request etc, but standard should be 10-20 times per day. However, I'm worried about data - if I parse it 10-20 times per day its 150-300mb in 1 month (10 x 30 x 0,5mb). Which is too much, as many people have 100mb limit. Or even 500mb limit, and I can't eat half of it with my app.
I need only very small part of web page data, is there a way to download for example only a part of web page source, or only some specific tags, or download it compressed, or any other kind of download whithout eating hundreds of mb per month?
Doing this would probably need some co-operation from the web-server, if you are downloading the page from a server that isn't under your control then this is probably not possible.
One thing to bear in mind is that modern web browsers and servers typically gzip text-based data, so the actual amount of data being transferred will be significantly less than the uncompressed size of the pages (to get a rough idea of how big the transfer will be, try using a zip utility to squash the raw HTML).
One further thing that might help is the HTTP Range header, which may or may not be supported by your server - this lets you request particular parts of a resource, specified by a byte range.
The best way I can think of doing it is to set up a proxy server, which will download the page periodically, and extract the data you need, exposing that to your app in a smaller, more suitable format.
You could for example use a command line tool like wget or curl on a linux server, then use a script (php/perl/python/ruby/bash) to parse the data, and re-format it. Then you would serve the content using a web server (apache/lighttpd).
Personally, I would do the whole thing in node.js, if you have the luxury of your own server to use for this task.