I need download around 300 images to my app but I didn't find anything saying which method is the most performance/connection friendly. Should I download and unzip the file on the device or download the images one by one?
On a mobile device, making less connections is preferable, as it can be expensive to negotiate a network connection in the first place. A zip should be smaller than the individual files and it will definitely require far less data than downloading the individual files, due to less overhead as you now have one instead of 300 calls.
You should also be able to download the zip file faster than downloading the individual files, as there's less hand-shaking required to create and close connections.
Rule of thumb for me on mobiles is to do as much as possible in a call, so that you do not create and open multiple connections. I've found it more efficient to add an additional 1k or 2k payload to a call than to make multiple calls to retrieve information.
Related
I am building an android app which requires to have >300 images and ca. 100 audio files for an apk size of around 50mb (after webp compression and proguard).
It is not huge and I could probably live with that. But since I am planning to add other features the size will get bigger and bigger.
I am still interested though, since I'm quite new to android development, if there is a better way to store all this files, perhaps remotely and access them when required.
When the app starts I would have to load all the images into a list and once an element of the list is tapped I would need to open a separate activity and load the sound. So there is no upload from the App, just a resource gathering.
I do not know if it is more efficient this way or to store all the files locally.
Either way I would like to know what my options are. what are the pros and cons of a server (and if it would be a viable solution for me at all) and what is the advantage of storing them locally instead.
Please keep in mind that I working by myself and haven't got money to invest on premium servers or stuff like that.
FILE STORAGE will be best for you. Performance depends on the type and amount of data you are using. You do not need too much of data manipulation so go for file storage if privacy is not your concern for the data as it will be available for all the applications.
Use SQLite Database if the files needs to be protected from other applications.
Use File Storage(internal/external memory) if other applications can also access your files.
Avoid Fetching data from server using JSON parsing/Http requests it will make your app rely on the internet all the time. Unless you are using it to update your database or file storage.
My Android app needs to do a large (~150MB) download in the background. I am using an intent service and doing this in the background.
The problem is that this big download cloggs all other downloads.
(I'm using Volley+Glide for image downloads in the app and OkHTTP for the large file download. I would use Volley for everything, but since it's not recommended for large files, I'm using okHttp.)
Volley has a way of setting download priorities, but AFAIK, that's only used to determine WHEN a download starts, not for the % of bandwidth a d/l uses.
I was not able to find a way to set okHTTP to download at a very low priority.
Ideally, what I would like to do is just set the large okHttp download to a very low priority, and it would let everything else download.
Create a class called ThrottledSource that extends Okio’s ForwardingSource. This class needs to limit the kilobytes per second. You can see examples of throttling a stream in MockWebServer.java.
When you're done you'll have a stream that downloads slower than it needs to. That'll free up bandwidth for other transfers.
Split the file into smaller chunks, eg. split zips
Download the files with the IntentService when the app is not running (use JobScheduler)
I need to download large audio files from a web server from within a corona sdk app, and have been looking at using network.download() or possibly network.request() for this purpose.
Because much of the user base will be in areas that have poor or intermittent network coverage, I would like to improve robustness of the download process by allowing for resuming download from where it left off if the network drops out.
From the documentation neither network.download or network.request functions seem to support this directly, but is there a way that I can use either of these functions to achieve what I'm looking for? If not, is there another technique I can use?
My app will eventually be for both iOS and Android, but for now I am developing the iOS version first. Therefore, I am ok with having to use a different solution for each platform if there is not an easy solution that covers both platforms. However, I would prefer not to have to use native code if possible as I don't currently have an Enterprise subscription for Corona.
I don't think you can do this using Corona without native plugins.
One way to go around this problem is to split the large files in smaller ones, and I don't mean creating multiple audio files just splitting the large audio files in smaller files.
Then you can download one chunk at a time and when one fails, by getting an error in the handler or waiting a reasonable timeout and then checking to see if the file is present in the file system, you can start downloading it again.
After all chunks are download you can recreate the large file by using the ltn library.
You can read some more about the ltn12 library here and I think you need to take a close look to the Pumps method.
On your server-side, create a simple program that splits an audio file in multiple sub-files of the max size you would like to specify.
On your client-side, create a function that collides multiple chunks in one single audio files.
For example if you would like to limit your file size to 1MB, create a server side program that splits any audio file above 1MB in chunks: a 4.5MB file would be split in part1 1MB, part2 1MB, part3 1MB, part4 1MB, part5 0.5MB.
Glue the chunks together in one single file in your lua code when you have fetched that with network.request.
I am developing an app (for both Android and iPhone). Actually, in my app I have to download lot of videos in background (using a background service, IntentService) as well as I have a screen showing the progress of downloads (using an Activity which has UI for showing download progress). But I don't know why the download speed is too less as compared to download speed for the same app in iPhone.
Also, after each download of video, I am marking that video as downloaded in the database. And this happens for all videos.
Is the database calls a problem for slow downloads of files in android?
Since the same functionality does not affect the download speed of files in iPhone.
You could try to implement some sort of download accelerator which downloads a single file by splitting it in segments and using several simultaneous connections to download these segments from a the server. First you should check if the server Accepts byte-ranges and then specify byte-ranges to download in a multi-threaded way. At last you just merge back the segments into a complete file.
Look into these documents for more info:
http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.5
Page 128:
http://www.ietf.org/rfc/rfc2068.txt
Also look up Java API URLConnection.setRequestProperty
However, I think the slow download is mostly because of your network bandwidth and server's upload speed.
Increase the Byte size of your inputstream
Update:
Take a look here
I'm writing an application for Android which let users browse a list of files and download them.
For every download, I created a thread and I download the file with an HttpURLConnection instance (by reading from the connection in a while loop).
This method works fine with one active download. But when user starts more than one, download performance degrades dramatically. Most of the time, these parallel downloads consume all the bandwidth and the users is unable to browse files (which uses another HttpUrlConnection to load the files list).
Any suggestions on refining the download system?
Thanks.
P.S.: The method that popular browsers such as Google Chrome and Firefox do seems good. Anyone knows how they work?
Alas, i don't know of a way to throttle certain connections. However, a practical approach would be to implement a queue of downloads to control the number of simultaneous downloads. In your case, you would probably want to only let 1 thing download at a time. This can be implemented a few different ways.
Here's a way to do it with Handlers and a Looper: http://mindtherobot.com/blog/159/android-guts-intro-to-loopers-and-handlers/
Edit 1:
See mice's comment. It may be smarter to have a max of 2 threads downloading at a time.
You might want to check out the DownloadManager class in the android SDK.. Its only available above or equal api level 2.3 though.
http://developer.android.com/reference/android/app/DownloadManager.html
Some tutorials you might want to see..
http://jaxenter.com/downloading-files-in-android.1-35572.html
http://www.vogella.de/blog/2011/06/14/android-downloadmanager-example/