We've got an android app and an iPhone app (same functionality) that use sqlite for local data storage. The apps initially come with no data, then on the first run they receive data from a remote server and store it in a sqlite database. The sqlite database is created by the server and the apps download it as one file, which is then used buy the apps. The database file is not very large by today's standards, but not a tiny one either - about 5-6 MB.
Now, once in a while, the apps need to refresh the data from the server. There a few approaches I can think of:
Download a new full database from the server and replace the existing one. This one sounds like the simplest way to deal with the problem were it not for a repeated 5-6 MB downloads. The apps do prompt the user whether they want to download the updates, so this may not be too much of a problem.
Download a delta database from the server, containing only the new/modified records and in some form information about what records to delete. This would lead to a much smaller download size, but the work on the client side is more complicated. I would need to read one database and, based on what is read, update another one. To the best of my knowledge, there's not way with sqlite to do something like insert into db1.table1 (select * from db2.table1) where db1 and db2 are two sqlite databases containing table1 of the same structure. (The full sqlite database contains about 10 tables with the largest one probably containing about 500 records or so.)
Download delta of the data in some other format (json, xml, etc.) and use this info to update the database in the app. Same as before: not to much problem on the server side, smaller download size than the full database, but quite a painful process to do the updates.
Which of the three approaches you recommend? Or maybe there's yet another way that I missed?
Many thanks in advance.
After much considerations and tries-and-errors, I went for a combination of options (2) and (3).
If no data is present at all, then the app downloads a full database file from the server.
If data is present and an update is required, the app downloads some database from the server. And checks the content of a particular value in a particular table. That value will state whether the new database is to replace the original or whether it contains deletions/updates/inserts
This turns out to be the fastest way (performance-wise) and leaves all the heavy lifting (determining whether to put everything into one database or just an update) to the server. Further, with this approach, if I need to modify the algorithm to, say, always download the full database, it would only be a change on the server without the need to re-compile and re-distribute the app.
Is there a way you can have a JSON field for each of the tables? For instance, if you got a table named users, have a column named "json" that stores the JSON for each of the users. In essence, it would contain the information the rest of the fields have.
So when you download the delta in JSON, all you got to do is insert the JSON's into the tables.
Of course with this method, you will need to do additional work in parsing the JSON and creating the model/object from it, but it's just an extra 3-4 small steps.
I will recommend approach 3, because app will download the json file more fast and local db will be updated more easily avoid overhead of more internet usages.
Just create a empty db initially according to server db and then regularly updated the same by fetching json
Related
I am developing an Android application to collect the store (Grocery) information.
The application have modules to create store, set it's attributes like address, lat lng, operating hours, manager details, building photos, etc.
Once the store is created user need to list down the assests of that store by clicking photos and providing it's details.
To store all this details, i have around 15 SQLite tables.
Now i want to implement feature of 'Synchronization', all this captured details need to send to server whenever connection is available otherwise detail should be stored locally and whenever connection is available it should move to server.
Also, please note that the number of tables may increase up to 40 as application grows.
I searched for the solutions/approaches for this on Google but in most of the article or example they have mentioned for small scale application having small data.
I have also implemented synchronization feature for small datatable (2 tables), where i checked for last updated timestamp on server and local and if it's different then we synchronize the data. I don't this i should use this approach for such large scale and large database.
I have one approach which doesn't depend on numbe of tables.
I am planning to have single table which store the following data
id
URL
request header
request body
Now let's say connection isn't available while sending request so it will be stored in table. Whenever connection is available it start reading the table and execute the request, on success it will remove the entry from table. With this approach we need only one table in SQLite.
The problem with this approach is when we want to retrieve data offline how we can do that? Do we need to have local database schema same as server?
Please guide.
Thanks
If you are syncing data with a server and you are removing local storage data ,which is incorrect as per my knowledge ,in this case your app does not work offline.So for that when you sync data to a server at that time maintain some flag which data is synced.And then next time just check flag status if it's synced then do not synced data otherwise do syncing.
I hope this solves your problem.
I want to create a sqlite database file in a web service, so I dont have to read a json in the android device and wait for it to read the json, convert it to an object and then insert it to the database.
When the json is huge, with a lot of data, that process its to long for be waiting in an android device.
I would like to generate the database file of sqlite in the webservice, so that, instead of returning the json, it returns the sqlite database, and in android, I just need to save the database, so that, it is ready to use.
That would save a lot of time!
SQLite have libraries for almost any kind of server side language.
SQLite db is just a file so after is created you shall compress is in a zip and use volley library to dl the file over http.
Decompress the zip and connect to it.
I have no idea about which kind of data and which amount you need to transfer but if the data is organised properly the processing should be so long. Also you have to take in consideration that using JSON you can "ask" to receive only updates (delta) and this is something that is not possible if you download all the db each time.
Update: for this kind a data I would go to a different approach. Use docs from google publishing api to upload every specific period of time the db in an extension pack for your app. so most of the dl'ing process is even before the "install" on the device itself. When the app is first running will contact your server and get the latest updates since the db was created (I suppose that even that is a week you are talking about less than a hundred rows)...
I will develop an android application with a lot of data (json files with some rows and CSV for graphics data with a lot of rows) , this data change every 5 minutes and replaces all the previous data (or mostly).
What are the best approaches to design this ? I have 2 options:
Save all the data in a sqlite db, and sync this by a IntentService.
save the data in json and csv files and replace this every 5 minutes.
Which approach will the best performance?
This considering the time to parse the files, sorting data, the download time and the consistency of data.
any other ideas?
PD:I need a cache system too, in case if i don't have internet and I need the previous stored data
Advantages of SQLite:
Changes are ACID
You can make complex requests faster (e.g. "give me only fields A,B from items with (C/D)>E")
I would bet more compact for big data (integers are stored as integers instead of a string of individual digit)
Only one file
You can merge old data with new data easily
You can update current data, even while using it
Concurrency can be handled
Advantages for JSON/CSV:
Easier to debug (plain text)
Faster to make a whole update (copy the new file + delete the old one)
For the original question the whole delete/replace of the data makes JSON/CSV the winner.
However, if the application was to retrieve partial data every 10s and merge/update it with the previous one, SQLite would be a better option.
Sqlite is mostly used when you want data to be saved and used in future. In your case data is changing every 5 minutes so its better to have JSON because every time to make the Database connection store and retrieve after 5 minutes will take some time.
UPDATE:
I also had the same Application in which the data was changing every time. In that case I used Map<K, V> and ArrayList to maintain the values, because as the data is changing everytime I think its not feasible to store the data in Sqlite everytime. It needs much time to perform DB Connection store, retrieve, update the data in Sqlite.
I recommend using JSON or some type of object serialisation unless:
You need ACID compliance for write operations
You need to report against the data which may involve copying the data to an external RDBMS
or
You wish to join those complicit in the overuse / abuse of databases, as commonly seen nowadays
Ideally this should depend on whether you need the previous data, for maybe comparing it with current data and so on. As a thumb rule, I use SQLite when you need the data to be stored and retrieved at a later stage. In case the data is only for display, I would rather keep it in program memory. Mind you this does not involve file operation.
Purpose of JSON and SQLite is completely different from each other
JSON = is used send and receive data between server and client.
SQLite= is used to store data.
I want users to be able to get additional content from my website which means I will insert the downloaded data into the device's SQLite. I am wondering if I am approaching this the right way..
My current approach is to create a REST web service which returns data in JSON format, parse the JSON and insert it row by row into the Sqlite db on the android device.
Is this the right approach? Will it be too slow if there are many table rows to be inserted at one time? Or is there a way to download another SQlite db and merge it with the local one?
I welcome any suggestion, thank you in advance for your answer.
I works, but you absolutely need to paginate : set a limit to the number of element sent by your rest service.
Another approach would be to download the complete sqlite database file at once, but that requires some tweaks. see http://www.reigndesign.com/blog/using-your-own-sqlite-database-in-android-applications/ (it is about embeding the database from the assets, but the preparation of the database is the same.)
A last point: large amount of insert, as well as downloading data from a server, must to be done in a separate thread or asynctask, from a service (not an activity that can be interrupted), or even better from a SynchronizationAdapter, which is called by the system itself.
I have about twenty pages of information that is stored in tables that needs to be stored in my Android application. Each column is a designated stop on a bus route and the column is filled with times that the bus will be at the stop. There is also certain information that needs to be associated with some times, such as if the bus is handicap accessible at a certain time.
Here is an example of one of the tables: Bus Times
I have thought about using a SQL lite as that seems as though it would be able to store these tables quite easily; but when I think of using SQL I think of dynamic data storage and this shouldn't be changing more than once a year.
Is SQL appropriate for this application? Is there a better way to do this?
Thanks,
Rob
I think a database is really the appropriate form for doing this. Data in a database don't have to chance regularly or very often, almost more important is the fact that you can relatively easy extract very specific information from a large data set. So if you need to store the data lcoally I would use a database.
Just a hint for another approach. Did you think about reading this data directly from the website? Judging from the style of this page I don't think they offer a webservice, but maybe you could parse it using HTTP Get? Don't know if the structure changes over time, but this solution would have the advantage that you don't need locale storage and if the data is update you don't have to manually update your database.
Hope could help you