I have a large local SQLite database in my app. I need to keep it up to date with a MySQL database. I was thinking of generating some kind of XML file with the changes and then having the client download that and use it to update the local database. Records are created and deleted with each update. Is there a better way to do this? How should I go about this? I am really open to taking any route at this point.
This is almost a year old, but this is how I did it in case anyone stumbles upon this.
-Have a MySQL database that holds all the changes
-When a change is added, it is generated with a Unix timestamp
-A column exists as a boolean for "deleted"
-PHP page takes timestamp as a URL variable
-returns all changes after that timestamp as a .CSV
-Application downloads .CSV, parses it, and updates its local SQLite database
Simple as that, works great.
Related
I have a daily devotional reading application that i am working on. I have some data structured in my sqLite database for each day for a period of one year. I want to update the UI from my database according to each present day. I have already inserted my data in the database but i am stuck on how to update the UI according to current date. Please i need helpful ideas.
You need to write select query with where class fetch record equal to today. Get the result cursor and read all fields and bind on UI.
In your case, the DB2 file is having static data. You can add this to project as part of assets folder but application can not considered it as secured db file. Please follow the link to copy the DB file from asset folder to application folder.
https://blog.reigndesign.com/blog/using-your-own-sqlite-database-in-android-applications/
I have an android application that relies on a sqlite database, and use OrmLite to access my DB.
Instead of OrmLite creating the tables I rely on downloading the database from a central server as the user will often want to "sync" things. Currently I don't have the fancy sync code written so the app replaces the db. The steps are:
1 Download the latest SQLite db file from the server, as a zip
2 Expand the file to produce a database.sqlite file in a temporary folder
3 Deletes the contents of a data folder, which contains the live database.sqlite file
4 Move the database.sqlite file from the temporary folder to the data folder.
The problem is that the new database file seems to get ignored and DAO queries I run return old data. The only way to show data from the new version of the DB is to restart the application.
To test things I created a table with a timestamp that records when the database was generated, each time you request a new copy of the sqlite db from the server this is updated. I have a fragment that displays this time so you know how fresh your data is. In the fragments onResume method I make a call to the DAO to get the timestamp and put value on screen. I've stepped through this and I see the call to the DAO but the value that comes back is from the old, now deleted, db. Restart the app and the correct value is shown.
So my question is, if I replace the underlying sqlite db file that stores my database, how can I tell ormlite to pick it up or refresh the connection or whatever it has to do???
I tried calling clearObjectCache on the DAO, made no difference.
I have a local sqlite database in android app running on multiple devices to sync with remote database. Everytime I fetch data to show on UI, I got data from local database, and later query from remote database and insert them into local database, using below code:
database.replaceOrThrow(TABLE_NAME,null,values);
this runs ok except for someone delete from one device. How do I got to know which row in my local database need to be removed when the row in remote database is deleted?
there are two options which I don't like either:
1) clear the local data whenever I fetch from remote
2) compare local data with remote data to find out which row is missing in remote data, and then delete from local database.
Is there any best practice for such common situation? Thanks!
I'd go with option number 2. You can easily calculate the delta between your local DB and the remote. For instance, you can create two ArrayList's containing the projection over the primary key or another key of your records. Then you calculate the difference between the local list and the remote list with removeAll(Collection). At this point you have a List of all the keys of records which no longer exist in your remote DB and which therefore can be erased from your local DB.
May be you can try this out:
Store the time
at which the Modification is Done is your Remote Db.
In Shared Preference Store the Time of your table creation and updation.Also update accordingly.
Finally compare : remote_table_time > table_time and fetch those entry whose time is greater than your table creation time
Applying any hand rolled syncing logic will be complex, difficult to debug and a nightmare to maintain. Your issue seems to assume that the remote DB is the source of truth but if you are allowing users to make data changes while "offline" then you need to retrospectively sync local changes to the remote store also.
I suggest you take a squiz at the sync adapter.
We've got an android app and an iPhone app (same functionality) that use sqlite for local data storage. The apps initially come with no data, then on the first run they receive data from a remote server and store it in a sqlite database. The sqlite database is created by the server and the apps download it as one file, which is then used buy the apps. The database file is not very large by today's standards, but not a tiny one either - about 5-6 MB.
Now, once in a while, the apps need to refresh the data from the server. There a few approaches I can think of:
Download a new full database from the server and replace the existing one. This one sounds like the simplest way to deal with the problem were it not for a repeated 5-6 MB downloads. The apps do prompt the user whether they want to download the updates, so this may not be too much of a problem.
Download a delta database from the server, containing only the new/modified records and in some form information about what records to delete. This would lead to a much smaller download size, but the work on the client side is more complicated. I would need to read one database and, based on what is read, update another one. To the best of my knowledge, there's not way with sqlite to do something like insert into db1.table1 (select * from db2.table1) where db1 and db2 are two sqlite databases containing table1 of the same structure. (The full sqlite database contains about 10 tables with the largest one probably containing about 500 records or so.)
Download delta of the data in some other format (json, xml, etc.) and use this info to update the database in the app. Same as before: not to much problem on the server side, smaller download size than the full database, but quite a painful process to do the updates.
Which of the three approaches you recommend? Or maybe there's yet another way that I missed?
Many thanks in advance.
After much considerations and tries-and-errors, I went for a combination of options (2) and (3).
If no data is present at all, then the app downloads a full database file from the server.
If data is present and an update is required, the app downloads some database from the server. And checks the content of a particular value in a particular table. That value will state whether the new database is to replace the original or whether it contains deletions/updates/inserts
This turns out to be the fastest way (performance-wise) and leaves all the heavy lifting (determining whether to put everything into one database or just an update) to the server. Further, with this approach, if I need to modify the algorithm to, say, always download the full database, it would only be a change on the server without the need to re-compile and re-distribute the app.
Is there a way you can have a JSON field for each of the tables? For instance, if you got a table named users, have a column named "json" that stores the JSON for each of the users. In essence, it would contain the information the rest of the fields have.
So when you download the delta in JSON, all you got to do is insert the JSON's into the tables.
Of course with this method, you will need to do additional work in parsing the JSON and creating the model/object from it, but it's just an extra 3-4 small steps.
I will recommend approach 3, because app will download the json file more fast and local db will be updated more easily avoid overhead of more internet usages.
Just create a empty db initially according to server db and then regularly updated the same by fetching json
I have an SQLite database for my Android app that stores a copy of some data from another database on a server. When the user opens the app, I want to sync the local copy to the external master. The user may have been on the related website and inserted/updated/deleted data.
If it was just insert/update, timestamps could be used, but as they could delete data, I'm not sure how to go about checking for deleted rows.
So, what's the best way to tell what's changed and update the local copy?
I'd add a table to audit the deletes (containing key fields of the deleted records) and transfer that on sync, and after a successful sync clear the table down.
Hm, we are working on iOS project, which will sync it's database with server if server will respond what it have newer version.
Our server incrementally stores performed SQL and on request if compounds all those changes to specific date and gziped sends to the application, where my Objective-C wrapper execute SQL statement from downloaded file.
May be same approach will be good for you too.