I have an app that allows users to sign in with google account. A user may be logged in with same account on multiple android devices at the same time. The app on these devices maintain a local database, which needs to be periodically synced. These devices may add or remove or update the entries at the same time.
Data will be stored on User's Google Drive.
Data that needs to be synced is simple (much like key-value pairs with no relations)
In case of update conflicts, the most recent update prevails.
Incremental updates are preferable, although not compulsory.
Assumption - Each entry takes about 150 Bytes, max number of entries per device is 1000, and max no. of devices logged in per user is 3. Thus, maximum amount of data synced is ~500 KB.
I don't want to host an actual server. Since the app is free and the number of users are very large, I cannot go for paid solutions (like Firebase, etc).
Further, I am using SyncAdapter for synchronisation (if it helps).
My questions are:
What would be the logic/algorithm to implement it? (which deals with conflicts as well)
Is there any better way other than storing files in Google Drive for sync purposes?
What I have thought so far:
I will save a JSON file in user's drive storage, representing entire database.
On local database, when entries are deleted by user, instead of deleting the entries, I set the value of deleted_at column as current time stamp.
On local database, when entries are added/updated/deleted, I update the updated_at column as current timestamp.
Sync will be done as follows:
Fetch the file from Drive.
For each records: If updated_at value from Drive file is more recent, then update record in local db. If updated_at value of local db is more recent, update record in JSON file.
Add all the records that exist in File, but not in local db and viceversa.
Upload and Overwrite the JSON file to Drive.
Please suggest better ways, and what problems will I run into with this approach.
Related
I want to develop a healthcare app in android. the doctor will be authenticate for a specific time to access patient's medical reports and download them to the application (reports will be in a block-chain or a db). when the session is over all of those downloaded data (reports) should be permanently deleted from the doctors mobile. what is the best approach to delete these data?
Storing files in DB is never advised. Rather, they should be stored as File themselves and you can save their path in the DB searching and accessing the files.
Your point about session timeout is too broad. It could be carried out in several ways, like Logout, Time Limit expired, Case closed from the Patient/Doctor's End etc.
You can try these steps if you find them suitable:
Once the doctor selects documents to be saved, download and save them in the Internal Storage of your app. Concurrently, save their respective path and download timestamp in a DB Table for future reference.
If your files are confidential and shouldn't be read outside your app, you can either encrypt them using an encryption algorithm and then save them on the device. You can also save them in different extensions and with random names to further make it complex for general users to extract them from the device. You will have to decrypt them at the time of viewing though.
If you think that the data in the file can be parsed and raw (text) data can be extracted, you can also try implementing a DB table and save such information in the DB itself. In such a case, there would be no files being saved on the device.
Now, you have your content (be it in a file system or DB) and your next task is to delete them once the session is over.
For LogOut Case, simply delete all the available data (both from the file system and DB), cleaning everything.
For Doctor Deleting the case, You can remove all the files for the selected case from the device. This information could be easily maintained in a DB Table.
For the case where a patient deletes/closes, you will have to implement a Push Notification service, wherein your server will send a delete command to the device. On receiving the notification on the app, you can follow the same steps.
For time limit expired, The simplest logic is to check, either every day at a certain time or every time your app is opened, for all the files which have a timestamp having 7 days older than today's date. Note the timestamp and file information is stored in the DB.
To check every day at a certain time, You will have to implement AlarmManager which will invoke a background service to carry out the task.
Note: There could be more possible ways to do such a specific task, however, these are the simplest and most widely used approaches.
Well, when the doctor will be authenticated you should start some type of a timer (for how long will he be authenticated to use patient's records) and save the paths to these files in DB. After the timer hits the 0 or the max value, you should have a listener or observer that just deletes the files from his mobile (using the paths saved in the database). You can delete files using File class.
Well, the best approach would be to create a cache directory with some unique name that distinguishes all the patients' records and caching all the downloadable items into that directory and deleting that directory upon completion of the session.
I'm working on a Mobile App, where the main feature has the user do a lot of CRUD (Create, Read, Update and Delete) tasks within it.
The main storage of data for the App is a local sqlite database, but the user has the option to register an account and use Cloud database to backup their data.
This App needs to be able to work both offline and online. And the user should be able to use multiple devices containing the same data.
Currently in all of my sql tables I have 3 extra columns that keep track of which entries in the database is synced: createdAt(datetime), updatedAt(datetime) and synced(boolean).
With this I am able to keep track of which entries are the most recent, and update either the local or the cloud database accordingly.
I'm using Cloud Firestore as the Cloud, and using its' Offline capabilities is not a viable option in my case.
My Question is; Which solution would be the best when keeping track of deleted entries until the App is able to sync with the cloud?
One idea for a solution for this is to have an table that contains all the ID's of deleted entries together with which table it belongs, and then when the App is able to sync; remove these entries on both the local database(on all devices) and the cloud database.
The problem I have with this solution is that, this 'deletion' table quickly will become huge, and removing entries from this table would be a problem, because of the need of all the user's devices to be up-to-date before deletion and in a scenario where the user has abandoned one of his devices, this would mean that the device would never sync, resulting in the entries not being remove from the 'deletion' table.
What would your suggestion be for a robust way of tracking deleted entries?
I don't think there's a solution that satisfies both these objectives:
Don't keep deleted items in the database forever
Make sure deletions are synchronized between all devices forever
So you will have to decide which one to give up on. Your idea satisfies 2 but not 1. A solution that would satisfy 1 but not 2 is to delete the deletion records after a period of time, maybe six months. A variation on that would be when a record is deleted, rather than actually deleting it just mark it as deleted (as well as the date when it was deleted), and if applicable remove any large pieces of data from the record. After whatever grace period you decide, the record can be actually deleted. The down side is that if a mothballed device is brought back out, it could restore previously deleted records.
I am developing an Android application to collect the store (Grocery) information.
The application have modules to create store, set it's attributes like address, lat lng, operating hours, manager details, building photos, etc.
Once the store is created user need to list down the assests of that store by clicking photos and providing it's details.
To store all this details, i have around 15 SQLite tables.
Now i want to implement feature of 'Synchronization', all this captured details need to send to server whenever connection is available otherwise detail should be stored locally and whenever connection is available it should move to server.
Also, please note that the number of tables may increase up to 40 as application grows.
I searched for the solutions/approaches for this on Google but in most of the article or example they have mentioned for small scale application having small data.
I have also implemented synchronization feature for small datatable (2 tables), where i checked for last updated timestamp on server and local and if it's different then we synchronize the data. I don't this i should use this approach for such large scale and large database.
I have one approach which doesn't depend on numbe of tables.
I am planning to have single table which store the following data
id
URL
request header
request body
Now let's say connection isn't available while sending request so it will be stored in table. Whenever connection is available it start reading the table and execute the request, on success it will remove the entry from table. With this approach we need only one table in SQLite.
The problem with this approach is when we want to retrieve data offline how we can do that? Do we need to have local database schema same as server?
Please guide.
Thanks
If you are syncing data with a server and you are removing local storage data ,which is incorrect as per my knowledge ,in this case your app does not work offline.So for that when you sync data to a server at that time maintain some flag which data is synced.And then next time just check flag status if it's synced then do not synced data otherwise do syncing.
I hope this solves your problem.
I have an online (postgresql) database, and a local copy of the db in an Android app. The data in these is synchronized, so the app can function offline, but can download new data and upload results when it can. Data is transfered via http GET and POST requests.
I have no problem synchronizing the data for known tables, however, I would like it to be possible to create new tables/alter tables in the online database and have that change reflected in the Android db automatically i.e. without having to release a new version with the updated synchronization code. Is there an obvious/standard way to do this that I haven't found? Google searches I have performed just refer to database migrations for two of the same system or for known schema.
Is there an obvious/standard way to do this that I haven't found?
There's no "standard" way but there are ways to do it depending on your requirements and how your app works.
It's actually a broad question but solvable if you give it a bit of thought.
My app downloads data from a server on a daily basis. Part of the download process involves downloading a DB version file (just a plain text file) similar to the following...
db_version=12345
...the string representation of the previous version (if any) is saved in SharedPreferences as an int value.
If the version in the file is greater than the one in SharedPreferences, the downloader pulls some text files which contain SQL commands for creating, dropping, altering tables etc. Only after the changes have been successfully made does the downloader pull the actual data files and update the DB data (not forgetting to update the latest version in SharedPreferences).
If the version in the file is the same as in SharedPreferences then obviously the download simply does the normal daily data download and DB data update.
We've got an android app and an iPhone app (same functionality) that use sqlite for local data storage. The apps initially come with no data, then on the first run they receive data from a remote server and store it in a sqlite database. The sqlite database is created by the server and the apps download it as one file, which is then used buy the apps. The database file is not very large by today's standards, but not a tiny one either - about 5-6 MB.
Now, once in a while, the apps need to refresh the data from the server. There a few approaches I can think of:
Download a new full database from the server and replace the existing one. This one sounds like the simplest way to deal with the problem were it not for a repeated 5-6 MB downloads. The apps do prompt the user whether they want to download the updates, so this may not be too much of a problem.
Download a delta database from the server, containing only the new/modified records and in some form information about what records to delete. This would lead to a much smaller download size, but the work on the client side is more complicated. I would need to read one database and, based on what is read, update another one. To the best of my knowledge, there's not way with sqlite to do something like insert into db1.table1 (select * from db2.table1) where db1 and db2 are two sqlite databases containing table1 of the same structure. (The full sqlite database contains about 10 tables with the largest one probably containing about 500 records or so.)
Download delta of the data in some other format (json, xml, etc.) and use this info to update the database in the app. Same as before: not to much problem on the server side, smaller download size than the full database, but quite a painful process to do the updates.
Which of the three approaches you recommend? Or maybe there's yet another way that I missed?
Many thanks in advance.
After much considerations and tries-and-errors, I went for a combination of options (2) and (3).
If no data is present at all, then the app downloads a full database file from the server.
If data is present and an update is required, the app downloads some database from the server. And checks the content of a particular value in a particular table. That value will state whether the new database is to replace the original or whether it contains deletions/updates/inserts
This turns out to be the fastest way (performance-wise) and leaves all the heavy lifting (determining whether to put everything into one database or just an update) to the server. Further, with this approach, if I need to modify the algorithm to, say, always download the full database, it would only be a change on the server without the need to re-compile and re-distribute the app.
Is there a way you can have a JSON field for each of the tables? For instance, if you got a table named users, have a column named "json" that stores the JSON for each of the users. In essence, it would contain the information the rest of the fields have.
So when you download the delta in JSON, all you got to do is insert the JSON's into the tables.
Of course with this method, you will need to do additional work in parsing the JSON and creating the model/object from it, but it's just an extra 3-4 small steps.
I will recommend approach 3, because app will download the json file more fast and local db will be updated more easily avoid overhead of more internet usages.
Just create a empty db initially according to server db and then regularly updated the same by fetching json