I am developing an Android application to collect the store (Grocery) information.
The application have modules to create store, set it's attributes like address, lat lng, operating hours, manager details, building photos, etc.
Once the store is created user need to list down the assests of that store by clicking photos and providing it's details.
To store all this details, i have around 15 SQLite tables.
Now i want to implement feature of 'Synchronization', all this captured details need to send to server whenever connection is available otherwise detail should be stored locally and whenever connection is available it should move to server.
Also, please note that the number of tables may increase up to 40 as application grows.
I searched for the solutions/approaches for this on Google but in most of the article or example they have mentioned for small scale application having small data.
I have also implemented synchronization feature for small datatable (2 tables), where i checked for last updated timestamp on server and local and if it's different then we synchronize the data. I don't this i should use this approach for such large scale and large database.
I have one approach which doesn't depend on numbe of tables.
I am planning to have single table which store the following data
id
URL
request header
request body
Now let's say connection isn't available while sending request so it will be stored in table. Whenever connection is available it start reading the table and execute the request, on success it will remove the entry from table. With this approach we need only one table in SQLite.
The problem with this approach is when we want to retrieve data offline how we can do that? Do we need to have local database schema same as server?
Please guide.
Thanks
If you are syncing data with a server and you are removing local storage data ,which is incorrect as per my knowledge ,in this case your app does not work offline.So for that when you sync data to a server at that time maintain some flag which data is synced.And then next time just check flag status if it's synced then do not synced data otherwise do syncing.
I hope this solves your problem.
Related
I have an architecture question. If you have a web app that is storing information on a DB server, theoretically, I should be able to use the middle tier logic for a mobile app. When the mobile app starts it can connect and populate a local SQLite DB or use JSON to store information within the mobile app. What if the mobile app also needs to work in off-line mode? Do you have it sync the next time it is connected? Do you have the mobile pull down and populate a complete DB or so it available in off-line? What are the best ways to architect a mobile app that has to go from on-line to off-line?
The simplest solution would be to put a "LastEdited" column into every table in your database and then pull query all the data which has updated since the last sync ( and you can perform a check on the index to detirmine if you need to update or insert into your own local cache. )
The ability to delete rows should actually be limited to a boolean "isDeleted" flag in this case to keep the sync process nice and simple.
If you have then the ability to edit or create rows from your app then you should keep a local table of changes to sync when you can go online and may have to implement some form of "merge" logic.
Several things you need to consider.
If your app is read only, you should implement a 'delta sync' logic in your local d. Keep a timestamp of last sync and get updates from your server. Of course, you need to consider the local db size in getting too large.
If you app is read/write, when working offline, you need to consider the two way sync especially when same record can be updated in different devices/users.
I am developing an android application where it will be communicating with a server all the item. Specifically, the android interface has many fields were the user fills in the data and send them to the server (Glassfish with Oracle Backend). My concern is: what is the best way to store data when the connection is lost so that when it connects again, I can send the data to the server.
Note 1: Data are all textual and it can reach 1.5 MB in size. Also, there is a plan to save images too.
Note 2: I know about SQLite, but is this the best solution or there is sth else?
Finally, I would like to thank all of you for your collaboration
SQLite is a good solution.
Because your data size can be reach upto 1.5 MB you must store your data in an easy way, which you can easily retrive the stored data when tha connection is available to the server.
I also have used SQLite in android and i believe it will be the best resolution for your problem.
For more comparison see http://developer.android.com/guide/topics/data/data-storage.html
Use SQLite to save your data offline, clear the table if data is sent to server. Best method.
Use SharedPreference to save keys(sent Succesfully) and values(true/false).
- Ex Save Data in DB >> Send data to Server >> Get Acknowledgement (if failed, resend till success) >> Update keys >> delete data in DB >> Repeat cycle
Use Cache/local directories to save images
You can use an SQLite database, and have your rows include a Synced tag. If the sync fails, add a row to the database with Synced = False. When you later Sync the data and get a successful return message, you can update the row in the database to Synced = True (if you plan to have offline cached data) or simply remove the row if you're using the table as a temporary store.
You do not want to use SharedPreferences in this instance.
If you're going to use the database to keep a persistent store synced with the server online, you may wish to also look at the following:
ORMLite
GreenDAO
I have a SQLite database on Android, which I'm designing to be available offline, and is getting data through a REST API on the server side. But is there any way I can only get the data that has changed since last upload? For example, if there is 5 customers registered in the MySQL database, and one of them change the address, then I only want to update the one with new address, and not the other 4.
Using timestamp field will be a solution for downloading incremental changes from the server to the client. For uploading data from the client to the server - some "Dirty" flag can be used to designate new records for upload.
Have a look at this question: Client-server synchronization over REST, which addresses the similar problems.
There is no build in API.
However you can create a timestamp column for every table which gets set on every update and insert.
When syncing, you can ask the server to give you only the rows with a timestamp greater than the last sync.
You might get in trouble when doing updates in both directions.
I'd like to make a basic to do app in android to get my feet wet. I have a rest api and online DB that handles the basic CRUD when there is a connection present.
Most task apps I've used however, allow creating tasks when there is no connection present.
What are the best practices for stuff like this?
Do apps usually store a copy of ALL data for a user locally so there is access to it when a connection is not present?
It looks like the app I use (astrid tasks) has no problem accessing all my tasks/history regardless of connection
If this is the case, how is syncing handled as far as the remote data's primary keys are concerned?
You have some encoding, let say one request per single data change to be executed atomically encoded as xml or json. Make a base class which is parent of connection and use it to send data update to remote db. If connection isn't present store entire command into file or sqlite. You can create multiple files (if going by file approach) based on their sizes, date etc. Create some rules how the oldest record will be chosen - if you need to update db in ordered manner.
One solution would be to have a local database in your application. When there's no internet connection store the data in this database.
Now let your application listen for network changes. When the device is connected to the network, you could upload the cached data from local database to the server without user interaction.
We've got an android app and an iPhone app (same functionality) that use sqlite for local data storage. The apps initially come with no data, then on the first run they receive data from a remote server and store it in a sqlite database. The sqlite database is created by the server and the apps download it as one file, which is then used buy the apps. The database file is not very large by today's standards, but not a tiny one either - about 5-6 MB.
Now, once in a while, the apps need to refresh the data from the server. There a few approaches I can think of:
Download a new full database from the server and replace the existing one. This one sounds like the simplest way to deal with the problem were it not for a repeated 5-6 MB downloads. The apps do prompt the user whether they want to download the updates, so this may not be too much of a problem.
Download a delta database from the server, containing only the new/modified records and in some form information about what records to delete. This would lead to a much smaller download size, but the work on the client side is more complicated. I would need to read one database and, based on what is read, update another one. To the best of my knowledge, there's not way with sqlite to do something like insert into db1.table1 (select * from db2.table1) where db1 and db2 are two sqlite databases containing table1 of the same structure. (The full sqlite database contains about 10 tables with the largest one probably containing about 500 records or so.)
Download delta of the data in some other format (json, xml, etc.) and use this info to update the database in the app. Same as before: not to much problem on the server side, smaller download size than the full database, but quite a painful process to do the updates.
Which of the three approaches you recommend? Or maybe there's yet another way that I missed?
Many thanks in advance.
After much considerations and tries-and-errors, I went for a combination of options (2) and (3).
If no data is present at all, then the app downloads a full database file from the server.
If data is present and an update is required, the app downloads some database from the server. And checks the content of a particular value in a particular table. That value will state whether the new database is to replace the original or whether it contains deletions/updates/inserts
This turns out to be the fastest way (performance-wise) and leaves all the heavy lifting (determining whether to put everything into one database or just an update) to the server. Further, with this approach, if I need to modify the algorithm to, say, always download the full database, it would only be a change on the server without the need to re-compile and re-distribute the app.
Is there a way you can have a JSON field for each of the tables? For instance, if you got a table named users, have a column named "json" that stores the JSON for each of the users. In essence, it would contain the information the rest of the fields have.
So when you download the delta in JSON, all you got to do is insert the JSON's into the tables.
Of course with this method, you will need to do additional work in parsing the JSON and creating the model/object from it, but it's just an extra 3-4 small steps.
I will recommend approach 3, because app will download the json file more fast and local db will be updated more easily avoid overhead of more internet usages.
Just create a empty db initially according to server db and then regularly updated the same by fetching json