I am developing an Android app that stores data locally in Sqlite database and sync it to a remote server (MSSQL server). The sending of data is handled through REST api.
This is the way I would like it to work and my plan to handle it:
When the app stores data in Sqlite database, the app will check if internet connection is available, if it is then the app will make a HttpPost to send the data (I use AsyncTask to handle this). Once the data sent, I will flag the row in the database as "synched" using postExecute callback.
If the internet connection is not available, then the app will continue on.
I need to make the app to listen to the event when internet connection became available and then the app will go through all rows that have not been synched and use AsyncTask again to send the data to remote server.
My questions are:
Is it achievable? and if so, is it best practices?
How to listen to the even when internet connection became available?
Thanks,
You could implement this manually, but I suggest you use a SyncAdapter instead.
Although you can design your own system for doing data transfers in
your app, you should consider using Android's sync adapter framework.
This framework helps manage and automate data transfers, and
coordinates synchronization operations across different apps. When you
use this framework, you can take advantage of several features that
aren't available to data transfer schemes you design yourself.
If you want to implement this without using a SyncAdapter anyway, then for the "detect when connection becomes available", you need to add a BroadcastListener to listen for CONNECTIVITY_ACTION broadcasts, then use a ConnectivityManager to query about the current state.
Related
I have a app working offline. It is assumed that 1000+ records are created with images in each record during this period and whenever connectivity is established. What should be the approach to send all the 1000+ records to server that also handles any interruption between the network calls or API failure response.
I assume I have to send records in batches but how to handle the interruption and maintain consistency and prevent any kind of data loss.
I guess the best way here is to send each record separetely (if they are not related to each other).
If you have media attachments, sending of each record will take 2 seconds in average, if you uploading via mobile internet with speed ~2 MB/s. If you will send the large batch of records via each request, you must have stable connection for a long period.
You can send each record as multipart request, where parts are record's body and media attachments.
Also you have no need to check for internet connection, or use receiver for catching changes of connection state. You can simply use this libraries for triggering sync requests:
JobScheduler
Firebase JobDispatcher
Evernote android-job
I would suggest to use Firebase database API.
It has got nice offline/online/sync implementations.
https://firebase.google.com/docs/database/
And it is possible to read/write the data using Admin SDK for your NodeJS server:
https://firebase.google.com/docs/admin/setup
You can use divide and conquer approach means divide the task into small task and upload the data to the server.
1. take a boolean flag "isFinishData" starting with false.
2. starting upload the data on server from 0 to 100 records.
3. next record send from 100 to 200.
4. this process run until last record (1000) is not send .
5. in last record update set boolean variable true and exit from loop .
this logic would be work fine in IOS/android both.
Save your records in local Db and use ORMs for it. Use Retrofit which provide onSuccess and onFailure method for Webservice calling. To send data to server at regular interval you can use sync adapter.
1st I need to know how did you save image in local db ?
You need to create a service to catch connection status. Each time when connection is established, you submit your record as Multipart kind. You can you Retrofit/Asynctask.
Just submit 1 record per one Retrofit/Asynctask, it makes you ez to handle success/fail of each record.
You can run a single or multi retrofit/asynctask to submit one or more record, it's up to you.
If ur data has image, on server side, you have to handle process from ur server to 3rd server ( server to save image ).
This is a very broad question and it relates to Architecture, UI Experience, limitations, etc.
It seems to be a synchronization pattern where the user can interact with the data locally and offline but at some point, you'd need to synchronize the local data with server-side and vice-versa.
I believe the best place to start is with a background service (Android, not sure if there's a similar approach on iOS). Essentially, regardless of whether the Android app is running or not, the service must handle all the synchronization, interruption, and failure in the background.
If it's a local db, then you'd need to manage opening and closing the database appropriately and I'd suggest using a field to mark any sync'd records so if some records did fail, you can retry them at another point.
Also, you can convert the records to json array, then do a post request.
As for uploading images, definitely needs to be in batch if there's a lot of them but also making sure to keep track of which ones are uploaded and which ones aren't.
The one problem that you will run into if you're supporting synchronization from different devices and platforms, is you'll have conflicting data being synchronized against the backend. You'll need to handle this case otherwise, it could be very messy and most likely cause a lot of weird issues.
Hope this helps on a high level :)
To take on simple approach ,have 1 flag in your data objects [NSManagedObject] classes as sync.While creating new object / modifying an existing object change sync flag to false .
Filter data objects with sync value as false.
let unsyncedFilter = NSPredicate(format: "sync = %#", #(false))
Now you will have an array of objects which you want to sync with server.If you are sending objects one by one in requests.
On success change sync flag to true else whenever your function gets executed again on app launch/reachability status update, it will filter out unsynced data again & start synch.
As others have mentioned this is a rather broad question. A lot depends on both the architecture of the server that will receive the data as well as the architecture of the app.
If you have any control over the implementation of your backend I would recommend implementing a storage solution that allows for pausing and resuming of transfers. Both Google Cloud Storage and Amazon S3 offer a similar functionality.
The idea behind this approach is to be able to pick up the upload from where it stopped. In case of app crash or issues with internet connection you don't have to restart all from the beginning.
In your case I would still start separate uploads for each one of the records and store their upload progress.
Here you can find an example of how to use the pause / resume approach using the mobile SDK with Amazon https://aws.amazon.com/blogs/mobile/pause-and-resume-amazon-s3-transfers-using-the-aws-mobile-sdk-for-android/.
Editing adding reference to Amazon iOS SDK , http://docs.aws.amazon.com/mobile/sdkforios/developerguide/s3transfermanager.html
Best way is to break the files into chunks of 100s and upload at intervals or when app is idle.
This may be duplicate question but am still having doubt am a beginner in android application i have a couple of doubts my primary doubt is:
I have made one application which will communicate with server when network available it will work as it is. when network is not available data will save in sqlite and later when network is avail need to sync that data to server how can i achieve this.
Whenever there is new update is made with server need to get notification how can i do this
For this one which will be the best approach syncadapter or server or intent service with broadcast receiver which would be opptimized solution for the above requirement
These are all my doubts i would be very glad if someone helps me !!!
If you want an Android app to be notified when something happens on a server you control (without having the app to constantly poll the server to ask for changes), the usual solution is to use Google Cloud Messaging to allow the server to send a notification to the app to tell it to refresh data.
It is kind of complicated to implement, but is the best way to do what you want and is standard practice for mobile apps.
If you need to know when the network becomes available, to reach your server for synchronization, implement connectivity change listener, as discussed in this question.
This does not allow to send messages from the server easily, but if the server messages are not of high urgency, maybe you can simply check for them periodically.
This would allow to use less Google specific infrastructure and change the cloud providers easier.
I am implementing a chat app in android. A vital part of this app is to sync with the server and local database. There are several methods to sync data between server and android device like AsyncTask, IntentService and SyncAdapter.
I prefer to use SyncAdapter, because it is more efficient and it handles most of the background tasks by itself.
When I read the developer page for SyncAdapter I found this,
Note: Sync adapters run asynchronously, so you should use them with the expectation that they transfer data regularly and efficiently, but not instantaneously. If you need to do real-time data transfer, you should do it in an AsyncTask or an IntentService.
Does that means is it not good to use like chat app?
Also I need to mention a feature of SyncAdapter
Automated execution
Allows you to automate data transfer based on a variety of criteria, including data changes, elapsed time, or time of day. In addition, the system adds transfers that are unable to run to a queue, and runs them when possible.
So if it starts to sync when data changes (Since the new messages are stored in the sqlite database), I think SyncAdapter will be a good choice for Chat App.
Any Suggestions are appreciated.
Thanks.
Usually mobile app depends on backend implementation and app requirements, but generally you shouldn't use such methods for chat application, they won't give you up to date data.
I'd say when app is in background, you should use GCM for new messages notifications and when app is in foreground use something like RPC, xmpp, sockets or whatever that keeps your connection alive.
My architecture will use ActiveMQ on the server and have Android clients send and receive messages.The network situation will be very unreliable; possibly hours of missing connection. Is there a framework that will allow me to queue up the messages on the android client and deliver them reliably once the connection is back?
You can efficiently implement one yourself, I don't think anyone will provide you this service, and if they do they will certainly charge, Here is what I can suggest for an optimal solution.
Design a db using SQLITE to hold you message, once a message is ready for deliver from android client, you can perform the following
a. If network is avaibale, then you can directly deliver message to your web clinet
b. If network in not present, then cache it directly to you local android db
Design a Sync logic, you can achieve it by network listener, so when user device comes back into network,
you can write a logic to query from databse and posting to your webclient, deleting local data subsequently
upon successful posting into server
You can strengthen you logic, by caching message everytime into local db first, then a Sync logic which will commit your local changes to web server in bulk, thus improving upon processing time.
Hope this answer your problem.
This is more of a conceptual question not necessarily bound to any specific technologies.
Lets say you got some database on a server, some REST/JSON API to access content in that database and some mobile client displaying data retrieved through the API.
It would be nice to have some caching mechanism on the client and also to be able to enable offline access to the data as long as the client is only reading (In my case it's fine to deny write access to offline clients to avoid having to manage all those nasty conflicts that might happen).
It appears that a nice way to solve that would be to have a subset of the servers database model present on the client and synchronizing data from the server to the client.
Access to the local database might then immediately return results but also trigger update requests to the server. In case the server returns modified data the client model then synchronizes it's local database and notifies the display of data changes.
The goal in the end is of course is that the user may browse the information regardless of the stability of his internet connection and is not annoyed by connection dialogs or similar as long as he doesn't modify any data.
Now from an implementation perspective... on one hand it seems like a bad idea to couple the server database directly to the client database as they may be from different vendors. I guess at least there would need to be a vendor independent model above both database implementations. On the other hand, transforming the data from the server database into some transport format and than putting it back into the client database seems like a lot of overhead.
Any suggestions how to solve that in an elegant and maintainable way?
I am working on an app that syncs small portions of a large database locally onto the handset. There is an initial preload that has to occur on the handset but after that the updates happen asynchronously in the background.
First of all, decoupling the server and handset using JSON or XML is highly advised. Locking into one technology always causes issues as you are forced to use the same technology regardless of the platform. That is, if you plan on expanding into other platforms (Web,iOS,etc..) you are forced to use the format dictated by the server. Choosing a generic format will make that simpler in the long run. In reality with the amount of public libraries reading/writing JSON is a trivial matter.
There are two ways that we use to sync the data;
1. AlarmManager
We schedule the AlarmManager to trigger a service to wakeup on a regular schedule (lets say every 6 hours). The wakeup starts a background service that contacts the server, downloads the changes in JSON and updates a local SQLite DB. If there is no connection, the update is skipped and scheduled for the next wakeup. We add a ConnectivityChanged receiver to automatically restart the sync when the connection is restored.
2. GCM
It's a little more work but saves a lot of battery and data usage if you only update the local database when there are changes. Google Cloud Messaging can send a wakeup message to the device and tell it to start the sync service. The sync service runs the same as the AlarmManager method above.
We do a combination of both of the methods above depending on how "fresh" you need the data and how often it changes. Something like an RSS feed should probably be updated every 30min whereas weather data may not need to be updated more than every 4 hours.
So to run the database sync we use;
Receivers -> listen for system events and trigger Service
Services -> connect to the server, download the JSON and update the SQLite providers
Providers -> insert the records into the database and broadcast content changes to ContentObservers
ContentObservers -> when the app is running, the ContentObservers update the UI with the new data
There is a lot of technical details in each of the components above but that should provide you with a very robust architecture for syncing server data with a local db.
I'm working on a project that has similar requirements. We want to have a big, available database on a server somewhere and then mobile devices that get data from it. If the devices go offline it's ok because they have saved their own copies of the data locally.
We've decided to use BigCouch (fork of Apache CouchDb that supports clustering) as the server technology and then Couchbase Mobile on the mobile devices. (As a note TouchDB for Android will replace Couchbase Mobile, but it's not stable yet.)
The reason we went with Couch* technologies is that Couch has good replication over HTTP. You can programmatically initiate a sync event on the mobile device and it will replicate all inserts, updates and deletes for you. It stores the information on it's own embedded CouchDb on the mobile device, so it can be read offline.
If you didn't want to go down the Couch road, you could simply use something like SQLlite to store the results of your REST/API calls. Then you would have to write your own replication logic for when a mobile device goes offline and then comes back. There are creative ways to do this, so maybe it's an option.