Large encrypted database on Android app. A good practice? - android

I want to keep a 22mb large encrypted sqlite DB in the assets folder of my Android app. When the app runs, I want to copy this DB to the actual DB in /data/data/... which is also supposed to be encrypted. Then I want to sync this DB time to time with the db on my server. No doubts in implementation part, but can anyone please suggest whether it is a good practice and worth it?
Thanks.

When the app runs, I want to copy this DB to the actual DB
This is one time initialization? It's OK, even better if you delete the seed file after initing.
actual DB in /data/data/... which is also supposed to be encrypted.
AFAIK doing this means you are using at least extra 22 MB of RAM everytime you do a database operation. And at least 22 MB worth of disk write everytime you commit. Please do this sensibly, e.g. use batching techniques, do the disk write in a background thread, etc.
Then I want to sync this DB time to time with the db on my server.
Depends on how you do the sync, and also the frequency. If you upload the full 22 MB everytime, it's not ok. If you only update what has been changed, it's ok. Give users the option to only sync when on wifi.

Related

Best way to update a Sqlite DB from a server?

I have an android app with a Sqlite database (it's about 800Mb), sometimes I need to insert, modify or delete database rows from an external server (via internet) in order to update the database.
Is there a way to update the database from the server without having to download the entire database (800mb)?
I was thinking of a homemade solution that consists of adding a new column to the server database that indicates if said row needs to be inserted, deleted or modified by the android app, but I don't know if something is already implemented.
First question- does the database also change locally on the Android device? If so, you're basically into cache coherency. There's an old joke that the two hardest problems in CS are cache coherency and naming things. It's not totally wrong.
If you do need to keep local changes, especially if you need to sync local changes up, this needs to be a small book, so I'm going to assume not for the rest of the answer.
Honestly, if your db needs to scale at all or you need to make changes frequently, downloading a new db is the way to go. Doing any sort of diff against the db is going to cost you a lot of DB processor time, which translates to bigger or faster db servers, which equals money. Or a big perf hit on any other use of the db.
If you do decide you need to do this you need two extra columns. One- an isDeleted flag. That way you can easily check for deleted rows (the only other way to do so is to download all rows and see what's missing, which is a very bad idea). Please note you'll need to change every db query you make anywhere to add "and isDeleted=false" as a condition so you don't return delete rows.
The second column isn't an "isModified" field, its a "modifiedTime" timestamp. Why a timestamp? Because you can't be assured that a client downloading the db was only 1 version behind. He could be 2. Or 10. You need to be able to get all the changes in all the previous versions as well, so an isModified isn't good enough. With a modifiedTime field, you can find the max modifiedTime in the local db, then ask the server for all rows with a modifiedTime greater than yours. You'll then either need to change all your inserts and updates to also set modifiedTime, or use a trigger to do so.
There are a few other ways to do it- a migration file approach (a file with the SQL commands to alter the data) can work if your changes are small. Really though, just download the db. It's so much simpler and less likely to break things. And if you're doing large updates, it may even be less bandwidth. Most importantly, if you just download the file you know the data is correct- if you try and do some kind of diff like above, you have to worry about bugs or inconsistencies in the data for various reasons (did your app get killed while processing the changes? Do you have a bug? Did you do a query mid change and get broken data, with only half the changes you need? Downloading a new file and swapping the dbs when done fixes all those things).

Android DB in app-specific folder vs external folder

I have an Android app that writes data to a db. Initially, when we started, we stored the db in an app-specific folder (something like /storage/org.domain.app/db/db.sync and so on). Things worked with no issues. However, the issue was that this DB would get wiped out everytime we reinstall the app.
We therefore moved the DB to a different folder outside the app. (Now the path is something like /storage/data/db/db.sync). Now, if we reinstall the app, we still have the old DB entries. However, now reading and writing to the DB is a lot lot slower.
I am not sure if this slowness is because the amount of data has shot up. Since the data is now persistent across multiple reinstalls.
I suspect this could also be because the DB is now in a folder external to the app. To rule this out, I setup the DB in an app specific folder and intend to copy the entire DB from some previous tests (that contains lot more data - the amount of data which caused the DB access to slow down). However, I am unable to do so.
Some questions:
1. I am not able to actually access the folder where the DB is setup. The folder is created in Context.MODE_PRIVATE mode. Therefore, only the app itself can access it. I tried changing it to MODE.WORLD_READABLE. However, this is deprecated and throws an exception while using it. Is there a way to make this folder easily editable?
I also tried accessing the DB from Device file explorer in Android studio. I opened the existing DB (db.sync file) and just copy-pasted the large DB I already have. However, it looks like this method does not allow us to copy or even display more than 400 KB of information. The DB I have is around 60-70 MB. Hence this approach did not work either.
Is there a way I can edit the db.sync file?
Does DB being in a folder external to the app make it slower? Esp when there are too many entries. In my case, certain search operations happen via a local cache and happens in a giffy. However, we also need to log some stuff to the DB and that takes forever. I figured this out by adding timer logs in the code.
Anything else, I could do to identify if the DB path is causing the issue?
Thanks,
Anand

How to Manage increasing size of sqlite in Mobile App?

My app are sometime needed syncing with web servers and pull the data in mobile sqlite database for offline usages, so database size is keep growing exponentially.
I want to know how the professional app like whatsapp,hike,evernote etc manage their offline sqlite database.
Please suggest me the steps to solve this problem.
PS: I am asking about offline database (i.e growing in the size after syncing) management do not confuse with database syncing with web servers.
I do not know how large is your data size is. However, I think it should not be a problem storing reasonably large data into the internal memory of an application. The internal memory is shared among all applications and hence it can grow until the storage getting filled.
In my opinion, the main problem here is the query time if you do not have the proper indexing to your database tables. Otherwise, keeping the databases in your internal storage is completely fine and I think you do not have to be worried about the amount of data which can be stored in the internal storage of an application as the newer Android devices provide better storage capability.
Hence, if your database is really big, which does not fit into the internal memory, you might consider having the data only which is being used frequently and delete otherwise. This highly depends on the use case of your application.
In one of the applications that I developed, I stored some large databases in the external memory and copied them into the internal memory whenever it was necessary. Copying the database from external storage into internal storage took some time (few seconds) though. However, once the database got copied I could run queries efficiently.
Let me know if you need any help or clarification for some points. I hope that helps you.
For max size databases. AFAIK You don't want to loose what's on the device and force a reload.
Ensure you don't drop the database with each new release of your app when a simple alter table add column will work.
What you do archive and remove from the device give the user a way to load it in the background.
There might be some Apps / databases where you can find a documentation, but probably this case is limited and an exception.
So to know exactly what's going on you need to create some snapshots of the databases. You can start with that of one app only, or do it directly with several, but without analyzing you won't get a reliable statement.
The reasons might be even different for each app as databases and app-features differ naturally too.
Faster growth in size than amount of incoming content might be related to cache-tables or indexing for searches, but perhaps there exist other reasons too. Without verification and some important basic-info about it, it's impossible to tell you a detailed reason.
It's possible that table-names of a database give already some hints, but if tablenames or even fields just use meaningless strings, then you've to analyze the data inside including the changes between snapshots.
The following link will help in understanding what exactly Whatsapp is using,
https://www.quora.com/How-is-the-Whatsapp-database-structured
Not really sure if you have to keep all the data all the time stored on the device, but if you have a choice you can always use cloud services (like FCM, AWS) to store or backup most of the data. If you need to keep all the data on the device, then perhaps one way is to use Caching mechanisms in your app.
For Example - Using LRU (Least Recently Used) to cache/store the data that you need on the device, while storing the rest on the cloud, and deleting whats unneeded from the device. If needed you can always retrieve the data on demand (i.e. if the user tries to pull to refresh or on a different action) and delete it whenever its not being used.

Load entire SQLite database into RAM to improve performance

I have a large SQLite database for my application (around 100MB). This database is read only and never written into. The performance of working with the database is decent, since I made sure that most of the queries have indexes backing them up. However I would like to speed it up further by caching the entire database into RAM to avoid disk access.
Is this possible and if so how can I do it?
A database file is a file, so you can just read it to put it into the OS's file cache. (And if the OS decides to throw the data out again, then it might have a reason for it …)
For an on-disk database, SQLite needs to check if some other process has made changes. To avoid that, you can use
PRAGMA locking_mode = EXCLUSIVE. However, the difference probably isn't measurable.

android application with huge database

Let me explain how my application is supposed to work:
Application will ship with a sqlite database in its assets folder which will be copied into databases folder later and it has some content in it(categories, sub categories, products and news) which they all have image. Then after download user can update the content via internet and the application store the new content in database so the application can perform offline.
So my question is, after a while this content will increase in size, is it gonna cause my application to crash? Lets say I release the application with 1 MB database and after 2 years of work the database size goes up around 120 MB. Is it gonna make the application to crash?
Also the other concern is that currently I'm storing the images in database and I load'em from there. Is it a good approach? Because I don't want user to be able to clear the cache or delete the images because later on updating the content it has to download those deleted images again and it will consume traffic imo.
please remember that the Application should be able to load content offline
No, applications don't just crash because they have a large database.
Part of the point of a Cursor is that it gives you a view into a large set of data, without having to load it all into memory at the same time.
If you follow best practices I see no problem - you're using a database. Forget for a second that it's on Android - you should optimize your table structure, indexes, etc, as best you can.
Also, large database or not, don't make any queries to it on the main thread. Use the Loader API if you need to show the result of a query in your UI.
Last, potentially most importantly, rethink why you even need such a large database. Is it really that common that a user will need to access all data ever while offline? Or might it make more sense for you to only store data from the last week or month, etc, and tell them that they need to be online to access older data.
Regarding your 2nd question - please in the future separate that into a separate question. But, no, storing binary blobs (images in this case) in a sqlite database is not good approach. Also, if they clear data on the app, everything is gone, so there's no advantage to using a database to avoid that. I would suggest storing images in a folder named after your app in external storage of the device, potentially storing image URIs/names in the database.
Any problem with database will cause SQLiteException which you are able to handle in your app to prevent the abnormal termination.
Having said that, a database of 120 MB seems to be too much, are you sure your users will want all that?

Categories

Resources