How to reset room database - android

I have tried many ways to reset it.
allowBackupandfullBackupOnly had been set to false.
.fallbackToDestructiveMigration()
and delete database and cache files directly.
but it doesn't work.

Simplest way is to uninstall the app, this deletes the database file(s). So rerunning starts from a brand new database.
To use .fallbackToDestructiveMigration() you have to have to invoke a Migration by increasing the version number but NOT have a Migration for the particular path. You could argue that this doesn't reset the database as the newly created database will have the higher version number.
Using clearAllTables doesn't entirely reset the database as it will not delete the system tables. Most notably, sqlite_sequence, which is a table that hold the vale of the latest rowid on a per table basis. That is if you have autogenerate = true in the #PrimaryKey annotation for an field/column that resolves to a column type affinity of INTEGER then AUTOINCREMENT is coded then the sqlite_sequence table will be created (if not already in existence) and store the latest (and therefore highest) value of the said primary key. Thus if you have inserted 100 rows (for example) into a such a table, then after a clearAllTables the 100 will still be stored in the sqlite_sequnce table.
You could also, prior to building the database, delete the database. Here's an example that allows it to be deleted when building :-
#Database(entities = [Customer::class], version = 1)
abstract class CustomerDatabase: RoomDatabase() {
abstract fun customerDao(): CustomerDao
companion object {
private var instance: CustomerDatabase?=null
fun getDatabase(context: Context, resetDatabase: Boolean): CustomerDatabase {
if (resetDatabase && instance == null) {
(context.getDatabasePath("thedatabase.db")).delete()
}
if (instance == null) {
instance = Room.databaseBuilder(context,CustomerDatabase::class.java,"thedatabase.db")
.allowMainThreadQueries()
.build()
}
return instance as CustomerDatabase
}
fun getDatabase(context: Context): CustomerDatabase {
if (instance == null) {
instance = Room.databaseBuilder(context,CustomerDatabase::class.java,"thedatabase.db")
.allowMainThreadQueries()
.build()
}
return instance as CustomerDatabase
}
}
}
note that in addition to requesting the reset, a check is also made to ensure that an instance of the database hasn't been retrieved.
This would also be more efficient as clearAllTables still incurs processing of the underlying data and the ensuing VACUUM which can be quite resource hungry.

You can use this clear all tables
THis deletes all rows from all the tables that are registered to this database as Database.entities().

The Room database has a clearAllTables function that does clearing entities you defined with #Entity annotation. But there is a catch. It does not clear the system generated tables such as sqlite_sequence, which stores the autoincrement values.
But there are more factors to consider. Since clearAllTables itself run in a transaction, we cannot run combination of clearAllTables and clearing sqlite_sequence in a single transaction. If you try to run clearAllTables in a transaction, it will fail with an IllgalStateException.
The android SQLite database library creates an additional table called android_metadata which stores database locale, and the room database library creates another table called room_master_table, which keeps track of database integrity and helps database migrations. We should not delete or clear these two tables. Additionally, SQLite will create a sqlite_sequence table if you have defined autoincrement columns. Deleting this table is not allowed, but clearing this will reset the autoincrement values.
The room database compiler generates the clearAllTables function in the database class. Basically, it disables foreign key constraints, then starts a transaction and clears all rows in tables you have given in the database class, and after the end of the transaction, it re-enables foreign key constraints. See how this is done in the room database compiler source code room / room-compiler / src / main / kotlin / androidx / room / writer / DatabaseWriter.kt / createClearAllTables. The generated function differs based on one factor, whether you have defined foreign key constraints or not.
Based on the compiler source code, I wrote an extension function to reset the database. It will clear all tables you defined and will reset the autoincrement values.
fun RoomDatabase.resetDatabase(tables: List<String>? = null): Boolean {
val db = openHelper.writableDatabase
val tableNames = db.getTableNames()
val hasForeignKeys = db.hasForeignKeys(tables ?: tableNames.minus("sqlite_sequence"))
val supportsDeferForeignKeys = db.supportsDeferForeignKeys()
return try {
if (hasForeignKeys && !supportsDeferForeignKeys) {
// clear enforcement of foreign key constraints.
db.execSQL("PRAGMA foreign_keys = FALSE")
}
db.beginTransaction()
if (hasForeignKeys && supportsDeferForeignKeys) {
// enforce foreign key constraints after outermost transaction is committed.
db.execSQL("PRAGMA defer_foreign_keys = TRUE")
}
// clear all tables including sqlite_sequence table.
// deleting sqlite_sequence table is required to reset autoincrement value.
val tablesToClear = tables?.let {
if (tableNames.contains("sqlite_sequence")) {
it.plus("sqlite_sequence")
} else {
it
}
} ?: tableNames
for (tableName in tablesToClear) {
db.execSQL("DELETE FROM $tableName")
}
db.setTransactionSuccessful()
true
} catch (e: Exception) {
false
} finally {
db.endTransaction()
if (hasForeignKeys && !supportsDeferForeignKeys) {
// restore enforcement of foreign key constraints.
db.execSQL("PRAGMA foreign_keys = TRUE")
}
// blocks until there is no database writer and all are reading from the most recent database snapshot.
db.query("PRAGMA wal_checkpoint(FULL)").close()
if (!db.inTransaction()) {
db.execSQL("VACUUM")
}
}
}
fun SupportSQLiteDatabase.getTableNames(
exclude: List<String> = listOf("android_metadata", "room_master_table")
): List<String> {
val cursor = query("SELECT DISTINCT tbl_name FROM sqlite_master WHERE type='table'")
val tables = mutableListOf<String>()
while (cursor.moveToNext()) {
tables.add(cursor.getString(0))
}
cursor.close()
tables.removeAll(exclude)
return tables
}
fun SupportSQLiteDatabase.hasForeignKeys(tables: List<String>? = null): Boolean {
val tableNames = tables ?: getTableNames(exclude = listOf("android_metadata", "room_master_table", "sqlite_sequence"))
for (tableName in tableNames) {
val cursor = query("PRAGMA foreign_key_list($tableName)")
if (cursor.count > 0) {
cursor.close()
return true
}
cursor.close()
}
return false
}
fun SupportSQLiteDatabase.supportsDeferForeignKeys(): Boolean {
// defer_foreign_keys is only supported on API 21+
// Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP
val cursor = query("PRAGMA defer_foreign_keys")
return cursor.use { it.count > 0 }
}

Related

Why is the primary key not reset in android room?

private fun savetosqlite(CoinListesi: List<CoinInfo>){
launch{
val dao = CoinInfoDatabase(getApplication()).CoinDao()
dao.deleteAll()
val uuidList= dao.insertAll(*CoinListesi.toTypedArray())
}
dao is reset but primary key keeps increasing every time the function is called, also primary key doesn't start from 0 how do I solve it?
Dao
#Dao
interface CoinInfoDao {
#Insert
suspend fun insertAll(vararg CoinInfo: CoinInfo):List<Long>
#Query("DELETE FROM CoinInfo")
suspend fun deleteAll() }
model
#Entity
data class CoinInfo (...){
#PrimaryKey(autoGenerate = true)
var uuid:Int=0
}
Because autoGenerate/autoIncreament from the sqlite docs says it is to
prevent the reuse of ROWIDs over the lifetime of the database
As deleting all the rows with "DELETE FROM CoinInfo" does not affect the lifetime of this table in the database then the numbers continue to increase.
You need to end the "life" of the table with the SQL "DROP TABLE CoinInfo" and then re-create it for it to start a new lifetime and reset the auto generated Int.
Or you can directly reset the value of where SQLite stores then last number used with a query like "DELETE FROM sqlite_sequence WHERE name='CoinInfo'" (or set the value to be 0/1 of this row)
You would need to execute something like
CoinInfoDatabase.getDatabase(application)
.getOpenHelper()
.getWritableDatabase()
.execSQL("DELETE FROM sqlite_sequence WHERE name='CoinInfo'");
Or more efficient is do you really need autoincrement? as per the docs

How to export Room Database to .CSV

How can I export my Room Database to a .CSV file. I would like it to be saved to device storage. I searched everything and no answer was suitable. I hope there is a way for this.
You cannot just save a database as a CSV. However the database, if fully checkpointed, is just a file. If not fully checkpointed then it (unless write-ahead logging as been disabled) would be three files.
The database itself consists of various parts, a header (first 100 bytes of the file) and then blocks of data for the various components. Most of these dependant upon the schema (the tables), there are also system tables
sqlite_master is a table that holds the schema
if autogenerate = true is used for a integer type primary key then there is also the sqlite_sequence table
room itself has the room_master_table in which room stores a hash, this being compared against a compiled hash based upon the Room's expected schema.
To save all that data as a CSV, would be complex (and needless as you can just copy the database file(s)).
If what you want is a CSV of the app's data, then that would depend upon the tables. If you a single table then extracting the data as a CSV would be relatively simple but could be complicated if the data includes commas.
If there are multiple tables, then you would have to distinguish the data for the tables.
Again the simplest way, if just securing the data is to copy the file.
However as an example based upon :-
A database that has 3 tables (apart from the system tables)
PostDataLocal (see below for columns)
GroupDataLocal
AdminDataLocal
an existing answer has been adapted for the example
Then:-
The following in an #Dao annotated interface (namely AllDao) :-
#Query("SELECT postId||','||content FROM postDataLocal")
fun getPostDataLocalCSV(): List<String>
#Query("SELECT groupPostIdMap||','||groupId||','||groupName FROM groupDataLocal")
fun getGroupDataLocalCSV(): List<String>
#Query("SELECT adminGroupIdMap||','||userId||','||adminName||','||avatar FROM adminDataLocal")
fun getAdminDataLocalCSV(): List<String>
And the following function where dao is an AllDao instance previously instantiated :-
private fun createCSV() {
val sb = StringBuilder()
var afterFirst = false
sb.append("{POSTDATALOCAL}")
for (s in dao.getPostDataLocalCSV()) {
if(afterFirst) sb.append(",")
afterFirst = true
sb.append(s)
}
afterFirst = false
sb.append("{GROUPDATALOCAL}")
for (s in dao.getGroupDataLocalCSV()) {
if (afterFirst) sb.append(",")
afterFirst = true
sb.append(s)
}
afterFirst = false
sb.append("{ADMINDATALOCAL}")
for (s in dao.getAdminDataLocalCSV()) {
if ((afterFirst)) sb.append(",")
afterFirst = true
sb.append(s)
}
Log.d("CSV_DATA","CSV is :-\n\t$sb")
}
And then in an activity (where dao has been instantiated) the following:-
createCSV()
Then, when the database contains the following data (extracted via App Inspection) :-
PostDataLocal
GroupDataLocal
AdminDataLocal
The result written to the log (as could be written to a file rather than the log) is :-
D/CSV_DATA: CSV is :-
{POSTDATALOCAL}1,Post001,2,Post002,3,Post003{GROUPDATALOCAL}1,1,Group001 (Post001),1,2,Group002 (Post001),1,3,Group003 (Post001),2,4,Group004 (Post002),2,5,Group005 (Post002),3,6,Group006 (Post003){ADMINDATALOCAL}1,1,Admin001,admin001.gif,1,2,Admin002,admin002.gif,1,3,Admin003,admin003.gif,2,4,Admin004,admin004.gif,2,5,Admin005,admin005.gif,3,6,Admin006,admin006.gif,4,7,Admin007,admin007.gif,5,8,Admin008,admin008.gif,6,9,Admin009,admin009.gif,6,10,Admin010,admin010.gif
Note how headers have been included to distinguish between the tables
of course no consideration has been given to the inclusion of commas in the data (the above is intended to just show that in-principle you can generate a CSV representation of the data relatively easily)
Additional
Here's a more automated version in which you don't need to create the #Query annotated functions, rather it interrogates sqlite_master to extract the tables and the uses the table_info pragma to ascertain the columns, building the respective SQL.
As such it should cater for any Room database.
It also allows for the replacement of commas in the data with an indicator of a comma that could then be replaced when processing the CSV.
The supportive (secondary/invoked by the primary) function being :-
private fun getTableColumnNames(tableName: String, suppDB: SupportSQLiteDatabase): List<String> {
val rv = arrayListOf<String>()
val csr = suppDB.query("SELECT name FROM pragma_table_info('${tableName}')",null)
while (csr.moveToNext()) {
rv.add(csr.getString(0))
}
csr.close()
return rv.toList()
}
And the Primary function :-
private fun AutoCreateCSV(): String {
val replaceCommaInData = "{COMMA}" /* commas in the data will be replaced by this */
val rv = StringBuilder()
val sql = StringBuilder()
var afterFirstTable = false
var afterFirstColumn = false
var afterFirstRow = false
val suppDb = db.getOpenHelper().writableDatabase
var currentTableName: String = ""
val csr = db.query("SELECT name FROM sqlite_master WHERE type='table' AND name NOT LIKE('sqlite_%') AND name NOT LIKE('room_%') AND name NOT LIKE('android_%')", null)
while (csr.moveToNext()) {
sql.clear()
sql.append("SELECT ")
currentTableName = csr.getString(0)
if (afterFirstTable) rv.append(",")
afterFirstTable = true
afterFirstColumn = false
rv.append("{$currentTableName},")
for (columnName in getTableColumnNames(currentTableName,suppDb)) {
if (afterFirstColumn) sql.append("||','||")
afterFirstColumn = true
sql.append("replace(`$columnName`,',','$replaceCommaInData')")
}
sql.append(" FROM `${currentTableName}`")
val csr2 = db.query(sql.toString(),null)
afterFirstRow = false
while (csr2.moveToNext()) {
if (afterFirstRow) rv.append(",")
afterFirstRow = true
rv.append(csr2.getString(0))
}
csr2.close()
}
csr.close()
return rv.toString()
}
Using the same data and as the primary function returns a String the following code Log.d("CSV_DATA2",AutoCreateCSV()) results in :-
D/CSV_DATA2: {PostDataLocal},1,Post001,2,Post002,3,Post003,{GroupDataLocal},1,1,Group001 (Post001),1,2,Group002 (Post001),1,3,Group003 (Post001),2,4,Group004 (Post002),2,5,Group005 (Post002),3,6,Group006 (Post003),{AdminDataLocal},1,1,Admin001,admin001.gif,1,2,Admin002,admin002.gif,1,3,Admin003,admin003.gif,2,4,Admin004,admin004.gif,2,5,Admin005,admin005.gif,3,6,Admin006,admin006.gif,4,7,Admin007,admin007.gif,5,8,Admin008,admin008.gif,6,9,Admin009,admin009.gif,6,10,Admin010,admin010.gif
and if the data includes a comma e.g. Post001 is changed to be the value Post001, <<note the comma in the data>>
Then :-
D/CSV_DATA2: {PostDataLocal},1,Post001{COMMA} <<note the comma in the data>>,2,Post002,3 ....
this additional solution also fixes a little bug in the first where some separating commas were omitted between the header and the data.
Get all your data as a list from room and use this library
https://github.com/doyaaaaaken/kotlin-csv
It works well, here is my usage
private fun exportDatabaseToCSVFile(context: Context, list: List<AppModel>) {
val csvFile = generateFile(context, getFileName())
if (csvFile != null) {
exportDirectorsToCSVFile(csvFile, list)
} else {
//
}
}
private fun generateFile(context: Context, fileName: String): File? {
val csvFile = File(context.filesDir, fileName)
csvFile.createNewFile()
return if (csvFile.exists()) {
csvFile
} else {
null
}
}
private fun getFileName(): String = "temp.csv"
fun exportDirectorsToCSVFile(csvFile: File, list: List<AppModel>) {
csvWriter().open(csvFile, append = false) {
// Header
writeRow(listOf("row1", "row2", "row3"))
list.forEachIndexed { index, appModel ->
writeRow(listOf(getRow1, getRow2, getRow3))
}
shareCsvFile(csvFile)
}
}
private fun shareCsvFile(csvFile: File) {
// share your file, don't forget adding provider in your Manifest
}

How to add new data to android room database when updating app?

I'm making an android app with Room database.
My plan is to prepopulate database with some initial data when it is installed on device,
and user can edit it and insert new row on each table.
New row id by users will start from, for example, 10000,
(the point of my question)
and later I want to add more data in the rows up to 9999.
Can I do this when users update the app?
or is there any other way?
Maybe should I try to import csv file to room database?
Thanks!!
my code to prepopulate from an app asset
Room.databaseBuilder(application, AppDatabase::class.java, DB_NAME)
.createFromAsset("database/appdatabase.db")
.build()
To make it so that the users start IF you have #PrimaryKey(autogenerate = true) then when preparing the original pre-populated data you can easily set the next userid to be used.
For example, if the Entity is :-
#Entity
data class User(
#PrimaryKey(autoGenerate = true)
val userId: Long=0,
val userName: String,
)
i.e. userid and userName are the columns and when first running you want the first App provided userid to be 10000 then you could use (as an example) the following in you SQLite Tool:-
CREATE TABLE IF NOT EXISTS `User` (`userId` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `userName` TEXT);
INSERT INTO User (userName) VALUES('Fred'),('Mary'),('Sarah'); /* Add Users as required */
INSERT INTO User VALUES(10000 -1,'user to be dropped'); /* SETS the next userid value to be 10000 */
DELETE FROM user WHERE userid >= 10000 - 1; /* remove the row added */
Create the table according to the Entity (SQL was copied from the generated java #AppDatabase_Impl)
Loads some users
Add a user with a userId of 9999 (10000 - 1), this causes SQLite to record 9999 in the SQLite system table sqlite_sequnce for the user table.
Remove the user that was added to set the sequence number.
The following, if used after the above, demonstrates the result of doing the above :-
/* JUST TO DEMONSTRATE WHAT THE ABOVE DOES */
/* SHOULD NOT BE RUN as the first App user is added */
SELECT * FROM sqlite_sequence;
INSERT INTO user (username) VALUES('TEST USER FOR DEMO DO NOT ADD ME WHEN PREPARING DATA');
SELECT * FROM user;
The first query :-
i.e. SQLite has stored the value 9999 in the sqlite_sequence table for the table that is named user
The second query shows what happens when the first user is added :-
To recap running 1-4 prepares the pre-populated database so that the first App added user will have a userid of 10000.
Adding new data
You really have to decide how you are going to add the new data. Do you want a csv? Do you want to provide an updated AppDatabase? with all data or with just the new data? Do you need to preserve any existing User/App input data? What about a new installs? Th specifics will very likely matter.
Here's an example of how you could manage this. This uses an updated pre-populated data and assumes that existing data input by the App user is to be kept.
An important value is the 10000 demarcation between supplied userid's and those input via the App being used. As such the User Entity that has been used is:-
#Entity
data class User(
#PrimaryKey(autoGenerate = true)
val userId: Long=0,
val userName: String,
) {
companion object {
const val USER_DEMARCATION = 10000;
}
}
Some Dao's some that may be of use, others used in the class UserDao :-
#Dao
abstract class UserDao {
#Insert(onConflict = OnConflictStrategy.IGNORE)
abstract fun insert(user: User): Long
#Insert(onConflict = OnConflictStrategy.IGNORE)
abstract fun insert(users: List<User>): LongArray
#Query("SELECT * FROM user")
abstract fun getAllUsers(): List<User>
#Query("SELECT * FROM user WHERE userid < ${User.USER_DEMARCATION}")
abstract fun getOnlySuppliedUsers(): List<User>
#Query("SELECT * FROM user WHERE userid >= ${User.USER_DEMARCATION}")
abstract fun getOnlyUserInputUsers(): List<User>
#Query("SELECT count(*) > 0 AS count FROM user WHERE userid >= ${User.USER_DEMARCATION}")
abstract fun isAnyInputUsers(): Long
#Query("SELECT max(userid) + 1 FROM user WHERE userId < ${User.USER_DEMARCATION}")
abstract fun getNextSuppliedUserid(): Long
}
The #Database class AppDatabase :-
#Database(entities = [User::class],version = AppDatabase.DATABASE_VERSION, exportSchema = false)
abstract class AppDatabase: RoomDatabase() {
abstract fun getUserDao(): UserDao
companion object {
const val DATABASE_NAME = "appdatabase.db"
const val DATABASE_VERSION: Int = 2 /*<<<<<<<<<<*/
private var instance: AppDatabase? = null
private var contextPassed: Context? = null
fun getInstance(context: Context): AppDatabase {
contextPassed = context
if (instance == null) {
instance = Room.databaseBuilder(
context,
AppDatabase::class.java,
DATABASE_NAME
)
.allowMainThreadQueries()
.addMigrations(migration1_2)
.createFromAsset(DATABASE_NAME)
.build()
}
return instance as AppDatabase
}
val migration1_2 = object: Migration(1,2) {
val assetFileName = "appdatabase.db" /* NOTE appdatabase.db not used to cater for testing */
val tempDBName = "temp_" + assetFileName
val bufferSize = 1024 * 4
#SuppressLint("Range")
override fun migrate(database: SupportSQLiteDatabase) {
val asset = contextPassed?.assets?.open(assetFileName) /* Get the asset as an InputStream */
val tempDBPath = contextPassed?.getDatabasePath(tempDBName) /* Deduce the file name to copy the database to */
val os = tempDBPath?.outputStream() /* and get an OutputStream for the new version database */
/* Copy the asset to the respective file (OutputStream) */
val buffer = ByteArray(bufferSize)
while (asset!!.read(buffer,0,bufferSize) > 0) {
os!!.write(buffer)
}
/* Flush and close the newly created database file */
os!!.flush()
os.close()
/* Close the asset inputStream */
asset.close()
/* Open the new database */
val version2db = SQLiteDatabase.openDatabase(tempDBPath.path,null,SQLiteDatabase.OPEN_READONLY)
/* Grab all of the supplied rows */
val v2csr = version2db.rawQuery("SELECT * FROM user WHERE userId < ${User.USER_DEMARCATION}",null)
/* Insert into the actual database ignoring duplicates (by userId) */
while (v2csr.moveToNext()) {
database.execSQL("INSERT OR IGNORE INTO user VALUES(${v2csr.getLong(v2csr.getColumnIndex("userId"))},'${v2csr.getString(v2csr.getColumnIndex("userName"))}')",)
}
/* close cursor and the newly created database */
v2csr.close()
version2db.close()
tempDBPath.delete() /* Delete the temporary database file */
}
}
}
testing has been done on the main thread for convenience and brevity hence .allowMainThreadQueries
As can be seen a Migration from 1 to 2 is used this:-
takes the asset appdatabase.db 2nd version (another 3 "supplied" users have been added" using :-
CREATE TABLE IF NOT EXISTS `User` (`userId` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `userName` TEXT NOT NULL);
INSERT INTO User (userName) VALUES('Fred'),('Mary'),('Sarah'); /* Add Users as required */
INSERT INTO User (userName) VALUES('Tom'),('Elaine'),('Jane'); /*+++++ Version 2 users +++++*/
INSERT INTO User VALUES(10000 -1,'user to be dropped'); /* SETS the next userid value to be 10000 */
DELETE FROM user WHERE userid >= 10000 - 1; /* remove the row added */```
So at first the asset appdatabase.db contains the original data (3 supplied users) and with the sequence number set to 9999.
If the App has database version 1 then this pre-populated database is copied.
Users of the App may add their own and userid's will be assigned 10000, 10001 ...
When the next version is released the asset appdatabase is changed accordingly maintaining the 9999 sequence number ignoring any App input userid's (they aren't known) and the database version is changed from 1 to 2.
The migration1_2 is invoked when the App is updated. If a new user installs the App then the database is created immediately from the asset by Room's createFromAsset.
Can I do this when users update the app? or is there any other way?
As above it can be done when the app is updated AND the database version is increased. It could be done other ways BUT detecting the changed data is what can get complicated.
Maybe should I try to import csv file to room database?
A CSV does not have the advantage of dealing with new installs and inherent version checking.
can I use migration without changing the database schema?
Yes, as the above shows.

Is it acceptable to manage large DB manipulations inside Room migration?

When updading DB is it acceptable to run large code to align the DB to my requirements.
For example, I need to alter the table and change column names. Then I need to get all my data in the DB and check if file is located than update the DB accordingly. I need it happen only once when user updates the app to this Room version.
val MIGRATION_8_9 = object : Migration(8, 9) {
override fun migrate(database: SupportSQLiteDatabase) {
database.execSQL("ALTER TABLE RideEntity RENAME videoPresent TO videoState")
GlobalScope.launch(Dispatchers.IO) {
val rides = DataBaseHelper.getAllPartsFromDB() //get all data
rides.forEach {
val path = MyApp.appContext.getExternalFilesDir(null)!!.path + "/" + it.name + "/"
val file = File(path + VIDEO_FILE).exists()
if (file) {
it.videoState = 1
DataBaseHelper.updateData(it) //set the data
}
}
}
}
}
Where:
suspend fun getAllPartsFromDB() = withContext(Dispatchers.IO) {
val parts = db.rideDao().getAllParts()
parts
}
Function:
#Query("SELECT * FROM rideentity ORDER BY time DESC")
fun getAllParts(): List<Parts>
So my question, despite this works, is this way acceptable? And if the migrate function called only once when the app DB updated from version X to Y
Is it acceptable to manage large DB manipulations inside Room migration?
Yes. However you may wish to put the update loop inside a transaction.
And if the migrate function called only once when the app DB updated from version X to Y
Yes it is only called the one time. The Migration(8,9) determines this that is the Migration will only be invoked when the version, as stored in the database header, is 8 and then the version number is set to 9.

How to ensure unique constraint over multiple columns in Room, even if one of them is null?

I have a Room database in my application with one table containing received and sent messages. Inside of the table, the messages are just differentiated by the phone number, being null for the backend-server (since a server has no phone number) and the phone number of the user for the sent messages. (Entered on app installation, just as Whatsapp.)
To sync the table with the backend, I introduced a new column, containing the backend id of the messages on the server. Since the server seperates sent and received messages (due to different information contained in the tables backend), the id of a sent message and the id of a received message can be equal, only distinguishable by the corresponding phone number. (Either null or own)
So I created a unique constraint over both columns: backend_id & phone number.
#Entity(indices = [Index(value = ["backend_id", "senderNumber"], unique = true)])
data class Message(
var senderNumber: String?,
var message: String?,
var backend_id: String? = null,
var time : Date? = Date(),
var status : Status = Status.PENDING
) : ListItem(time), Serializable {
#PrimaryKey(autoGenerate = true) var id : Long? = null
}
But trying it out with some messages, I had to realize, that the database gladly accepts equal backend_ids, if the phone number is null. To make sure this was not an accident, I even wrote a UnitTest:
#RunWith(AndroidJUnit4::class)
class ExampleInstrumentedTest {
lateinit var db : MyDatabase
lateinit var dao : MessageDao
#Before
fun createDb() {
val context = ApplicationProvider.getApplicationContext<Context>()
db = Room.inMemoryDatabaseBuilder(
context, MyDatabase::class.java).build()
dao = db.messageDao()
}
#After
#Throws(IOException::class)
fun closeDb() {
db.close()
}
#Test(expected = Exception::class)
fun check_unique_constraint_is_violated() {
// Context of the app under test.
val message = Message(senderNumber = null, backend_id = "1", time = Date(), message = "Hello")
dao.insertAll(message)
dao.insertAll(message)
val allMessages = dao.getAll()
assertTrue(allMessages.size==2)
assertTrue(allMessages[0].backend_id==allMessages[1].backend_id)
}
}
This test fails, since it doesn´t throw any exception. Debugging it shows, that the Room database also doesn´t catch the exception silently, since both messages (being the same) are being inserted successfully, resulting in 2 messages.
So my question is: How can I ensure, that the result is unique over both columns, even if one of them is null? It seems a bit weird to me, that you can pass-by uniqueness, just by inserting null for one of the columns. It worked, when I only checked the backend_id in the index, throwing exceptions, when a sent and a received message had the same id. (But I obviously don´t want that.)
In case Database and Dao have any relevance to the solution:
Database:
#Database(entities = [Message::class], version = 1)
#TypeConverters(Converters::class)
abstract class MyDatabase : RoomDatabase() {
override fun init(configuration: DatabaseConfiguration) {
super.init(configuration)
//Create and execute some trigger, limiting the entries on the latest 50
}
abstract fun messageDao() : MessageDao
companion object {
private var db: MyDatabase? = null
private fun create(context : Context) : MyDatabase {
return Room.databaseBuilder(context, MyDatabase::class.java, "dbname").build()
}
fun getDB(context : Context) : MyDatabase {
synchronized(this) {
if(db==null) {
db = create(context)
}
return db!!
}
}
}
}
MessageDao:
#Dao
interface MessageDao {
#Query("SELECT * FROM Message")
fun getAll() : List<Message>
#Insert
fun insertAll(vararg messages: Message) : List<Long>
}
In SQLite (and others that conform to SQL-92) null is considered different to any other null and hence your issue.
As such you should not be using null. You can overcome this setting the default value to a specific value that indicates a no supplied value.
For example you could use:-
#NotNull
var backend_id: String = "0000000000"
0000000000 could be any value that suits your purpose.
"" could also be used.
Altenative
An alternative approach could be to handle the null in the index such as :-
#Entity(indices = [Index(value = ["coalesce(backend_id,'00000000')", "senderNumber"], unique = true)])
HOWEVER, Room will issue an error message because it doesn't determine that the backend_id column is the column being indexed and thus issues a compilation error e.g. :-
error: coalesce(backend_id,'00000000') referenced in the index does not exists in the Entity.
Therefore you would need to add the index outside of Room's creation of tables. You could do this via the onCreate or onOpen callback. Noting that onCreate is only called once when the database is first created and that onOpen is called every time the app is run.
The safest (data wise) but slightly less efficient is to use the onOpen callback.
Here's an example that creates the index (applying it to both columns, considering that both backend_id and senderNumber columns can be null).
This being done when building the database :-
....
.addCallback(object :RoomDatabase.Callback() {
override fun onOpen(db: SupportSQLiteDatabase) {
super.onOpen(db)
db.execSQL(
"CREATE UNIQUE INDEX IF NOT EXISTS message_unique_sn_beid " +
"ON message (coalesce(backend_id,''),coalesce(senderNumber,''));")
}
override fun onCreate(db: SupportSQLiteDatabase) {
super.onCreate(db)
}
})
.build()
....
The index name would be message_unique_sn_beid
Results using the Alternative
Basing the Message Entity on your (but with fewer columns) and an Insert Dao of :-
#Insert(onConflict = OnConflictStrategy.IGNORE)
fun insert(message: Message): Long
using the following (and with the index added via the onOpen callback) the when running :-
dao.insert(Message(null,"1234567890","blah"))
dao.insert(Message(null,"0123456789","blah","0123456789"))
dao.insert(Message(null,"1234567890","blah"))
dao.insert(Message(null,"1234567890","blah",null))
dao.insert(Message(null,null,message = "blah",backend_id = "9876543210"))
dao.insert(Message(null,null,message = "blah",backend_id = "9876543210"))
1st and 2nd rows will be added, 3rd and 4th rows will be ignored due to UNIQUE conflict 5th (3rd row in table) will be added, 6th will be ignored due to UNIQUE conflict.
Using Android Studio's Database Inspector:-
1. The message table :-
2. Looking at the sqlite_master (the schema) at items starting with mess (i.e. running SQL SELECT * FROM sqlite_master WHERE name LIKE 'mess%';) :-

Categories

Resources