While searching for this, I only came across people asking how to Avoid inserting duplicate rows using room db. But my app has a feature where the user may tap a copy button and the list item will get inserted again in the db. I could have simply achieved this if my table didn't have a primary key set on one of its fields. While I found this solution for SQLite, I don't know how I can achieve this in Room Db. Because while writing an insert query with custom queries in room would defeat the purpose of using room in the first place.
Let's say you have some entity
#Entity(tableName = "foo_table")
data class Foo (
#PrimaryKey(autoGenerate = true) var id: Int,
// or without autogeneration
// #PrimaryKey var id: Int = 0,
var bar:String
)
and you have some Dao with insert:
#Insert(onConflict = OnConflictStrategy.REPLACE)
suspend fun insert(foo: Foo)
Then to copy your existing value (copiedValue: Foo) you need in some way to manage your primary key:
Scenario 1. Your Primary Key is autogenerated, you have to set it to default value to get new autogenerated one:
copiedValue.id = 0
yourDao.insert(copiedValue)
Scenario 2. Your Primary Key is not autogenerated, you have to set new primary key manually:
copiedValue.id = ... // some code to set new unique id
yourDao.insert(copiedValue)
Related
Here is my room entity object:
#Entity(tableName = "user_account", indices = [Index(value = ["user_name", "user_type"], unique = true)])
data class DataUserAccountEntity(
#PrimaryKey(autoGenerate = true) #ColumnInfo(name = "auto_id") val autoId: Int,
#NonNull #ColumnInfo(name = "user_name") val userName: String,
#NonNull #ColumnInfo(name = "user_photo") val userPhoto: String,
#NonNull #ColumnInfo(name = "user_type") val userType: Int,
)
Here is my Dao entity object:
#Dao
interface DataUserAccountDao {
#Query("SELECT * FROM user_account WHERE auto_id = :autoId LIMIT 1")
fun getUserAccount(autoId: Int): DataUserAccountEntity
#Query("SELECT * FROM user_account ORDER BY auto_id ASC")
fun getAllUserAccounts(): List<DataUserAccountEntity>
}
Since auto_id is set to #PrimaryKey(autoGenerate = true), how would I query room for the next value?
(i.e. I am looking for the auto_id that would be generated if I insert a new row into the local database right now)
Although I appreciate the response, this does not solve my problem. I need the number BEFORE insertion.
If autoGenerate=true is coded then you can use:-
#Query("SELECT seq+1 FROM sqlite_sequence WHERE name=:tableName")
fun getNextRowidFromTable(tableName: String): Long
HOWEVER, there is no guarantee that the next allocated value will be 1 greater than the last and thus the value obtained from the query. As per:-
The behavior implemented by the AUTOINCREMENT keyword is subtly different from the default behavior. With AUTOINCREMENT, rows with automatically selected ROWIDs are guaranteed to have ROWIDs that have never been used before by the same table in the same database. And the automatically generated ROWIDs are guaranteed to be monotonically increasing.
and
Note that "monotonically increasing" does not imply that the ROWID always increases by exactly one. One is the usual increment. However, if an insert fails due to (for example) a uniqueness constraint, the ROWID of the failed insertion attempt might not be reused on subsequent inserts, resulting in gaps in the ROWID sequence. AUTOINCREMENT guarantees that automatically chosen ROWIDs will be increasing but not that they will be sequential.
What coding autoGereate=true does is include the AUTOINCREMENT keyword. This doesn't actually cause auto generation rather that for every table (using Room at least) a value is generated an placed into a hidden column rowid. If a column is specified with a type of INTEGER and the column is the PRIMARY KEY (not part of a composite primary key) then that column is an alias of the rowid. If such a column has a value specified for the column when inserting the row then that value (as long as it is unique) is assigned to the column (and therefore rowid).
AUTOINCREMENT is a constraint (rule) that enforces the use of a value higher than any that have been assigned (even if such rows are deleted).
AUTOINCREMENT handles this subtle difference by using the sqlite_sequence table to store the assigned rowid or alias thereof obviously updating the value to always be the highest. The sqlite_sequence table will not exist if AUTOINCREMENT aka autoGenerate=true is not coded in any #Entity annotated classes (which are passed to the #Database annotated class via the entities parameter of the annotation)
You may wish to refer to https://www.sqlite.org/autoinc.html
For a solution that is less likely to result in missed sequence numbers you could instead not use AUTOINCREMENT aka autoGenerate= true. This does mean another subtle change to cater for the auto generation and that is making the auto_id nullable with a default value of null.
e.g.
#Entity(tableName = "user_account", indices = [Index(value = ["user_name", "user_type"], unique = true)])
data class DataUserAccountEntity(
#PrimaryKey/*(autoGenerate = true)*/ #ColumnInfo(name = "auto_id") val autoId: Int?=null /*<<<<< CHANGED*/,
#NonNull #ColumnInfo(name = "user_name") val userName: String,
#NonNull #ColumnInfo(name = "user_photo") val userPhoto: String,
#NonNull #ColumnInfo(name = "user_type") val userType: Int,
)
As sqlite_sequence will not exist or not have a row for this table then you cannot use it to ascertain the next auto_id value.
So you could have:-
#Query("SELECT COALESCE(max(auto_id),0)+1 FROM user_account")
fun getNextAutoId(): Long
This would work, due to the COALESCE function changing null into 0, even if there were no rows and return 1.
Even still there is still no guarantee that the value will be in sequence. However, more likely and predictable than if using AUTOINCREMENT as the issue with AUTOINCREMENT is due to sqlite_sequence being updated but then the row not being inserted (rolled back).
However, IF the sequence number reaches the value of 9223372036854775807 then instead of an SQLITE_FULL error that would happen with AUTOINCREMENT (it cannot break the rule and cannot have a larger value) SQLite will try to find an unused value (and therefore lower value (unless getting even deeper and using negative values)).
You could mimic sqlite_sequence by defining a table with two columns (only one could be used but two name and seq would cater for other tables). You could compliment this with a TRIGGER so that an INSERT automatically sets the new value (prone to misuse). Room doesn't support TRIGGERS but doesn't complain if you include them (e.g. via a callback).
Saying all that when it boils down to it. The intended purpose of the rowid or an alias thereof is for the unique identification of a row. SQLite has been written with this in mind (such as up to twice as fast location of rows as the rowid can be considered a super/optimized index). Other uses of the rowid/alias thereof will always have some potential issues.
As such it is not recommended to use them for anything other than their intended use.
You can get the id of last saved record in room database.
#Query("SELECT auto_id FROM user_account ORDER BY auto_id DESC LIMIT 1")
fun getLastUserAccount(autoId: Int): Long
This will return you last row id. Suppose you have 5 records, it will return 4.
Now, you increment the returned_id, to get new one.
And verify after inserting,
#Insert(onConflict = OnConflictStrategy.REPLACE)
suspend fun insertCountry(dataUserAccountEntity: DataUserAccountEntity): Long
Long is the return type of this new record. if it's -1, it means the operation got failed else will return the auto-generated ID
That is my current dao
#PrimaryKey(autoGenerate = true)
val id: Int,
val name: String,
val date: LocalDate,
val amount: Int,
val uri: String,
val tag: String,
val toList: Boolean,
val inUse: Boolean,
val listValue: Int
now I have the problem that in a previous version of that dao there is a variable in that table that I now want to remove.
I found a 4 step guid:
1.) create new table 2.) insert from the old table 3.) drop the old table 4.) alter new table name back to old table name
that's fine but my problem is that I have a variable with a LocalDate which uses a DateTypeConverter to function properly.
How do I insert that LocalDate into the new table? I just know of TEXT and INTEGER
Step 2 use the SupportSQliteDatabase's execSQL method to execeute a query based upon the SQL
INSERT INTO <the_table> SELECT <the_columns> FROM <the_old_table>;
Where:-
anything enclosed within <> needs to be altered accordingly as per:-
<the_table> should be replaced with the new table name.
<the_columns> should be replaced with the column names, separated by commas, LESS THE DROPPED COLUMN NAME
<the_old_table> should be replaced with the old/original table name.
Note that a variable name will be the same as the variable name.
The above will copy the values, whatever they are, as stored in the database, from the old to the new table.
The TypeConverters are only used to convert the data to or from the respective object (LocalDate in your case) when storing or retrieving the stored data.
A type converter should consist of two functions:-
1 to convert the object to a type that can be stored in an SQLite database (SQLite is a universal database that has no concept of a programming languages objects). The SQLite types being
INTEGER (not necessarily a Kotlin Int, could be a Long, Byte even a Boolean ....).
TEXT (a Kotlin String ....)
REAL (Kolin Double, Float ....)
BLOB (Kotlin ByteArray ....)
NULL
2 to convert the stored type into the object when retrieving data from the database. As such it is no issue at all for the INSERT INTO table SELECT ....; to copy the existing data from one table to another irrespective of Room's handling of the data when it stores and retrieves the data.
The result being that the data is stored in the database as either one of the 5 types. As such it is no is
If the "current dao" (it is not a dao, it is an entity that should be annotated with #Entity, which equates to a table) is after the removal of the dropped variable then you would use:-
INSERT INTO <the_new_table> SELECT id,name,date,amount,uri,tag,toList,inUse,listValue FROM <the_old_table>;
You may wish to refer to 2. INSERT INTO table SELECT ...;
If you want to add data into date column of old table data.
fun updateData() {
val list = dao.getAllData()
list.forEach {
//update data
}
dao.saveData(list)
}
I have an issue where autoGenerate is not working on an inherited field in my Entity class.
In my project I have created a base class which has an id field already added to it. This base class is then used by every Entity so I can work with generics and such. Everything seems to work perfectly until I add the autoGenerate to the id field of an Entity. (FYI: this was working in version 2.2.6, but in 2.3.0 this breaks and results in this issue.)
The BaseEntity class
interface BaseEntity {
val id: Any
}
The specific Entity class
#Entity(tableName = DBConstants.FOOD_ENTRY_TABLE_NAME)
data class FoodEntry(
#PrimaryKey(autoGenerate = true)
override val id: Int = 0,
var amount: Float,
var date: Long,
var meal: Meal
) : BaseEntity
If I do something like this it works (but it's not what I need)
#Entity(tableName = DBConstants.FOOD_ENTRY_TABLE_NAME)
data class FoodEntry(
override val id: Int = 0,
#PrimaryKey(autoGenerate = true)
var someOtherId: Int = 0,
var amount: Float,
var date: Long,
var meal: Meal
) : BaseEntity
As far as I can see this is only a problem when you wish to autoGenerate an inherited field.
Anybody else have seen this issue before?
As far as I can see this is only a problem when you wish to autoGenerate an inherited field.
The same behaviour happens if you just have #PrimaryKey or if you define the id column as a primary key.
The issue is that Room interprets the Int as a Type of int (in the underlying Java code (perhaps a bug, you may wish to raise an issue )). Room if it considers the Type as int as opposed to Int treats the 0 default value differently.
In case of Type Int with a 0 then Room doesn't attempt to specify a value if it is a for a primary key column and thus allows SQLite to assign a value.
e.g. SQL along the lines of INSERT INTO the_table (amount,date,meal) VALUES(the_amount, the_date, the_meal);
If the Type is int then the value is always specified so the 0's will result in UNIQUE constrain conflicts.
e.g. SQL along the lines of INSERT INTO the_table VALUES(0,the_amount, the_date, the_meal);
as the column list is omitted ALL columns need a value
Possible Fix
If you instead used
interface BaseEntity {
val id: Any?
}
with :-
#PrimaryKey
override val id: Int? = null,
generated java has :-
_db.execSQL("CREATE TABLE IF NOT EXISTS food (id INTEGER, amount REAL NOT NULL, date INTEGER NOT NULL, meal TEXT NOT NULL, PRIMARY KEY(id))");
or :-
#PrimaryKey(autoGenerate = true)
override val id: Int? = null,
generated java has :-
_db.execSQL("CREATE TABLE IF NOT EXISTS food (id INTEGER PRIMARY KEY AUTOINCREMENT, amount REAL NOT NULL, date INTEGER NOT NULL, meal TEXT NOT NULL)");
then no attempt is made to insert the id value and SQLite assigns the value.
in SQLite terms
Alternative Fix
If you use the original BaseEntity then you could insert using a Query where you exclude the id column and thus allow it to be generated.
e.g. you could have :-
#Query("INSERT INTO food (amount,date,meal) VALUES(:amount,:date,:meal)")
fun insertFE(amount: Float, date: Long, meal: String): Long
but that doesn't insert using a FoodEntry object so you could then have (dependant upon the above) :-
fun insertFE(FoodEntry: FoodEntry): Long {
return insertFE(FoodEntry.amount,FoodEntry.date,your_type_converter(FoodEntry.meal))
}
obviously your_type_converter would be changed accordingly
About autoGenerate (AUTOINREMENT in SQLite)
autogenerate = true does not noticeably effect the internally generated value until you reach the maximum allowed value (9223372036854775807). If autogenerate = true is coded and the last value was the 9223372036854775807 the next insert will result in a an SQLITE_FULL error and an exception.
If autoGenerate = false is coded or autoGenerate is not specified then then SQLite will, if 9223372036854775807 has been assigned (and the row still exists), attempt to allocate an unassigned value between 1 and 9223372036854775807 which would likely succeed as it's basically impossible to have 9223372036854775807 rows.
note that if any row has been assigned a negative value then the range is extended.
of course specifying Int or int imposes a much lower restriction outside of SQLite. Really id's should be Long or long.
autoGenerate= true means that the AUTOINCREMENT keyword is included in the column definition. This is a constraint that says that when the value is determined by SQLite that the value must be greater than any value that either exists or has been used.
To ascertain this AUTOINCREMENT uses an internal table namely sqlite_sequence to store the last assigned/determined value. Having the additional table and having to access and maintain the table has overheads.
SQLite https://sqlite.org/autoinc.html has as it's first sentence :- The AUTOINCREMENT keyword imposes extra CPU, memory, disk space, and disk I/O overhead and should be avoided if not strictly needed. It is usually not needed.
so in my application, when the user clicks add on something I should create an Entity A to carry the values which the user provides, this Entity A have an autoincremented-primary-key, also along the way of constructing Entity A there're another Entities that carry the key of Entity A as a foreign key as well as part of their composite key.
so my problem is that room prevents me from creating the other entities without providing the key of Entity A in their constructor annotating it with #NonNull as it's part of their composite key and it can't null.
now I don't know how to approach this problem,
was it a mistake from the beginning to work with my entities as custom classes along my application and I should separate entities from custom classes ? (though they would be having the same fields)
whenever the user clicks the add option, should I just push/insert an empty entity/row/tuple to get an autogenerated key so I could create the entities along the way?
please tell me your thoughts about this as it's my first time to work with a database embedded in an application so I don't know what should regarding it.
this Entity A have an autoincremented-primary-key
AUTOINCREMENT, in Room autoGenerate = true as part of the #PrimaryKey annotation, does not actually result in auto generation. Rather it is a constraint rule that forces the next automatically generated rowid to be greater than any that exist or have existed (for that table).
Without AUTOINCREMENT if the column is INTEGER PRIMARY KEY (or implied via a table level definition of such a column as PRIMARY KEY) then the column is made an alias of the always existing rowid (except for the rarely used WITHOUT ROWID table (unable to do so in Room via entities, there is no annotation for such a table)).
The rowid is always unique and always automatically generated and will typically be greater (typically 1 greater) anyway. It is only (unless purposefully manipulated) when the max (9223372036854775807th rowid) is reached when AUTOINCREMENT comes into play. In which case with AUTOINCREMENT you get an SQLITE_FULL exception, without SQLITE will try to find a lower unused/free rowid.
Due to the unnecessary overheads see I personally never use autoGenerate = true.
What AUTOINCREMENT does, is have a system table sqlite_sequence with a row per table that has AUTOINCREMENT where it stores/maintains the highest allocated rowid for the table. With AUTOINCREMENT it then uses the higher of the sqlite_sequence value and the highest rowid value and then adds 1 (without it just uses the highest rowid and adds 1).
was it a mistake from the beginning to work with my entities as custom classes along my application and I should separate entities from custom classes ?
There should be no need to have separate classes an Entity can be used as a stand-alone class, the room annotations being ignored.
whenever the user clicks the add option, should I just push/insert an empty entity/row/tuple to get an autogenerated key so I could create the entities along the way?
It is very easy to get the generated key and #Insert for a single insert returns the key (id) as a long so the #Dao #Insert abstract fun(entityA: EntityA): Long (long in Java) returns the key or -1 if the insert did not insert a row.
If you use the list/varargs for of #Insert then it returns a and array of Longs, each element returning the key (id) of the insert or -1.
So considering what I believe is your issue consider the following 3 Entities (not if Java then use Long rather than long for the key as primitives can't be null).
#Entity
data class EntityA(
#PrimaryKey
var entityAKey: Long? = null,
var otherAdata: String
)
No AUTOINCREMENT via autoGenerate = true.
No #NOTNULL annotations
then :-
#Entity
data class EntityB(
#PrimaryKey
var entityBKey: Long?= null,
var otherBdata: String
)
and :-
#Entity(
primaryKeys = ["entityBRef","entityARef","otherPartOfPrimaryKey"]
)
data class EntityC(
var entityBRef: Long,
var entityARef: Long,
var otherPartOfPrimaryKey: Long,
var otherCData: String
)
add some Dao's :-
#Insert
abstract fun insert(entityA: EntityA): Long
#Insert
abstract fun insert(entityB: EntityB): Long
#Insert
abstract fun insert(entityC: EntityC): Long
NOTE the Long return value (always Long doesn't compile if Int) and generated keys should always be long anyway as they can exceed what an Int can hold.
Finally consider :-
db = TheDatabase.getInstance(this)
dao = db.getDao()
var myfirstA = EntityA(otherAdata = "First")
var myfirstB = EntityB(otherBdata = "The first B")
var a1 = dao.insert(myfirstA)
var b1 = dao.insert(myfirstB)
dao.insert(EntityC(b1,a1,100L,"The first C using id's from the first A and the first B"))
run on the main thread via allowMainThreadQueries()
And the database :-
You could even do :-
dao.insert(EntityC(
dao.insert(EntityB(otherBdata = "Second B")),
dao.insert(EntityA(otherAdata = "A's second")),
200,
"blah")
)
obviously this would likely be of limited use as you'd need to know the values up front.
And the result is :-
Database snapshots obtained via Android studio's App Inspector (formerly Database Inspector).
You could also do/use :-
var my3rdA = EntityA(otherAdata = "3rd")
my3rdA.entityAKey = dao.insert(my3rdA)
Of course whenever you extract from the database then the object will include the key (id) (unless you purposefully chose not to).
CertificateElementEntity entity had an embedded class ImgData.
I have divided CertificateElementEntityand ImageData into separate tables.
But now I can't figure out how to make the migration.
open class CertificateElementEntity(
#IgnoreJson
#PrimaryKey
#ColumnInfo(name = "local_id")
var localId: Long? = null,
var data: String? = null,
var imageData: ImgData? = null)
Maybe someone made similar migrations
You can try this general migration schema (honestly, I haven't experienced such a migration, so may be there is way easier):
Create temporary table [CertificateElementEntityTemp] with the same structure.
Copy all data from table [CertificateElementEntity] to [CertificateElementEntityTemp].
Drop table [CertificateElementEntity].
Create table [ImgData].
Create table [CertificateElementEntity] with new structure (with just imageId instead all fields from embedded table). Create Foreign Key for [imageId].
Copy needed data from [CertificateElementEntityTemp] to [ImgData].
Copy needed data from [CertificateElementEntityTemp] to [CertificateElementEntity].
Drop table [CertificateElementEntityTemp].
All this of course you should write in migration section with equivalent SQL statements.