Can I create multiple tables with one data class? - android

What I wanna do is, By merging the following two data classes into one, the two tables are divided as they are.
#Entity(tableName = "file")
data class File(
#PrimaryKey
#ColumnInfo(name = "path") var path: String,
#ColumnInfo(name = "date", index = true) var date: Long,
#ColumnInfo(name = "number") var num: Float = -1f
)
#Entity(tableName = "del_file")
data class delFile(
#PrimaryKey
#ColumnInfo(name = "path") var path: String,
#ColumnInfo(name = "date", index = true) var date: Long,
#ColumnInfo(name = "number") var num: Float = -1f
)
The reason I want to manage those two tables separately is that they are used in completely different situations.
'file' will be used in the app's file system, and 'del_file' will be managed only in the recycle bin.
I also thought about adding and managing a column called "is_alive" to the "file" table, but I thought it was a bad structure in that it would be meaningless data in almost all entries and that filtering was required for all queries to be used in the app's file system.
The best way I think is to manage each of the two tables, but I thought I couldn't come up with a better way because I wasn't good enough.
Is there a way to manage the table separately while making that data class one? Or is there a better way?
I would be very, very grateful if you could tell me a good way.

Generally, having two separate tables for the same data model is not good because it may lead to data duplication, which has various disadvantages; for example, it costs storage space and data inconsistency, etc.
There are two ways to deal with this situation. If you want to distinguish file items just by one field(for example, is_alive), the best way is to use one table with the is_alive field. You can use this way to solve your problem.
But if those distinguishable fields are more than one or maybe in the future become more, the solution is to create another table(like def_file) that contains only the reference to the original table(file table) and those fields. In other words, to avoiding data duplication, separate those fields in another table with a reference to the original table and then use JOIN when you want to retrieve them.
For more detail see this

You cannot have a single #Entity annotated class for multiple tables.
However, you could use a class as the basis of other classes, by embedding the class into the other classes.
e.g. you could have:-
#Entity(tableName = "file")
data class File(
#PrimaryKey
#ColumnInfo(name = "path") var path: String,
#ColumnInfo(name = "date", index = true) var date: Long,
#ColumnInfo(name = "number") var num: Float = -1f
)
and the second class as :-
#Entity(tableName = "del_file",primaryKeys = ["path"], indices = [Index("date")])
data class delFile(
#Embedded
val file: File
)
However, you may need to be aware of what is and isn't applied to the class that Embeds another class.
You cannot use #ColumnInfo with an #Embedded (it is not necessarily a single column)
The #PrimaryKey is lost/dropped when Embedding thus why you would need to define the primary key in the #Entity annotation of the class that Embeds the other class (it is feasible to have multiple #Embeddeds and thus impossible to determine what the correct primary key should be (Room requires that a primary key is defined))
Likewise for the index on the date column and hence the indicies being defined in the #Entity of the delFile class that Embeds the other class.
However the name given in the #ColumnInfo of the first is used in the second (safe to propogate this).
As an example of the columns names being different than the field/variable name, if, for the first, you had:-
#Entity(tableName = "file")
data class File(
#PrimaryKey
#ColumnInfo(name = "mypath") var path: String,
#ColumnInfo(name = "mydate", index = true) var date: Long,
#ColumnInfo(name = "mynumber") var num: Float = -1f
)
Then as the column names are different to the field/variable names then you would have to use:-
#Entity(tableName = "del_file",primaryKeys = ["mypath"], indices = [Index("mydate")])
data class delFile(
#Embedded
val file: File
)
You would also have to be aware that, changes to the File class would also apply to the delFile class, which could be useful at times but potentially problematic at other times.
Changes to the second (delFile) class would not be applied to the first (File) class, so you would have the freedom to augment the second e.g.
#Entity(tableName = "del_file",primaryKeys = ["mypath"], indices = [Index("mydate")])
data class delFile(
#Embedded
val file: File,
val another_column: String
)
This would result in the del_file table having the additional another_column column.

Related

Saving Multiple Nested lists into room database

I'm working on dictionnary application in which the api request have many nested lists , i have tried to insert all the nested lists but i'm getting different error each time , i would like to know what is the best way to save multiple nested lists , should i use room relations or something else , thank you in advance for help , i m really stuck with this for few days now
This is sample schema of how are the lists nested
This is the parent list
#Entity(tableName = "DICTIONNARYTABLE")
#TypeConverters(DictionnaryModelConverter::class)
class DictionnaryModel : ArrayList<DictionnaryModelItem>() {
#PrimaryKey(autoGenerate = true)
#NotNull
val wordId: Long = 0
}
The parent list has two lists as well
#Entity
data class DictionnaryModelItem(
#PrimaryKey val dictionnaryModelId: Long = 0,
#TypeConverters(DictionnaryMeaningConverter::class)
val meanings: MutableList<Meaning>,
#TypeConverters(DictionnaryPhoneticsConverter::class)
val phonetics: MutableList<Phonetic>,
val word: String
)
//---------------------------
#Entity
data class Meaning(
#PrimaryKey val meaningId: Long = 0,
#TypeConverters(DictionnaryDefinitionConverter::class)
val definitions: List<Definition>,
val partOfSpeech: String
)
///-------------------------------
#Entity
data class Phonetic(
#PrimaryKey val phoneticId: Long = 0,
val audio: String,
val text: String
)
inside the meaning , i also have definition which another model
#Entity
data class Definition(
#PrimaryKey val definitionId: Long = 0,
val definition: String,
val example: String,
#TypeConverters(DictionnarySynonymsConverter::class)
val synonyms: List<String>
)
You need to create one-to-many relationship data model here. For instance each dictionary word has many meanings and many phonetics. Here Dictionary is a parent entity and Meaning and Phonetic are the child entities. Each child entity will have it's parent entity primary key stored in its table. You will need another data class to define this relationship.
data class DictionaryWithMeanings(
#Embedded val dictionary: Dictionary,
#Relation(
parentColumn = "dictionaryModelId",
entityColumn = "dictionaryId"
)
val meanings: List<Meaning>
)
Meaning table has to store dictionaryId as foreign key its table. Same has to be defined for phonetics. And Meaning table again has similar relationship with Definition and so on.

andorid room inner join one column from table

This is my table :
#Parcelize
#Entity(tableName = "profile")
data class Profile(
#SerializedName("id") #PrimaryKey var id:Long,
#SerializedName("name") var name :String?,
#TypeConverters(UserConverter::class)
#NotNull
#SerializedName("users") var users :List<Long>?
):Parcelable
and this is my second table :
#Parcelize
#Entity(tableName = "user")
data class User(
#PrimaryKey
#SerializedName("id") var id: Long,
#SerializedName("name") var name: String
) : Parcelable
and I want to get this object :
data class ProfileWithUsersName(
val profile: Profile,
val usersName: List<String>?
)
to get this list of objects I do this :
fun getProfiles() :List<ProfileWithUsersName>{
val arrayListTemp = arrayListOf<ProfileWithUsersName>()
val profiles = profileDao.getProfiles()
for(profile in profiles){
if(profile.users != null) {
arrayListTemp.add(
ProfileWithUsersName(
profile,
userDao.getUsersNameByIds(profile.users!!)
)
)
}else{
ProfileWithUsersName(
profile,
null
)
}
}
return arrayListTemp.toList()
}
it is any changes to do this on one query ?
it could be something like this
#Dao
abstract class ProfileDao {
companion object {
const val QUERY = "SELECT * FROM profile " +
"INNER JOIN user " +
`your condition`
}
#Query(QUERY)
abstract fun getCurrentStep(): List<ProfileWithUsersName>?
}
}
and your profile should be something like this
data class ProfileWithUsersName(
#Embedded val profile: Profile,
#Embedded val usersName: List<String>?
)
May be that would not be full answer, but why not to get partly answer?
Notes to your tables' structure
It's not so optimal and it would be better not to hold information about users having exact profile in Profile class (better option - to remove this into another table). But let it be.
With no changes to tables' structure I think your decision with nested queries could be better if you call the function "userDao.getUsersNameByIds(profile.users!!)" just once (but for that you should change logic a little bit). But I think your tables are not too big to invest time into such optimisations.
As for "do this in one query"
Actually I think that there are some chances to succeed in that, but honestly I don't think it worth of it and that query would be faster or more elegant than loops. But it could be a challenge for somebody :)
Main problem here - is field "users" in your entity "Profile". In SQLite I guess you save it as a String (with the help of UserConverter). For example, if you have list of ids: [111, 222, 333] in SQLite they will be saved as TEXT "111,222,333". So on next step we must somehow JOIN table with such a field an a table, that has INTEGERS 111|222|333.
That could be done only with casting INTEGER To TEXT and then JOIN tables on condition - casted_value_from_second_table LIKE ["%," + value_from_first_table + ",%]" in pseudocode. Maybe that requires some changes to your TypeConverter. Even if it will be successful, JOIN tables with "LIKE" is not best practice.
Conclusions:
Without changes to tables' structures and if your tables are not very big I think way how you do now - is acceptable. With "one query" (again, with no changes to db tables' structure) it's not regular task, there are several problems that are needed to be solved experimentally - I don't think it worth of it.
P.S. Maybe I don't see some obvious decision, so it's just my personal opinion :)

How to store a list of integers in Room database so that it can be easily queried later?

I'm storing podcast data in a Room database, where each podcast has a List<Int> called genreIds. I'd like to be able to store this in such a way that I can easily query it later by doing something like SELECT * FROM podcasts WHERE genreIds CONTAINS :genre, or whatever the command would be.
So what is the best way to store that list of Ints so that it can be easily queried later, and how would I do that using Room? I've used TypeConverters before, but that converts it to a string, which is difficult to query, so I'd like to be able to link it to another table or something that can be easily queried, I'm just not sure how to do that.
Thanks in advance.
The data stored on a the db with Room, depends on the data class you use. If you specify a data class with an Int member, that will be an Int on the db.
Example:
data class TrackPlayedDB (
#PrimaryKey(autoGenerate = true)
val _id: Int = 0,
val timesPlayed: Int,
val artist: String,
val trackTitle: String,
val playDateTime: LocalDateTime
)
here timesPlayed will be an Int on the DB (as _id). You'll specify your data classes like the following, this will build the corresponding tables.
#Database(entities = [TrackPlayedDB::class], version = 1, exportSchema = false)
#TypeConverters(Converters::class)
abstract class MyRoomDatabase : BaseRoomDatabase() {
Edit: Following author's comment, I stand corrected i didn't get the question right.
Author actually asks how to store a List<Int> as field on a table. There are 2 solutions to do that: one, as Author suggests, is to store the List as String and use Like keyword to write queries with a clause like the following:
SELECT * FROM mytable
WHERE column1 LIKE '%word1%'
OR column1 LIKE '%word2%'
OR column1 LIKE '%word3%'
as a simple search on SO would have shown: SQL SELECT WHERE field contains words
The Author says he used TypeConverters so i'll skip how to convert a List<Int> into a string
The other solution to this problem is to realise that nothing was understood about the theory of Transactional Databases. In fact, when you have a many-to-many relationship, as in the case of podcast and genre, theory dictates that you build a table that links the ids of podcasts and the ids of genres, as it is explained here: https://dzone.com/articles/how-to-handle-a-many-to-many-relationship-in-datab
and other countless books, videos and blogs.
This benefits the db with added clarity, performance and scalability.
Bottom line, Author's db design is wrong.
I found [this article on Medium][1] that I found very helpful. What I'm trying to do is a many to many relationship, which in this case would be done something like the following:
Podcast class:
#Entity(tableName = "podcasts")
data class Podcast(
#PrimaryKey
#ColumnInfo(name = "podcast_id")
val id: String,
// other fields
}
Genre class:
#Entity(tableName = "genres")
data class Genre (
#PrimaryKey
#ColumnInfo(name = "genre_id")
val id: Int,
val name: String,
val parent_id: Int
)
PodcastDetails class:
data class PodcastDetails (
#Embedded
val podcast: Podcast,
#Relation(
parentColumn = "podcast_id",
entityColumn = "genre_id",
associateBy = Junction(PodcastGenreCrossRef::class)
)
val genres: List<Genre>
)
PodcastGenreCrossRef:
#Entity(primaryKeys = ["podcast_id", "genre_id"])
data class PodcastGenreCrossRef (
val podcast_id: Int,
val genre_id: Int
)
And access it in the DAO like this:
#Transaction
#Query(SELECT * FROM podcasts)
fun getPodcastsWithGenre() : List<PodcastDetails>
[1]: https://medium.com/androiddevelopers/database-relations-with-room-544ab95e4542

Room Livedata returns incorrect values

I have an audio recorder app, where I enable the user to mark certain points in his recordings with predefined markers. For that purpose I have a MarkerEntity, which is the type of Marker, and a MarkTimestamp, the point at which the user marks a given recording. These entities are connected via a Relation, called MarkAndTimestamp.
#Entity(tableName = "markerTable")
data class MarkerEntity(
#PrimaryKey(autoGenerate = true) val uid: Int,
#ColumnInfo(name = "markerName") val markerName: String
)
#Entity(tableName = "markerTimeTable")
data class MarkTimestamp(
#PrimaryKey(autoGenerate = true) #ColumnInfo(name = "mid") val mid: Int,
#ColumnInfo(name = "recordingId") val recordingId: Int,
#ColumnInfo(name = "markerId") val markerId: Int,
#ColumnInfo(name = "markTime") val markTime: String
)
data class MarkAndTimestamp(
#Embedded val marker: MarkerEntity,
#Relation(
parentColumn = "uid",
entityColumn = "markerId"
)
val markTimestamp: MarkTimestamp
)
The insertion of this data works flawlessly, I checked this via DB Browser for SQLite and Android Debug Database. The problem arises, when I want to display all marks for a recording. I fetch the entries with the following SQL statement.
#Transaction
#Query("SELECT * FROM markerTimeTable INNER JOIN markerTable ON markerTimeTable.markerId=markerTable.uid WHERE markerTimeTable.recordingId = :key")
fun getMarksById(key: Int): LiveData<List<MarkAndTimestamp>>
What ends up happening is, that if the user uses a Marker more than once, all marks created with that Marker have the same MarkerTimestamp row attached to them, specificially, the last row to be inserted with that Marker. The weird thing is, this only happens in the app using Livedata. Using the same query in DB Browser for SQLite returns the correct and desired data.
This is the stored data (correct)
MarkTimestamps
MarkerEntities
And this is the Livedata returned at this point (incorrect)
[
MarkAndTimestamp(marker=MarkerEntity(uid=1, markerName=Mark), markTimestamp=MarkTimestamp(mid=6, recordingId=2, markerId=1, markTime=00:05)),
MarkAndTimestamp(marker=MarkerEntity(uid=2, markerName=zwei), markTimestamp=MarkTimestamp(mid=5, recordingId=2, markerId=2, markTime=00:03)),
MarkAndTimestamp(marker=MarkerEntity(uid=1, markerName=Mark), markTimestamp=MarkTimestamp(mid=6, recordingId=2, markerId=1, markTime=00:05))
]
I also get the following build warning
warning: The query returns some columns [mid, recordingId, markerId, markTime] which are not used by de.ur.mi.audidroid.models.MarkAndTimestamp. You can use #ColumnInfo annotation on the fields to specify the mapping. You can suppress this warning by annotating the method with #SuppressWarnings(RoomWarnings.CURSOR_MISMATCH). Columns returned by the query: mid, recordingId, markerId, markTime, uid, markerName. Fields in de.ur.mi.audidroid.models.MarkAndTimestamp: uid, markerName. - getMarksById(int) in de.ur.mi.audidroid.models.MarkerDao
Why does Room return the wrong data and how do I fix this?
So, I still don't know what caused the behaviour described in my original post. My guess is, that the realtion data class and the SQL query interfered in some way, producing the cinfusing and incorrect outcome.
I solved my problem nonetheless. I needed to change
data class MarkAndTimestamp(
#Embedded val marker: MarkerEntity,
#Relation(
parentColumn = "uid",
entityColumn = "markerId"
)
val markTimestamp: MarkTimestamp
)
to
data class MarkAndTimestamp(
#Embedded val marker: MarkerEntity,
#Embedded val markTimestamp: MarkTimestamp
)
This makes sure, that all fields returned by the query are included in the data class.

Is it possible in Room to ignore a field on a basic update

I have the following entity:
#Entity
class Foo(
#PrimaryKey
#ColumnInfo(name = "id")
val id: Long,
#ColumnInfo(name = "thing1")
val thing1: String,
#ColumnInfo(name = "thing2")
val thing2: String,
#ColumnInfo(name = "thing3")
val thing3: String,
#ColumnInfo(name = "thing4")
val thing4: String
) {
#ColumnInfo(name = "local")
var local: String? = null
}
Where local is information that is not stored on the server, only local to the phone.
Currently when I pull information from the server GSON auto fills in my values, but since "local" does not come from the server it is not populate in that object.
Is there a way that when I call update I can have Room skip the update for the "local" column without writing a custom update to insert into all other columns except "local"? The pain point is that I could have many columns and each new column I add, I would have to add that to the custom insert statement.
I have also thought of a one-to-one mapping from the server entity to a new "local" entity, however now I have to deal with the pain of a join statement everywhere I get my entity since I need the local information.
I was hoping that I could do something like this:
#Entity
class Foo(
#PrimaryKey
#ColumnInfo(name = "id")
val id: Long,
#ColumnInfo(name = "thing1")
val instructions: String,
#ColumnInfo(name = "thing2")
val instructions: String,
#ColumnInfo(name = "thing3")
val instructions: String,
#ColumnInfo(name = "thing4")
val instructions: String
) {
#Ignore
var local: String? = null
}
Using the #Ignore annotation, to try and ignore the local string on a generic update. Then provide a custom update statement to just save the local info
#Query("UPDATE foo SET local = :newLocal WHERE foo.id = :id")
fun updateLocal(id: Long, newLocal: String)
However ROOM seems to be smart enough to check that I used #Ignore on the local property and it will not compile with that update statement.
Any ideas?
Partial Updates got added to Room in 2.2.0
In Dao you do the following:
// Here you specify the target entity
#Update(entity = Foo::class)
fun update(partialFoo: PartialFoo)
And along your entity Foo create a PartialFoo containing the primary key and the fields you want to update.
#Entity
class PartialFoo {
#ColumnInfo(name = "id")
val id: Long,
#ColumnInfo(name = "thing1")
val instructions: String,
}
https://stackoverflow.com/a/59834309/1724097
Simple answer is NO. Room doesn't have conditional insertion or partial insertion.
You have to come up with your insertion logic. The best one I guess is call both database and server for data and just update your server response' local value with your database response' local value.
If you are comfortable with Rx, then you can do something like this
localDb.getFoo("id")
.zipWith(
remoteServer.getFoo("id"),
BiFunction<Foo, Foo, Foo> { localFoo, remoteFoo ->
remoteFoo.local = localFoo.local
remoteFoo
}
)
Another possible way is to write custom #Query that you insert all the values except local, but it's not feasible if you have lots of fields.

Categories

Resources