I am fairly new to Android Room and SQLite in general, so sorry if this is a simple question.
I am getting data from a API that I'd like to insert into a database so it's accessible when the device is offline.
Depending on the endpoint of the API, some fields of my Data objects may be null (Think a summary with just the basic fields versus a fully detailed object with all fields)
To keep the database clean, I'd like to update the entries, but only the columns that are not null (eg. that I have new values for) and keep the rest of the columns untouched.
Here are some example classes to clarify:
Person
#Entity(tableName = "person", indices = {
#Index(value = "id", unique = true)
})
public class Person {
#PrimaryKey
public int id;
public String name;
public String description;
}
Example:
// create db
RoomDB db = RoomDB.create(ctx);
// create some sample objects
final Person p2 = new Person(2, "Peter", null);
// insert them into the db
db.personDao().insert(p2);
// create a updated peter that likes spiders
// but has no name (as a example)
final Person newPeter = new Person(2, null, "Peter likes spiders");
// and update him
db.personDao().updateNonNull(newPeter);
// now we read him back
final Person peter = db.personDao().getById(2);
In this example, the desired values of 'peter' would be:
id = 2
name = "Peter"
description = "Peter likes spiders"
However, using Room's #Update or #Insert i can only get this:
id = 2
name = null
description = "Peter likes spiders"
The only way i found to achive this would be to manuall get the object and supplement the values like so:
#Transaction
public void updateNonNull(Person newPerson) {
final Person oldPerson = getById(newPerson.id);
if (oldPerson == null) {
insert(newPerson);
return;
}
if (newPerson.name == null)
newPerson.name = oldPerson.name;
if (newPerson.description == null)
newPerson.description = oldPerson.description;
update(newPerson);
}
However, that would result in quite a bit of code with bigger objects...
So my question, is there a better way to do this?
Edit:
After some Testing with the SQL by #Priyansh Kedia, i found that those functions indeed work as intended and do so at a higher performance than java.
However, as a SQL statement would have required me to write huge queries, i decided to use a Reflection based solution, as can be seen below.
I only did so because the function isn't called regularly, so the lower performance won't matter too much.
/**
* merge two objects fields using reflection.
* replaces null value fields in newObj with the value of that field in oldObj
* <p>
* assuming the following values:
* oldObj: {name: null, desc: "bar"}
* newObj: {name: "foo", desc: null}
* <p>
* results in the "sum" of both objects: {name: "foo", desc: "bar"}
*
* #param type the type of the two objects to merge
* #param oldObj the old object
* #param newObj the new object. after the function, this is the merged object
* #param <T> the type
* #implNote This function uses reflection, and thus is quite slow.
* The fastest way of doing this would be to use SQLs' ifnull or coalesce (about 35% faster), but that would involve manually writing a expression for EVERY field.
* That is a lot of extra code which i'm not willing to write...
* Besides, as long as this function isn't called too often, it doesn't really matter anyway
*/
public static <T> void merge(#NonNull Class<T> type, #NonNull T oldObj, #NonNull T newObj) {
// loop through each field that is accessible in the target type
for (Field f : type.getFields()) {
// get field modifiers
final int mod = f.getModifiers();
// check this field is not status and not final
if (!Modifier.isStatic(mod)
&& !Modifier.isFinal(mod)) {
// try to merge
// get values of both the old and new object
// if the new object has a null value, set the value of the new object to that of the old object
// otherwise, keep the new value
try {
final Object oldVal = f.get(oldObj);
final Object newVal = f.get(newObj);
if (newVal == null)
f.set(newObj, oldVal);
} catch (IllegalAccessException e) {
Log.e("Tenshi", "IllegalAccess in merge: " + e.toString());
e.printStackTrace();
}
}
}
}
There is no in-built method in room to do this
What you can do is, put check in the query for your update method.
#Query("UPDATE person SET name = (CASE WHEN :name IS NOT NULL THEN :name ELSE name END), description = (CASE WHEN :description IS NOT NULL THEN :description ELSE description END) WHERE id = :id")
Person update(id: Int, name: String, description: String)
We have written the update query for SQL which checks if the inserted values are null or not, and if they are null, then the previous values are retained.
Related
matching multiple title in single query using like keyword
I am trying to get all records if that matches with given titles.
below is the structure of database please see
database screenshot
when i pass single like query it returns data
#Query("SELECT * FROM task WHERE task_tags LIKE '%\"title\":\"Priority\"%'")
when i try to generate query dynamically to search multiple match it return 0 data
val stringBuilder = StringBuilder()
for (i in 0 until tags.size) {
val firstQuery = "%\"title\":\"Priority\"%"
if (i == 0) {
stringBuilder.append(firstQuery)
} else stringBuilder.append(" OR '%\"title\":\"${tags[i].title}\"%'")
}
this is function I have made
#Query("SELECT * FROM task WHERE task_tags LIKE:tagQuery ")
fun getTaskByTag(stringBuilder.toString() : String): List<Task>
The single data is fine. However, you simply cannot use the second method.
First you are omitting the space after LIKE,
Then you are omitting the full test i.e. you have task_ tags LIKE ? OR ?? when it should be task_tags LIKE ? OR task_tags LIKE ?? ....
And even then, due to the way that a parameter is handled by room the entire parameter is wrapped/encased as a single string, so the OR/OR LIKE's all become part of what is being searched for as a single test.
The correct solution, as least from a database perspective, would be to not have a single column with a JSON representation of the list of the tags, but to have a table for the tags and then, as you want a many-many relationship (a task can have many tags and a single tag could be used by many tasks) an associative table and you could then do the test using a IN clause.
As a get around though, you could utilise a RawQuery where the SQL statement is built accordingly.
As an example:-
#RawQuery
fun rawQuery(qry: SimpleSQLiteQuery): Cursor
#SuppressLint("Range")
fun getTaskByManyTags(tags: List<String>): List<Task> {
val rv = ArrayList<Task>()
val sb=StringBuilder()
var afterFirst = false
for (tag in tags) {
if (afterFirst) {
sb.append(" OR task_tags ")
}
sb.append(" LIKE '%").append(tag).append("%'")
afterFirst = true
}
if (sb.isNotEmpty()) {
val csr: Cursor = rawQuery(SimpleSQLiteQuery("SELECT * FROM task WHERE task_tags $sb"))
while (csr.moveToNext()) {
rv.add(
Task(
csr.getLong(csr.getColumnIndex("tid")),
csr.getString(csr.getColumnIndex("task_title")),
csr.getString(csr.getColumnIndex("task_tags"))))
// other columns ....
}
csr.close()
}
return rv
}
Note that the complex string with the embedded double quotes is, in this example, passed rather than built into the function (relatively simple change to incorporate) e.g. could be called using
val tasks1 = taskDao.getTaskByManyTags(listOf()) would return no tasks (handling no passed tags something you would need to decide upon)
val tasks2 = taskDao.getTaskByManyTags(listOf("\"title\":\"Priority\""))
val tasks3 = taskDao.getTaskByManyTags(listOf("\"title\":\"Priority\"","\"title\":\"Priority\"","\"title\":\"Priority\"")) obviously the tags would change
Very limited testing has been undertaken (hence just the 3 columns) but the result of running all 3 (as per the above 3 invocations) against a very limited database (basically the same row) results in the expected (as per breakpoint):-
the first returns the empty list as there are no search arguments.
the second and third both return all 4 rows as "title":"Priority" is in all 4 rows
the main reason for the 3 search args was to check the syntax of multiple args, rather than whether or not the correct selections were made.
The resultant query of the last (3 passed tags) being (as extracted from the getTaskaByManyTags function):-
SELECT * FROM task WHERE task_tags LIKE '%"title":"Priority"%' OR task_tags LIKE '%"title":"Priority"%' OR task_tags LIKE '%"title":"Priority"%'
Currently, we have the following database table
#Entity(
tableName = "note"
)
public class Note {
#ColumnInfo(name = "body")
private String body;
public String getBody() {
return body;
}
public void setBody(String body) {
this.body = body;
}
}
The length of the body string, can be from 0 to a very large number.
In certain circumstance, we need to
Load the all notes into memory.
A LiveData which is able to inform observers, if there's any changes made in the SQLite note table.
We just need the first 256 characters of body. We do not need entire body. Loading entire body string for all notes might cause OutOfMemoryException.
We have the following Room Database Dao
#Dao
public abstract class NoteDao {
#Query("SELECT * FROM note")
public abstract LiveData<List<Note>> getAllNotes();
}
getAllNotes able to fulfill requirements (1) and (2), but not (3).
The following getAllNotesWithShortBody is a failed solution.
#Dao
public abstract class NoteDao {
#Query("SELECT * FROM note")
public abstract LiveData<List<Note>> getAllNotes();
#Query("SELECT * FROM note")
public abstract List<Note> getAllNotesSync();
public LiveData<List<Note>> getAllNotesWithShortBody() {
MutableLiveData<List<Note>> notesLiveData = new MutableLiveData<>();
//
// Problem 1: Still can cause OutOfMemoryException by loading
// List of notes with complete body string.
//
List<Note> notes = getAllNotesSync();
for (Note note : notes) {
String body = note.getBody();
// Extract first 256 characters from body string.
body = body.substring(0, Math.min(body.length(), 256));
note.setBody(body);
}
notesLiveData.postValue(notes);
//
// Problem 2: The returned LiveData unable to inform observers,
// if there's any changes made in the SQLite `note` table.
//
return notesLiveData;
}
}
I was wondering, is there any way to tell Room database Dao: Before returning List of Notes as LiveData, please perform transformation on every Note's body column, by trimming the string to maximum 256 characters?
Examining the source code generated by Room Dao
If we look at the source code generated by Room Dao
#Override
public LiveData<List<Note>> getAllNotes() {
final String _sql = "SELECT * FROM note";
final RoomSQLiteQuery _statement = RoomSQLiteQuery.acquire(_sql, 0);
...
...
final String _tmpBody;
_tmpBody = _cursor.getString(_cursorIndexOfBody);
_tmpPlainNote.setBody(_tmpBody);
It will be great, if there is a way to supply transformation function during runtime, so that we can have
final String _tmpBody;
_tmpBody = transform_function(_cursor.getString(_cursorIndexOfBody));
_tmpPlainNote.setBody(_tmpBody);
p/s Please do not counter recommend Paging library at this moment, as some of our features require entire List of Notes (with trimmed body String) in memory.
You can use SUBSTR, one of SQLite's built-in functions.
You need a primary key in your #Entity. Assuming that you call it id, you can write a SQL like below.
#Query("SELECT id, SUBSTR(body, 0, 257) AS body FROM note")
public abstract LiveData<List<Note>> getAllNotes();
This will return the body trimmed to 256 chars.
With that being said, you should consider segmenting your rows. If you have too many rows, they will eventually use up your memory at some point. Using Paging is one way to do it. You can also use LIMIT and OFFSET to manually go through segments of rows.
Let's take this example: I have a form, which has several sections, each having questions. Sideways, I have answers that are mapped to questions and they have another column that I want to filter on when querying:
So I have the following entities:
#Entity(tableName = "sections")
public class Section {
#PrimaryKey
public long id;
public String title;
}
#Entity(tableName = "questions")
public class Question {
#PrimaryKey
public long id;
public String title;
public long sectionId;
}
#Entity(tableName = "answers")
public class Answer {
#PrimaryKey
public long id;
public long questionId;
public int otherColumn;
}
In the section DAO I want to retrieve all of them.
Here's the POJO that I want filled by this query:
class SectionWithQuestions {
#Embedded
public Section section;
#Relation(parentColumn = "id", entityColumn = "sectionId", entity = Question.class)
public List<QuestionWithAnswer> questions;
public static class QuestionWithAnswer {
#Embedded
public Question question;
#Relation(parentColumn = "id", entityColumn = "questionId", entity = Answer.class)
List<Answer> answers;
}
}
In another application, the query would be:
SELECT s.*, q.*, a.*
FROM sections s
LEFT JOIN questions q ON q.sectionId = s.id
LEFT JOIN answers a ON a.questionId = q.id
WHERE s.id = :sectionId and a.otherColumn = :otherColumn
However in Room I have found out that if you want an object and their relations (like a user and its pets in the example), you only select the object, and the relations are queried in a second query. That would be:
#Query("SELECT * FROM sections WHERE id = :sectionId")
Then in the generated code there would be (pseudo code):
sql = "SELECT * FROM sections WHERE id = :sectionId" // what's inside #Query
cursor = query(sql)
int indexColumn1 = cursor.getColumnIndex(col1)
int indexColumn2
... etc
while (cursor.moveToNext) {
masterObject = new object()
masterObject.property1 = cursor.get(indexColumn1)
... etc
__fetchRelationshipXXXAsYYY(masterObject.relations) // fetch the child objects
}
and this __fetch XXX as YYY method is as follows:
sql = "SELECT field1, field2, ... FROM a WHERE foreignId IN (...)"
similar algo as previously: fetch column indices, and loop through the cursor
So basically it creates 2 queries: one for the master object and one for the relations. The 2nd query is automatically created and we have no control over it.
To get back to my problem where I want relations but also filter on the child column, I'm stuck:
in the 1st query I can't reference the otherColumn column because it doesn't exist
in the #Relation I can't either because the only properties of this annotation are the join column and entity definition
Is this possible in Room or do I have to make the subqueries myself?
Bonus question: why don't they join tables in a single query but create 2 queries instead? Is this for performance reasons?
Edit to clarify what I expected:
That's what I expected to write:
#Query("SELECT s.*, q.*, a.* " +
"FROM sections s " +
"LEFT JOIN questions q ON q.sectionId = s.id " +
"LEFT JOIN answers a ON a.questionId = q.id " +
"WHERE s.id = :sectionId and a.otherColumn = :additionalIntegerFilter")
SectionWithQuestionsAndAnswers fetchFullSectionData(long sectionId);
static class SectionWithQuestionsAndAnswers {
#Embedded Section section;
#Relation(parentColumn = "id", entityColumn = "sectionId", entity = Question.class)
List<QuestionWithAnswers> questions;
}
static class QuestionWithAnswers {
#Embedded Question question;
#Relation(parentColumn = "id", entityColumn = "questionId", entity = Answer.class)
Answer answer; // I already know that #Relation expects List<> or Set<> which is
// not useful if I know I have zero or one relation (ensured
// through unique keys)
}
That's pseudo code that I imagined to be implemented by Room as the generated code:
function fetchFullSectionData(long sectionId, long additionalIntegerFilter) {
query = prepare(sql); // from #Query
query.bindLong("sectionId", sectionId);
query.bindLong("additionalIntegerFilter", additionalIntegerFilter);
cursor = query.execute();
Section section = null;
long prevQuestionId = 0;
Question question = null;
while (cursor.hasNext()) {
if (section == null) {
section = new Section();
section.questions = new ArrayList<>();
section.field1 = cursor.get(col1); // etc for all fields
}
if (prevQuestionId != cursor.get(questionIdColId)) {
if (question != null) {
section.questions.add(question);
}
question = new Question();
question.fiedl1 = cursor.get(col1); // etc for all fields
prevQuestionId = question.id;
}
if (cursor.get(answerIdColId) != null) { // has answer
Answer answer = new Answer();
answer.field1 = cursor.get(col1); // etc for all fields
question.answer = answer;
}
}
if (section !=null && question != null) {
section.questions.add(question);
}
return section;
}
That's one query, and all my objects fetched.
I find Room Relations hard to work with, not very flexible and much of the work is done under the hood in a way that is hard to really be sure how.
In my projects, most of the time I just create presentation objects - objects dedicated for some UI presentation that can be filled with a custom select.
That way I have much more control over what I want to fetch from DB (i.e. what I really need), and I fill that into that custom presentation object.
I'm just pasting the information provided on the feature request I posted (see my comment on my question):
Hi there - we have recently released a new feature where relational query methods can be defined with Multimap return types. With this new feature, you should be able to achieve the results discussed in this thread. For more info on this new feature, you can check out the following resources:
Define relationships between objects: https://developer.android.com/training/data-storage/room/relationships
Relational Query Methods in ADS 2021: https://youtu.be/i5coKoVy1g4?t=344
The new MapInfo annotation: https://developer.android.com/reference/androidx/room/MapInfo
I know link-only answers aren't great, but I didn't have the opportunity to test this. If someone has a better answer, I'll accept it.
I found a better solution for this. Instead of aliasing all columns you can use #RawQuery annotation.
First of all, add a prefix for embedded table annotation using table name or its alias like #Embedded(prefix = "P.") or #Embedded(prefix = "Post."):
public class UserPost {
#Embedded
private User user;
#Embedded(prefix = "P.")
private Post post;
}
Then in your Dao, create a function to run a raw query, and create another function to run a raw query:
#Dao
public interface UserDao {
String USER_POST_QUERY = "SELECT U.*, P.* FROM User as U " +
"INNER JOIN Post as P ON U.id = P.userId " +
"WHERE P.status = 1";
#RawQuery
LiveData<List<UserPost>> rawQuery(SimpleSQLiteQuery query);
default LiveData<List<UserPost>> getAlertViolationsAsync() {
return rawQuery(new SimpleSQLiteQuery(USER_POST_QUERY));
}
}
Is it possible to use an alias (AS) in a query for ORMLite in Android? I am trying to use it with the following code:
String query =
"SELECT *, (duration - elapsed) AS remaining FROM KitchenTimer ORDER BY remaining";
GenericRawResults<KitchenTimer> rawResults =
getHelper().getKitchenTimerDao().queryRaw(
query, getHelper().getKitchenTimerDao().getRawRowMapper());
But when this codes gets executed it gives the following error:
java.lang.IllegalArgumentException: Unknown column name 'remaining' in table kitchentimer
java.lang.IllegalArgumentException: Unknown column name 'remaining' in table kitchentimer
The raw-row-mapper associated with your KitchenTimerDao expects the results to correspond directly with the KitchenTimer entity columns. However, since you are adding your remaining column, it doesn't no where to put that result column, hence the exception. This is a raw-query so you will need to come up with your own results mapper -- you can't use the DAO's. See the docs on raw queries.
For instance, if you want to map the results into your own object Foo then you could do something like:
String query =
"SELECT *, (duration - elapsed) AS remaining FROM KitchenTimer ORDER BY remaining";
GenericRawResults<Foo> rawResults =
orderDao.queryRaw(query, new RawRowMapper<Foo>() {
public Foo mapRow(String[] columnNames, String[] resultColumns) {
// assuming 0th field is the * and 1st field is remaining
return new Foo(resultColumns[0], Integer.parseInt(resultColumns[1]));
}
});
// page through the results
for (Foo foo : rawResults) {
System.out.println("Name " + foo.name + " has " + foo.remaining + " remaining seconds");
}
rawResults.close();
I had the same problem. I wanted to get a list of objects but adding a new attribute with an alias.
To continue using the object mapper from OrmLite I used a RawRowMapper to receive columns and results. But instead of convert all columns manually I read the alias first and remove its reference in the column arrays. Then it is possible to use the OrmLite Dao mapper.
I write it in Kotlin code:
val rawResults = dao.queryRaw<Foo>(sql, RawRowMapper { columnNames, resultColumns ->
// convert array to list
val listNames = columnNames.toMutableList()
val listResults = resultColumns.toMutableList()
// get the index of the column not included in dao
val index = listNames.indexOf(ALIAS)
if (index == -1) {
// There is an error in the request because Alias was not received
return#RawRowMapper Foo()
}
// save the result
val aliasValue = listResults[index]
// remove the name and column
listNames.removeAt(index)
listResults.removeAt(index)
// map row
val foo = dao.rawRowMapper.mapRow(
listNames.toTypedArray(),
listResults.toTypedArray()
) as Foo
// add alias value. In my case I save it in the same object
// but another way is to create outside of mapping a list and
// add this value in the list if you don't want value and object together
foo.aliasValue = aliasValue
// return the generated object
return#RawRowMapper foo
})
It is not the shortest solution but for me it is very important to keep using the same mappers. It avoid errors when an attribute is added to a table and you don't remember to update the mapping.
I'm tring to make join in two tables and get all columns in both, I did this:
QueryBuilder<A, Integer> aQb = aDao.queryBuilder();
QueryBuilder<B, Integer> bQb = bDao.queryBuilder();
aQb.join(bQb).prepare();
This equates to:
SELECT 'A'.* FROM A INNER JOIN B WHERE A.id = B.id;
But I want:
SELECT * FROM A INNER JOIN B WHERE A.id = B.id;
Other problem is when taking order by a field of B, like:
aQb.orderBy(B.COLUMN, true);
I get an error saying "no table column B".
When you are using the QueryBuilder, it is expecting to return B objects. They cannot contain all of the fields from A in B. It will not flesh out foreign sub-fields if that is what you mean. That feature has not crossed the lite barrier for ORMLite.
Ordering on join-table is also not supported. You can certainly add the bQb.orderBy(B.COLUMN, true) but I don't think that will do what you want.
You can certainly use raw-queries for this although it is not optimal.
Actually, I managed to do it without writing my whole query as raw query. This way, I didn't need to replace my query builder codes (which is pretty complicated). To achieve that, I followed the following steps:
(Assuming I have two tables, my_table and my_join_table and their daos, I want to order my query on my_table by the column order_column_1 of the my_join_table)
1- Joined two query builders & used QueryBuilder.selectRaw(String... columns) method to include the original table's + the columns I want to use in foreign sort. Example:
QueryBuilder<MyJoinTable, MyJoinPK> myJoinQueryBuilder = myJoinDao.queryBuilder();
QueryBuilder<MyTable, MyPK> myQueryBuilder = myDao.queryBuilder().join(myJoinQueryBuilder).selectRaw("`my_table`.*", "`my_join_table`.`order_column` as `order_column_1`");
2- Included my order by clauses like this:
myQueryBuilder.orderByRaw("`order_column_1` ASC");
3- After setting all the select columns & order by clauses, it's time to prepare the statement:
String statement = myQueryBuilder.prepare().getStatement();
4- Get the table info from the dao:
TableInfo tableInfo = ((BaseDaoImpl) myDao).getTableInfo();
5- Created my custom column-to-object mapper which just ignores the unknown column names. We avoid the mapping error of our custon columns (order_column_1 in this case) by doing this. Example:
RawRowMapper<MyTable> mapper = new UnknownColumnIgnoringGenericRowMapper<>(tableInfo);
6- Query the table for the results:
GenericRawResults<MyTable> results = activityDao.queryRaw(statement, mapper);
7- Finally, convert the generic raw results to list:
List<MyTable> myObjects = new ArrayList<>();
for (MyTable myObject : results) {
myObjects.add(myObject);
}
Here's the custom row mapper I created by modifying (just swallowed the exception) com.j256.ormlite.stmt.RawRowMapperImpl to avoid the unknown column mapping errors. You can copy&paste this into your project:
import com.j256.ormlite.dao.RawRowMapper;
import com.j256.ormlite.field.FieldType;
import com.j256.ormlite.table.TableInfo;
import java.sql.SQLException;
public class UnknownColumnIgnoringGenericRowMapper<T, ID> implements RawRowMapper<T> {
private final TableInfo<T, ID> tableInfo;
public UnknownColumnIgnoringGenericRowMapper(TableInfo<T, ID> tableInfo) {
this.tableInfo = tableInfo;
}
public T mapRow(String[] columnNames, String[] resultColumns) throws SQLException {
// create our object
T rowObj = tableInfo.createObject();
for (int i = 0; i < columnNames.length; i++) {
// sanity check, prolly will never happen but let's be careful out there
if (i >= resultColumns.length) {
continue;
}
try {
// run through and convert each field
FieldType fieldType = tableInfo.getFieldTypeByColumnName(columnNames[i]);
Object fieldObj = fieldType.convertStringToJavaField(resultColumns[i], i);
// assign it to the row object
fieldType.assignField(rowObj, fieldObj, false, null);
} catch (IllegalArgumentException e) {
// log this or do whatever you want
}
}
return rowObj;
}
}
It's pretty hacky & seems like overkill for this operation but I definitely needed it and this method worked well.