I have the case that I operate on some object I got from greenDao and in some cases I have to revert the changes. I only got this to work with IdentityScope.None - with some IdentityScope I found no way to do that - even refresh() which sounded promising was not bringing back the data from the database. Is there any way to do this with a IdentityScope?
The refresh(entity) method of a DAO does reload all entity values from the database. However, it operates on a single entity, not on a tree of entities.
Related
I am converting my application to room database and try to follow the google architecture best practices based on "Room with a View".
I am having trouble to understand the repository in terms of clean architecture.
The Words database example contains only one table and one view using it, making it a simple HelloWorld example. But lets start with that.
There is a view which displays a list of words. Thus all words need to be read from the database and displayed.
So we have a MainActivity and a Database to connect.
Entity Word
WordDao to access DB
WordViewModel: To separate the activity lifecycle from the data lifecycle a ViewModel is used.
WordRepository: Since the data maybe kept in a database or the cloud or whatever the repository is introduced to handle decision, where data comes from.
Activity with the View
It would be nice if the view is updated when the data changes, so LiveData is used.
This in turn means, the repository is providing the LiveData for the full table:
// LiveData gives us updated words when they change.
val allWords: LiveData<List<Word>>
This is all fine for a single view.
Now to my questions on expanding this concept.
Let us assume, the word table has two columns "word" and "last_updated" as time string.
For easier comparison the time string needs to be converted to milliseconds, so I have a function.
Question: Where to put the fun queryMaxServerDateMS() to get the max(last_updated)?
/**
* #return Highest server date in table in milliseconds or 1 on empty/error.
*/
fun queryMaxServerDateMS(): Long {
val maxDateTime = wordDao.queryMaxServerDate()
var timeMS: Long = 0
if (maxDateTime != null) {
timeMS = parseDateToMillisOrZero_UTC(maxDateTime)
}
return if (timeMS <= 0) 1 else timeMS
}
For me it would be natural to put this into the WordRepository.
Second requirement: Background job to update the word list in the database.
Suppose I now want a Background Job scheduled on a regular basis which checks the server, if new entries were made and downloads them to the database. The app may not be open.
This question just relays to the question of the above queryMaxServerDateMS.
The job will basically check first, if a new entry was made by asking the server if an entry exists which is newer then the max known entry.
So I would need to get a new class WordRepository, do my query, get max last_update and ask the server.
BUT: I do not need the LiveData in the background job and when val repositoy = WordRepository the full table is read, which is needless and time-, memory and batteryconsuming.
I also can think of a number of different fragments that would require some data of the word table, but never the full data, think of a product detail screen which lists one product.
So I can move it out to another Repository or DbHelper however you want to call it.
But in the end I wonder, if I use LiveData, which requires the View, ViewModel and Repository to be closely coupled together:
Question: Do I need a repository for every activity/fragment instead of having a repository for every table which would be much more logical?
Yes, with your current architecture you should put it in the Repository.
No, you don't need a repository for every activity/fragment. Preferably, 1 repository should be created for 1 entity. You can have a UseCase for every ViewModel.
In Clean architecture there's a concept of UseCase / Interactor, that can contain business logic, and in Android it can act as an additional layer between ViewModel and Repository, you can create some UseCase class for your function queryMaxServerDateMS(), put it there and call it from any ViewModel you need.
Also you can get your LiveData value synchronously, by calling getValue().
You do not need repository for each activity or fragment. To answer your question about getting max server time - when you load words from db you pretty much have access to entire table. That means you can either do that computation yourself to decide which is the latest word that's added or you can delegate that work to room by adding another query in dao and access it in your repo. I'd prefer latter just for the simplicity of it.
To answer your question about using repo across different activities or fragment - room caches your computations so that they are available for use across different users of your repo (and eventually dao). This means if you have already computed the max server time in one activity and used it there, other lifecycle owners can use that computed result as far as the table has not been altered (there might be other conditions as well)
To summarize you're right about having repository for tables as opposed to activities or fragments
I use Android Architecture Components to build my app. There is Paging Library to load items with Room generated DataSource. Also there is BoundaryCallback to get new data from server and store it in the database. It works fine, all is reactive, changes in the database come into PagedList.
But now I need to these items get some additional data, some calculations before they come into PagesList and RecyclerView. These calculations is not so fast to executing them on main thread in RecyclerView ViewHolder (actually I need to get additional data from the database or even from the server). So I supposed that I need to write my custom DataSource and make calculations there and then pass these processed items to PagedList.
I created my ItemKeyedDataSource (I'm not sure this is correct, because I load data from database, but this data source type is designed for network, but I don't think this is critical), and make queries in Dao that return List of items. After I got a "page", I make calculations to items and then pass it to callback. It works, PagedList gets processed items.
But unfortunately there is no reactivity with this approach. No changes in database come to my PagedList. I tried to return LiveData<List> from Dao and add observeForever() listener in DataSource but it fails since you can't run it on background thread.
I watched Room generated DataSource.Factory and LimitOffsetDataSource but it doesn't look good to me since you need to pass table names to observe changes and other unclear things.
I suppose that I need to use invalidate(), but I don't because I have no idea where it should be.
There is 3 main questions:
Is it right to process items in DataSource before they come to RecyclerView or there is a better place?
Should I use PositionalDataSource instead of ItemKeyedDataSource?
How can I add Room reactivity to custom DataSource?
It seems that I've found a mistake in my DataSource.Factory. Instead of creating DataSource object in create() method I just returned object which was passed to that factory (I saw it in one popular article on Medium). And because of that I couldn't invalidate my DataSource. But now I create DataSource in that method and invalidation works.
The only problem is to understand where and when to invalidate. For now I've found some workaround: make a query in Dao that returns LiveData of last item, and then observe it in my Activity to understand that data was modified and call invalidate(). But I'm not sure this is a good solution. Maybe you know a better one.
You may add invalidationTracker in your DataSource:
dbRoom.getInvalidationTracker().addObserver(
object : InvalidationTracker.Observer("your_table") {
override fun onInvalidated(#NonNull tables: Set<String>) {
invalidate()
}
})
Problem:
I am using Room Persistence Library and so far everything is working fine except that there is a data from select query which I need synchronously as I am calling it from a Periodic Job (Work Manager's Worker). I have defined the return type to be LiveData as I am also accessing it for display purposes in UI and so observers are great for that but now I also need the same data in Job.
Code Snippet
#Query("SELECT * from readings ORDER BY date, time ASC")
LiveData<List<Reading>> getAllReadings();
Tried
I have tried the getValue() method in LiveData but it returns null as the data is not loaded in LiveData while making the query.
readingDao().getAllReadings().getValue() // returns null
Possible Solution
There is only one solution that I can think of which is to duplicate the getAllReadings query with a different name and return type (without LiveData) but I don't think this is a clean approach as it increases duplication of code just to get a synchronous return type.
Please let me know if there is any other solution or perhaps some way to synchronously access data from LiveData variable.
You can allow main thread query when you initialize Room DB, but it's clearly not desirable. This will give you the synchronous behavior but will block user interface. Is there a specific reason you want this to be synchronous?
The reason why getValue() is returning null is because Room is querying data asynchronously. You can attach an observer or a callback function to get result when the query is finished. You can display the result to the UI or chain another call for sequential operation etc from there.
I use RxJava to wrap my query request for asynchronous query but I you can also use AsyncTask.
In my Android application I am using GreenDao as an orm.
I have two tables: A and B. Table B has foreign key to table A.
A entities can execute getBList() method and B entities can execute getA() method.
When I started to remove some A entities with connected B entities from database I noted strange behavior. Now some of newly created A entities have connected B entities, but there is no connecting in code:
A a = new A();
// setting some simple a fields, nothing with Bs
aDao.create(a);
a.getBList(); // not empty list
Does anybody know what can cause such behavior and how to fix it?
This is from the greenDao website:
Resolving and Updating To-Many Relations
To-many relations are resolved lazily on the first request. After
that, the related entities are cached in the source entity inside a
List object. Subsequent calls to the get method of the relation do not
query the database.
Note that updating to-many relations require some additional work.
Because to-many lists are cached, they are not updated when related
entities are added to the database. The following code illustrates the
behavior:
List orders1 = customer.getOrders();
int size1 = orders1.size();
Order order = new Order();
order.setCustomerId(customer.getId());
daoSession.insert(order);
Listorders2 = customer.getOrders();
// size1 == orders2.size(); // NOT updated
// orders1 == orders2; // SAME list object
Likewise, you can delete related entities:
List orders = customer.getOrders(); daoSession.delete(newOrder);
orders.remove(newOrder);
Sometimes, it may be cumbersome or even impossible to update all
to-many relations manually after related entities were added or
removed. To the rescue, greenDAO has reset methods to clear the cached
list. If a to-many relation may have changed potentially, you can
force greenDAO to reload the list of related entities:
customer.resetOrders();
List orders2 = customer.getOrders();
Try to reset your relations when you add or remove elements.
I'm using GreenDao and creating a core function that help you to update some values of an entity, and if the entity is not in the database then it also inserting it. The problem is that I'm always getting the cached copy of the entity, I know that GreenDao manage some simple cache and I would like to have the ability bypass it. does anyone knows how I can query right from the database?
This doesn't work
.Dao().queryBuilder().where(comDao.Properties.Id.eq(id)).build().listLazyUncached();
GreenDao indeed has Inner caching mechanism in its daoCore.jar sources.
You can disable the caching easily by searching for the code that put() and get() entities from the cache.
which is a: HashMap<? extends AbstractDao>.
Then generate MyDaoCore.jar and add it to your project.
Secondly, in order to update or insert and entity (without replacing it entirely) you need to implement the following pseudo code. I'm sorry that I'm not adding the actual code, I solved it long time ago.
public void insertOrUpdate(List<? extends AbstractDao> entities){
List<Entity> toInsert;
List<Entity> toUpdate;
for (Entity e : entities)
{
if( e.inDatabase() )
toUpdate.add(e);
else
toInsert.add(e);
}
Dao.updateAll(toUpdate);
Dao.insertAll(toInsert);
}
Edit 1:
You can use IN statement in order to get all the ids of an entity in only one query like this:
.where(Dao.Properties.Id.in(ids)).build().list();