ORMLite OpenHelper DAO caching in DaoManager? - android

So I have a custom subclass of OrmLiteSqliteOpenHelper. I want to use the ObjectCache interface to make sure I have identity-mapping from DB rows to in-memory objects, so I override getDao(...) as:
#Override
public <D extends Dao<T, ?>, T> D getDao(Class<T> arg0) throws SQLException {
D dao = super.getDao(arg0);
if (dao.getObjectCache() == null && !UNCACHED_CLASSES.contains(arg0))
dao.setObjectCache(InsightOpenHelperManager.sharedCache());
return dao;
}
My understanding is that super.getDao(Class<T> clazz) is basically doing a call to DaoManager.createDao(this.getConnectionSource(),clazz) behind the scenes, which should find a cached DAO if one exists. However...
final DatabaseHelper helpy = CustomOpenHelperManager.getHelper(StoreDatabaseHelper.class);
final CoreDao<Store, Integer> storeDao = helpy.getDao(Store.class);
DaoManager.registerDao(helpy.getConnectionSource(), storeDao);
final Dao<Store,Integer> testDao = DaoManager.createDao(helpy.getConnectionSource(), Store.class);
I would expect that (even w/o the registerDao(...) call) storeDao and testDao should be references to the same object. I see this in the Eclipse debugger, however:
Also, testDao's object cache is null.
Am I doing something wrong here? Is this a bug?
I do have a custom helper manager, but only because I needed to manage several databases. It's just a hashmap of Class<? extends DatabaseHelper> keys to instances.
The reason I need my DAO cached is that I have several foreign collections that are eager and are being loaded by internally-generated DAOs that are not using my global cache and thus are being re-created independently for each collection.
As I was writing this up, I thought I could just have my overridden helpy.getDao(...) call through to DaoManager.createDao(...), but that results in the same thing: I still get a different DAO on the second call to createDao(...). This seems to me to be totally against the docs for DaoManager.
First, I thought it looked like registerDao(...) may be the culprit:
public static synchronized void registerDao(ConnectionSource connectionSource, Dao<?, ?> dao) {
if (connectionSource == null) {
throw new IllegalArgumentException("connectionSource argument cannot be null");
}
if (dao instanceof BaseDaoImpl) {
DatabaseTableConfig<?> tableConfig = ((BaseDaoImpl<?, ?>) dao).getTableConfig();
if (tableConfig != null) {
tableMap.put(new TableConfigConnectionSource(connectionSource, tableConfig), dao);
return;
}
}
classMap.put(new ClassConnectionSource(connectionSource, dao.getDataClass()), dao);
}
That return on line 230 of the source for DaoManager prevents the classMap from being updated (since I'm using the pregenerated config files?). When my code hits the second create call, it looks at the classMap first, and somehow (against my better understanding) finds a different copy of the DAO living there. Which is super weird, because stepping through the first create, I watched the classMap be initialized.
But where would a second DAO possibly come from?
Looking forward to Gray's insight! :-)

As #Ben mentioned, there is some internal DAO creation which is screwing things up but I think he may have uncovered a bug.
Under Android, ORMLite tries to use some magic reflection to build the DAOs given the horrible reflection performance under all but the most recent Android OS versions. Whenever the user asks for the DAO for class Store (for example), the magic reflection fu is creating one DAO but internally it is using another one. I've created the follow bug:
https://sourceforge.net/tracker/?func=detail&aid=3487674&group_id=297653&atid=1255989
I changed the way the DAOs get created to do a better job of using the reflection output. The changes were pushed out in the 4.34. This release revamps (and simplifies) the internal DAO creation and caching. It should fix the issue.
http://ormlite.com/releases/

Just kidding. Looks like what may be happening is that my Store object DAO initialization is creating DAO's for foreign connections (that I set to foreignAutoRefresh) and then recursively creating another DAO for itself (since the DAO creation that started this has not completed, and thus has yet to be registered w/ the DaoManager).
Looks like this has to do w/ the recursion noted in BaseDaoImpl.initialize().
I'm getting Inception flashbacks just looking at this.

Related

Android - Couchbase lite - DAO - MyClass extends Document

I'm working on an Android application using Couchbase lite.
Should I have my classes extending com.couchbase.lite.Document ?
Pros: DAO is integrated in class.
Cons: - every object is linked to a document, if we want a new object, we must create a new document in couchbase? - anything else?
For example:
public class UserProfile extends Document {
public UserProfile (Database database, String documentId);
public Map<String, Object> getProperties();
public boolean isModified();
public boolean update() throws CouchbaseLiteException {
if (isModified()) {
super.putProperties(getProperties());
return true;
}
else
return false;
}
}
I would not recommend extending Document. Instead, either just use Maps, or use something like the Jackson JSON library to create POJOs. I usually create a simple helper class to wrap the database operations (including replication, if you're using that).
Off the top of my head, I wouldn't do it because subclassing doesn't fit well with some of the ways you retrieve documents, documents are somewhat heavy-weight objects, and the preferred way to update takes into account the possibility of conflicts, which would be much more difficult. (See this blog post for a discussion of that last point.)
I've never tried to work around these issues in a subclassing approach, but it seems pretty certain to be more pain than it's worth.

How to provide an object asynchornously with Dagger?

I'm using Dagger1 and I have a list of Jokes. In my AwesomeJokeModule I provide a List. The list is provided by JokeDataLayer.getJokeCache(). The thing is, if the Cache isn't build up yet, the getJokeCache() method hits the DB getting a huge list of jokes. This could take a while, and while Injecting my Jokes into my Activity, this can cause a slow down since the Jokes are a member of my Activity. What's the best way to inject a member into something like an Activity asynchronously?
Some things I've thought of, was to return an empty list right away if the cache isn't built yet, and then somehow try to communicate that the cache has been updated? But it just feels like I'm circumventing Dagger/DI. Any advice or ways to do this?
This is where the Lazy<T> is for. Consider using LAZY INJECTION
class GridingCoffeeMaker {
#Inject Lazy<Grinder> lazyGrinder;
public void brew() {
while (needsGrinding()) {
// Grinder created once on first call to .get() and cached.
lazyGrinder.get().grind();
}
}
}
This lazyGrinder object will only be initialized when you need to use it.

What is the correct way to initialize data in a lookup table using DBFlow?

I am trying to implement DBFlow for the first time and I think I might just not get it. I am not an advanced Android developer, but I have created a few apps. In the past, I would just create a "database" object that extends SQLiteOpenHelper, then override the callback methods.
In onCreate, once all of the tables have been created, I would populate any lookup data with a hard-coded SQL string: db.execSQL(Interface.INSERT_SQL_STRING);. Because I'm lazy, in onUpgrade() and onDowngrade(), I would just DROP the tables and call onCreate(db);.
I have read through the migrations documentation, which not only seems to be outdated syntactically because "database =" has been changed to "databaseName =" in the annotation, but also makes no mention of migrating from no database to version "initial". I found an issue that claims that migration 0 can be used for this purpose, but I cannot get any migrations to work at this point.
Any help would be greatly appreciated. The project is # Github.
The answer below is correct, but I believe that this Answer and Question will soon be "deprecated" along with most third-part ORMs. Google's new Room Persistence Library (Yigit's Talk) will be preferred in most cases. Although DBFlow will certainly carry on (Thank You Andrew) in many projects, here is a good place to re-direct people to the newest "best practice" because this particular question was/is geared for those new to DBFlow.
The correct way to initialize the database (akin to the SQLiteOpenHelper's onCreate(db) callback is to create a Migration object that extends BaseMigration with the version=0, then add the following to the onCreate() in the Application class (or wherever you are doing the DBFlow initialization):
FlowManager.init(new FlowConfig.Builder(this).build());
FlowManager.getDatabase(BracketsDatabase.NAME).getWritableDatabase();
In the Migration Class, you override the migrate() and then you can use the Transaction manager to initialize lookup data or other initial database content.
Migration Class:
#Migration(version = 0, database = BracketsDatabase.class)
public class DatabaseInit extends BaseMigration {
private static final String TAG = "classTag";
#Override
public void migrate(DatabaseWrapper database) {
Log.d(TAG, "Init Data...");
populateMethodOne();
populateMethodTwo();
populateMethodThree();
Log.d(TAG, "Data Initialized");
}
To populate the data, use your models to create the records and the Transaction Manager to save the models via FlowManager.getDatabase(AppDatabase.class).getTransactionManager()
.getSaveQueue().addAll(models);
To initialize data in DBFlow all you have to do is create a class for your object models that extends BaseModel and use the #Table annotation for the class.
Then create some objects of that class and call .save() on them.
You can check the examples in the library's documentation.

OrmLite inside an Android Module

I'm trying to put all the DatabaseRequests inside a module in Android to centralize all the acces to DDBB in the same place.
I'm wondering if I'm making any mistake doing that. The apps works in the right way but I'm concerned about best practices doing that.
I have an static class called DatabaseRequest where all the requests are inside, for instance:
public static void insertUser(Context context, User user) {
DataBaseHelper mDataBaseHelper = OpenHelperManager.getHelper(context, DataBaseHelper.class);
try {
Dao<User, Integer> dao = mDataBaseHelper.getUserDao();
dao.createOrUpdate(user);
} catch (SQLException e) {
e.printStackTrace();
} finally {
if (mDataBaseHelper != null) {
OpenHelperManager.releaseHelper();
}
}
}
The context param is the context of the activity that's making the request.
Is there any performance issue related with this code?
Thanks in advance ;)
No, as Gray (ORMlite creator) said in this post:
is it ok to create ORMLite database helper in Application class?
What is most important with your code is that it guarantees a single
databaseHelper instance. Each instance has it's own connection to the
database and problems happen when there are more than one (1)
connection opened to the database in a program. Sqlite handles
multiple threads using the same connection at the same time but it
doesn't handle multiple connections well and data inconsistencies may
occur.
And in your case you may have multiple connections at one time.
I can preset you my approach on how I'm using ORMlite, I have one singleton class public class DbHelper extends OrmLiteSqliteOpenHelper which takes care of creating database connection and holds all Dao fields. You will have database upgrade code there and some other stuff so consider making facade classes. In my case each facade holds one Dao object for one model class, where i keep logic for complex item retrieving (and for simple cases i just delegate it to Dao object.

Why cache Dao inside DatabaseHelper, if its cached in DaoManage

According to ORMLite documentation, all created Dao objects are cached inside DaoManager. But in ORMLite examples, I've seen Dao classes are again cached inside DatabaseHelper class. Do we really need it? ex.
public Dao<SimpleData, Integer> getDao() throws SQLException {
if (simpleDao == null) {
simpleDao = getDao(SimpleData.class);
}
return simpleDao;
}
My plan is to obtain Dao object when ever I need it and not to cache it inside my code base(In DatabaseHelper class), just want to allow DaoManager to cache Dao.
This is what I'm planing to use
DatabaseHelper databaseHelper = OpenHelperManager.getHelper(this, DatabaseHelper.class);
Dao<SimpleData, Integer> myDao = databaseHelper.get.getDao(SimpleData.class);
Any performance issue if I obtain dao like this, instead of caching it inside DatabaseHelper?
Any performance issue if I obtain dao like this, instead of caching it inside DatabaseHelper?
No this is certainly fine. You are doing a Hashmap.get(..) call each time but that is a very small hit -- especially when compared to any DAO operations or IO.
I would recommend not doing one of these for every call to the DAO:
databaseHelper.getDao(SimpleData.class).create(...);
databaseHelper.getDao(SimpleData.class).update(...);
But if you want to just get it at the start of the method and then perform a couple of operations then this should perform fine.

Categories

Resources