If you want to pre-populate a database (SQLite) in Android, this is not that easy as one might think.
So I found this tutorial which is often referenced here on Stack Overflow as well.
But I don't really like that way of pre-populating the database since you take the control from the database handler and create the files yourself. I would prefer to not touch the file system and let the database handler do everything on its own.
So what I thought one could do is create the database in the database handler's onCreate() as usual but then load a file (.sql) from /assets which contains the statements to fill in the values:
INSERT INTO testTable (name, pet) VALUES ('Mike', 'Tiger');
INSERT INTO testTable (name, pet) VALUES ('Tom', 'Cat');
...
But calling execSQL() in the handler's onCreate() doesn't really work. It seems that the /assets file must not have more than 1MB and the execSQL() only executes the first statement (Mike - Tiger).
What would you do do pre-populate the database?
I suggest the following:
Wrap all of your INSERT logic into a transaction (BEGIN... COMMIT, or via the beginTransaction()... endTransaction() APIs)
As already suggested, utilize the bind APIs and recycle objects.
Don't create any indexes until after this bulk insert is complete.
Additionally take a look at Faster bulk inserts in sqlite3?
Your question states, that you want the fastest way - but you don't like the way it's done in the article - you don't want to manually replace the DB file (even though, it may be actually faster than filling empty DB with queries).
I had exaclty the same thoughts - and I figured out, that populating via SQL statements and prepopulating can both be the best solution - but it depends on the way you will use the DB.
In my application I need to have about 2600 rows (with 4 columns) in DB at the very first run - it's the data for autocompletion and few other things. It will be modified quite rarely (users can add custom records, but most of the time - they don't need to) and is quite big. Populating it from SQL statements takes not only significantly more time, but more space in the APK (assuming I would store data inside it, alternatively I could download it from the internet).
This is the very simple case (the "Big" insert can take place only once and only at first startup) and I decided to go with copying prepopulated DB file. Sure, it may not be the nicest way - but it's faster. I want my users to be able to use the app as quickly as it's possible and treat speed as a priority - and they really like it. On the contrary, I doubt they would be glad when app would slow down because I thought that slower and nicer solution is actually better.
If instead of 2600 my table would have initially ~50 rows, I would go with SQL statements, since speed and size difference wouldn't be so big.
You have to decide which solution fits your case better. If you foresee any problems that may arise from using "prepopulated db" option - don't use it. If you are not sure about these problems - ask, providing more details on how you will use (and eventually, upgrade) contents of the DB. If you aren't quite sure which solution will be faster - benchmark it. And don't be afraid of that copying file method - it can work really well, if used wisely.
You can have your cake and eat it too. Here is a solution that can both respect the use of your db adapter and also use a simple (and much faster) copy process for a pre-populated database.
I'm using a db adapter based on one of Google's examples. It includes an internal class dbHelper() that extends Android's SQLiteOpenHelper() class. The trick is to override it's onCreate() method. This method is only called when the helper can't find the DB you are referencing and it has to create the DB for you. This should only happen the first time it is called on any given device installation, which is the only time you want to copy the DB. So override it like this -
#Override
public void onCreate(SQLiteDatabase db) {
mNeedToCopyDb = true;
}
Of course make sure you have first declared and initialized this flag in the DbHelper -
private Boolean mNeedToCopyDb = false;
Now, in your dbAdapter's open() method you can test to see if you need to copy the DB. If you do then close the helper, copy the DB and then finally open a new helper (see below code). All future attempts to open the db using the db adapter will find your (copied) DB and therefor the onCreate() method of the internal DbHelper class will not be called and the flag mNeedToCopyDb will remain false.
/**
* Open the database using the adapter. If it cannot be opened, try to
* create a new instance of the database. If it cannot be created,
* throw an exception to signal the failure.
*
* #return this (self reference, allowing this to be chained in an
* initialization call)
* #throws SQLException if the database could neither be opened nor created
*/
public MyDbAdapter open() throws SQLException {
mDbHelper = new DatabaseHelper(mCtx);
mDb = mDbHelper.getReadableDatabase();
if (mDbHelper.mNeedToCopyDb == true){
mDbHelper.close();
try {
copyDatabase();
} catch (IOException e) {
e.printStackTrace();
} finally {
mDbHelper = new DatabaseHelper(mCtx);
mDb = mDbHelper.getReadableDatabase();
}
}
return this;
}
Just place some code to do your database copy inside of your db adapter in a method named copyDatabase() as used above. You can use the value of mDb that was updated by the first instance of DbHelper (when it created the stub DB) to get the path to use for your output stream when you do the copy.
Construct your input stream like this
dbInputStream = mCtx.getResources().openRawResource(R.raw.mydatabase);
[note: If your DB file is too large to copy in one gulp then just break it up into a few pieces.]
This works very fast and puts all of the db access code (including the copying of the DB if needed) into your db adapter.
I wrote a DbUtils class similar to the previous answer. It is part of the ORM tool greenDAO and is available on github. The difference is that it will try to find statement boundaries using a simple regular expression, not just line endings. If you have to rely on a SQL file, I doubt that there's a faster way.
But, if you can supply the data in another format, it should be significantly faster than using a SQL script. The trick is to use a compiled statement. For each data row, you bind the parsed values to the statement and execute the statement. And, of course, you need to do this inside a transaction. I would recommend a simple delimiter separated file format (for example CSV) because it can be parsed faster than XML or JSON.
We did some performance tests for greenDAO. For our test data, we had insert rates of about 5000 rows per second. And for some reason, the rate dropped to half with Android 4.0.
ye, the assets maybe has size limit, so if bigger than the limit, you can cut to more files.
and exesql support more sql sentence, here give you a example:
BufferedReader br = null;
try {
br = new BufferedReader(new InputStreamReader(asManager.open(INIT_FILE)), 1024 * 4);
String line = null;
db.beginTransaction();
while ((line = br.readLine()) != null) {
db.execSQL(line);
}
db.setTransactionSuccessful();
} catch (IOException e) {
FLog.e(LOG_TAG, "read database init file error");
} finally {
db.endTransaction();
if (br != null) {
try {
br.close();
} catch (IOException e) {
FLog.e(LOG_TAG, "buffer reader close error");
}
}
}
above example require the INIT_FILE need every line is a sql sentence.
Also, if your sql sentences file is big, you can create the database out site of android(sqlite support for windows, linux, so you can create the database in your os, and copy the database file to your assets folder, if big, you can zip it)
when your application run, you can get the database file from assets, directed to save to your application's database folder (if you zip it, you can unzip to the application's database folder)
hope can help you -):
I used this method. First create your sqlite database there are a few programs you can use I like SqliteBrowser. Then copy your database file into your assets folder. Then you can use this code in the constructor of SQLiteOpenHelper.
final String outFileName = DB_PATH + NAME;
if(! new File(outFileName).exists()){
this.getWritableDatabase().close();
//Open your local db as the input stream
final InputStream myInput = ctx.getAssets().open(NAME, Context.MODE_PRIVATE);
//Open the empty db as the output stream
final OutputStream myOutput = new FileOutputStream(outFileName);
//final FileOutputStream myOutput = context.openFileOutput(outFileName, Context.MODE_PRIVATE);
//transfer bytes from the inputfile to the outputfile
final byte[] buffer = new byte[1024];
int length;
while ((length = myInput.read(buffer))>0){
myOutput.write(buffer, 0, length);
}
//Close the streams
myOutput.flush();
((FileOutputStream) myOutput).getFD().sync();
myOutput.close();
myInput.close();
}
} catch (final Exception e) {
// TODO: handle exception
}
DB_PATH is something like /data/data/com.mypackage.myapp/databases/
NAME is whatever database name you choose "mydatabase.db"
I know there are many improvements on this code but it worked so well and is VERY FAST. So I left it alone. Like this might be even better in the onCreate() method. Also checking if the file exists every time is probably not the best. Anyway like I said it works, it's fast and reliable.
If the data is not private then simply host it on your website then download it on first run. That way you can keep it up to date. So long as you remember to take app version into account when you upload it to your webserver.
Related
I'm having trouble with a pre Populated database in android. Not the usual problems though. I've got the database working just fine.
My problem comes with adding new data after the app has been published.
I spent a lot of time with onupgrade method but then it dawned on me that my problem is elsewhere. Once I've added new lines to my database in my assets folder, how do I get these added to the database that was copied to my data/data folder.....
My database is where I store my level information for a game, the last column in the table is a flag to mark the level completed so I can't lose this information.
You could add some sql patch files, and then read them to upgrade your database.
I used it simply with the FileHelper static class I copied from Danny Remington's post on SO and then do :
try {
InputStream in = mgr.open(assetFile);
String[] statements = FileHelper.parseSqlFile(in);
dao.mDb.execSQL("BEGIN TRANSACTION;");
/*dao.mDb is a reference to the SQLiteDatabase from my dao*/
for (String statement : statements) {
dao.mDb.execSQL(statement);
}
dao.mDb.execSQL("COMMIT;");
in.close();
} catch (IOException e) {
//
}
I have a pre-populated DB that I've created offline and put in the assets/ directory. I'm trying to copy it when SQLiteOpenHelper.onCreate() is called, but I'm running into serious trouble.
public void onCreate(SQLiteDatabase db) {
importAssets();
}
When I write my code like this, I consistently get a no such table error when subsequently trying to get data out of the database. I did some mucking around and started inserting SQL statements into onCreate() as well as the call to importAssets(), for example:
public void onCreate(SQLiteDatabase db) {
importAssets();
db.execSQL("CREATE TABLE main_table (some_col);");
}
Now when I run the app, I get problems with a column not found error (the above schema doesn't match what the DB access code expects, it's missing a number of columns. On the second run of the app, android_metatdata get corrupted and then it all goes to hell. I've also tried with the SQL code before the importAssets() call, with similarly disasterous results.
So I'm starting to think that the file copy I'm doing in importAssets() is essentially getting truncated by onCreate(). Am I right? Is the only way to create a DB in onCreate() by directly using SQL? Can I not use this function to copy a ready-made DB from assets/?
I have confirmed that importAssets() is running correctly by using Log() calls. Here is the complete function:
private void importAssets() throws IOException
{
AssetManager am = mCtx.getAssets();
InputStream in = am.open(DB_NAME);
FileOutputStream out;
byte[] buffer = new byte[1024];
int len;
File db;
db = new File(DB_DIR + "/" + DB_NAME);
// copy the DB from assets to standard DB dir created above
out = new FileOutputStream(db);
while ((len = in.read(buffer)) > 0)
{
out.write(buffer, 0, len);
}
out.flush();
out.close();
in.close();
if (!db.exists())
{
Log.e(TAG, DB_DIR + "/" + DB_NAME + " does not exist");
}
}
I've seen elsewhere the suggestion that I do the file copy inside the SQLiteOpenHelper constructor, but that means I'd have to manually check the presence of the file in importAssets() since the constructor gets called every time I use the DB. Also, onCreate() would be effectively useless for what I want to do, though I don't see any way to avoid that.
I'm now convinced that my suspicion was right. The onCreate() function of SQLiteOpenHelper will create the DB file, specified in the name parameter of the call to the super class constructor. It then expects you to modify this DB file (add tables, data, alter tables, etc) from within the onCreate() function using the various DB libraries. Copying a file from somewhere else will just get truncated.
I don't know how or why it's doing this, but this makes onCreate() useless for 'creating' a DB by copying it intact from assets/ which is what I need. I've moved importAssets() to the constructor and added the appropriate check to not overwrite it if it's already there. With nothing in onCreate() at all it doesn't seem to disrupt what's going on elsewhere.
I think that SQLiteOpenHelper really needs the capability to run an SQLite script. This way I don't have to hard-code long chunks of SQL into my Java code, when all I really want to do is get a DB setup.
I have database updates like the two below, throughout my code. Most of my updates open the database first before updating the record and then closes the database after the record has been updated. I have noticed that not using this statement: mDb = Helper.getWritableDatabase(); before the insertion and this statement: mDb.close(); after insertion, will cause a force close error sometimes, but not always. What is the proper way. Do I use the open and close statements all the time or only when I have to or should I always open and then close during the update process. What is the proper technique. Here is the snippet with the open close statements. Thanks in advance. Is the open statement necessary?
// Open connections to the database
mDb = Helper.getWritableDatabase();
// update 1
String strFilter7 = "_id=" + 7;
ContentValues args7 = new ContentValues();
args7.put(COL_VALUE, newB1ftgvalue);
mDb.update("VarData", args7, strFilter7, null);
// update 2
String strFilter11 = "_id=" + 11;
ContentValues args11 = new ContentValues();
args11.put(COL_VALUE, newB2ftgvalue);
mDb.update("VarData", args11, strFilter11, null);
// closes database
mDb.close();
It is good practice to always call close() after you are done with database updates. If you haven't closed the database you may see errors. Once database is open, you may do multiple updates. It shouldn't be an issue. One thing to take care is, it is better not to keep open connection for long time due to lot of reasons. Here is good discussion on this topic.
Do not close it and do only have one sqlite helper. It's basically a static openhelper. There is no problem with never closing your database. This link gives a good piece of code that works great. You will not have memory leaks with an open database. You will however have problems with open cursors, so make sure to close those.
http://www.touchlab.co/blog/single-sqlite-connection/
A good discussion is here:
What are the best practices for SQLite on Android?
I realized the link I posted has changed since when I once first viewed it. Change:
instance = new DatabaseHelper(context);
to
instance = new DatabaseHelper(context.getApplicationContext());
and
public class DatabaseHelper extends OrmLiteSqliteOpenHelper
to
public class DatabaseHelper extends SqliteOpenHelper
You need to open the db as a writable database in order to modify its data, so yes, you have to open it before updating records.
I have an app that uses a database with 3 tables in it. Those 3 tables have data read from and written to them by activities and services.
Having gotten a few "android.database.sqlite.SQLiteException: database is locked" crashes, I went in to the database adapter class and wrapped every write, update, or delete function with a synchronized statement, so like:
public int deleteExpiredAlarms() {
String whereClause = FIELD_EXPIRED + " = 1";
int val = 0;
synchronized(dbWriteLock) {
val = db.delete(ALARM_DATABASE_TABLE, whereClause, null);
}
return val;
}
That seemed to make it better. But lately it's gotten bad again as I've added more services that read and write to different tables.
Do I need to synchronize ALL db access statements, including queries?
The exception is occurring on the attempt to open the writable database via the open helper...should I synchronize that act also?
I've heard that I should only be using one db helper so that there won't be issues with multiple threads accessing the db. How do I use only one db helper? Every example I've seen so far has the db helper as an instantiated value inside the db adapter....so wouldn't that be a separate db helper per db adapter instantiated (one in an activity, one in a service running,etc)
I've looked at using a content provider instead, as it's been claimed to solve problems like this, but it's really more work than I want to do if I should be able to have direct db access without locking issues. And I do not plan to make this db accessible to other apps.
Thanks for the help.
I want to bulk insert about 700 records into the Android database on my next upgrade. What's the most efficient way to do this? From various posts, I know that if I use Insert statements, I should wrap them in a transaction. There's also a post about using your own database, but I need this data to go into my app's standard Android database. Note that this would only be done once per device.
Some ideas:
Put a bunch of SQL statements in a file, read them in a line at a time, and exec the SQL.
Put the data in a CSV file, or JSON, or YAML, or XML, or whatever. Read a line at a time and do db.insert().
Figure out how to do an import and do a single import of the entire file.
Make a sqlite database containing all the records, copy that onto the Android device, and somehow merge the two databases.
[EDIT] Put all the SQL statements in a single file in res/values as one big string. Then read them a line at a time and exec the SQL.
What's the best way? Are there other ways to load data? Are 3 and 4 even possible?
Normally, each time db.insert() is used, SQLite creates a transaction (and resulting journal file in the filesystem), which slows things down.
If you use db.beginTransaction() and db.endTransaction() SQLite creates only a single journal file on the filesystem and then commits all the inserts at the same time, dramatically speeding things up.
Here is some pseudo code from: Batch insert to SQLite database on Android
try
{
db.beginTransaction();
for each record in the list
{
do_some_processing();
if (line represent a valid entry)
{
db.insert(SOME_TABLE, null, SOME_VALUE);
}
some_other_processing();
}
db.setTransactionSuccessful();
}
catch (SQLException e) {}
finally
{
db.endTransaction();
}
If you wish to abort a transaction due to an unexpected error or something, simply db.endTransaction() without first setting the transaction as successful (db.setTransactionSuccessful()).
Another useful method is to use db.inTransaction() (returns true or false) to determine if you are currently in the middle of a transaction.
Documentation here
I've found that for bulk insertions, the (apparently little-used) DatabaseUtils.InsertHelper class is several times faster than using SQLiteDatabase.insert.
Two other optimizations also helped with my app's performance, though they may not be appropriate in all cases:
Don't bind values that are empty or null.
If you can be certain that it's safe to do it, temporarily turning off the database's internal locking can also help performance.
I have a blog post with more details.
This example below will work perfectly
String sql = "INSERT INTO " + DatabaseHelper.TABLE_PRODUCT_LIST
+ " VALUES (?,?,?,?,?,?,?,?,?);";
SQLiteDatabase db = this.getWritableDatabase();
SQLiteStatement statement = db.compileStatement(sql);
db.beginTransaction();
for(int idx=0; idx < Produc_List.size(); idx++) {
statement.clearBindings();
statement.bindLong(1, Produc_List.get(idx).getProduct_id());
statement.bindLong(2, Produc_List.get(idx).getCategory_id());
statement.bindString(3, Produc_List.get(idx).getName());
// statement.bindString(4, Produc_List.get(idx).getBrand());
statement.bindString(5, Produc_List.get(idx).getPrice());
//statement.bindString(6, Produc_List.get(idx).getDiscPrice());
statement.bindString(7, Produc_List.get(idx).getImage());
statement.bindLong(8, Produc_List.get(idx).getLanguage_id());
statement.bindLong(9, Produc_List.get(idx).getPl_rank());
statement.execute();
}
db.setTransactionSuccessful();
db.endTransaction();
Well, my solution for this it kind of weird but works fine...
I compile a large sum of data and insert it in one go (bulk insert?)
I use the db.execSQL(Query) command and I build the "Query" with the following statement...
INSERT INTO yourtable SELECT * FROM (
SELECT 'data1','data2'.... UNION
SELECT 'data1','data2'.... UNION
SELECT 'data1','data2'.... UNION
.
.
.
SELECT 'data1','data2'....
)
The only problem is the building of the query which can be kind of messy.
I hope it helps
I don't believe there is any feasible way to accomplish #3 or #4 on your list.
Of the other solutions you list two that have the datafile contain direct SQL, and the other has the data in a non-SQL format.
All three would work just fine, but the latter suggestion of grabbing the data from a formatted file and building the SQL yourself seems the cleanest. If true batch update capability is added at a later date your datafile is still usable, or at least easily processable into a usable form. Also, creation of the datafile is more straightforward and less error prone. Finally, having the "raw" data would allow import into other data-store formats.
In any case, you should (as you mentioned) wrap the groups of inserts into transactions to avoid the per-row transaction journal creation.