Android SQLite database: slow insertion - android

I need to parse a fairly large XML file (varying between about a hundred kilobytes and several hundred kilobytes), which I'm doing using Xml#parse(String, ContentHandler). I'm currently testing this with a 152KB file.
During parsing, I also insert the data in an SQLite database using calls similar to the following: getWritableDatabase().insert(TABLE_NAME, "_id", values). All of this together takes about 80 seconds for the 152KB test file (which comes down to inserting roughly 200 rows).
When I comment out all insert statements (but leave in everything else, such as creating ContentValues etc.) the same file takes only 23 seconds.
Is it normal for the database operations to have such a big overhead? Can I do anything about that?

You should do batch inserts.
Pseudocode:
db.beginTransaction();
for (entry : listOfEntries) {
db.insert(entry);
}
db.setTransactionSuccessful();
db.endTransaction();
That increased the speed of inserts in my apps extremely.
Update:
#Yuku provided a very interesting blog post: Android using inserthelper for faster insertions into sqlite database

Since the InsertHelper mentioned by Yuku and Brett is deprecated now (API level 17), it seems the right alternative recommended by Google is using SQLiteStatement.
I used the database insert method like this:
database.insert(table, null, values);
After I also experienced some serious performance issues, the following code speeded my 500 inserts up from 14.5 sec to only 270 ms, amazing!
Here is how I used SQLiteStatement:
private void insertTestData() {
String sql = "insert into producttable (name, description, price, stock_available) values (?, ?, ?, ?);";
dbHandler.getWritableDatabase();
database.beginTransaction();
SQLiteStatement stmt = database.compileStatement(sql);
for (int i = 0; i < NUMBER_OF_ROWS; i++) {
//generate some values
stmt.bindString(1, randomName);
stmt.bindString(2, randomDescription);
stmt.bindDouble(3, randomPrice);
stmt.bindLong(4, randomNumber);
long entryID = stmt.executeInsert();
stmt.clearBindings();
}
database.setTransactionSuccessful();
database.endTransaction();
dbHandler.close();
}

Compiling the sql insert statement helps speed things up. It can also require more effort to shore everything up and prevent possible injection since it's now all on your shoulders.
Another approach which can also speed things up is the under-documented android.database.DatabaseUtils.InsertHelper class. My understanding is that it actually wraps compiled insert statements. Going from non-compiled transacted inserts to compiled transacted inserts was about a 3x gain in speed (2ms per insert to .6ms per insert) for my large (200K+ entries) but simple SQLite inserts.
Sample code:
SQLiteDatabse db = getWriteableDatabase();
//use the db you would normally use for db.insert, and the "table_name"
//is the same one you would use in db.insert()
InsertHelper iHelp = new InsertHelper(db, "table_name");
//Get the indices you need to bind data to
//Similar to Cursor.getColumnIndex("col_name");
int first_index = iHelp.getColumnIndex("first");
int last_index = iHelp.getColumnIndex("last");
try
{
db.beginTransaction();
for(int i=0 ; i<num_things ; ++i)
{
//need to tell the helper you are inserting (rather than replacing)
iHelp.prepareForInsert();
//do the equivalent of ContentValues.put("field","value") here
iHelp.bind(first_index, thing_1);
iHelp.bind(last_index, thing_2);
//the db.insert() equilvalent
iHelp.execute();
}
db.setTransactionSuccessful();
}
finally
{
db.endTransaction();
}
db.close();

If the table has an index on it, consider dropping it prior to inserting the records and then adding it back after you've commited your records.

If using a ContentProvider:
#Override
public int bulkInsert(Uri uri, ContentValues[] bulkinsertvalues) {
int QueryType = sUriMatcher.match(uri);
int returnValue=0;
SQLiteDatabase db = mOpenHelper.getWritableDatabase();
switch (QueryType) {
case SOME_URI_IM_LOOKING_FOR: //replace this with your real URI
db.beginTransaction();
for (int i = 0; i < bulkinsertvalues.length; i++) {
//get an individual result from the array of ContentValues
ContentValues values = bulkinsertvalues[i];
//insert this record into the local SQLite database using a private function you create, "insertIndividualRecord" (replace with a better function name)
insertIndividualRecord(uri, values);
}
db.setTransactionSuccessful();
db.endTransaction();
break;
default:
throw new IllegalArgumentException("Unknown URI " + uri);
}
return returnValue;
}
Then the private function to perform the insert (still inside your content provider):
private Uri insertIndividualRecord(Uri uri, ContentValues values){
//see content provider documentation if this is confusing
if (sUriMatcher.match(uri) != THE_CONSTANT_IM_LOOKING_FOR) {
throw new IllegalArgumentException("Unknown URI " + uri);
}
//example validation if you have a field called "name" in your database
if (values.containsKey(YOUR_CONSTANT_FOR_NAME) == false) {
values.put(YOUR_CONSTANT_FOR_NAME, "");
}
//******add all your other validations
//**********
//time to insert records into your local SQLite database
SQLiteDatabase db = mOpenHelper.getWritableDatabase();
long rowId = db.insert(YOUR_TABLE_NAME, null, values);
if (rowId > 0) {
Uri myUri = ContentUris.withAppendedId(MY_INSERT_URI, rowId);
getContext().getContentResolver().notifyChange(myUri, null);
return myUri;
}
throw new SQLException("Failed to insert row into " + uri);
}

Related

Sqlite/android: insert actually replaces rows

In Udacity's Developing Android Apps, the app they're building displays the weather forecast in a listview, after fetching the data from the openweathermap api. The data is saved into the db via the function bulkInsert shown below. Updates are called by the option "Refresh" in the menu.
No code handles the pruning of old data, and yet the number of rows remains constant in the db after each refresh. Turns out that the last 14 old rows vanish after the loop inserting the 14 new rows. The latter have new row id's.
Example:
We start with:
_id=55
_id=56
_id=57
…
_id=721
_id=722
_id=723
_id=724
_id=725
_id=726
Following the bulk insert, the id list becomes:
_id=55
_id=56
_id=57
…
_id=735
_id=736
_id=737
_id=738
_id=739
_id=740
With the rows 713-726 missing/replaced. Code for bulkinsert:
#Override
public int bulkInsert(Uri uri, ContentValues[] values) {
final SQLiteDatabase db = mOpenHelper.getWritableDatabase();
final int match = sUriMatcher.match(uri);
switch (match) {
case WEATHER:
db.beginTransaction();
int returnCount = 0;
try {
for (ContentValues value : values) {
normalizeDate(value);
long _id = db.insert(WeatherContract.WeatherEntry.TABLE_NAME, null, value);
if (_id != -1) {
returnCount++;
Log.i(“id:”, " "+_id);
}
}
db.setTransactionSuccessful();
} finally {
db.endTransaction();
}
getContext().getContentResolver().notifyChange(uri, null);
return returnCount;
default:
return super.bulkInsert(uri, values);
}
}
insert is the vanilla SQLiteDatabase method, which is not supposed to mess with existing rows. Yet, when the line is commented out, the old rows remain in the db.
Any clue what's going on?

Huge performance difference on SELECT queries using compileStatement() vs query() in SQLite / Android

In short:
Performing 23770 SELECT queries using query() and retrieving result using a Cursor takes 7 sec. I was able to reduce the time to 1 sec for the same by compiling the statement using compileStatement() and calling simpleQueryForString().
Is there a way to get similar performance without using compileStatement() since compileStatement() is limited to retrieving result only if output is 1x1 table?
More info:
I have an Android app which uses an SQLite database with a table having the following schema:
CREATE TABLE testtable(
id number primary key,
sentence text not null
);
The table is indexed on id.
What a part of my app does is to get an array of id's as input and retrieve the corresponding sentences from the table testtable.
I started by using the query() method which took around 7 sec to retrieve sentences for an array of 23770 ids. (23770 queries in 7 seconds)
I was trying to improve performance and I came to know that SQLiteStatement compileStatement(String sql) can improve performance by compiling the statements beforehand. And since SQLiteStatement has a method String simpleQueryForString() to retrieve results if the output is 1 x 1 table(which satisfies my usecase currently), I used it.
The improvement was massive. It could complete the same 23770 queries in 1 sec.
Although I can use this for now, the query may get complicated in future and the output may conatin more rows and columns which will make me use query() method.
So my question is: Is there a way to optimize queries without using compileStatement() and get similar performance?
This is the code I am testing with (The code using compileStatement() is commented):
public class DBMan extends SQLiteAssetHelper{
SQLiteDatabase db;
public DBMan(Context context){
super(context, "my.db", null, 1);
db = this.getReadableDatabase();
}
public String[] getSentences(Integer[] idList){
String[] result = new String[idList.length];
Cursor cur = null;
long timeStart = System.nanoTime();
try {
db.beginTransaction();
/* SQLiteStatement selStmt = db.compileStatement("SELECT sentence FROM testtable WHERE id=?"); */
for (int i = 0; i < idList.length; i++) {
// Querying using compileStatement() and simpleQueryForString()
/*
selStmt.clearBindings();
selStmt.bindLong(1, idList[i]);
result[i] = selStmt.simpleQueryForString();
*/
// Querying using query() and Cursor
cur = db.query(
"testtable",
new String[]{"sentence"},
"id = ?",
new String[]{String.valueOf(idList[i])},
null, null, null
);
if (cur.moveToFirst()) {
result[i] = cur.getString(0);
}
if (cur != null) {
cur.close();
}
}
db.setTransactionSuccessful();
}
finally {
db.endTransaction();
}
long totalTime = System.nanoTime() - timeStart;
Log.i("MYAPP", "DB total query time: "+totaltime/1000000000.0+" sec");
return result;
}
}
I'm using SQLiteAssetHelper which is an extension of SQLiteOpenHelper. I'm using it to copy my database file from assets folder on first run instead of creating it.
I'm used transactions although I'm doing only select queries as it reduces the number of shared locks that are obtained and dropped(see here).

How to bind values to SQLiteStatement for insert query?

Insertion code using SQLiteStatement usually looks like this,
String sql = "INSERT INTO table_name (column_1, column_2, column_3) VALUES (?, ?, ?)";
SQLiteStatement statement = db.compileStatement(sql);
int intValue = 57;
String stringValue1 = "hello";
String stringValue2 = "world";
// corresponding to each question mark in the query
statement.bindLong(1, intValue);
statement.bindString(2, stringValue1);
statement.bindString(3, stringValue2);
long rowId = statement.executeInsert();
Now this works perfectly fine but the issue I find here is that I have to be very careful about binding correct data to corresponding indexes. A simple swap of index will give me an error.
Also let's say in future my column_2 gets dropped from the table, then I would have to change all the indexes after the column_2 index otherwise the statement won't work. This seems trivial if I just have 3 columns. Imagine if a table has 10-12 (or even more) columns and column 2 gets dropped. I'll have to update the index of all the subsequent columns. This whole process seems inefficient and error prone.
Is there an elegant way to handle all this?
Edit : Why would I want to use SQLiteStatement ? Check this :Improve INSERT-per-second performance of SQLite?
Insertions can be done with ContentValues:
ContentValues cv = new ContentValues();
cv.put("column_1", 57);
cv.put("column_2", "hello");
cv.put("column_3", "world");
long rowId = db.insertOrThrow("table_name", null, cv);
But in the general case, the most correct way would be to use named parameters. However, these are not supported by the Android database API.
If you really want to use SQLiteStatement, write your own helper function that constructs it from a list of columns and takes care of matching it with the actual data. You also could write your own bindXxx() wrapper that maps previously-saved column names to parameter indexes.
You can use ContentValues with beginTransaction into SQLite that is quite easy as well as faster then prepared statements
For this you have to create ContentValues Array previously or create Content values object into your loop. and pass into insert method .this solution solve your both of problem in one.
mDatabase.beginTransaction();
try {
for (ContentValues cv : values) {
long rowID = mDatabase.insert(table, " ", cv);
if (rowID <= 0) {
throw new SQLException("Failed to insert row into ");
}
}
mDatabase.setTransactionSuccessful();
count = values.length;
} finally {
mDatabase.endTransaction();
}

Android - How can I pass data related to two tables to a the insert method of a Content Provider

I need to insert data related to an Order and its corresponding Detail.
Without a ContentProvider I would do something like this:
public boolean insertOrder(Order order, ArrayList<OrderDetail> items) {
boolean wasSuccessful = false;
ContentValues cvOrder = new ContentValues();
cvPedido.put(ORDER_CUSTOMER_ID, order.getCustomerId());
cvPedido.put(ORDER_CUSTOMER_NAME, order.getCustomerName());
String insertQuery = "INSERT INTO " + ORDER_DETAIL_TABLE
+ " VALUES (?,?)";
//...
try {
this.openWriteableDB();
SQLiteStatement statement = db.compileStatement(insertQuery);
db.beginTransaction();
long idOrder = db.insertOrThrow(ORDER_TABLE, null, cvOrder);
if (idOrder > 0) {
for (int i = 0; i < items.size(); i++) {
OrderDetail detail=items.get(i);
statement.clearBindings();
statement.bindString(1, detail.getDescription);
statement.bindDouble(2, detail.getPrice);
//...
statement.execute();
}
db.setTransactionSuccessful();
wasSuccessful = true;
}
} finally {
db.endTransaction();
this.closeDB();
}
return wasSuccessful;
}
The problem is that now I want to use a ContentProvider and I don't know what to do with this kind of cases where data about two or more tables must be passed to a single CRUD operation, knowing that a insert operation only accepts two parameters :
#Override
public Uri insert(Uri uri, ContentValues values) {
}
What do you do in a ContentProvider when you have to insert relational data in a transaction?
Thanks in advance.
You should use ContentProviderOperation. Since it's your ContentProvider you can assure that applyBatch() will execute all operations within a transaction. All standard content providers also ensure that that's the case.
See my blog post about ContentProviderOperation in general and my other post about how to use withBackReference() to access results of previous operations - which you need to access the orderId.
One important caveat: All ContentProviderOperations of one batch must use the same authority - but can use different URIs! In your case that should be no problem.
You can put all the data into a ContentValues and have a provider. You'll have to get a little creative with the order details.
Below psuedo code I create a key "DETAIL" on the fly with a integer then the item.
ContentValues values = new ContentValues();
values.put(ORDER_ID,orderid);
for (int i = 0; i < items.size(); i++) {
values.put("DETAIL" + Integer.ToString(i),items.get(i));
}
Uri uri = context.getContentResolver().insert(
ORDER_URI, values);
Then in content provider you sort it out.
#Override
public Uri insert(Uri uri, ContentValues values) {
int uriType = sURIMatcher.match(uri);
SQLiteDatabase sqlDB = database.getWritableDatabase();
long id = 0;
switch (uriType) {
case ORDER:
// trim name and description
trimNameDescriptions(values);
try {
id = sqlDB.insertOrThrow(ORDERS_TABLE,
null, values);
Integer i =0;
do (values.containsKey("DETAIL" + i.toString()){
ContentValues v = new ContentValues();
v.put("DETAIL",values.get("Detail" + i.toString()));
v.put("ORDERID",id);
//ACTUALLY CALL THE INSERT METHOD OF THE PROVIDER
insert(DETAIL_URI,v);
i+=1;
}
You can design content provider to mirror your SQLite tables, and use insertOrder code same as above.
Just use insert of content providers for each table(uri), to perform similar operations as in your insertOrder method
Another option is to define your content provider URI to take combination of your Order and items , and implement the parsing yourself in the content provider before committing to underlying data model.

Importing CSV file to android sqlite datase on phone itself [duplicate]

I need to parse a fairly large XML file (varying between about a hundred kilobytes and several hundred kilobytes), which I'm doing using Xml#parse(String, ContentHandler). I'm currently testing this with a 152KB file.
During parsing, I also insert the data in an SQLite database using calls similar to the following: getWritableDatabase().insert(TABLE_NAME, "_id", values). All of this together takes about 80 seconds for the 152KB test file (which comes down to inserting roughly 200 rows).
When I comment out all insert statements (but leave in everything else, such as creating ContentValues etc.) the same file takes only 23 seconds.
Is it normal for the database operations to have such a big overhead? Can I do anything about that?
You should do batch inserts.
Pseudocode:
db.beginTransaction();
for (entry : listOfEntries) {
db.insert(entry);
}
db.setTransactionSuccessful();
db.endTransaction();
That increased the speed of inserts in my apps extremely.
Update:
#Yuku provided a very interesting blog post: Android using inserthelper for faster insertions into sqlite database
Since the InsertHelper mentioned by Yuku and Brett is deprecated now (API level 17), it seems the right alternative recommended by Google is using SQLiteStatement.
I used the database insert method like this:
database.insert(table, null, values);
After I also experienced some serious performance issues, the following code speeded my 500 inserts up from 14.5 sec to only 270 ms, amazing!
Here is how I used SQLiteStatement:
private void insertTestData() {
String sql = "insert into producttable (name, description, price, stock_available) values (?, ?, ?, ?);";
dbHandler.getWritableDatabase();
database.beginTransaction();
SQLiteStatement stmt = database.compileStatement(sql);
for (int i = 0; i < NUMBER_OF_ROWS; i++) {
//generate some values
stmt.bindString(1, randomName);
stmt.bindString(2, randomDescription);
stmt.bindDouble(3, randomPrice);
stmt.bindLong(4, randomNumber);
long entryID = stmt.executeInsert();
stmt.clearBindings();
}
database.setTransactionSuccessful();
database.endTransaction();
dbHandler.close();
}
Compiling the sql insert statement helps speed things up. It can also require more effort to shore everything up and prevent possible injection since it's now all on your shoulders.
Another approach which can also speed things up is the under-documented android.database.DatabaseUtils.InsertHelper class. My understanding is that it actually wraps compiled insert statements. Going from non-compiled transacted inserts to compiled transacted inserts was about a 3x gain in speed (2ms per insert to .6ms per insert) for my large (200K+ entries) but simple SQLite inserts.
Sample code:
SQLiteDatabse db = getWriteableDatabase();
//use the db you would normally use for db.insert, and the "table_name"
//is the same one you would use in db.insert()
InsertHelper iHelp = new InsertHelper(db, "table_name");
//Get the indices you need to bind data to
//Similar to Cursor.getColumnIndex("col_name");
int first_index = iHelp.getColumnIndex("first");
int last_index = iHelp.getColumnIndex("last");
try
{
db.beginTransaction();
for(int i=0 ; i<num_things ; ++i)
{
//need to tell the helper you are inserting (rather than replacing)
iHelp.prepareForInsert();
//do the equivalent of ContentValues.put("field","value") here
iHelp.bind(first_index, thing_1);
iHelp.bind(last_index, thing_2);
//the db.insert() equilvalent
iHelp.execute();
}
db.setTransactionSuccessful();
}
finally
{
db.endTransaction();
}
db.close();
If the table has an index on it, consider dropping it prior to inserting the records and then adding it back after you've commited your records.
If using a ContentProvider:
#Override
public int bulkInsert(Uri uri, ContentValues[] bulkinsertvalues) {
int QueryType = sUriMatcher.match(uri);
int returnValue=0;
SQLiteDatabase db = mOpenHelper.getWritableDatabase();
switch (QueryType) {
case SOME_URI_IM_LOOKING_FOR: //replace this with your real URI
db.beginTransaction();
for (int i = 0; i < bulkinsertvalues.length; i++) {
//get an individual result from the array of ContentValues
ContentValues values = bulkinsertvalues[i];
//insert this record into the local SQLite database using a private function you create, "insertIndividualRecord" (replace with a better function name)
insertIndividualRecord(uri, values);
}
db.setTransactionSuccessful();
db.endTransaction();
break;
default:
throw new IllegalArgumentException("Unknown URI " + uri);
}
return returnValue;
}
Then the private function to perform the insert (still inside your content provider):
private Uri insertIndividualRecord(Uri uri, ContentValues values){
//see content provider documentation if this is confusing
if (sUriMatcher.match(uri) != THE_CONSTANT_IM_LOOKING_FOR) {
throw new IllegalArgumentException("Unknown URI " + uri);
}
//example validation if you have a field called "name" in your database
if (values.containsKey(YOUR_CONSTANT_FOR_NAME) == false) {
values.put(YOUR_CONSTANT_FOR_NAME, "");
}
//******add all your other validations
//**********
//time to insert records into your local SQLite database
SQLiteDatabase db = mOpenHelper.getWritableDatabase();
long rowId = db.insert(YOUR_TABLE_NAME, null, values);
if (rowId > 0) {
Uri myUri = ContentUris.withAppendedId(MY_INSERT_URI, rowId);
getContext().getContentResolver().notifyChange(myUri, null);
return myUri;
}
throw new SQLException("Failed to insert row into " + uri);
}

Categories

Resources