I made my own contentprovider where I put a lot of data in at once with multiple inserts.
The app will receive the data from an external source and at this moment I receive about 30 items (therefor 30 times an insert).
Now I noticed that this takes a lot of precious time (about 3 seconds, 100ms on each insert).
How can I improve the speed of the contentprovider? I already tried to bulkInsert them all together but them it will take up to 5 sec.
Thanks in advance.
wrap all that in insertBulk into transactions.
Example:
SQLiteDatabase sqlDB = mDB.getWritableDatabase();
sqlDB.beginTransaction();
try {
for (ContentValues cv : values) {
long newID = sqlDB.insertOrThrow(table, null, cv);
if (newID <= 0) {
throw new SQLException("Failed to insert row into " + uri);
}
}
sqlDB.setTransactionSuccessful();
getContext().getContentResolver().notifyChange(uri, null);
numInserted = values.length;
} finally {
sqlDB.endTransaction();
}
bulkInsert does not use transactions by default, since the default behavior just calls insert:
Override this to handle requests to insert a set of new rows, or the default implementation will iterate over the values and call insert(Uri, ContentValues) on each of them.
doing the inserts in a transaction greatly improves the speed, since only one write to the actuall database takes place:
db.beginTransaction();
try {
// do the inserts
db.setTransactionSuccessful()
} finally {
db.endTransaction();
}
I was once experimenting trying to improve the write speed of about ~2000 writes, and this was the only big improvement I found.
Doing db.setLockingEnabled(false) i think gave about 1% improvement, but then you must also make sure no other threads write to the db. Removing redundant indexes can also give a minor boost if the table is huge.
Related
I'm having some difficulty getting my database updated. Basically the user will input data into two separate places, so we get
Name | Letter | Marks
----------------------------
Dave | Null | 90
Dave | A | Null
which should become
Dave | A | 90
However, nothing is updating. The query works perfectly when I try it in SQLite Manager, so I must be implementing cursor wrong.
public void insertData(String name, int mark_column, String marks) {
String [] columns = new String[] {COL_3, COL_4, COL_5};
SQLiteDatabase db = this.getWritableDatabase();
ContentValues contentValues = new ContentValues();
contentValues.put(COL_2, name);
contentValues.put(columns[mark_column], marks);
db.replace(TABLE_NAME, null, contentValues);
//The code above works as desired
String sql = "SELECT NAME, GROUP_CONCAT(LETTER, ', ') AS LETTER," +
"GROUP_CONCAT(MARKS, ', ') AS MARKS FROM " + TABLE_NAME + " GROUP BY NAME";
//This query works in SQLite Manager
Cursor c = db.rawQuery(sql, null);
c.moveToFirst();
while(c.moveToNext());
c.close();
}
I have tried various combinations of c.moveToLast, not having c.moveToNext, etc. This method is called in onClick of an Alert Dialog.
Any help is greatly appreciated
Regarding the cursor:
I don't see anything wrong with your query. If you aren't "seeing" any results in your app, it's likely because you aren't actually doing anything with the results. They exist in memory in a Cursor object, but that's all; if you want to see anything you have to bind that data to some UI components, or dump it to logcat, or something.
Note that if you were to add code inside of your while loop, you would skip the first row of the cursor because you would have a moveToFirst() call followed immediately by a moveToNext() call. This is how I iterate over a Cursor, and it always works:
if (cusor != null) {
try {
for (cursor.moveToFirst(); !cursor.isAfterLast(); cursor.moveToNext()) {
// do something with data in current row
}
} finally {
cursor.close();
}
}
Regarding the update:
You actually aren't doing an update per se, you are doing an insert. SQLiteDatabase.replace() executes this command:
INSERT OR REPLACE INTO tableName(columns) VALUES (values);
This can work as an update only if you have a constraint on the table and the insertion of a new row with these values would violate that constraint (the exact handling for different constraint violations is described here). For the constraint types that I suspect you are expecting, this operation will delete the existing row and insert a new row with these values, but it will not carry over values from the deleted row into the new one. In other words, you need all the combined values in the ContentValues if you expect a replace to occur. It's not like an UPDATE where you can set the values of just certain columns.
You should probably try to do an actual update instead. Make sure to use a proper WHERE clause so you only update rows that matter.
I may be misunderstanding your approach, but the description makes it seem like you are inserting two rows, then trying to update and/or combine them both later. This doesn't make sense to me, and I foresee bugs whereby you have leftover rows that are incomplete and need to be cleaned up. In my opinion, it's better to structure the code so there is one INSERT, and every operation thereafter is an UPDATE on the row of interest.
Can we insert 1000 records in one go in Sqlite db in android. I know there is an approach of putting the loop and then calling the insert for 1000 times ?? Is there any other approach for this which is good in performance. And if this is the only approach, how much my performance would be impacted by this??
database.execSQL("PRAGMA synchronous=OFF");
Will enhance the speed database operations.
But it is risky in some conditions.
You should use transactions for such scenarios.
Pseudocode:
db.beginTransaction();
for (entry : listOfEntries) {
db.insert(entry);
}
db.setTransactionSuccessful();
db.endTransaction();
Check this post.
It has example of inserting values to database with and without transaction.
Also it has time measurment for this approaches(In this post difference is more than 100 times while inserting 100 records).
There is a maximum number of values that can be set in an insert statement using transaction?
Example of my Android aplication function:
public void addArray(ArrayList<MyObject> arrData) {
this.mDb.beginTransaction();
try {
for(MyObject obj : arrData){
ContentValues values = new ContentValues();
values.put(KEY_ROW_ID, obj.getId());
values.put(KEY_NAME , obj.getName());
values.put(KEY_ICON , obj.getImage());
//////// LIMIT HERE ?
this.mDb.insert(TABLE, null, values);
}
this.mDb.setTransactionSuccessful();
}
finally {
this.mDb.endTransaction();
}
}
Thanks
There is no practical limit on number of inserts per transaction.
You will probably hit disk space filling issues before you will see any problems with number of rows in unfinished transaction.
SQLite writes all insert intent into journal or wal file, and it could easily grow into gigabytes.
I have personally tried to insert as many as 100k rows in one transaction, and it was working just fine.
SQLite author D. Richard Hipp even suggests to open transaction upon program start, do whatever you need to do for long time, and only commit when you quit or explicitly save the state. This is basically cheap way to implement undo function - simply rollback current transaction.
In my android application which has preprocessing step at the start of the application which loads data needed by the app from database. The size of SQLite database is some 40 MB. The application takes a lot of time to preprocess the data from the database so an user has to wait for some 1 minute time for using the application. Is there any way/ways by which I can improve the performance of my app? The DB operations are mostly of select type like this:
Cursor mCursor = myDataBase.rawQuery("SELECT SUM(TimeTaken) as _time from AssessmentAttempted where AssesmentId IN(" + assessmentIds + ")", null);
To improve the performance on the preprocess phase create a SQL transaction for all operations made against the DB. This will decrease especially the insert and update times.
myDataBase.beginTransaction();
try {
//make all the BD operations
myDataBase.setTransactionSuccessful();
}catch {
//Error in between database transaction
}finally {
myDataBase.endTransaction();
}
I'm doing both an update and an insert to a review_schedule table. It seems as if the time for the both takes between 1 and 2 seconds, which is a bit slow. I've tried it with and without indexes set on the id field and the date_review field. Note that I'm using a precompiled statement for the insert, but not the update, because the compiled statement for update apparently isn't supported in Gingerbread. Here's the code:
public long insert(int id, String lastReviewDate, String nextReviewDate) {
this.insertStmt.bindLong(1, id);
this.insertStmt.bindString(2, lastReviewDate);
this.insertStmt.bindString(3, nextReviewDate);
return this.insertStmt.executeInsert();
}
public void updateSRSInfo(int id, String lastReviewDate,
String nextReviewDate) {
ContentValues contentValues = new ContentValues();
contentValues.put("last_review_date", lastReviewDate);
contentValues.put("next_review_date", nextReviewDate);
this.myDataBase.update(DataBaseHelper.WORD_REVIEW_SCHEDULE,
contentValues, "_id=?", new String[] { Integer.toString(id) });
}
Any suggestions appreciated.
If the length of the procedure making the App unresponsive, maybe you can move the operation into a different thread.
At least two options:
1) Read uncommitted transactions, since by default SQLite reads only committed transactions, which is slow. But you have to understands all risks which you migth have reading uncommitted records. Changing transaction isolation level can be done in this way:
database.execSQL("PRAGMA read_uncommitted = true;");
2) Use tables indexing
As you already have got an id,why not combine it with ContentUri.withAppendedId,then you can access this record directly.
i'm not sure whether this run faster or not as i haven't test this.