SQLite - Increase speed of insertion - android

I have a method which reads data from file line by line and takes value between coma, then puts this value into INSERT query. Data in file saved in this way:
–,08:10,–,20:20,08:15,08:16,20:26,20:27,08:20,08:21,20:31,20:32,08:30,08:31,20:40,20:41,08:37,08:38,20:46
20:47,08:48,08:50,20:56,20:57,09:00,09:01,21:07,21:08
08:53,–,17:43,09:01,09:03,09:13,09:15,18:02,18:04,–,–,09:19,09:25
Here is actual my code:
public void insertTime(SQLiteDatabase database, String table) throws FileNotFoundException {
BufferedReader br = null;
String line;
try {
int j = 0;
br = new BufferedReader(new InputStreamReader(context.getAssets().open("time.txt")));
database.beginTransaction();
while ((line = br.readLine()) != null) {
j++;
String query = "INSERT INTO "+table+""+j+" (arrival, departure) VALUES (?,?)";
SQLiteStatement statement = database.compileStatement(query);
// use comma as separator
String[] time = line.split(",");
for(int i = 1; i < time.length; i+=2) {
statement.bindString(1,time[i-1]);//arrival
statement.bindString(2,time[i]);//departure
statement.executeInsert();
statement.clearBindings();
}
}
database.setTransactionSuccessful();
database.endTransaction();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (br != null) {
try {
br.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
The problem is that data insert very slow, despite I use SQLiteStatement and transactions. For example, when I insert 69000 rows it takes about 65,929 seconds.
What have I to change in my code to improve speed of insertion ?
UPDATE
OK, I have simplified my code, I got rid of BufferedReader and now it looks like this
public void insertTime(SQLiteDatabase database) throws FileNotFoundException {
database.beginTransaction();
int r = 0;
while (r < 122) {
r++;
String query = "INSERT INTO table_1 (arrival, departure) VALUES (?,?)";
SQLiteStatement statement = database.compileStatement(query);
for(int i = 1; i < 1100; i++) {
statement.bindString(1,i+"");//arrival
statement.bindString(2,i+"");//departure
statement.executeInsert();
statement.clearBindings();
}
}
database.setTransactionSuccessful();
database.endTransaction();
}
But it still so long inserts data, more than 2 min. Do you have any ideas how to increase speed of my second example ?

Here is a very very detailed post on every method of increasing SQL insertion speed.

Move beginTransaction() and setTransactionSuccessful() outside of while loop and it will be way faster.

A new transaction is started for each item in the while() loop.
It might go a bit faster if you only have 1 transaction to do all your insertions.
Also, when your data is corrupt and String.split doesn't give you at least 2 items, then your transaction will not be ended properly due to an Exception being thrown.

Every time you insert a row in a table with indexes, the indexes have to be adjusted. That operation can be costly. Indexes are kept as b-trees and if you hit the rebalance point, you're bound to have a slowdown. One thing you can do to test this is to remove your indexes. You could also drop the indexes, insert, then re-create the indexes.

For those using JDBC (Java): to be sure, do you first set the autoCommit to FALSE?
I guess so, because you work with explicit transactions.
The performace gain I got by explicitly setting the autocommit off was over 1000 times!
So:
Class.forName("org.sqlite.JDBC");
String urlInput = "jdbc:sqlite:" + databaseFile;
databaseConnection = DriverManager.getConnection(urlInput);
databaseConnection.setAutoCommit( false);
And:
String sql = "INSERT INTO " + TABLE_NAME + " ( type, bi, ci, fvi, tvi, content_type) VALUES ('V',?,?,?,?,'rtf')";
PreparedStatement psi = databaseConnection.prepareStatement(sql);
for( Item item : items) {
psi.setInt(1, item.property1);
// ....
count = psi.executeUpdate();
}
databaseConnection.commit();
databaseConnection.setAutoCommit( true);
So, when somebody forgets this, this may have a huge effect.

Related

Constantly retrieve data from database in a infinite loop

I created a database with a table named flagTable, this table only has two fields, which are id(auto increment) and an integer field. Next, in my program, I have a button that will trigger a thread to start. When the thread is starting, it constantly retrieve data from database, and check for the for the value, if the value is equal to one then it will trigger another new Thread, something like this:
private class statusOfStrummingInAnotherDevice extends Thread {
int value;
public void run() {
try{
while(true){
try{
if(flagCursor == null){
flagCursor = cdb1.getFlagAll();
}
}catch(Exception e){break;}
try{
Log.i("MAIN3ACTIVITY","getting status");
int size = cdb1.getSize(flagCursor);
Log.i("MAIN3ACTIVITY","SIZE is" + String.valueOf(xyz));
for(int i = 0 ; i < size ; i++){
flagCursor.moveToPosition(i);
Log.i("MAIN3ACTIVITY","getting status jkasdfasdf");
value = cdb1.getFlag();
if(value == 1){
Log.i("FLAGCURSOR=====>>>>","Succesful");
releasingNotes = new ReleasingNotes(IntendedChord);
releasingNotes.start();
//break;
}
cdb1.updateFlag(0);
Log.i("FLAGCURSOR=====>>>>",String.valueOf(value));
}
flagCursor = null;
}catch(Exception e){break;}
Log.i("MAIN3ACTIVITY","thread is sleeping");
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
break;
}
}
}catch(Exception e){
}
}
}
In the meantime, the data that were retrieved from the database is using this function:
public Cursor getFlagAll(){
return getReadableDatabase().rawQuery(
"SELECT _ID, flag from flagTable", null);
}
And, the data that were updated to the database through this method:
public int updateFlag(int i) {
SQLiteDatabase db = this.getWritableDatabase();
ContentValues contentValues = new ContentValues();
contentValues.put("flag",i);
return db.update("flagTable" , contentValues , "_ID" + "= ?",new String[]{String.valueOf(1)});
}
Now, above codes will give no error, however, the data that were retrieved from the database is always 1, it keeps trigger a new function. In my above codes, I stated if the value is equal to 1, then the current thread will trigger a new thread to start, When its finished, the program will update the current data to 0. So that, the next round of the infinite loop can stop triggering new thread until a the conditon is met. What is problem overhere? did my codes really updated the new value? or I need to referesh the database every time I updated a new value.
Use Listeners to your database.
use SQLiteTransactionListener and do your things in onCommit()
Some guide in details here :
https://developer.android.com/reference/android/database/sqlite/SQLiteTransactionListener.html and
http://www.programcreek.com/java-api-examples/index.php?api=android.database.sqlite.SQLiteTransactionListener

android loop through database with cursor speed

I have read several posts here on speed issues when looping through a cursor and tried the answers given in these posts such as e.g. do not use getcolumnindex in the loop call this once etc.
However with a database having around 2400 records it takes around 3 to 5 minutes to finish.
The loop is running in an async task method so that it does not hang up the device and the database is handled via a database adapter.
The loop code is as follows :
while (!exportrec.isAfterLast()) {
if ( exportrec.moveToNext() ) {
fulldate = exportnumberformatter(exportrec.getInt(daye))
+"/"+exportnumberformatter(exportrec.getInt(monthe))+"/"
+String.valueOf(exportrec.getInt(yeare));
fulltime = exportnumberformatter(exportrec.getInt(houre))+":"
+exportnumberformatter(exportrec.getInt(mine))+":"
+exportnumberformatter(exportrec.getInt(sece));
noiseid = exportrec.getInt(typee);
exportedinfo += exporttypes[id] +","+exportrec.getString(notee)+","+
fulldate+","+fulltime+" \n" ;
}
}
The exportnumberformatter does the following :
public String exportnumberformatter(int i) {
String result = Integer.toString(i);
if (result.length() >1 ) {
return Integer.toString(i);
}
String zeroprefix = "";
zeroprefix = "0"+result;
return zeroprefix ;
}
The cursor is called as follows before the loop to get the data :
exportrec = MD.GetAllLogs(2, "date_sort");
exportrec.moveToFirst();
The MD is the database adapter and the GetAllLogs Method (this has been played with to try and speed things up and so the date_sort that is used is really ignored here):
public Cursor GetAllLogs(Integer i,String sortfield)
{
String sorted = "";
if (i == 1 ) {
sorted = "DESC";
} else if (i == 2) {
sorted = "ASC";
}
return mDB.query(DB_TABLE, new String[] {COL_ID, COL_TYPE,COL_IMAGE, COL_INFO,COL_IMAGE,COL_HOUR,COL_SEC,COL_MIN,COL_DAY,COL_MON,COL_YEAR,COL_SORT_DATE},
null, null, null, null, COL_ID+" "+sorted);
}
When I created the table in the database it had no indexes so I created these via the upgrade method. However they did not error or appear to fail when I did this but what I do not know is A) does the database/table need rebuilding after an index is created and B) how to tell if they have been created ? the two indexes were based on the ID as the first and a field that holds the year month day hour minute second all in on Long Integer.
I am concerned that the loop appears to be taking this long to read through that many records.
Update:
rtsai2000's and the suggestion from CL answer has improved the speed from minutes to seconds
Your exportedInfo String is growing and growing. Save the results in an array and Stringify later (such as with StringBuilder).
You are not closing your cursor after reading the records.
List<String> exportedInfo = new ArrayList<String>();
Cursor exportrec = GetAllLogs();
try {
while (exportrec.moveToNext()) {
String info = String.format("%s, %s, %02d/%02d/%02d, %02d:%02d:%02d",
exporttypes[id],
exportrec.getString(notee),
exportrec.getInt(daye),
exportrec.getInt(monthe),
exportrec.getInt(yeare),
exportrec.getInt(houre),
exportrec.getInt(mine),
exportrec.getInt(sece));
exportedInfo.add(info);
}
} finally {
exportrec.close();
}
return exportedInfo;

ORMLite select some columns using predicates

I have ORMLite database with some fields. I want to select titles from the table where id == id which I get from webservice. I do like that:
try {
Dao<ProcessStatus,Integer> dao = db.getStatusDao();
Log.i("status",dao.queryForAll().toString());
QueryBuilder<ProcessStatus,Integer> query = dao.queryBuilder();
Where where = query.where();
String a = null;
for(Order r:LoginActivity.orders) {
//LoginActivity.orders - array of my objects which I get from webservice
Log.i("database",query.selectRaw("select title from process_status").
where().rawComparison(ProcessStatus.STATUS_ID, "=",
r.getProcess_status().getProccessStatusId()).toString());
}
Log.i("sr",a);
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
I tried like this but I get only sets of my id, not titles. I tried like this:
Log.i("database", query.selectColumns(ProcessStatus.STATUS_TITLE).where().
eq(ProcessStatus.STATUS_ID, r.getProcess_status().getProccessStatusId())
.toString());
but I have the same result. How should I get data from database?
For selecting an specific field from the table, you could do something like this:
String result = "";
try {
GenericRawResults<String[]> rawResults = yourDAO.queryRaw("select " +
ProcessStatus.STATUS_TITLE +" from YourTable where "+
ProcessStatus.STATUS_ID + " = " +
r.getProcess_status().getProccessStatusId());
List<String[]> results = rawResults.getResults();
// This will select the first result (the first and maybe only row returned)
String[] resultArray = results.get(0);
//This will select the first field in the result which should be the ID
result = resultArray[0];
} catch (Exception e) {
e.printStackTrace();
}
Hope this helps.
It's hard to properly answer this question without seeing all of the classes of the processStatusId field and others. However I think you are doing too much raw method and may not be properly escaping your values and the like.
I would recommend that you use the IN SQL statement instead of what you are doing in the loop. Something like:
List<String> ids = new ArrayList<String>();
for(Order r : LoginActivity.orders) {
ids.add(r.getProcess_status().getProccessStatusId());
}
QueryBuilder<ProcessStatus, Integer> qb = dao.queryBuilder();
Where where = qb.where();
where.in(ProcessStatus.STATUS_ID, ids);
qb.selectColumns(ProcessStatus.STATUS_TITLE);
Now that you have built your query, either you can retrieve your ProcessStatus objects or you can get the titles themselves using dao.queryForRaw(...):
List<ProcessStatus> results = qb.query();
// or use the prepareStatementString method to get raw results
GenericRawResults<String[]> results = dao.queryRaw(qb.prepareStatementString());
// each raw result would have a String[] with 1 element for the title

Android sqlite inserting 500 rows does not work or preserve INSERT order

I can not get SQLite to support my begin/end transaction surrounding multiple inserts.
Multiples INSERTs : 2500ms
Using BEGIN and COMMIT : 90ms
Using SELECT and UNION : 40ms
So I looked using begin and commit. What am I doing wrong?
// pseudocode:
ArrayList<Integer> iList = new ArrayList<Integer>();
for (int i = 1; i <= 500; i++) {
iList.add(i);
}
Collections.shuffle(iList);
StringBuilder sb = new StringBuilder("begin transaction;");
for (Integer i: iList) {
sb.append("insert into \"t_order\" (qid) values(");
sb.append(i);
sb.append(");");
}
sb.append(" end transaction;");
// from docs: http://developer.android.com/reference/android/database/sqlite/SQLiteDatabase.html#execSQL(java.lang.String)
// Execute a single SQL statement that is NOT a SELECT or any other SQL statement that returns data.
m_db.execSQL(sb.toString());
OK, I did a bit more research and it seems that "Multiple statements separated by semicolons are not supported." What can I do instead to insert and preserve insert order?
Start a transaction, execute each of the INSERTs on separate execSQL() calls, and then commit the transaction.
You don't need to cluster the INSERTs together in the same execSQL() call.
Use the SQLiteDatabase.beginTransaction() and SQLiteDatabase.endTransaction() methods and issue your execSQL call(s) between them. It would also be better style to use a ContentValues structure instead of doing your own string concatenation:
ContentValues cv = new ContenValues();
m_db.beginTransaction();
try {
for (Integer i: iList) {
cv.put("qid", i);
m_db.insert("t_order", null, cv);
}
m_db.setTransactionSuccessful();
} finally {
m_db.endTransaction();
}
Have a look at the official Android documentation on beginTransaction(). Replace the "..." portion with a loop doing a separate execSQL() call -- there is no need to truncate the statements together in one buffer.
Also, it's often worth it to use a prepared statement. Prepare the statement, begin the transaction, loop for each item binding and executing the statement and finally commit.
This may help :
public void putAll(Context context, LinkedList<HANDLEDOBJECT> objects) {
if (objects.size() < 1 || objects.get(0) == null)
return;
Log.i("Database", "Starting to insert objects to " + getTableName());
List<String> insertCommands = new ArrayList<String>();
int t = 0;
while (t < objects.size()) {
int k = 0;
StringBuilder sb = new StringBuilder();
sb.append("INSERT OR REPLACE INTO ").append(getTableName())
.append("('k', 'v') ");
for (t = t + 0; t < objects.size() && k < 450; t++) {
k++;
if (t % 450 != 0)
sb.append("UNION ");
sb.append("SELECT " + objects.get(t).getId())
.append(" AS k, ")
.append("'"
+ GsonSerializer.getInstance().toJson(
objects.get(t), getHandledObjectType())
+ "'").append(" AS v ");
}
insertCommands.add(sb.toString());
}
for (String insertCommand : insertCommands)
SQLiteClient.getConnetion(context).execSQL(insertCommand);
Log.i("Database", "Successfully inserted " + t + " objects to "
+ getTableName() + "!!!");
System.gc();
}

Android - Speed up inserting data in database

I currently have a CSV file that I parse and am trying to insert the data into the android database. The problem I am having is that it is taking way too long to insert all of the data. It's a good amount of data but I feel like it shouldn't take 20min or so to complete.
Basically, I create my database, then begin the parsing. While parsing through each individual CSV row, I grab the required data and insert it into the database. In total there are around 40000 rows.
Is there any way I can speed up this process? I have tried batch inserts but it never really helped (unless I did it wrong).
Code down below.
Thanks.
DatabaseHelper (i have two insert commands based on the amount of data in each csv row):
// add zipcode
public void add9Zipcode(String zip, String city, String state, String lat,
String longi, String decom) {
// get db and content values
SQLiteDatabase db = this.getWritableDatabase();
ContentValues values = new ContentValues();
db.beginTransaction();
try{
// add the values
values.put(KEY_ZIP, zip);
values.put(KEY_STATE, state);
values.put(KEY_CITY, city);
values.put(KEY_LAT, lat);
values.put(KEY_LONG, longi);
values.put(KEY_DECOM, decom);
// execute the statement
db.insert(TABLE_NAME, null, values);
db.setTransactionSuccessful();
} finally {
db.endTransaction();
}
db.close();
}
public void add12Zipcode(String zip, String city, String state, String lat,
String longi, String decom, String tax, String pop, String wages) {
// get db and content values
SQLiteDatabase db = this.getWritableDatabase();
ContentValues values = new ContentValues();
db.beginTransaction();
try{
// add the values
values.put(KEY_ZIP, zip);
values.put(KEY_STATE, state);
values.put(KEY_CITY, city);
values.put(KEY_LAT, lat);
values.put(KEY_LONG, longi);
values.put(KEY_DECOM, decom);
values.put(KEY_TAX, tax);
values.put(KEY_POP, pop);
values.put(KEY_WAGES, wages);
// execute the statement
db.insert(TABLE_NAME, null, values);
db.setTransactionSuccessful();
} finally{
db.endTransaction();
}
db.close();
}
Parse File:
public void parse(ArrayList<String> theArray, DatabaseHandler db) {
String[] data = null;
// while loop to get split the data into new lines
// for loop to split each string in the array list of zipcodes
for (int x = 0; x < theArray.size(); x++) {
if(x == 10000 || x == 20000 || x == 30000 || x == 40000){
Log.d(TAG, "x is 10k, 20k, 30k, 40k");
}
// split string first into an array
data = theArray.get(x).split(",");
// separate based on the size of the array: 9 or 12
if (data.length == 9) {
db.add9Zipcode(data[0], data[2], data[3], data[5], data[6],
data[8]);
} else if (data.length == 12) {
db.add12Zipcode(data[0], data[2], data[3], data[5], data[6],
data[8], data[9], data[10], data[11]);
/*
* theZip.zip = data[0]; theZip.city = data[2]; theZip.state =
* data[3]; theZip.lat = data[5]; theZip.longi = data[6];
* theZip.decom = data[8]; theZip. = data[9]; theZip.population
* = data[10]; theZip.wages = data[11];
*/
}
}
Refer to this answer I made previously: Inserting 1000000 rows in sqlite3 database
In short, use an InsertHelper and do more than one insert per transaction - unless you did something wonky, the speed increase should be noticeable.
Edit:
In short:
Your SQLiteOpenHelper should be a singleton used across your entire application.
Don't go around calling close() on your SQLiteDatabase instance - it's cached in the SQLiteOpenHelper and every time you close you force the helper to reopen it.
Batch your inserts, start a transaction outside the call to the addZipCode methods and mark it as successful after you've done all the inserts - then commit the transaction.
Use an InsertHelper - it will format the insert properly as a prepared statement and is nice and reusable.
Be mindful of synchronizing access to the database - unless you intend to do all your database work on the UI-thread (which is not recommended) - you either need to enable locking or guard access to the database to avoid concurrent access.

Categories

Resources