Android + sqlite insert speed improvements? - android

I recently inherited a project where a sqlite db is stored on the users sdcard (tables and columns only, no content). For the initial install (and subsequent data updates), an XML file is parsed via saxParser storing it's contents to the db columns like so:
saxParser:
#Override
public void endElement(String uri, String localName, String qName) throws SAXException {
currentElement = false;
if (localName.equals("StoreID")) {
buffer.toString().trim();
storeDetails.setStoreId(buffer.toString());
} else if (localName.equals("StoreName")) {
buffer.toString().trim();
storeDetails.setStoreName(buffer.toString());
...
} else if (localName.equals("StoreDescription")) {
buffer.toString().trim();
storeDetails.setStoreDescription(buffer.toString());
// when the final column is checked, call custom db helper method
dBHelper.addtoStoreDetail(storeDetails);
}
buffer = new StringBuffer();
}
#Override
public void characters(char[] ch, int start, int length) throws SAXException {
if (currentElement) {
buffer.append(ch, start, length);
}
}
DatabaseHelper:
// add to StoreDetails table
public void addtoStoreDetail(StoreDetails storeDetails) {
SQLiteDatabase database = null;
InsertHelper ih = null;
try {
database = getWritableDatabase();
ih = new InsertHelper(database, "StoreDetails");
// Get the numeric indexes for each of the columns that we're updating
final int idColumn = ih.getColumnIndex("_id");
final int nameColumn = ih.getColumnIndex("StoreName");
...
final int descColumn = ih.getColumnIndex("StoreDescription");
// Add the data for each column
ih.bind(idColumn, storeDetails.getStoreId());
ih.bind(nameColumn, storeDetails.getStoreName());
...
ih.bind(descColumn, storeDetails.getStoreDescription());
// Insert the row into the database.
ih.execute();
} finally {
ih.close();
safeCloseDataBase(database);
}
}
The loaded xml document is 6000+ lines long. When testing on the device it stops inserting after around halfway (no errors) which takes about 4-5 minutes. On the emulator however, it runs rather quickly, writing all lines to the database in about 20 seconds. I have log statements that run when the db is opened, data added, then closed. The LogCat outputs are significantly slower when running on the device. Is there something I'm missing here? Why is my data taking so long to write? I thought the improved InsertHelper would help, but unfortunately not even a little faster. Can someone point out my flaw(s) here?

I also counted on InsertHelper improving singificantly the speed, but the difference wasnt that drastic when I tested it.
Still the strength of the InsertHelper is in multiple inserts, because it compiles the query just once. The way you do it you declare new InsertHelper for every insert, which is bypassing the one-time-compilation improvement. Try using the same instance for multiple inserts.
However, I do not think that 6000+ inserts will go in less than a minute on slow device.
EDIT Also make sure you fetch the column indices only once, this will speed up a bit more. Place these outside the loop for the batch insert.
// Get the numeric indexes for each of the columns that we're updating
final int idColumn = ih.getColumnIndex("_id");
final int nameColumn = ih.getColumnIndex("StoreName");
final int descColumn = ih.getColumnIndex("StoreDescription");

When you're doing a batch insert like this, You might do better to set up a special action in your DB helper for it. Right now, you are opening and closing the connection to the SQLite DB every time you insert a row, which is going to slow you down significantly. For the batch process, set it up so that you can maintain the connection for the whole import Job. I think the reason that it is faster in the emulator is that, while running, the emulator exists entirely in memory - so although it intentionally slows down your CPU speed, File IO comes a lot faster.

In addition to connecting the database just once could the database connection be set not to commit the changes after each insert but only once at the end of the batch? I tried to browse Android dev docs but couldn't find an exact instructions how to do this (or is it already set so). On other platforms setting SQLite driver's AutoCommit to false and committing only at the end of the batch can improve insert speeds significantly.

Related

Database insertion taking too much time- android sqlite

I am trying to insert around 2800 records into the sqlite database, it is taking 150 sec, which is way too much! Could anyone please tell how to optimize this insertion.
public void createVariantEntry(ArrayList<ArrayList<String>> str) {
InsertHelper ih = new InsertHelper(Database, VARIANT_TABLE_NAME);
final int varid = ih.getColumnIndex(VARIANT_ID);
final int varmakeid = ih.getColumnIndex(VARIANT_MAKE_ID);
final int varmodid = ih.getColumnIndex(VARIANT_MODEL_ID);
final int varname = ih.getColumnIndex(VARIANT_NAME);
final int varposteddate = ih.getColumnIndex(VARIANT_POSTED_DATE);
for(int i=0;i<1253;i++)
{
ih.prepareForInsert();
ih.bind(varid, str.get(i).get(0));
ih.bind(varmakeid, str.get(i).get(1));
ih.bind(varmodid, str.get(i).get(2));
ih.bind(varname, str.get(i).get(3));
ih.bind(varposteddate, str.get(i).get(4));
ih.execute();
}
for(int i=1255;i<str.size();i++)
{
ih.prepareForInsert();
ih.bind(varid, str.get(i).get(0));
ih.bind(varmakeid, str.get(i).get(1));
ih.bind(varmodid, str.get(i).get(2));
ih.bind(varname, str.get(i).get(3));
ih.bind(varposteddate, str.get(i).get(4));
ih.execute();
}
ih.close();
}
a great boost in performance will be gained when using transactions.
try {
SQLiteDatabase db = MySQLiteOpenHelper.getWritableDatabse();
ContentValues values = new ContentValues();
db.beginTransaction();
while ( more_data_to_insert ) {
// put the data in 'values'
values.put("col_1", data_1);
values.put("col_2", data_2);
// ...
values.put("col_n", data_n);
// Insert the row into the database.
db.insert("table_name", null, values);
}
db.setTransactionSuccessful();
} catch ( SQLiteException e ) {
// handle your sqlite errors
} finally {
db.endTransaction();
}
and don't use InsertHelper. its deprecated now.
Here are some general tips that might help you:
You can bulkInsert or applyBatch using ContentProviders to do a bunch of operations in one go:
How to use bulkInsert() function in android?
You can use transactions to speed things up as well:
Android Database Transaction
In some cases DatabaseUtils.InsertHelper has been known to provide faster inserts than the normal sqlite insert:
http://www.outofwhatbox.com/blog/2010/12/android-using-databaseutils-inserthelper-for-faster-insertions-into-sqlite-database/
After this, You'll have to do some benchmarking and optimize for your specific situation analyzing performance vs data integrity tradeoffs etc. Good luck.

Ormlite Android bulk inserts

can anyone explain why my inserts are taking so long in Ormlite? Doing 1,700 inserts in one sqlite transaction on the desktop takes less than a second. However, when using Ormlite for Android, it's taking about 70 seconds, and I can see each insert in the debugging messages.
When I try and wrap the inserts into one transaction it goes at exactly the same speed. I understand that there is overhead both for Android and for Ormlite, however, I wouldn't expect it to be that great. My code is below:
this.db = new DatabaseHelper(getApplicationContext());
dao = db.getAddressDao();
final BufferedReader reader = new BufferedReader(new InputStreamReader(getResources().openRawResource(R.raw.poi)));
try {
dao.callBatchTasks(new Callable<Void>() {
public Void call() throws Exception {
String line;
while ((line = reader.readLine()) != null) {
String[] columns = line.split(",");
Address address = new Address();
// setup Address
dao.create(address);
}
return null;
}
});
} catch (SQLException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
I've had the same problem, and found a reasonable workaround. This took insert time from 2 seconds to 150ms:
final OrmLiteSqliteOpenHelper myDbHelper = ...;
final SQLiteDatabase db = myDbHelper.getWritableDatabase();
db.beginTransaction();
try{
// do ormlite stuff as usual, no callBatchTasks() needed
db.setTransactionSuccessful();
}
finally {
db.endTransaction();
}
Update:
Just tested this on Xperia M2 Aqua (Android4.4/ARM) and callBatchTasks() is actually faster. 90ms vs 120ms. So I think more details are in order.
We have 3 tables/classes/DAOs: Parent, ChildWrapper, Child.
Relations: Parent to ChildWrapper - 1 to n, ChildWrapper to Child - n to 1.
Code goes like this:
void saveData(xml){
for (parents in xml){
parentDao.createOrUpdate(parent);
for (children in parentXml){
childDao.createOrUpdate(child);
childWrapperDao.createOrUpdate(generateWrapper(parent, child));
}
}
}
I've got original speed up on a specific Android4.2/MIPS set-top-box (STB).
callBatchTasks was the first option because that's what we use througout all the code and it works well.
parentDao.callBatchTasks(
// ...
saveData();
// ...
);
But inserts were slow, so we've tried to nest callBatchTasks for every used DAO, set autocommit off, startThreadConnection and probably something else - don't remember at the moment. To no avail.
From my own experience and other similar posts it seems the problem occurs when several tables/DAOs are involved and it has something to do with implemetation specifics of Android (or SQLite) for concrete devices.
Unfortunately, this may be "expected". I get similar performance when I do that number of inserts under my emulator as well. The batch-tasks and turning off auto-commit don't seem to help.
If you are looking to load a large amount of data into a database, you might consider replaying a database dump instead. See here:
Android OrmLite pre-populate database
My guess would be that you are slowing somewhat because you are doing two IO tasks at one time (at least in the code shown above). You are reading from a file and writing to a database (which is a file). Also, from what I understand transactions should be a reasonable size. 1600 seems like a very high number. I would start with 100 but play around with the size.
So essentially I suggest you "chunk" your reads and inserts.
Read 100 lines to a temp Array, then insert that 100. Then read the next 100, then insert, etc.

SQLite Android Database Cursor window allocation of 2048 kb failed

I have a routine that runs different queries against an SQLite database many times per second. After a while I would get the error
"android.database.CursorWindowAllocationException: - Cursor window allocation of 2048 kb failed. # Open Cursors = " appear in LogCat.
I had the app log memory usage, and indeed when usage reaches a certain limit the I get this error, implying it runs out. My intuition tells me that the database engine is creating a NEW buffer (CursorWindow) every time I run a query, and even though .close() the cursors, neither the garbage collector nor SQLiteDatabase.releaseMemory() are quick enough at freeing memory. I think the solution may lie in "forcing" the database to always write into the same buffer, and not create new ones, but I have been unable to find a way to do this. I have tried instantiating my own CursorWindow, and tried setting SQLiteCursor to it, but to no avail.
¿Any ideas?
EDIT: re example code request from #GrahamBorland:
public static CursorWindow cursorWindow = new CursorWindow("cursorWindow");
public static SQLiteCursor sqlCursor;
public static void getItemsVisibleArea(GeoPoint mapCenter, int latSpan, int lonSpan) {
query = "SELECT * FROM Items"; //would be more complex in real code
sqlCursor = (SQLiteCursor)db.rawQuery(query, null);
sqlCursor.setWindow(cursorWindow);
}
Ideally I would like to be able to .setWindow() before giving a new query, and have the data put into the same CursorWindow everytime I get new data.
Most often the cause for this error are non closed cursors. Make sure you close all cursors after using them (even in the case of an error).
Cursor cursor = null;
try {
cursor = db.query(...
// do some work with the cursor here.
} finally {
// this gets called even if there is an exception somewhere above
if(cursor != null)
cursor.close();
}
To make your App crash when you are not closing a cursor you can enable Strict Mode with detectLeakedSqlLiteObjects in your Applications onCreate:
StrictMode.VmPolicy policy = new StrictMode.VmPolicy.Builder()
.detectLeakedClosableObjects()
.detectLeakedSqlLiteObjects()
.penaltyDeath()
.penaltyLog()
.build();
StrictMode.setVmPolicy(policy);
Obviously you would only enable this for debug builds.
If you're having to dig through a significant amount of SQL code you may be able to speed up your debugging by putting the following code snippet in your MainActivity to enable StrictMode. If leaked database objects are detected then your app will now crash with log info highlighting exactly where your leak is. This helped me locate a rogue cursor in a matter of minutes.
#Override
protected void onCreate(Bundle savedInstanceState) {
if (BuildConfig.DEBUG) {
StrictMode.setVmPolicy(new StrictMode.VmPolicy.Builder()
.detectLeakedSqlLiteObjects()
.detectLeakedClosableObjects()
.penaltyLog()
.penaltyDeath()
.build());
}
super.onCreate(savedInstanceState);
...
...
I have just experienced this issue - and the the suggested answer of not closing the cursor while valid, was not how I fixed it. My issue was closing the database when SQLite was trying to repopulate it's cursor. I would open the database, query the database to get a cursor to a data set, close the database and iterate over the cursor. I noticed whenever I hit a certain record in that cursor, my app would crash with this same error in OP.
I assume that for the cursor to access certain records, it needs to re-query the database and if it is closed, it will throw this error. I fixed it by not closing the database until I had completed all the work I needed.
There is indeed a maximum size Android SQLite cursor windows can take and that is 2MB, anything more than this size would result into the above error. Mostly, this error is either caused by a large image byte array stored as blob in sql database or too long strings. Here is how i fixed it.
Create a java class eg. FixCursorWindow and put below code in it.
public static void fix() {
try {
Field field = CursorWindow.class.getDeclaredField("sCursorWindowSize");
field.setAccessible(true);
field.set(null, 102400 * 1024); //the 102400 is the new size added
} catch (Exception e) {
e.printStackTrace();
}
}
Now go to your application class (create one if you don't have already) and make a call to the FixCursorWindow like this
public class App extends Application {
public void onCreate()
{
super.onCreate();
CursorWindowFixer.fix();
}
}
Finally, ensure you include your application class in your manifest on the application tag like this
android:name=".App">
That's all, it should work perfectly now.
If you're running Android P, you can create your own cursor window like this:
if(cursor instanceof SQLiteCursor && Build.VERSION.SDK_INT >= Build.VERSION_CODES.P) {
((SQLiteCursor) cursor).setWindow(new CursorWindow(null, 1024*1024*10));
}
This allows you to modify the cursor window size for a specific cursor without resorting to reflections.
Here is #whlk answer with Java 7 automatic resource management of try-finally block:
try (Cursor cursor = db.query(...)) {
// do some work with the cursor here.
}
This is a Normal Exception while we are using External SQLite especially. You can resolve it by closing the Cursor Object just like as follow:
if(myCursor != null)
myCursor.close();
What it means is, IF the cursor has memory and it's opened then close it so the Application will be faster, all the Methods will take less space, and the functionalities related to the Database will also be improved.
public class CursorWindowFixer {
public static void fix() {
try {
Field field = CursorWindow.class.getDeclaredField("sCursorWindowSize");
field.setAccessible(true);
field.set(null, 102400 * 1024);
} catch (Exception e) {
e.printStackTrace();
}
}
}

Android SQLite query crashing when it takes too long?

I have an SQLite query in my android app that seems to crash when it takes too long to execute. It crashes with NullPointerException and tells me the line number...
When I put breakpoints around that line and see that it always gets filled with a variable, the app does not crash and does what it is supposed to.
So aside from having a phantom null pointer, it appears the problem is that the breakpoints actually slow things down giving the query time to complete. Without breakpoints it always crashes without fail.
Others here seem to have a similar problem, and I've read some things about SQLite taking an erratic amount of time to complete tasks, but this table should only ever have a few entries in it (the one I'm testing should only have three entries, 4 columns)
Suggestions on how to make it not crash? Perhaps put a thread wait inside the method that makes the query?
public void fetchItemsToRemove() throws SQLException{
Cursor mCursor =
mapDb.query(myMain_TABLE, new String[] {myOtherId, myCustomID, myDATE}, null, null, null, null, null);
if(mCursor.moveToFirst())
{
do
{
/*taking "dates" that were stored as plain text strings, and converting them to
*Date objects in a particular format for comparison*/
String DateCompareOld = mCursor.getString(mCursor.getColumnIndex(myDATE));
String DateCompareCurrent = "";
Date newDate = new Date();
DateCompareCurrent = newDate.toString();
try {
DateCompareOld = (String)DateCompareOld.subSequence(0, 10);
DateCompareCurrent = (String)DateCompareCurrent.subSequence(0, 10);
SimpleDateFormat dateType = new SimpleDateFormat("EEE MMM dd");
Date convertDate = dateType.parse(DateCompareOld);
newDate = dateType.parse(DateCompareCurrent);
if(convertDate.compareTo(newDate) < 0)
{
//remove unlim id
mapDb.delete(myMain_TABLE, myDATE + "=" + mCursor.getString(mCursor.getColumnIndex(myDATE)), null);
}
} catch (ParseException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}while(mCursor.moveToNext());
mCursor.close();
}
else
{
mCursor.close();
}
}
Now "line 342" where it crashes with NullPointerException is DateCompareOld = (String)DateCompareOld.subSequence(0, 10); where it gets a subsequence of the string. If it gets here and is null, this means the string was never filled at String DateCompareOld = mCursor.getString(mCursor.getColumnIndex(myDATE));
as if the query just got skipped because it took too long. Do note this is in a while loop, and I have done tests to make sure that the mCursor never goes out of bounds.
You're deleting things from a DB table whilst iterating over the results of a query from that table. Sounds a bit dangerous.
Try building a list, inside the loop, of things to be deleted, and then delete them in a single go after the loop finishes.
Also, wrap the entire thing in a DB transaction. When you're modifying the DB in a loop, that can make a huge difference to performance.
EDIT: a quick explanation of transactions:
A transaction allows you to combine a bunch of DB queries/modifications into a single atomic operation which either succeeds or fails. It's primarily a safety mechanism so your DB isn't stuck in an inconsistent state if something goes wrong half way through, but it also means that any modifications are committed to the DB's file storage in a single shot rather than one at a time, which is much faster.
You start the transaction at the start of your function:
public void fetchItemsToRemove() throws SQLException{
db.beginTransaction();
Cursor mCursor = ....
You set it as successful if the whole function completes without errors. This probably means you want to remove the inner try/catch and have an outer try/catch enclosing the loop. Then at the end of the try{ }, you can assume nothing's gone wrong, so you call:
db.setTransactionSuccessful();
Then, in a finally clause, to make sure you always close the transaction whether it's successful or otherwise:
db.endTransaction();

How to efficiently manage search suggestion using Android QSB?

I try to make a dictionary using Quick Search Box in Android. As shown in the SearchableDictionary tutorial, it loads all (999 definitions)data and uses them as matches to the input text to get the search suggestion. in my case, I have 26963 rows of data that need to be suggest while user input a word on QSB. therefore, I want to grab the char data one by one from the QSB, so that it will be efficiently load necessary suggestion. how can i do this?
here's the code i use...
bringit(200);
if (Intent.ACTION_VIEW.equals(intent.getAction())) {
// from click on search results
//Dictionary.getInstance().ensureLoaded(getResources());
String word = intent.getDataString();
//if(word.length() > 3){bringit(10);}
Dictionary.Word theWord = Dictionary.getMatches(word).get(0);
launchWord(theWord);
finish();
} else if (Intent.ACTION_SEARCH.equals(intent.getAction())) {
String query = intent.getStringExtra(SearchManager.QUERY);
//SearchManager.
//String bb =
mTextView.setText(getString(R.string.search_results, query));
WordAdapter wordAdapter = new WordAdapter(Dictionary.getMatches(query));
//letsCount(query);
mList.setAdapter(wordAdapter);
mList.setOnItemClickListener(wordAdapter);
}
Log.d("dict", intent.toString());
if (intent.getExtras() != null) {
Log.d("dict", intent.getExtras().keySet().toString());
}
}
private void letsCount(String query) {
// TODO Auto-generated method stub
for(int i=0; i<query.length(); i++){
definite[i] = query.charAt(i);
}
}
public void bringit(int sum) {
// TODO Auto-generated method stub
String[] ss = new String[10];
Log.d("dict", "loading words");
for(int i=1; i<=sum; i++){
KamusDbAdapter a = new KamusDbAdapter(getApplicationContext());
a.open();
Cursor x = a.quick(String.valueOf(i));startManagingCursor(x);
if(x.moveToFirst()){
ss[0] = x.getString(1);
ss[1] = x.getString(2);
}
Dictionary.addWord(ss[0].trim(), ss[1].trim());
Log.v("Debug",ss[0]+" "+ss[1]);
//onStop();
}
}
I use SQLite to collect data. and the other code is just same as the tutorial...
Retrieving a cursor is generally slow. You only want to retrieve one cursor which contains all the matching results.
You should perform the searching using SQL rather than fetching everything. A FULL_TEXT search is usually fastest for text matching, it is however slightly more complicated to implement than a simple LIKE, but I highly recommend you give it a try.
So you want to execute an SQL statement like:
SELECT * FROM my_table WHERE subject_column MATCH 'something'
See SQLite FTS Extension for more information. You can also use wild-cards to match part of a word.
In terms of search suggestions there is really no point returning more than around ~100 results since generally no users ever bother to scroll down that far, so you can further speed things up by adding a LIMIT 0, 100 to the end of your SQL statement.
If possible only start getting cursors once the user has entered more than X number of characters (usually 3 but in you're case this may not be appropriate). That way you're not performing searches that could potentially match thousands of items.
You seem to be leaving lots of cursors open until the application closes them even though you don't actually need them anymore: instead of calling startManagingCursor just make sure to call x.close() after your if (x.moveToFirst()) { ... } - this will free up memory faster.
On an unrelated note: please don't name your variables and methods things like ss or bringIt() as it makes code hard to read -- what is ss and what does bringIt() bring exactly?
You could have a look at the full text search extension in SQL Lite. Idea is to have a SQL query that fetches only the matching results, not all the results and then filter.
There is also a sample for the Android SDK: com/example/android/searchabledict/DictionaryDatabase

Categories

Resources