I have an SQLite query in my android app that seems to crash when it takes too long to execute. It crashes with NullPointerException and tells me the line number...
When I put breakpoints around that line and see that it always gets filled with a variable, the app does not crash and does what it is supposed to.
So aside from having a phantom null pointer, it appears the problem is that the breakpoints actually slow things down giving the query time to complete. Without breakpoints it always crashes without fail.
Others here seem to have a similar problem, and I've read some things about SQLite taking an erratic amount of time to complete tasks, but this table should only ever have a few entries in it (the one I'm testing should only have three entries, 4 columns)
Suggestions on how to make it not crash? Perhaps put a thread wait inside the method that makes the query?
public void fetchItemsToRemove() throws SQLException{
Cursor mCursor =
mapDb.query(myMain_TABLE, new String[] {myOtherId, myCustomID, myDATE}, null, null, null, null, null);
if(mCursor.moveToFirst())
{
do
{
/*taking "dates" that were stored as plain text strings, and converting them to
*Date objects in a particular format for comparison*/
String DateCompareOld = mCursor.getString(mCursor.getColumnIndex(myDATE));
String DateCompareCurrent = "";
Date newDate = new Date();
DateCompareCurrent = newDate.toString();
try {
DateCompareOld = (String)DateCompareOld.subSequence(0, 10);
DateCompareCurrent = (String)DateCompareCurrent.subSequence(0, 10);
SimpleDateFormat dateType = new SimpleDateFormat("EEE MMM dd");
Date convertDate = dateType.parse(DateCompareOld);
newDate = dateType.parse(DateCompareCurrent);
if(convertDate.compareTo(newDate) < 0)
{
//remove unlim id
mapDb.delete(myMain_TABLE, myDATE + "=" + mCursor.getString(mCursor.getColumnIndex(myDATE)), null);
}
} catch (ParseException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}while(mCursor.moveToNext());
mCursor.close();
}
else
{
mCursor.close();
}
}
Now "line 342" where it crashes with NullPointerException is DateCompareOld = (String)DateCompareOld.subSequence(0, 10); where it gets a subsequence of the string. If it gets here and is null, this means the string was never filled at String DateCompareOld = mCursor.getString(mCursor.getColumnIndex(myDATE));
as if the query just got skipped because it took too long. Do note this is in a while loop, and I have done tests to make sure that the mCursor never goes out of bounds.
You're deleting things from a DB table whilst iterating over the results of a query from that table. Sounds a bit dangerous.
Try building a list, inside the loop, of things to be deleted, and then delete them in a single go after the loop finishes.
Also, wrap the entire thing in a DB transaction. When you're modifying the DB in a loop, that can make a huge difference to performance.
EDIT: a quick explanation of transactions:
A transaction allows you to combine a bunch of DB queries/modifications into a single atomic operation which either succeeds or fails. It's primarily a safety mechanism so your DB isn't stuck in an inconsistent state if something goes wrong half way through, but it also means that any modifications are committed to the DB's file storage in a single shot rather than one at a time, which is much faster.
You start the transaction at the start of your function:
public void fetchItemsToRemove() throws SQLException{
db.beginTransaction();
Cursor mCursor = ....
You set it as successful if the whole function completes without errors. This probably means you want to remove the inner try/catch and have an outer try/catch enclosing the loop. Then at the end of the try{ }, you can assume nothing's gone wrong, so you call:
db.setTransactionSuccessful();
Then, in a finally clause, to make sure you always close the transaction whether it's successful or otherwise:
db.endTransaction();
Related
This is my first Application with database, I hope that someone can help me to understand this problem. I have this insert method:
public long insertData(String name, int password){
....
contentValues.put(KEY_NAME, name);
contentValues.put(KEY_PASSWORD, password);
return db.insert(DBHelper.TABle_NAME, null, contentValues);
}
I can insert few data with this method, but what about if I have thousands of rows? how can I insert all these data into database? where can I write all these data, in extra class or what?
As others have said, you'll need to do some sort of iteration.
Efficiency can be gained by performing a bulk transaction. Here's an example:
public int bulkInsert(#NonNull ContentValues[] values) {
int insertCount = 0;
SQLiteDatabase db = mSqlHelper.getWritableDatabase();
try {
db.beginTransaction();
for (ContentValues value : values) {
if (db.insertOrThrow(tableName, null, value) == -1) {
throw new Exception("Unknown error while inserting entry in database.");
}
insertCount++;
}
db.setTransactionSuccessful();
} catch (Exception e) {
Log.e(LOG_TAG, "An error occurred while bulk-inserting database entries.\n" + e.getMessage(), e);
} finally {
db.endTransaction();
}
return insertCount;
}
There is no 'bulk load' facility that I'm aware of.
You'd just have to spin through the list, and insert the items.
You might want to think about why you're potentially trying to insert thousands of items into a database on a hardware-limited device like a phone or a tablet.
Might it be better to put the data on a server, and create an API that you can use to load data (for display) by pages?
you can do it the same way, that you do with few data, you only need to catch the thousands rows to insert into your database using your method, you can use asyntask, or a service to do that
You can use the same method to insert any amount of records, whether it's 1 or 1,000. Use a loop to call your insert method and add your records to your database. Consider putting your database executions in an AsyncTask to prevent your UI thread from hanging.
Your data can come from anywhere, as long as it's formatted to fit your function parameters String, int
I am new to android and maybe its a silly question but i am not getting it. See i am designing a game in which we give scores to some persons. So i want to store the names of the persons in a database while installation and then their scores set to 0 initially which will be updated according to what the users select. Here i am not able to figure out that how should i enter the data as it will be around 100 names and their scores. Using INSERT INTO() statement will make it like 100 statements. So is there any short method like can we do it through strings or something. Just guessing though. Any help would be appreciated.
You don't hard-code names or scores into your SQL statements. Instead, you use parameters.
var command = new SQLiteCommand()
command.CommandText = "INSERT INTO Scores (name, score) VALUES(#name, #score)";
command.CommandType = CommandType.Text;
foreach (var item in data)
{
command.Parameters.Add(new SQLiteParameter("#name", item.Name));
command.Parameters.Add(new SQLiteParameter("#score", item.Score));
command.ExecuteNonQuery();
}
and then just loop through all of the names and scores.
I recommend you using a transaction.
You can archive this stating you want to use a transaction with beginTransaction(), do all the inserts on makeAllInserts() with a loop and if everything works then call setTransactionSuccessful() to do it in a batch operation. If something goes wrong, on the finally section you will call endTransaction() without setting the success, this will execute a rollback.
db.beginTransaction();
try {
makeAllInserts();
db.setTransactionSuccessful();
}catch {
//Error in between database transaction
}finally {
db.endTransaction();
}
For the makeAllInserts function, something like this could work out:
public void makeAllInserts() {
for(int i = 0; i < myData.size(); i++) {
myDataBase = openDatabase();
ContentValues values = new ContentValues();
values.put("name", myData.get(i).getName());
values.put("score", myData.get(i).getScore());
myDataBase.insert("MYTABLE", nullColumnHack, values);
}
}
If you also want to know about the nullColumnHack here you have a good link -> https://stackoverflow.com/a/2663620/709671
Hope it helps.
i'm searching for hours now, to get a solution for this problem:
at the very beginning of my android app, a layout with buttons is shown to the user. if he clicks on the button "Tasks" a listView should pop up (another activity and layout) to show him all available Tasks, and with a click on one he can do even more things, but they're not necessary for my problem. the point is, the app won't get any Data out of the Database, but when I Step Into or Step Over the lines which call a method for all the DBStuff it works.
Here are the necesssary lines:
if (connection1.OpenDatabase(1, getDataBaseName()))
{
CTask = connection1.DBQueryTable(getDataBaseName(), "Tasks", TempFieldT);
CEquipment = connection1.DBQueryTable(getDataBaseName(), "Equipment", TempFieldE);
connection1.CloseDatabase();
}
so he will run over those lines, execute the lines beneath, but wont give any Data back, when i'm not supervising it with breakpoints, and the steps. when i do it, all things work the way they should.
The Database Stuff the app runs through at this place.
public Android.Database.ICursor DBQueryTable(string DataBaseName, string TableName, string[] Fields)
{
FindDBPath(DataBaseName);
Android.Database.ICursor c;
string TempF = "";
string str = "";
foreach (string n in Fields)
{ TempF += n + ","; }
SQLQuery = "SELECT " + (str = TempF.TrimEnd(',')) +" FROM " + TableName;
c = sqldTemp.RawQuery(SQLQuery, null);
return c;
}
so why do the app/compiler/debugger behave like this? are there any mistakes i did, but i can't figure out right now?
Ps: yeah i know there is a query function, but thats not necessary here as long as it would provide a solution to my problem.
Your DBQueryTable method returns a cursor. That will become invalidated as soon as you close the connection in the following line:
connection1.CloseDatabase();
You should keep the connection open for as long as you need the cursor. For example, you could fetch all the data from the cursor and then close the connection.
can anyone explain why my inserts are taking so long in Ormlite? Doing 1,700 inserts in one sqlite transaction on the desktop takes less than a second. However, when using Ormlite for Android, it's taking about 70 seconds, and I can see each insert in the debugging messages.
When I try and wrap the inserts into one transaction it goes at exactly the same speed. I understand that there is overhead both for Android and for Ormlite, however, I wouldn't expect it to be that great. My code is below:
this.db = new DatabaseHelper(getApplicationContext());
dao = db.getAddressDao();
final BufferedReader reader = new BufferedReader(new InputStreamReader(getResources().openRawResource(R.raw.poi)));
try {
dao.callBatchTasks(new Callable<Void>() {
public Void call() throws Exception {
String line;
while ((line = reader.readLine()) != null) {
String[] columns = line.split(",");
Address address = new Address();
// setup Address
dao.create(address);
}
return null;
}
});
} catch (SQLException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
I've had the same problem, and found a reasonable workaround. This took insert time from 2 seconds to 150ms:
final OrmLiteSqliteOpenHelper myDbHelper = ...;
final SQLiteDatabase db = myDbHelper.getWritableDatabase();
db.beginTransaction();
try{
// do ormlite stuff as usual, no callBatchTasks() needed
db.setTransactionSuccessful();
}
finally {
db.endTransaction();
}
Update:
Just tested this on Xperia M2 Aqua (Android4.4/ARM) and callBatchTasks() is actually faster. 90ms vs 120ms. So I think more details are in order.
We have 3 tables/classes/DAOs: Parent, ChildWrapper, Child.
Relations: Parent to ChildWrapper - 1 to n, ChildWrapper to Child - n to 1.
Code goes like this:
void saveData(xml){
for (parents in xml){
parentDao.createOrUpdate(parent);
for (children in parentXml){
childDao.createOrUpdate(child);
childWrapperDao.createOrUpdate(generateWrapper(parent, child));
}
}
}
I've got original speed up on a specific Android4.2/MIPS set-top-box (STB).
callBatchTasks was the first option because that's what we use througout all the code and it works well.
parentDao.callBatchTasks(
// ...
saveData();
// ...
);
But inserts were slow, so we've tried to nest callBatchTasks for every used DAO, set autocommit off, startThreadConnection and probably something else - don't remember at the moment. To no avail.
From my own experience and other similar posts it seems the problem occurs when several tables/DAOs are involved and it has something to do with implemetation specifics of Android (or SQLite) for concrete devices.
Unfortunately, this may be "expected". I get similar performance when I do that number of inserts under my emulator as well. The batch-tasks and turning off auto-commit don't seem to help.
If you are looking to load a large amount of data into a database, you might consider replaying a database dump instead. See here:
Android OrmLite pre-populate database
My guess would be that you are slowing somewhat because you are doing two IO tasks at one time (at least in the code shown above). You are reading from a file and writing to a database (which is a file). Also, from what I understand transactions should be a reasonable size. 1600 seems like a very high number. I would start with 100 but play around with the size.
So essentially I suggest you "chunk" your reads and inserts.
Read 100 lines to a temp Array, then insert that 100. Then read the next 100, then insert, etc.
I have a routine that runs different queries against an SQLite database many times per second. After a while I would get the error
"android.database.CursorWindowAllocationException: - Cursor window allocation of 2048 kb failed. # Open Cursors = " appear in LogCat.
I had the app log memory usage, and indeed when usage reaches a certain limit the I get this error, implying it runs out. My intuition tells me that the database engine is creating a NEW buffer (CursorWindow) every time I run a query, and even though .close() the cursors, neither the garbage collector nor SQLiteDatabase.releaseMemory() are quick enough at freeing memory. I think the solution may lie in "forcing" the database to always write into the same buffer, and not create new ones, but I have been unable to find a way to do this. I have tried instantiating my own CursorWindow, and tried setting SQLiteCursor to it, but to no avail.
¿Any ideas?
EDIT: re example code request from #GrahamBorland:
public static CursorWindow cursorWindow = new CursorWindow("cursorWindow");
public static SQLiteCursor sqlCursor;
public static void getItemsVisibleArea(GeoPoint mapCenter, int latSpan, int lonSpan) {
query = "SELECT * FROM Items"; //would be more complex in real code
sqlCursor = (SQLiteCursor)db.rawQuery(query, null);
sqlCursor.setWindow(cursorWindow);
}
Ideally I would like to be able to .setWindow() before giving a new query, and have the data put into the same CursorWindow everytime I get new data.
Most often the cause for this error are non closed cursors. Make sure you close all cursors after using them (even in the case of an error).
Cursor cursor = null;
try {
cursor = db.query(...
// do some work with the cursor here.
} finally {
// this gets called even if there is an exception somewhere above
if(cursor != null)
cursor.close();
}
To make your App crash when you are not closing a cursor you can enable Strict Mode with detectLeakedSqlLiteObjects in your Applications onCreate:
StrictMode.VmPolicy policy = new StrictMode.VmPolicy.Builder()
.detectLeakedClosableObjects()
.detectLeakedSqlLiteObjects()
.penaltyDeath()
.penaltyLog()
.build();
StrictMode.setVmPolicy(policy);
Obviously you would only enable this for debug builds.
If you're having to dig through a significant amount of SQL code you may be able to speed up your debugging by putting the following code snippet in your MainActivity to enable StrictMode. If leaked database objects are detected then your app will now crash with log info highlighting exactly where your leak is. This helped me locate a rogue cursor in a matter of minutes.
#Override
protected void onCreate(Bundle savedInstanceState) {
if (BuildConfig.DEBUG) {
StrictMode.setVmPolicy(new StrictMode.VmPolicy.Builder()
.detectLeakedSqlLiteObjects()
.detectLeakedClosableObjects()
.penaltyLog()
.penaltyDeath()
.build());
}
super.onCreate(savedInstanceState);
...
...
I have just experienced this issue - and the the suggested answer of not closing the cursor while valid, was not how I fixed it. My issue was closing the database when SQLite was trying to repopulate it's cursor. I would open the database, query the database to get a cursor to a data set, close the database and iterate over the cursor. I noticed whenever I hit a certain record in that cursor, my app would crash with this same error in OP.
I assume that for the cursor to access certain records, it needs to re-query the database and if it is closed, it will throw this error. I fixed it by not closing the database until I had completed all the work I needed.
There is indeed a maximum size Android SQLite cursor windows can take and that is 2MB, anything more than this size would result into the above error. Mostly, this error is either caused by a large image byte array stored as blob in sql database or too long strings. Here is how i fixed it.
Create a java class eg. FixCursorWindow and put below code in it.
public static void fix() {
try {
Field field = CursorWindow.class.getDeclaredField("sCursorWindowSize");
field.setAccessible(true);
field.set(null, 102400 * 1024); //the 102400 is the new size added
} catch (Exception e) {
e.printStackTrace();
}
}
Now go to your application class (create one if you don't have already) and make a call to the FixCursorWindow like this
public class App extends Application {
public void onCreate()
{
super.onCreate();
CursorWindowFixer.fix();
}
}
Finally, ensure you include your application class in your manifest on the application tag like this
android:name=".App">
That's all, it should work perfectly now.
If you're running Android P, you can create your own cursor window like this:
if(cursor instanceof SQLiteCursor && Build.VERSION.SDK_INT >= Build.VERSION_CODES.P) {
((SQLiteCursor) cursor).setWindow(new CursorWindow(null, 1024*1024*10));
}
This allows you to modify the cursor window size for a specific cursor without resorting to reflections.
Here is #whlk answer with Java 7 automatic resource management of try-finally block:
try (Cursor cursor = db.query(...)) {
// do some work with the cursor here.
}
This is a Normal Exception while we are using External SQLite especially. You can resolve it by closing the Cursor Object just like as follow:
if(myCursor != null)
myCursor.close();
What it means is, IF the cursor has memory and it's opened then close it so the Application will be faster, all the Methods will take less space, and the functionalities related to the Database will also be improved.
public class CursorWindowFixer {
public static void fix() {
try {
Field field = CursorWindow.class.getDeclaredField("sCursorWindowSize");
field.setAccessible(true);
field.set(null, 102400 * 1024);
} catch (Exception e) {
e.printStackTrace();
}
}
}