I'm doing both an update and an insert to a review_schedule table. It seems as if the time for the both takes between 1 and 2 seconds, which is a bit slow. I've tried it with and without indexes set on the id field and the date_review field. Note that I'm using a precompiled statement for the insert, but not the update, because the compiled statement for update apparently isn't supported in Gingerbread. Here's the code:
public long insert(int id, String lastReviewDate, String nextReviewDate) {
this.insertStmt.bindLong(1, id);
this.insertStmt.bindString(2, lastReviewDate);
this.insertStmt.bindString(3, nextReviewDate);
return this.insertStmt.executeInsert();
}
public void updateSRSInfo(int id, String lastReviewDate,
String nextReviewDate) {
ContentValues contentValues = new ContentValues();
contentValues.put("last_review_date", lastReviewDate);
contentValues.put("next_review_date", nextReviewDate);
this.myDataBase.update(DataBaseHelper.WORD_REVIEW_SCHEDULE,
contentValues, "_id=?", new String[] { Integer.toString(id) });
}
Any suggestions appreciated.
If the length of the procedure making the App unresponsive, maybe you can move the operation into a different thread.
At least two options:
1) Read uncommitted transactions, since by default SQLite reads only committed transactions, which is slow. But you have to understands all risks which you migth have reading uncommitted records. Changing transaction isolation level can be done in this way:
database.execSQL("PRAGMA read_uncommitted = true;");
2) Use tables indexing
As you already have got an id,why not combine it with ContentUri.withAppendedId,then you can access this record directly.
i'm not sure whether this run faster or not as i haven't test this.
Related
I saw in a tutorial a code updating data from a SQlite database using execSQL:
String update = "UPDATE FRUIT SET COLOR=? WHERE ID=?";
myDatabase.execSQL( update, new Object[] {"RED", 7});
Cursor cursor = myDatabase.rawQuery("SELECT * FROM FRUIT;", null);
if (cursor.moveToFirst()) {
do {
String name = cursor.getString(cursor.getColumnIndex("NAME"));
String color = cursor.getString(cursor.getColumnIndex("COLOR"));
Log.i(TAG, "onCreate: Name: " + name + ", color: " + color);
} while (cursor.moveToNext());
}
but, I read this in the oficial documentation of Android:
The code using execSQL worked but it's better to use update or I can still use execSQL since it worked? What's better for good practice? Since this tutorial is from a trustworthy source, why are they using execSQL?
The issue/warning regarding using execSQL may be for a few reasons, one of them being that you do no get anything returned when using execSQL, whilst the convenience methods return potentially useful values.
insert will return a long containing the id of the inserted row else -1.
update and delete return the number of affected rows.
Using the convenience methods also reduces the chance of making typing errors by build the underlying SQL, adding the keywords and enclosing strings and for some importantly offering a level of protection against SQL injection (execSQL's
bindArgs also offers protection against SQL Injection, likewise with rawQuery).
However, there are limitations and sometimes the use of execSQL/rawQuery becomes necessary for some more complex situations.
For inserting into sqlite presently I have to follow these steps:
Create contentValues i.e. ContentValues contentValues = new ContentValues();
Put column_name and Value
lastly, call sqLiteDatabase.insert(DATABASE_NAME,null,contentValues)
Problem is only in step 2,we have manually Columnname and Columnvalue for n number of times assuming I have n Columns to persist.
So, I wrote the following method thinking I can reuse it:
public void insert(Map tableMap){
ContentValues contentValues = new ContentValues();
Iterator tableMapIterator = tableMap.entrySet().iterator();
while(tableMapIterator.hasNext()){
Map.Entry mapEntry = (Map.Entry)tableMapIterator.next();
contentValues.put((String)mapEntry.getKey(), mapEntry.getValue());
}
sqLiteDatabase.insert(DATABASE_NAME,null,contentValues)
}
But the problem is that when I call mapEntry.getValue(), the return type is Object for which contentValues.put is not defined.
So, can anyone tell me any workaround so that I can use the above approach efficiently to do the data insertion.
NOTE : I want to write method so that I can use it for all data types in SQLITE.
The objects that will access your ContentMap will be verified by this method DatabaseUtils.getTypeOfObject()
Therefore, if you put anything in your ContentValue that is not one of the expected type, it will be assumed to be a String, and in bindArguments(), toString() will be called on it.
Now, assuming that all your object are either recognized valid types, or have sufficient String representation (for instance, a File object would give its path, which is sufficient to recreate it when you extract it from the database), there are ways to put an arbitrary Map in a ContentValue.
The trivial way is to use reflection to access the internal map, which is consistently named mValues across all versions of android.
Another, shorter (but slower) and clearer way, I find, is to use the Parcel mechanism. Indeed, ContentValue.writeToParcel only writes the internal map.
The entire code is here:
Parcel parcel = obtain();
parcel.writeMap(map);
parcel.setDataPosition(0);
ContentValues values = ContentValues.CREATOR.createFromParcel(parcel);
Detailed explanation on my blog : http://njzk2.wordpress.com/2013/05/31/map-to-contentvalues-abusing-parcelable/
I am not exactly sure if i get your question but i will try my best.
1) You can use some kind of already written ORM. - It can automatically detect field types.
2) You can write your own simple ORM to handle situations like this. When i want to automatically add object to DB, i inherit this table object from GenericTableObject, which has methods like getFieldsValues, getFieldsTypes etc... With help of these methods, it is fully automated.
You will probably spend a few hours by writing this generic table object but it is usefull. - It is all about java reflection.
try this one
public void addEntity(EntityClass e) {
SQLiteDatabase db = this.getWritableDatabase();
ContentValues values = new ContentValues();
values.put("Name",e.getmProductName());
values.put("Price",e.getmProductPrice());
values.put("Date",e.getmPurchaseDate());
// Inserting Row
db.insert(TABLE_Accounts, null, values);
Log.d("insert", "success");
Toast.makeText(mContext, "Info added Successfully", Toast.LENGTH_SHORT).show();
db.close(); // Closing database connection
}
I use this function to insert data into the SQLite Android data base:
public long insertAccount(String code,String name,int s3,int s4,String s5,String s6,int s7,
int s8,int s9,int s10,int s11,String s12,String s13,int s14,int s15,int s16) {
//container and place in it the information you want inserted, updated, etc.
ContentValues initialValues = new ContentValues();
initialValues.put(Code, code);
initialValues.put(Name,name);
initialValues.put(Type, s3);
initialValues.put(Level1, s4);
initialValues.put(Father, s5);
initialValues.put(ACCCurr,s6);
initialValues.put(AccNat, s7);
initialValues.put(LowLevel, s8);
initialValues.put(DefNum, s9);
initialValues.put(AccClass, s10);
initialValues.put(SubClass, s11);
initialValues.put(SSClass1, s12);
initialValues.put(SSClass2, s13);
initialValues.put(Stype1, s14);
initialValues.put(Stype2, s15);
initialValues.put(Stype3, s16);
return db.insert(DATABASE_TABLE, null, initialValues);
}
But this takes much time when inserting about 70,000+ rows! How can I accelerate the process of insertion into the data base, and after the insert is done, how can I apply Update on it?
Some options:
Prepopulate your database. See "Ship an application with a database"
Use transactions to reduce the time waiting for I/O. See e.g. "Android SQLite database: slow insertion". Likely you cannot wrap all 70k rows in a single transaction but something like 100..1000 inserts per transaction should be doable, cutting the cumulative I/O wait time by orders of magnitude.
Inserting into SQLlite android using PHP? how is it possible using php in android phone, I am sorry I didn't got this.
Anyways I believe you have written the java code up here and you have like 7k+ records that you want to insert in your db.
The style of inserting a bulk of records in any db is called "Bulk Inserts", the idea is to create as less number of transactions as possible and rather do all the inserts in one shot; In case of relational db's like sql server and oracle its done by specific api's as well, but in sqllite the plain old idea is to make a single transaction with a bunch of data
check out this article which uses the same technique http://www.techrepublic.com/blog/software-engineer/turbocharge-your-sqlite-inserts-on-android/ and also explains it quite well.
You have to use transaction to done insertion in 1 time. you can use this:
//before insertion
db.beginTransaction();
//====do insertion
//after insertion
db.setTransactionSuccessful()
db.endTransaction();
There is a maximum number of values that can be set in an insert statement using transaction?
Example of my Android aplication function:
public void addArray(ArrayList<MyObject> arrData) {
this.mDb.beginTransaction();
try {
for(MyObject obj : arrData){
ContentValues values = new ContentValues();
values.put(KEY_ROW_ID, obj.getId());
values.put(KEY_NAME , obj.getName());
values.put(KEY_ICON , obj.getImage());
//////// LIMIT HERE ?
this.mDb.insert(TABLE, null, values);
}
this.mDb.setTransactionSuccessful();
}
finally {
this.mDb.endTransaction();
}
}
Thanks
There is no practical limit on number of inserts per transaction.
You will probably hit disk space filling issues before you will see any problems with number of rows in unfinished transaction.
SQLite writes all insert intent into journal or wal file, and it could easily grow into gigabytes.
I have personally tried to insert as many as 100k rows in one transaction, and it was working just fine.
SQLite author D. Richard Hipp even suggests to open transaction upon program start, do whatever you need to do for long time, and only commit when you quit or explicitly save the state. This is basically cheap way to implement undo function - simply rollback current transaction.
I made my own contentprovider where I put a lot of data in at once with multiple inserts.
The app will receive the data from an external source and at this moment I receive about 30 items (therefor 30 times an insert).
Now I noticed that this takes a lot of precious time (about 3 seconds, 100ms on each insert).
How can I improve the speed of the contentprovider? I already tried to bulkInsert them all together but them it will take up to 5 sec.
Thanks in advance.
wrap all that in insertBulk into transactions.
Example:
SQLiteDatabase sqlDB = mDB.getWritableDatabase();
sqlDB.beginTransaction();
try {
for (ContentValues cv : values) {
long newID = sqlDB.insertOrThrow(table, null, cv);
if (newID <= 0) {
throw new SQLException("Failed to insert row into " + uri);
}
}
sqlDB.setTransactionSuccessful();
getContext().getContentResolver().notifyChange(uri, null);
numInserted = values.length;
} finally {
sqlDB.endTransaction();
}
bulkInsert does not use transactions by default, since the default behavior just calls insert:
Override this to handle requests to insert a set of new rows, or the default implementation will iterate over the values and call insert(Uri, ContentValues) on each of them.
doing the inserts in a transaction greatly improves the speed, since only one write to the actuall database takes place:
db.beginTransaction();
try {
// do the inserts
db.setTransactionSuccessful()
} finally {
db.endTransaction();
}
I was once experimenting trying to improve the write speed of about ~2000 writes, and this was the only big improvement I found.
Doing db.setLockingEnabled(false) i think gave about 1% improvement, but then you must also make sure no other threads write to the db. Removing redundant indexes can also give a minor boost if the table is huge.