im trying to do a batch insert of about 700 floats. The method i'm using is below and as well as the content provider's bulkInsert. The issue is that when i put all the floating point values into the ContentValues nothing happens. What's a better way to insert those floating point values into the ContentValues object?
private void saveToDatabase( float[] tempValues )
{
ContentValues values = new ContentValues();
// WM: TODO: add patient id and sensor type
for (float tempVal : tempValues){
values.put( DataTable.COLUMN_DATA, tempVal );
}
ContentValues[] cvArray = new ContentValues[1];
cvArray[0] = values;
ContentResolver resolver = getContentResolver();
resolver.bulkInsert( HealthDevContentProvider.CONTENT_URI_DATA, cvArray);
public int bulkInsert(Uri uri, ContentValues[] values){
int numInserted = 0;
String table = null;
int uriType = sURIMatcher.match(uri);
switch (uriType) {
case RAWINPUT_TABLE:
table = RAWINPUT_TABLE_PATH;
break;
}
db.beginTransaction();
try {
for (ContentValues cv : values) {
long newID = db.insertOrThrow(table, null, cv);
if (newID <= 0) {
throw new SQLException("Failed to insert row into " + uri);
}
}
db.setTransactionSuccessful();
getContext().getContentResolver().notifyChange(uri, null);
numInserted = values.length;
} finally {
db.endTransaction();
}
return numInserted;
}
If you want each float to have it's own record in your database, you need an instance of ContentValues for each new record. Right now you have one instance of ContentValues and you are writing the same key to it (meaning you are writing over the value) 700 times.
private void saveToDatabase( float[] tempValues ) {
final int count = tempValues.legnth;
ContentValues[] cvArray = new ContentValues[count];
for (int i = 0; i < count; i++) {
float tempVal = tempValues[i];
ContentValues values = new ContentValues();
values.put( DataTable.COLUMN_DATA, tempVal );
cvArray[i] = values;
}
/* all the rest */
}
I know that this will be rude, but just throw away this code. Providers have primary methods to deal with most SQLite operations and you tried to blend three of them (insert(), bulkInsert(), and applyBatch()) into some kind of Frankenstein. Here are the main mistakes:
1) This line values.put(DataTable.COLUMN_DATA, tempVal) is not inserting new entries at each iteration; it is overriding them. After all iterations, values contains only the 700th float value of your array.
2) As #Karakuri remembered, there is only one ContentValues instance inside cvArray. bulkInsert() doc states about its second parameter:
An array of sets of column_name/value pairs to add to the database. This must not be null.
So cvArray must contain a ContentValues instance (a set) for every entry you want to insert into the database.
3) Not exactly an error, but something you should watch out. There are no guarantees that mTables will exist, and trying to make operations without specifying a table will throw a SQLException.
4) These three lines are basically useless:
if (newId <= 0) {
throw new SQLException("Failed to insert row into " + uri);
}
insertOrThrow() already throws an exception if some error happens during the insert operation. If you want to check manually for an error, try insert() or insertWithOnConflict() (or add a catch to your try block and deal with the exception there).
5) And finally, there is the problem about numInserted #petey pointed (and there's no need to repeat).
One last advice: forget that bulkInsert() exists. I know that this will require more lines of code, but using applyBatch() you can achieve better results (and more easily, since you do not have to implement it). Wolfram Rittmeyer wrote a series of excellent articles about transactions, check if you have any doubt.
Last but not least (yes, I'm in a good mood today), this is how I would do a basic implementation of your code:
#Override
public Uri insert(Uri uri, ContentValues values) {
final SQLiteDatabase db // TODO: retrieve writable database
final int match = matcher.match(uri);
switch(match) {
case RAWINPUT_TABLE:
long id = db.insert(RAWINPUT_TABLE, null, values); // TODO: add catch block to deal.
getContext().getContentResolver().notifyChange(uri, null, false);
return ContentUris.withAppendedId(uri, id);
default:
throw new UnsupportedOperationException("Unknown uri: " + uri);
}
}
private void saveToDatabase( float[] tempValues ) {
ArrayList<ContentProviderOperation> operations = new ArrayList<ContentProviderOperation>();
for (float tempVal : tempValues){
operations.add(ContentProviderOperation
.newInsert(HealthDevContentProvider.CONTENT_URI_DATA)
.withValue(DataTable.COLUMN_DATA, tempVal).build();
.withValue() // TODO: add patient id
.withValue() // TODO: add sensor type);
}
// WARNING!! Provider operations (except query if you are using loaders) happen by default in the main thread!!
getContentResolver().applyBatch(operations);
}
I use batch inserts, not sure what the difference between bulk and batch is but all I do is this
ArrayList<ContentProviderOperation> operations = new ArrayList<ContentProviderOperation>();
for(int j=0;j<locationAry.length;j++){
ContentValues values2 = new ContentValues();
values2.put(MapPoints.ELEMENT_ECM2ID, ecm2id);
values2.put(MapPoints.ELEMENT_ID, newElementId);
values2.put(MapPoints.LATITUDE, locationAry[j+1]);
values2.put(MapPoints.LONGITUDE, locationAry[j]);
values2.put(MapPoints.LAYER_ID, layerID);
operations2.add(ContentProviderOperation.newInsert(MapPoints.CONTENT_URI).withValues(values2).build());
}
getContentResolver().applyBatch(MapElements.AUTHORITY, operations);
did you override the bulkInsert method in your ContentProvider?
If one insert fails, your whole transaction fails. Without seeing your table create statement for unique keys, try a replace after your insert fails.. Also your numInserted will always be the same as values.length no matter what insert/replace fails. this doesnt seem correct either.
...
db.beginTransaction();
int numInserted = 0;
try {
for (ContentValues cv : values) {
long newID;
try {
newID = database.insertOrThrow(table, null, cv);
} catch (SQLException ignore) {
newID = database.replace(table, null, cv);
}
if (newID <= 0) {
Log.e("TAG, "Failed to insert or replace row into " + uri);
} else {
// you are good...increment numInserted
numInserted++;
}
}
db.setTransactionSuccessful();
getContext().getContentResolver().notifyChange(uri, null);
} finally {
db.endTransaction();
}
return numInserted;
Related
I need to insert data related to an Order and its corresponding Detail.
Without a ContentProvider I would do something like this:
public boolean insertOrder(Order order, ArrayList<OrderDetail> items) {
boolean wasSuccessful = false;
ContentValues cvOrder = new ContentValues();
cvPedido.put(ORDER_CUSTOMER_ID, order.getCustomerId());
cvPedido.put(ORDER_CUSTOMER_NAME, order.getCustomerName());
String insertQuery = "INSERT INTO " + ORDER_DETAIL_TABLE
+ " VALUES (?,?)";
//...
try {
this.openWriteableDB();
SQLiteStatement statement = db.compileStatement(insertQuery);
db.beginTransaction();
long idOrder = db.insertOrThrow(ORDER_TABLE, null, cvOrder);
if (idOrder > 0) {
for (int i = 0; i < items.size(); i++) {
OrderDetail detail=items.get(i);
statement.clearBindings();
statement.bindString(1, detail.getDescription);
statement.bindDouble(2, detail.getPrice);
//...
statement.execute();
}
db.setTransactionSuccessful();
wasSuccessful = true;
}
} finally {
db.endTransaction();
this.closeDB();
}
return wasSuccessful;
}
The problem is that now I want to use a ContentProvider and I don't know what to do with this kind of cases where data about two or more tables must be passed to a single CRUD operation, knowing that a insert operation only accepts two parameters :
#Override
public Uri insert(Uri uri, ContentValues values) {
}
What do you do in a ContentProvider when you have to insert relational data in a transaction?
Thanks in advance.
You should use ContentProviderOperation. Since it's your ContentProvider you can assure that applyBatch() will execute all operations within a transaction. All standard content providers also ensure that that's the case.
See my blog post about ContentProviderOperation in general and my other post about how to use withBackReference() to access results of previous operations - which you need to access the orderId.
One important caveat: All ContentProviderOperations of one batch must use the same authority - but can use different URIs! In your case that should be no problem.
You can put all the data into a ContentValues and have a provider. You'll have to get a little creative with the order details.
Below psuedo code I create a key "DETAIL" on the fly with a integer then the item.
ContentValues values = new ContentValues();
values.put(ORDER_ID,orderid);
for (int i = 0; i < items.size(); i++) {
values.put("DETAIL" + Integer.ToString(i),items.get(i));
}
Uri uri = context.getContentResolver().insert(
ORDER_URI, values);
Then in content provider you sort it out.
#Override
public Uri insert(Uri uri, ContentValues values) {
int uriType = sURIMatcher.match(uri);
SQLiteDatabase sqlDB = database.getWritableDatabase();
long id = 0;
switch (uriType) {
case ORDER:
// trim name and description
trimNameDescriptions(values);
try {
id = sqlDB.insertOrThrow(ORDERS_TABLE,
null, values);
Integer i =0;
do (values.containsKey("DETAIL" + i.toString()){
ContentValues v = new ContentValues();
v.put("DETAIL",values.get("Detail" + i.toString()));
v.put("ORDERID",id);
//ACTUALLY CALL THE INSERT METHOD OF THE PROVIDER
insert(DETAIL_URI,v);
i+=1;
}
You can design content provider to mirror your SQLite tables, and use insertOrder code same as above.
Just use insert of content providers for each table(uri), to perform similar operations as in your insertOrder method
Another option is to define your content provider URI to take combination of your Order and items , and implement the parsing yourself in the content provider before committing to underlying data model.
I need to parse a fairly large XML file (varying between about a hundred kilobytes and several hundred kilobytes), which I'm doing using Xml#parse(String, ContentHandler). I'm currently testing this with a 152KB file.
During parsing, I also insert the data in an SQLite database using calls similar to the following: getWritableDatabase().insert(TABLE_NAME, "_id", values). All of this together takes about 80 seconds for the 152KB test file (which comes down to inserting roughly 200 rows).
When I comment out all insert statements (but leave in everything else, such as creating ContentValues etc.) the same file takes only 23 seconds.
Is it normal for the database operations to have such a big overhead? Can I do anything about that?
You should do batch inserts.
Pseudocode:
db.beginTransaction();
for (entry : listOfEntries) {
db.insert(entry);
}
db.setTransactionSuccessful();
db.endTransaction();
That increased the speed of inserts in my apps extremely.
Update:
#Yuku provided a very interesting blog post: Android using inserthelper for faster insertions into sqlite database
Since the InsertHelper mentioned by Yuku and Brett is deprecated now (API level 17), it seems the right alternative recommended by Google is using SQLiteStatement.
I used the database insert method like this:
database.insert(table, null, values);
After I also experienced some serious performance issues, the following code speeded my 500 inserts up from 14.5 sec to only 270 ms, amazing!
Here is how I used SQLiteStatement:
private void insertTestData() {
String sql = "insert into producttable (name, description, price, stock_available) values (?, ?, ?, ?);";
dbHandler.getWritableDatabase();
database.beginTransaction();
SQLiteStatement stmt = database.compileStatement(sql);
for (int i = 0; i < NUMBER_OF_ROWS; i++) {
//generate some values
stmt.bindString(1, randomName);
stmt.bindString(2, randomDescription);
stmt.bindDouble(3, randomPrice);
stmt.bindLong(4, randomNumber);
long entryID = stmt.executeInsert();
stmt.clearBindings();
}
database.setTransactionSuccessful();
database.endTransaction();
dbHandler.close();
}
Compiling the sql insert statement helps speed things up. It can also require more effort to shore everything up and prevent possible injection since it's now all on your shoulders.
Another approach which can also speed things up is the under-documented android.database.DatabaseUtils.InsertHelper class. My understanding is that it actually wraps compiled insert statements. Going from non-compiled transacted inserts to compiled transacted inserts was about a 3x gain in speed (2ms per insert to .6ms per insert) for my large (200K+ entries) but simple SQLite inserts.
Sample code:
SQLiteDatabse db = getWriteableDatabase();
//use the db you would normally use for db.insert, and the "table_name"
//is the same one you would use in db.insert()
InsertHelper iHelp = new InsertHelper(db, "table_name");
//Get the indices you need to bind data to
//Similar to Cursor.getColumnIndex("col_name");
int first_index = iHelp.getColumnIndex("first");
int last_index = iHelp.getColumnIndex("last");
try
{
db.beginTransaction();
for(int i=0 ; i<num_things ; ++i)
{
//need to tell the helper you are inserting (rather than replacing)
iHelp.prepareForInsert();
//do the equivalent of ContentValues.put("field","value") here
iHelp.bind(first_index, thing_1);
iHelp.bind(last_index, thing_2);
//the db.insert() equilvalent
iHelp.execute();
}
db.setTransactionSuccessful();
}
finally
{
db.endTransaction();
}
db.close();
If the table has an index on it, consider dropping it prior to inserting the records and then adding it back after you've commited your records.
If using a ContentProvider:
#Override
public int bulkInsert(Uri uri, ContentValues[] bulkinsertvalues) {
int QueryType = sUriMatcher.match(uri);
int returnValue=0;
SQLiteDatabase db = mOpenHelper.getWritableDatabase();
switch (QueryType) {
case SOME_URI_IM_LOOKING_FOR: //replace this with your real URI
db.beginTransaction();
for (int i = 0; i < bulkinsertvalues.length; i++) {
//get an individual result from the array of ContentValues
ContentValues values = bulkinsertvalues[i];
//insert this record into the local SQLite database using a private function you create, "insertIndividualRecord" (replace with a better function name)
insertIndividualRecord(uri, values);
}
db.setTransactionSuccessful();
db.endTransaction();
break;
default:
throw new IllegalArgumentException("Unknown URI " + uri);
}
return returnValue;
}
Then the private function to perform the insert (still inside your content provider):
private Uri insertIndividualRecord(Uri uri, ContentValues values){
//see content provider documentation if this is confusing
if (sUriMatcher.match(uri) != THE_CONSTANT_IM_LOOKING_FOR) {
throw new IllegalArgumentException("Unknown URI " + uri);
}
//example validation if you have a field called "name" in your database
if (values.containsKey(YOUR_CONSTANT_FOR_NAME) == false) {
values.put(YOUR_CONSTANT_FOR_NAME, "");
}
//******add all your other validations
//**********
//time to insert records into your local SQLite database
SQLiteDatabase db = mOpenHelper.getWritableDatabase();
long rowId = db.insert(YOUR_TABLE_NAME, null, values);
if (rowId > 0) {
Uri myUri = ContentUris.withAppendedId(MY_INSERT_URI, rowId);
getContext().getContentResolver().notifyChange(myUri, null);
return myUri;
}
throw new SQLException("Failed to insert row into " + uri);
}
i'm creating a contentProvider , and i wish to be able to send it multiple DB records (contentValues) to be inserted or updated to a single table using a single batch operations .
how do i do that?
batchInsert is intended only for inserting , but wouldn't it mean that insertion of something that already exists won't do anything?
also , is there a way for the update operation to use a special constraint ? for example , i need to ignore the primary key and update based on 2 other fields that together are unique.
"batchInsert is intended only for inserting" : this is true BUT you can override it in your ContentProvider to perform an UPSERT (insert/update) depending on the URI passed to batchInsert.
The following is some working code that I currently use to perform bulk inserts on time-series data (admittedly, I just delete anything that gets in the way instead of updating, but you could easily change this to your own ends.).
Also note the use of the sql transaction; this speeds up the process immensely.
#Override
public int bulkInsert(Uri uri, ContentValues[] values) {
SQLiteDatabase sqlDB = database.getWritableDatabase();
switch (match(uri)) {
case ONEPROGRAMME:
String cid = uri.getLastPathSegment();
int insertCount = 0;
int len = values.length;
if (len > 0) {
long start = values[0].getAsLong(Programme.COLUMN_START);
long end = values[len - 1].getAsLong(Programme.COLUMN_END);
String where = Programme.COLUMN_CHANNEL + "=? AND " + Programme.COLUMN_START + ">=? AND "
+ Programme.COLUMN_END + "<=?";
String[] args = { cid, Long.toString(start), Long.toString(end) };
//TODO use a compiled statement ?
//SQLiteStatement stmt = sqlDB.compileStatement(INSERT)
sqlDB.beginTransaction();
try {
sqlDB.delete(tableName(PROGRAMME_TABLE), where, args);
for (ContentValues row : values) {
if (sqlDB.insert(tableName(PROGRAMME_TABLE), null, row) != -1L) {
insertCount++;
}
}
sqlDB.setTransactionSuccessful();
} finally {
sqlDB.endTransaction();
}
}
if (insertCount > 0)
getContext().getContentResolver().notifyChange(Resolver.PROGRAMME.uri, null);
return insertCount;
default:
throw new UnsupportedOperationException("Unsupported URI: " + uri);
}
}
I am new to Android programming and am trying to understand the best practices.
I want to do multiple inserts into two different database tables, but as one transaction (as the tables shared a foreign key). I want my function to return a result so that I can display a Toast or something to say that an error occurred, otherwise I want to return the row ID of the first insert.
I believe one way of doing this is sort-of as follows (Disclaimer: psuedo-ish code, probably won't compile!):
Long result = -1;
myDatabase.beginTransaction();
try {
// Insert into first table
ContentValue someValues = new ContentValues();
someValues.put("dbfield1", 1);
result = myDatabase.insert(DATABASE_TABLE_1, null, someValues);
if (-1 != result ) {
// Insert into second table
someValues.clear();
someValues.put("dbfield2", 2);
if( myDatabase.insert(DATABASE_TABLE_2, null, someValues) < 0 ) {
result = -1;
}
}
mDatabase.setTransactionSuccessful();
} catch(Exception e) {
// An error occurred
result = -1;
} finally {
mDatabase.endTransaction();
}
Is there a simpler/better way of doing this?
You can override bulkInsert inside your ContentProvider
your Code looks fine. The Method should return the inserted Rows Value but you can customize that so you return only the first ID.
public int bulkInsert(Uri uri, ContentValues[] values) {
Log.e("BULK", "Bulk insert started for URI" + uri.toString());
bulkSqlDB = database.getWritableDatabase();
int numInserted;
bulkSqlDB.beginTransaction();
try {
for (ContentValues cv : values) {
insert(uri, cv);
}
bulkSqlDB.setTransactionSuccessful();
numInserted = values.length;
} finally {
getContext().getContentResolver().notifyChange(uri, null);
bulkSqlDB.endTransaction();
}
return numInserted;
}
I need to parse a fairly large XML file (varying between about a hundred kilobytes and several hundred kilobytes), which I'm doing using Xml#parse(String, ContentHandler). I'm currently testing this with a 152KB file.
During parsing, I also insert the data in an SQLite database using calls similar to the following: getWritableDatabase().insert(TABLE_NAME, "_id", values). All of this together takes about 80 seconds for the 152KB test file (which comes down to inserting roughly 200 rows).
When I comment out all insert statements (but leave in everything else, such as creating ContentValues etc.) the same file takes only 23 seconds.
Is it normal for the database operations to have such a big overhead? Can I do anything about that?
You should do batch inserts.
Pseudocode:
db.beginTransaction();
for (entry : listOfEntries) {
db.insert(entry);
}
db.setTransactionSuccessful();
db.endTransaction();
That increased the speed of inserts in my apps extremely.
Update:
#Yuku provided a very interesting blog post: Android using inserthelper for faster insertions into sqlite database
Since the InsertHelper mentioned by Yuku and Brett is deprecated now (API level 17), it seems the right alternative recommended by Google is using SQLiteStatement.
I used the database insert method like this:
database.insert(table, null, values);
After I also experienced some serious performance issues, the following code speeded my 500 inserts up from 14.5 sec to only 270 ms, amazing!
Here is how I used SQLiteStatement:
private void insertTestData() {
String sql = "insert into producttable (name, description, price, stock_available) values (?, ?, ?, ?);";
dbHandler.getWritableDatabase();
database.beginTransaction();
SQLiteStatement stmt = database.compileStatement(sql);
for (int i = 0; i < NUMBER_OF_ROWS; i++) {
//generate some values
stmt.bindString(1, randomName);
stmt.bindString(2, randomDescription);
stmt.bindDouble(3, randomPrice);
stmt.bindLong(4, randomNumber);
long entryID = stmt.executeInsert();
stmt.clearBindings();
}
database.setTransactionSuccessful();
database.endTransaction();
dbHandler.close();
}
Compiling the sql insert statement helps speed things up. It can also require more effort to shore everything up and prevent possible injection since it's now all on your shoulders.
Another approach which can also speed things up is the under-documented android.database.DatabaseUtils.InsertHelper class. My understanding is that it actually wraps compiled insert statements. Going from non-compiled transacted inserts to compiled transacted inserts was about a 3x gain in speed (2ms per insert to .6ms per insert) for my large (200K+ entries) but simple SQLite inserts.
Sample code:
SQLiteDatabse db = getWriteableDatabase();
//use the db you would normally use for db.insert, and the "table_name"
//is the same one you would use in db.insert()
InsertHelper iHelp = new InsertHelper(db, "table_name");
//Get the indices you need to bind data to
//Similar to Cursor.getColumnIndex("col_name");
int first_index = iHelp.getColumnIndex("first");
int last_index = iHelp.getColumnIndex("last");
try
{
db.beginTransaction();
for(int i=0 ; i<num_things ; ++i)
{
//need to tell the helper you are inserting (rather than replacing)
iHelp.prepareForInsert();
//do the equivalent of ContentValues.put("field","value") here
iHelp.bind(first_index, thing_1);
iHelp.bind(last_index, thing_2);
//the db.insert() equilvalent
iHelp.execute();
}
db.setTransactionSuccessful();
}
finally
{
db.endTransaction();
}
db.close();
If the table has an index on it, consider dropping it prior to inserting the records and then adding it back after you've commited your records.
If using a ContentProvider:
#Override
public int bulkInsert(Uri uri, ContentValues[] bulkinsertvalues) {
int QueryType = sUriMatcher.match(uri);
int returnValue=0;
SQLiteDatabase db = mOpenHelper.getWritableDatabase();
switch (QueryType) {
case SOME_URI_IM_LOOKING_FOR: //replace this with your real URI
db.beginTransaction();
for (int i = 0; i < bulkinsertvalues.length; i++) {
//get an individual result from the array of ContentValues
ContentValues values = bulkinsertvalues[i];
//insert this record into the local SQLite database using a private function you create, "insertIndividualRecord" (replace with a better function name)
insertIndividualRecord(uri, values);
}
db.setTransactionSuccessful();
db.endTransaction();
break;
default:
throw new IllegalArgumentException("Unknown URI " + uri);
}
return returnValue;
}
Then the private function to perform the insert (still inside your content provider):
private Uri insertIndividualRecord(Uri uri, ContentValues values){
//see content provider documentation if this is confusing
if (sUriMatcher.match(uri) != THE_CONSTANT_IM_LOOKING_FOR) {
throw new IllegalArgumentException("Unknown URI " + uri);
}
//example validation if you have a field called "name" in your database
if (values.containsKey(YOUR_CONSTANT_FOR_NAME) == false) {
values.put(YOUR_CONSTANT_FOR_NAME, "");
}
//******add all your other validations
//**********
//time to insert records into your local SQLite database
SQLiteDatabase db = mOpenHelper.getWritableDatabase();
long rowId = db.insert(YOUR_TABLE_NAME, null, values);
if (rowId > 0) {
Uri myUri = ContentUris.withAppendedId(MY_INSERT_URI, rowId);
getContext().getContentResolver().notifyChange(myUri, null);
return myUri;
}
throw new SQLException("Failed to insert row into " + uri);
}