Most Efficient Way to Insert 5000+ Android Contacts - android

I realize this has been somewhat touched upon in various places including here on Stack Overflow, but I'm looking for any other solutions that people might have used. So with that in mind...
I'm developing an application where a user can initially sync all his contacts with a desktop application OTA. This is done through a web service call that grabs a set of 100 contacts from the server, downloads and parses the information, inserts the contacts into the Android Contact DB, acknowledges receipt of these contacts, and then repeats the previous steps with the next set of 100 contacts until the sync is complete. This process works well when a user has contacts on the order or 1000-2000, but a typical user of this application can easily have 5000-6000 contacts (with power users having upwards of 10000+) in which case things take far longer than I'd like. For example, a sample set of approximately 5300 contacts can take about 13.5 minutes to complete. Not bad, but I'd like it to be at least as efficient as iOS which runs about 8 minutes for the same data set if possible.
I've logged the time it takes for each step and, unsurprisingly, the bottleneck appears to be with inserting the data into the Android contract DB. After scouring the web I've found little help with regards to inserting thousands of contacts, but what I have found seems to fall into these three groups:
1) ContentProviderOperation -- The Google recommended way which gave me my baseline of 13.5 minutes for 5300 contacts.
2) Bulk Inserts -- I read that builkInsert tends to be more efficient than applyBatch, but when I tried to implement this myself it actually took 25 minutes for the same 5300 contacts. I have a feeling a lot of this is due to the fact that I need to insert the RawContact information and then save the resulting URI for use in creating the ContactsContract.Data for the bulkInsert which comes more naturally via the backValueReference in the ContentProviderOperation. Additionally, I looked at the android source code and I don't get the feeling that bulkInsert is terribly efficient.
3) Creating an optimized bulk insert using the DatabaseUtils.InsertHelper and transactions -- Unfortunately, this seems geared towards those people who created their own content provider because you need access to the underlying DB as an instance variable and I've yet to see how that could be done with the native contacts DB.
Does anyone have any experience with inserting 5000+ contacts or any other possible ideas I could look into to help reduce my time? Or should the ContentProviderOperation be considered as optimized as it's going to get?

Unfortunately, I believe 1 is the best option. I suspect a majority of your overhead in comparison to iPhone is in the cross process IPC inherent to the content provider design.
Your analysis of 3 is correct.
There are options on rooted devices to go around the content provider but I doubt that is what you are looking for.

Hi I insert huge contact within minutes my code is:
public void insertContact(contactList:List<Contact>){
val queueSize = 300 //400
val contactQueue = contactList.size/queueSize
if(contactQueue > 0) {
var startIndex = 0
var endIndex = 0
var tempList: List<Request.ContactBean>? = null
totalQueue = contactQueue + 1+smsQueue
for (i in 0..contactQueue) {
startIndex = i * queueSize
endIndex = startIndex + queueSize
endIndex = if (endIndex < contactList.size) endIndex else contactList.size
tempList = contactList.subList(startIndex, endIndex);
Log.d(Constant.TAG_RESTORE, "In loop totalQueue: " + contactQueue + " i: " + i
+ " startIndex: " + startIndex + "endIndex: " + endIndex + " Queuesize: " + tempList.size)
restoreContact(tempList);
}
}else{
totalQueue = 1+smsQueue;
restoreContact(contactList);
}
}
private fun restoreContact( contactList: List<Request.ContactBean>) {
Observable.fromCallable { insertContact(contactList); }
.subscribeOn(Schedulers.io())
.observeOn(AndroidSchedulers.mainThread())
.subscribe({
totalCompleteOperation++
if(totalCompleteOperation === totalQueue){
Log.d(Constant.TAG_RESTORE, " in subscribe restoreContact " +
"totalCompleteOperation: "+ totalCompleteOperation +" totalQueue "+totalQueue)
hideDialog();
completeRestore(true)
}
}
)
}
public void insertContact(List<Request.ContactBean> contacts) throws RemoteException, OperationApplicationException {
final int MAX_OPERATIONS_FOR_INSERTION = 100; //100
int size = contacts.size();
ArrayList<ContentProviderOperation> ops = new ArrayList<>();
for (int i = 0; i < size; i++) {
createOperations(ops, contacts.get(i));
if (ops.size() >= MAX_OPERATIONS_FOR_INSERTION) {
mContext.getContentResolver().applyBatch(ContactsContract.AUTHORITY, ops);
ops.clear();
}
}
if (ops.size() > 0)
mContext.getContentResolver().applyBatch(ContactsContract.AUTHORITY, ops);
}
private void createOperations(ArrayList<ContentProviderOperation> ops,
Request.ContactBean contact){
int backReference = ops.size();
ops.add(ContentProviderOperation.newInsert(ContactsContract.RawContacts.CONTENT_URI)
.withValue(ContactsContract.RawContacts.ACCOUNT_NAME, null)
.withValue(ContactsContract.RawContacts.ACCOUNT_TYPE, null)
.withValue(ContactsContract.RawContacts.AGGREGATION_MODE, ContactsContract.RawContacts.AGGREGATION_MODE_DISABLED)
.build()
);
ops.add(ContentProviderOperation.newInsert(ContactsContract.Data.CONTENT_URI)
// .withYieldAllowed(true)
.withValueBackReference(ContactsContract.Data.RAW_CONTACT_ID, backReference)
.withValue(ContactsContract.Data.MIMETYPE, ContactsContract.CommonDataKinds.StructuredName.CONTENT_ITEM_TYPE)
.withValue(ContactsContract.CommonDataKinds.StructuredName.GIVEN_NAME, contact.getName())
.build());
if (contact.getNumbers() != null && contact.getNumbers().size() > 0) {
// Adding insert operation to operations list
// to insert Mobile Number in the table ContactsContract.Data
ops.add(ContentProviderOperation.newInsert(ContactsContract.Data.CONTENT_URI)
//.withYieldAllowed(true)
.withValueBackReference(ContactsContract.Data.RAW_CONTACT_ID, backReference)
.withValue(ContactsContract.Data.MIMETYPE, Phone.CONTENT_ITEM_TYPE)
.withValue(Phone.NUMBER, contact.getNumbers().get(0).getNumber())
.withValue(Phone.TYPE, Phone.TYPE_MOBILE)
.build());
if (contact.getNumbers().size() > 1) {
// Adding insert operation to operations list
// to insert Home Phone Number in the table ContactsContract.Data
ops.add(ContentProviderOperation.newInsert(ContactsContract.Data.CONTENT_URI)
//.withYieldAllowed(true)
.withValueBackReference(ContactsContract.Data.RAW_CONTACT_ID, backReference)
.withValue(ContactsContract.Data.MIMETYPE, Phone.CONTENT_ITEM_TYPE)
.withValue(Phone.NUMBER, contact.getNumbers().get(1).getNumber())
.withValue(Phone.TYPE, Phone.TYPE_HOME)
.build());
}
}
if (contact.getEmails() != null && contact.getEmails().size() > 0) {
// Adding insert operation to operations list
// to insert Work Email in the table ContactsContract.Data
ops.add(ContentProviderOperation.newInsert(ContactsContract.Data.CONTENT_URI)
// .withYieldAllowed(true)
.withValueBackReference(ContactsContract.Data.RAW_CONTACT_ID, backReference)
.withValue(ContactsContract.Data.MIMETYPE, Email.CONTENT_ITEM_TYPE)
.withValue(Email.ADDRESS, contact.getEmails().get(0).getAddress())
.withValue(Email.TYPE, Email.TYPE_WORK)
.build());
}
if (contact.getEmails().size() > 1) {
// Adding insert operation to operations list
// to insert Home Email in the table ContactsContract.Data
ops.add(ContentProviderOperation.newInsert(ContactsContract.Data.CONTENT_URI)
// .withYieldAllowed(true)
.withValueBackReference(ContactsContract.Data.RAW_CONTACT_ID, backReference)
.withValue(ContactsContract.Data.MIMETYPE, Email.CONTENT_ITEM_TYPE)
.withValue(Email.ADDRESS, contact.getEmails().get(1).getAddress())
.withValue(Email.TYPE, Email.TYPE_HOME)
.build());
}
}
This code will insert huge contact list in very less time.

Related

Android / SQLite - Fastest insert?

Long story short:
I got a CSV file with something like 8.000 records (and 4 fields).
I have to download it and after that process it and insert each record in a sqllite table.
So I do with it a transaction:
SQLiteDatabase db = this.getWritableDatabase();
db.beginTransaction();
try
{
String line;
int i=0;
do {
line = buffreader.readLine();
i++;
if(i==1)
continue; //Header of the CSV
if(line != null)
{
String[] values = line.split(";");
if(values.length != 4 )
continue;
sql = String.format("INSERT INTO TABLE (FIELD_1, FIELD_2, FIELD_3, FIELD_4) VALUES (%s, %s, %s, %s)",
values[0],
values[1],
values[2],
values[3]);
db.execSQL(sql);
}
}
while (line != null);
db.setTransactionSuccessful();
}
catch (SQLiteException ex)
{
Log.d(TAG, "Well.. : " + ex.getMessage());
throw ex;
}
finally
{
db.endTransaction();
}
Everything works fine, it takes like 8-9 seconds on my cellphone and other cellphone.
Sadly on the Android device where this app have to run ( a white label device with a dualcore processor ) it takes 6-7 MINUTES!!!
Of course my boss is not happy about it, he do agree that on "regular" cellphone with a quadcore process everything is faster but we have to make it working where on this dualcore and 6-7 minutes looks like a problem.. Any idea about how to solve it ?
1) Separate your processes (file read and db inserts). You need to consume a lower quantity of memory.
2) Insert multiple records: INSERT INTO ... VALUES (1,2,3,4),(5,6,7,8),(9,10,11,12). In this way, you get a lower I/O.
3) Use query parameters
So, there we're with some experiment.
I remove the "split CSV part".
The "record_list" variable is 27358 record.
I comment the DB operation 'cause as suggested I try to determine where the time is spent. I add two Date variable so I can see how much it really takes.
Well, it takes 159 seconds to populate the SQL query with the white label device.. If I uncomment the db operation it takes the same amount of time ( 165 seconds ). So the problem is in the String creation and I think that it's already optimized at his best..
Here is the code
String[] record_list = Split_CSV();
Date StartDate = new Date();
//SQLiteDatabase db = this.getWritableDatabase();
//db.beginTransaction();
try
{
StringBuilder sql = new StringBuilder();
int i=0;
for (String line : record_list)
{
String[] values = line.split(";");
if (i==0)
{
sql.append("INSERT INTO TABLE (FIELD_1, FIELD_2, FIELD_3, FIELD_4) VALUES ");
}
i = i+1;
sql.append(String.format("(%s,%s,%s,%s), ",
values[0],
values[1],
values[2],
values[3]));
if (i==500)
{
i = 0;
//db.execSQL(sql.substring(0,sql.length()-2));
sql.setLength(0);
}
}
if (sql.length()!=0) {
//db.execSQL(sql.substring(0, sql.length() - 2));
sql.setLength(0);
}
//db.setTransactionSuccessful();
}
catch (SQLiteException ex)
{
Log.d(TAG, "addAnagraficheClienti : " + ex.getMessage());
throw ex;
}
finally
{
//db.endTransaction();
}
Date EndDate = new Date();
If anyone don't know:
the "INSERT" is splitted every 500 record 'cause of this comment:
Is it possible to insert multiple rows at a time in an SQLite database?
I was reading that:
As a further note, sqlite only seems to support upto 500 such union selects per query so if you are trying to throw in more data than that you will need to break it up into 500 element blocks

I need to separate the text from a string based on column names

I am working on OCR based Android app, getting this text as string from the attached image dynamically (getting the text in Horizontal Direction from the image)
Text from Image:
"Part Name Part Cost Engine Oil and Oil Filter Replacement Rs 10K Alf Filter Rs 4500 Cabin AC Micro Filter Rs 4000 Pollen Filter Rs 1200 - 1500 AC Disinfectant Rs 3000 Fuel Filter Rs 6000 - 8000 Spark Plug Set Replacement (Applicable in TFSI / Petrol Car Range) Rs 10K Body Wash, Basic Clean 8. Engine Degrease Rs 3000 Body Wax Polish Detailed Rs 7000 - 8000 Car interior Dry Clean with Genn Clean Rs 8000 - 10000 Wheel Alignment \u0026 Balancing Rs 6000 - 7000 Brake Pads Replacernent (Pair) Rs 30K - 32K Brake Disc Replacernent (Pair) Rs 30K - 35K ..........".
I need to separate the Part Name and Part Cost(just 2 columns i.e Part Name, Part Cost) (ignore all extra text from the column heading). Separate the values from String and should store it in SQLIte Database Android. I am stuck how to get the values and separate them.
The text returned from the OCR isn't ideal. The first thing you should do is check if whatever OCR solution can be configured to provide a better output. Ideally, you want the lines to be separated by newline characters and the space between the columns to be interpreted as something more useful, such as a tab character.
If you have no way of changing the text you get, you'll have to find some way of parsing it. You may want to look into using a parser, such as ANTLR to make this easier.
The following observations may help you to come up with a parsing strategy:
Column 2 items all start with "Rs" or "Upto Rs".
Column 2 items end with:
A number (where a number is allowed to be a string of digits [0-9.], optionally followed by a "K"
"Lakh"
Column 1 items don't begin with a number or "Lakh"
So a basic algorithm could be:
List<String> column1 = new ArrayList<String>();
List<String> column2 = new ArrayList<String>();
String[] tokens = ocrString.split(" ");
List<String> column = column1;
String item = "";
for (int i = 0; i < tokens.length; i++) {
String token = tokens[i];
String nextToken = i == tokens.length - 1 ? "" : tokens[i+1];
if (column == column1) {
if (token == "Rs" || (token == "Upto" && nextToken == "Rs")) {
column = column2;
column.add(item); item = "";
i--; continue;
}
item += " " + token;
} else {
item += " " + token;
if (/*token is number or "Lakh" and nextToken is not*/) {
column.add(item); item = "";
column = column1;
}
}
}

Bulk update of more than 500 contacts

I am developing an app that needs to update many contacts and I am getting the following error.
android.content.OperationApplicationException: Too many content provider operations between yield points. The maximum number of operations per yield point is 500
I tried breaking the contacts up into smaller chunks to update, but I still get the same error. The good thing is that now, some contacts are updated (previously 0 contacts are updated). Any suggestions that can help me is greatly appreciated.
Uri uri = ContactsContract.Data.CONTENT_URI;
String selectionUpdate = ContactsContract.CommonDataKinds.Phone._ID + " = ? AND " + ContactsContract.Contacts.Data.MIMETYPE + " = ? ";
int i = 0;
int numRowsUpdated = 0;
int batchsize = 100;
for (EntityPhone ep : eps) {
if (ep.isUpdateNumber()) {
//update only when checkbox is ticked
ops.add(ContentProviderOperation.newUpdate(uri)
.withSelection(selectionUpdate, new String[]{ep.getPhoneId(), ContactsContract.CommonDataKinds.Phone.CONTENT_ITEM_TYPE})
.withValue(ContactsContract.CommonDataKinds.Phone.NUMBER, ep.getPhoneNumberNew())
.build());
i++;
if (i % batchsize == 0) {
i = 0;
ContentProviderResult[] count = contentResolver.applyBatch(ContactsContract.AUTHORITY, ops);
if (count != null) {
numRowsUpdated += count.length;
Log.i(TAG, "batch update success" + count.length);
} else {
Log.w(TAG, "batch update failed");
}
}
}
}
if (i != 0) {
ContentProviderResult[] count = contentResolver.applyBatch(ContactsContract.AUTHORITY, ops);
}
I have looked at the past questions, but they are mostly related to inserts, not updates.
Insertion of thousands of contact entries using applyBatch is slow
Whats the fastest way to create large numbers of contacts?
The reason why I want to update so many records at once is because my application is a 'contact number formatter' that allows the user to standardizes all the phone numbers in the phone easily. I do not have control of how many records the users want to update in a single batch. (https://play.google.com/store/apps/details?id=angel.phoneformat)
You're not creating a new object for ops. During subsequent calls to applyBatch, you're passing the previously applied operations back in as well. The first time ops contains 100 elements, then 200 and eventually it fails when it reaches 500. Change to
if (i % batchsize == 0) {
contentResolver.applyBatch(ContactsContract.AUTHORITY, ops);
ops = new ArrayList<ContentProviderOperation>(100);
}

Android: Need Advice on SQLite, searching slow

I have to search a database that is 26024 entries and counting. It used to be fast with less records but now is taking like 10 seconds and slowing the app. I was wondering if i could get advice as to how speed up the process or if i'm doing anything wrong. Here is the code.
while (cursor.moveToNext()) {
String word = cursor.getString(0);
if (word.equals(input)) {
String nikus = cursor.getString(1);
String def = cursor.getString(2);
ret.append(" " + nikus + "\n"+ def + "\n");
g = null;
}
EDIT:
In my Database i have a definitions table and in the table there are 3 fields one is the words to be compared to, the sencond is the full word, and the third is the definition itself. Hopefully that helps you guys a little more.
CREATE TABLE [definitions] (
[word] TEXT,
[fullword] TEXT,
[definition] TEXT);
EDIT: here is the error im getting
01-04 00:47:54.678: E/CursorWindow(4722): need to grow: mSize = 1048576, size = 17, freeSpace() = 13, numRows = 15340
laalto's comment above is correct. You should be running a select with a where clause that only pulls back the rows where word is equal to input (don't forget about case sensitivity.) An index on the word column will help the query go even faster.

Insertion of thousands of contact entries using applyBatch is slow

I'm developing an application where I need to insert lots of Contact entries. At the current time approx 600 contacts with a total of 6000 phone numbers. The biggest contact has 1800 phone numbers.
Status as of today is that I have created a custom Account to hold the Contacts, so the user can select to see the contact in the Contacts view.
But the insertion of the contacts is painfully slow. I insert the contacts using ContentResolver.applyBatch. I've tried with different sizes of the ContentProviderOperation list(100, 200, 400), but the total running time is approx. the same. To insert all the contacts and numbers takes about 30 minutes!
Most issues I've found regarding slow insertion in SQlite brings up transactions. But since I use the ContentResolver.applyBatch-method I don't control this, and I would assume that the ContentResolver takes care of transaction management for me.
So, to my question: Am I doing something wrong, or is there anything I can do to speed this up?
Anders
Edit:
#jcwenger:
Oh, I see. Good explanation!
So then I will have to first insert into the raw_contacts table, and then the datatable with the name and numbers. What I'll lose is the back reference to the raw_id which I use in the applyBatch.
So I'll have to get all the id's of the newly inserted raw_contacts rows to use as foreign keys in the data table?
Use ContentResolver.bulkInsert (Uri url, ContentValues[] values) instead of ApplyBatch()
ApplyBatch (1) uses transactions and (2) it locks the ContentProvider once for the whole batch instead locking/unlocking once per operation. because of this, it is slightly faster than doing them one at a time (non-batched).
However, since each Operation in the Batch can have a different URI and so on, there's a huge amount of overhead. "Oh, a new operation! I wonder what table it goes in... Here, I'll insert a single row... Oh, a new operation! I wonder what table it goes in..." ad infinitium. Since most of the work of turning URIs into tables involves lots of string comparisons, it's obviously very slow.
By contrast, bulkInsert applies a whole pile of values to the same table. It goes, "Bulk insert... find the table, okay, insert! insert! insert! insert! insert!" Much faster.
It will, of course, require your ContentResolver to implement bulkInsert efficiently. Most do, unless you wrote it yourself, in which case it will take a bit of coding.
bulkInsert: For those interested, here is the code that I was able to experiment with. Pay attention to how we can avoid some allocations for int/long/floats :) this could save more time.
private int doBulkInsertOptimised(Uri uri, ContentValues values[]) {
long startTime = System.currentTimeMillis();
long endTime = 0;
//TimingInfo timingInfo = new TimingInfo(startTime);
SQLiteDatabase db = mOpenHelper.getWritableDatabase();
DatabaseUtils.InsertHelper inserter =
new DatabaseUtils.InsertHelper(db, Tables.GUYS);
// Get the numeric indexes for each of the columns that we're updating
final int guiStrColumn = inserter.getColumnIndex(Guys.STRINGCOLUMNTYPE);
final int guyDoubleColumn = inserter.getColumnIndex(Guys.DOUBLECOLUMNTYPE);
//...
final int guyIntColumn = inserter.getColumnIndex(Guys.INTEGERCOLUMUNTYPE);
db.beginTransaction();
int numInserted = 0;
try {
int len = values.length;
for (int i = 0; i < len; i++) {
inserter.prepareForInsert();
String guyID = (String)(values[i].get(Guys.GUY_ID));
inserter.bind(guiStrColumn, guyID);
// convert to double ourselves to save an allocation.
double d = ((Number)(values[i].get(Guys.DOUBLECOLUMNTYPE))).doubleValue();
inserter.bind(guyDoubleColumn, lat);
// getting the raw Object and converting it int ourselves saves
// an allocation (the alternative is ContentValues.getAsInt, which
// returns a Integer object)
int status = ((Number) values[i].get(Guys.INTEGERCOLUMUNTYPE)).intValue();
inserter.bind(guyIntColumn, status);
inserter.execute();
}
numInserted = len;
db.setTransactionSuccessful();
} finally {
db.endTransaction();
inserter.close();
endTime = System.currentTimeMillis();
if (LOGV) {
long timeTaken = (endTime - startTime);
Log.v(TAG, "Time taken to insert " + values.length + " records was " + timeTaken +
" milliseconds " + " or " + (timeTaken/1000) + "seconds");
}
}
getContext().getContentResolver().notifyChange(uri, null);
return numInserted;
}
An example of on how to override the bulkInsert(), in order to speed up multiples insert, can be found here
#jcwenger At first, after read your post, I think that's the reason of
bulkInsert is quicker than ApplyBatch, but after read the code of Contact Provider, I don't think so.
1.You said ApplyBatch use transactions, yes, but bulkInsert also use transactions. Here is the code of it:
public int bulkInsert(Uri uri, ContentValues[] values) {
int numValues = values.length;
mDb = mOpenHelper.getWritableDatabase();
mDb.beginTransactionWithListener(this);
try {
for (int i = 0; i < numValues; i++) {
Uri result = insertInTransaction(uri, values[i]);
if (result != null) {
mNotifyChange = true;
}
mDb.yieldIfContendedSafely();
}
mDb.setTransactionSuccessful();
} finally {
mDb.endTransaction();
}
onEndTransaction();
return numValues;
}
That is to say, bulkInsert also use transations.So I don't think that's the reason.
2.You said bulkInsert applies a whole pile of values to the same table.I'm sorry I can't find related code in the source code of froyo.And I want to know how could you find that?Could you tell me?
The reason I think is that:
bulkInsert use mDb.yieldIfContendedSafely() while applyBatch use
mDb.yieldIfContendedSafely(SLEEP_AFTER_YIELD_DELAY)/*SLEEP_AFTER_YIELD_DELAY = 4000*/
after reading the code of SQLiteDatabase.java, I find that, if set a time in yieldIfContendedSafely, it will do a sleep, but if you don't set the time, it will not sleep.You can refer to the code below which is a piece of code of SQLiteDatabase.java
private boolean yieldIfContendedHelper(boolean checkFullyYielded, long sleepAfterYieldDelay) {
if (mLock.getQueueLength() == 0) {
// Reset the lock acquire time since we know that the thread was willing to yield
// the lock at this time.
mLockAcquiredWallTime = SystemClock.elapsedRealtime();
mLockAcquiredThreadTime = Debug.threadCpuTimeNanos();
return false;
}
setTransactionSuccessful();
SQLiteTransactionListener transactionListener = mTransactionListener;
endTransaction();
if (checkFullyYielded) {
if (this.isDbLockedByCurrentThread()) {
throw new IllegalStateException(
"Db locked more than once. yielfIfContended cannot yield");
}
}
if (sleepAfterYieldDelay > 0) {
// Sleep for up to sleepAfterYieldDelay milliseconds, waking up periodically to
// check if anyone is using the database. If the database is not contended,
// retake the lock and return.
long remainingDelay = sleepAfterYieldDelay;
while (remainingDelay > 0) {
try {
Thread.sleep(remainingDelay < SLEEP_AFTER_YIELD_QUANTUM ?
remainingDelay : SLEEP_AFTER_YIELD_QUANTUM);
} catch (InterruptedException e) {
Thread.interrupted();
}
remainingDelay -= SLEEP_AFTER_YIELD_QUANTUM;
if (mLock.getQueueLength() == 0) {
break;
}
}
}
beginTransactionWithListener(transactionListener);
return true;
}
I think that's the reason of bulkInsert is quicker than applyBatch.
Any question please contact me.
I get the basic solution for you,
use "yield points" in batch operation.
The flip side of using batched operations is that a large batch may lock up the database for a long time preventing other applications from accessing data and potentially causing ANRs ("Application Not Responding" dialogs.)
To avoid such lockups of the database, make sure to insert "yield points" in the batch. A yield point indicates to the content provider that before executing the next operation it can commit the changes that have already been made, yield to other requests, open another transaction and continue processing operations.
A yield point will not automatically commit the transaction, but only if there is another request waiting on the database. Normally a sync adapter should insert a yield point at the beginning of each raw contact operation sequence in the batch. See withYieldAllowed(boolean).
I hope it's may be useful for you.
Here is am example of inserting same data amount within 30 seconds.
public void testBatchInsertion() throws RemoteException, OperationApplicationException {
final SimpleDateFormat FORMATTER = new SimpleDateFormat("mm:ss.SSS");
long startTime = System.currentTimeMillis();
Log.d("BatchInsertionTest", "Starting batch insertion on: " + new Date(startTime));
final int MAX_OPERATIONS_FOR_INSERTION = 200;
ArrayList<ContentProviderOperation> ops = new ArrayList<>();
for(int i = 0; i < 600; i++){
generateSampleProviderOperation(ops);
if(ops.size() >= MAX_OPERATIONS_FOR_INSERTION){
getContext().getContentResolver().applyBatch(ContactsContract.AUTHORITY,ops);
ops.clear();
}
}
if(ops.size() > 0)
getContext().getContentResolver().applyBatch(ContactsContract.AUTHORITY,ops);
Log.d("BatchInsertionTest", "End of batch insertion, elapsed: " + FORMATTER.format(new Date(System.currentTimeMillis() - startTime)));
}
private void generateSampleProviderOperation(ArrayList<ContentProviderOperation> ops){
int backReference = ops.size();
ops.add(ContentProviderOperation.newInsert(ContactsContract.RawContacts.CONTENT_URI)
.withValue(ContactsContract.RawContacts.ACCOUNT_NAME, null)
.withValue(ContactsContract.RawContacts.ACCOUNT_TYPE, null)
.withValue(ContactsContract.RawContacts.AGGREGATION_MODE, ContactsContract.RawContacts.AGGREGATION_MODE_DISABLED)
.build()
);
ops.add(ContentProviderOperation.newInsert(ContactsContract.Data.CONTENT_URI)
.withValueBackReference(ContactsContract.Data.RAW_CONTACT_ID, backReference)
.withValue(ContactsContract.Data.MIMETYPE, ContactsContract.CommonDataKinds.StructuredName.CONTENT_ITEM_TYPE)
.withValue(ContactsContract.CommonDataKinds.StructuredName.GIVEN_NAME, "GIVEN_NAME " + (backReference + 1))
.withValue(ContactsContract.CommonDataKinds.StructuredName.FAMILY_NAME, "FAMILY_NAME")
.build()
);
for(int i = 0; i < 10; i++)
ops.add(ContentProviderOperation.newInsert(ContactsContract.Data.CONTENT_URI)
.withValueBackReference(ContactsContract.Data.RAW_CONTACT_ID, backReference)
.withValue(ContactsContract.Data.MIMETYPE, ContactsContract.CommonDataKinds.Phone.CONTENT_ITEM_TYPE)
.withValue(ContactsContract.CommonDataKinds.Phone.TYPE, ContactsContract.CommonDataKinds.Phone.TYPE_MAIN)
.withValue(ContactsContract.CommonDataKinds.Phone.NUMBER, Integer.toString((backReference + 1) * 10 + i))
.build()
);
}
The log:
02-17 12:48:45.496 2073-2090/com.vayosoft.mlab D/BatchInsertionTest﹕ Starting batch insertion on: Wed Feb 17 12:48:45 GMT+02:00 2016
02-17 12:49:16.446 2073-2090/com.vayosoft.mlab D/BatchInsertionTest﹕ End of batch insertion, elapsed: 00:30.951
Just for the information of the readers of this thread.
I was facing performance issue even if using applyBatch().
In my case there was database triggers written on one of the table.
I deleted the triggers of the table and its boom.
Now my app insert rows with blessing fast speed.

Categories

Resources