Converting server data as Sqlite Database - android

i am developing android app, here i am having an huge no of data approximately 10000 records with 10 fields in the server, i need to get this data and store it in the my local db, so for this i tried to implement by getting the data in the form of json parsing it and inserting in db one by one, it is taking less time to download the data but more time to insert to the db, after some time i get to know that i am inserting to the db one by one, so insertion operations looping based on the total no of records which had been got. i tried to look for the alternatives i could not get the way for this, so i request you to give me suggestions and snippets to me achieve this.
Thanking you

use transactions to wrap multiple inserts into one operation, that's a lot faster: Improve INSERT-per-second performance of SQLite?
List<Item> list = getDataFromJson();
SQLiteDatabase db = getDatabase();
db.beginTransaction();
try {
// doing all the inserts (into memory) here
for(Item item : list) {
db.insert(table, null, item.getContentValues());
}
// nothing was actually inserted yet
db.setTransactionSuccessful();
} finally {
// all inserts happen now (if transaction was set to successful)
db.endTransaction();
}

Related

the best database for storing a large JSON

I have a large file of json after the parsing using JACKSON, I must store it into a database, it takes a several minutes so my questions
1-there's any way to speed up the storage ?
2- there's another database ?
3- Using Object database could help me ?,
I use SQLITE Database, I heared about RealM but I'm not sure to use it, help me guys
If you're storing directly the json inside sqlite you're probably doing something "wrong".
A database is useful when you need RELATIONSHIPs beetween objects, and so you have tables, indexes, keys and stuff like that.
Anyway, it's really strange that it takes minutes, but I suggest you to rethink your architecture, and maybe just write it on a file.
Changes in SQLite are ACID (atomic, consistent, isolated, durable). This means that every update, insert and delete operation is ACID. Unfortunately this requires some overhead in the database processing therefore you should wrap updates in the SQLite database in an transaction and commit this transaction after several operations. This can significantly improve performance.
The following code demonstrates that performance optimization.
db.beginTransaction();
try {
for (int i= 0; i< values.lenght; i++){
// TODO prepare ContentValues object values
db.insert(your_table, null, values);
// In case you do larger updates
yieldIfContededSafely()
}
db.setTransactionSuccessful();
} finally {
db.endTransaction();
}
REfs:http://www.vogella.com/tutorials/AndroidSQLite/article.html

Accelerate the insertion into Sqlite datatbase Android?

I use this function to insert data into the SQLite Android data base:
public long insertAccount(String code,String name,int s3,int s4,String s5,String s6,int s7,
int s8,int s9,int s10,int s11,String s12,String s13,int s14,int s15,int s16) {
//container and place in it the information you want inserted, updated, etc.
ContentValues initialValues = new ContentValues();
initialValues.put(Code, code);
initialValues.put(Name,name);
initialValues.put(Type, s3);
initialValues.put(Level1, s4);
initialValues.put(Father, s5);
initialValues.put(ACCCurr,s6);
initialValues.put(AccNat, s7);
initialValues.put(LowLevel, s8);
initialValues.put(DefNum, s9);
initialValues.put(AccClass, s10);
initialValues.put(SubClass, s11);
initialValues.put(SSClass1, s12);
initialValues.put(SSClass2, s13);
initialValues.put(Stype1, s14);
initialValues.put(Stype2, s15);
initialValues.put(Stype3, s16);
return db.insert(DATABASE_TABLE, null, initialValues);
}
But this takes much time when inserting about 70,000+ rows! How can I accelerate the process of insertion into the data base, and after the insert is done, how can I apply Update on it?
Some options:
Prepopulate your database. See "Ship an application with a database"
Use transactions to reduce the time waiting for I/O. See e.g. "Android SQLite database: slow insertion". Likely you cannot wrap all 70k rows in a single transaction but something like 100..1000 inserts per transaction should be doable, cutting the cumulative I/O wait time by orders of magnitude.
Inserting into SQLlite android using PHP? how is it possible using php in android phone, I am sorry I didn't got this.
Anyways I believe you have written the java code up here and you have like 7k+ records that you want to insert in your db.
The style of inserting a bulk of records in any db is called "Bulk Inserts", the idea is to create as less number of transactions as possible and rather do all the inserts in one shot; In case of relational db's like sql server and oracle its done by specific api's as well, but in sqllite the plain old idea is to make a single transaction with a bunch of data
check out this article which uses the same technique http://www.techrepublic.com/blog/software-engineer/turbocharge-your-sqlite-inserts-on-android/ and also explains it quite well.
You have to use transaction to done insertion in 1 time. you can use this:
//before insertion
db.beginTransaction();
//====do insertion
//after insertion
db.setTransactionSuccessful()
db.endTransaction();

Android sqlite db load data to server and truncate

I have written an app that saves some data in sqlite db. Periodically I want to send this data to server and then truncate the tables in sqlite (so that the app does not fill up the space on device).
I am using singleton object of SQLiteOpenHelper (which I have read is thread safe).
So my question is - does the following code looks ok-
SQLiteOpenHelper openHelper = MyDBOpenHelper.getInsance();
SQLiteDatabase database = openHelper.getWritableDatabase();
database.beginTransaction();
Cursor cursor = database.rawQuery(SELECT_ALL_TABLE1_STMT, null);
while( cursor.moveToNext()) {
// save result in tmp list/buffer
}
database.execSQL(DELETE_ALL_TABLE1_STMT);
database.endTransaction();
database.setTransactionSuccessful();
// send data to server
// and repeat the process for rest of the tables.
If there is another thread that is trying to write to the same table (that I am reading and later will truncate), then does the above code looks ok to handle that scenario?
thanks!
You need more input on this from here:
Synchronized Methods
Android Multi-threading
Plus AsyncTask, of course.

Best way to structure a sync between a tablet and a database?

Just curious on the best practice on syncing data from a database to an android tablet.
Tables:
- Part1
- Part2
- Part3
- Part4
- Part5
Whenever I open the app on the tablet I grab the latest lists from the database, truncate the table, and re-add the records. Each table consists of 400 records. So it takes around 60.45 per table to grab the data from the server and add it. Since I have 5 tables it takes around 5 minutes. Is there a better way to achieve efficient syncing for what I am doing? After I grab the data from the server, instead of truncating the table I've tried checking if it exists firsts before adding it but that didn't help with the time.
What I am currently doing: I get the JSON list from the API server and truncate the table and add the rows back. Pretty time consuming with 5 tables of 500 records each.
libraryApp = (LibraryApp) act.getApplication();
List<Pair> technicians = getJsonData("get_technicians");
if(technicians.size() > 0) {
stiLibraryApp.getDataManager().emptyTechnicianTable(); // truncate current table
// add technicians back to database
for(Pair p : technicians) {
libraryApp.getDataManager().saveTechnician(new Technician((Integer) p.key(), (String) p.value()));
}
}
Given the limited information provided I would try the following:
(1) Have your server keep a record of when the table you are updating was last "put" on the server. I have no idea what backend language you are using so I cannot make a sugestion. But it should be really easy to keep a lastupdated timestamp.
With this timestamp you will be able to tell if the version of the table on your server is more recent than the version on your mobile device.
(2) Using an AsyncTask download the data you need. I am not sure if all 5 tables are in the same activity, in seperate activities, in fragments or something else. However, the general idea is as follows:
private class GetTableData extends AsyncTask<Void, Void, Void>{
#Override
protected Void doInBackground(){
//get data from your sever
return null;
}
protect void onPostExecute(Void result){
//update the table view if version on server is newer
}
You will place all your I/O methods, that is those that connect to your server and download data within doInBackground. You will place all methods that update the table view within onPostExecute. This seperation is necessary because while I/O functions must run in the background after Jellybean, views must be updated from the UI thread.
(3) Check the timestamp of what you downloaded. If what you downloaded is newer update your table. You can acomplish this by simply adding in a conditional statment to your onPostExecute function such that
if(lastDownloadTime < lastUpdatedOnServerTime){
//update view
}
Depending on how big the table files are you may want to add a function on your sever code that just returns the time the table was last updated. That way you can check the time it was last updated against the time you last downloaded the table. If the table was updated on the server after you downladed it you can proceed to download the new information.
That's the basic idea. You can adapt it to your own set up.

Bulk Insertion on Android device

I want to bulk insert about 700 records into the Android database on my next upgrade. What's the most efficient way to do this? From various posts, I know that if I use Insert statements, I should wrap them in a transaction. There's also a post about using your own database, but I need this data to go into my app's standard Android database. Note that this would only be done once per device.
Some ideas:
Put a bunch of SQL statements in a file, read them in a line at a time, and exec the SQL.
Put the data in a CSV file, or JSON, or YAML, or XML, or whatever. Read a line at a time and do db.insert().
Figure out how to do an import and do a single import of the entire file.
Make a sqlite database containing all the records, copy that onto the Android device, and somehow merge the two databases.
[EDIT] Put all the SQL statements in a single file in res/values as one big string. Then read them a line at a time and exec the SQL.
What's the best way? Are there other ways to load data? Are 3 and 4 even possible?
Normally, each time db.insert() is used, SQLite creates a transaction (and resulting journal file in the filesystem), which slows things down.
If you use db.beginTransaction() and db.endTransaction() SQLite creates only a single journal file on the filesystem and then commits all the inserts at the same time, dramatically speeding things up.
Here is some pseudo code from: Batch insert to SQLite database on Android
try
{
db.beginTransaction();
for each record in the list
{
do_some_processing();
if (line represent a valid entry)
{
db.insert(SOME_TABLE, null, SOME_VALUE);
}
some_other_processing();
}
db.setTransactionSuccessful();
}
catch (SQLException e) {}
finally
{
db.endTransaction();
}
If you wish to abort a transaction due to an unexpected error or something, simply db.endTransaction() without first setting the transaction as successful (db.setTransactionSuccessful()).
Another useful method is to use db.inTransaction() (returns true or false) to determine if you are currently in the middle of a transaction.
Documentation here
I've found that for bulk insertions, the (apparently little-used) DatabaseUtils.InsertHelper class is several times faster than using SQLiteDatabase.insert.
Two other optimizations also helped with my app's performance, though they may not be appropriate in all cases:
Don't bind values that are empty or null.
If you can be certain that it's safe to do it, temporarily turning off the database's internal locking can also help performance.
I have a blog post with more details.
This example below will work perfectly
String sql = "INSERT INTO " + DatabaseHelper.TABLE_PRODUCT_LIST
+ " VALUES (?,?,?,?,?,?,?,?,?);";
SQLiteDatabase db = this.getWritableDatabase();
SQLiteStatement statement = db.compileStatement(sql);
db.beginTransaction();
for(int idx=0; idx < Produc_List.size(); idx++) {
statement.clearBindings();
statement.bindLong(1, Produc_List.get(idx).getProduct_id());
statement.bindLong(2, Produc_List.get(idx).getCategory_id());
statement.bindString(3, Produc_List.get(idx).getName());
// statement.bindString(4, Produc_List.get(idx).getBrand());
statement.bindString(5, Produc_List.get(idx).getPrice());
//statement.bindString(6, Produc_List.get(idx).getDiscPrice());
statement.bindString(7, Produc_List.get(idx).getImage());
statement.bindLong(8, Produc_List.get(idx).getLanguage_id());
statement.bindLong(9, Produc_List.get(idx).getPl_rank());
statement.execute();
}
db.setTransactionSuccessful();
db.endTransaction();
Well, my solution for this it kind of weird but works fine...
I compile a large sum of data and insert it in one go (bulk insert?)
I use the db.execSQL(Query) command and I build the "Query" with the following statement...
INSERT INTO yourtable SELECT * FROM (
SELECT 'data1','data2'.... UNION
SELECT 'data1','data2'.... UNION
SELECT 'data1','data2'.... UNION
.
.
.
SELECT 'data1','data2'....
)
The only problem is the building of the query which can be kind of messy.
I hope it helps
I don't believe there is any feasible way to accomplish #3 or #4 on your list.
Of the other solutions you list two that have the datafile contain direct SQL, and the other has the data in a non-SQL format.
All three would work just fine, but the latter suggestion of grabbing the data from a formatted file and building the SQL yourself seems the cleanest. If true batch update capability is added at a later date your datafile is still usable, or at least easily processable into a usable form. Also, creation of the datafile is more straightforward and less error prone. Finally, having the "raw" data would allow import into other data-store formats.
In any case, you should (as you mentioned) wrap the groups of inserts into transactions to avoid the per-row transaction journal creation.

Categories

Resources