Using Transaction helpful in Android in my scenario? - android

I am performing two functionalities on my Application
i>Save data (Insertion of data which is quite large)
ii> Data Deletion
I am using SQLite & creating a single database object & referencing it to perform the different database operations. When the application is closing , I am closing the database object.
While saving & deleting the data , its taking approx 4-5 seconds .
I have the following questions:
i>Is opening & closing the database multiple times to handle the database queries a better approach or opening & closing the database only once?
ii>Will using transaction(read on some articles) , while inserting & deleting the data , help me in minimizing the processing time taken ?
If so , kindly provide some sample code/hints for transaction.
Thanks in advance.
Warm Regards,
CB

1) It depends on what are you trying to achieve. If you need to do database updates and update the UI respectively and that's an ongoing process for your application - better leave teh db connection open. If you just do some query to populate data once, or do some inserts - better is to close the database connection as soon as you're finished.
2) Transactions greatly improve speed. Sample code:
db.beginTransaction();
try {
// do your db operations here
...
db.setTransactionSuccessful();
} finally {
db.endTransaction();
}

Related

How do SQLite transactions on Android work?

My understanding of SQLite transactions on Android is based largely on this article. In its gist it suggests that
if you do not wrap calls to SQLite in an explicit transaction it will
create an implicit transaction for you. A consequence of these
implicit transactions is a loss of speed
.
That observation is correct - I started using transactions to fix just that issue:speed. In my own Android app I use a number of rather complex SQLite tables to store JSON data which I manipulate via the SQLite JSON1 extension - I use SQLCipher which has JSON1 built in.
At any given time I have to manipulate - insert, update or delete - rows in several tables. Given the complexity of the JSON I do this with the help of temporary tables I create for each table manipulation. The start of the manipulation begins with SQL along the lines of
DROP TABLE IF EXISTS h1;
CREATE TEMP TABLE h1(v1 TEXT,v2 TEXT,v3 TEXT,v4 TEXT,v5 TEXT);
Some tables require just one table - which I usually call h1 - others need two in which case I call them h1 and h2.
The entire sequence of operations in any single set of manipulations takes the form
begin transaction
manipulate Table 1 which
which creates its own temp tables, h1[h2],
then extracts relevant existing JSON from Table 1 into the temps
manipulates h1[h2]
performs inserts, updates, deletes in Table 1
on to the next table, Table 2 where the same sequence is repeated
continue with a variable list of such tables - never more than 5
end transaction
My questions
does this sound like an efficient way to do things or would it be better to wrap each individual table operation in its own transaction?
it is not clear to me what happens to my DROP TABLE/CREATE TEMP TABLE calls. If I end up with h1[h2] temp tables that are pre-populated with data from manipulating Table(n - 1) when working with Table(n) then the updates on Table(n) will go totally wrong. I am assuming that the DROP TABLE bit I have is taking care of this issue. Am I right in assuming this?
I have to admit to not being an expert with SQL, even less so with SQLite and quite a newbie when it comes to using transactions. The SQLite JSON extension is very powerful but introduces a whole new level of complexity when manipulating data.
The main reason to use transactions is to reduce the overheads of writing to the disk.
So if you don't wrap multiple changes (inserts, deletes and updates) in a transaction then each will result in the database being written to disk and the overheads involved.
If you wrap them in a transaction and the in-memory version will be written only when the transaction is completed (note that if using the SQLiteDatabase beginTransaction/endTransaction methods, that you should, as part of ending the transaction use the setTransactionSuccessful method and then use the endTransaction method).
That is, the SQLiteDatabase method are is different to doing this via pure SQL when you'd begin the transaction and then end/commit it/them (i.e. the SQLiteDatabase methods would otherwise automatically rollback the transactions).
Saying that the statement :-
if you do not wrap calls to SQLite in an explicit transaction it will
create an implicit transaction for you. A consequence of these
implicit transactions is a loss of speed
basically reiterates :-
Any command that changes the database (basically, any SQL command
other than SELECT) will automatically start a transaction if one is
not already in effect. Automatically started transactions are
committed when the last query finishes.
SQL As Understood By SQLite - BEGIN TRANSACTION i.e. it's not Android specific.
does this sound like an efficient way to do things or would it be
better to wrap each individual table operation in its own transaction?
Doing all the operations in a single transaction will be more efficient as there is just the single write to disk operation.
it is not clear to me what happens to my DROP TABLE/CREATE TEMP TABLE
calls. If I end up with h1[h2] temp tables that are pre-populated with
data from manipulating Table(n - 1) when working with Table(n) then
the updates on Table(n) will go totally wrong. I am assuming that the
DROP TABLE bit I have is taking care of this issue. Am I right in
assuming this?
Dropping the tables will ensure data integrity (i.e. you should, by the sound of it, do this), you could also use :-
CREATE TEMP TABLE IF NOT EXISTS h1(v1 TEXT,v2 TEXT,v3 TEXT,v4 TEXT,v5 TEXT);
DELETE FROM h1;

Best way to structure a sync between a tablet and a database?

Just curious on the best practice on syncing data from a database to an android tablet.
Tables:
- Part1
- Part2
- Part3
- Part4
- Part5
Whenever I open the app on the tablet I grab the latest lists from the database, truncate the table, and re-add the records. Each table consists of 400 records. So it takes around 60.45 per table to grab the data from the server and add it. Since I have 5 tables it takes around 5 minutes. Is there a better way to achieve efficient syncing for what I am doing? After I grab the data from the server, instead of truncating the table I've tried checking if it exists firsts before adding it but that didn't help with the time.
What I am currently doing: I get the JSON list from the API server and truncate the table and add the rows back. Pretty time consuming with 5 tables of 500 records each.
libraryApp = (LibraryApp) act.getApplication();
List<Pair> technicians = getJsonData("get_technicians");
if(technicians.size() > 0) {
stiLibraryApp.getDataManager().emptyTechnicianTable(); // truncate current table
// add technicians back to database
for(Pair p : technicians) {
libraryApp.getDataManager().saveTechnician(new Technician((Integer) p.key(), (String) p.value()));
}
}
Given the limited information provided I would try the following:
(1) Have your server keep a record of when the table you are updating was last "put" on the server. I have no idea what backend language you are using so I cannot make a sugestion. But it should be really easy to keep a lastupdated timestamp.
With this timestamp you will be able to tell if the version of the table on your server is more recent than the version on your mobile device.
(2) Using an AsyncTask download the data you need. I am not sure if all 5 tables are in the same activity, in seperate activities, in fragments or something else. However, the general idea is as follows:
private class GetTableData extends AsyncTask<Void, Void, Void>{
#Override
protected Void doInBackground(){
//get data from your sever
return null;
}
protect void onPostExecute(Void result){
//update the table view if version on server is newer
}
You will place all your I/O methods, that is those that connect to your server and download data within doInBackground. You will place all methods that update the table view within onPostExecute. This seperation is necessary because while I/O functions must run in the background after Jellybean, views must be updated from the UI thread.
(3) Check the timestamp of what you downloaded. If what you downloaded is newer update your table. You can acomplish this by simply adding in a conditional statment to your onPostExecute function such that
if(lastDownloadTime < lastUpdatedOnServerTime){
//update view
}
Depending on how big the table files are you may want to add a function on your sever code that just returns the time the table was last updated. That way you can check the time it was last updated against the time you last downloaded the table. If the table was updated on the server after you downladed it you can proceed to download the new information.
That's the basic idea. You can adapt it to your own set up.

How to save large data quickly into sqlite database in android

I created Calendar application for android, when i store data(daily plans) for one month in a single instance in my calendar application, the emulator shows not responding alert or it takes about 5 min to store the data. How can I store the data to database as quick.
It is always recommended to use SQLite Transactions to store large amount of data. Transactions create single journal file to perform SQLite manipulation, causing the entire process to accomplish quickly.
A simple Transaction would look like:
db.beginTransaction();
try {
/*
*
perform sql add/edit/delete here
*
*/
db.setTransactionSuccessful();
}
catch {
//Error in between database transaction
}finally {
db.endTransaction();
}
P.S. If its taking 5 min without transactions, then I hope it should be completed within ±10 seconds when using it.
Also if it does not respond to ui gestures, you are doing database operations in the main thread, move database operations to background thread, preferably by using AsyncTask.

Android SQLite - Update table only if different

I currently successfully use a SQLite database which is populated with data from the web. I create an array of values and add these as a row to the database.
Currently to update the database, on starting the activity I clear the database and repopulate it using the data from the web.
Is there an easy method to do one of the following?
A: Only update a row in the table if data has changed (I'm not sure how I could do this unless there was a consistent primary key - what would happen is a new row would be added with the changed data, however there would be no way to know which of the old rows to remove)
B: get all of the rows of data from the web, then empty and fill the database in one go rather than after getting each row
I hope this makes sense. I can provide my code but I don't think it's especially useful for this example.
Context:
On starting the activity, the database is scanned to retrieve values for a different task. However, this takes longer than it needs to because the database is emptied and refilled slowly. Therefore the task can only complete when the database is fully repopulated.
In an ideal world, the database would be scanned and values used for the task, and that database would only be replaced when the complete set of updated data is available.
Your main concern with approach (b) - clearing out all data and slowly repopulating - seems to be that any query between the empty and completion of the refill would need to be refused.
You could simply put the empty/repopulate process in a transaction. Thereby the database will always have data to offer for reading.
Alternatively, if that's not a viable solution, how about appending newer results to the existing ones, but inserted as with an 'active' key set to 0. Then, once the process of adding entries is complete, use a transaction to find and remove currently active entries, and (in the same transaction) update the inactive entries to active.

Android sqlite database setlocale is too slow while opening db. How to solve?

I use the following method for reading/writing db:
Database is located at /data/data/{packagename}/databases/Database.db
Since the database is greater than 3Mb we found a specific solution to have it copied there and to have it populated with appropriate data.
Following is the class implementing the task to get the opened database. This class is a singleton.
public class DatabaseHelper extends SQLiteOpenHelper
to open the database we use the following method:
SQLiteDatabase db = DatabaseHelper.getInsance().getReadableDatabase();
Then rawquery is used for querying the db:
Cursor cursor = db.rawQuery(query, null);
Then best fitting to our purposes we fetch the database data into memory in different resultset instances:
while (cursor.moveToNext()) {
ResultSet rs = new ResultSet();
rs.setThis(cursor.getInt(0));
rs.setThat(cursor.getString(1));
// and so on.. this is just an example
ResultList.add(rs);
}
Finally:
cursor.close();
db.close();
Let mention, if necessary, transaction is used also, but using transaction didn't lead to speed-up.
For every single query the pattern above is (quite) followed. But unfortunately this solution seems very slow. So some method profiling is made and it came to clear, that sqlite setlocale is always run at getReadableDatabase() (which is created! don't forget) and that method takes the most of the time. Meanly 40% alone..
Please advice how to solve this problem! Or maybe please offer an other pattern to satisfy our needs!
Thanks in advance
Szia!
Funniest thing is, native_setLocale (which is causing the slow DB open) apparently doesn't even work: http://code.google.com/p/android/issues/detail?id=2625
6) Finally:
cursor.close();
db.close();
It's not possible to keep the database open between queries?
As with the question posed here (SQLCipher for Android getReadableDatabase() Overherad) the performance issue you are seeing is likely due to SQLCipher key derivation. Performance for opening a database is deliberately slow due to key derivation. You should cache the database connection so that it can be used multiple times without having to open and key the database repeatedly. If this is possible, opening the database once during startup is the preferred course of action. Subsequent access on the same database handle will not trigger key derivation, so performance will be much faster.

Categories

Resources