I created Calendar application for android, when i store data(daily plans) for one month in a single instance in my calendar application, the emulator shows not responding alert or it takes about 5 min to store the data. How can I store the data to database as quick.
It is always recommended to use SQLite Transactions to store large amount of data. Transactions create single journal file to perform SQLite manipulation, causing the entire process to accomplish quickly.
A simple Transaction would look like:
db.beginTransaction();
try {
/*
*
perform sql add/edit/delete here
*
*/
db.setTransactionSuccessful();
}
catch {
//Error in between database transaction
}finally {
db.endTransaction();
}
P.S. If its taking 5 min without transactions, then I hope it should be completed within ±10 seconds when using it.
Also if it does not respond to ui gestures, you are doing database operations in the main thread, move database operations to background thread, preferably by using AsyncTask.
Related
My understanding of SQLite transactions on Android is based largely on this article. In its gist it suggests that
if you do not wrap calls to SQLite in an explicit transaction it will
create an implicit transaction for you. A consequence of these
implicit transactions is a loss of speed
.
That observation is correct - I started using transactions to fix just that issue:speed. In my own Android app I use a number of rather complex SQLite tables to store JSON data which I manipulate via the SQLite JSON1 extension - I use SQLCipher which has JSON1 built in.
At any given time I have to manipulate - insert, update or delete - rows in several tables. Given the complexity of the JSON I do this with the help of temporary tables I create for each table manipulation. The start of the manipulation begins with SQL along the lines of
DROP TABLE IF EXISTS h1;
CREATE TEMP TABLE h1(v1 TEXT,v2 TEXT,v3 TEXT,v4 TEXT,v5 TEXT);
Some tables require just one table - which I usually call h1 - others need two in which case I call them h1 and h2.
The entire sequence of operations in any single set of manipulations takes the form
begin transaction
manipulate Table 1 which
which creates its own temp tables, h1[h2],
then extracts relevant existing JSON from Table 1 into the temps
manipulates h1[h2]
performs inserts, updates, deletes in Table 1
on to the next table, Table 2 where the same sequence is repeated
continue with a variable list of such tables - never more than 5
end transaction
My questions
does this sound like an efficient way to do things or would it be better to wrap each individual table operation in its own transaction?
it is not clear to me what happens to my DROP TABLE/CREATE TEMP TABLE calls. If I end up with h1[h2] temp tables that are pre-populated with data from manipulating Table(n - 1) when working with Table(n) then the updates on Table(n) will go totally wrong. I am assuming that the DROP TABLE bit I have is taking care of this issue. Am I right in assuming this?
I have to admit to not being an expert with SQL, even less so with SQLite and quite a newbie when it comes to using transactions. The SQLite JSON extension is very powerful but introduces a whole new level of complexity when manipulating data.
The main reason to use transactions is to reduce the overheads of writing to the disk.
So if you don't wrap multiple changes (inserts, deletes and updates) in a transaction then each will result in the database being written to disk and the overheads involved.
If you wrap them in a transaction and the in-memory version will be written only when the transaction is completed (note that if using the SQLiteDatabase beginTransaction/endTransaction methods, that you should, as part of ending the transaction use the setTransactionSuccessful method and then use the endTransaction method).
That is, the SQLiteDatabase method are is different to doing this via pure SQL when you'd begin the transaction and then end/commit it/them (i.e. the SQLiteDatabase methods would otherwise automatically rollback the transactions).
Saying that the statement :-
if you do not wrap calls to SQLite in an explicit transaction it will
create an implicit transaction for you. A consequence of these
implicit transactions is a loss of speed
basically reiterates :-
Any command that changes the database (basically, any SQL command
other than SELECT) will automatically start a transaction if one is
not already in effect. Automatically started transactions are
committed when the last query finishes.
SQL As Understood By SQLite - BEGIN TRANSACTION i.e. it's not Android specific.
does this sound like an efficient way to do things or would it be
better to wrap each individual table operation in its own transaction?
Doing all the operations in a single transaction will be more efficient as there is just the single write to disk operation.
it is not clear to me what happens to my DROP TABLE/CREATE TEMP TABLE
calls. If I end up with h1[h2] temp tables that are pre-populated with
data from manipulating Table(n - 1) when working with Table(n) then
the updates on Table(n) will go totally wrong. I am assuming that the
DROP TABLE bit I have is taking care of this issue. Am I right in
assuming this?
Dropping the tables will ensure data integrity (i.e. you should, by the sound of it, do this), you could also use :-
CREATE TEMP TABLE IF NOT EXISTS h1(v1 TEXT,v2 TEXT,v3 TEXT,v4 TEXT,v5 TEXT);
DELETE FROM h1;
I am developing an android application in which I need to download an JSON string and save it in SQlite database in a specific format (In my perspective, I have no other option to choose any other data-storage). And this is my table-structure:
problem_table(pid INTEGER PRIMARY KEY,
num TEXT, title TEXT,
dacu INTEGER,
verdict_series TEXT)
And at launch I need almost 4200 rows to be entered into the database table. I am working on emulator and when I launch the app, it works perfectly. But the app seemed to be freeze for a while after database manipulation is begin. Eventually the app manages to insert all the row but take pretty much time. Even at a point it shows the following look:
So how can I reduce the time-memory complexity or how can I do this in more optimized way or avoid this temporary failure?
N.S. : I didn't check it in any real device yet for lack of my scope. My emulator is using 512 RAM and 48 heap size.
Don't do your database manipulations in UI thread but in an AsyncTask, Thread, Service or whatever, but not in the UI Thread.
I solved it by #Jakobud answer given here
Answer:
Normally, each time db.insert() is used, SQLite creates a transaction (and resulting journal file in the filesystem). If you use db.beginTransaction() and db.endTransaction() SQLite commits all the inserts at the same time, dramatically speeding things up.
Here is some pseudo code from: Batch insert to SQLite database on Android
try
{
db.beginTransaction();
for each record in the list
{
do_some_processing();
if (line represent a valid entry)
{
db.insert(SOME_TABLE, null, SOME_VALUE);
}
some_other_processing();
}
db.setTransactionSuccessful();
}
catch (SQLException e) {}
finally
{
db.endTransaction();
}
Just curious on the best practice on syncing data from a database to an android tablet.
Tables:
- Part1
- Part2
- Part3
- Part4
- Part5
Whenever I open the app on the tablet I grab the latest lists from the database, truncate the table, and re-add the records. Each table consists of 400 records. So it takes around 60.45 per table to grab the data from the server and add it. Since I have 5 tables it takes around 5 minutes. Is there a better way to achieve efficient syncing for what I am doing? After I grab the data from the server, instead of truncating the table I've tried checking if it exists firsts before adding it but that didn't help with the time.
What I am currently doing: I get the JSON list from the API server and truncate the table and add the rows back. Pretty time consuming with 5 tables of 500 records each.
libraryApp = (LibraryApp) act.getApplication();
List<Pair> technicians = getJsonData("get_technicians");
if(technicians.size() > 0) {
stiLibraryApp.getDataManager().emptyTechnicianTable(); // truncate current table
// add technicians back to database
for(Pair p : technicians) {
libraryApp.getDataManager().saveTechnician(new Technician((Integer) p.key(), (String) p.value()));
}
}
Given the limited information provided I would try the following:
(1) Have your server keep a record of when the table you are updating was last "put" on the server. I have no idea what backend language you are using so I cannot make a sugestion. But it should be really easy to keep a lastupdated timestamp.
With this timestamp you will be able to tell if the version of the table on your server is more recent than the version on your mobile device.
(2) Using an AsyncTask download the data you need. I am not sure if all 5 tables are in the same activity, in seperate activities, in fragments or something else. However, the general idea is as follows:
private class GetTableData extends AsyncTask<Void, Void, Void>{
#Override
protected Void doInBackground(){
//get data from your sever
return null;
}
protect void onPostExecute(Void result){
//update the table view if version on server is newer
}
You will place all your I/O methods, that is those that connect to your server and download data within doInBackground. You will place all methods that update the table view within onPostExecute. This seperation is necessary because while I/O functions must run in the background after Jellybean, views must be updated from the UI thread.
(3) Check the timestamp of what you downloaded. If what you downloaded is newer update your table. You can acomplish this by simply adding in a conditional statment to your onPostExecute function such that
if(lastDownloadTime < lastUpdatedOnServerTime){
//update view
}
Depending on how big the table files are you may want to add a function on your sever code that just returns the time the table was last updated. That way you can check the time it was last updated against the time you last downloaded the table. If the table was updated on the server after you downladed it you can proceed to download the new information.
That's the basic idea. You can adapt it to your own set up.
i am developing android app, here i am having an huge no of data approximately 10000 records with 10 fields in the server, i need to get this data and store it in the my local db, so for this i tried to implement by getting the data in the form of json parsing it and inserting in db one by one, it is taking less time to download the data but more time to insert to the db, after some time i get to know that i am inserting to the db one by one, so insertion operations looping based on the total no of records which had been got. i tried to look for the alternatives i could not get the way for this, so i request you to give me suggestions and snippets to me achieve this.
Thanking you
use transactions to wrap multiple inserts into one operation, that's a lot faster: Improve INSERT-per-second performance of SQLite?
List<Item> list = getDataFromJson();
SQLiteDatabase db = getDatabase();
db.beginTransaction();
try {
// doing all the inserts (into memory) here
for(Item item : list) {
db.insert(table, null, item.getContentValues());
}
// nothing was actually inserted yet
db.setTransactionSuccessful();
} finally {
// all inserts happen now (if transaction was set to successful)
db.endTransaction();
}
I am performing two functionalities on my Application
i>Save data (Insertion of data which is quite large)
ii> Data Deletion
I am using SQLite & creating a single database object & referencing it to perform the different database operations. When the application is closing , I am closing the database object.
While saving & deleting the data , its taking approx 4-5 seconds .
I have the following questions:
i>Is opening & closing the database multiple times to handle the database queries a better approach or opening & closing the database only once?
ii>Will using transaction(read on some articles) , while inserting & deleting the data , help me in minimizing the processing time taken ?
If so , kindly provide some sample code/hints for transaction.
Thanks in advance.
Warm Regards,
CB
1) It depends on what are you trying to achieve. If you need to do database updates and update the UI respectively and that's an ongoing process for your application - better leave teh db connection open. If you just do some query to populate data once, or do some inserts - better is to close the database connection as soon as you're finished.
2) Transactions greatly improve speed. Sample code:
db.beginTransaction();
try {
// do your db operations here
...
db.setTransactionSuccessful();
} finally {
db.endTransaction();
}