Handling an exception within Room - android

I migrated from ormlite to Android Room
I use custom query which can fail, but I don't find how to handle the exception
try {
appDatabase.query(new SimpleSQLiteQuery(sql));
} catch (SQLException e)
{
// custom code
}
how can i achieve this with android rooms ?
That type of query fail with some version of sqlite
INSERT INTO country(id,continent,name)
SELECT 1,'Asia','Afghanistan' UNION
SELECT 2,'Africa','Egypt'
of course there are a lot of line, and it was made for performance reasons. If it fail, I run a fail sql batch
that type of query is supported from a sqlite version BUT there are device which don't support it, even it has the required version. (so I'd like to know when there is an error)

Related

SQLiteConstraintException error message for insert duplicate primary key

try {
db.insertOrThrow("savedreports", null, cv);
} catch (SQLException e) {
Log.e(TAG, e.getMessage(), e);
}
If the code above is executed trying to insert duplicate record (same primary key and fields values as well) what error message will the exception give?
"PRIMARY KEY must be unique"
or
"UNIQUE constraint failed"
or
???
My program will grab some records from a server and then informs the server to delete those records. However sometimes the http request to delete the records fails due to bad connection, hence I would get the same records when I rerequest for the latest records.
I currently insert these records using insertOrThrow, hence when the duplicate records occur, I would like to know exactly the exception thrown it's exactly due to duplicate records (and not due to null column constraint, database connection close or etc).
These constraints are designed to detect programming errors, not runtime errors. Therefore, there is no guaranteed error message.
If you want to check that the key is not a duplicate, you have to do the check yourself.
But if this is the only constraint that can fail, you can INSERT OR IGNORE instead (insertWithOnConflict() in Android).

Android Pre Populated Database - Adding New Pre Populated Rows After Publication

I'm having trouble with a pre Populated database in android. Not the usual problems though. I've got the database working just fine.
My problem comes with adding new data after the app has been published.
I spent a lot of time with onupgrade method but then it dawned on me that my problem is elsewhere. Once I've added new lines to my database in my assets folder, how do I get these added to the database that was copied to my data/data folder.....
My database is where I store my level information for a game, the last column in the table is a flag to mark the level completed so I can't lose this information.
You could add some sql patch files, and then read them to upgrade your database.
I used it simply with the FileHelper static class I copied from Danny Remington's post on SO and then do :
try {
InputStream in = mgr.open(assetFile);
String[] statements = FileHelper.parseSqlFile(in);
dao.mDb.execSQL("BEGIN TRANSACTION;");
/*dao.mDb is a reference to the SQLiteDatabase from my dao*/
for (String statement : statements) {
dao.mDb.execSQL(statement);
}
dao.mDb.execSQL("COMMIT;");
in.close();
} catch (IOException e) {
//
}

How to avoid app failure against large database manipulation

I am developing an android application in which I need to download an JSON string and save it in SQlite database in a specific format (In my perspective, I have no other option to choose any other data-storage). And this is my table-structure:
problem_table(pid INTEGER PRIMARY KEY,
num TEXT, title TEXT,
dacu INTEGER,
verdict_series TEXT)
And at launch I need almost 4200 rows to be entered into the database table. I am working on emulator and when I launch the app, it works perfectly. But the app seemed to be freeze for a while after database manipulation is begin. Eventually the app manages to insert all the row but take pretty much time. Even at a point it shows the following look:
So how can I reduce the time-memory complexity or how can I do this in more optimized way or avoid this temporary failure?
N.S. : I didn't check it in any real device yet for lack of my scope. My emulator is using 512 RAM and 48 heap size.
Don't do your database manipulations in UI thread but in an AsyncTask, Thread, Service or whatever, but not in the UI Thread.
I solved it by #Jakobud answer given here
Answer:
Normally, each time db.insert() is used, SQLite creates a transaction (and resulting journal file in the filesystem). If you use db.beginTransaction() and db.endTransaction() SQLite commits all the inserts at the same time, dramatically speeding things up.
Here is some pseudo code from: Batch insert to SQLite database on Android
try
{
db.beginTransaction();
for each record in the list
{
do_some_processing();
if (line represent a valid entry)
{
db.insert(SOME_TABLE, null, SOME_VALUE);
}
some_other_processing();
}
db.setTransactionSuccessful();
}
catch (SQLException e) {}
finally
{
db.endTransaction();
}

Check if query was successful in sqlite for Android

In order to insert data into a sqlite table named mytable in android I use query:
db.execSQL("INSERT INTO MYTABLE VALUES('','"+name+"',NOW());");
I want to check whether this query was successfully inserted into the table or not.
In php (with mysql) you could easily do
if($result)
echo 'success';
else
echo 'not inserted';
But what is the equivalent code in android (with sqlite) to check whether the query has been successfully
executed without any error?
According to the documentation execSQL throws an SQLException so I think your best bet is to surround your execSQL call with a try catch. You can then deal with whatever errors occurs in the catch. So something like this is what you want
try {
db.execSQL("INSERT INTO MYTABLE VALUES('','"+name+"',NOW());");
} catch (SQLException e) {
...deal with error
}

Bulk Insertion on Android device

I want to bulk insert about 700 records into the Android database on my next upgrade. What's the most efficient way to do this? From various posts, I know that if I use Insert statements, I should wrap them in a transaction. There's also a post about using your own database, but I need this data to go into my app's standard Android database. Note that this would only be done once per device.
Some ideas:
Put a bunch of SQL statements in a file, read them in a line at a time, and exec the SQL.
Put the data in a CSV file, or JSON, or YAML, or XML, or whatever. Read a line at a time and do db.insert().
Figure out how to do an import and do a single import of the entire file.
Make a sqlite database containing all the records, copy that onto the Android device, and somehow merge the two databases.
[EDIT] Put all the SQL statements in a single file in res/values as one big string. Then read them a line at a time and exec the SQL.
What's the best way? Are there other ways to load data? Are 3 and 4 even possible?
Normally, each time db.insert() is used, SQLite creates a transaction (and resulting journal file in the filesystem), which slows things down.
If you use db.beginTransaction() and db.endTransaction() SQLite creates only a single journal file on the filesystem and then commits all the inserts at the same time, dramatically speeding things up.
Here is some pseudo code from: Batch insert to SQLite database on Android
try
{
db.beginTransaction();
for each record in the list
{
do_some_processing();
if (line represent a valid entry)
{
db.insert(SOME_TABLE, null, SOME_VALUE);
}
some_other_processing();
}
db.setTransactionSuccessful();
}
catch (SQLException e) {}
finally
{
db.endTransaction();
}
If you wish to abort a transaction due to an unexpected error or something, simply db.endTransaction() without first setting the transaction as successful (db.setTransactionSuccessful()).
Another useful method is to use db.inTransaction() (returns true or false) to determine if you are currently in the middle of a transaction.
Documentation here
I've found that for bulk insertions, the (apparently little-used) DatabaseUtils.InsertHelper class is several times faster than using SQLiteDatabase.insert.
Two other optimizations also helped with my app's performance, though they may not be appropriate in all cases:
Don't bind values that are empty or null.
If you can be certain that it's safe to do it, temporarily turning off the database's internal locking can also help performance.
I have a blog post with more details.
This example below will work perfectly
String sql = "INSERT INTO " + DatabaseHelper.TABLE_PRODUCT_LIST
+ " VALUES (?,?,?,?,?,?,?,?,?);";
SQLiteDatabase db = this.getWritableDatabase();
SQLiteStatement statement = db.compileStatement(sql);
db.beginTransaction();
for(int idx=0; idx < Produc_List.size(); idx++) {
statement.clearBindings();
statement.bindLong(1, Produc_List.get(idx).getProduct_id());
statement.bindLong(2, Produc_List.get(idx).getCategory_id());
statement.bindString(3, Produc_List.get(idx).getName());
// statement.bindString(4, Produc_List.get(idx).getBrand());
statement.bindString(5, Produc_List.get(idx).getPrice());
//statement.bindString(6, Produc_List.get(idx).getDiscPrice());
statement.bindString(7, Produc_List.get(idx).getImage());
statement.bindLong(8, Produc_List.get(idx).getLanguage_id());
statement.bindLong(9, Produc_List.get(idx).getPl_rank());
statement.execute();
}
db.setTransactionSuccessful();
db.endTransaction();
Well, my solution for this it kind of weird but works fine...
I compile a large sum of data and insert it in one go (bulk insert?)
I use the db.execSQL(Query) command and I build the "Query" with the following statement...
INSERT INTO yourtable SELECT * FROM (
SELECT 'data1','data2'.... UNION
SELECT 'data1','data2'.... UNION
SELECT 'data1','data2'.... UNION
.
.
.
SELECT 'data1','data2'....
)
The only problem is the building of the query which can be kind of messy.
I hope it helps
I don't believe there is any feasible way to accomplish #3 or #4 on your list.
Of the other solutions you list two that have the datafile contain direct SQL, and the other has the data in a non-SQL format.
All three would work just fine, but the latter suggestion of grabbing the data from a formatted file and building the SQL yourself seems the cleanest. If true batch update capability is added at a later date your datafile is still usable, or at least easily processable into a usable form. Also, creation of the datafile is more straightforward and less error prone. Finally, having the "raw" data would allow import into other data-store formats.
In any case, you should (as you mentioned) wrap the groups of inserts into transactions to avoid the per-row transaction journal creation.

Categories

Resources