I currently successfully use a SQLite database which is populated with data from the web. I create an array of values and add these as a row to the database.
Currently to update the database, on starting the activity I clear the database and repopulate it using the data from the web.
Is there an easy method to do one of the following?
A: Only update a row in the table if data has changed (I'm not sure how I could do this unless there was a consistent primary key - what would happen is a new row would be added with the changed data, however there would be no way to know which of the old rows to remove)
B: get all of the rows of data from the web, then empty and fill the database in one go rather than after getting each row
I hope this makes sense. I can provide my code but I don't think it's especially useful for this example.
Context:
On starting the activity, the database is scanned to retrieve values for a different task. However, this takes longer than it needs to because the database is emptied and refilled slowly. Therefore the task can only complete when the database is fully repopulated.
In an ideal world, the database would be scanned and values used for the task, and that database would only be replaced when the complete set of updated data is available.
Your main concern with approach (b) - clearing out all data and slowly repopulating - seems to be that any query between the empty and completion of the refill would need to be refused.
You could simply put the empty/repopulate process in a transaction. Thereby the database will always have data to offer for reading.
Alternatively, if that's not a viable solution, how about appending newer results to the existing ones, but inserted as with an 'active' key set to 0. Then, once the process of adding entries is complete, use a transaction to find and remove currently active entries, and (in the same transaction) update the inactive entries to active.
Related
Suppose, In my app I have a sqlite table that can contain at most 20 row. Each row has 2 column(id, name). Where I frequently need to search by Id to get Name. For this frequent need I have two solution:
Solution 1: Get rows in a arraylist<model> and then find name from array.
Solution 2: Every time search on sqlite table.
Now please give your opinion which one is better?
Remember again, I need this search in my recycleView item, so it call so frequently.
Thanks
I don't really get what is your real intent, but if your task is to search by id often, I would use
LongSparseArray<String> idsToNames; // or LongSparseArray<Model>
Which will map primitive long to Strings in a more memory-efficient way than Map and will have a better performance than ArrayList when searching.
The advantage over querying SQLite here is that you can do it in a blocking manner instead of having to query database on a background thread every time the lookup runs.
The disadvantage is that whenever data changes in SQLite, you will have to rebuild your idsToNames map. Also, if the number of entries in SQLite will eventually grow, you will end up in a large collection. So I would recommend this approach only if the updates to the database during this session will not happen, and if the data size is always predictable or fixed.
I need to synchronize the data in my application. I do the request to the server, bind and use copyToRealmOrUpdate(Iterable<E> objects) to add or update this data to the database.
But my files can be invalidated and I need something to delete everything that don't have at the data that return at the request. I don't want to truncate or do a manual delete to do this because performance matters.
IDEA 1
#beeender
What do you think about use the PRIMARY_KEY of the table to delete the data that I don't want (or I don't need)?
Looks like:
1º: If the database was populated, get all primary key and add it in an HashMap (or anything that do the same).
2º: Update the data or add, removes the item of the HashMap (using the primary key) if it was updated or added.
3º: Remove all items of HashMap on the Realm.
Maybe the In memory Realm would be a good choice for you in this situation. You can find related documents here .
By using the in-memory Realm:
The db will be empty when you start a new app process
After you close all the instances of the Realm, the data will be cleared as well.
----------------------------------- Update for deleting data for normal case -----------------------------------------
For deleting, there are some options you can use
Remove all data for a specific model, see doc
realm.allObjects(MyModel.class).clear();
Remove entire data from a given Realm by (Realm API)[https://realm.io/docs/java/latest/api/io/realm/Realm.html#deleteRealm(io.realm.RealmConfiguration)] (close all instances first!):
Realm.deleteRealm(realmConfig);
Or just remove the Realm file through normal java API.
If you really care about the performance, you could consider to separate those data in one Realm, and use option 2 or 3 to remove them. See doc here for using different Realm through RealmConfiguration.
----------------------------------- Update for delete by Date field ------------------------------------------------------
For your user case, this would be a good choice:
Add a Date field to your model, and add annotation #Index to make query faster on it.
Update/add rows and set the modified date to current time.
Delete the objects where its modifiedDate is before the current date.realm.where(MyModel.class).lessThan("modifiedDate", currentDate).findAll().clear()
NOTE: "The dates are truncated with a precision of one second. In order to maintain compatibility between 32 bits and 64 bits devices, it is not possible to store dates before 1900-12-13 and after 2038-01-19." See current limitations. If you could modified the table in a very short time which the accuracy doesn't fit, consider to use a int field instead. You can get the column's max value by RealmResult.max()
I already have a SQLite Database setup which I am using as cache for the Android application. The application does a HTTP Request and gets back a List of objects which I can insert into the db. After the first request, if I do anymore requests, how do all of the following in a better way:
1) insert all new objects from the list
2) update all objects that were already in the db
3) delete all rows that were not there in the latest list of objects.
I know that options 1 and 2 can be done using the "INSERT OR UPDATE" query. How can I manage the 3rd option efficiently?
Right now my approach is to delete all from table and then insert all. But that isn't very efficient. Any ideas how to improve it?
For that you can use the ids of the rows. For doing that first retrieve all the rows which you want to delete using SELECT query and add it a temporary arraylist, then use for loop over the arraylist to delete all those rows by using DELETE query.
You should do your operations using the applyBatch() method of the ContentProvider (http://developer.android.com/reference/android/content/ContentProvider.html#applyBatch(java.util.ArrayList)).
You can perform this method in a separate thread asynchronously so that you do not block anything else. You will have to create a list of ContentProviderOperations. In fact, you only need to specify the ones you need to insert or update within the ArrayList and implement the applyBatch() method such that it will automatically delete the rest of the entries in the database.
To answer your question about how to delete the entries not in the table, the logical assumption would be to search through your data sequentially and then delete the ones that do not need to exist.
I guess the intention is to refresh the Http request result set saved in the database. So I think the most efficient way is do a transaction or batch operation to delete all rows from the table first and then insert the new rows. A transaction might be better so that the result rows are either all new or all old, but not mixed.
I have two tables, SyncedComments and QueuedComments, the latter holds local comments until they are synced with a webserver, when they are synced succesfully they get placed in the synced table, my application should be indifferent to each type. I load in the comments through a CursorLoader, and they may be moved to the synced table while users are reading them. Let's say the user can also edit comments, perhaps while they are being moved, so the application should know where the comment is, regardless of it's table.
To support this, I've thought of having a table with 3 columns, local_id, synced_id and queued_id, the local_id is persistent and simply serves as a reference to either one of the two other id's. When a comment is created a new row is inserted with it's sync_id set to NULL and the queue id it's been given, when a comment is moved then the queue_id is set to NULL and the sync_id is set. This way my application only needs to reference the local id at all times.
How does this solution look? Any flaws? Could it be done smarter?
I would in the first place put all the comments in one table, with flag for whether the comment is synchronized (actually it would probably be ID on server, set to NULL until synchronized and the value obtained from server afterwards). That will take you down to 1 table instead of 3, make it easier to show all comments (because you won't need to do union) and above all avoid problems when the comment is synchronized while being shown, because the comment will not be moving anywhere. And it does less writes to the database file, so it causes less fragmentation and fewer writes to the flash device.
I have an application that uses a pre-polulated database to list events. The app allows people to save these events to their favorites by setting a '1' to the column isFavorite. Then the user can view only a list of 'favorited' events which searches for all rows that have isFavorite = 1.
If any changes happen to the events or I need to add more to the list, I have to make those changes and then push the update to the app which completely writes over the table, clearing out their favorites.
Is there any way that I can, on upgrade, save a list of id's of all the events that they have set to their favorites, then after the new database has been loaded, to set all id's in that list to 1 so the user doesn't lose their favorited data?
If there are any other better solutions to this problem I would really appreciate it, this has been the biggest hurdle for me so far.
I guess, you have a SQLiteOpenHelper-class? This class must be extended and then provides two functions: onCreate (which is called when the Database is queried and it doesn't exist (Normally creates your Database in the first place)) and onUpdate (which is queried when the Database structure should be updated).
The onUpdate-method has an SQLiteDatabase-Object parameter, which is your Database. You can then Query the information you need, save them and then create the new Database-tables. After that, you can insert your saved data back into the Database.
Can't you cope with thus in your DB design? Have a user favourites table that holds id's. So long as these id's don't change upgrade won't affect it surely?
One possible solution is backing up part or all of your database to restore at a later time. I found this guide quite handy http://www.screaming-penguin.com/node/7749
Alternatively, you may save their favorites as a SharedPreferences. http://developer.android.com/reference/android/content/SharedPreferences.html for more information on that.