DBFlow - update multiple items efficiently - android

ArrayList list =... --> Get data from database
//do some changes in the values of the objects in the list
for(MyObject object:list){
object.save();
}
Instead of this for loop is there a way to save multiple items that is more efficient than calling save() a bunch of times?

Finaly found it
FlowManager.getModelAdapter(MyTable.class).saveAll(myObjectsList);
and it's faster than calling save multiple times. From some quick testing for 15 items the multiple save took an average of 120 milliseconds while saveAll took about 100 milliseconds. percentage wise it's quite a big difference.

Calling save() multiple times is ok. If there is a thousand entries, you can do it with asynchronous transaction to avoid blocking UI.
If you really want to do save one time, create a new table(model) name ListObject, then use the relationship one-to-many to store it as a List<Object>. So have to save one time only.
https://agrosner.gitbooks.io/dbflow/content/Relationships.html

Related

Android/java multi threaded fetch and store images in arraylist in order

I needed to fetch a series of images in sequence.
Atm I have a thread fetching these images in sequence and store them in an array list in a separate thread.
And another thread is used to read and display the images from the arraylist. If the "playback" is in advance, it sleeps for 3 seconds (buffering) waiting for the arraylist to fill up.
It works relatively well but there is too much buffering.
The issue is that the arraylist does not fill up fast enough.
I think i need to do multiple fetching of the images and the problem is how to manage the order in the arraylist and how to make sure there are no "holes" in the arraylist? (Frame 2 loaded before frame 1)
Ok I got it. I use a ThreadPoolExecutor for multi threaded downloading of images and a hashmap to preserve the order

How to find same values in arraylist and sort them in sections?

I have an unsorted arraylist with 70000 string values. I want to add same values in separate list.
i.e
if the sample of unsorted list is like this
Arraylist[0]->"NewYork"
Arraylist[1]->"DC"
....
....
....
Arraylist[401]->"NewYork"
Arraylist[402]->"Seoul"
Arraylist[403]->"DC"
if 2 or more same values are found from the unsorted list, i want to add in separate list or multihashmap which can store same values, as i want to create sections for same values. Result would be like this
Section1:
Arraylist1.add("NewYork");
Arraylist1.add("NewYork");
Section2:
Arraylist2.add("DC");
Arraylist2.add("DC");
In my perspective as unsorted list can be random , Creating multiple arraylist is bad approach instead i have used like multihashmap for sections.
The thing is i do not want code because i already implemented it,the thing is above scenario for finding strings and sorting them in each section but my algorithm is way slow, it takes about 30 to 40 seconds, my issue is which is the fastest way to perform this so i can do this in lesser and minimal time.

storing large data in android

I am pulling a large amount of json from a restful server. I use the GSON library from google to traverse this json and it works great. Now I want to save all of the json objects in my sqlite db, however I want to make use of a transaction to add all of them at once. This is difficult if you dont have all the objects ready in one datastructe. Since I am traversing the json one object at a time, I guess I would have to store that in a data structure such as an arraylist or hashmap and then afterwards use a database transaction to do the inserts fast. However... Storing a large amount of data aka 200 000 json objects into a structure in memory can take up a lot of memory and wil probably run out as well. What would be the best way to get all of that json objects into my sqlite db and at the same time not use up a lot of menory in otherwords storing and inserting in a way that allows for a lot of recycling.
If you want to add a large amount of data at an unique moment : it will take a lot of memory anyway. 200 000 large JSON objects take a certain amount of memory and you will not be able to change it.
You can keep this behavior, but I think it's not a great solution because you create a huge memory consumption on both Android device and server. It will be better if you receive data part by part and adding them this way : but you need to have control on the server code.
If you are absolutely forced to keep this behavior, maybe you should receive all the data at the same time, parse them on a huge JSON object, then make multiple transactions. Check if every transaction was executed correctly and put back your database in a good state if not. It's a really bad way to do it, IMHO... but I don't know all your constraints.
To finish : avoid receiving a large amount of data at only one time. It will be better to make multiple requests to get partial data set. It will make your app less network dependant : if you loose the network for 2 seconds, maybe only one request will fail. So you will have to retry only one request and received again a small part of data. With only one huge request : if you loose the network, you will have to retry the entire request...
I know this is not the best implementation of handling large json input in Android, but it certainly works great.
So my solution is pretty simple:
While parsing the JSON code, make use of prepared statements and a db transaction to execute the inserts.
So in short, as soon as a JSON object is parsed, take that info, insert it into the db using the prepared statement and the db transaction, then move on to the next object.
I have just pulled 210 000 rows with 23 fields each (thats 4.6 million values) from a remote server via JSON and GSON and at the same time inserting all those values into my SQLite3 db on the device, in less than 5 minutes and without wasting any memory. Each iteration of parsing/inserting uses the same amount of memory and is cleaned on the next iteration.
So yeah, there's the solution. Obviously, this is not the best solution for commercial applications or tables with 1000 000 + records, but it works great for my situation.
Have you ever tried to add a lot of data (and I really mean a lot, in my case 2600 rows of data) into the Android-internal database (SQLite)?
If so, you propably went the same road as I did.
I tried a normal InsertStatement which was way to slow (10 sec. in my emulator). Then I tried PreparedStatements. The time was better but still unacceptable. (6 sec.). After some frustrating hours of writing code and then throwing it away, I finally found a good solution.
The Android-OS provide the InsertHelper as a fast way to do bulk inserts.
To give you an overview of the performance (measured with an emulator on a crap computer, trying to insert 2600 rows of data):
Insert-Statement
10 seconds
Prepared-Statements
6 seconds
InsertHelper
320 ms
You can speed up the insertion even more with temporarily disable thread locks. This will gain about 30 % more performance. However it’s important to be sure that only one thread per time is using the database while inserting data due to it’s not threadsafe anymore.
public void fillDatabase(HashMap<String, int[]> localData){
//The InsertHelper needs to have the db instance + the name of the table where you want to add the data
InsertHelper ih = new InsertHelper(this.db, TABLE_NAME);
Iterator<Entry<String, int[]>> it = localData.entrySet().iterator();
final int firstExampleColumn = ih.getColumnIndex("firstExampleColumn");
final int secondExampleColumn = ih.getColumnIndex("secondExampleColumn");
final int thirdExampleColumn = ih.getColumnIndex("thirdExampleColumn");
try{
this.db.setLockingEnabled(false);
while(it.hasNext()){
Entry<String, int[]> entry = it.next();
int[] values = entry.getValue();
ih.prepareForInsert();
ih.bind(firstExampleColumn, entry.getKey());
ih.bind(secondExampleColumn, values[0]);
ih.bind(thirdExampleColumn, values[1]);
ih.execute();
}
}catch(Exception e){e.printStackTrace();}
finally{
if(ih!=null)
ih.close();
this.db.setLockingEnabled(true);
}
}

Android: Best technique to keep data, among Activities, consistent?

The user of my application will be able to change(add,edit,delete) the following data:
name
number1
number2
number3
All those data are parallel. For example:
George 200 100 50
Andreas 450 205 190
John 800 230 180
The user will be able to change the first three (name, time1, time2) from an Activity. And the last one (time3) from another Activity.
Upon change on those data a service would like to know about the changes immediately.
The solutions I am aware of are:
Storing the arrays using shared preferences (Solution) but this would be too time consuming for my application as I may have up to (maximum) 200 data entries. (so 200x4 data to store).
Upon change on the tables I can keep my data consistent by storing them in a database and sending a broadcast to my service with (putExtra) the new data. The service will locally update those arrays. The problem with this solution is that each time I want to make a change on the data I have to change my database too, which is time consuming.
Are there any other solutions?
Is any of the above a good solution for my problem?

How to store xml data into sqlite database

I have a xml format data which is came from server. Now i want to store it into database and it should load on button click. How should i do this?
enter code here
<qst_code> 7 </qst_code>
<qst_prg_code> 1 </qst_prg_code>
<qst_mod_code> 2 </qst_mod_code>
<qst_Question>What is not true about left dominant cardiology circulation? </qst_Question>
<qst_opt1>It is seen in 20% of the population</qst_opt1>
<qst_opt2>Left circumflex artery supplies the Posterior descending artery</qst_opt2>
<qst_opt3>Left circumflex artery terminates as obtuse marginal branch</qst_opt3>
<qst_opt4>Left circumflex artery may originate from right coronary sinus</qst_opt4>
<qst_opt01>1</qst_opt01>
<qst_opt02>1</qst_opt02>
<qst_opt03>1</qst_opt03>
<qst_opt04>1</qst_opt04>
<qst_CorctOpt>1</qst_CorctOpt>
<qst_Marks>10</qst_Marks>
<qst_company_code>1</qst_company_code>
<user_code>1</user_code>
One option is to store it as a string if the data is not too large, else break it into a schema that maps to sqlite and recreate it while loading.
If your XML data is large, I would rather change the data exchange type to json. XML parsing and then insert is a very expensive operation and is time-consuming.
Some issues which you will face with XML parsing and insert.
a. XML parsing is memory intensive and so you heap size will grow, you need to keep an eye on this as this might cause crash.
b. Inserts in SQLite DB will take around ~100ms per tuple (row), so you can calculate the time it will required to pump in thousands of rows of data.
If you data is not too large don't bother about using SQLite.

Categories

Resources