Android: Best technique to keep data, among Activities, consistent? - android

The user of my application will be able to change(add,edit,delete) the following data:
name
number1
number2
number3
All those data are parallel. For example:
George 200 100 50
Andreas 450 205 190
John 800 230 180
The user will be able to change the first three (name, time1, time2) from an Activity. And the last one (time3) from another Activity.
Upon change on those data a service would like to know about the changes immediately.
The solutions I am aware of are:
Storing the arrays using shared preferences (Solution) but this would be too time consuming for my application as I may have up to (maximum) 200 data entries. (so 200x4 data to store).
Upon change on the tables I can keep my data consistent by storing them in a database and sending a broadcast to my service with (putExtra) the new data. The service will locally update those arrays. The problem with this solution is that each time I want to make a change on the data I have to change my database too, which is time consuming.
Are there any other solutions?
Is any of the above a good solution for my problem?

Related

DBFlow - update multiple items efficiently

ArrayList list =... --> Get data from database
//do some changes in the values of the objects in the list
for(MyObject object:list){
object.save();
}
Instead of this for loop is there a way to save multiple items that is more efficient than calling save() a bunch of times?
Finaly found it
FlowManager.getModelAdapter(MyTable.class).saveAll(myObjectsList);
and it's faster than calling save multiple times. From some quick testing for 15 items the multiple save took an average of 120 milliseconds while saveAll took about 100 milliseconds. percentage wise it's quite a big difference.
Calling save() multiple times is ok. If there is a thousand entries, you can do it with asynchronous transaction to avoid blocking UI.
If you really want to do save one time, create a new table(model) name ListObject, then use the relationship one-to-many to store it as a List<Object>. So have to save one time only.
https://agrosner.gitbooks.io/dbflow/content/Relationships.html

storing large data in android

I am pulling a large amount of json from a restful server. I use the GSON library from google to traverse this json and it works great. Now I want to save all of the json objects in my sqlite db, however I want to make use of a transaction to add all of them at once. This is difficult if you dont have all the objects ready in one datastructe. Since I am traversing the json one object at a time, I guess I would have to store that in a data structure such as an arraylist or hashmap and then afterwards use a database transaction to do the inserts fast. However... Storing a large amount of data aka 200 000 json objects into a structure in memory can take up a lot of memory and wil probably run out as well. What would be the best way to get all of that json objects into my sqlite db and at the same time not use up a lot of menory in otherwords storing and inserting in a way that allows for a lot of recycling.
If you want to add a large amount of data at an unique moment : it will take a lot of memory anyway. 200 000 large JSON objects take a certain amount of memory and you will not be able to change it.
You can keep this behavior, but I think it's not a great solution because you create a huge memory consumption on both Android device and server. It will be better if you receive data part by part and adding them this way : but you need to have control on the server code.
If you are absolutely forced to keep this behavior, maybe you should receive all the data at the same time, parse them on a huge JSON object, then make multiple transactions. Check if every transaction was executed correctly and put back your database in a good state if not. It's a really bad way to do it, IMHO... but I don't know all your constraints.
To finish : avoid receiving a large amount of data at only one time. It will be better to make multiple requests to get partial data set. It will make your app less network dependant : if you loose the network for 2 seconds, maybe only one request will fail. So you will have to retry only one request and received again a small part of data. With only one huge request : if you loose the network, you will have to retry the entire request...
I know this is not the best implementation of handling large json input in Android, but it certainly works great.
So my solution is pretty simple:
While parsing the JSON code, make use of prepared statements and a db transaction to execute the inserts.
So in short, as soon as a JSON object is parsed, take that info, insert it into the db using the prepared statement and the db transaction, then move on to the next object.
I have just pulled 210 000 rows with 23 fields each (thats 4.6 million values) from a remote server via JSON and GSON and at the same time inserting all those values into my SQLite3 db on the device, in less than 5 minutes and without wasting any memory. Each iteration of parsing/inserting uses the same amount of memory and is cleaned on the next iteration.
So yeah, there's the solution. Obviously, this is not the best solution for commercial applications or tables with 1000 000 + records, but it works great for my situation.
Have you ever tried to add a lot of data (and I really mean a lot, in my case 2600 rows of data) into the Android-internal database (SQLite)?
If so, you propably went the same road as I did.
I tried a normal InsertStatement which was way to slow (10 sec. in my emulator). Then I tried PreparedStatements. The time was better but still unacceptable. (6 sec.). After some frustrating hours of writing code and then throwing it away, I finally found a good solution.
The Android-OS provide the InsertHelper as a fast way to do bulk inserts.
To give you an overview of the performance (measured with an emulator on a crap computer, trying to insert 2600 rows of data):
Insert-Statement
10 seconds
Prepared-Statements
6 seconds
InsertHelper
320 ms
You can speed up the insertion even more with temporarily disable thread locks. This will gain about 30 % more performance. However it’s important to be sure that only one thread per time is using the database while inserting data due to it’s not threadsafe anymore.
public void fillDatabase(HashMap<String, int[]> localData){
//The InsertHelper needs to have the db instance + the name of the table where you want to add the data
InsertHelper ih = new InsertHelper(this.db, TABLE_NAME);
Iterator<Entry<String, int[]>> it = localData.entrySet().iterator();
final int firstExampleColumn = ih.getColumnIndex("firstExampleColumn");
final int secondExampleColumn = ih.getColumnIndex("secondExampleColumn");
final int thirdExampleColumn = ih.getColumnIndex("thirdExampleColumn");
try{
this.db.setLockingEnabled(false);
while(it.hasNext()){
Entry<String, int[]> entry = it.next();
int[] values = entry.getValue();
ih.prepareForInsert();
ih.bind(firstExampleColumn, entry.getKey());
ih.bind(secondExampleColumn, values[0]);
ih.bind(thirdExampleColumn, values[1]);
ih.execute();
}
}catch(Exception e){e.printStackTrace();}
finally{
if(ih!=null)
ih.close();
this.db.setLockingEnabled(true);
}
}

Number of Data in a row checking

I am building a monopoly game and from what I'm doing I'm almost done, but I want the game to end after 30dice rolls. But the way I want to do it is weird. I need a way to store data and check if the data is upto 30 or not, I mean check the amount of data is a row, I've been looking if sqlite or shared preferences would do, but can't get anything. Any idea would be welcomed and if you can help review my code too, I wouldn't mind. Thank You.
If you realy want to persist it in the Database, you want to have a key-value table current_game_progress with an item dice_rolls and update this each dice roll.
UPDATE current_game_progress SET content = content + 1 WHERE keyname = "dice_rolls"
But its a bit overhead. Maybe you want to store it local in a variable and transfer it on "save" action to the database.

Regarding temp data storage in android

I am creating an android project. And in my android project I have 4 screen which consists of seekbars. So from each screen I will get the 2 values and the total values will be 8 values after the the 4 screen. and I will be storing these 8 values in my database.
But My problem how send the 8 values at at a time to database.
I got an idea of doing it like storing values comming from each screen and sending all the values at a time to the database but I could not succeed it doing.
So can any help to it in an easy way.
My database looks like
Name Sc1a Sc1b Sc2a sc2b Sc3a sc3b sc4a sc4b timestamp(current)
stud1 10 30 40 50 60 70 80 90 2-8-2013 14:00:00
Where the values sc1a,sc1b will come from screen1 and sc2a,sc2b comes from screen2 and sc3a,sc3b comes from screen3
And finally on screen I will values sc4a,sc4b and by including all the values from 4 screens that is sc1a,sc1b,sc2a,sc2b,sc3a,sc3b,sc4a,sc4b to the database table at a time which includes current time stamp.
Store each screen's values in SharedPreferences. When you at the last screen, read stored values, build your query with all these values and execute, storing in DB.

How to store xml data into sqlite database

I have a xml format data which is came from server. Now i want to store it into database and it should load on button click. How should i do this?
enter code here
<qst_code> 7 </qst_code>
<qst_prg_code> 1 </qst_prg_code>
<qst_mod_code> 2 </qst_mod_code>
<qst_Question>What is not true about left dominant cardiology circulation? </qst_Question>
<qst_opt1>It is seen in 20% of the population</qst_opt1>
<qst_opt2>Left circumflex artery supplies the Posterior descending artery</qst_opt2>
<qst_opt3>Left circumflex artery terminates as obtuse marginal branch</qst_opt3>
<qst_opt4>Left circumflex artery may originate from right coronary sinus</qst_opt4>
<qst_opt01>1</qst_opt01>
<qst_opt02>1</qst_opt02>
<qst_opt03>1</qst_opt03>
<qst_opt04>1</qst_opt04>
<qst_CorctOpt>1</qst_CorctOpt>
<qst_Marks>10</qst_Marks>
<qst_company_code>1</qst_company_code>
<user_code>1</user_code>
One option is to store it as a string if the data is not too large, else break it into a schema that maps to sqlite and recreate it while loading.
If your XML data is large, I would rather change the data exchange type to json. XML parsing and then insert is a very expensive operation and is time-consuming.
Some issues which you will face with XML parsing and insert.
a. XML parsing is memory intensive and so you heap size will grow, you need to keep an eye on this as this might cause crash.
b. Inserts in SQLite DB will take around ~100ms per tuple (row), so you can calculate the time it will required to pump in thousands of rows of data.
If you data is not too large don't bother about using SQLite.

Categories

Resources