GreenDao - determine database size on runtime - android

I would like to know please how to get my database(which is off course *.sqlite file) size in bytes?
My current way to do it(which isn't working) is:
new File(DataManager.getInstance().db.getPath()).length()
but I'm just getting here the same number every time 53,676~ , which is irrelevant to the database's content, I'm getting this number even when it's empty.
Thank you.

OK the solution is pretty simple, my recent way to check the database file was good.
But I didn't take in account that greenDao adds to the database another 53 KB. So an empty DB size would be 53± KB and after some insertions it would get bigger and bigger.

Related

storing large data in android

I am pulling a large amount of json from a restful server. I use the GSON library from google to traverse this json and it works great. Now I want to save all of the json objects in my sqlite db, however I want to make use of a transaction to add all of them at once. This is difficult if you dont have all the objects ready in one datastructe. Since I am traversing the json one object at a time, I guess I would have to store that in a data structure such as an arraylist or hashmap and then afterwards use a database transaction to do the inserts fast. However... Storing a large amount of data aka 200 000 json objects into a structure in memory can take up a lot of memory and wil probably run out as well. What would be the best way to get all of that json objects into my sqlite db and at the same time not use up a lot of menory in otherwords storing and inserting in a way that allows for a lot of recycling.
If you want to add a large amount of data at an unique moment : it will take a lot of memory anyway. 200 000 large JSON objects take a certain amount of memory and you will not be able to change it.
You can keep this behavior, but I think it's not a great solution because you create a huge memory consumption on both Android device and server. It will be better if you receive data part by part and adding them this way : but you need to have control on the server code.
If you are absolutely forced to keep this behavior, maybe you should receive all the data at the same time, parse them on a huge JSON object, then make multiple transactions. Check if every transaction was executed correctly and put back your database in a good state if not. It's a really bad way to do it, IMHO... but I don't know all your constraints.
To finish : avoid receiving a large amount of data at only one time. It will be better to make multiple requests to get partial data set. It will make your app less network dependant : if you loose the network for 2 seconds, maybe only one request will fail. So you will have to retry only one request and received again a small part of data. With only one huge request : if you loose the network, you will have to retry the entire request...
I know this is not the best implementation of handling large json input in Android, but it certainly works great.
So my solution is pretty simple:
While parsing the JSON code, make use of prepared statements and a db transaction to execute the inserts.
So in short, as soon as a JSON object is parsed, take that info, insert it into the db using the prepared statement and the db transaction, then move on to the next object.
I have just pulled 210 000 rows with 23 fields each (thats 4.6 million values) from a remote server via JSON and GSON and at the same time inserting all those values into my SQLite3 db on the device, in less than 5 minutes and without wasting any memory. Each iteration of parsing/inserting uses the same amount of memory and is cleaned on the next iteration.
So yeah, there's the solution. Obviously, this is not the best solution for commercial applications or tables with 1000 000 + records, but it works great for my situation.
Have you ever tried to add a lot of data (and I really mean a lot, in my case 2600 rows of data) into the Android-internal database (SQLite)?
If so, you propably went the same road as I did.
I tried a normal InsertStatement which was way to slow (10 sec. in my emulator). Then I tried PreparedStatements. The time was better but still unacceptable. (6 sec.). After some frustrating hours of writing code and then throwing it away, I finally found a good solution.
The Android-OS provide the InsertHelper as a fast way to do bulk inserts.
To give you an overview of the performance (measured with an emulator on a crap computer, trying to insert 2600 rows of data):
Insert-Statement
10 seconds
Prepared-Statements
6 seconds
InsertHelper
320 ms
You can speed up the insertion even more with temporarily disable thread locks. This will gain about 30 % more performance. However it’s important to be sure that only one thread per time is using the database while inserting data due to it’s not threadsafe anymore.
public void fillDatabase(HashMap<String, int[]> localData){
//The InsertHelper needs to have the db instance + the name of the table where you want to add the data
InsertHelper ih = new InsertHelper(this.db, TABLE_NAME);
Iterator<Entry<String, int[]>> it = localData.entrySet().iterator();
final int firstExampleColumn = ih.getColumnIndex("firstExampleColumn");
final int secondExampleColumn = ih.getColumnIndex("secondExampleColumn");
final int thirdExampleColumn = ih.getColumnIndex("thirdExampleColumn");
try{
this.db.setLockingEnabled(false);
while(it.hasNext()){
Entry<String, int[]> entry = it.next();
int[] values = entry.getValue();
ih.prepareForInsert();
ih.bind(firstExampleColumn, entry.getKey());
ih.bind(secondExampleColumn, values[0]);
ih.bind(thirdExampleColumn, values[1]);
ih.execute();
}
}catch(Exception e){e.printStackTrace();}
finally{
if(ih!=null)
ih.close();
this.db.setLockingEnabled(true);
}
}

Number of Data in a row checking

I am building a monopoly game and from what I'm doing I'm almost done, but I want the game to end after 30dice rolls. But the way I want to do it is weird. I need a way to store data and check if the data is upto 30 or not, I mean check the amount of data is a row, I've been looking if sqlite or shared preferences would do, but can't get anything. Any idea would be welcomed and if you can help review my code too, I wouldn't mind. Thank You.
If you realy want to persist it in the Database, you want to have a key-value table current_game_progress with an item dice_rolls and update this each dice roll.
UPDATE current_game_progress SET content = content + 1 WHERE keyname = "dice_rolls"
But its a bit overhead. Maybe you want to store it local in a variable and transfer it on "save" action to the database.

I am working with static data,does my application needs to create database?

i am working with an greeting card application in which all the things are static and user have to just select greetings.
so suggest me how to work with it there is 3 options i have like:
1)put text file in asset folder which is having all the data
2)database
3)string-array
database should include
id
name
image
title
greeting
textstyle
textsize
frames
--but this will take too much memory allocation because there are 20 greetings.
any kind of suggestions are valuable for me.
Don't try to insert image as blob into database, because it requires to much memory, please keep the image path name, retrieve it from database and load the image.
Keep the greetings image in application database directory...
This is the suggestion to avoiding OutOfMemory exception. Try to load one by one.
And before loading the image into memory, always keep one thing in your mind that the image should not be too much large, if it is large then compress it and then load..

sqlite3 table with limited max lines (by choice) efficiency

I am storing data on an interval basis (gps location) and I dont want the DB to swell up so I defined a MAX number of lines it can go up to and then it simply deletes the oldest line every time I insert a new one.
Now a database expert looked at my code and he says its way not efficient because deleting a row from the database is the most time/memory/procedures consuming action and I should avoid it at all cost.
He says I should instead, run-over the oldest line (update) after I reach MAX.
(so it goes top to bottom every time)
That means I need to save a separate "header" table to save my current pointer to the oldest line and update it on every insert (i don't want to lose it if the app crashes) ..
does it really more efficient ? any other ways to do this more efficiently?
Turning your database table into a ring buffer is silly.
If you really want to use this approach...
Don't use a database just use a data file and IO
In your data file each record will be a fixed size
[Time Stamp][Latitude][Longitude]
You can use string formatted or binary representations of your data it doesn't matter as long as they are fixed size.
----------gps.dat----------
[Ring Pointer]
[Time Stamp][Latitude][Longitude]
[Time Stamp][Latitude][Longitude]
...
[Time Stamp][Latitude][Longitude]
Ring Pointer is the binary representation of a long integer
When you first create the file you'll set its size to the size of an LONG_INTEGER_SIZE + (MAX_RECORDS * RECORD_SIZE)
When you want to add a record:
Read [Ring Pointer] from the beginning of the file
Write [Ring Pointer] + 1 to the beginning of the file (so people don't get confused keep the [Ring Pointer] variable the same just write the new value back to the file)
Go to location LONG_INTEGER_SIZE + (([Ring Pointer] % MAX_RECORDS) * RECORD_SIZE)
Write your new record at that location

How to store xml data into sqlite database

I have a xml format data which is came from server. Now i want to store it into database and it should load on button click. How should i do this?
enter code here
<qst_code> 7 </qst_code>
<qst_prg_code> 1 </qst_prg_code>
<qst_mod_code> 2 </qst_mod_code>
<qst_Question>What is not true about left dominant cardiology circulation? </qst_Question>
<qst_opt1>It is seen in 20% of the population</qst_opt1>
<qst_opt2>Left circumflex artery supplies the Posterior descending artery</qst_opt2>
<qst_opt3>Left circumflex artery terminates as obtuse marginal branch</qst_opt3>
<qst_opt4>Left circumflex artery may originate from right coronary sinus</qst_opt4>
<qst_opt01>1</qst_opt01>
<qst_opt02>1</qst_opt02>
<qst_opt03>1</qst_opt03>
<qst_opt04>1</qst_opt04>
<qst_CorctOpt>1</qst_CorctOpt>
<qst_Marks>10</qst_Marks>
<qst_company_code>1</qst_company_code>
<user_code>1</user_code>
One option is to store it as a string if the data is not too large, else break it into a schema that maps to sqlite and recreate it while loading.
If your XML data is large, I would rather change the data exchange type to json. XML parsing and then insert is a very expensive operation and is time-consuming.
Some issues which you will face with XML parsing and insert.
a. XML parsing is memory intensive and so you heap size will grow, you need to keep an eye on this as this might cause crash.
b. Inserts in SQLite DB will take around ~100ms per tuple (row), so you can calculate the time it will required to pump in thousands of rows of data.
If you data is not too large don't bother about using SQLite.

Categories

Resources