sqlite3 table with limited max lines (by choice) efficiency - android

I am storing data on an interval basis (gps location) and I dont want the DB to swell up so I defined a MAX number of lines it can go up to and then it simply deletes the oldest line every time I insert a new one.
Now a database expert looked at my code and he says its way not efficient because deleting a row from the database is the most time/memory/procedures consuming action and I should avoid it at all cost.
He says I should instead, run-over the oldest line (update) after I reach MAX.
(so it goes top to bottom every time)
That means I need to save a separate "header" table to save my current pointer to the oldest line and update it on every insert (i don't want to lose it if the app crashes) ..
does it really more efficient ? any other ways to do this more efficiently?

Turning your database table into a ring buffer is silly.
If you really want to use this approach...
Don't use a database just use a data file and IO
In your data file each record will be a fixed size
[Time Stamp][Latitude][Longitude]
You can use string formatted or binary representations of your data it doesn't matter as long as they are fixed size.
----------gps.dat----------
[Ring Pointer]
[Time Stamp][Latitude][Longitude]
[Time Stamp][Latitude][Longitude]
...
[Time Stamp][Latitude][Longitude]
Ring Pointer is the binary representation of a long integer
When you first create the file you'll set its size to the size of an LONG_INTEGER_SIZE + (MAX_RECORDS * RECORD_SIZE)
When you want to add a record:
Read [Ring Pointer] from the beginning of the file
Write [Ring Pointer] + 1 to the beginning of the file (so people don't get confused keep the [Ring Pointer] variable the same just write the new value back to the file)
Go to location LONG_INTEGER_SIZE + (([Ring Pointer] % MAX_RECORDS) * RECORD_SIZE)
Write your new record at that location

Related

Maintain 3 days log - Android

I have a location service in my app. For debug purposes, i want to log (to text file) some events like "new coordinate", "service onDestroy", "service onStartCommand", "coordinate sent to backend", and so on.
But im facing with a problem. The log file gets 350+ new lines each day .. so.. in 3 days i have a file with 1000 lines.
My idea is to maintain only the 3 (or N in that case) last days and delete the content that was written 3+ days ago.
But:
I don't want to check in every write if there are old lines to be removed
I don't want to set an alarm that fires every 3 days to erase the old data.
Can you please tell me if you know another efficient way to handle this situation?
Without a way to index into the text file, there's really no way to solve it without reading each line of the file (up to a certain point), parsing it and finding the date.
Don't keep a file. Keep a database. Have one of the columns be "created time". Then you can easily delete the rows that with a created time older than some threshold.
getContentResolver().delete(
yourUri,
"created_time < ?",
new String[] { System.currentTimeIllis() - Integer.toString(TimeUnit.DAYS.toMillis(3)
);
(Or something, please test your own SQL ...)
As a side note, a 1,000 line text file is nothing. Reading and parsing it is not significant unless you are doing it often. If you read and trimmed the file 1x per day, no problem.
Another solution would be to rotate the files (log --> log.1, log.1 --> log.2, ..., log.n-1 ---> log.n). This of course doesn't set a hard limit on the size of any particular file.

Q: SQLite calculation on android

I am currently working on android application that can track income/outcome of a budget. In one of my database is named Transaction consist of:
Amount
Type (income/outcome)
Date
since this is my first application I'm not really confident with my code, especially in the database. And now I'm making a trigger that can calculate percentage of outcome in a month. Can you check it,if my code is right or not, or not efficient. Here is tiny part of my Trigger code.
In my code I want to calculate percentage of total outcome from the first of a month until the current date .
"CREATE TRIGGER Calc"+
"AFTER INSERT"+
"ON" +transaction+
"FOR EACH ROW" +
"WHEN (SELET * FROM amount.transaction
WHERE (strftime ('%d','now') - strftime('%d','start of month'))
HAVING SUM(amount.transaction WHERE type.transaction IS "outcome") /SUM(amount.transaction WHERE type.transaction IS "income") * 0.01 )"
is it select * from amount.transaction needed? so the code can be much simpler?
0.01 there is 100%
sorry its so messed up since I'm just start in making android and I'm not really well with database. If you have any suggestion please tell me.
Thanks before
I agree with commentors, I don't think you need to use triggers ahead of on-the-fly SELECT statements.
You do need to know that quotations " are literal and when concatenating strings, you need to manually add spaces.

GreenDao - determine database size on runtime

I would like to know please how to get my database(which is off course *.sqlite file) size in bytes?
My current way to do it(which isn't working) is:
new File(DataManager.getInstance().db.getPath()).length()
but I'm just getting here the same number every time 53,676~ , which is irrelevant to the database's content, I'm getting this number even when it's empty.
Thank you.
OK the solution is pretty simple, my recent way to check the database file was good.
But I didn't take in account that greenDao adds to the database another 53 KB. So an empty DB size would be 53± KB and after some insertions it would get bigger and bigger.

How to store xml data into sqlite database

I have a xml format data which is came from server. Now i want to store it into database and it should load on button click. How should i do this?
enter code here
<qst_code> 7 </qst_code>
<qst_prg_code> 1 </qst_prg_code>
<qst_mod_code> 2 </qst_mod_code>
<qst_Question>What is not true about left dominant cardiology circulation? </qst_Question>
<qst_opt1>It is seen in 20% of the population</qst_opt1>
<qst_opt2>Left circumflex artery supplies the Posterior descending artery</qst_opt2>
<qst_opt3>Left circumflex artery terminates as obtuse marginal branch</qst_opt3>
<qst_opt4>Left circumflex artery may originate from right coronary sinus</qst_opt4>
<qst_opt01>1</qst_opt01>
<qst_opt02>1</qst_opt02>
<qst_opt03>1</qst_opt03>
<qst_opt04>1</qst_opt04>
<qst_CorctOpt>1</qst_CorctOpt>
<qst_Marks>10</qst_Marks>
<qst_company_code>1</qst_company_code>
<user_code>1</user_code>
One option is to store it as a string if the data is not too large, else break it into a schema that maps to sqlite and recreate it while loading.
If your XML data is large, I would rather change the data exchange type to json. XML parsing and then insert is a very expensive operation and is time-consuming.
Some issues which you will face with XML parsing and insert.
a. XML parsing is memory intensive and so you heap size will grow, you need to keep an eye on this as this might cause crash.
b. Inserts in SQLite DB will take around ~100ms per tuple (row), so you can calculate the time it will required to pump in thousands of rows of data.
If you data is not too large don't bother about using SQLite.

Android Loading Strings into Array

I have a list of 1000 words. I need to load an array with n randomly chosen words from that list (no repeats allowed). What is the best way of going about doing that?
My ideas:
1) Load the words into R.arrays to create a String array. Use collections.shuffle to shuffle the array, then pull the first n entries from it. Right now, I am having memory issues loading the initial array with the 1000 words using this method.
2) Load the words into a text file, read each word into a String array. Use same method to get first n entries.
3) Hard code the input of the words into a String array (I'd use a script to get that output of course). Use same method to get first n entries.
Is there a better way?
If you're mainly worried about memory usage and you're willing to give up computation speed, here's an algorithm that will get you there.
Keep your words in a text file, one word per line, with a fixed amount of characters per word, padding each word with spaces at the end to ensure a fixed word char size, call it s.
Create an array of max size n, call it w
Open a stream reader to the file containing the 1000 words
Get a random number between 1 and 1000, call it k
Seek to position k*s in the file stream and grab the next s characters
Add the word to w if it does not exist in the array yet
If the w array is full (ie. size=n), we're done, otherwise go back to step 3
Let us know how it goes. Happy coding!

Categories

Resources