When should I use android SharedPreferences and when SQLite? [duplicate] - android

For persistent storage of data is there any distinct advantage of using a SQLlite database over SharedPreferences or vice versa? Currently my application data is only a couple of kilobytes in size, though it could conceivably rise to ten times that size in the future. I can't find anywhere that states how much storage is available using SharedPreferences but would imagine this would be one limitation of using it? Is there any difference in speed between the two methods? I'm looking to weigh up the pros and cons of those two storage methods.

Off the top of my head:
SharedPreferences:
Pro:
Lightweight
Quick and easy to use
Easy to debug
Config file can be edited by hand if need be
Con:
Slow when dealing with lots of data
Not helpful when the data is more than a simple key/value affair
Entire file needs to be read and parsed to access data
Takes up more space, each entry has a considerable amount of ASCII data around it, and all the data itself is ASCII too.
SQLite:
Pro:
Scales nicely
Changes don't require rewriting the entire data file from scratch
Powerful queries
Con:
More code to write
More heavyweight (code and memory), overkill when dealing with a little bit of data

Related

Best Possible way to Store Data (Lot of Strings with Images) in android application

I am trying to build a application which will be a Ebook kind of (Lot of theory & diagrams) will be there.
Now what i want to know is that since there are many ways of storing the data which one will be the best
Storing in Database
XML
Or simple text files
I am very concerned about the security of the data as well. Since this will be a paid app, i want the data to be secured and also be fast and convenient.
Also, I thought of converting the doc files (Data) in to epub format & then use epub api's to access the data and show it on the android app screen, will this be a gud idea to go for? as compared to the above ways?
Which one will be more secure, fast, flexible & easy!
It depends on how you will access to this data. If you will store in xml you will must to read the whole file from the start to access to chapter (or load to memory, for example). It's not good idea if you will store big data.
Storing in SQL faster. You can gain access to any chapter. You don't need to read all data, like in xml.
Simple text file has the same problem like XML (xml is textfile).
The only one way to secure you data - encrypt it. If user will get root on their device, he will gain access to your files and databases. There is no meaning where you will store your data.
Depends on what is more important to you - speed or security.
Speed
Definitely SQLite, it isn't exactly the cleanest, but definitely the fastest way.
Security
Custom files which are encrypted - it will take a while to read the whole file and then decrypt it in order to display it, but you can be sure that the attacker will access the files encrypted and without the knowledge of the encryption - those data would be useless to him.
EPUB
If you're concerned about security then don't, unless you know how to apply DRM...and that is not a way to go honestly.
I think that the best way to store big amount of data is database. In Android it is sqlite database. I recommend you to put all your text data into sqlite database. You can structure it in easy and beautiful way. Then put your images into assets folder and store the pathes to the the images in database.
Advantages of database solution:
Always well structured data
Easy way to update data with version control system.
You can store and get fast accesses to really big amount of data.
You can use encryption to protect your data.
Disadvantages
It is more complicate to write good code for database solution then for XML or JSON one.
P.S If you will decide to use XML I recommend you to change it to JSON. It is faster and easier to use.
Which one will be more secure, fast, flexible & easy!
Secure: It mainly depends on encryption system.
Fast: SQLite, you can read some advantages of SQLite here Android Performance : Flat file vs SQLite
Flexible and easy: Storing the encrypted files in internal storage is a flexible and easy way. I think it is secure enough. Here you can get some android security tips about storing data http://developer.android.com/training/articles/security-tips.html#StoringData
for saving little data you can use xml for strings but you lose fast loading factor
sqlite is good for almost every purpose, but Security

Android - Storing fair amounts of data locally, xml/json or SQLite?

Long story short: I'm working on refactoring an old Android project of mine. Previously, it was using serialization, which was painfully slow and, from what I'm reading, a pretty lousy idea in general for Android apps. I'm looking for another way to persist both user-specific data as well and read-only data for the application.
There is going to be a good deal of data on both sides and I'm not sure if there's a "good" way to store it. Basically, the app is a small RPG. There are a number of "maps" that are represented as 2D arrays of Tiles. Each Tile will have a number of attributes, some simple primitives or enums, others additional objects, such as Events, which will also potentially contain various objects, etc. With 400 Tiles in a 20x20 map alone, there's a LOT of data to store. In addition to storing that data, it would need to store a lot of user-specific data, such as which Tiles have been visited, which Events have been successfully run, etc.
I've been examining methods of saving this data out and I just can't seem to settle on something. I guess it boils down to XML or JSON vs SQLite. XML or JSON would be more flexible in terms of future changes, which is good as I want flexibility in the data, ie, adding new attributes to existing objects, adding new objects as the need arises, etc. SQLite isn't as easily malleable as you have to change up the schema, perhaps adjust queries and indexes, etc, but I haven't really used SQLite in the past, so maybe there are some features that help to simplify that process. However, I would also like fast random access to data to avoid loading everything into memory at once if it can be helped. For example, when moving from one map to another, I'd much rather load the next map only when needed rather than having everything held in memory, which is where SQLite would shine as I'd be able to directly query the data rather than traversing a JSON/XML file to find potentially scattered data, ie, we load the map, but Events and objects contained in the Events may not be unique to that map and could easily lie elsewhere in the file or in another file entirely. However, normalizing the data in SQLite would mean a lot of tables and quite a bit of deconstructing/reconstructing of objects.
Writing user data would only occur when the user manually saves the game, so write performance is not a big issue.
I sometimes have a tendency to overanalyze and get hung up on stuff like this. Maybe neither case is necessarily "wrong" and I'm worrying about things that are infinitesimal. Maybe there are other cases that I haven't considered. I've used Hibernate and have considered something like ORMLite to handle a lot of the database nitty-gritty, but that would require a lot of retrofitting, likely much more than what I would need to do for other options.
I'd suggest you use SQLite. I think it makes the most sense considering the amount of data you're trying to store.
As far as your concerns with it not being as flexible, I would argue that point. Just use a ContentProvider. ContentProviders make it pretty easy to update the db schema and queries without affecting your existing functionality. If you use a ContentProvider, you could even swap out persistent data strategies in the future as well as use different ones simultaneously.
http://developer.android.com/guide/topics/providers/content-providers.html

Is it bad to store "JSON.stringify" data in SharedPreferences?

I can't choose between SQLite and SharedPreferences.
I can use
JSON.parse(SharedPreferences.getString("data","qweqwe");
and
s.putString(key,JSON.stringify(JSONObject));
Or create a new, big class to store my (text) data in SQLite. (PS: JSON.* is my own class)
What will be faster, better?
I know that SharedPreferences is for "key-value" data, SQLite - for big amount of structured data. But in my case storing JSON-formatted data in SP and accessing by key would be easier. Main question - will it be slower or faster? Pros and cons?
On the one hand this is a slightly subjective question (and not the best fit for stackoverflow). On the other hand, taking your question title literally, the objective answer is "No, it's not bad".
The reasoning is, however, slightly subjective as it 'depends' on the situation.
The SharedPreferences class is effectively a wrapper / helper for a file stored in an app's private (internal) storage - as I understand it, it's an XML file. Based on that fact, ask yourself again..."Is it bad to save a JSON-formatted string in an XML file"?
As you mention in the commen on your question, using a SQLite database will mean writing extra code whereas the advantage of SharedPreferences is that a given preference file is accesible by name by any Android class which extends Context including Application, Activity and Service.
I thought about this and had some practice.
So, using SQLite (in my case) is better that using JSON-formatted string in SharedPreferences, because I can just update only one-two rows from a table. With SharedPreferences I have to:
use new JSONObject(sharedPreferences.getString("json_string","qweqwe");
some manipulations with object
edit my SharedPreferences.
seasoning to your taste
Put my JSONObject().toString() back into SharedPreferences. It's all.
IMHO, it's more complicated for the device. Because it cannot be seasoned
If I wouldn't need to update an individual parts of data, I'd rather to use SharedPreferences, because for static data, which I don't need to update, is faster.
I have used this approach in several projects without any issues so far. But there are certainly several advantages to using an SQLite database; particularly, the versioning/upgrade features of SQLite, and the powerful SQL querying language. If you ever need to migrate data to a new structure of storage, the upgrade and onUpgrade callbacks in the SQLite framework can be immensely helpful.
If you are keeping it simple, JSON in preferences can be very quick and extremely easy to implement. In terms of security, preferences are slightly more "exposed" than a database in that they are simply stored in xml, but ultimately the database file for an SQLite database is stored the same way and can be read during an intrusion as well.
I haven't had any performance issues using JSON/SharedPreferences yet, but I also haven't done any profiling to test this. My mindset has been to keep the code simple and not optimize prematurely - if performance issues arise, do the work of profiling it at that point.
Ultimately, I'd say that there is nothing inherently wrong with using SharedPreferences in this manner.

Which is better? Database or xmlfile? [duplicate]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I really like Xml for saving data, but when does sqlite/database become the better option? eg, when the xml has more than x items or is greater than y MB?
I am coding an rss reader and I believe I made the wrong choice in using xml over a sqlite database to store a cache of all the feeds items. There are some feeds which have an xml file of ~1mb after a month, another has over 700 items, while most only have ~30 items and are ~50kb in size after a several months.
I currently have no plans to implement a cap because I like to be able to search through everything.
So, my questions are:
When is the overhead of sqlite/databases justified over using xml?
Are the few large xml files justification enough for the database when there are a lot of small ones, though even the small ones will grow over time? (a long long time)
updated (more info)
Every time a feed is selected in the GUI I reload all the items from that feeds xml file.
I also need to modify the read/unread status which seems really hacky when I loop through all nodes in the xml to find the item and then set it to read/unread.
Man do I have experience with this. I work on a project where we originally stored all of our data using XML, then moved to SQLite. There are many pros and cons to each technology, but it was performance that caused the switchover. Here is what we observed.
For small databases (a few meg or smaller), XML was much faster, and easier to deal with. Our data was naturally in a tree format, which made XML much more attractive, and XPath allowed us to do many queries in one simple line rather than having to walk down an ancestry tree.
We were programming in a Win32 environment, and used the standard Microsoft DOM library. We would load all the data into memory, parse it into a DOM tree and search, add, modify on the in memory copy. We would periodically save the data, and needed to rotate copies in case the machine crashed in the middle of a write.
We also needed to build up some "indexes" by hand using C++ tree maps. This, of course would be trivial to do with SQL.
Note that the size of the data on the filesystem was a factor of 2-4 smaller than the "in memory" DOM tree.
By the time the data got to 10M-100M size, we started to have real problems. Interestingly enough, at all data sizes, XML processing was much faster than SQLite turned out to be (because it was in memory, not on the hard drive)! The problem was actually twofold- first, loadup time really started to get long. We would need to wait a minute or so before the data was in memory and the maps were built. Of course once loaded the program was very fast. The second problem was that all of this memory was tied up all the time. Systems with only a few hundred meg would be unresponsive in other apps even though we ran very fast.
We actually looking into using a filesystem based XML database. There are a couple open sourced versions XML databases, we tried them. I have never tried to use a commercial XML database, so I can't comment on them. Unfortunately, we could never get the XML databases to work well at all. Even the act of populating the database with hundreds of meg of XML took hours.... Perhaps we were using it incorrectly. Another problem was that these databases were pretty heavyweight. They required Java and had full client server architecture. We gave up on this idea.
We found SQLite then. It solved our problems, but at a price. When we initially plugged SQLite in, the memory and load time problems were gone. Unfortunately, since all processing was now done on the harddrive, the background processing load went way up. While earlier we never even noticed the CPU load, now the processor usage was way up. We needed to optimize the code, and still needed to keep some data in memory. We also needed to rewrite many simple XPath queries as complicated multiquery algorithms.
So here is a summary of what we learned.
For tree data, XML is much easier to query and modify using XPath.
For small datasets (less than 10M), XML blew away SQLite in performance.
For large datasets (greater than 10M-100M), XML load time and memory usage became a big problem, to the point that some computers become unusable.
We couldn't get any opensource XML database to fix the problems associated with large datasets.
SQLite doesn't have the memory problems of XML DOM, but it is generally slower in processing the data (it is on the hard drive, not in memory). (note- SQLite tables can be stored in memory, perhaps this would make it as fast.... We didn't try this because we wanted to get the data out of memory.)
Storing and querying tree data in a table is not enjoyable. However, managing transactions and indexing partially makes up for it.
I basically agree with Mitchel, that this can be highly specific depending on what are you going to do with XML and SQLite. For your case (cache), it seems to me that using SQLite (or other embedded databases) makes more sense.
First I don't really think that SQLite will need more overhead than XML. And I mean both development time overhead and runtime overhead. Only problem is that you have a dependence on SQLite library. But since you would need some library for XML anyway it doesn't matter (I assume project is in C/C++).
Advantages of SQLite over XML:
everything in one file,
performance loss is lower than XML as cache gets bigger,
you can keep feed metadata separate from cache itself (other table), but accessible in the same way,
SQL is probably easier to work with than XPath for most people.
Disadvantages of SQLite:
can be problematic with multiple processes accessing same database (probably not your case),
you should know at least basic SQL. Unless there will be hundreds of thousands of items in cache, I don't think you will need to optimize it much,
maybe in some way it can be more dangerous from security standpoint (SQL injection). On the other hand, you are not coding web app, so this should not happen.
Other things are on par for both solutions probably.
To sum it up, answers to your questions respectively:
You will not know, unless you test your specific application with both back ends. Otherwise it's always just a guess. Basic support for both caches should not be a problem to code. Then benchmark and compare.
Because of the way XML files are organized, SQLite searches should always be faster (barring some corner cases where it doesn't matter anyway because it's blazingly fast). Speeding up searches in XML would require index database anyway, in your case that would mean having cache for cache, not a particularly good idea. But with SQLite you can have indexing as part of database.
Don't forget that you have a great database at your fingertips: the filesystem!
Lots of programmers forget that a decent directory-file structure is/has:
It's fast as hell
It's portable
It has a tiny runtime footprint
People are talking about splitting up XML files into multiple XML files... I would consider splitting your XML into multiple directories and multiple plaintext files.
Give it a go. It's refreshingly fast.
Use XML for data that the
application should know -
configuration, logging and what not.
Use databases(oracle, SQL server etc) for data that the user
interacts with directly or
indirectly - real data
Use SQLite if the user data is more
of a serialized collection - like
huge list of files and their content
or collection of email items etc.
SQLite is good at that.
Depends on the kind and the size of the data.
I wouldn't use XML for storing RSS items. A feed reader makes constant updates as it receives data.
With XML, you need to load the data from file first, parse it, then store it for easy search/retrieval/update. Sounds like a database...
Also, what happens if your application crashes? if you use XML, what state is the data in the XML file versus the data in memory. At least with SQLite you get atomicity, so you are assured that your application will start with the same state as when the last database write was made.
XML is best used as an interchange format when you need to move data from your application to somewhere else or share information between applications. A database should be the preferred method of storage for almost any size application.
When should XML be used for data persistence instead of a database? Almost never. XML is a data transport language. It is slow to parse and awkward to query. Parse the XML (don't shred it!) and convert the resulting data into domain objects. Then persist the domain objects. A major advantage of a database for persistence is SQL which means unstructured queries and access to common tools and optimization techniques.
I have made the switch to SQLite and I feel much better knowing it's in a database.
There are a lot of other benefits from this:
Adding new items is really simple
Sorting by multiple columns
Removing duplicates with a unique index
I've created 2 views, one for unread items and one for all items, not sure if this is the best use of views, but I really wanted to try using them.
I also benchmarked the xml vs sqlite using the StopWatch class, and the sqlite is faster, although it could just be that my way of parsing xml files wasn't the fastest method.
Small # items and size (25 items, 30kb)
~1.5 ms sqlite
~8.0 ms xml
Large # of items (700 items, 350kb)
~20 ms sqlite
~25 ms xml
Large file size (850 items, 1024kb)
~45 ms sqlite
~60 ms xml
To me it really depends on what you are doing with them, how many users/processes need access to them at the same time etc.
I work with large XML files all the time, but they are single process, import style items, that multi-user, or performance are not really needs.
SO really it is a balance.
If any time you will need to scale, use databases.
XML is good for storing data which is not completely structured and you typically want to exchange it with another application. I prefer to use a SQL database for data. XML is error prone as you can cause subtle errors due to typos or ommissions in the data itself. Some open source application frameworks use too many xml files for configuration, data, etc. I prefer to have it in SQL.
Since you ask for a rule of thumb, I would say that use XML based application data, configuration, etc if you are going to set it up once and not access/search it much. For active searches and updations, its best to go with SQL.
For example, a web server stores application data in a XML file and you dont really need to perform complex search, update the file. The web server starts, reads the xml file and thats that. So XML is perfect here. Suppose you use a framework like Struts. You need to use XML and the action configurations dont change much once the application is developed and deployed. So again, the XML file is a good way. Now if your Struts developed application allows extensive searches and updations, deletions, then SQL is the optimal way.
Offcourse, you will surely meet one or two developers in your organisation who will chant XML or SQL only and proclaim XML or SQL as the only way to go. Beware of such folks and do what 'feels' right for your application. Dont just follow a 'technology religion'.
Think of things like how often you need to update the data, how often you need to search the data. Then you will have your answer on what to use - XML or SQL.
I agree with #Bradley.
XML is very slow and not particularly useful as a storage format. Why bother? Will you be editing the data by hand using a text editor? If so, XML still isn't a very convenient format compared to something like YAML. With something like SQlite, queries are easier to write, and there's a well defined API for getting your data in and out.
XML is fine if you need to send data around between programs. But in the name of efficiency, you should probably produce the XML at sending time, and parse it into "real data" at receive time.
All the above means that your question about "when the overhead of a database is justified" is kind of moot. XML has a way higher overhead, all the time, than SQlite does. (Full-on databases like MSSQL are heavier, especially in administrative overhead, but that's a totally different question.)
XML can be stored as text and as a binary file format.
If your primary goal is to let a computer read / write a file format effeciently you should work with a binary file format.
Databases are an easy to use way of storing and maintaining data.
They are not the fastest way to store data that is a binary file format.
What can speed things up is using an in memory database / database type. Sqlite has this option.
And this sounds like the best way to do it for you.
My opinion is that you should use SQLite (or another appropriate embedded database) anytime you don't need a pure-text file format. Note, this is a pretty big exception. There are a lot of scenarios that require, or are benefited by, pure-text file formats.
As far as overhead goes, SQLite compiles to something like 250 k with normal flags. Many XML parsing libraries are larger than SQLite. You get no concurrency gains using XML. The SQLite binary file format is going to support much more efficient writes (largely because you can't append to the end of a well-formatted XML file). And even reading data, most of which I assume is fairly random access, is going to be faster using SQLite.
And to top it all off, you get access to the benefits of SQL like transactions and indexes.
Edit: Forgot to mention. One benefit of SQLite (as opposed to many databases) is that it allows any type in any row in any column. Basically, with SQLite you get the same freedom you have with XML in terms of datatypes. This also means that you don't have to worry about putting limits on text columns.
You should note that many large Relational DBs (Oracle and SQLServer) have XML datatypes to store data within a database and use XPath within the SQL statement to gain access to that data.
Also, there are native XML databases which work very much like SQLite in the sense they are one binary file holding a collection of documents (which could roughly be a table) then you can either XPath/XQuery on a single document or the whole collection. So with an XML database you can do things like store the days data as a separate XML document in the collection... so you just need to use that one document when your dealing with the data for today. But write an XQuery to figure out historical data on the collection of documents for that person. Slick.
I've used Berkeley XMLDB (now backed by Oracle). There are others if you search google for "Native XML Database". I've not seen a performance problem with storing/retrieving data in this manner.
XQuery is a different beast (but well worth learning), however you may be able to just use the XPaths you currently use with slight modifications.
A database is great as part of your program. If quering the data is part of your business logic.
XML is best as a file format, especially if you data format is:
1, Hierarchal
2, Likely to change in the future in ways you can't guess
3, The data is going to live longer than the program
I say it's not a matter of data size, but of data type. If your data is structured, use a relational database. If your data is semi-structured, use XML or - if the data amounts really grow too large - an XML database.
If your searching go with a db. You could split the xml files up into directories to ease seeking, but the managerial overhead easily gets quite heavy. You also get a lot more than just performance with a sql db...

Is using Android shared preferences for storing large amounts of data a good idea?

So I inherited this Android project from someone else. The code currently seems to be storing huge amounts of data (that should really belong to an SQLite database) into the shared preferences. I'm very uncomfortable with that part of the code and want to start using the sqlite database. But I am still unable to justify to myself the time it would take especially if it comes with no immediate benefits.
Of course I'm eventually going to move it to sqlite but since I'm kinda on a tight deadline I was wondering if this is worth something doing now or later.
Any thoughts and comments on storing large amounts of data in shared preferences would be very much appreciated.
Thanks
If it works now then you can definitely leave it. You are correct that the large amounts of data should go into the database. If nothing else, you'll have an easier time of querying for data.
Further research has found this post suggesting that you won't have any major problems with a large amount of data in your Shared Prefs. You could, however, have performance issues since the single Shared Pref XML file will have to be read to get any pref while with a database you only have to grab what you need as you need it.
TL;DR; Don't use shared prefs for large storage, use a DB instead (but if it works for now and you're in a rush do it later)
I wouldn't personally recommend it since the system will keep an in-memory copy of all shared prefs for your app. So if you throw a lot of data in there your memory usage will be affected. That said, and depending on the data format (whether you can use it as is and use the key to find it directly - if you just store a huge JSON object that you then have to parse or if you have to get all shared prefs to then do a linear search for the one you really need, there's little benefit in either case) and how often you have to access it, it might be faster to lookup than a file or a database since it's already in memory. It also provides the benefit of being threadsafe (a SQL DB would be as well since DBs get locked when accessed) as opposed to a file where you would have to deal with that on your own.
Hope this helps.

Categories

Resources