My question is not "how?" but "should I?".
I got the app that uses access to database very often. It's money manager basically. It uses data to make calculations, draw a charts and stuff.
I was wondering if the accessing database every ocasion I need some data is efficent way, especially when database grows larger.
Is it a good idea of making class that would read all the database tables into multidimensional arrays for further use of the application? I would read it in a thread with some progress bar if necessary, and then I would have much more efficient way of accesiing datas from arrays right?
Please be gentle with me, as I could be totally wrong here.
I don't think that is a good idea. If your data grows large, you can easily run out of memory. Unless you are sure you wouldn't have that much data, or you need to read them to memory anyway. How much data you are talking about? And how much of them you actually need on the screen?
Related
I have recently read inside documentation for Realm database that all of their queries are lazy and I am not sure, if I understand correctly the implications which this may cause.
Let me explain how I understand it and please feel free to correct me if I'm wrong. The way I see it is that, whenever I am making such command mRealm.where(Customer.class).equalTo(Customer.ID, "someId").findFirst(); I do not get Java object with all filled data, that is contained in db for this customer. Instead queries are made whenever I try to access some fields of this "fetched" object. Therefore I am wondering, if that is faster than OrmLite, if I want to access all of the fields of given class?
Another question related to it. Is it good idea to use Realm db for items which would be displayed in ListView or RecyclerView? If queries are constantly made during scrolling of the list, wouldn't it have serious impact on the performance?
I would be really glad, if someone could explain this to me in more detail.
RealmResults, which are created by our queries, are auto-updating views into the underlying data. You can think of them as type-safe cursors and they have some of the same trade-offs, but unlike Cursors we don't copy data into a CursorWindow so there is no pagination effect. You can access the entire object graph without being worried about reaching a CursorWindow limit.
Most ORM's copy all data into Java heap memory. This can be a potentially very costly operation both time- and memory-wise + you might have a lot of additional data you don't even need. The upside to this is that you do it once, then it is really fast to access the data.
Realm on the other hand, only load the data you actually need. This includes individual fields as well. So if you have a ListView showing 1 field from 10 items we will only load those 10 fields just like an CursorAdapter. It has to load that data from native memory though which is more expensive than reading it from the Java heap.
So to answer you question. Yes, Realm works perfectly fine with ListView.
If you want to know what is faster, try it ;). Probably Realm will be faster, because their propietary system that is more optimized than SQLite, even fine-tuned. But that's my supposition
No, it mustn't have impact in performance, because we are supposing that Realm is smart enough to fetch the real data when appropiate. When using another ORM, it's the same thing, data is fetched when a pagination occurs.
Long story short: I'm working on refactoring an old Android project of mine. Previously, it was using serialization, which was painfully slow and, from what I'm reading, a pretty lousy idea in general for Android apps. I'm looking for another way to persist both user-specific data as well and read-only data for the application.
There is going to be a good deal of data on both sides and I'm not sure if there's a "good" way to store it. Basically, the app is a small RPG. There are a number of "maps" that are represented as 2D arrays of Tiles. Each Tile will have a number of attributes, some simple primitives or enums, others additional objects, such as Events, which will also potentially contain various objects, etc. With 400 Tiles in a 20x20 map alone, there's a LOT of data to store. In addition to storing that data, it would need to store a lot of user-specific data, such as which Tiles have been visited, which Events have been successfully run, etc.
I've been examining methods of saving this data out and I just can't seem to settle on something. I guess it boils down to XML or JSON vs SQLite. XML or JSON would be more flexible in terms of future changes, which is good as I want flexibility in the data, ie, adding new attributes to existing objects, adding new objects as the need arises, etc. SQLite isn't as easily malleable as you have to change up the schema, perhaps adjust queries and indexes, etc, but I haven't really used SQLite in the past, so maybe there are some features that help to simplify that process. However, I would also like fast random access to data to avoid loading everything into memory at once if it can be helped. For example, when moving from one map to another, I'd much rather load the next map only when needed rather than having everything held in memory, which is where SQLite would shine as I'd be able to directly query the data rather than traversing a JSON/XML file to find potentially scattered data, ie, we load the map, but Events and objects contained in the Events may not be unique to that map and could easily lie elsewhere in the file or in another file entirely. However, normalizing the data in SQLite would mean a lot of tables and quite a bit of deconstructing/reconstructing of objects.
Writing user data would only occur when the user manually saves the game, so write performance is not a big issue.
I sometimes have a tendency to overanalyze and get hung up on stuff like this. Maybe neither case is necessarily "wrong" and I'm worrying about things that are infinitesimal. Maybe there are other cases that I haven't considered. I've used Hibernate and have considered something like ORMLite to handle a lot of the database nitty-gritty, but that would require a lot of retrofitting, likely much more than what I would need to do for other options.
I'd suggest you use SQLite. I think it makes the most sense considering the amount of data you're trying to store.
As far as your concerns with it not being as flexible, I would argue that point. Just use a ContentProvider. ContentProviders make it pretty easy to update the db schema and queries without affecting your existing functionality. If you use a ContentProvider, you could even swap out persistent data strategies in the future as well as use different ones simultaneously.
http://developer.android.com/guide/topics/providers/content-providers.html
My app needs to store data on the phone, but I'm not sure what's the more efficient method. I don't need to search through the data or anything like that. I just need to be able to save the app's current state when it closes and restore when it's back up. There is between 1mb and 10mb worth of data that will need saving.
There are basically a bunch of custom classes with data in them, and right now I have them as Serializable, and just save each class to a file. Is there any reason for me to change that to store it in SQLite?
If you where to use sqlite you could save as you go, and know that whats in the DB is pretty much uptodate if the app/activity holding the data is suddenly killed by the os. Other that that I cant see and obvious reason to use sqlite for your use-case.
Also for the sql approach you have a clear cut way to change the structure of your domain objects at a later time and to migrate the data from a old to a new version of your database. This can be done using serialized objects as-well, but then the objects needs to be duplicated, both new and old at the same time. And that to me sounds very very scary and messy, especially considering that you might need to go from version x to y, so you might end up with some pretty tricky problems if you ever need to update the domain objects.
And I can honestly not see any benefits of using the flat-file/serialized approach.
You mention in your question that the data is only meant to save the state of the app, therefore my initial response would be to keep it on the devices especially since you mention that the file size would not be much more than 10MB, which is quite reasonable.
So my answer to you would be to keep it as is on the device. If your usage of the information changes in the future, you should then reconsider this approach, but for now it's totally logical.
If you're only saving serialized classes, you could use an ORM mapper as discussed in this thread . This saves you the inconvenience of writing your own mapper and is easily extendable to new classes. Also, if your requirements change, you COULD lookup data.
The only reasons for changing your system to SQLite would be more comfort and maybe a more foolproof system. E.g. now you have to check if the file exists, parse the contents etc. and if you'd use SQLite, you don't have to verify the integrity of the data and Android also helps you a little. And you could use the data for other causes, like displaying them in a ListView.
I'm new but learning. I just need to know, which is more memory efficient, string[] in xml or an SQLite db? I can do either, and can do pre-populated on the db. I'm talking about at most 1000 strings, with more possible in updates.
Thanks for your answers.
PS I have learned so much from Stackoverflow. this is the first place I turn to when I hit a snag. Thank you.
Depends on the strings are and what you need them for. If they vary each time the app runs, leaving them in memory, as a string array, is probably best. If they are persistent between app runs, the sqlite DB will probably be better in the long run since you don't need to "reload" the database between app runs.
Likewise, do you really need all 1000 strings in memory at all times while the app runs? If so, again the array might be a good idea. If not, the database is a better bet.
Ultimately, you need to run it on a variety of android devices and see which implementation is sufficiently responsive for whatever the app is designed to do.
I would say string[] is much better. Here is a good answer from SO itself.
"Unless you want to store the data persistently I'd say you should probably just use an Array. Databases are more for persistent storage (i.e. stuff you'll need over multiple runs of your app). That said, if you arrays start getting reeeeeeeeeealy* big, then yea you're going to want to move them onto disk (in which case they won't take up any memory). And probably the simplest way to do that is with a database.
*On the order of magnitude of hundreds of thousands of entrys, maybe even more."
Source: #Kurtis Nusbaum
https://stackoverflow.com/a/7906472/847954
We're designing an Android app that has a lot of data ("customers", "products", "orders"...), and we don't want to query SQLite every time we need some record. We want to avoid to query the database as most as we can, so we decided to keep certain data always in memory.
Our initial idea is to create two simple classes:
"MemoryRecord": a class that will contain basically an array of objects (string, int, double, datetime, etc...), that are the data from a table record, and all methods to get those data in/out from this array.
"MemoryTable": a class that will contain basically a Map of [Key,MemoryRecord] and all methods to manipulate this Map and insert/update/delete record into/from database.
Those classes will be derived to every kind of table we have in the database. Of course there are other useful methods not listed above, but they are not important at this point.
So, when starting the app, we will load those tables from an SQLite database to memory using those classes, and every time we need to change some data, we will change in memory and post it into the database right after.
But, we want some help/advice from you. Can you suggest something more simple or efficient to implement such a thing? Or maybe some existing classes that already do it for us?
I understand what you guys are trying to show me, and I thank you for that.
But, let's say we have a table with 2000 records, and I will need to list those records. For each one, I have to query other 30 tables (some of them with 1000 records, others with 10 records) to add additional information in the list, and this while it's "flying" (and as you know, we must be very fast at this moment).
Now you'll be going to say: "just build your main query with all those 'joins', and bring all you need in one step. SQLite can be very fast, if your database is well designed, etc...".
OK, but this query will become very complicated and sure, even though SQLite is very fast, it will be "too" slow (2 a 4 seconds, as I confirmed, and this isn't an acceptable time for us).
Another complicator is that, depending on user interaction, we need to "re-query" all records, because the tables involved are not the same, and we have to "re-join" with another set of tables.
So, an alternative is bring only the main records (this will never change, no matter what user does or wants) with no join (this is very fast!) and query the other tables every time we want some data. Note that on the table with 10 records only, we will fetch the same records many and many times. In this case, it is a waste of time, because no matter fast SQLite is, it will always be more expensive to query, cursor, fetch, etc... than just grabbing the record from a kind of "memory cache". I want to make clear that we don't plan to keep all data in memory always, just some tables we query very often.
And we came to the original question: What is the best way to "cache" those records? I really like to focus the discussion on that and not "why do you need to cache data?"
The vast majority of the apps on the platform (contacts, Email, Gmail, calendar, etc.) do not do this. Some of these have extremely complicated database schemas with potentially a large amount of data and do not need to do this. What you are proposing to do is going to cause huge pain for you, with no clear gain.
You should first focus on designing your database and schema to be able to do efficient queries. There are two main reasons I can think of for database access to be slow:
You have really complicated data schemas.
You have a very large amount of data.
If you are going to have a lot of data, you can't afford to keep it all in memory anyway, so this is a dead end. If you have complicated structures, you would benefit in either case with optimizing them to improve performance. In both cases, your database schema is going to be key to good performance.
Actually optimizing the schema can be a bit a of a black art (and I am no expert on it), but some things to look out for are correctly creating indices on rows you will query, designing joins so they will take efficient paths, etc. I am sure there are lots of people who can help you with this area.
You could also try looking at the source of some of the platform's databases to get some ideas of how to design for good performance. For example the Contacts database (especially starting with 2.0) is extremely complicated and has a lot of optimizations to provide good performance on relatively large data and extensible data sets with lots of different kinds of queries.
Update:
Here's a good illustration of how important database optimization is. In Android's media provider database, a newer version of the platform changed the schema significantly to add some new features. The upgrade code to modify an existing media database to the new schema could take 8 minutes or more to execute.
An engineer made an optimization that reduced the upgrade time of a real test database from 8 minutes to 8 seconds. A 60x performance improvement.
What was this optimization?
It was to create a temporary index, at the point of upgrade, on an important column used in the upgrade operations. (And then delete it when done.) So this 60x performance improvement comes even though it also includes the time needed to build an index on one of the columns used during upgrading.
SQLite is one of those things where if you know what you are doing it can be remarkably efficient. And if you don't take care in how you use it, you can end up with wretched performance. It is a safe bet, though, if you are having performance issues with it that you can fix them by improving how you are using SQLite.
The problem with a memory cache is of course that you need to keep it in sync with the database. I've found that querying the database is actually quite fast, and you may be pre-optimizing here. I've done a lot of tests on queries with different data sets and they never take more than 10-20 ms.
It all depends on how you're using the data, of course. ListViews are quite well optimized to handle large numbers of rows (I've tested into the 5000 range with no real issues).
If you are going to stay with the memory cache, you may want have the database notify the cache when it's contents change and then you can update the cache. That way anyone can update the database without knowing about the caching. Also, if you build a ContentProvider over your database, you can use the ContentResolver to notify you of changes if you register using registerContentObserver.