I need to search through text files (around 40MB) in my app with regular expressions, as you can imagine, it normally takes 1 minute or so to get it done. AND I have to do it repeatedly.
I wonder if I can keep these files in RAM after the first search. Can I possibly do that? I mean, find a way to explicitly say keep something in RAM for some time.
Consider putting your search results in a WeakHashMap, with keys that only exist for the duration that you need the values to exist, like the scope of an Activity. Watch out for memory issues though. On some devices, your application's process may only have a heap size as low as 16M.
Keep the results in a custom object that will save the search result. This will keep it in RAM (as long as you keep a reference to it).
Also keep in mind that allocating 40 MiB in RAM in Android devices is not a very good idea since RAM is quite limited in a lot of low-end devices. This can make your application a very tasty target for Android when it looks to free memory.
Related
I have an application with large number of classes & also many libraries are included, I am setting android:largeHeap="true" as i am getting memory issue, My Manifest file code is attached.
<application
android:name=".MyApplication"
android:allowBackup="true"
android:icon="#drawable/ic_launcher"
android:label="My Huge Application"
android:largeHeap="true"
android:logo="#drawable/logo"
android:screenOrientation="portrait"
android:theme="#style/AppTheme" >
</application>
I had to ask that is this a good practice?
Kindly suggest advantages and disadvantages (pros and cons) of using it.
Way too late for the party here, but i will offer my 0.02$ anyways.
It's not a good idea to use android:largeHeap="true" here's the extract from google that explains it,
However, the ability to request a large heap is intended only for a
small set of apps that can justify the need to consume more RAM (such
as a large photo editing app). Never request a large heap simply
because you've run out of memory and you need a quick fix—you should
use it only when you know exactly where all your memory is being
allocated and why it must be retained. Yet, even when you're confident
your app can justify the large heap, you should avoid requesting it to
whatever extent possible. Using the extra memory will increasingly be
to the detriment of the overall user experience because garbage
collection will take longer and system performance may be slower when
task switching or performing other common operations.
here's the complete link of the documentation
https://developer.android.com/training/articles/memory.html
UPDATE
After working excrutiatingly with out of memory errors i would say adding this to the manifest to avoid the oom issue is not a sin, also like #Milad points out below it does not affect the normal working of the app
UPDATE 2
Here are a few tips to deal with out of memory errors
1) Use these callback that android gives onLowMemory, onTrimMemory(int) and clear the cache of image like (picasso, glide, fresco....) you can read more about them here and here
2) compress your files(images, pdf)
3) read about how to handle bitmap more efficiently here
4) Use lint regularly before production pushes to ensure code is sleek and
not bulky
I think this is a very effective question, and let me add some details about advantages and disadvantages of using this option.
What You Get :
Obviously, you get larger heap, which means decreasing risk of OutOfMemoryError.
What You Lose :
You may lose some frames, which can cause a visible hitching. Larger heap makes garbage collections take longer. Because the garbage collector basically has to traverse your entire live set of objects. Usually, garbage collection pause time is about 5ms, and you may think few milliseconds are not a big deal. But every millisecond count. Android device has to update its screen in every 16 ms and longer GC time might push your frame processing time over the 16 millisecond barrier, which can cause a visible hitching.
Also switching apps will become slower. Android system may kill processes in the LRU cache beginning with the process least recently used, but also giving some consideration toward which processes are most memory intensive. So if you use larger heap, your process would more likely to be killed when it's backgrounded, which means it may take longer time when users want to switch from other apps to yours. Also other backgrounded processes will more likely to be kicked out when your process is foreground, because your app require larger memory. It means switching from your app to other apps also takes longer.
Conclusion :
Avoid using largeHeap option as much as possible. It may cost you hard-to-notice performance drop and bad user experience.
If you must use (and retain) a large amount of memory, then yes, you can and should use android:largeHeap="true". But if you do use it, you should be prepared to have your app flushed from memory whenever other apps are in the foreground.
By "be prepared," I mean that you should design for that likelihood, so that your onStop() and onResume() methods are written as efficiently as possible, while ensuring that all pertinent state is saved and restored in a manner that presents a seamless appearance to the user.
There are three methods that relate to this parameter: maxMemory(), getMemoryClass(), and getLargeMemoryClass().
For most devices, maxMemory() will represent a similar value to getMemoryClass() by default, although the latter is expressed in megabytes, while the former is expressed in bytes.
When you use the largeHeap parameter, maxMemory() will be increased to a device-specific higher level, while getMemoryClass() will remain the same.
getMemoryClass() does not constrain your heap size, but it tells you the amount of heap you should use if you want your app to function comfortably and compatibly within the limits of the particular device on which you are running.
maxMemory(), by contrast, does constrain your heap size, and so you do gain access to additional heap through increasing its value, and largeHeap does increase that value. However, the increased amount of heap is still limited, and that limit will be device-specific, which means that the amount of heap available to your app will vary, depending on the resources of the device on which your app is running. So, using largeHeap is not an invitation for your app to abandon all caution and oink its way through the all-you-can-eat buffet.
Your app can discover exactly how much memory would be made available on a particular device through using the largeHeap parameter by invoking the method getLargeMemoryClass(). The value returned is in megabytes.
This earlier post includes a discussion of the largeHeap parameter, as well as a number of examples of what amounts of heap are made available with and without its usage, on several specific Android devices:
Detect application heap size in Android
I have not deployed any of my own apps with this parameter set to true. However, I have some memory-intensive code in one of my apps for compiling a set of optimization-related parameters, that runs only during development. I add the largeHeap parameter only during development, in order to avoid out of memory errors while running this code. But I remove the parameter (and the code) prior to deploying the app.
Actually android:largeHeap is the instrument for increasing your allocated memory to app.
There is no clear definition of the need to use this flag. If you need more memory - Android provides you with a tool to increase it. But necessity of using, you define yourself.
I have an App with almost 50 classes
I don't think this makes much problem. The reason why you've got outOfMemory error is usually loading too much images in your app or something like that. If you are unhappy to use large heap you must find a way to optimize using memory.
You can also use Image Loading Libraries such as Picasso, UIL or Glide. All of them have the feature of image caching in memory and/or on disk.
Whether your application's processes should be created with a large Dalvik heap. This applies to all processes created for the application. It only applies to the first application loaded into a process; if you're using a shared user ID to allow multiple applications to use a process, they all must use this option consistently or they will have unpredictable results.
Most apps should not need this and should instead focus on reducing their overall memory usage for improved performance. Enabling this also does not guarantee a fixed increase in available memory, because some devices are constrained by their total available memory.
I am developing an application in which I have a database with 5000 rows with 4 columns.
problem_id (int)
problem_no (string)
problem_title (string)
dacu (int)
I need to frequently query single items in a large scale like 1000 query to fetch problem_no based on problem_id or sometimes only one item.
So I decided to query all the database rows and map them in a hashMap at runtime. I know hashmap insertion/query operation will take only O(1) or sometimes little more, so I only need 5000 operations I think. But how much space hashMap will take in this case? Would android dalvik be able to allocate them without any trouble?
How much space will hashmap take?
It's an implementation detail that can vary between versions, devices, etc. As long as we understand that and look for an estimation only, you can actually measure it very easily. Android SDK includes a powerful suite of memory analysis tools. Check out Eclipse MAT (the best one in my eyes). You can take a heap snapshot when your hashmap is fully loaded, then use MAT to see how many bytes it takes. Make sure you sum up both the hash itself, the keys and the values (if I remember correctly MAT can do the math for you too (it can handle the core collections very well).
Will dalvik be able to allocate?
For the sake of discussion let's say your hashmap takes 1MB of memory. To get a feeling if that's much, we need to understand the constraints of the system we live in. Dalvik limits the max size of your heap. The limitation varies per device. The minimum on very old devices is 16MB. Devices like Samsung Galaxy 2 have about 32MB-48MB and new devices like Galaxy 3 and 4 have more than 100MB.
The biggest memory hog in apps is usually bitmaps. Since every pixel can take as much as 4 bytes, a full screen bitmap can easily eat up a few MB of memory.
With this in mind, a toll of 1MB doesn't sound bad. It's comparable to a using a nice background image :) if your overall memory usage is low, you can distribute it as you see fit. The memory analysis tools (MAT or DDMS) let you know exactly how much memory your app is currently using, so you can easily estimate how much your total consumption will be.
Other thoughts:
Caching things in memory to improve performance is usually a good idea. So your approach is a good one in my eyes (as long as you understand the memory implications).
Since your memory hashmap is an optimization only, you can be extra careful and only do it when you have memory to spare. You can easily measure the amount of available heap (the is API for that) and make your decision accordingly. You can listen to low memory notification events (google about those). And you can even catch OutOfMemoryError exceptions of failed allocations and change your memory strategy in runtime.
You are playing in a field where exact measurements are difficult. Be sure to QA on several devices and several versions of Android. To simulate low memory settings, try to use the oldest devices you can find.
I personally think that you will have absolutely no issues handling what you want in memory. Especially if those things are just primitives (no bitmaps).
I have used queries for up to 10k rows for caching in memory and had absolutely no issues in terms of memory for them.
The issue might happen when you need to process everything. Like how fast is it to get to a specific item, get all realated items etc..
One issue i have come accross was UI related. I tried to just fill an adapter and show it in a list with all 10k of rows which took about 7 seconds to complete. It was long time ago and i don't recall why exactly that happened, but what i am saying is that i would pay more attention to keep processing outside of the UI thread and manage that as much as possible rather than memory in your case.
On android.com they say, that if you're working in Java, the maximum memory you can use is 16 MB. At least that's the one the devices are supposed to support. If you have an older phone, you'll notice that you can't get more, you get an OutOfMemoryError instead. Not if you're doing the same thing using the NDK. In on of my applications I am trying to get 50MB and more, and so far Android was fine with that.
I havn't found anything related to that on android.com.
Is there any limit like in Java, too?
If yes: what's the limit?
If no: What is a good value for that?
Problem is, that I have to build my code depending on that size.
[Edit:]
I tried what Seva Alekseyev were suggesting.
root#android:/ # ulimit -a
ulimit -a
time(cpu-seconds) unlimited
file(blocks) unlimited
coredump(blocks) 0
data(KiB) unlimited
stack(KiB) 8192
lockedmem(KiB) 64
nofiles(descriptors) 1024
processes 7806
flocks unlimited
sigpending 7806
msgqueue(bytes) 819200
maxnice 40
maxrtprio 0
resident-set(KiB) unlimited
address-space(KiB) unlimited
root#android:/ # ulimit -v
ulimit -v
unlimited
root#android:/ #
The memory I am requesting (by using "alloc" or "new") is virtual memory (ulimit -v). So there's no chance to figure out how much I can gain?!
You're subject to three types of memory limits:
1) Artificial limits put in place to keep the system responsive when multitasking -- the VM heap limitation is the main example of this. ulimit is a potential mechanism for a the OS to provide further limitations on you, but I have not seen it being used restrictively on Android devices.
2) Physical limits based on available real memory. You should have a baseline device you're developing/testing on, and should be pretty aggressive in assume other processes (background services, other apps) need memory too. Also remember that memory in use by the OS varies with OS version (and will tend to increase over time). Stock Android doesn't swap, so if you go too far you're dead. One potential scenario is a Nexus One (512MB RAM) with an audio player and the phone app going in the background, and a "balloon" service eating another 100MB physical memory to give some leeway; in this configuration you'll still find more than 100MB available.
3) Virtual memory limits based on address space. Stock android allows overcommitment of memory, so it won't blink if you ask for a 1GB virtual allocation (via mmap, etc) on a device with 512MB of RAM, and this is often a very useful thing to do. However, when you then touch the memory, it needs to be brought into physical memory. If there are read-only pages in physical memory they can be ejected, but soon enough you're going to run out, and without swap -- dead. (The combination and overcommit and no swap leads directly to process death in out-of-memory situations, rather than recoverable errors like malloc returning null).
Finally, it's worth noting that whether calloc/malloc/new require physical allocation is allocator-dependent, but it's safer to assume yes, especially for allocations less than a large number of pages. So: If you're dealing with < 100 MB of standard, well behaved allocations, you're probably in the clear -- but test! If you're dealing with large amounts of data that you'd like memory mapped, mmap is your friend, when used carefully, and is your best friend when used with PROT_READ only. And if you're dealing with > 100 MB of physical memory allocations, expect to run quite nicely on modern devices, but you'll have to define a baseline carefully and test, test, test, since detecting out-of-memory situations on the fly is not generally possible.
One more note: APP_CMD_LOW_MEMORY exists, and is a great place to purge caches, but there's no guarantee it's called in time to save your life. It doesn't change the overall picture at all.
I've spent the last few days trying to remove memory leaks in my game, resulting in many out of memory errors. I'm on the verge of adding a significant amount of graphics, that while not hugely complicated, will add significantly to the processing requirements of my system, and I'm a bit worried about my memory usage, and I was hoping someone might have some tips for me. I don't want to go below Android 2.1, so please tailor any answers to that end.
First of all, my game consists of:
2 activities, 13 XML files (Some relating to a small part of a layout, some dialogs, and 2 directly related to activities.
A number of drawables, made in Adobe Illustrator and converted to PNG. These are probably large, but not unusually large, and for the most part, only small amounts of them are in memory at any given time.
Quite a few dialogs.
Targeted towards Android 1.6 and above.
I used the newest Admob, and as a result, I have to build against 3.2.
My default heap size for my emulators is around 24 MB.
A few sample images from my game:
What I have learned:
Despite my total app size being only around 500K, I somehow am taking up 24 Megs, as calculated by adb shell procrank.
I have done considerable optimization, but am not seeing large increases in memory.
Using tools to find what is in the Heap typically only show around 7 MB avaliable, with around 3 MB being used. Sometimes, when opening new dialogs and the like, I see an increase, but I can't say that I see it being all that large...
MAT shows that none of my classes are using an unusually large amount of memory.
So, given all of this, my questions.
Is 24 Mb an actual requirement to develop to (1.6+ android)?
Assuming it is, how can I both allow for nicer graphics for systems which can handle it, but not crash and burn for older systems?
What are some common gotchas that I can use to improve my memory usage?
Without seeing your actual code, I can't say if the following will be relevant to you or not. However, it is worth a shot.
If you are not already doing so, you can consider using something called an LruCache. http://developer.android.com/reference/android/util/LruCache.html
Using this tool, you can determine at what point your cached objects (such as Bitmaps) will become eligible for garbage collection. So, if you want to set it at 4mb (for example) the OS will deal with it should it try to grow beyond it. (See docs for implementation details and a good example).
The only downside is that that little gem only came along with 3.2, so you would have to make that your min SDK in the AndroidManifest, or do a check programatically at run time for the api level to determine if you can use it. Pre 3.2 I would say you need to call recycle() on any Bitmaps you are using, but if you have optimized already I would think the chances are good you are already doing this.
This is a nice little snippet about the difference between the heap and native memory. http://code-gotcha.blogspot.com/2011/09/android-bitmap-heap.html It may (depending on what you are doing) help you to understand why you are not seeing the drop in memory you are expecting.
And finally this post from SO should help when dealing with heap size as well:
Detect application heap size in Android
Hope that helps.
will it affect the performance if i store about 1.5mb worth 100+ string_array data inside strings.xml?
Any other best method to store them?
I don't think there would be any noticeable performance problems with 1.5MB string.xml, except that your app would take up a little more RAM for the time it runs. It's fine, as long as it doesn't grow beyond, say 2-3 MBs, which might leave less RAM for other portions of your app.
Android will throttle the RAM anyway, so, if during your testing, you don't see any OutOfMemory errors, you are good to go.
But then, there might be other approaches, depending on what exactly your requirement is.