I am new to android development and I am trying to understand how this Garbage Collection works but I need a clear explanation from someone first hand.
My app is doing some large transactions back and forth with a server. When I am switching from one activity to another, I constantly get the following message in my console:
GC_MINOR: (Nursery full) pause 2.77ms, total 2.95ms, bridge 11.82ms promoted 128K major 2640K los 4441K
and of course, the ms timing is different every time but it happens A LOT!
I read up on it here and created environment.txt file in my project with the following lines:
MONO_GC_PARAMS=nursery-size=1024m
MONO_GC_PARAMS=soft-heap-limit=64m
I was just testing different values at nursery-size and soft-heap-limit but it didn't help at all.
Right now, app runs REALLY slow when I go from one activity to another.
Can someone please explain in detail and present me with some options?
Thank you.
Garbage Collection works on different portions of heap
Nursery
Tenured
Which is different on different JVMs (Hotspot, IBM etc.)
Generally Nursery size is lower than Tenured (Nursery << Tenured)
Ex. In 2 GB of heap space Nusery can range from 128-512 and Remaining will be Tenured.
Nursery part will always be well managed by JVM. This part is used most of the time for new object creation, allocation as this part is lower in size GC operations (Compaction, GC collection)are fast and well tuned.
Tenured part is used when objects in Nursery grow in size or are alive for than a specific time limit (long living objects). They are maintained in Tenured. This is bigger chunk of memory and thus GC operations are slower.
Nursery pauses are generally small and shouldn't impact much when you are facing it continuously then that's sign of problem. While resizing nursery keep in mind that it should not be more than Tenured. Size is proportional to GC operations time.
In your case you should look at,
Existing nursery size and object allocation pattern. If objects of bigger size of getting created then try increasing nursery in multiples of 2.
Try parallel threads for GC operation. This can improve performance drastically.
Find out JVM policies i.e. Throughput policy, CMS policy (dependent on JVM)
Related
One of the touted features of the ART runtime in Android 5.0+ is heap compaction, to reduce heap fragmentation. A fragmented heap can get OutOfMemoryErrors a lot easier, as there may not be a single contiguous free block of memory big enough for your needs, even if the heap overall has enough free space.
I understand that this occurs when the app moves to the background, based on Google conference presentations and the like. However, the only statement that I can find on it in the documentation says:
Homogeneous space compaction is free-list space to free-list space compaction which usually occurs when an app is moved to a pause imperceptible process state. The main reasons for doing this are reducing RAM usage and defragmenting the heap.
It's unclear exactly what a "pause imperceptible process state" means, technically.
Suppose an app does not have any foreground activities at the moment. Is there anything that the developer might have done that might prevent heap compaction for that app's process? For example, does having a foreground service block heap compaction?
Putting the pieces of the puzzle together.
From what I can determine, ART will compact anything that is paused for 2-3 seconds and by paused it means not currently running in background, so activities, but not running services. It will also compact on the fly, or concurrently while the app is in the foreground.
Currently, the event that triggers heap compaction is ActivityManager process-state changes. When an app goes to background, it notifies ART the process state is no longer jank “perceptible.” This enables ART do things that cause long application thread pauses, such as compaction and monitor deflation.
Chet Hasse states:
Garbage Collection
ART brought improved garbage collection dynamics. For one thing, ART is a moving collector; it is able to compact the heap when a long pause in the application won’t impact user experience (for example, when the app is in the background and is not playing audio). Also, there is a separate heap for large objects like bitmaps, making it faster to find memory for these large objects without wading through the potentially fragmented regular heap. Pauses in ART are regularly in the realm of 2–3ms.
From what I can see any pause in the app is fair game for the ART GC.
I suspect the app needs to be paused completely of all services, etc for the compact to occur, as it's reallocating the memory addresses of the heap, and for this to occur, it cannot be changing. As this larger compact that is taken during the app pause and not on the fly is a dynamic rearrangement of the heap. The only changes that can be made in the smaller pauses is to re-route some addresses on processes no longer being used.
Though this is an educated guess, not definitive and I will endeavour to get more info.
The source code here should have the answer. They're using naming like InJankPerceptibleProcessState() and trying to wade through this, as you probably already have yourself.
Reading it, will update answer when/if I find the definite answer.
Homogeneous space compaction is free-list space to free-list space compaction which usually occurs when an app is moved to a pause imperceptible process state. The main reasons for doing this are reducing RAM usage and defragmenting the heap.
Source : https://developer.android.com/studio/profile/investigate-ram.html#LogMessages
Actually you can measure the idle time of an app by. Start the idle timer and stop if if there is any event captured in TextWatcher/OnKeylistner, if you app is in background and none of these events are called, it is good to be collected by GC.
Also this heap contraction is event based and priority based. Eg if there is never a scenario when user need memory, OS will not even do it.
As far as priority is concerned, for garbage collection, it looks for background apps with no background service, then app with background service and at last the foreground apps.
I'm working on an app and I have memory issues.
I started to study this thing and I have met Eclipse's debugging system.
I use DDMS's Heap tester to see how much memory my app allocated.
I saw it's about 90%.
Now I made a simple new project, a blank empty activity without any functions or variables. Just a splendid new project.
I ran this heap tester and I saw the results:
Heap size: 10,629 MB
Allocated: 9,189 MB
Free: 1,440 MB
Used: 86.45 %
Objects: 44,565
Well, is it normal?
I have a very simple blank activity, and nothing else, and this app is used 86% of memory?
Allocated 9 MB of 10? Really? Is that normal? How this works?
Please instruct me about this, because I would like to know how these memory allocations work.
Dalvik will initially allocate a certain heap size to your app. In your case, this is around 10 MB. As your app needs more memory, Dalvik will increase the heap size upto the maximum configured size (which is different for different devices). If your app still needs more memory after the maximum is reached, then it will cause a OutOfMemoryException.
To learn more about analyzing memory allocations in Android, check out this excellent article from the Android developers blog:
http://android-developers.blogspot.in/2011/03/memory-analysis-for-android.html
Examining Heap Usage is somewhat tricky but is equally easy. Let's find out how.
So consider a small application. You have Android debugging tools to determine the heap usage and to examine them.
You can check this- memory-analysis-for-android, which have more details of how to analize the application effectively in android.
Let's have a short description here too:
There are two ways to start DDMS-
1) Using Eclipse: click Window > Open Perspective > Other... > DDMS
2) or from the command line: run ddms (or ./ddms on Mac/Linux) in the tools/ directory
Then select your application process from Devices and click "Update Heap".
Now switch to the Heap tab in DDMS.
To see the first update, click the Cause GC button.
You will see something like this:
We can see that our set (the Allocated column) is a little over 20MB. If you do some little flip flop, that number can go up. In small applications, the amount of memory we leak is bounded. In some ways, this can be the worst kind of leak to have, because we never get an OutOfMemoryError indicating that we are leaking.
You can use Heap Dump to identify the problem. Click the Dump HPROF file button in the DDMS toolbar and save the file wherever you want. Then run hprof-conv on it.
Using MAT which is a powerful Memory Analyzer tool-
You can install MAT from SITE which is a stand-alone Memory Analyzer tool and analyze the Heap dumps using it.
NOTE:
If you're running ADT (which includes a plug-in version of DDMS) and have MAT installed in Eclipse as well, clicking the "dump HPROF" button will automatically do the conversion (using hprof-conv) and open the converted hprof file into Eclipse (which will be opened by MAT).
Start the MAT and load the converted HPROF file. Navigate to the Histogram view which shows a list of classes sortable by the number of instances, the shallow heap (total amount of memory used by all instances), or the retained heap (total amount of memory kept alive by all instances, including other objects that they have references to).
If we sort by shallow heap, we can see that instances of byte[] are at the top.
Next, Right-click on the byte[] class and select List Objects > with incoming references. This produces a list of all byte arrays in the heap, which we can sort based on Shallow Heap usage.
Pick one of the big objects, and drill down on it. This will show you the path from the root set to the object - the chain of references that keeps this object alive. Lo and behold, there's our bitmap cache!
MAT can't tell us for sure that this is a leak, because it doesn't know whether these objects are needed or not -- only the programmer can do that. However, looking at the stats it is predictable to know that the cache is using a large amount of memory relative to the rest of the application, so we might consider limiting the size of the cache.
Go this way all along for all, and you will see a tremendous amount of performance optimization.
What you see here is allocated memory and not maximum memory which can be allocated, maximum memory which can be allocated depends upon android version and device to device.
In this case, your apps does not have any high memory requirement, all the files,system and object being used to run the app is very small hence initially android has allocated your app a common initial space,now this space goes on increasing as demand from app increases until its met, or it exceeds maximum heap size defines per app by android, in this scenario your app will crash stating running out of memory as reason.
To read more about memory allocation in android go through below developer link
http://developer.android.com/training/articles/memory.html
I'm working on an app to streaming music from internet... My app does many things and it's structured in this way: I have a tab view and every view is allocated in memory so every time I navigate through tabs I find again the previous status ( every tab can also open a webview to find information about songs, news etc in internet ).. all that grows memory occupation but makes the app very user friendly... After having paid attention to avoid memory leaks following the Android guide, I tried to look at the heap occupation and I found that my app allocates max 3.5MB of memory and the heap size allocated is 4.5 - 4.6 MB... I'm working on the emulator .. They are not so much I think, but sometimes my app is restarted founding in LogCat a strange message like
Grow heap ( frag case ) to 3.373 for 19764-byte allocation
What is it? an emulator issue? or something else? Am I using too much memory?
Thank you in advance for any help :)
The maximum heap size depends on the device (you can get that value by calling Runtime.getRuntime().maxMemory()), but it's probably around 32MB. In order to save memory, Android doesn't allocate maximum memory to every app automatically. Instead it waits until the app need more memory and then gives it more heap space as needed until it's reached the max. I believe that's the Grow heap message you see.
If you do a lot of memory allocation and freeing, you may run into fragmentation problems. Wikipedia has a decent description here, but basically means that you might have the required memory available, just not all in one chunk. Hence the need to grow the heap.
So to answer your questions, it's probably not an emulator issue, it's just the nature of your program, which sounds a little memory heavy. However this isn't a bad thing. I don't think using 3-5MB for multiple tabs with webviews is too much.
Please note I do NOT have a memory leak. My question is about a subtler issue.
I recently wrote an android app which does image processing. The image is loaded as a Bitmap, then copied out in pixels, processed in a way that uses lots of memory (think Fourier transforms in floating point representations and stuff), then converted back into a bitmap and saved.
The problem is, through at least android OS 2.3, the total memory limitation (typically 16MB) is combined java and (externally stored) Bitmaps, and the java high water mark doesn't go down (that I can discern) even when the memory is free (successfully GC'd), which means when I go to allocate the final Bitmap, I am often "out of memory" even though by that point I have freed (and GC'd) most of the space. I.e., I never need the full 16MB at once, but the space left for Bitmaps appears to be 16MB minus the MAX historical java heap usage (as opposed to current usage).
I watched a tech talk by one of the android developers about memory issues and he implied this problem has been fixed in subsequent versions of the OS (they moved Bitmap memory into the java heap space), but in the meantime most of the people wanting to use my app are running 2.2 or 2.3.
Long story short, I am wondering if the java heap is ever compacted (de-fragmented, in effect) so that the high-water mark shrinks (and if so, how to make it happen)?
If not, then does anybody have another suggestion how to deal with this problem?
Long story short, I am wondering if the java heap is ever compacted (de-fragmented, in effect) so that the high-water mark shrinks (and if so, how to make it happen)?
Whatever its behavior is, it most certainly is not under your control.
If not, then does anybody have another suggestion how to deal with this problem?
Ideally, reuse your own Bitmaps. You don't indicate what "processed in a way that uses lots of memory" really is. However, if it does not change the dimensions or bit depth of the image, copy the data back out to the original Bitmap rather than allocating a fresh one, if you can.
Image processing on Android 2.x is one of the few places where I can see justifying using multiple processes. You will add overhead for schlepping the image data between processes, but the other process has its own heap (Java and native), so this may give you more "elbow room".
So far, no indication that there is any way to compact the memory.
Here is my workaround, which is suboptimal but much better than the behavior before:
I now intentionally hold on to the original Bitmap while I am doing my processing, and then recycle() and null it, and GC(), but not until just before allocating my output Bitmap.
What this does is reserve external (Bitmap) space, and cause my application to run out of java heap (during processing, before calling recycle()), which I can at least catch and handle by retrying on a smaller image. (Before, everything seemed to be fine until I tried to save, but by then it was too late and with no way to recover.)
Technically this limits my max image size to less than I should be able to do with the allotted memory (because I need to reserve space in the heap and external at the same time when in truth I never need both together), but at least I can still handle a reasonable image size.
What was happening before is I would free and recycle the Bitmap early which allowed the high water mark on the java heap to use up essentially all of my memory allotment, meaning from that point forward I couldn't open or create any more Bitmaps at all (other than tiny thumbnail sizes sometimes).
Imo, this is a major bug in the way android handles Bitmap memory, but I believe it is fixed in newer versions of the OS so hopefully I can disable this workaround conditional on the OS release.
I'm assuming that you already call Bitmap.recycle() but it's the only thing I remembered and you didn't talk about.
I see that the Heap Size is automatically increased as the app needs it, up to whatever the phone's Max Heap Size is. I also see that the Max Heap Size is different depending on the device.
So my first question is, what are the typical Max Heap Sizes on Android devices? I have tested memory allocation on one phone that was able to use a heap over 40mb while another gave out OutOfMemory errors in the 20's mbs. What are the lowest that are in common use and what are the highest that are on common devices? Is there a standard or average?
The second question, and more important one, is how to ensure you are able to use the resources available per device but avoid using too much? I know there are methods such as onLowMemory() but those seem to be only for the entire system memory, not just the heap for your specific application.
Is there a way to detect the max heap size for the device and also detect when the available heap memory is reaching a low point for your application?
For example, if the device only allowed a max heap of 24mb and the app was nearing that limit in allocation, then it could detect and scale back. However, if the device could comfortably handle more, it would be able to take advantage of what is available.
Thanks
Early devices had a per-app cap of 16MB. Later devices increased that to 24MB. Future devices will likely have even more available.
The value is a reflection of the physical memory available on the device and the properties of the display device (because a larger screen capable of displaying more colors will usually require larger bitmaps).
Edit: Additional musings...
I read an article not too long ago that pointed out that garbage-collecting allocators are essentially modeling a machine with infinite memory. You can allocate as much as you want and it'll take care of the details. Android mostly works this way; you keep hard references to the stuff you need, soft/weak references to stuff you might not, and discard references to the stuff you'll never need again. The GC sorts it all out.
In your particular case, you'd use soft references to keep around the things that you don't need to have in memory, but would like to keep if there's enough room.
This starts to fall apart with bitmaps, largely because of some early design decisions that resulted in the "external allocation" mechanism. Further, the soft reference mechanism needs some tuning -- the initial version tended to either keep everything or discard everything.
The Dalvik heap is under active development (see e.g. the notes on Android 2.3 "Gingerbread", which introduces a concurrent GC), so hopefully these issues will be addressed in a future release.
Edit: Update...
The "external allocation" mechanism went away in 4.0 (Ice Cream Sandwich). The pixel data for Bitmaps is now stored on the Dalvik heap, avoiding the earlier annoyances.
Recent devices (e.g. the Nexus 4) cap the heap size at 96MB or more.
A general sense of the app's memory limits can be obtained as the "memory class", from ActivityManager.getMemoryClass(). A more specific value can be had from the java.lang.Runtime function maxMemory().
Here are the "normal" (see below) heap sizes for some specific devices:
G1: 16MB
Moto Droid: 24MB
Nexus One: 32MB
Viewsonic GTab: 32MB
Novo7 Paladin: 60MB
I say "normal" because some versions of Android (e.g., CyanogenMod) will allow a user to manually adjust the heap limit. The result can be larger or smaller than the "normal" values.
See this answer for additional information, including how to find out what the heap size actually is programmatically, and also how to distinguish between the absolute heap size limit on the one hand and the heap limit that you should ideally respect, on the other:
Detect application heap size in Android
To detect what your present heap utilization is, you could try using the Runtime class' totalMemory() method. However, I've read reports that different versions/implementations of the Android OS may have different policies regarding whether native memory (from which the backing memory for bitmaps is allocated) is counted against the heap's maximum or not. And, since version 3.0, the native memory is directly taken from the application's own heap.
The iffiness of this calculation makes me think that it is a mistake to monitor your app's usage of memory at runtime, constantly comparing it to the amount available. Also, if you are in the middle of an involved computation, and find that you're running out of memory, it is not always convenient or reasonable to cancel that computation, and it may create a bad experience for your users if you do.
Instead, you might try preemptively defining certain modes, or constraints, upon your app's functional behavior that will ensure that it comes in under whatever your current device's relevant heap limits are (as detected during your app's initialization).
For example, if you have an app that uses a large list of words that must be loaded into memory all at once, then you might constrain your app so that for smaller heap limits only a smaller list of the more common words can be loaded, while for larger heap limits a full list containing many more words can be loaded.
There are also Java programming techniques that allow you to declare certain memory to be reclaimable by the garbage collector on demand, even if it has existing "soft" (rather than hard) references. If you have data that you would like to keep in memory, but which can be re-loaded from non-volatile storage if required (i.e., a cache), then you might consider using soft references to have such memory automatically freed when your app starts bumping against the upper limits of your heap. See this page for info on soft references in Android:
http://developer.android.com/reference/java/lang/ref/SoftReference.html