RAM increase - android service - android

I made some changes in my android service, like declared a hashmap of bitmaps as member variable, and populated it dynamically.
I am concerned that bitmaps might take up considerable RAM.
How can I confirm how much is the RAM usage increase after my changes. Is there any tool to check this ?
If anybody has worked in this area before, please help !

there are eclipse android sdk tools to detect the current heap allocation size, but there is much simpler way to know current heap size:
just search in logcat for the words "Grow heap". every time the heap size of you application is grow - this log is printed to logcat, indicating also the current heap size.
it would look something like this:
02-22 11:20:57.040: I/dalvikvm-heap(5313): Grow heap (frag case) to 213.551MB for 3850256-byte allocation
generally, to avoid situation of too much bitmaps allocations, and to allocate bitmaps automatically on the scale type required for display, you should use library such Volley to handle for you the caching, and remote server fetching.
Volley mad by Google, and been use by the Google Play application for handling all requests in front of the server, and provides also smart ImageLoader that doing for you automatically the download,decode in correct sample size, displaying, memory cache and storage cache (if specified).

Related

What is a safe heap size?

I have created a simple game with libgdx. It's currently showing about 50mb heap size. I believe everything is being disposed, and I try not to create objects in redner loop. What is a safe heap size?
From developer.android.com
...each Android-powered device has a different amount of RAM available to the system and thus provides a different heap limit for each app. You can call getMemoryClass() to get an estimate of your app's available heap in megabytes. If your app tries to allocate more memory than is available here, it will receive an OutOfMemoryError.

How heap memory allocation works

I'm working on an app and I have memory issues.
I started to study this thing and I have met Eclipse's debugging system.
I use DDMS's Heap tester to see how much memory my app allocated.
I saw it's about 90%.
Now I made a simple new project, a blank empty activity without any functions or variables. Just a splendid new project.
I ran this heap tester and I saw the results:
Heap size: 10,629 MB
Allocated: 9,189 MB
Free: 1,440 MB
Used: 86.45 %
Objects: 44,565
Well, is it normal?
I have a very simple blank activity, and nothing else, and this app is used 86% of memory?
Allocated 9 MB of 10? Really? Is that normal? How this works?
Please instruct me about this, because I would like to know how these memory allocations work.
Dalvik will initially allocate a certain heap size to your app. In your case, this is around 10 MB. As your app needs more memory, Dalvik will increase the heap size upto the maximum configured size (which is different for different devices). If your app still needs more memory after the maximum is reached, then it will cause a OutOfMemoryException.
To learn more about analyzing memory allocations in Android, check out this excellent article from the Android developers blog:
http://android-developers.blogspot.in/2011/03/memory-analysis-for-android.html
Examining Heap Usage is somewhat tricky but is equally easy. Let's find out how.
So consider a small application. You have Android debugging tools to determine the heap usage and to examine them.
You can check this- memory-analysis-for-android, which have more details of how to analize the application effectively in android.
Let's have a short description here too:
There are two ways to start DDMS-
1) Using Eclipse: click Window > Open Perspective > Other... > DDMS
2) or from the command line: run ddms (or ./ddms on Mac/Linux) in the tools/ directory
Then select your application process from Devices and click "Update Heap".
Now switch to the Heap tab in DDMS.
To see the first update, click the Cause GC button.
You will see something like this:
We can see that our set (the Allocated column) is a little over 20MB. If you do some little flip flop, that number can go up. In small applications, the amount of memory we leak is bounded. In some ways, this can be the worst kind of leak to have, because we never get an OutOfMemoryError indicating that we are leaking.
You can use Heap Dump to identify the problem. Click the Dump HPROF file button in the DDMS toolbar and save the file wherever you want. Then run hprof-conv on it.
Using MAT which is a powerful Memory Analyzer tool-
You can install MAT from SITE which is a stand-alone Memory Analyzer tool and analyze the Heap dumps using it.
NOTE:
If you're running ADT (which includes a plug-in version of DDMS) and have MAT installed in Eclipse as well, clicking the "dump HPROF" button will automatically do the conversion (using hprof-conv) and open the converted hprof file into Eclipse (which will be opened by MAT).
Start the MAT and load the converted HPROF file. Navigate to the Histogram view which shows a list of classes sortable by the number of instances, the shallow heap (total amount of memory used by all instances), or the retained heap (total amount of memory kept alive by all instances, including other objects that they have references to).
If we sort by shallow heap, we can see that instances of byte[] are at the top.
Next, Right-click on the byte[] class and select List Objects > with incoming references. This produces a list of all byte arrays in the heap, which we can sort based on Shallow Heap usage.
Pick one of the big objects, and drill down on it. This will show you the path from the root set to the object - the chain of references that keeps this object alive. Lo and behold, there's our bitmap cache!
MAT can't tell us for sure that this is a leak, because it doesn't know whether these objects are needed or not -- only the programmer can do that. However, looking at the stats it is predictable to know that the cache is using a large amount of memory relative to the rest of the application, so we might consider limiting the size of the cache.
Go this way all along for all, and you will see a tremendous amount of performance optimization.
What you see here is allocated memory and not maximum memory which can be allocated, maximum memory which can be allocated depends upon android version and device to device.
In this case, your apps does not have any high memory requirement, all the files,system and object being used to run the app is very small hence initially android has allocated your app a common initial space,now this space goes on increasing as demand from app increases until its met, or it exceeds maximum heap size defines per app by android, in this scenario your app will crash stating running out of memory as reason.
To read more about memory allocation in android go through below developer link
http://developer.android.com/training/articles/memory.html

Maximum Heap size for Nexus?

I am using Galaxy Nexus(i9250) for development and testing. I noticed strange fact- sometimes when the total heap size is 64mb and allocated heap size is in and around 56-60mb, app crashes. But sometimes I noticed that even the memory shoots up to 80mb, app didn't crashes.
Initially I thought that maximum heap size for devices of the range nexus will be 64mb(now I realize it is wrong). So my question is what is the maximum heap size a device can use. If it is variable based on device, then on what factor heap size depends. I knew this is a common question. Could anyone guide me to the right answer. Thanks in advance!
NOTE: I didn't use LargeHeapSize = true; in my code
Considering , your app crashes every time giving "OutOfMemoryError"
DVM starts by allocating a small heap, after every garbage collection cycle it checks for free heap and compares it to the total heap memory and adds more memory in case that the difference is too small. So Heap can be increased or decreased at runtime as per Dalvik VM's requirement by OS.
So, when their is enough memory available in the system your app didn't get crashed when the memory shoots up to 80mb.(Assuming 64MB is not the hard limit of heap size)
Yes there is a hard limit on the heap size of every device which can be increased by using "LargeHeapSize = true" , but this might be too costly in terms of app performance as increased heap size is directly proportional to the time taken by the garbage collection process(as GC now have to traverse more objects inside bigger heap). So it is a big "NO,NO" until unless you exactly know what and why are you going for larger heap size.
on what factor heap size depends:
Heap size mainly depends on the resolution of the device you are
using, as more resolution needs larger images/bitmaps size to fit
in.(more pixels = needs more memory to incorporate)
Although, I didn't got through any written proof of this.. As per my
understanding Heap size also depends on the RAM size. Since, bigger
RAM size will allow more flexibility of multitasking and lesser
chances of getting the "OutOfMemoryError"
In case you needs to know exact amount of heap memory you can use, ActivityManager.getMemoryClass(). This method returns an integer indicating the number of megabytes available for your app's heap
ActivityManager.getLargeMemoryClass() for larger heap size
it is not clear that what device you are exactly talking about. It is also not clear how you calculated your heap memory.
I recommend you calculate your heap memory and available memory using this link
But if your app uses Native Memory, their are no restriction on that link.
I would only use the DDMS values as a guide to find memory leaks and memory allocation problems rather than some specific number given that you can target. Any Android application will be expected to run on a variety of devices so while you may have tuned your app for a 'Galaxy Nexus' you will want to be able to run on older devices, and test appropriately. See
#dongshengcn comment.
In addition to the links by #minhaz I would also read: Understanding Heap Size.
If you are trying to get a better understanding of memory management on Android then you should read Android Framework Engineer #hackbod's answer to: How to Discover Memory Usage of My Application in Android.

Android grow heap frag case

I'm working on an app to streaming music from internet... My app does many things and it's structured in this way: I have a tab view and every view is allocated in memory so every time I navigate through tabs I find again the previous status ( every tab can also open a webview to find information about songs, news etc in internet ).. all that grows memory occupation but makes the app very user friendly... After having paid attention to avoid memory leaks following the Android guide, I tried to look at the heap occupation and I found that my app allocates max 3.5MB of memory and the heap size allocated is 4.5 - 4.6 MB... I'm working on the emulator .. They are not so much I think, but sometimes my app is restarted founding in LogCat a strange message like
Grow heap ( frag case ) to 3.373 for 19764-byte allocation
What is it? an emulator issue? or something else? Am I using too much memory?
Thank you in advance for any help :)
The maximum heap size depends on the device (you can get that value by calling Runtime.getRuntime().maxMemory()), but it's probably around 32MB. In order to save memory, Android doesn't allocate maximum memory to every app automatically. Instead it waits until the app need more memory and then gives it more heap space as needed until it's reached the max. I believe that's the Grow heap message you see.
If you do a lot of memory allocation and freeing, you may run into fragmentation problems. Wikipedia has a decent description here, but basically means that you might have the required memory available, just not all in one chunk. Hence the need to grow the heap.
So to answer your questions, it's probably not an emulator issue, it's just the nature of your program, which sounds a little memory heavy. However this isn't a bad thing. I don't think using 3-5MB for multiple tabs with webviews is too much.

BitmapFactory OOM driving me nuts

I've been doing a lot of searching and I know a lot of other people
are experiencing the same OOM memory problems with BitmapFactory. My
app only shows a total memory available of 4MB using Runtime.getRuntime
().totalMemory(). If the limit is 16MB, then why doesn't the total
memory grow to make room for the bitmap? Instead it throws an error.
I also don't understand that if I have 1.6MB of free memory according
to Runtime.getRuntime().freeMemory() why do I get an error saying "VM
won't let us allocate 614400 bytes"? Seems to me I have plenty
available memory.
My app is complete except for this problem, which goes away when I
reboot the phone so that my app is the only thing running. I'm using
an HTC Hero for device testing (Android 1.5).
At this point I'm thinking the only way around this is to somehow
avoid using BitmapFactory.
Anyone have any ideas on this or an explanation as to why VM won't
allocate 614KB when there's 1.6MB of free memory?
[Note that (as CommonsWare points out below) the whole approach in this answer only applies up to and including 2.3.x (Gingerbread). As of Honeycomb Bitmap data is allocated in the VM heap.]
Bitmap data is not allocated in the VM heap. There is a reference to it in the VM heap (which is small), but the actual data is allocated in the Native heap by the underlying Skia graphics library.
Unfortunately, while the definition of BitmapFactory.decode...() says that it returns null if the image data could not be decoded, the Skia implementation (or rather the JNI glue between the Java code and Skia) logs the message you’re seeing ("VM won't let us allocate xxxx bytes") and then throws an OutOfMemory exception with the misleading message "bitmap size exceeds VM budget".
The issue is not in the VM heap but is rather in the Native heap. The Natïve heap is shared between running applications, so the amount of free space depends on what other applications are running and their bitmap usage. But, given that BitmapFactory will not return, you need a way to figure out if the call is going to succeed before you make it.
There are routines to monitor the size of the Native heap (see the Debug class getNative methods). However, I have found that getNativeHeapFreeSize() and getNativeHeapSize() are not reliable. So in one of my applications that dynamically creates a large number of bitmaps I do the following.
The Native heap size varies by platform. So at startup, we check the maximum allowed VM heap size to determine the maximum allowed Native heap size. [The magic numbers were determined by testing on 2.1 and 2.2, and may be different on other API levels.]
long mMaxVmHeap = Runtime.getRuntime().maxMemory()/1024;
long mMaxNativeHeap = 16*1024;
if (mMaxVmHeap == 16*1024)
mMaxNativeHeap = 16*1024;
else if (mMaxVmHeap == 24*1024)
mMaxNativeHeap = 24*1024;
else
Log.w(TAG, "Unrecognized VM heap size = " + mMaxVmHeap);
Then each time we need to call BitmapFactory we precede the call by a check of the form.
long sizeReqd = bitmapWidth * bitmapHeight * targetBpp / 8;
long allocNativeHeap = Debug.getNativeHeapAllocatedSize();
if ((sizeReqd + allocNativeHeap + heapPad) >= mMaxNativeHeap)
{
// Do not call BitmapFactory…
}
Note that the heapPad is a magic number to allow for the fact that a) the reporting of Native heap size is "soft" and b) we want to leave some space in the Native heap for other applications. We are running with a 3*1024*1024 (ie 3Mbytes) pad currently.
1.6 MB of memory seems like a lot but it could be the case that the memory is so badly fragmented that it can't allocate such big block of memory in one go (still this does sound very strange).
One common cause of OOM while using image resources is when one is decompressing JPG, PNG, GIF images with really high resolutions. You need to bear in mind that all these formats are pretty well compressed and take up very little space but once you load the images to the phone, the memory they're going to use is something like width * height * 4 bytes. Also, when decompression kicks in, a few other auxiliary data structures need to be loaded for the decoding step.
It seems like the issues given in Torid's answer have been resolved in the more recent versions of Android.
However, if you are using an image cache (a specialized one or even just a regular HashMap), it is pretty easy to get this error by creating a memory leak.
In my experience, if you inadvertently hold on to your Bitmap references and create a memory leak, OP's error (an referring to the BitmapFactory and native methods) is the one that will crash your app (up to ICS - 14 and +?)
To avoid this, make your you "let go" of your Bitmaps. This means using SoftReferences in the final tier of your cache, so that Bitmaps can get garbage collected out of it. This should work, but if you are still getting crashes, you can try to explicitly mark certain Bitmaps for collection by using bitmap.recycle(), just remember to never return a bitmap for use in your app if bitmap.isRecycled().
As an aside, LinkedHashMaps are a great tool for easily implementing pretty good cache structures, especially if you combine hard and soft references like in this example (starting line 308)... but using hard references is also how you can get yourself into memory leak situations if you mess up.
Although usually it doesnt make sense to catch an Error because usually they are thrown only by the vm but in this particular case the Error is thrown by the jni glue code thus it is very simple to handle cases where you could not load the image: just catch the OutOfMemoryError.
Although this is a fairly high level answer, the problem for me turned out to be using hardware acceleration on all of my views. Most of my views have custom Bitmap manipulation, which I figured to be the source of the large native heap size, but in fact when disabling hardware acceleration the native heap usage was cut down by a factor of 4.
It seems as though hardware acceleration will do all kinds of caching on your views, creating bitmaps of its own, and since all bitmaps share the native heap, the allocation size can grow pretty dramatically.

Categories

Resources