To start, thanks for taking the time to look over my question. I am currently having problems with memory spiking in the application I am developing.
My intent is to be able to download and process large amounts of HTML data, currently the cause is from large base64 encoded images nested in the HTML which I understand is not ideal for use on a mobile platform. For the record, I am currently testing on a Samsung Galaxy S. Also, this problem does not occur on the Galaxy Nexus due to more memory allocation per application.
My problem is that while processing a large chunk of HTML data of approximately 2.8mb, the memory heap increases to around 27-29mb but the allocated memory never passes beyond 18-19mb. When the HTML has been processed, saved and displayed the allocated memory returns to around 3-4mb. If I was to then download and process this HTML again, the process repeats and I get the same memory use, except it seems to increase the heap further (which to me doesn't seem necessary), at this point I receive an Out of memory error.
When I do receive this error it is normally while downloading the HTML using HttpGet or while extracting the data from disk using a StringBuffer. Occasionally it is caused by a Bitmap during an XML inflation.
Any help would be greatly appreciated.
There is little you can do if you really need that amount of memory. Phones have limited memory.
Dealocating memory is not instantaneous. It might take several iterations to free all the memory (and each iteration might be executed a few seconds apart).
It's frequent to have problems with too much memory used by images/drawables. Some times it's a memory leak; other times it's not possible to say what is causing it.
I've also had problems parsing large xml files. My solution was spliting those files into smaller ones. Another possibility is considering the advantages and disadvantages of different xml parsers (first google result: SAX parser vs XML pull parser). Maybe using some thirdparty implementation specially developed with memory usage concerns? One third option is using a server to convert the xml file to a more efficient format.
The best practice is not to allocate lots of memory. Process the data in-stream as you're reading it from the network, or stream it to disk and then read from there. The android:largeHeap option is available on all devices running Android 3.0 or above, but that only increases the amount you can allocate not remove the limit altogether.
Related
I have an android native shared library (.so) that is used in an android application. This library was originally written for windows/desktop, and then ported to mobile platforms. It contains an "algorithmic" code, which works on large data set. Within the library I don't use a standard heap (malloc and friends), instead memory pages are allocated via mmap with flags MAP_PRIVATE | MAP_ANONYMOUS and then partitioned appropriately.
Now, the problem is that at some point mmap fails with error code = 12, which is out-of-memory. This happens when the total allocated memory size reaches about 650MB. I know this value if pretty large, and way above what a typical android application needs. But in my specific case this is legitimate IMHO, since this is indeed what this library/application does, and this is realized and approved by the user.
Specifically I try to run this on Samsung tablet SM-T800, which has 3GB RAM, a 32-bit armeabi-v7a architecture, and more than 7GB of free storage space (flash memory). So that technically there should be no problem.
Also this is not a problem of virtual memory fragmentation: the mmap fails when I ask to allocate additional memory chunk of as much as 16MB. So, most probably some artificial limitation is imposed by the system on how many memory pages can be allocated for the process.
So, my question is, if and how this limitation can be removed. From what I found in online documentation, there is no mention of this limitation, but I'm pretty sure it exists. I've also read in some forums that starting from Android 5.0 some applications fail to allocate as much memory as they could in older systems.
In case this limitation can't be removed, would it help to work with file mapping? Currently I store the data in source files in a compressed way, then I read it and build a complex data structure in memory. Instead I could store the whole data structure in a file (means, the file would be larger), and just map it into memory via mmap. The total size of the virtual address space would be the same, but if the limitation is not on its size, but on how many pages are allocated and not backed by a file - this could work.
As a last resort I can abandon the idea of having all the data in (virtual) memory, and manually read & lock only the data portions that I currently need, and discard those that are not used recently. But by such a way I actually duplicate the work of memory manager, since the paging mechanism does exactly this.
Thanks in advance.
Guess what, another Android-Bitmap-OOM question!
Background
Whilst stress testing our application it has been noted that it is possible to max-out the app's process memory allocation after sustained, heavy usage (monkey runner like) with OutOfMemory exceptions being recorded within the ensuing stacktrace. The app downloads images (around 3 at a time) when a page under a ViewPager is selected. There can be 280+ images available for download when the length and breath of the app is exercised. The application uses Picasso by Square for it's image downloading abstraction. Notably, at no point in our application's code are we manipulating Bitmaps directly...we trust that the very talented Square Inc. employees are doing it better than we can.
Here is a picture
The below plot shows the heap allocations over time recorded under the dalvikvm-heap log message. The red dots indicates a user bringing a fresh set of articles into the application in order to bolster the amount of work outstanding and stress the app...
DALVIKVM heap allocations http://snag.gy/FgsiN.jpg
Figure 1: Nexus One heap allocations; OOMs occur at 80MB+
Investigation to-date
Against a Nexus S, Nexus 4, Wildfire, HTC Incredible and a myriad of further test devices, anecdotal testing has shown the memory management to be sufficient with the DVM GC 'keeping up' with the heavy lifting work being completed by the app. However, on high end devices such as the Galaxy S II, III, IV and HTC One the OOM are prevalent. In fact given enough work to do, I would imagine all of our devices would eventually exhibit the failure.
The question
There is clearly a relationship between screen density (our requested image sizes are based off the size of the ImageView), the process memory allocation and the number of images at a given size that would result in the app exceeding it's heap limits. I am about to embark on quantifying this relationship but would like the SO community to cast their eyes over this problem and (a) agree or disagree that the relationship is worth making and (b) provide literature indicating how best to draw up this relationship.
It is important to note that if we destroy the image quality our OOM all disappear but alas the UX is poorer which is why we are wanting to be dicing with the most effective use of the available heap.
Side note: Here is the portion of code responsible for loading these images into the views that have been laid out;
picassoInstance.load(entry.getKey())
.resize(imageView.getMeasuredWidth(),
imageView.getMeasuredHeight())
.centerCrop()
.into(imageView);
The 'dashing of image quality' mentioned above is simply dividing the imageView.getMeasured... by a number like '4'.
First you need to manage the memories allocation ,its a big issue in android as bitmaps takes lots of memories ,for that memory allocation can be reduce by following ways
put all those images which are huge in size to assets folder instead of putting them in drawabable folder . because drawable resources takes memory for caching them .if you load from asset folder the image will not cache .and will takes less memory .
study Lrucache which use for efficient memory management .
put resources in tiny formats for that check TinyPNG
if your images are too large in resolution , then try to use SVG files for images and load SVG file instead of image . check this SVG FOR ANDROID
finally i am not very good in English hope it may helps you.
This post is a little old but I also had this issue recently. Maybe this will help someone else.
General Overview of this massive thread/What helped me.
-Make sure you are using a Singleton Instance of Picasso
-Use fit()
-For large Images or many Images or when used in a FragmentPager/StatePager you should probably use skipmemorycache() and/or largeHeap declaration
Read the thread for more tips. At the time this question was posted nobody had posted this issue on picassos github.
https://github.com/square/picasso/issues/305
My activity has listview and (apart from all other stuff) loads images from web and displays them in listview. I have access to 5 android devices: 2 HTC desire, LG P-350, one more phone and a tablet. Normally, everything works fine, but being launched on one of HTC desire, app tends to crash with NullPointerException, which is due to out of memory error (I guess so), this is the output:
05-03 14:41:23.818: E/dalvikvm(843): Out of memory: Heap Size=7367KB, Allocated=4991KB, Bitmap Size=16979KB
Later, logcat outputs stack trace of nullpointerexception where one of my static variables suddenly becomes null (the variable is initialized in app's root activity, is used across the app and for sure is not nulled in code). I suppose, it is nulled by system due to lack of memory.
As far as I undesrstand, system tries to allocate bitmap as large as 17mb - I'm sure loaded images cant be that big. They are 100*70 jpegs and any of them weighs far less than 1mb.
Another thing I dont understand is why I get this error only on one device - other devices work fine.
To my mind, this looks very strange and I can find no clue, I need advice.
The reason is simple: the memory is not holding your JPG data per say, but rather its decompressed equivalent, which, needless to say, takes a lot more RAM space than the source files... Note that this 17 mb limit is for all your loaded bitmaps at once, not necessarily a single one.
I had to fight with similar problems in one of my programs (a custom Tile loader for a Mapquest Android API MapView object), and I ended up having to call the recycle() method of my bitmaps whenever possible, as well as manually oblige the system to garbage collect at strategic locations using System.gc()...
Sorry to not be the bearer of the best news...
You might solve your problems using the same strategy as I did: I essentially cache the loaded bitmaps in hard storage such as my external SD card, and reload them on the fly when needed, instead of attempting to hold everything in RAM.
I want to load a text file in a wordprocessing apps. But I get out of memory error when files are of too big. I finalized that I can load upto 1 MB files. But sometimes I get out of memory even for 1MB files. But I have to say before loading whether I can load files or not.
I tried for solution of finding maximum possible available memory, apart from freeMemory(),
that is freememory + (maxmemory - totalmemory) which will give the total possible available memory for the application. (Say it will be around 18MB to 20 MB). But I get outofMemory error after completely utilizing the heap. say for example(24 MB).
My question is really that (18MB to 20 mb) of "maximum possible available memory" is utilized for allocation when loading 1MB file.
How much of memory should be available to load 1MB file.
How can I roughly compute it.
Is there any way out of PSS, Privatedirty. I couldn't understand much about PSS. But I couldn't get much info regarding of summing up in article "How to discover memory usage of my application in Android"
Thanks
Remember, the way you store the files in variables matters quite a lot. Using char array manually is one of the most memory-efficient ways, but you still need to account for every character taking 16 bits, or 2 bytes. So, if you have text file using some 8-bit encoding and you load it into char array, it takes twice as much space.
I have an app, which is doing some basic operations like: download files, install files, query the phone, Using threads, HttpClient connections, etc.. (nothing too complicated).
it's also running a perm service inside(kind of Listener)
The wierd thing is: when I first intall it on my device, it's size in the memory is around 150 kb. but after a while (could be couple of days of activity), the size is growing unexpectedly, last time I check it got to 664KB.
What could be the reason? is this memory measure is realiable ?
what should I check or how sould I solve it in order to keep it small while it's extracted in the memory?
Thanks,
ray.
It's usually due to fragmentation of the heap, there are plenty of resources to research upon for regular Java VM's, you will have to check the Dalvik documentation for tools to assist with Android.
How fragmented is my Java heap?
Tips and tricks for dealing with a fragmented Java
Dalvik heap fragmentation and recovery