Can't allocate enough memory in JNI - android

I have an android native shared library (.so) that is used in an android application. This library was originally written for windows/desktop, and then ported to mobile platforms. It contains an "algorithmic" code, which works on large data set. Within the library I don't use a standard heap (malloc and friends), instead memory pages are allocated via mmap with flags MAP_PRIVATE | MAP_ANONYMOUS and then partitioned appropriately.
Now, the problem is that at some point mmap fails with error code = 12, which is out-of-memory. This happens when the total allocated memory size reaches about 650MB. I know this value if pretty large, and way above what a typical android application needs. But in my specific case this is legitimate IMHO, since this is indeed what this library/application does, and this is realized and approved by the user.
Specifically I try to run this on Samsung tablet SM-T800, which has 3GB RAM, a 32-bit armeabi-v7a architecture, and more than 7GB of free storage space (flash memory). So that technically there should be no problem.
Also this is not a problem of virtual memory fragmentation: the mmap fails when I ask to allocate additional memory chunk of as much as 16MB. So, most probably some artificial limitation is imposed by the system on how many memory pages can be allocated for the process.
So, my question is, if and how this limitation can be removed. From what I found in online documentation, there is no mention of this limitation, but I'm pretty sure it exists. I've also read in some forums that starting from Android 5.0 some applications fail to allocate as much memory as they could in older systems.
In case this limitation can't be removed, would it help to work with file mapping? Currently I store the data in source files in a compressed way, then I read it and build a complex data structure in memory. Instead I could store the whole data structure in a file (means, the file would be larger), and just map it into memory via mmap. The total size of the virtual address space would be the same, but if the limitation is not on its size, but on how many pages are allocated and not backed by a file - this could work.
As a last resort I can abandon the idea of having all the data in (virtual) memory, and manually read & lock only the data portions that I currently need, and discard those that are not used recently. But by such a way I actually duplicate the work of memory manager, since the paging mechanism does exactly this.
Thanks in advance.

Related

Why are hundreds of bitmaps in memory for a basic android app?

When building my first Android app I noticed that memory usage was already approaching 20MB right when the app started. I downloaded the Eclipse MAT and viewed the contents to find hundreds, if not thousands, of bitmaps in memory. Many of which are not used directly in my project (at least not so much that I would be aware)
Largest Image is 9MB:
https://imagebin.ca/v/2eCK4JYLO2f2
Others are much smaller:
https://imagebin.ca/v/2eCMmbFuIWsz
Android seems to have gone to great lengths to reduce memory by using the zygote... so why are there so many unused bitmaps already in memory?
(Basic project is a blank activity using Android Studio with fragment made up of a text view which simply says "Hello World!")
I have found that the bitmaps are from the Zygote's memory footprint. Since all java-based apps are started via the zygote this means that the minimal memory usage is determined by the size of memory in use by the Zygote which in turn is based on the display density of the device.
Answers found:
Find the class that wastes memory
Locate & remedy cause of large heap size

Memory usage analysis using Android Studio

I am trying to understand where my app is using memory, and where I can make it more efficient in this respect.
In the Android Monitor part of Android Studio, I have dumped the Java Heap, and am looking at the generated hprof.
And I see a lot categorized under FinalizerReference:
What is this? How can I understand better what is causing it, and how to keep it down? Looking into the "Instance" panel doesn't help me much... doesn't make much sense.
I have tried looking at this but it's all slightly over my head at the moment.
Also, at the moment the memory monitor is reporting (in the live chart section) an Allocated memory of 10.58 MB. But on my device, in Application Manager > Running Processes, my app is showing a memory usage of 44MB. Why the discrepancy? If it's the ~33MB I want to try and reduce, I'm not apparently even seeing that in Android Studio, so no real hope of identifying what it is?
There may not be much you can do about FinalizerReference memory usage. See this question for more details - basically some objects implement finalize() and these are handled a little differently, such that they can end up sticking around longer. I haven't looked into it too closely, but I suspect that some android sdk objects do this and there's little you can do about it except for maybe tuning up your object caching/recycling to reduce it.
I'm not sure if this would help with FinalizerReference, but one thing I like to do to track down memory leaks is to find suspicious objects' connections to the GC root.
If you're using the Eclipse hprof analyzer (independent of the actual Eclipse IDE; works with hprofs generated by android studio), this is one way to access this:
Overview
Histogram
Right-click, "List Objects"
Right-click an object you suspect is leaking, "Path to GC Roots"
Now you should see a list of nested references leading back down from the gc root to your object.
I'm not exactly sure what is owing to the discrepancy - here is a similar question on that. Apparently the memory monitor tool may only be reporting heap allocations made by Java code, whereas the device reports the entire processes's memory usage.
The Retained Size reported by the Memory Profiler for FinalizerReference is currently a meaningless number, as I argued in my answer to my own similar question.
To summarize: Treating FinalizerReference like any other class when profiling (as Memory Profiler does), leads to repeated counting of the same memory when calculating its Retained Size.
I view this as a bug in Android Studio's Memory Profiler, and have filed this issue.

iOS equivalent to increasing heap size

iOS apps (for the most part) is written in Objective-C, which is a subset of C, and is therefore a data managed language, unlike Android/Java.
In Android, you have the ability to increase heap size by simply adding this one line in the XML android manifest:
<application android:largeHeap="true"/>
Is there an iOS version to doing something like this?
Well in iOS you don't have any control over the memory.
It is all managed by the kernel. So you cannot increase the heap size.
As pointed out in the comments, memory management has a different notion in iOS.
You get as many memory as available but if the app uses to much memory it will be killed by the system.
Now that you explained your goal, you shouldn't download large files into memory, this will cause trouble. Instead you should save it directly to the disk as you get the response.
Take a look at Apple's "Memory Usage Performance Guidelines" for an explanation of how iOS doesn't manage swap space.
Although OS X supports a backing store, iOS does not. In iPhone applications, read-only data that is already on the disk (such as code pages) is simply removed from memory and reloaded from disk as needed. Writable data is never removed from memory by the operating system. Instead, if the amount of free memory drops below a certain threshold, the system asks the running applications to free up memory voluntarily to make room for new data. Applications that fail to free up enough memory are terminated.
iOS attempts to provide each application with as much of the device's memory as the OS can spare. However each application is limited to the device's physical memory. There is no option to allocated larger blocks and expect them to be swapped to disk as needed.
Manipulating the heap size in iOS is therefore not a meaningful concept. Each app already has the largest heap the OS can provide. Instead apps must attempt to minimize their memory footprint to remain within the available space on the host device. This means purging in-memory caches in response to memory warnings, streaming access to resources on disk (as #CouchDeveloper suggested in a comment), and minimizing the amount of memory used overall.
As an additional complication iOS attempts to keep memory in use. Unused memory is wasted capacity and users may be better served by the OS keeping more applications suspended and in memory rather than terminated. As a result attempting to measure available free memory does not give a meaningful result. As the device runs low on free memory other applications will reduce their use in response to memory warnings or by being terminated completely.

Android memory spiking

To start, thanks for taking the time to look over my question. I am currently having problems with memory spiking in the application I am developing.
My intent is to be able to download and process large amounts of HTML data, currently the cause is from large base64 encoded images nested in the HTML which I understand is not ideal for use on a mobile platform. For the record, I am currently testing on a Samsung Galaxy S. Also, this problem does not occur on the Galaxy Nexus due to more memory allocation per application.
My problem is that while processing a large chunk of HTML data of approximately 2.8mb, the memory heap increases to around 27-29mb but the allocated memory never passes beyond 18-19mb. When the HTML has been processed, saved and displayed the allocated memory returns to around 3-4mb. If I was to then download and process this HTML again, the process repeats and I get the same memory use, except it seems to increase the heap further (which to me doesn't seem necessary), at this point I receive an Out of memory error.
When I do receive this error it is normally while downloading the HTML using HttpGet or while extracting the data from disk using a StringBuffer. Occasionally it is caused by a Bitmap during an XML inflation.
Any help would be greatly appreciated.
There is little you can do if you really need that amount of memory. Phones have limited memory.
Dealocating memory is not instantaneous. It might take several iterations to free all the memory (and each iteration might be executed a few seconds apart).
It's frequent to have problems with too much memory used by images/drawables. Some times it's a memory leak; other times it's not possible to say what is causing it.
I've also had problems parsing large xml files. My solution was spliting those files into smaller ones. Another possibility is considering the advantages and disadvantages of different xml parsers (first google result: SAX parser vs XML pull parser). Maybe using some thirdparty implementation specially developed with memory usage concerns? One third option is using a server to convert the xml file to a more efficient format.
The best practice is not to allocate lots of memory. Process the data in-stream as you're reading it from the network, or stream it to disk and then read from there. The android:largeHeap option is available on all devices running Android 3.0 or above, but that only increases the amount you can allocate not remove the limit altogether.

How can I gracefully degrade my performance, given limited memory?

I've spent the last few days trying to remove memory leaks in my game, resulting in many out of memory errors. I'm on the verge of adding a significant amount of graphics, that while not hugely complicated, will add significantly to the processing requirements of my system, and I'm a bit worried about my memory usage, and I was hoping someone might have some tips for me. I don't want to go below Android 2.1, so please tailor any answers to that end.
First of all, my game consists of:
2 activities, 13 XML files (Some relating to a small part of a layout, some dialogs, and 2 directly related to activities.
A number of drawables, made in Adobe Illustrator and converted to PNG. These are probably large, but not unusually large, and for the most part, only small amounts of them are in memory at any given time.
Quite a few dialogs.
Targeted towards Android 1.6 and above.
I used the newest Admob, and as a result, I have to build against 3.2.
My default heap size for my emulators is around 24 MB.
A few sample images from my game:
What I have learned:
Despite my total app size being only around 500K, I somehow am taking up 24 Megs, as calculated by adb shell procrank.
I have done considerable optimization, but am not seeing large increases in memory.
Using tools to find what is in the Heap typically only show around 7 MB avaliable, with around 3 MB being used. Sometimes, when opening new dialogs and the like, I see an increase, but I can't say that I see it being all that large...
MAT shows that none of my classes are using an unusually large amount of memory.
So, given all of this, my questions.
Is 24 Mb an actual requirement to develop to (1.6+ android)?
Assuming it is, how can I both allow for nicer graphics for systems which can handle it, but not crash and burn for older systems?
What are some common gotchas that I can use to improve my memory usage?
Without seeing your actual code, I can't say if the following will be relevant to you or not. However, it is worth a shot.
If you are not already doing so, you can consider using something called an LruCache. http://developer.android.com/reference/android/util/LruCache.html
Using this tool, you can determine at what point your cached objects (such as Bitmaps) will become eligible for garbage collection. So, if you want to set it at 4mb (for example) the OS will deal with it should it try to grow beyond it. (See docs for implementation details and a good example).
The only downside is that that little gem only came along with 3.2, so you would have to make that your min SDK in the AndroidManifest, or do a check programatically at run time for the api level to determine if you can use it. Pre 3.2 I would say you need to call recycle() on any Bitmaps you are using, but if you have optimized already I would think the chances are good you are already doing this.
This is a nice little snippet about the difference between the heap and native memory. http://code-gotcha.blogspot.com/2011/09/android-bitmap-heap.html It may (depending on what you are doing) help you to understand why you are not seeing the drop in memory you are expecting.
And finally this post from SO should help when dealing with heap size as well:
Detect application heap size in Android
Hope that helps.

Categories

Resources