On android.com they say, that if you're working in Java, the maximum memory you can use is 16 MB. At least that's the one the devices are supposed to support. If you have an older phone, you'll notice that you can't get more, you get an OutOfMemoryError instead. Not if you're doing the same thing using the NDK. In on of my applications I am trying to get 50MB and more, and so far Android was fine with that.
I havn't found anything related to that on android.com.
Is there any limit like in Java, too?
If yes: what's the limit?
If no: What is a good value for that?
Problem is, that I have to build my code depending on that size.
[Edit:]
I tried what Seva Alekseyev were suggesting.
root#android:/ # ulimit -a
ulimit -a
time(cpu-seconds) unlimited
file(blocks) unlimited
coredump(blocks) 0
data(KiB) unlimited
stack(KiB) 8192
lockedmem(KiB) 64
nofiles(descriptors) 1024
processes 7806
flocks unlimited
sigpending 7806
msgqueue(bytes) 819200
maxnice 40
maxrtprio 0
resident-set(KiB) unlimited
address-space(KiB) unlimited
root#android:/ # ulimit -v
ulimit -v
unlimited
root#android:/ #
The memory I am requesting (by using "alloc" or "new") is virtual memory (ulimit -v). So there's no chance to figure out how much I can gain?!
You're subject to three types of memory limits:
1) Artificial limits put in place to keep the system responsive when multitasking -- the VM heap limitation is the main example of this. ulimit is a potential mechanism for a the OS to provide further limitations on you, but I have not seen it being used restrictively on Android devices.
2) Physical limits based on available real memory. You should have a baseline device you're developing/testing on, and should be pretty aggressive in assume other processes (background services, other apps) need memory too. Also remember that memory in use by the OS varies with OS version (and will tend to increase over time). Stock Android doesn't swap, so if you go too far you're dead. One potential scenario is a Nexus One (512MB RAM) with an audio player and the phone app going in the background, and a "balloon" service eating another 100MB physical memory to give some leeway; in this configuration you'll still find more than 100MB available.
3) Virtual memory limits based on address space. Stock android allows overcommitment of memory, so it won't blink if you ask for a 1GB virtual allocation (via mmap, etc) on a device with 512MB of RAM, and this is often a very useful thing to do. However, when you then touch the memory, it needs to be brought into physical memory. If there are read-only pages in physical memory they can be ejected, but soon enough you're going to run out, and without swap -- dead. (The combination and overcommit and no swap leads directly to process death in out-of-memory situations, rather than recoverable errors like malloc returning null).
Finally, it's worth noting that whether calloc/malloc/new require physical allocation is allocator-dependent, but it's safer to assume yes, especially for allocations less than a large number of pages. So: If you're dealing with < 100 MB of standard, well behaved allocations, you're probably in the clear -- but test! If you're dealing with large amounts of data that you'd like memory mapped, mmap is your friend, when used carefully, and is your best friend when used with PROT_READ only. And if you're dealing with > 100 MB of physical memory allocations, expect to run quite nicely on modern devices, but you'll have to define a baseline carefully and test, test, test, since detecting out-of-memory situations on the fly is not generally possible.
One more note: APP_CMD_LOW_MEMORY exists, and is a great place to purge caches, but there's no guarantee it's called in time to save your life. It doesn't change the overall picture at all.
Related
As I researched, Android allocates limit memory for each process, maybe range from 16MB to 24MB for each one. Here is reference
Nevertheless when I view memory usage for one application in setting, I often see one normal application costs hundred megabytes for memory (on one process). There is a conflict here that I cannot understand.
Thanks :)
NDK code can use more system RAM than can a single Dalvik/ART process. Also, the app might be using more than one process, or it might be using android:largeHeap to request an above-normal heap size.
iOS apps (for the most part) is written in Objective-C, which is a subset of C, and is therefore a data managed language, unlike Android/Java.
In Android, you have the ability to increase heap size by simply adding this one line in the XML android manifest:
<application android:largeHeap="true"/>
Is there an iOS version to doing something like this?
Well in iOS you don't have any control over the memory.
It is all managed by the kernel. So you cannot increase the heap size.
As pointed out in the comments, memory management has a different notion in iOS.
You get as many memory as available but if the app uses to much memory it will be killed by the system.
Now that you explained your goal, you shouldn't download large files into memory, this will cause trouble. Instead you should save it directly to the disk as you get the response.
Take a look at Apple's "Memory Usage Performance Guidelines" for an explanation of how iOS doesn't manage swap space.
Although OS X supports a backing store, iOS does not. In iPhone applications, read-only data that is already on the disk (such as code pages) is simply removed from memory and reloaded from disk as needed. Writable data is never removed from memory by the operating system. Instead, if the amount of free memory drops below a certain threshold, the system asks the running applications to free up memory voluntarily to make room for new data. Applications that fail to free up enough memory are terminated.
iOS attempts to provide each application with as much of the device's memory as the OS can spare. However each application is limited to the device's physical memory. There is no option to allocated larger blocks and expect them to be swapped to disk as needed.
Manipulating the heap size in iOS is therefore not a meaningful concept. Each app already has the largest heap the OS can provide. Instead apps must attempt to minimize their memory footprint to remain within the available space on the host device. This means purging in-memory caches in response to memory warnings, streaming access to resources on disk (as #CouchDeveloper suggested in a comment), and minimizing the amount of memory used overall.
As an additional complication iOS attempts to keep memory in use. Unused memory is wasted capacity and users may be better served by the OS keeping more applications suspended and in memory rather than terminated. As a result attempting to measure available free memory does not give a meaningful result. As the device runs low on free memory other applications will reduce their use in response to memory warnings or by being terminated completely.
I need to search through text files (around 40MB) in my app with regular expressions, as you can imagine, it normally takes 1 minute or so to get it done. AND I have to do it repeatedly.
I wonder if I can keep these files in RAM after the first search. Can I possibly do that? I mean, find a way to explicitly say keep something in RAM for some time.
Consider putting your search results in a WeakHashMap, with keys that only exist for the duration that you need the values to exist, like the scope of an Activity. Watch out for memory issues though. On some devices, your application's process may only have a heap size as low as 16M.
Keep the results in a custom object that will save the search result. This will keep it in RAM (as long as you keep a reference to it).
Also keep in mind that allocating 40 MiB in RAM in Android devices is not a very good idea since RAM is quite limited in a lot of low-end devices. This can make your application a very tasty target for Android when it looks to free memory.
The size of the VM heap cannot exceed 16mb, 24mb, 32mb depending on the phone.
But what is the maximum size of the native heap? How much native memory can be allocated to the app when it is in foreground.
Thanks.
Technically there's no restriction in the NDK. Someone asked this a while back and was referred to this android-ndk Groups thread. A relevent quote:
"Also given that this is the NDK list, the limit is actually
not imposed on you, because it is only on the Java heap. There is no limit on
allocations in the native heap..."
Dianne Hackborn
She does go on to say that it shouldn't be abused and if it is than applications could be killed.
There's no simple answer to this; you can use as much memory as the device has, minus what it's using for other programs. When Android thinks it's low on memory, it'll start killing background tasks, so it's a soft limit. Most devices do not have swap space. You can get some statistics about the device's memory from inside Dalvik with android.app.ActivityManager.MemoryInfo (I assume there's an NDK equivalent).
adb shell dumpsys meminfo PACKAGENAME will give you native and dalvik memory usage of your app.
I see that the Heap Size is automatically increased as the app needs it, up to whatever the phone's Max Heap Size is. I also see that the Max Heap Size is different depending on the device.
So my first question is, what are the typical Max Heap Sizes on Android devices? I have tested memory allocation on one phone that was able to use a heap over 40mb while another gave out OutOfMemory errors in the 20's mbs. What are the lowest that are in common use and what are the highest that are on common devices? Is there a standard or average?
The second question, and more important one, is how to ensure you are able to use the resources available per device but avoid using too much? I know there are methods such as onLowMemory() but those seem to be only for the entire system memory, not just the heap for your specific application.
Is there a way to detect the max heap size for the device and also detect when the available heap memory is reaching a low point for your application?
For example, if the device only allowed a max heap of 24mb and the app was nearing that limit in allocation, then it could detect and scale back. However, if the device could comfortably handle more, it would be able to take advantage of what is available.
Thanks
Early devices had a per-app cap of 16MB. Later devices increased that to 24MB. Future devices will likely have even more available.
The value is a reflection of the physical memory available on the device and the properties of the display device (because a larger screen capable of displaying more colors will usually require larger bitmaps).
Edit: Additional musings...
I read an article not too long ago that pointed out that garbage-collecting allocators are essentially modeling a machine with infinite memory. You can allocate as much as you want and it'll take care of the details. Android mostly works this way; you keep hard references to the stuff you need, soft/weak references to stuff you might not, and discard references to the stuff you'll never need again. The GC sorts it all out.
In your particular case, you'd use soft references to keep around the things that you don't need to have in memory, but would like to keep if there's enough room.
This starts to fall apart with bitmaps, largely because of some early design decisions that resulted in the "external allocation" mechanism. Further, the soft reference mechanism needs some tuning -- the initial version tended to either keep everything or discard everything.
The Dalvik heap is under active development (see e.g. the notes on Android 2.3 "Gingerbread", which introduces a concurrent GC), so hopefully these issues will be addressed in a future release.
Edit: Update...
The "external allocation" mechanism went away in 4.0 (Ice Cream Sandwich). The pixel data for Bitmaps is now stored on the Dalvik heap, avoiding the earlier annoyances.
Recent devices (e.g. the Nexus 4) cap the heap size at 96MB or more.
A general sense of the app's memory limits can be obtained as the "memory class", from ActivityManager.getMemoryClass(). A more specific value can be had from the java.lang.Runtime function maxMemory().
Here are the "normal" (see below) heap sizes for some specific devices:
G1: 16MB
Moto Droid: 24MB
Nexus One: 32MB
Viewsonic GTab: 32MB
Novo7 Paladin: 60MB
I say "normal" because some versions of Android (e.g., CyanogenMod) will allow a user to manually adjust the heap limit. The result can be larger or smaller than the "normal" values.
See this answer for additional information, including how to find out what the heap size actually is programmatically, and also how to distinguish between the absolute heap size limit on the one hand and the heap limit that you should ideally respect, on the other:
Detect application heap size in Android
To detect what your present heap utilization is, you could try using the Runtime class' totalMemory() method. However, I've read reports that different versions/implementations of the Android OS may have different policies regarding whether native memory (from which the backing memory for bitmaps is allocated) is counted against the heap's maximum or not. And, since version 3.0, the native memory is directly taken from the application's own heap.
The iffiness of this calculation makes me think that it is a mistake to monitor your app's usage of memory at runtime, constantly comparing it to the amount available. Also, if you are in the middle of an involved computation, and find that you're running out of memory, it is not always convenient or reasonable to cancel that computation, and it may create a bad experience for your users if you do.
Instead, you might try preemptively defining certain modes, or constraints, upon your app's functional behavior that will ensure that it comes in under whatever your current device's relevant heap limits are (as detected during your app's initialization).
For example, if you have an app that uses a large list of words that must be loaded into memory all at once, then you might constrain your app so that for smaller heap limits only a smaller list of the more common words can be loaded, while for larger heap limits a full list containing many more words can be loaded.
There are also Java programming techniques that allow you to declare certain memory to be reclaimable by the garbage collector on demand, even if it has existing "soft" (rather than hard) references. If you have data that you would like to keep in memory, but which can be re-loaded from non-volatile storage if required (i.e., a cache), then you might consider using soft references to have such memory automatically freed when your app starts bumping against the upper limits of your heap. See this page for info on soft references in Android:
http://developer.android.com/reference/java/lang/ref/SoftReference.html