android memory issue - app uses a lot of ram - android

I have some problems with the memory usage of my android app and don't know what causes the high memory usage. When I start my app, it uses up to 40 mb ram (says DDMS) and when I open another app, my app gets immediately killed.
I read a lot about memory leaks and I'm unbinding drawables, running the GC and so on but my app still needs a lot of memory.
I have about 3mb resources in my app, but afaik they are loaded into ram on demand. Am I wrong? May this cause the 40mb of ram usage?
EDIT: I think I'm not having memory leaks because I can switch the orientation on each activity as often as I want and the app does not crash because of low memory. So it can't be a memoryleak, can it?

you need to do memory management into your android application, please free the resources which is no longer used, try to override onStop(), onDestroy(), onPause() methods of Activity which will keep track of activity stack.
in OnDestroy() method free your whole availed resources, so that another app can use the same resources again.

What data structures are you using? Very large data structures (long Lists, big graphs, big maps, etc) can quickly use up RAM.
It could also be that you're leaking the Context on orientation change in your app.
It could also be that your layouts are really badly designed along with some heavy data structures.
It's difficult to tell unless you describe a bit more about what your app tries to do.

Related

Free RAM in Android

I'd like to know some simple code that allows for freeing used memory that is no longer needed, in a similar way as a lot of memory freeing apps do.
Yes, I'm aware that this shouldn't be neccesary because Android manages memory on its own, but it looks like what's causing a non desired behavior in my app is having a lot of opened app occupying memory, so I think this is worthwhile to try, and check if the error happens any longer.
Could anyone hand me such a code? I'm not able to find any.
What I gather from the article is that you don't need to do anything to reclaim memory, but you can make garbage collection happen quicker and at specific times. What this means to me is that any arrays, Lists, large objects, etc. should be set to null when you are done with it. Granted, this should be done automatically when you leave a method or a View, but in case you are in a long running loop or staying on a page with lots of data hanging around, you can clean it up a little faster.
The Android Runtime (ART) and Dalvik virtual machine use paging and memory-mapping (mmapping) to manage memory. This means that any memory an app modifies—whether by allocating new objects or touching mmapped pages—remains resident in RAM and cannot be paged out. The only way to release memory from an app is to release object references that the app holds, making the memory available to the garbage collector. That is with one exception: any files mmapped in without modification, such as code, can be paged out of RAM if the system wants to use that memory elsewhere.
https://developer.android.com/topic/performance/memory-overview
You can also check your memory usage to see if that's really the problem. This is linked in the article above, but I thought I'd pop it out so it's easier to notice.
https://developer.android.com/reference/android/app/ActivityManager.html#getMemoryClass()

Android clear the back stack if out of memory

When running on Huawei G300 with Gingerbread, my app crashes after 5 minutes or so of usage during setContentView() as it runs out of memory.
Each individual page doesn't use much memory, but from some research it seems the memory accumulates in the back stack.
Following advice here, I've replaced all my calls to startActivity with a utility function that also calls finish().
Android: Clear the back stack
This works; but there is no more back stack - the back button immediately quits the app, which isn't what I wanted.
Is there a way to only finish() the applications when I actually do run out of memory, and is that a reasonable approach to take?
You should search for memory leaks. A good tool for that is MAT if you use eclipse. MAT is not that hard to handle and you can get quickly some very valuable information.
One of the most common mistakes I have seen on Android is to keep a reference on a context that is dead. For instance, having a singleton holding a reference on one of the activities you created. There is no real reason for an app to crash the memory if it is well coded.
The Android Activity Manager was designed to manage this exact problem. The OS is designed to kill activities in the background and then restore them using onSaveInstanceState and onRestoreInstanceState.
The fact that your app is accumulating memory usage over time indicates to me that you may have a Context leak somewhere (a static reference to a view, adapter, etc. that has a reference to a Context), or that you have a caching mechanism that's not adjusting to your memory heap, or some other situation that's causing the out of memory.
I highly doubt that it's the Activities in the Back Stack causing the Out of Memory.
Here's a great guide on tracking down memory leaks on Android:
http://android-developers.blogspot.com/2011/03/memory-analysis-for-android.html
Memory is a very tricky subject in Android.
Every app gets a heap memory limit depending on the device. This heap memory is the dalvik memory plus the native memory, and you can see it as the total column in the dumpsys meminfo results. The dalvik memory deals with everything except with the bitmaps, that are allocated in the native memory (this is true for Android versions before Honeycomb).
Having said that I can only answer to some of your questions:
As far as I know, Android will always allocate memory for Bitmaps, even if they are the same. Therefore, in your case, every activity allocates memory for your background.
I don't know if its better to work with themes, you'll have to try that.
On one hand, activities are not reclaimed while the device has enough memory to deal with the next activity. Every activity is pushed to a pile from where it is recovered when you pressed the back button. In case Android needs more memory it removes one activity from the pile deallocating its memory (going back to question number one, maybe this is the reason for not sharing memory). On the other hand, you can set the activities launchMode to change this behaviour (have a look here).
I think MAT doesn't show native memory data. Use dumpsys meminfo's native column to see how much allocated memory for Bitmaps you have.
I have had hard times dealing with OutOfMemory problems myself. Now I have a much more clear idea of how it works and I am able to work with large files without running out of memory. I would highly recommend these two resources that helped me a lot:

Android Heap Fragmentation Strategy?

I have an OpenGL Android app that uses a considerable amount of memory to set up a complex scene and this clearly causes significant heap fragmentation. Even though there are no memory leaks it is impossible to destroy and create the app without it running out of memory due to fragmentation. (Fragmentation is definitely the problem, not leaks)
This causes a major problem since Android has a habit of destroying and creating activities on the same VM/heap which obviously causes the activity to crash. As a strategy to counter this I have used the following technique:
#Override
protected void onStop() {
super.onStop();
if(isFinishing()) {
System.runFinalizersOnExit(true);
System.exit(0);
}
}
This ensures that when the activity is finishing it causes a complete VM shutdown and therefore next time the activity is started it gets a fresh unfragmented heap.
Note: I realise that this is not the "Android way" but given that the garbage collector is non-compacting it is impossible to continuously re-use the heap.
This techinque does actually work in general, however it doesn't work when the activity is destroyed in a non-finishing mode and then re-created.
Has anyone got any good suggestions about how to handle the degredation of the heap?
Further note: Reducing memory consumption is not really an option either. The activity doesn't actually use that much memory, but the heap (and native heap) seem to get easily fragmented, probably due to some large'ish memory chunks
Fragmentation is almost always a consequence of an ill conditioned allocation pattern. Large objects are frequently created and destroyed. In conjunction with smaller objects may persisting (or a least having a different lifetime) - holes in the heap are created.
The only working fragmentation prevention in such scenarios is: prevent the specific allocation pattern. This can often be done by pooling the large objects. If successfull, the application will thankfully acknowledge this with a much better execution speed as well.
#edit: yet more specific to your question: if the heap after a restart of the application is yet not empty, so what is there to remain on the heap? You confirmed that its not a problem of a memory leak, but this is, what it seems. Since you are using OpenGL - could it possibly be, some native wrappers have survived, because the OpenGL ressources have not properly been disposed?

How to implement an in Memory Image Cache in Android?

Until now I'm using a SoftReference Cache for images in Android. This cache is used for images that are shown in ListViews and should help holding images of items in memory that are not shown on the screen, if there is enough memory left.
The problem with this is that SoftReferences are garbage collected nearly at the very moment the last hard reference is released. The result of this is that an image that is removed from the screen is garbage collected at that moment and if the user scrolls back to this entry in the ListView the image is reloaded from the internal phone memory resulting in a complex lazy loading process, resulting in frequent redraws of the list and general bad performance.
The bug request for the soft reference behavior states that this is the intended behavior and that you should use a LRU-Cache for caching this kind of stuff.
Know to my question. The LRU cache will only take as much memory as I allow him to. But if the app needs a lot of memory it will not free memory. How should I determine how much memory I can allow the Cache to use, and is there a way to reduce the size of the cache if the memory situation of the phone becomes tight?
At the moment the image cache is saved inside the application as a kind of global image storage for all activities. This would result in my app constantly using all the memory of the image cache even if my activities are in the background or destroyed.
Keeping background processes alive is an OS-level optimization that makes switching back to your process fast. Your process will stay alive only as long as the OS can afford the memory; when that memory is needed for another application your process will be killed and its resources will be released.
If you free your cache each time your process is backgrounded, switching back to your application would no longer be fast because it would see cache-misses. This defeats Android's keep-background-processes-alive optimization!
You should just use the LruCache, which is also included in the Android Support Package for releases prior to Android 3.0 (Honeycomb).

Android Service Memory Usage

I have an Android app with a running service.
When I look in the "Running Apps" menu in Android settings, I see that my app memory usage is
between 9-16MB .
I used DDMS Allocation Tracker to see where this is coming from, but all of the objects were less than 500 bytes.
Does it make sense? Any other ways to track my app's memory usage?
Also, I have an SQLite database opened as long as the service is running. Is that an impact on memory as well?
Thanks.
Does it make sense?
It neither makes sense nor doesn't make sense. You can get to "9-16MB" by increments of 500 as easily as you can get there by increments of 5000. Also, AFAIK that allocation tracker does not track everything (e.g., bitmaps on pre-3.0 environments).
Any other ways to track my app's memory usage?
Dump your heap (e.g., using the Dump HPROF File toolbar button in DDMS) and examine the results with the MAT plugin for Eclipse. There was a presentation on this at the 2011 Google I|O conference -- the YouTube video is online. You can use this to track memory leaks.
Is that an impact on memory as well?
Some, I'm sure.
Another issue is actually the service itself. Your objective should be to have that service in memory as little as possible, and only while it is actively delivering continuous value to the user. Ideally, your service is destroyed ~99% of the time.

Categories

Resources