I have a bunch of little files in my assets which need to be copied to the SD-card on the first start of my App. The copy code i got from here placed in an IntentService works like a charm. However, when I start to copy many litte files, the whole app gets increddible slow (I'm not really sure why by the way), which is a really bad experience for the user on first start.
As I realised other apps running normal in that time, I tried to start a child process for the service, which didn't work, as I can't acess my assets from another process as far as I understood. Has anybody out there an idea how
a) to copy the files without blocking my app
b) to get through to my assets from a private process (process=":myOtherProcess" in Manifest)
or
c) solve the problem in a complete different way
Edit:
To make this clearer: The copying allready takes place in a seperate thread (started automaticaly by IntentService). The problem is not to separate the task of copying but that the copying in a dedicated thread somehow affects the rest of the app (e.g. blocking to many app-specific resources?) but not other apps (so it's not blocking the whole CPU or someting)
Edit2:
Problem solved, it turns out, there wasn't really a problem. See my answer below.
I suggest you to create a separate thread to do the work. Or, more simple, an AsyncTask!
Sorry for this, it turns out you actually can use the assets in a child process. I've got no idea why it doesn't work the first time I tried it. So the answer to my question is actually (b). Create a child process for the Intentservice, access the assets through getApplicationContext().getAssets() and there you go. It now runs satisfiable fast. Thanks for trying to help.
Related
I'm creating an Android app that needs to download a set of images and audio files in order to work properly. Those files will be updated on the server from time to time. On startup, the app will check for updated files and download them.
My concern: The updates/downloads of those files must be completed in an atomic fashion, meaning that the update is successful only if all files have been downloaded. If one file failed to download (reason being poor internet connection, insufficient storage space on the phone, etc), the update should be rolled back.
I feel that implementing something like that from scratch could be a pretty big task, so I first wanted to ask if there's already a library/module, or at least a best practice for implementing something like this.
I feel that implementing something like that from scratch could be a pretty big task.
I disagree:
Your app can creat a temporary private folder (i.e. -tmp-update-20190321_122800) and download all files into this folder.
After all downlads finished without error the app can replace the original files.
So instead of searching for a transaction-multi-filie-download-lib you can use any android-download-lib and you must know how to copy/move/delete files.
I do not have not created any code yet on this topic so I will just ask a question.
I am making an application in which user will insert multiple paths, and in those paths/folders I will be searching for N biggest files in every folder separately. I was thinking that I should use new Thread for every path but I am not sure if this is a good practice or good idea in general.
One little subquestion - should I use TreeSet(RedBlack tree) to keep files in some kind of order or B-Tree would be better?
EDIT: By new Thread I mean using multiple threads with AsyncTask or something like that
I have been jumping around the various posts on this site related to this question, but none of them really answer exactly what I am looking for. So if you find or know of a post that I may have missed then I am sorry ahead of time and please post the link.
I am trying to take .png files (around 1000+ of them) and somehow install them with the app. I want to access them on the fly and I thought to use SQLite to achieve this. I just do not really understand how I would go about "packaging" them with the app or where to store them in the apps structure.
Main Points...
access files on the fly (view them at run time like a slideshow)
add them to SQLiteDatabase for ease of access
the images will not be edited and if they are then I will add them to a separate database
all the images will be added once at installation or when first onCreate() is called and will not be changed, rearranged, added too, or anything after this initial .put()
this is all in Java/xml
Thank you ahead of time and sorry for my ignorance on this matter.
I am trying to process multiple file writing commands (to seperate files) without hanging the UI.
To put my question in context, imagine the following
My main application looks like a file manager. It currently sees 10 files each of about 5MB in size. (Don't worry about how this list works etc.)
When I select one file item, I want it to immediately start copying/duplicating the file onto another location on the SD card. Typically this should take a few seconds
I want to be able to select a second, or a third etc. file immediately after the first. At the end of everything, all the files which I selected will be duplicated. So I could, for example click 5 files within 5 seconds, but all the copying actions takes a minute.
At this moment two options have come to my mind:
The first is to simply put the file writing commands of each file in a seperate thread. Pseudocode will look like this
Onclick
new Thread()
write file
If this works, there can be scenarios where I have 10 threads running simultaneously, writing to 10 seperate files. I would like to know if anyone has done this before, and what I should be looking out for
The second option, of course, is if there is already a certain data structure/known methods that can address this problem. Some sort of a pending queue system that gets larger as I add requests, but gets smaller as the system writes the data away.
I'm not absolutely sure how the SD-card works, but I can tell you that trying to write in parallel to a single hard disk is a bad idea because it will actually slow down the performance compared to a sequential write.
You may want to try using an ExecutorService and measuring the performance with different thread counts but I'm afraid you will end up having to implement a queue with a single thread taking the queued files and writing them one by one.
I would create an AsyncTask class that simply copies a given file over. Each time a file is selected, create a new instance of that class for the selected file. The thread management built into Android for AsyncTask is well balanced and should handle this use case nicely. It will be easy to provide feedback for progress and completion using the built-in AsyncTask methods.
I think the classes in java.util.concurrent are what you need; specifically the Executors class to create a ThreadPoolEecutor. It has the benefit of accepting as many tasks as the user clicks but limiting the number of threads to some limit you specify. (Spawning threads without limit can be a problem, slowing down not only each other but the UI as well.)
I've got an app that is heavily based on remote images. They are usually displayed alongside some data in a ListView. A lot of these images are new, and a lot of the old ones will never be seen again.
I'm currently storing all of these images on the SD card in a custom cache directory (ala evancharlton's magnatune app).
I noticed that after about 10 days, the directory totals ~30MB. This is quite a bit more than I expected, and it leads me to believe that I need to come up with a good solution for cleaning out old files... and I just can't think of a great one. Maybe you can help. These are the ideas that I've had:
Delete old files. When the app starts, start a background thread, and delete all files older than X days. This seems to pose a problem, though, in that, if the user actively uses the app, this could make the device sluggish if there are hundreds of files to delete.
After creating the files on the SD card, call new
File("/path/to/file").deleteOnExit(); This will cause all files to be deleted when the VM exits (I don't even know if this method works on Android). This is acceptable, because, even though the files need to be cached for the session, they don't need to be cached for the next session. It seems like this will also slow the device down if there are a lot of files to be deleted when the VM exits.
Delete old files, up to a max number of files. Same as #1, but only delete N number of files at a time. I don't really like this idea, and if the user was very active, it may never be able to catch up and keep the cache directory clean.
That's about all I've got. Any suggestions would be appreciated.
Don't delete them all at once. Delete one every few seconds or something, and the user may not notice.
The VM does not exit normally on Android, so deleteOnExit() will not be reliable.
See #1 above.
You might also consider using AlarmManager to schedule deletion work for the wee hours of the morning. This has a side benefit of a capped CPU hit -- anything that runs truly in the background is capped to ~10% of CPU, so this work will not impact the user even if the user is actually using the device at that hour. You will need to use a WakeLock to keep the device awake while you are deleting things. One possibility is to use my WakefulIntentService for this, as it solves the problem of keeping the device awake and having it do the deletion work off the main application thread.