I have a custom file format in res/raw, which means it will get compressed in APK.
Assume my application is installed on the device. When I open a file via openRawResource(), does it completely uncompress the file in the memory?
E.g. if it's a 3MB file, will it uncompress it in memory? If I only need 10KB data from a certain offset (and I reach it via BufferedInputStream.skip()), will it still consume 3MB when openRawResource() is called?
UPDATE: unless I'm overlooking something, it does not seem to be uncompressing the whole file. Consider the following test.
I added a 30MB file of random data to res/raw. Its extension is not ogg,mp3,png or similar, so it will be compressed in the APK. In the below code, I seek to around the 19MB-th position, and read cca. 117KB from the file. And it worked, even though the uncompressed length is 30MB. (As far as I read, the 1MB limit applies only to assets, not to raw resources.)
is = context.getResources().openRawResource(R.raw.test); // test is a 30MB test.txt
DataInputStream dis = new DataInputStream(new BufferedInputStream(is, 8192));
dis.skip(19555125);
byte[] testArr = new byte[117412];
dis.readFully(testArr);
for (int i = 0; i < testArr.length - 117000; i++) {
Log.w("LOG_TAG", "" + testArr[i]);
}
Yes, all "compressible" files are compressed. "Uncompressible" files
are certain image files, zip files, etc, that are already in a
compressed form and wouldn't benefit from compression.
Keep in mind the compression can be a problem, if you have a file larger
than 1M. The file will compress fine and the Android toolkit will
happily package it up, but on the phone it'll give an error when you
try to retrieve it, since the phone refuses to uncompress a file
larger than 1M. The workaround for this is to give the file an
extension of jpg or some such, so it doesn't get compressed
In the meantime, on the Android mailing list, Dianne Hackborn answered this question.
Quoting Dianne:
Prior to I think Gingerbread, it would entirely uncompress in memory
with a limit on the amount of memory it would use (I think 1MB) above
which the open would fail. On current versions of the platform it
uncompression is streamed as you read it with no limit on size.
My tests confirm this. I've just tested it on a Gingerbread and on an ICS device, and it works fine. In the upcoming days, I'll test it on a Froyo (Android 2.2) device as well, and will update this post.
Of course, Dianne is certainly right I'm sure, but she says she thinks "Gingerbread" is the first version where it's supported, so it won't hurt that I'll test it on Froyo then.
P.S.: thanks to CommonsWare for spotting the answer of Dianne.
Related
I am using File.length() to get file size, and I hardcoded file sizes in an array in app. If the file (downloaded file) is not the same size of recorded size, I delete it and request user to download it again.
Is this correct? if not what is the best approach to validate downloaded file?
Can file length be different per device / android version?
A downloaded file length will not be different per device / android version, not unless you have written a compression code or any modification of the image.
Now as for the approach, yes the approach is correct. But it's all about the underlying algorithm used. Algorithms vary, some increases the precision. You can read about File.length() here at the Androdid Developers Page.
Then decide if that's what you want to use, or another logic written by you or someone else.
You should verify file checksum. 2 files can have same length but contain different data.
Can file length be different per device / android version?
If you are saving it as binary file - no, it's plain data.
Background
Some files of the app can only be stored in res/raw or assets folders.
Each of those folders work in a very similar way to the other. res/raw folder allows to access files easier, with all the other benefits of resource files, while assets folder allows to access them no matter the file name and structure (including folders and sub folders).
The main idea of loading files is about the same for both of them. You just have a choice of ease-of-use, depends on your needs.
The problem
I remember that a very long time ago, I've found some special behavior of both of those folders:
Each folder within the assets folder had a max number of files. I think it was about 500, but not sure. I've noticed this behavior a very long time ago,
Some said that files in the assets folder have a max size for files (example here). I never saw such a restriction. Not even on Android 2.3 at the time.
Some said (example here), and it's still believed even today (example here), that if you load a file from res/raw, it could take much more memory than if you took it from assets folder.
What I've tried
For #1, I never had to use more files anyway after the project I worked on, and at the time I worked on it, we simply split the files into more folders.
For #2 , as I wrote, I never noticed it anyway. I used much larger files sizes.
For #3, I tried to make a sample project that compares the memory usage between the 2 methods. I didn't notice any difference (memory usage or time to load) between the 2 methods. Especially not a major one. Sadly I have only one device (Nexus 5x), and it has quite a new Android version (8.1). It might be that starting from specific Android version there is no difference between the 2 methods. Another reason for this is that it's harder to measure memory usage on Java, because of the GC, and I've already noticed that on Android 8.x, memory works a bit differently than before (written about it here).
I tried to read about the differences and restrictions of the above, but all I've found are very old articles, so I think things might have changed ever since.
The questions
Actually it's just one question, but I'd like to split it in case the answer is complex:
Are there any major or unique limitations or differences between using res/raw and assets folders?
Does reading a file from the assets folder (by creating an input stream from it) really take less memory than using the res/raw? So much that even one of the most appreciated developers (here) decides to choose it, even nowadays?
Have the above restrictions existed up to specific Android versions, and then they became identical in terms of no restrictions whatsoever (except of course files naming for res/raw, but that's just how it works) ?
If so, from which Android version do they work about the same?
Are there any major or unique limitations or differences between using res/raw and assets folders?
Now, In android we don't have any restriction on max limit size for any file in assets or in raw.
Android Documentation:
Arbitrary files to save in their raw form. To open these resources
with a raw InputStream, call Resources.openRawResource() with the
resource ID, which is R.raw.filename.
However, if you need access to original file names and file hierarchy,
you might consider saving some resources in the assets/ directory
(instead of res/raw/). Files in assets/ aren't given a resource ID, so
you can read them only using AssetManager.
Does reading a file from the assets folder (by creating an input stream from it) really take less memory than using the res/raw? So much that even one of the most appreciated developers (here) decides to choose it, even nowadays?
No, I have not found any differences between memory usage. It is one of biggest mess that android is having right now, Also we don't have any official documentation about their memory limitation.
Have the above restrictions existed up to specific Android versions, and then they became identical in terms of no restrictions whatsoever (except of course files naming for res/raw, but that's just how it works) ?
Before android 2.3 we had memory restriction for asset folder, which is 1 MB. Please refer link.
If so, from which Android version do they work about the same?
From android 2.3, We don't have any memory related restriction, which they launched in December, 2010
Well, so I've been digging through an android game's files trying to get sprites and the like. So I've managed to come across this folder called "raw", and inside were jpg files like imagelocal2.jpg (along with imagelocal2.list) and such. These files aren't valid images and can't be viewed normally, but they're big enough to contain many images inside of them.
What I'm wondering is, is there some unknown JPG compression-like method where they manage to squish a bunch of files into one? I opened the files with a hex editor but I couldn't make heads or tails of them (the fact that I have no experience with hex editors doesn't really help), so if anyone knows anything about how these files are compressed, please help.
There is no standard multi-image JPEG format. it would be anything with a JPG extension. No competently-written decoder would rely on the extension anyway.
You could take a look at the first few bytes and try to match the file signature.
http://en.wikipedia.org/wiki/List_of_file_signatures
I am working on an app and one of the features I am working on is to download some binary files. Some of them are really big (more than several mega-bytes). Downloads are completing fine as long as the file size is less than 2 GB.
I got stuck on a file that is 3.2GB in that I get progress updates (I am pooling the DownloadManager for progress updates), but when the download completes, the file is not present on the target file path. Interrogating the DownloadManager for that download id, I get STATUS_FAILED and reason ERROR_UNKNOWN - the favorite error details one will ever wish for!
What is weird is that this appears on most of the devices, but for some (like Samsung SG 4 Active OS 4.2.2 and LG Nexus 5 OS 4.4.2), it doesn't appear.
Doing some extra investigation, I found out that this seems to be a bug in Android DownloadManager implementation. It seems Android implementation stores the download count as an int, but when that count goes above Integer.MAX_VALUE the download ends as failed.
I am thinking to replace the DownloadManager usage with a foreground service, but I wouldn't give up yet ....
Did you guys face this and if so, how did you fix it?
Is there any work-around to use DownloadManager in pre-4.2.2 so I can download more than 2.1 GB per file?
To download such a large files, you need to download those in chunks. Either you can use any library that support HTTP range options to allow to pull down a single file in multiple pieces , supporting resume etc.
Or you can split your large file on your server then have a text file with MD5 hash of each file, when you first start to download then get the MD5 file once finish then check that hashes matches the downloaded pieces. If they do not then delete that piece and add it to queue of items to download.
Once all pieces downloaded and MD5 works, you can put the pieces back together as single file.
If you are thinking to download the file in the SD card then FAT32 is the default file system. There is a 4 GB per file limit with this file system.
From looking at the Android source code, it appears that this issue was resolved in JB-MR2.
It seems that the only way to work around this on older platform versions would be to modify the server such that it uses chunked transfer encoding[1] for these large resources. In that case, that Download Manager will ignore and not attempt to parse the Content-Length header.
[1] http://en.wikipedia.org/wiki/Chunked_transfer_encoding
There is one clear outcome of this:
You can not fix the DownloadManager. If it's a bug in it, it will be so. Therefore, in short, No, you can not workaround this issue using the DownloadManager. You could however workaround it using a server side approach that has been put into words in the other answers.
So, I think your simplest solution would be to force the minimum sdk level to JB-MR2 because #ksasq mentioned that this issue has been resolved.
If that is not plausible nor in your case possible, you can find the best file download library out there and create an interface similar to DownloadManager's for this library. Of course, this interface should be implemented to use the default DownloadManager for versions which do not have this bug and use the custom library for those which had this bug (and for files who cause the issue if possible).
Unfortunately, a search on google showed yingyixu's android-download-manager last updated in 2012.
Another unfortunate note about this topic by CommonsWare simply verifies that there is no DownloadManager in google's support libraries. Worse is that the guy gave up the idea of implementing his own port becuase it was way too complicated. You can only hope that yingyixu's library or some other library you hopefully find is good enough.
You can pass this issue by splitting file into smaller zip files. Next step is to join them on target, I've found ->this<- that might help you. If you will not compress file (split only option) you should have good performance. Other issue is that you will need twice as much storage space. You can download smaller files, about 100MB, write it to joined buffer and remove form file-system, that will preserve space wasting.
You could also take the fixed version of DownloadManager, change the package to your package structure and use this version instead of the system version. Eventually you need to import some classes from the original package android.app. Then register your implementation as a service.
I'm developing for Android 2.2.
Things put to res and assets are compressed in the APK by default unless their extension indicates that they're already compressed (mp3, png). Moreover, before Android 2.3, to assets, you could only put uncompressed files of size less than 1 MB.
Question 1:
if I put a 1.5MB binary file to res/raw, and my program refers to it with its standard Android ID (R.raw.....), then will the system pull the whole file into memory? Since the 1.5MB is stored in a compressed way, I suppose it must. This is unpleasant because it is possible that the program only needs 1KB of data to be loaded from a given file offset. This can have a serious impact on app performance/speed.
I see two solutions:
(hack) Use mp3 or png extensions; but I am not sure this allows memory-efficient access after all (i.e. inputstream.skipbytes, etc.)
After the installation, at the first start of the app, the app will copy the files to its own writable working folder (in SD card). Since this point, it will always access the files from there instead of the R.raw... way. Therefore, the sequential read will work for sure, i.e. the memory usage will be no more than the actual data read from the specified file offset (apart from temporary read buffers used by inputstream.seek implementation, which are well-optimized i.e. small I suppose).
Question 2:
what is the best way of reading binary data memory-efficiently, from big files? I don't want to split my big files into many small ones unless that's the only way.
I'd go with #2 solution and then would use RandomAccessFile to avoid linear access.
I also would to opt for solution #2, but instead of using random access file, I would use java.nio.MappedByteBuffer - this way you will get fast random access with byte buffer semantic.