I am moving files(around 100-2000 files each of size 100-200KB) via thread from one folder to another. All goes well but on some Samsung & LG devices with sdcard, suddenly after copying, all or some of them go missing.
This does not happen every time, but approximately around once in every 20 times.
I have tried 3 techniques so far:
public void copyMethodA(File src, File dst){
if(!dst.exists()){
src.renameTo(dst);
}
}
copyMethodA(); resulted in loss of file in most of the times.
public void copyMethodB(File sourceFile, File destFile) throws IOException {
if (!destFile.exists()) {
destFile.createNewFile();
}
try {
FileChannel source = new FileInputStream(sourceFile).getChannel();
FileChannel destination = new FileOutputStream(destFile).getChannel();
destination.transferFrom(source, 0, source.size());
} finally {
source.close();
destination.close();
}
}
copyMethodB(); resulted in loss of file comparatively less number of times than A.
public void copyMethodC(File src, File dst) throws IOException {
InputStream in = new FileInputStream(src);
OutputStream out = new FileOutputStream(dst);
byte[] buf = new byte[10240];
int len;
while ((len = in.read(buf)) > 0) {
out.write(buf, 0, len);
}
in.close();
out.close();
}
copyMethodC(); very rarely resulted in loss of files. Hence currently using this one.
All 3 methods worked fine without a single loss of file on Xperia C & Nexus 5(both using internal storage)
But loss of files was observed on LG Optimus One(using sdcard) and some Samsung devices(using internal or sdcard)
Info about devices on which I have tested:
Nexus 5 - Android 4.4.2
Xperia C - Android 4.2.2
LG Optimus One - Android 2.3.3
Samsung devices - Android 4.0 and above
(I guess this problem isn't related to version of Android used)
I am avoiding to use huge 3rd party File IO API's as my Android app's size is just 300KB. Using API like Apache commons.io will bloat it to around 2.5MB
Is there any other safe, secure & better way to copy files ?
Thanks.
I think you should something like this :
you should check the return value of renameTo as, as the javadoc states it, there are many reasons for it to fail.
If the renameTo call failed, use the third way, with try / catch blocks to catch IOException when reading / writing the streams and make sure you only delete the source file if the copy was successful. You can then check the exception to understand why the copy failed and possibly retry it later.
I'm doing this (with a 1024 bytes buffer, like in #beni answer) to copy a few hundred files and I've never seen any loss.
I'm using this function and always works. I test this code on Nexus 5. in.read(buf) method returns -1 when is in the end of the file in other case return the number of the bytes readed. So, try this.
public static void copyFile(InputStream in, OutputStream out) throws IOException {
byte[] buffer = new byte[1024];
int read;
while ((read = in.read(buffer)) != -1) {
out.write(buffer, 0, read);
}
}
Related
In my app, i'm sending a file from a client, using sockets. On the other side, another client receive the file using InputStream and then bufferedOutputStream save the file in the system.
I don´t know why, the file isn´t utterly transmited. I think this is because of network overload, anyway, i don´t know how to solve it.
Transmiter is:
Log.d(TAG,"Reading...");
bufferedInputStream.read(byteArrayFile, 0, byteArrayFile.length);
Log.d(TAG, "Sending...");
bufferedOutputStream.write(byteArrayFile,0,byteArrayFile.length);
bufferedOutputStream.flush();
Receiver is:
bufferedOutputStream=new BufferedOutputStream(new FileOutputStream(file));
byteArray=new byte[fileSize];
int currentOffset = 0;
bytesReaded = bufferedInputStream.read(byteArray,0,byteArray.length);
currentOffset=bytesReaded;
do {
bytesReaded = bufferedInputStream.read(byteArray, currentOffset, (byteArray.length-currentOffset));
if(bytesReaded >= 0){ currentOffset += bytesLeidos;
}
} while(bytesReaded > -1 && currentOffset!=fileSize);
bufferedOutputStream.write(byteArray,0,currentOffset);
You don't state where filesize came from, but there are numerous problems with this code. Too many to mention. Throw it all away and use DataInputStream.readFully(). Or use the following copy loop, which doesn't require a buffer the size of the file, a technique which does not scale, assumes that the file size fits into an int, and adds latency:
byte[] buffer = new byte[8192];
int count;
while ((count = in.read(buffer)) > 0)
{
out.write(buffer, 0, count);
}
Use this at both ends. If you're sending multiple files via the same connection it gets more complex, but you haven't stated that.
I must be doing something really wrong. Running following code on Android it produces truncated file (_items_) without any exceptions or problems in the log. Running the same code with OpenJDK 7 it decompresses the file correctly.
try {
final InputStream fis = new GZIPInputStream(new FileInputStream("/storage/sdcard/_items"));
try {
final FileOutputStream fos = new FileOutputStream("/storage/sdcard/_items_");
try {
final byte[] buffer = new byte[1024];
int n;
while ((n = fis.read(buffer)) != -1) {
fos.write(buffer, 0, n);
}
} finally {
fos.close();
}
} finally {
fis.close();
}
} catch (final IOException e) {
e.printStackTrace();
throw new RuntimeException(e);
}
I've tried this with Android emulator (API 18) and on Desire HD (Android 2.3.5) with the same buggy result.
Input file (_items): https://drive.google.com/file/d/0B6M72P2gzYmwaHg4SzRTYnRMOVk/edit?usp=sharing
Android truncated output file (_items_): https://drive.google.com/file/d/0B6M72P2gzYmwMUZIZ2FEaHNZUFk/edit?usp=sharing
The AOSP bug has been updated with analysis by an engineer from the Dalvik team.
Summary: the gzip stream has multiple members concatenated together, and the decompressor is expected to process them all, but Dalvik's implementation is stopping after the first.
Unless there's a way to convince the data source to compress its streams differently, you will need to find a replacement for GZIPInputStream.
Workaround is to use GZIPInputStream from JZlib (currently only in concatenated_gzip_streams branch). See https://github.com/ymnk/jzlib/issues/12 for more details.
I'm trying to write a very large file to another very large file. I'm receiving this error on the filechannel writing line and I'm unsure why. I thought it was because I was going out of the limits of the data type long but long can go up to 9,223,372,036,854,775,807 and I'm only going up to 5,372,896,745 at the most. Any ideas why this is occurring? Is there some limit that MappedByteBuffer has? This doesn't occur for smaller files and I haven't run into any issues using the same code in a java desktop application. (Only happens on Android)
File f1 = new File(filename1);
FileChannel fic, foc;
long fsize;
MappedByteBuffer mBUf;
FileOutputStream out = new FileOutputStream(f1,true);
foc = out.getChannel();
File f2 = new File(filename2);
FileInputStream in = new FileInputStream(f2);
fic = in.getChannel();
fsize = fic.size();
for (long b = 0; b < fsize; b += 65536)
{
if (fsize - b < Resource.MEMORY_ALLOC_SIZE)
mBUf = fic.map(FileChannel.MapMode.READ_ONLY, b, fsize - b);
else
mBUf = fic.map(FileChannel.MapMode.READ_ONLY, b, Resource.MEMORY_ALLOC_SIZE);
foc.write(mBUf); //ERROR HERE!
}
fic.close();
in.close();
foc.close();
out.close();
Any ideas/feedback is appreciated!
Is there some limit that MappedByteBuffer has?
Of course there is. It is limited by the available virtual memory for a start, and after that by the virtual address space.
You should be using transferTo() for this task rather than MappedByteBuffers,, as there is no agreed means of disposing of the virtual address space occupied by the latter.
Unfortunately a Long does not go that high on a 32-bit system (which I believe Android is since it doesn't have over 4Gb of RAM). Therefore the maximum length of an unsigned long on Android is 4,294,967,295 which means you are exceeding its limit.
I have an encryption and decryption code which I use to encrypt and decrypt video files (mp4). I'm trying to speed up the decryption process as the encryption one is not that relevant for my case. This is the code that I have for the decryption process:
private static void decryptFile() throws IOException, ShortBufferException, IllegalBlockSizeException, BadPaddingException
{
//int blockSize = cipher.getBlockSize();
int blockSize = cipher.getBlockSize();
int outputSize = cipher.getOutputSize(blockSize);
System.out.println("outputsize: " + outputSize);
byte[] inBytes = new byte[blockSize];
byte[] outBytes = new byte[outputSize];
in= new FileInputStream(inputFile);
out=new FileOutputStream(outputFile);
BufferedInputStream inStream = new BufferedInputStream(in);
int inLength = 0;;
boolean more = true;
while (more)
{
inLength = inStream.read(inBytes);
if (inLength == blockSize)
{
int outLength
= cipher.update(inBytes, 0, blockSize, outBytes);
out.write(outBytes, 0, outLength);
}
else more = false;
}
if (inLength > 0)
outBytes = cipher.doFinal(inBytes, 0, inLength);
else
outBytes = cipher.doFinal();
out.write(outBytes);
}
My question is how to speed up the decryption process in this code. I've tried decrypting a 10MB mp4 file and it decrypts in 6-7 seconds. However, I'm aiming for < 1 seconds. Another thing I would like to know is if my writing to the FileOutputStream out is actually slowing the process down rather than the decryption process itself. Any suggestions on how to go about speeding things up here.
I'm using AES for encryption/decryption.
Until I find a solution, I will be using a ProgressDialog which tells the user to wait until the video has been decrypted (Obviously, I'm not going to use the word: decrypted).
Why are you decrypting data only by blockSize increments ? You do not show what type of object cipher is, but I am guessing this is a javax.crypto.Cipher instance. It can handle update() calls over arrays of arbitrary length, and you will have much less overhead if you use longer arrays. You should process data by blocks of, say, 8192 bytes (that's the traditional length for a buffer, it interacts reasonably well with CPU inner caches).
bytebiscuit, your question gave me the solution which I am trying from past 6 days. I just modified your code little bit, and my 52 mb video file is getting decrypted in just 4 seconds. Previous decrypting technique took 45 seconds which was a different logic (not yours) . Thats a massive difference 45 seconds to 4 seconds. Where ever I have done modification I am putting //modified comment lines. I am sure if your video is 10mb video, it will get decrypted in 1 second for sure. Try applying this, it should work out.
private static void decryptFile() throws IOException, ShortBufferException, IllegalBlockSizeException, BadPaddingException
{
//int blockSize = cipher.getBlockSize();
int blockSize = cipher.getBlockSize();
int outputSize = cipher.getOutputSize(blockSize);
System.out.println("outputsize: " + outputSize);
byte[] inBytes = new byte[blockSize*1024]; //modified
byte[] outBytes = new byte[outputSize * 1024]; //modified
in= new FileInputStream(inputFile);
out=new FileOutputStream(outputFile);
BufferedInputStream inStream = new BufferedInputStream(in);
int inLength = 0;;
boolean more = true;
while (more)
{
inLength = inStream.read(inBytes);
if (inLength/1024 == blockSize) //modified
{
int outLength
= cipher.update(inBytes, 0, blockSize*1024, outBytes);//modified
out.write(outBytes, 0, outLength);
}
else more = false;
}
if (inLength > 0)
outBytes = cipher.doFinal(inBytes, 0, inLength);
else
outBytes = cipher.doFinal();
out.write(outBytes);
}
I suggest you use the profiling tool provided in the android sdk. it will tell you where you spend the most time (i.e. : file writing or decoding).
see http://developer.android.com/guide/developing/debugging/debugging-tracing.html
This work on the emulator as well as on an actual device.
Consider using the NDK. On devices before Froyo (and even Froyo itself), it would be really slow due to the lack of JIT (or a very simple one in Froyo). Even with the JIT, native architecture-optimized crypto code will always outrun Dalvik.
See also this question.
As an aside, if you're using AES directly, you're probably doing something wrong. If this is part of an effort to do DRM, make sure you realize the full extent of the fact that decompiling an Android app is trivial. Your key will not be secure, which by definition defeats the encryption.
Instead of spending efforts to improve an inadequate architecture, you should consider a streaming solution: it has the great advantage to spread the computation time for the decryption so that it becomes no more noticeable. I mean: do not produce another file from your video source but rather a stream, with a local http server. Unfortunately there is no such component in the SDK, you have to make your own implementation or search for an existing one.
Well, I've been diving in the murky waters of low-level Android programming (native C/C++ using the CodeSourcery toolchain). I tried out the executable on an emulator and it worked. I'd like to try it out on a real device. So I plugged in my nexus and pushed the files on to the filesystem. Then I tried to execute the binary, and I got a permission error. It really doesn't matter how I mount it, or where I send it, I'm not root and it's not letting me execute it. Is there any way to run a program like this on a non-rooted phone?
Update: notice that apps targeting API level 29 (Android 10) and above will not be able to use the trick below, as the OS will restrict the execute permission. See Behavior changes: apps targeting API 29+.
After using the toolchain included in the Android NDK to compile your binaries, it is possible to package them with a typical Android app and have them spawn as subprocesses.
You'll have to include all the necessary files within the assets folder of your application. In order to run them, you have to have the program copy them from the assets folder to a runnable location like: /data/data/com.yourdomain.yourapp/nativeFolder
You can do this like so:
private static void copyFile(String assetPath, String localPath, Context context) {
try {
InputStream in = context.getAssets().open(assetPath);
FileOutputStream out = new FileOutputStream(localPath);
int read;
byte[] buffer = new byte[4096];
while ((read = in.read(buffer)) > 0) {
out.write(buffer, 0, read);
}
out.close();
in.close();
} catch (IOException e) {
throw new RuntimeException(e);
}
}
Keep in mind that the assetPath is not absolute but in respect to assets/.
IE: "assets/nativeFolder" is just "nativeFolder"
To then run your application and read its output you could do something like this:
Process nativeApp = Runtime.getRuntime().exec("/data/data/com.yourdomain.yourapp/nativeFolder/application");
BufferedReader reader = new BufferedReader(new InputStreamReader(nativeApp.getInputStream()));
int read;
char[] buffer = new char[4096];
StringBuffer output = new StringBuffer();
while ((read = reader.read(buffer)) > 0) {
output.append(buffer, 0, read);
}
reader.close();
// Waits for the command to finish.
nativeApp.waitFor();
String nativeOutput = output.toString();