In my application the user can choose a file using the chooser Intent, which will then be "imported" into the application and saved in internal storage for security reasons. This all worked fine and still does on some devices, but for example on the Google Pixel on Android 7.1.1 it only functions normally for the first 4-6 files and afterwards it acts very odd.
The performance was going down drastically so I checked my storage usage and found that it was continuously growing, although the file I was supposed to be saving was less than 1mb large. Importing a file would cause the amount of storage taken by my app to rise past 500mb and upward. I can't seem to find the cause for this.
The method I am using to save the files which is called in an async background task:
BufferedInputStream bis = null;
BufferedOutputStream bos = null;
OutputStream fos = new FileOutputStream(file);
int size = 0;
InputStream fis = getContentResolver().openInputStream(uri);
try{
bis = new BufferedInputStream(fis);
bos = new BufferedOutputStream(fos);
byte[] buf = new byte[1024];
int len = 1024;
while((len = bis.read(buf,0,len)) != -1){
bos.write(buf,0,len);
size = size+1024;
Log.v("Bytes written",""+size);
}
}catch (IOException e){
e.printStackTrace();
}finally {
try{
if(bis != null) bis.close();
if(bos != null) bos.close();
if(fis != null) fis.close();
if(fos != null) fos.close();
}catch(IOException e){
e.printStackTrace();
}
}
return Uri.fromFile(file);
The Uri which this function returns is then saved in an SQLite Database to be used later.
I appreciate all kinds of tips as to where this memory usage could be coming from.
Btw, this did not result from an update on the phone nor from any changes in my code, as it was working the last time I tested it and I haven't changed anything since.
I see a couple of things to correct:
1) The signature of the write method doesn't seem correct, if you write from a buffer you should use write(buff, offset, length).
2) You read into the buffer once, so it should be enugh to write out the buffer once too.
3) If you need to read the the buffer more than once, and write out the values more than once, use a while, not a do while. You have no garantee that the read operation was succesfull.
Ref to write method in Android Developer
I had an additional method which would append an index to a file added multiple times eg. "file.pdf", "file1.pdf","file2.pdf". This method wasn't correct, leading to an endless loop of creating a new file and appending an index. I managed to fix the problem by changing this method to avoid looping.
In retrospect I should have included that in my question.
Related
I am facing a problem downloading PNG images from my server to my Android app. The problem is specific to PNG images (JPG works fine), and the issue is that the downloaded files are corrupt images. I will explain in more details, below.
Scenario :
I need to download JPG and PNG images from my server, and display them to the user of the Android app.
Issue :
The JPG images get downloaded without an issue. But the downloaded PNG files are corrupt. I have double checked the source of the images at my server, and they are proper. Its only the downloaded PNG files, that are corrupt. So, the problem probably lies in the way I am downloading them in Android.
Code Sample :
URL imageURL;
File imageFile = null;
InputStream is = null;
FileOutputStream fos = null;
byte[] b = new byte[1024];
try {
// get the input stream and pass to file output stream
imageURL = new URL(image.getServerPath());
imageFile = new File(context.getExternalFilesDir(null), image.getLocalPath());
fos = new FileOutputStream(imageFile);
// get the input stream and pass to file output stream
is = imageURL.openConnection().getInputStream();
// also tried but gave same results :
// is = imageURL.openStream();
while(is.read(b) != -1)
fos.write(b);
} catch (FileNotFoundException e) {
} catch (MalformedURLException e) {
} catch (IOException e) {
} finally {
// close the streams
try {
if(fos != null)
fos.close();
if(is != null)
is.close();
} catch(IOException e){
}
}
Any pointers on how I can work on this, will be very appreciated.
Note :
Since this is happening in a service, there are no problems of doing this inside an AsyncTask.
The problem is here
while(is.read(b) != -1)
fos.write(b);
This is wrong, because in each iteration it will write the full buffer (1024 bytes) to the file. But the previous read could have read less bytes than than (almost surely on the last loop, unless the image lenght happens to be exactly a multiple of 1024). You should check how many bytes were read each time, and write that amount of bytes.
int bytesRead;
while( (bytesRead = is.read(b)) != -1)
fos.write(b,0,bytesRead );
Your error makes you write always files with sizes that are multiple of 1024 - which of course is not the case in general. Now, what happens when a image is saved with extra trailing bytes depends on the format and on the image reader. In some cases, it might work. Still, it's wrong.
BTW: never swallow exceptions - even if that's not the problem today, it might be tomorrow and you might spend hours finding the problem.
Using Base64 from Apache commons
public byte[] encode(File file) throws FileNotFoundException, IOException {
byte[] encoded;
try (FileInputStream fin = new FileInputStream(file)) {
byte fileContent[] = new byte[(int) file.length()];
fin.read(fileContent);
encoded = Base64.encodeBase64(fileContent);
}
return encoded;
}
Exception in thread "AWT-EventQueue-0" java.lang.OutOfMemoryError: Java heap space
at org.apache.commons.codec.binary.BaseNCodec.encode(BaseNCodec.java:342)
at org.apache.commons.codec.binary.Base64.encodeBase64(Base64.java:657)
at org.apache.commons.codec.binary.Base64.encodeBase64(Base64.java:622)
at org.apache.commons.codec.binary.Base64.encodeBase64(Base64.java:604)
I'm making small app for mobile device.
You cannot just load the whole file into memory, like here:
byte fileContent[] = new byte[(int) file.length()];
fin.read(fileContent);
Instead load the file chunk by chunk and encode it in parts. Base64 is a simple encoding, it is enough to load 3 bytes and encode them at a time (this will produce 4 bytes after encoding). For performance reasons consider loading multiples of 3 bytes, e.g. 3000 bytes - should be just fine. Also consider buffering input file.
An example:
byte fileContent[] = new byte[3000];
try (FileInputStream fin = new FileInputStream(file)) {
while(fin.read(fileContent) >= 0) {
Base64.encodeBase64(fileContent);
}
}
Note that you cannot simply append results of Base64.encodeBase64() to encoded bbyte array. Actually, it is not loading the file but encoding it to Base64 causing the out-of-memory problem. This is understandable because Base64 version is bigger (and you already have a file occupying a lot of memory).
Consider changing your method to:
public void encode(File file, OutputStream base64OutputStream)
and sending Base64-encoded data directly to the base64OutputStream rather than returning it.
UPDATE: Thanks to #StephenC I developed much easier version:
public void encode(File file, OutputStream base64OutputStream) {
InputStream is = new FileInputStream(file);
OutputStream out = new Base64OutputStream(base64OutputStream)
IOUtils.copy(is, out);
is.close();
out.close();
}
It uses Base64OutputStream that translates input to Base64 on-the-fly and IOUtils class from Apache Commons IO.
Note: you must close the FileInputStream and Base64OutputStream explicitly to print = if required but buffering is handled by IOUtils.copy().
Either the file is too big, or your heap is too small, or you've got a memory leak.
If this only happens with really big files, put something into your code to check the file size and reject files that are unreasonably big.
If this happens with small files, increase your heap size by using the -Xmx command line option when you launch the JVM. (If this is in a web container or some other framework, check the documentation on how to do it.)
If the file recurs, especially with small files, the chances are that you've got a memory leak.
The other point that should be made is that your current approach entails holding two complete copies of the file in memory. You should be able to reduce the memory usage, though you'll typically need a stream-based Base64 encoder to do this. (It depends on which flavor of the base64 encoding you are using ...)
This page describes a stream-based Base64 encoder / decoder library, and includes lnks to some alternatives.
Well, do not do it for the whole file at once.
Base64 works on 3 bytes at a time, so you can read your file in batches of "multiple of 3" bytes, encode them and repeat until you finish the file:
// the base64 encoding - acceptable estimation of encoded size
StringBuilder sb = new StringBuilder(file.length() / 3 * 4);
FileInputStream fin = null;
try {
fin = new FileInputStream("some.file");
// Max size of buffer
int bSize = 3 * 512;
// Buffer
byte[] buf = new byte[bSize];
// Actual size of buffer
int len = 0;
while((len = fin.read(buf)) != -1) {
byte[] encoded = Base64.encodeBase64(buf);
// Although you might want to write the encoded bytes to another
// stream, otherwise you'll run into the same problem again.
sb.append(new String(buf, 0, len));
}
} catch(IOException e) {
if(null != fin) {
fin.close();
}
}
String base64EncodedFile = sb.toString();
You are not reading the whole file, just the first few kb. The read method returns how many bytes were actually read. You should call read in a loop until it returns -1 to be sure that you have read everything.
The file is too big for both it and its base64 encoding to fit in memory. Either
process the file in smaller pieces or
increase the memory available to the JVM with the -Xmx switch, e.g.
java -Xmx1024M YourProgram
This is best code to upload image of more size
bitmap=Bitmap.createScaledBitmap(bitmap, 100, 100, true);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, stream); //compress to which format you want.
byte [] byte_arr = stream.toByteArray();
String image_str = Base64.encodeBytes(byte_arr);
Well, looks like your file is too large to keep the multiple copies necessary for an in-memory Base64 encoding in the available heap memory at the same time. Given that this is for a mobile device, it's probably not possible to increase the heap, so you have two options:
make the file smaller (much smaller)
Do it in a stram-based way so that you're reading from an InputStream one small part of the file at a time, encode it and write it to an OutputStream, without ever keeping the enitre file in memory.
In Manifest in applcation tag write following
android:largeHeap="true"
It worked for me
Java 8 added Base64 methods, so Apache Commons is no longer needed to encode large files.
public static void encodeFileToBase64(String inputFile, String outputFile) {
try (OutputStream out = Base64.getEncoder().wrap(new FileOutputStream(outputFile))) {
Files.copy(Paths.get(inputFile), out);
} catch (IOException e) {
throw new UncheckedIOException(e);
}
}
First I will say I've researched various solutions on StackOverflow and elsewhere and none have solved my issue. I'll admit that my knowledge in this area is somewhat lacking, so sorry for being all noobish.
I want to do what I assume to be a simple thing. My app creates a file on start up if it does not exist yet, and populates it with 30 "0.0", each on a different line (This is done successfully).
Then when you beat a level, the intent is to have the file read, the 0.0's and assign each one to an element of an array, assign the time remaining (for example, 2.5) to the appropriate spot ([0] for the first level, [1] for the second level, etc), and write it back to the file to keep track of how much time was remaining when the player beat each level.
The problem becomes, though it does write the score in the appropriate spot back into the file (I'm testing level 1 so in spot [0]), all of the other 0.0's become "null". So I know there is a mistake somewhere that I must not be seeing, and after several hours over several days of trying various solutions, I finally decided to ask for help here. This is the code below:
try {
String[] scoreArray = new String[30];
File sdCard = Environment.getExternalStorageDirectory();
File myFile = new File(sdCard.getAbsolutePath());
// myFile.createNewFile();
File file = new File(myFile, "RLGLscores.txt");
FileOutputStream fOut = new FileOutputStream(file);
String newLine = System.getProperty("line.separator");
OutputStreamWriter myOutWriter =
new OutputStreamWriter(fOut);
FileInputStream fis = new FileInputStream(file);
BufferedReader myReader = new BufferedReader(
new InputStreamReader(fis));
String data = "";
int i = 0;
while ((data = myReader.readLine()) != null) {
scoreArray[i] = data;
i++;
}
myReader.close();
scoreArray[0] = timerStringFormat;
for (int j = 0; j < 30; j++) {
myOutWriter.write(scoreArray[j] + newLine);
}
myOutWriter.close();
fOut.close();
} catch (IOException e) {
e.printStackTrace();
}
timerStringFormat is the String that holds the time remaining in the form of a String. Again I'm rather new to this. My game is nearly complete, I'm amazed I got as far as I have. But any help would be greatly appreciated.
Opening the file for writing truncates it to zero length. If you later try to read from it you just see an empty file.
To fix that, first read the contents, close the reader, and then open the output stream and write the modified contents back.
I'm using the SpringFramework for Android to get my inputstreams.
#Override
public InputStream getImageStream(String url) {
InputStream is = new ByteArrayInputStream(template.getForObject(url, byte[].class));
return is;
}
For the first few inputstreams its going ok. No Problems at all, but then, I think it tries to get a very big inputstream. So then I get the outofmemory error.
I see a lot of posts using something like the following code:
public byte[] readBytes(InputStream inputStream) throws IOException {
ByteArrayOutputStream byteBuffer = new ByteArrayOutputStream();
int bufferSize = 1024;
byte[] buffer = new byte[bufferSize];
int len = 0;
while ((len = inputStream.read(buffer)) != -1) {
byteBuffer.write(buffer, 0, len);
}
byte[] byteArray= byteBuffer.toByteArray();
return byteArray;
}
The idea of this code is to read the inputstream in chunks right?
But the outofmemory error I'm getting is before I can even start the readBytes method. I tried putting resets everywhere...I thought maybe I should clear the memory somewhere after readbytes or something. But I do not know how and I don't know if that is the right way?
I think I'm having the basics wrong? I'm very new to android and java... Is there a way of getting the InputStream another way? I also read something about BufferedInputStream but I just can't think up of a way to fit it in.
My goal is to store a blob of the image in the database. And my input is the imgurl via oauth.
I can also call less quality versions of the inputstream through another url. And then everything works...
But I wanted to try it with the original image url, because maybe later I want to have the ability to download an original for printing the photo.
You are getting OutOfMemory error because you reading whole data in memory when using ByteArrayInputStream. Also notice that when you write in ByteArrayOutputStream it keeps data in memory too (it just simple wrapper for byte array). Probably you should use FileOutputStream instead as cache.
Edit*
I have successful on the client server. Now I am doing a file transferring between 2 emulators. The file did transfer between the emulators, but I notice that the file size received is not the same as the original file. For example, A.jpg size is 900KB, but the received file is less than 900KB. I checked the file transfer size, found that there were some data(byte) lost when transferring. How is this happening?
Here's the code:
Client (Send File)
File myFile = new File ("/mnt/sdcard/Pictures/A.jpg");
FileInputStream fis = new FileInputStream(myFile);
OutputStream os = socket.getOutputStream();
int filesize = (int) myFile.length();
byte [] buffer = new byte [filesize];
int bytesRead =0;
while ((bytesRead = fis.read(buffer)) > 0) {
os.write(buffer, 0, bytesRead);
//Log display exact the file size
System.out.println("SO sendFile" + bytesRead);
}
os.flush();
os.close();
fis.close();
Log.d("Client", "Client sent message");
socket.close();
Server (Receive File)
FileOutputStream fos = new FileOutputStream("/mnt/sdcard/Pictures/B.jpg");
#SuppressWarnings("resource")
BufferedOutputStream bos = new BufferedOutputStream(fos);
InputStream is = clientSocket.getInputStream();
byte[] aByte = new byte[1024];
int bytesRead;
while ((bytesRead = is.read(aByte)) != -1)
{
bos.write(aByte, 0, bytesRead);
//Log display few parts the file size is less than 1024. I total up, the lost size caused the file received is incomplete
System.out.println("SO sendFile" + bytesRead);
}
clientSocket.close();
*Edit 2
While I surfed around google, I found that .read(buffer) does not guarantee read the full size(byte) of the file. Hence, the received file always lost some bytes (like space, empty character). To solve this, send the file size first to inform the receiver, then only start transfer the file.
NetworkOnMainThreadException occurs because you have to use AsyncTask
NullPointerException occurs because you are trying to use PrintWriter with the result of Sockets. As you have got nothing with Sockets you get this error.
The NetworkOnMainThreadException tells you what you are doing wrong.
You need to put the network stuff into a separate Thread (or AsyncTask or similar).
You can not call any server operation on Main Thread in Android.
In Android O.S 4.0 and above this will directly cause to NetworkOnMainThreadException. You have 2 choices :
1) Either to use AsyncTask to call your every server operation.
2) Or Use User defined Thread for any type of server operation.
I was also struggling with this Exception, only in OS version above 4.0 devices, So you can not ignore these small needs of Android.