OutOfMemory Exception while encoding Base64 [duplicate] - android

Using Base64 from Apache commons
public byte[] encode(File file) throws FileNotFoundException, IOException {
byte[] encoded;
try (FileInputStream fin = new FileInputStream(file)) {
byte fileContent[] = new byte[(int) file.length()];
fin.read(fileContent);
encoded = Base64.encodeBase64(fileContent);
}
return encoded;
}
Exception in thread "AWT-EventQueue-0" java.lang.OutOfMemoryError: Java heap space
at org.apache.commons.codec.binary.BaseNCodec.encode(BaseNCodec.java:342)
at org.apache.commons.codec.binary.Base64.encodeBase64(Base64.java:657)
at org.apache.commons.codec.binary.Base64.encodeBase64(Base64.java:622)
at org.apache.commons.codec.binary.Base64.encodeBase64(Base64.java:604)
I'm making small app for mobile device.

You cannot just load the whole file into memory, like here:
byte fileContent[] = new byte[(int) file.length()];
fin.read(fileContent);
Instead load the file chunk by chunk and encode it in parts. Base64 is a simple encoding, it is enough to load 3 bytes and encode them at a time (this will produce 4 bytes after encoding). For performance reasons consider loading multiples of 3 bytes, e.g. 3000 bytes - should be just fine. Also consider buffering input file.
An example:
byte fileContent[] = new byte[3000];
try (FileInputStream fin = new FileInputStream(file)) {
while(fin.read(fileContent) >= 0) {
Base64.encodeBase64(fileContent);
}
}
Note that you cannot simply append results of Base64.encodeBase64() to encoded bbyte array. Actually, it is not loading the file but encoding it to Base64 causing the out-of-memory problem. This is understandable because Base64 version is bigger (and you already have a file occupying a lot of memory).
Consider changing your method to:
public void encode(File file, OutputStream base64OutputStream)
and sending Base64-encoded data directly to the base64OutputStream rather than returning it.
UPDATE: Thanks to #StephenC I developed much easier version:
public void encode(File file, OutputStream base64OutputStream) {
InputStream is = new FileInputStream(file);
OutputStream out = new Base64OutputStream(base64OutputStream)
IOUtils.copy(is, out);
is.close();
out.close();
}
It uses Base64OutputStream that translates input to Base64 on-the-fly and IOUtils class from Apache Commons IO.
Note: you must close the FileInputStream and Base64OutputStream explicitly to print = if required but buffering is handled by IOUtils.copy().

Either the file is too big, or your heap is too small, or you've got a memory leak.
If this only happens with really big files, put something into your code to check the file size and reject files that are unreasonably big.
If this happens with small files, increase your heap size by using the -Xmx command line option when you launch the JVM. (If this is in a web container or some other framework, check the documentation on how to do it.)
If the file recurs, especially with small files, the chances are that you've got a memory leak.
The other point that should be made is that your current approach entails holding two complete copies of the file in memory. You should be able to reduce the memory usage, though you'll typically need a stream-based Base64 encoder to do this. (It depends on which flavor of the base64 encoding you are using ...)
This page describes a stream-based Base64 encoder / decoder library, and includes lnks to some alternatives.

Well, do not do it for the whole file at once.
Base64 works on 3 bytes at a time, so you can read your file in batches of "multiple of 3" bytes, encode them and repeat until you finish the file:
// the base64 encoding - acceptable estimation of encoded size
StringBuilder sb = new StringBuilder(file.length() / 3 * 4);
FileInputStream fin = null;
try {
fin = new FileInputStream("some.file");
// Max size of buffer
int bSize = 3 * 512;
// Buffer
byte[] buf = new byte[bSize];
// Actual size of buffer
int len = 0;
while((len = fin.read(buf)) != -1) {
byte[] encoded = Base64.encodeBase64(buf);
// Although you might want to write the encoded bytes to another
// stream, otherwise you'll run into the same problem again.
sb.append(new String(buf, 0, len));
}
} catch(IOException e) {
if(null != fin) {
fin.close();
}
}
String base64EncodedFile = sb.toString();

You are not reading the whole file, just the first few kb. The read method returns how many bytes were actually read. You should call read in a loop until it returns -1 to be sure that you have read everything.
The file is too big for both it and its base64 encoding to fit in memory. Either
process the file in smaller pieces or
increase the memory available to the JVM with the -Xmx switch, e.g.
java -Xmx1024M YourProgram

This is best code to upload image of more size
bitmap=Bitmap.createScaledBitmap(bitmap, 100, 100, true);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, stream); //compress to which format you want.
byte [] byte_arr = stream.toByteArray();
String image_str = Base64.encodeBytes(byte_arr);

Well, looks like your file is too large to keep the multiple copies necessary for an in-memory Base64 encoding in the available heap memory at the same time. Given that this is for a mobile device, it's probably not possible to increase the heap, so you have two options:
make the file smaller (much smaller)
Do it in a stram-based way so that you're reading from an InputStream one small part of the file at a time, encode it and write it to an OutputStream, without ever keeping the enitre file in memory.

In Manifest in applcation tag write following
android:largeHeap="true"
It worked for me

Java 8 added Base64 methods, so Apache Commons is no longer needed to encode large files.
public static void encodeFileToBase64(String inputFile, String outputFile) {
try (OutputStream out = Base64.getEncoder().wrap(new FileOutputStream(outputFile))) {
Files.copy(Paths.get(inputFile), out);
} catch (IOException e) {
throw new UncheckedIOException(e);
}
}

Related

Saving a file uses unexplainably large amounts of storage

In my application the user can choose a file using the chooser Intent, which will then be "imported" into the application and saved in internal storage for security reasons. This all worked fine and still does on some devices, but for example on the Google Pixel on Android 7.1.1 it only functions normally for the first 4-6 files and afterwards it acts very odd.
The performance was going down drastically so I checked my storage usage and found that it was continuously growing, although the file I was supposed to be saving was less than 1mb large. Importing a file would cause the amount of storage taken by my app to rise past 500mb and upward. I can't seem to find the cause for this.
The method I am using to save the files which is called in an async background task:
BufferedInputStream bis = null;
BufferedOutputStream bos = null;
OutputStream fos = new FileOutputStream(file);
int size = 0;
InputStream fis = getContentResolver().openInputStream(uri);
try{
bis = new BufferedInputStream(fis);
bos = new BufferedOutputStream(fos);
byte[] buf = new byte[1024];
int len = 1024;
while((len = bis.read(buf,0,len)) != -1){
bos.write(buf,0,len);
size = size+1024;
Log.v("Bytes written",""+size);
}
}catch (IOException e){
e.printStackTrace();
}finally {
try{
if(bis != null) bis.close();
if(bos != null) bos.close();
if(fis != null) fis.close();
if(fos != null) fos.close();
}catch(IOException e){
e.printStackTrace();
}
}
return Uri.fromFile(file);
The Uri which this function returns is then saved in an SQLite Database to be used later.
I appreciate all kinds of tips as to where this memory usage could be coming from.
Btw, this did not result from an update on the phone nor from any changes in my code, as it was working the last time I tested it and I haven't changed anything since.
I see a couple of things to correct:
1) The signature of the write method doesn't seem correct, if you write from a buffer you should use write(buff, offset, length).
2) You read into the buffer once, so it should be enugh to write out the buffer once too.
3) If you need to read the the buffer more than once, and write out the values more than once, use a while, not a do while. You have no garantee that the read operation was succesfull.
Ref to write method in Android Developer
I had an additional method which would append an index to a file added multiple times eg. "file.pdf", "file1.pdf","file2.pdf". This method wasn't correct, leading to an endless loop of creating a new file and appending an index. I managed to fix the problem by changing this method to avoid looping.
In retrospect I should have included that in my question.

Sending video file in API

I need to make API in Json to send the video. I don't want to send the path of video. What is the best way to send the video in JSON which will be used by android and iPhone guys. If I use the base64 or byte[] then I am getting the memory exception error.
File file = new File("video.mp4");
FileInputStream fis = new FileInputStream(file);
ByteArrayOutputStream bos = new ByteArrayOutputStream();
byte[] buf = new byte[1024];
try {
for (int readNum; (readNum = fis.read(buf)) != -1;) {
bos.write(buf, 0, readNum); //no doubt here is 0
System.out.println("read " + readNum + " bytes,");
}
} catch (IOException ex) {
Logger.getLogger(genJpeg.class.getName()).log(Level.SEVERE, null, ex);
}
byte[] bytes = bos.toByteArray();
This is how you add a video byte by byte inside an byte array. You just then send the byte array as JSONOBject by following...
byte[] data; //array holding the video
String base64Encoded = DatatypeConverter.printBase64Binary(data); //You have encoded the array into String
now send that to server. (I am guessing you know how to)..
This is how you will decode your JSON to byteArray again.
byte[] base64Decoded = DatatypeConverter.parseBase64Binary(base64Encoded);
The typical way to send binary in json is to base64 encode it. Java provides different ways to Base64 encode and decode a byte[]. One of these is DatatypeConverter.
I hope it helps.
Cheers!
Edited:
You are getting OutOfMemoryException, because HeapMemory is 2Mb in size and your video is 2Mb, so when inserting into String, it's going out of memory. Even if you put it into an Object instances, you will EITHER have to re-initialize the heap or some way else. I will try to write an answer tomorrow. (Writing this half asleep, might be other way around_

Converting byte[] to Base64 String throws OutOfMemoryError

I'm trying to convert files to Base64 string. For small size file it works perfectly, for files larger like 500mb it throws OutOfMemoryError. I'm to convert the file to Base64 encodedString because it is my server side requirement to upload a file through Base64 encodedstring. is it possible to convert and send a 500mb file through this method ? Thanks in advance.
byte[] bytes = null;
InputStream inputStream;
try {
inputStream = new FileInputStream(mFilepath);
byte[] buffer = new byte[1024];
int bytesRead;
ByteArrayOutputStream output = new ByteArrayOutputStream();
try {
while ((bytesRead = inputStream.read(buffer)) != -1) {
output.write(buffer, 0, bytesRead);
}
inputStream.close();
output.flush();
output.close();
} catch (IOException e) {
e.printStackTrace();
}
bytes = output.toByteArray();
} catch (FileNotFoundException e1) {
e1.printStackTrace();
}
// Here it throws OutOfMemoryError
String encodedString = Base64.encodeToString(bytes, Base64.DEFAULT);
Then I'm passing encodedString to server using HttpURLConnection.
for files larger like 500mb it throws OutOfMemoryError
Of course. The heap limit of a Java-based Android app is going to be a lot smaller than 500MB, let alone two copies of the data, one in an expanded (Base64) format. There are hundreds of millions of devices that do not even have that much RAM for the whole device, let alone for use by your app.
is it possible to convert and send a 500mb file through this method ?
Only if you can somehow stream up the converted bytes. Convert a handful of bytes, write them to the socket, convert the next handful of bytes, write them to the socket, and so forth. You have no practical way of converting the entire 500MB file into Base64 in memory and transferring it as a string.

Android-Compressing and Decompressing Video

I am trying to compress a video in Android before uploading. I am following a code from Stack, when I try it I can see that the file is compressed (not sure that it has actually compressed) but it reduces the file size and creates a compressed file as expected but I wont be able to open the file as its content is gibberish so I try to decompress the video as I can surely know that it has been successfully compressed which in turn has lead to decompression.
My problem is that the original file size and the decompressed file size is SAME, but the file does not open and it says "Sorry, the video cannot be played".
CODE :
Compression :
public static void compressData(byte[] data) throws Exception {
OutputStream out = new FileOutputStream(new
File("/storage/emulated/0/DCIM/Camera/compressed_video.mp4"));
Log.e("Original byte length: ", String.valueOf(data.length));
Deflater d = new Deflater();
DeflaterOutputStream dout = new DeflaterOutputStream(out, d);
dout.write(data);
dout.close();
Log.i("The Compressed Byte array is ", ""+data.length);
Log.e("Compressed byte length: ",
String.valueOf(dout.toString().getBytes().length));
}
Decompression :
public static void decompress() throws Exception {
InputStream in = new FileInputStream("/storage/emulated/0/DCIM/Camera/compressed_video.mp4");
InflaterInputStream ini = new InflaterInputStream(in);
ByteArrayOutputStream bout = new ByteArrayOutputStream(1024);
int b;
while ((b = ini.read()) != -1) {
bout.write(b);
}
ini.close();
bout.close();
String s = new String(bout.toByteArray());
System.out.println(s);
File decompressed_file = new File("/storage/emulated/0/DCIM/Camera/decompressed_video.mp4");
FileOutputStream out_file = new FileOutputStream(decompressed_file);
out_file.write(bout.toByteArray());
out_file.close();
Log.i("The Decompressed Byte array is ", ""+bout.toByteArray().length);
Log.e("De-compressed byte length: ",
String.valueOf(bout.toByteArray().length));
}
From the above code, the original byte length and the decompressed byte length is same but I am not sure why the byte array does not get write to the file. I can see that the two files of compressed_video and decompressed_video is created but I cant play either. Unable to play compressed_video.mp4 is acceptable but I should be able to play the decompressed_video.mp4 which is unavailable to play. I have been sitting on this for more than 2 days so any help would be insanely appreciated. Thanks in advance guys.

Issues with inputstream outofmemory errors

I'm using the SpringFramework for Android to get my inputstreams.
#Override
public InputStream getImageStream(String url) {
InputStream is = new ByteArrayInputStream(template.getForObject(url, byte[].class));
return is;
}
For the first few inputstreams its going ok. No Problems at all, but then, I think it tries to get a very big inputstream. So then I get the outofmemory error.
I see a lot of posts using something like the following code:
public byte[] readBytes(InputStream inputStream) throws IOException {
ByteArrayOutputStream byteBuffer = new ByteArrayOutputStream();
int bufferSize = 1024;
byte[] buffer = new byte[bufferSize];
int len = 0;
while ((len = inputStream.read(buffer)) != -1) {
byteBuffer.write(buffer, 0, len);
}
byte[] byteArray= byteBuffer.toByteArray();
return byteArray;
}
The idea of this code is to read the inputstream in chunks right?
But the outofmemory error I'm getting is before I can even start the readBytes method. I tried putting resets everywhere...I thought maybe I should clear the memory somewhere after readbytes or something. But I do not know how and I don't know if that is the right way?
I think I'm having the basics wrong? I'm very new to android and java... Is there a way of getting the InputStream another way? I also read something about BufferedInputStream but I just can't think up of a way to fit it in.
My goal is to store a blob of the image in the database. And my input is the imgurl via oauth.
I can also call less quality versions of the inputstream through another url. And then everything works...
But I wanted to try it with the original image url, because maybe later I want to have the ability to download an original for printing the photo.
You are getting OutOfMemory error because you reading whole data in memory when using ByteArrayInputStream. Also notice that when you write in ByteArrayOutputStream it keeps data in memory too (it just simple wrapper for byte array). Probably you should use FileOutputStream instead as cache.

Categories

Resources