Images in sqlite huge - android

I have 5 Mb worth of images that I want to put into an sqlite db, in blob fields.
After the insertion, the db is around 50Mb.
This is how I get the byte[]:
private static byte[] getByteArrayFromFileName(String filename) {
int id = context.getResources().getIdentifier(filename, "raw", context.getPackageName());
ByteArrayOutputStream blob = new ByteArrayOutputStream();
BitmapFactory.decodeResource(context.getResources(), id).compress(CompressFormat.PNG, 0, blob);
return blob.toByteArray();
}
This is how I insert them into the db:
public void createImage(SQLiteDatabase database, Image image) throws DbException {
ContentValues values = new ContentValues();
values.put(ImageTable.DATA, image.data);
values.put(ImageTable.TYPE_ID, image.type.getValue());
values.put(ImageTable.LEVELID, image.levelId);
values.put(ImageTable.ID, image.id);
if (database.replace(ImageTable.TABLE_NAME, null, values) == -1) {
throw new DbException("createImage insertion error");
}
}
What am I screwing up? :)
edit: the problem was, I should not be putting bitmaps into the database, but just the raw (compressed in jpeg format)files. So for reference, here is a correct way of getting a byte[] from a bitmap file, so it's still small in size:
private static byte[] getByteArrayFromRawByFileName(Context context, String filename) throws IOException {
int id = context.getResources().getIdentifier(filename, "raw", context.getPackageName());
InputStream is = context.getResources().openRawResource(id);
ByteArrayOutputStream bos = new ByteArrayOutputStream();
int bytesRead;
byte[] b = new byte[1024];
while ((bytesRead = is.read(b)) != -1) {
bos.write(b, 0, bytesRead);
}
return bos.toByteArray();
}

Probably the problem is you are using JPEG images as source but before insertion you encoding them to PNG and that's the reason of 10x grow I believe. Your getByteArrayFromFileName() is also a fail cuz since you have a file you could read it as byteArray without BitmapFactory involving

Storing images as a blob in your database may not be such a good idea. A couple reasons are...
On Android, your database files are stored in the phone's internal storage, not the SD card. Since the phone's internal storage may be quite limited, it's always a good idea to store large stuff on the SD card. Although, this argument may be invalid on newer devices that do not allow for expanded storage.
If the user decided to uninstall your app, the images also go with it.
I would suggest saving the images to the SD card, and store the file path in the database.

Related

Saving a file uses unexplainably large amounts of storage

In my application the user can choose a file using the chooser Intent, which will then be "imported" into the application and saved in internal storage for security reasons. This all worked fine and still does on some devices, but for example on the Google Pixel on Android 7.1.1 it only functions normally for the first 4-6 files and afterwards it acts very odd.
The performance was going down drastically so I checked my storage usage and found that it was continuously growing, although the file I was supposed to be saving was less than 1mb large. Importing a file would cause the amount of storage taken by my app to rise past 500mb and upward. I can't seem to find the cause for this.
The method I am using to save the files which is called in an async background task:
BufferedInputStream bis = null;
BufferedOutputStream bos = null;
OutputStream fos = new FileOutputStream(file);
int size = 0;
InputStream fis = getContentResolver().openInputStream(uri);
try{
bis = new BufferedInputStream(fis);
bos = new BufferedOutputStream(fos);
byte[] buf = new byte[1024];
int len = 1024;
while((len = bis.read(buf,0,len)) != -1){
bos.write(buf,0,len);
size = size+1024;
Log.v("Bytes written",""+size);
}
}catch (IOException e){
e.printStackTrace();
}finally {
try{
if(bis != null) bis.close();
if(bos != null) bos.close();
if(fis != null) fis.close();
if(fos != null) fos.close();
}catch(IOException e){
e.printStackTrace();
}
}
return Uri.fromFile(file);
The Uri which this function returns is then saved in an SQLite Database to be used later.
I appreciate all kinds of tips as to where this memory usage could be coming from.
Btw, this did not result from an update on the phone nor from any changes in my code, as it was working the last time I tested it and I haven't changed anything since.
I see a couple of things to correct:
1) The signature of the write method doesn't seem correct, if you write from a buffer you should use write(buff, offset, length).
2) You read into the buffer once, so it should be enugh to write out the buffer once too.
3) If you need to read the the buffer more than once, and write out the values more than once, use a while, not a do while. You have no garantee that the read operation was succesfull.
Ref to write method in Android Developer
I had an additional method which would append an index to a file added multiple times eg. "file.pdf", "file1.pdf","file2.pdf". This method wasn't correct, leading to an endless loop of creating a new file and appending an index. I managed to fix the problem by changing this method to avoid looping.
In retrospect I should have included that in my question.

OutOfMemory Exception while encoding Base64 [duplicate]

Using Base64 from Apache commons
public byte[] encode(File file) throws FileNotFoundException, IOException {
byte[] encoded;
try (FileInputStream fin = new FileInputStream(file)) {
byte fileContent[] = new byte[(int) file.length()];
fin.read(fileContent);
encoded = Base64.encodeBase64(fileContent);
}
return encoded;
}
Exception in thread "AWT-EventQueue-0" java.lang.OutOfMemoryError: Java heap space
at org.apache.commons.codec.binary.BaseNCodec.encode(BaseNCodec.java:342)
at org.apache.commons.codec.binary.Base64.encodeBase64(Base64.java:657)
at org.apache.commons.codec.binary.Base64.encodeBase64(Base64.java:622)
at org.apache.commons.codec.binary.Base64.encodeBase64(Base64.java:604)
I'm making small app for mobile device.
You cannot just load the whole file into memory, like here:
byte fileContent[] = new byte[(int) file.length()];
fin.read(fileContent);
Instead load the file chunk by chunk and encode it in parts. Base64 is a simple encoding, it is enough to load 3 bytes and encode them at a time (this will produce 4 bytes after encoding). For performance reasons consider loading multiples of 3 bytes, e.g. 3000 bytes - should be just fine. Also consider buffering input file.
An example:
byte fileContent[] = new byte[3000];
try (FileInputStream fin = new FileInputStream(file)) {
while(fin.read(fileContent) >= 0) {
Base64.encodeBase64(fileContent);
}
}
Note that you cannot simply append results of Base64.encodeBase64() to encoded bbyte array. Actually, it is not loading the file but encoding it to Base64 causing the out-of-memory problem. This is understandable because Base64 version is bigger (and you already have a file occupying a lot of memory).
Consider changing your method to:
public void encode(File file, OutputStream base64OutputStream)
and sending Base64-encoded data directly to the base64OutputStream rather than returning it.
UPDATE: Thanks to #StephenC I developed much easier version:
public void encode(File file, OutputStream base64OutputStream) {
InputStream is = new FileInputStream(file);
OutputStream out = new Base64OutputStream(base64OutputStream)
IOUtils.copy(is, out);
is.close();
out.close();
}
It uses Base64OutputStream that translates input to Base64 on-the-fly and IOUtils class from Apache Commons IO.
Note: you must close the FileInputStream and Base64OutputStream explicitly to print = if required but buffering is handled by IOUtils.copy().
Either the file is too big, or your heap is too small, or you've got a memory leak.
If this only happens with really big files, put something into your code to check the file size and reject files that are unreasonably big.
If this happens with small files, increase your heap size by using the -Xmx command line option when you launch the JVM. (If this is in a web container or some other framework, check the documentation on how to do it.)
If the file recurs, especially with small files, the chances are that you've got a memory leak.
The other point that should be made is that your current approach entails holding two complete copies of the file in memory. You should be able to reduce the memory usage, though you'll typically need a stream-based Base64 encoder to do this. (It depends on which flavor of the base64 encoding you are using ...)
This page describes a stream-based Base64 encoder / decoder library, and includes lnks to some alternatives.
Well, do not do it for the whole file at once.
Base64 works on 3 bytes at a time, so you can read your file in batches of "multiple of 3" bytes, encode them and repeat until you finish the file:
// the base64 encoding - acceptable estimation of encoded size
StringBuilder sb = new StringBuilder(file.length() / 3 * 4);
FileInputStream fin = null;
try {
fin = new FileInputStream("some.file");
// Max size of buffer
int bSize = 3 * 512;
// Buffer
byte[] buf = new byte[bSize];
// Actual size of buffer
int len = 0;
while((len = fin.read(buf)) != -1) {
byte[] encoded = Base64.encodeBase64(buf);
// Although you might want to write the encoded bytes to another
// stream, otherwise you'll run into the same problem again.
sb.append(new String(buf, 0, len));
}
} catch(IOException e) {
if(null != fin) {
fin.close();
}
}
String base64EncodedFile = sb.toString();
You are not reading the whole file, just the first few kb. The read method returns how many bytes were actually read. You should call read in a loop until it returns -1 to be sure that you have read everything.
The file is too big for both it and its base64 encoding to fit in memory. Either
process the file in smaller pieces or
increase the memory available to the JVM with the -Xmx switch, e.g.
java -Xmx1024M YourProgram
This is best code to upload image of more size
bitmap=Bitmap.createScaledBitmap(bitmap, 100, 100, true);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, stream); //compress to which format you want.
byte [] byte_arr = stream.toByteArray();
String image_str = Base64.encodeBytes(byte_arr);
Well, looks like your file is too large to keep the multiple copies necessary for an in-memory Base64 encoding in the available heap memory at the same time. Given that this is for a mobile device, it's probably not possible to increase the heap, so you have two options:
make the file smaller (much smaller)
Do it in a stram-based way so that you're reading from an InputStream one small part of the file at a time, encode it and write it to an OutputStream, without ever keeping the enitre file in memory.
In Manifest in applcation tag write following
android:largeHeap="true"
It worked for me
Java 8 added Base64 methods, so Apache Commons is no longer needed to encode large files.
public static void encodeFileToBase64(String inputFile, String outputFile) {
try (OutputStream out = Base64.getEncoder().wrap(new FileOutputStream(outputFile))) {
Files.copy(Paths.get(inputFile), out);
} catch (IOException e) {
throw new UncheckedIOException(e);
}
}

Bitmap - compare contact picture with picture on sd card

I want to check, if the current user picture of a contact is the same as the one on the sd card...
I set the user picture like following:
byte[] photo = ImageFunctions.convertImageToByteArray(bitmap);
ContentValues values = new ContentValues();
....
values.put(ContactsContract.CommonDataKinds.Photo.PHOTO, photo);
...
And I read the image like following:
InputStream inputStream = ContactsContract.Contacts.openContactPhotoInputStream(
contentResolver,
ContentUris.withAppendedId(ContactsContract.Contacts.CONTENT_URI, mId),
fullsize);
if (inputStream != null)
return BitmapFactory.decodeStream(inputStream);
And my convert function is following:
public static byte[] convertImageToByteArray(Bitmap bitmap)
{
ByteArrayOutputStream streamy = new ByteArrayOutputStream();
bitmap.compress(CompressFormat.JPEG, 100, streamy);
return streamy.toByteArray();
}
And I use a md5 hash on the byte array of the bitmap to find changes... Actually, the images are not exactly the same.
What can I do, so that I compare the hash codes? It seems, like compression or whatever is not exactly the same, so the md5 hash check fails...

SQLite DB Blob field Vs Android Filesystem

Here's the thing that's causing a bit of headscratching, maybe someone can shed some light on the situation. I'm using the Camera intent to snap a picture (well, any number of pictures really), like so:
ImageView imgPhoto = (ImageView)findViewById(R.id.imgButtonPhoto);
imgPhoto.setBackgroundColor(Color.rgb(71,117,255));
imgPhoto.setOnClickListener(new View.OnClickListener()
{
#Override
public void onClick(View v)
{
++snapNumber;
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
lastPicSaved = String.valueOf(gd.getDeliveryId()) + "_" + String.valueOf(snapNumber) + ".jpg";
Uri imageUri = Uri.fromFile(new File(Environment.getExternalStorageDirectory(), lastPicSaved));
intent.putExtra(MediaStore.EXTRA_OUTPUT, imageUri);
startActivityForResult(intent, GooseConsts.IMAGE_CAPTURE_INTENT);
}
});
Once the activity has finished I snag the result like so:
case GooseConsts.IMAGE_CAPTURE_INTENT:
try
{
String newCompressedImage = Environment.getExternalStorageDirectory() + "/" + lastPicSaved;
BitmapFactory.Options options = new BitmapFactory.Options();
options.inSampleSize = 4;
//options.inDensity = DisplayMetrics.DENSITY_MEDIUM;
Bitmap b = BitmapFactory.decodeFile(newCompressedImage, options);
FileOutputStream fos;
fos = new FileOutputStream(newCompressedImage);
b.compress(CompressFormat.JPEG, 60, fos);
fos.flush();
fos.close();
Image i = new Image();
i.setReported(0);
i.setReportedFull(0);
i.setImage(newCompressedImage);
//i.setImageData(b);
dbHelper.insertImageReference(i, gd.getDeliveryId());
}
Simple stuff really. As you can see I'm using the options.inSampleSize and reducing quality upon compression, to reduce the size of the end image, so as to maintain a small image capture to send back to hq, via an XMPP packet.
Here comes the fun part!
On the filesystem this resulting image is around the 50Kb size, possibly a bit more but never more than 60Kb. Which is fine, that sends via XMPP and I can process and display it in a custom connected client I've also written.
I figured it'd probably be best to keep the images, just in case sending fails for whatever reason, but didn't want them getting lost in the file system, so added a BLOB field to my local device database. I wondered if I could just send them directly from said DB and do away with the file system completely, so I tried it, suddenly NO images were being sent/received by my client bot. Strange! After a bit of digging, I noticed that the images that had been saved into the db BLOB are now (amazingly) 3x the size of the original. Same dimensions (486x684) and same quality (as I've adb pulled a few to test against the ones stored on the SD card).
Can anyone tell me why this is the case? I've been using BLOB fields for years and have never seen such a dramatic increase in file size before. A couple of Kb here and there, sure, but not jumping from 50(ish)Kb to over 160Kb?!
Many thanks.
After you compress the image convert the Image to a byte array instead of using a blob
Bitmap b = BitmapFactory.decodeFile(newCompressedImage, options);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
b.compress(Bitmap.CompressFormat.PNG, 60, stream);
byte [] byteArray = stream.toByteArray();
This should keep the file size to a minimum. You will have to convert the byte array back to a bitmap for display, of course. But that should be straight forward.
I believe byte array's have a size limit though. Initialize it instead as
byte [] byteArray = new byte [1024];
// then
byteArray = stream.toByteArray();

Issues with inputstream outofmemory errors

I'm using the SpringFramework for Android to get my inputstreams.
#Override
public InputStream getImageStream(String url) {
InputStream is = new ByteArrayInputStream(template.getForObject(url, byte[].class));
return is;
}
For the first few inputstreams its going ok. No Problems at all, but then, I think it tries to get a very big inputstream. So then I get the outofmemory error.
I see a lot of posts using something like the following code:
public byte[] readBytes(InputStream inputStream) throws IOException {
ByteArrayOutputStream byteBuffer = new ByteArrayOutputStream();
int bufferSize = 1024;
byte[] buffer = new byte[bufferSize];
int len = 0;
while ((len = inputStream.read(buffer)) != -1) {
byteBuffer.write(buffer, 0, len);
}
byte[] byteArray= byteBuffer.toByteArray();
return byteArray;
}
The idea of this code is to read the inputstream in chunks right?
But the outofmemory error I'm getting is before I can even start the readBytes method. I tried putting resets everywhere...I thought maybe I should clear the memory somewhere after readbytes or something. But I do not know how and I don't know if that is the right way?
I think I'm having the basics wrong? I'm very new to android and java... Is there a way of getting the InputStream another way? I also read something about BufferedInputStream but I just can't think up of a way to fit it in.
My goal is to store a blob of the image in the database. And my input is the imgurl via oauth.
I can also call less quality versions of the inputstream through another url. And then everything works...
But I wanted to try it with the original image url, because maybe later I want to have the ability to download an original for printing the photo.
You are getting OutOfMemory error because you reading whole data in memory when using ByteArrayInputStream. Also notice that when you write in ByteArrayOutputStream it keeps data in memory too (it just simple wrapper for byte array). Probably you should use FileOutputStream instead as cache.

Categories

Resources