In my android app I capture an image from the camera, compress it to jpeg, send it to the server and save it on hdd. There it takes 48,9kb (e.g.). I send it back in a Base64-String and decode it on the Android side like this:
byte[] img;
img = Base64.decode(base64, Base64.DEFAULT);
ByteArrayInputStream in = new ByteArrayInputStream(img);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inSampleSize = 3;
Bitmap bmp = BitmapFactory.decodeStream(in, null, options);
return bmp;
Values bigger than 3 for
options.inSampleSize
will make the image look ugly. But if i now look at the size of
bmp
it is 156kb. Why does the size increase? How can I decode it, so that it keeps its original size and doesnt look ugly (too hard downsampling)?
Related
When I execute bitmap.getHeight(); I got a nullPointerException. This is how I try to get my bitmap:
In my json:"pic":"iVBORw0KGgoAAAANSUhEUgAAAgAAAAGACAIAAABUQk3......."
I retrieve from json the following:
byte[] decode = Base64.decode(jsonObj.getString("pic"), Base64.DEFAULT);
Log.i("size",decode.length+""); //65535
Bitmap pic = getImage(decode);
public static Bitmap getImage(byte[] image) {
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPreferredConfig = Bitmap.Config.ARGB_8888;
return BitmapFactory.decodeByteArray(image, 0, image.length,options);
}
But I can't display the image.
Any ideas?
The error must be inside BitmapFactory.decodeByteArray and most probably the source of the problem is unsupported image format. Check that you pass base64 encoded jpeg, png, gif or bmp.
Also if you get 65535 as decode's size - it is very suspicious. Maybe the data is trimmed in database (if you have limit of 65535 for blob size) or somewhere else.
I have a very large image on my device (not taken with device camera) I need to resize the image before sending it to server. So say it is 14mb originally and I want to reduce it to 2mb. I want to resize the image without losing quality at all. What I mean is that the server will allow for zooming into the photo. So I am thinking inDensity is important. Except I don’t understand how inDensity works in this regard. Will someone please explain how I can resize the photo to 2mb but keep such a high density that the image can be zoomed with high quality? Or is that not possible.
I already know how to resize images:
public static Bitmap resizeImage(String file, int reqHeight, int reqWidth) {
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeFile(file, options);
options.inSampleSize = calculateInSampleSize(options, reqHeight, reqWidth);
options.inJustDecodeBounds = false;
return BitmapFactory.decodeFile(file, options);
}
Try this in your code:
Bitmap bmp = BitmapFactory.decodeFile(myImage)
ByteArrayOutputStream bos = new ByteArrayOutputStream();
bmp.compress(CompressFormat.JPEG, 70, bos);
InputStream in = new ByteArrayInputStream(bos.toByteArray());
ContentBody image = new InputStreamBody(in, "image/jpeg", "filename");
It worked for me!
Here's what I'm trying to do:
Myapp calls the camera app, takes a picture, sends the pic path back to Myapp to be displayed in an ImageView and to then be shared to Instagram. I want the displayed bitmap to be of the same dimensions as what Instagram uses so no unexpected cropping will happen when going from Myapp to Instagram.
Here's what I've tried so far when returning from the camera:
(note: INSTAGRAM_FORMAT_W == INSTAGRAM_FORMAT_H == 1080)
BitmapFactory.Options bmOptions = new BitmapFactory.Options();
bmOptions.outHeight = INSTAGRAM_FORMAT_H;
bmOptions.outWidth = INSTAGRAM_FORMAT_W;
bmOptions.inMutable = true;
Bitmap photo = (Bitmap) BitmapFactory.decodeFile(path, bmOptions);
photo = Bitmap.createScaledBitmap(photo, INSTAGRAM_FORMAT_H, INSTAGRAM_FORMAT_W, false);
ByteArrayOutputStream bytes = new ByteArrayOutputStream();
photo.compress(Bitmap.CompressFormat.JPEG, 40, bytes);
This distorts the image to fit the square format; not optimal
BitmapFactory.Options bmOptions = new BitmapFactory.Options();
bmOptions.outHeight = INSTAGRAM_FORMAT_H;
bmOptions.outWidth = INSTAGRAM_FORMAT_W;
bmOptions.inMutable = true;
Bitmap photo = (Bitmap) BitmapFactory.decodeFile(path, bmOptions);
int baseX = (photo.getWidth() - INSTAGRAM_FORMAT_W)/2;
int baseY = (photo.getHeight() - INSTAGRAM_FORMAT_H)/2;
Bitmap photoResized = Bitmap.createBitmap(photo,baseX,baseY,INSTAGRAM_FORMAT_W,INSTAGRAM_FORMAT_H);
ByteArrayOutputStream bytes = new ByteArrayOutputStream();
photoResized.compress(Bitmap.CompressFormat.JPEG, 40, bytes);
This crops too much off what the user sees thru the camera app, it is also bigger than the Instagram size which results in additional cropping when going to Instagram; not optimal
I was digging thru BitmapFactory.Options and possible parameters for Bitmap.createBitmap but I'm pretty lost in terms of what best practice is for when/where the formatting occurs, how to deal with variable pixel density of the screen (if needed) and variable camera definition (if applicable).
I could use a helping hand folks. Thanks
I have primarily a question,Can I set a byte[] as image in ImageView without converting back to Bitmap?
The OOM error is thrown when I decode the byte[] to bitmap and I am seeing this happening for images of 300 KB even. I have opted to use BitmapFactory.options (inSampleSize) to scale the image to avoid the exception.
But this alters the dimension (width especially) of my image which looks so bad in my application. Is there anyway to fetch the original image from DB without scaling or altering the image (of course without the risk of OOM error)??
Any help is appreciated..!!
Thanks in Advance.
Code that troubles:
ByteArrayInputStream imageStream = new ByteArrayInputStream(imageByteArray);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
Bitmap ImageBmp = BitmapFactory.decodeStream(imageStream, null, options);
options.inJustDecodeBounds = false;
//checking if the image is too large we can resize it small in order to avoid Out of Memory error from the decodeStream method.
if(showfullImage(options.outWidth) && !isExternal)
{
options.inSampleSize = 2;
}
imageStream.close();
imageStream = new ByteArrayInputStream(imageByteArray);
ImageBmp = BitmapFactory.decodeStream(imageStream, null, options);
return ImageBmp;
Here is my code:
File file = new File(Path to Jpeg File size is 700kb);
InputStream in = null;
try {
in = new BufferedInputStream(new FileInputStream(file));
}
catch (Exception e) {
// TODO: handle exception
}
bitmap =BitmapFactory.decodeStream(in);
bitmap = bitmap.copy(Bitmap.Config.ARGB_8888, true);
Please help i get error in this copy line i want to make its ARGB_8888 image.Need Help :(
You need to reduce the memory usage.
From you code, you first decode stream to one bitmap, and then copy it, which means you create two large bitmap objects.
You don't need to decode and then copy it, you can try
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPreferredConfig = Bitmap.Config.ARGB_8888
// You can try value larger than 1
options.inSampleSize = 2 // If set to a value > 1, requests the decoder to subsample the original image, returning a smaller image to save memory.
// Decode bitmap
bitmap = BitmapFactory.decodeStream(in, null, options)
In this case, there's only one bitmap created. And you set inSampleSize to large values to reduce the loaded bitmap size.