I'm downloading a bitmap from an URL with the following code. If I do this cyclic (like streaming images from a camera) then the bitmap will be reallocated again and again. So I wonder if there is a way to write the newly downloaded byte-array into the existing bitmap which is already allocated in memory.
public static Bitmap downloadBitmap(String url) {
try {
URL newUrl = new URL(url);
return BitmapFactory.decodeStream(newUrl.openConnection()
.getInputStream());
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
Within this segment in the bitmap memory management section entitled 'Manage Memory on Android 3.0 and Higher' they start to speak of how to manipulate the bitmaps so that you can reuse the bitmap space so that the location for the Bitmap itself does not need to be re-allocated. If you are indeed looking at using the stream from the camera then this will cover back to Honeycomb since they will be the same sizes. Otherwise, it may only help out past 4.4 Kitkat.
But, you could store a local WeakReference (if you want it to be collected in case of memory issues) within the downloadBitmap class and then re-assign to that space and return there instead of creating a bitmap each time in a single line.
The app is slowed down because it allocates and de-allocates memory in each cycle. There are three ways to avoid that.
The first version works without OpenCV but still allocates some memory in each cycle. But the amount is much smaller and therefore it is at least two times faster. How? By re-using an existing and allready allocated buffer (byte[]). I'm using it with a pre-allocated SteamInfo buffer of 1.000.000 length (about double the size than I'm expecting).
By the way - reading the input stream in chunks and using BitmapFactory.decodeByteArray is much faster than putting the URL's input stream directly into BitmapFactory.decodeStream.
public static class StreamInfo {
public byte[] buffer;
public int length;
public StreamInfo(int length) {
buffer = new byte[length];
}
}
public static StreamInfo imageByte(StreamInfo buffer, String url) {
try {
URL newUrl = new URL(url);
InputStream is = (InputStream) newUrl.getContent();
byte[] tempBuffer = new byte[8192];
int bytesRead;
int position = 0;
if (buffer != null) {
// re-using existing buffer
while ((bytesRead = is.read(tempBuffer)) != -1) {
System.arraycopy(tempBuffer, 0, buffer.buffer, position,
bytesRead);
position += bytesRead;
}
buffer.length = position;
return buffer;
} else {
// allocating new buffer
ByteArrayOutputStream output = new ByteArrayOutputStream();
while ((bytesRead = is.read(tempBuffer)) != -1) {
output.write(tempBuffer, 0, bytesRead);
position += bytesRead;
}
byte[] result = output.toByteArray();
buffer = new StreamInfo(result.length * 2, false);
buffer.length = position;
System.arraycopy(result, 0, buffer.buffer, 0, result.length);
return buffer;
}
} catch (MalformedURLException e) {
e.printStackTrace();
return null;
} catch (IOException e) {
e.printStackTrace();
return null;
}
}
The second version uses OpenCV Mat and a pre-allocated Bitmap. Receiving the stream is done as in version one. So it does not need further memory allocation anymore (for details check out this link). This version works fine but it is a bit slower because it contains conversions between OpenCV Mat and Bitmap.
private NetworkCameraFrame frame;
private HttpUtils.StreamInfo buffer = new HttpUtils.StreamInfo(1000000);
private MatOfByte matForConversion;
private NetworkCameraFrame receive() {
buffer = HttpUtils.imageByte(buffer, uri);
if (buffer == null || buffer.length == 0)
return null;
Log.d(TAG, "Received image with byte-array of length: "
+ buffer.length / 1024 + "kb");
if (frame == null) {
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
Bitmap bmp = BitmapFactory.decodeByteArray(buffer.buffer, 0,
buffer.length);
frame = new NetworkCameraFrame(bmp.getWidth(), bmp.getHeight());
Log.d(TAG, "NetworkCameraFrame created");
bmp.recycle();
}
if (matForConversion == null)
matForConversion = new MatOfByte(buffer.buffer);
else
matForConversion.fromArray(buffer.buffer);
Mat newImage = Highgui.imdecode(matForConversion,
Highgui.IMREAD_UNCHANGED);
frame.put(newImage);
return frame;
}
private class NetworkCameraFrame implements CameraFrame {
Mat mat;
private int mWidth;
private int mHeight;
private Bitmap mCachedBitmap;
private boolean mBitmapConverted;
public NetworkCameraFrame(int width, int height) {
this.mWidth = width;
this.mHeight = height;
this.mat = new Mat(new Size(width, height), CvType.CV_8U);
this.mCachedBitmap = Bitmap.createBitmap(width, height,
Bitmap.Config.ARGB_8888);
}
#Override
public Mat gray() {
return mat.submat(0, mHeight, 0, mWidth);
}
#Override
public Mat rgba() {
return mat;
}
// #Override
// public Mat yuv() {
// return mYuvFrameData;
// }
#Override
public synchronized Bitmap toBitmap() {
if (mBitmapConverted)
return mCachedBitmap;
Mat rgba = this.rgba();
Utils.matToBitmap(rgba, mCachedBitmap);
mBitmapConverted = true;
return mCachedBitmap;
}
public synchronized void put(Mat frame) {
mat = frame;
invalidate();
}
public void release() {
mat.release();
mCachedBitmap.recycle();
}
public void invalidate() {
mBitmapConverted = false;
}
};
The third version uses the instructions "Usage of BitmapFactory" on BitmapFactory.Options and a mutable Bitmap that is then re-used while decoding. It even work ed for me on Android JellyBean. Make sure you're using the correct BitmapFactory.Options when created the very first Bitmap.
BitmapFactory.Options options = new BitmapFactory.Options();
options.inBitmap = bmp; // the old Bitmap that should be reused
options.inMutable = true;
options.inSampleSize = 1;
Bitmap bmp = BitmapFactory.decodeByteArray(buffer, 0, buffer.length, options);
options.inBitmap = bmp;
This was actually the fastest streaming then.
Related
ARCore camera doesn't seem to support takePicture.
https://developers.google.com/ar/reference/java/com/google/ar/core/Camera
Anyone know how I can take pictures with ARCore?
I am assuming you mean a picture of what the camera is seeing and the AR objects. At a high level you need to get permission to write to external storage to save the picture, copy the frame from OpenGL and then save it as a png (for example). Here are the specifics:
Add the WRITE_EXTERNAL_STORAGE permission to the AndroidManifest.xml
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
Then change CameraPermissionHelper to iterate over both the CAMERA and WRITE_EXTERNAL_STORAGE permissions to make sure they are granted
private static final String REQUIRED_PERMISSIONS[] = {
Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.CAMERA
};
/**
* Check to see we have the necessary permissions for this app.
*/
public static boolean hasCameraPermission(Activity activity) {
for (String p : REQUIRED_PERMISSIONS) {
if (ContextCompat.checkSelfPermission(activity, p) !=
PackageManager.PERMISSION_GRANTED) {
return false;
}
}
return true;
}
/**
* Check to see we have the necessary permissions for this app,
* and ask for them if we don't.
*/
public static void requestCameraPermission(Activity activity) {
ActivityCompat.requestPermissions(activity, REQUIRED_PERMISSIONS,
CAMERA_PERMISSION_CODE);
}
/**
* Check to see if we need to show the rationale for this permission.
*/
public static boolean shouldShowRequestPermissionRationale(Activity activity) {
for (String p : REQUIRED_PERMISSIONS) {
if (ActivityCompat.shouldShowRequestPermissionRationale(activity, p)) {
return true;
}
}
return false;
}
Next, add a couple fields to HelloARActivity to keep track of the dimensions of the frame and boolean to indicate when to save the picture.
private int mWidth;
private int mHeight;
private boolean capturePicture = false;
Set the width and height in onSurfaceChanged()
public void onSurfaceChanged(GL10 gl, int width, int height) {
mDisplayRotationHelper.onSurfaceChanged(width, height);
GLES20.glViewport(0, 0, width, height);
mWidth = width;
mHeight = height;
}
At the bottom of onDrawFrame(), add a check for the capture flag. This should be done after all the other drawing happens.
if (capturePicture) {
capturePicture = false;
SavePicture();
}
Then add the onClick method for a button to take the picture, and the actual code to save the image:
public void onSavePicture(View view) {
// Here just a set a flag so we can copy
// the image from the onDrawFrame() method.
// This is required for OpenGL so we are on the rendering thread.
this.capturePicture = true;
}
/**
* Call from the GLThread to save a picture of the current frame.
*/
public void SavePicture() throws IOException {
int pixelData[] = new int[mWidth * mHeight];
// Read the pixels from the current GL frame.
IntBuffer buf = IntBuffer.wrap(pixelData);
buf.position(0);
GLES20.glReadPixels(0, 0, mWidth, mHeight,
GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, buf);
// Create a file in the Pictures/HelloAR album.
final File out = new File(Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_PICTURES) + "/HelloAR", "Img" +
Long.toHexString(System.currentTimeMillis()) + ".png");
// Make sure the directory exists
if (!out.getParentFile().exists()) {
out.getParentFile().mkdirs();
}
// Convert the pixel data from RGBA to what Android wants, ARGB.
int bitmapData[] = new int[pixelData.length];
for (int i = 0; i < mHeight; i++) {
for (int j = 0; j < mWidth; j++) {
int p = pixelData[i * mWidth + j];
int b = (p & 0x00ff0000) >> 16;
int r = (p & 0x000000ff) << 16;
int ga = p & 0xff00ff00;
bitmapData[(mHeight - i - 1) * mWidth + j] = ga | r | b;
}
}
// Create a bitmap.
Bitmap bmp = Bitmap.createBitmap(bitmapData,
mWidth, mHeight, Bitmap.Config.ARGB_8888);
// Write it to disk.
FileOutputStream fos = new FileOutputStream(out);
bmp.compress(Bitmap.CompressFormat.PNG, 100, fos);
fos.flush();
fos.close();
runOnUiThread(new Runnable() {
#Override
public void run() {
showSnackbarMessage("Wrote " + out.getName(), false);
}
});
}
Last step is to add the button to the end of activity_main.xml layout
<Button
android:id="#+id/fboRecord_button"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignStart="#+id/surfaceview"
android:layout_alignTop="#+id/surfaceview"
android:onClick="onSavePicture"
android:text="Snap"
tools:ignore="OnClick"/>
Acquiring the image buffer
In the latest ARCore SDK, we get access to the image buffer via public class Frame. Below is the sample code which gives us access to the image buffer.
private void onSceneUpdate(FrameTime frameTime) {
try {
Frame currentFrame = sceneView.getArFrame();
Image currentImage = currentFrame.acquireCameraImage();
int imageFormat = currentImage.getFormat();
if (imageFormat == ImageFormat.YUV_420_888) {
Log.d("ImageFormat", "Image format is YUV_420_888");
}
}
onSceneUpdate() will be called for every update if you register it to setOnUpdateListener() callback. Image will be in YUV_420_888 format, but it will have full Field of view of native high resolution camera.
Also do not forget to close resources of received image by calling currentImage.close(). Otherwise you will receive a ResourceExhaustedException on the next run of onSceneUpdate.
Writing the acquired image buffer to a file
Following implementation converts YUV buffer to compressed JPEG byte array
private static byte[] NV21toJPEG(byte[] nv21, int width, int height) {
ByteArrayOutputStream out = new ByteArrayOutputStream();
YuvImage yuv = new YuvImage(nv21, ImageFormat.NV21, width, height, null);
yuv.compressToJpeg(new Rect(0, 0, width, height), 100, out);
return out.toByteArray();
}
public static void WriteImageInformation(Image image, String path) {
byte[] data = null;
data = NV21toJPEG(YUV_420_888toNV21(image),
image.getWidth(), image.getHeight());
BufferedOutputStream bos = new BufferedOutputStream(new FileOutputStream(path));
bos.write(data);
bos.flush();
bos.close();
}
private static byte[] YUV_420_888toNV21(Image image) {
byte[] nv21;
ByteBuffer yBuffer = image.getPlanes()[0].getBuffer();
ByteBuffer uBuffer = image.getPlanes()[1].getBuffer();
ByteBuffer vBuffer = image.getPlanes()[2].getBuffer();
int ySize = yBuffer.remaining();
int uSize = uBuffer.remaining();
int vSize = vBuffer.remaining();
nv21 = new byte[ySize + uSize + vSize];
//U and V are swapped
yBuffer.get(nv21, 0, ySize);
vBuffer.get(nv21, ySize, vSize);
uBuffer.get(nv21, ySize + vSize, uSize);
return nv21;
}
Sorry for answering late.You can use code to click picture in ARCore:
private String generateFilename() {
String date =
new SimpleDateFormat("yyyyMMddHHmmss", java.util.Locale.getDefault()).format(new Date());
return Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_PICTURES) + File.separator + "Sceneform/" + date + "_screenshot.jpg";
}
private void saveBitmapToDisk(Bitmap bitmap, String filename) throws IOException {
File out = new File(filename);
if (!out.getParentFile().exists()) {
out.getParentFile().mkdirs();
}
try (FileOutputStream outputStream = new FileOutputStream(filename);
ByteArrayOutputStream outputData = new ByteArrayOutputStream()) {
bitmap.compress(Bitmap.CompressFormat.PNG, 100, outputData);
outputData.writeTo(outputStream);
outputStream.flush();
outputStream.close();
} catch (IOException ex) {
throw new IOException("Failed to save bitmap to disk", ex);
}
}
private void takePhoto() {
final String filename = generateFilename();
/*ArSceneView view = fragment.getArSceneView();*/
mSurfaceView = findViewById(R.id.surfaceview);
// Create a bitmap the size of the scene view.
final Bitmap bitmap = Bitmap.createBitmap(mSurfaceView.getWidth(), mSurfaceView.getHeight(),
Bitmap.Config.ARGB_8888);
// Create a handler thread to offload the processing of the image.
final HandlerThread handlerThread = new HandlerThread("PixelCopier");
handlerThread.start();
// Make the request to copy.
PixelCopy.request(mSurfaceView, bitmap, (copyResult) -> {
if (copyResult == PixelCopy.SUCCESS) {
try {
saveBitmapToDisk(bitmap, filename);
} catch (IOException e) {
Toast toast = Toast.makeText(DrawAR.this, e.toString(),
Toast.LENGTH_LONG);
toast.show();
return;
}
Snackbar snackbar = Snackbar.make(findViewById(android.R.id.content),
"Photo saved", Snackbar.LENGTH_LONG);
snackbar.setAction("Open in Photos", v -> {
File photoFile = new File(filename);
Uri photoURI = FileProvider.getUriForFile(DrawAR.this,
DrawAR.this.getPackageName() + ".ar.codelab.name.provider",
photoFile);
Intent intent = new Intent(Intent.ACTION_VIEW, photoURI);
intent.setDataAndType(photoURI, "image/*");
intent.addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION);
startActivity(intent);
});
snackbar.show();
} else {
Log.d("DrawAR", "Failed to copyPixels: " + copyResult);
Toast toast = Toast.makeText(DrawAR.this,
"Failed to copyPixels: " + copyResult, Toast.LENGTH_LONG);
toast.show();
}
handlerThread.quitSafely();
}, new Handler(handlerThread.getLooper()));
}
In my app I have to upload selected images to parse.com for taking their Printout . I have to maintain image quality and I could not resize the images.
I have to upload images in the parse.com ..I do not need to show them on device screen (images are form image gallery or from facebook album..or from sdcard) . I could not scale down them as per requirement.
I am getting OutOfMemory error on BitmapFactory.decodeFile(). How to solve this bug ?
is using android:largeHeap="true" could sove my issue ?
I am getting this crash on Samsung SM-G900T, But not on emulator ..
I tried to put
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = false;
options.inPreferredConfig = Config.RGB_565;
But it is not working.
Below is my AsyncTask class for uploading images to Parse.com
class UploadFileFromURL extends AsyncTask<String, String, String> {
ProgressDialog dialog;
String albumId = "";
#Override
protected void onPreExecute() {
super.onPreExecute();
}
#Override
protected String doInBackground(String... f_url) {
try {
for (int i = 0; i < arrListImgBean.size(); i++) {
if (!isUploading || objAsyncUpload.isCancelled()) {
break;
}
try {
if (arrListImgBean.get(i).imageStatus == 1)
continue;
else if (arrListImgBean.get(i).imageStatus == 2) {
isPhotodeleted = true;
publishProgress("" + countUploaded);
deletePhoto(i);
}
else {
isPhotodeleted = false;
try {
Bitmap b = null;
InputStream is = null;
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = false;
options.inPreferredConfig = Config.RGB_565; // to
// reduce
// the
// memory
options.inDither = true;
if (arrListImgBean.get(i).imgURL
.startsWith("http")) {
try {
URL url = new URL(
arrListImgBean.get(i).imgURL);
is = url.openConnection()
.getInputStream();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
b = BitmapFactory.decodeStream(is, null,
options);
} else {
b = BitmapFactory.decodeFile(
arrListImgBean.get(i).imgURL,
options);
}
// Convert it to byte
ByteArrayOutputStream stream = new ByteArrayOutputStream();
// Bitmap out = Bitmap.createScaledBitmap(b,
// 1500, 2100, false);
b.compress(Bitmap.CompressFormat.PNG, 100,
stream);
byte[] image = stream.toByteArray();
ParseFile file = new ParseFile("Android.png",
image);
file.save();
String uploadedUrl = file.getUrl();
if (uploadedUrl != null) {
ParseObject imgupload = new ParseObject(
"Photo");
imgupload.put("userName", ParseUser
.getCurrentUser().getEmail());
imgupload.put("photoURL", file);
imgupload.put("photoID",
arrListImgBean.get(i).imageId);
imgupload.put("count", 1);
imgupload.put("albumName", albumId);
imgupload.save();
String objId = imgupload.getObjectId();
if (objId != null && !objId.isEmpty()) {
countUploaded++;
publishProgress("" + countUploaded);
database.updateImageStatus(
arrListImgBean.get(i).imageId,
Constants.STATUS_UPLOADED,
objId, uploadedUrl);
}
}
} catch (Exception e) {
}
}
} catch (Exception e) {
isUploading = false;
e.printStackTrace();
}
}
} catch (Exception e) {
Log.e("Error: ", e.getMessage());
}
return null;
}
#Override
protected void onPostExecute(String file_url) {
// dismissDialog(progress_bar_type);
isUploading = false;
btnUploadImages.setBackgroundResource(R.drawable.upload_photo);
vprogress.setCompoundDrawables(null, null, null, null);
// stopLoading();
setProgressMsg();
}
}
android:largeHeap="true"
This line of code can solve your problem but its a temporary solution but crash may occurs again if number of images or the size of images will increase. Better to Use Picasso library to deals with Images
Consider you have an image of 1024x1024dp and a device with 512x512dp (both figures are just for understanding). So, in this case, loading a full resolution image on a smaller scale device is waste of memory. What you can do is to scale down the image so that it fits the device screen. In this way not only you will save a lot of memory but also get a proper, clear and sharp image.
I am adding code for scaling the image which I am using currently in my project.
final FileInputStream streamIn = new FileInputStream(file);
final BitmapFactory.Options ops = new BitmapFactory.Options();
ops.inJustDecodeBounds = true;
// Find the correct scale value. It should be the power of 2.
final int REQUIRED_SIZE = 300;
int width_tmp = ops.outWidth, height_tmp = ops.outHeight;
int scale = 1;
while (true) {
if (width_tmp / 2 < REQUIRED_SIZE || height_tmp / 2 < REQUIRED_SIZE) {
break;
}
width_tmp /= 2;
height_tmp /= 2;
scale *= 2;
}
ops.inJustDecodeBounds = false;
ops.inSampleSize = scale;
bitmap = BitmapFactory.decodeStream(streamIn, null, ops); //This gets the image
streamIn.close();
Choose a REQUIRED_SIZE value depending on the device's screen display size.
try {
image = readInFile(path);
}
catch(Exception e) {
e.printStackTrace();
}
// Create the ParseFile
ParseFile file = new ParseFile("picturePath", image);
// Upload the image into Parse Cloud
file.saveInBackground();
// Create a New Class called "ImageUpload" in Parse
ParseObject imgupload = new ParseObject("Image");
// Create a column named "ImageName" and set the string
imgupload.put("Image", "picturePath");
// Create a column named "ImageFile" and insert the image
imgupload.put("ImageFile", file);
// Create the class and the columns
imgupload.saveInBackground();
// Show a simple toast message
Toast.makeText(LoadImg.this, "Image Saved, Upload another one ",Toast.LENGTH_SHORT).show();
private byte[] readInFile(String path) throws IOException {
// TODO Auto-generated method stub
byte[] data = null;
File file = new File(path);
InputStream input_stream = new BufferedInputStream(new FileInputStream(
file));
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
data = new byte[16384]; // 16K
int bytes_read;
while ((bytes_read = input_stream.read(data, 0, data.length)) != -1) {
buffer.write(data, 0, bytes_read);
}
input_stream.close();
return buffer.toByteArray();
}
I'm using Volley library in my project but I have problem with OutOfMemory Exception. In my application I'm downloading thumbs and full size images from server via NetworkImageView using setImageUrl method. I'm using BitmapLruCache:
public class BitmapLruCache extends LruCache<String, Bitmap> implements ImageLoader.ImageCache {
public static int getDefaultLruCacheSize() {
final int maxMemory = (int) (Runtime.getRuntime().maxMemory() / 1024);
final int cacheSize = maxMemory / 8;
return cacheSize;
}
public BitmapLruCache() {
this(getDefaultLruCacheSize());
}
public BitmapLruCache(int sizeInKiloBytes) {
super(sizeInKiloBytes);
}
#Override
protected int sizeOf(String key, Bitmap value) {
return value.getRowBytes() * value.getHeight() / 1024;
}
#Override
public Bitmap getBitmap(String url) {
return get(url);
}
#Override
public void putBitmap(String url, Bitmap bitmap) {
put(url, bitmap);
}
}
I'm getting OutOfMemoryException on HTC Desire (Android 2.2.2). How can I deal with this exception? Is something wrong I'm doing?
Edit
This exception I got during monkey test:
java.lang.OutOfMemoryError at
com.android.volley.toolbox.ByteArrayPool.getBuf(ByteArrayPool.java:101)
at
com.android.volley.toolbox.PoolingByteArrayOutputStream.expand(PoolingByteArrayOutputStream.java:76)
at
com.android.volley.toolbox.PoolingByteArrayOutputStream.write(PoolingByteArrayOutputStream.java:84)
at
com.android.volley.toolbox.BasicNetwork.entityToBytes(BasicNetwork.java:213)
at
com.android.volley.toolbox.BasicNetwork.performRequest(BasicNetwork.java:104)
at
com.android.volley.NetworkDispatcher.run(NetworkDispatcher.java:105)
#Sipka - it doesn't solve my problem
#Muhammad Babar - Volley library handles all network/bitmap/cache operations so I need solution to fix OutOfMemory exception caused by Volley.
Use this code to create Bitmap in a thread that will help you
Bitmap bitmap = null;
HttpResponse response = null;
InputStream instream = null;
try {
File file = new File(Environment.getExternalStorageDirectory()
.toString(), floderName);
String s = file.getAbsolutePath();
f = new File(s);
if (!f.exists()) {
HttpClient client = new DefaultHttpClient();
HttpGet request = new HttpGet(new URL(url[0]).toURI());
response = client.execute(request);
if (response.getStatusLine().getStatusCode() != 200) {
return null;
}
// BufferedHttpEntity bufHttpEntity = new BufferedHttpEntity(
// response.getEntity());
instream = response.getEntity().getContent();
OutputStream os = new FileOutputStream(f);
Globals.CopyStream(instream, os);
os.close();
instream.close();
}
FileInputStream fs = null;
try {
fs = new FileInputStream(f);
} catch (FileNotFoundException e) {
// TODO do something intelligent
e.printStackTrace();
}
BitmapFactory.Options o2 = new BitmapFactory.Options();
o2.inDither = false; // Disable Dithering mode
o2.inPurgeable = true; // Tell to gc that whether it needs free
// memory, the Bitmap can be cleared
o2.inInputShareable = true; // Which kind of reference will be used
// to recover the Bitmap data after
// being clear, when it will be used in
// the future
o2.inTempStorage = new byte[32 * 1024];
o2.inSampleSize = 1;
bitmap = BitmapFactory.decodeFileDescriptor(fs.getFD(), null, o2);
bit = bitmap;
// bit.compress(Bitmap.CompressFormat.JPEG, 90, null);
newsFeed.setBitmap(bit);
// Data.globelCoverIcon = bit;
// OutputStream os = new FileOutputStream(f);
} catch (Exception ex) {
ex.printStackTrace();
}
public class Globals {
private static final int JPEG_EOI_1 = 0xFF;
private static final int JPEG_EOI_2 = 0xD9;
public static void CopyStream(InputStream is, OutputStream os) {
final int buffer_size = 1024;
try {
byte[] bytes = new byte[buffer_size];
for (;;) {
int count = is.read(bytes, 0, buffer_size);
if (count == -1)
break;
os.write(bytes, 0, count);
}
} catch (Exception ex) {
Log.e("App", ex.getMessage(), ex);
}
}
}
I'm new to android and I'm confused about how to deal with Bitmaps.
I want to download a Bitmap, it could be quite large, and save it to a temporary internal file. I'm then going to draw this Bitmap to a Canvas later.
My current method is to
1. Download the input stream
2. copy the stream
3. use one stream to work out bounds using bitmapFactory.options
4. use the other stream to decode the full bitmap with the sample size
However, I need landscape and portrait versions, so now I will have to do this twice and save two images.
Or - I have seen people use bm.compress(Bitmap.CompressFormat.JPEG, 50, bos); to save a file instead. This by-passes the decoding with sample size as its saved direct from a stream. I guess then I would use a matrix to scale when I draw to my Canvas.
Basically, I am confused as the best approach for this task , which method is less likely to run into out of memory and is the more commonly used approach?
Cheers
byte[] imagesByte = getLogoImage(Your url);
set to imageview...
imgView.setImageBitmap(BitmapFactory.decodeByteArray( imagesByte, 0, imagesByte.length));
Method for Download
public static byte[] getLogoImage(String url){
try {
URL imageUrl = new URL(url);
URLConnection ucon = imageUrl.openConnection();
InputStream is = ucon.getInputStream();
BufferedInputStream bis = new BufferedInputStream(is);
ByteArrayBuffer baf = new ByteArrayBuffer(500);
int current = 0;
while ((current = bis.read()) != -1) {
baf.append((byte) current);
}
return baf.toByteArray();
} catch (Exception e) {
Log.d("ImageManager", "Error: " + e.toString());
}
return null;
}
In Android you have to e aware of limited memory, so large images would't fit in memory and you will have OutOfMemory exceptions.
The key is, after saving te image in internal storage, load it at the display resolution:
First download te image, this should be done outside the UI thread, let _url an URL intance with the image addres and _file the String containing destination file :
URLConnection conn = _url.openConnection();
conn.connect();
InputStream is = conn.getInputStream();
boolean success = false; //track succesful operation
if( _file != null)
{
try
{
FileOutputStream fos = new FileOutputStream(_file);
byte data[] = new byte[4086]; //use 4086 bytes buffer
int count = 0;
while ((count = is.read(data)) != -1)
{
fos.write(data, 0, count);//write de data
}
is.close();
fos.flush();
fos.close();
int len = conn.getContentLength();
File f = new File( _file);//check fie length is correct
if( len== f.length())
{
success = true;
}
else
{
//error downloading, delete de file
File tmp = new File( _file);
if( tmp.exists())
{
tmp.delete();
}
}
}catch (Exception e )
{
try
{
e.printStackTrace();
//delete file with errors
File tmp = new File( _file);
if( tmp.exists())
{
tmp.delete();
}
}
catch (Exception ex)
{
ex.printStackTrace();
}
}
finally
{
is.close();//cleanup
}
Then when you have to load the image at the desired resolution, here the key is use BitmapFactory to read bitmap info and get scaled bitmap:
public static Bitmap bitmapFromFile(int width, int height, String file)
{
Bitmap bitmap = null;
final BitmapFactory.Options options = new BitmapFactory.Options();
if( height >0 && width > 0 ) {
options.inJustDecodeBounds = true;//only read bitmap metadata
BitmapFactory.decodeFile(file,options);
// Calculate inSampleSize
options.inSampleSize = calculateInSampleSize(options, width, height);
// Decode bitmap with inSampleSize set
options.inJustDecodeBounds = false;
}
try
{
bitmap = BitmapFactory.decodeFile(file, options);//decode scaled bitmap
}catch (Throwable t)
{
if( bitmap != null)
{
bitmap.recycle();//cleanup memory, very important!
}
return null;
}
return bitmap
}
The final step is to calculate the scale factor:
public static int calculateInSampleSize(
BitmapFactory.Options options, int reqWidth, int reqHeight) {
// Raw height and width of image
final int height = options.outHeight;
final int width = options.outWidth;
int inSampleSize = 1;
if (height > reqHeight || width > reqWidth) {
final int halfHeight = height;
final int halfWidth = width;
// Calculate the largest inSampleSize value that is a power of 2 and keeps both
// height and width larger than the requested height and width.
while ((couldShrink(halfWidth, reqWidth, inSampleSize)&&
couldShrink(halfHeight,reqHeight, inSampleSize))
//&&(halfHeight*halfWidth)/inSampleSize > maxsize)
)
{
inSampleSize *= 2;
}
}
return inSampleSize;
}
private static boolean couldShrink ( int dimension, int req_dimension, int divider)
{
int actual = dimension / divider;
int next = dimension / (divider*2);
int next_error = Math.abs(next - req_dimension);
int actual_error = Math.abs(actual-req_dimension);
return next > req_dimension ||
(actual > req_dimension && (next_error < actual_error) )
;
}
That is if you want to do it by hand, I recommend you to use Picasso that will handle donwloading, disk caching and memory caching of your image:
To load into a ImageView called image showing a backgroud (R.drawable.img_bg) while downloading :
Picasso.with(image.getContext())
.load(url).placeholder(R.drawable.img_bg).fit()
.into(image, new Callback.EmptyCallback()
{
#Override
public void onSuccess()
{
holder.progress.setVisibility(View.GONE); //hide progress bar
}
#Override
public void onError()
{
holder.progress.setVisibility(View.GONE); //hide progress bar
//do whatever you design to show error
}
});
to handle yourself a bitmap:
//first declare a target
_target = new Target()
{
#Override
public void onBitmapLoaded(Bitmap bitmap, Picasso.LoadedFrom from)
{
//handle your bitmap (store it and use it on you canvas
}
#Override
public void onBitmapFailed(Drawable errorDrawable)
{
//handle your fail state
}
#Override
public void onPrepareLoad(Drawable placeHolderDrawable)
{//for example for drawing a placeholder while downloading
}
};
Now you just have to load and resize your image:
Picasso.with(context).load(url).resize(192, 192).centerCrop().into(_target);
Hope that helps.
I am doing a app on images to show them in GridView, i am fetching 20 images from server. Resolution of the each image is 720*540.I used JSON parsing to fetch url and used below code to convert into Bitmap in order to set images.
public static Bitmap loadImageFromUrl(String url) {
InputStream inputStream;Bitmap b;
try {
inputStream = (InputStream) new URL(url).getContent();
BitmapFactory.Options bpo= new BitmapFactory.Options();
if(bpo.outWidth>500) {
bpo.inSampleSize=8;
b=BitmapFactory.decodeStream(inputStream, null,bpo );
} else {
bpo.inSampleSize=2;
b=BitmapFactory.decodeStream(inputStream, null,bpo );
}
return b;
} catch (IOException e) {
throw new RuntimeException(e);
}
}
my app is working fine but it is taking too much time to load the images. So that my app became slow. Should i decrease the resolution of images?
how to come out of the issue?
If you are doing a grid view to load 20 images of such resolution, I would suggest the following:
Definitely reduce the size of the images. Unless you are targeting a tablet, you will be fine as most smartphones cannot achieve that resolution with 20 images.
Cache images if you can.
Download the images on a different thread. Store a HashMap would make it easy for you, just put all the imageviews with the image file names or other form of IDs as keys. send message to your Handler when images are downloaded and update the view after it's decoded. You can retrieve your views directly. Just remember to check if they are still in the window. This way the images will show up one after another quickly. I don't think multithreading the images will help, just make sure to use another thread to "push the images" and the main UI thread updates. User experience will be greatly improved then.
Hope this helps.
---some implementations, I don't have the complete code with me right now---
Have a data structure to match the views with data that comes in. very handy here.
private HashMap<String,ImageView> pictures;
When you get the list of image urls, iterate through them:
pictures.put(id,view);
try{
FileInputStream in = openFileInput(id);
Bitmap bitmap = null;
bitmap = BitmapFactory.decodeStream(in, null, null);
view.setImageBitmap(bitmap);
}catch(Exception e){
new Thread(new PictureGetter(this,mHandler,id)).start();
}
(Here the picture getter will simply fetch the image if it is not cached already and cache it)
Code to update the image view:
if(id!=null){
ImageView iv = pictures.get(id);
if(iv!=null){
try{
FileInputStream in = openFileInput(id);
Bitmap bitmap = null;
bitmap = BitmapFactory.decodeStream(in, null, null);
iv.setImageBitmap(bitmap);
}catch(Exception e){
}
}
Picasso library
Solution is instead of using bitmap to load image directly use a awesome Library called Picasso its just super fast i know you really love this you can do this like this
Add picasso jar file to your project (Download picasso jar file here) Use picasso to load the Image like this
Picasso.with(context).load(new File(title)).centerCrop()
.resize(150, 150).error(R.drawable.ic_launcher).into(image);
where title is the image path which you want to load. Crop,resize, error are optional.
I'm guessing that most of the loading time is because of the large amount of images combined with the size of the images.
There are 2 possible solutions:
Resize the images, or lower the quality of the images so that the filesize is below 75kb or so.
Use multi-threading to retrieve multiple images at once. This might not help if the user's connection is really slow, but if you combine this with a small enough filesize it might just help out enough. You might want to determine what the current bandwidth of the device is and base the number of threads you run on that.
For instance: 20 images of 75KB each and an available connection of 200 KB/s = 3 or 4 concurrent threads.
Hope this helps.
I have same problem in my android app. When you decode a bitmap from a big sized image and set as imageBitmap to an image view probably your application will slow and after a few try you'll get an "out of memory exception"
Two of the possible ways you can try to handle this problem:
1- Reduce bitmap size when you decode from file
2- Use an image library.
I prefered second way and used Universal Image Loader. https://github.com/nostra13/Android-Universal-Image-Loader
String url = "file://" + your_file_path
com.nostra13.universalimageloader.core.ImageLoader.getInstance().displayImage(url, ivPicture, options);
public class clothImageLoader {
// the simplest in-memory cache implementation. This should be replaced with
// something like SoftReference or BitmapOptions.inPurgeable(since 1.6)
// public static HashMap<String, Bitmap> cache = new HashMap<String,
// Bitmap>();
private static File cacheDir;
public clothImageLoader(Context context) {
// Make the background thead low priority. This way it will not affect
// the UI performance
photoLoaderThread.setPriority(Thread.NORM_PRIORITY - 1);
// Find the dir to save cached images
if (android.os.Environment.getExternalStorageState().equals(android.os.Environment.MEDIA_MOUNTED))
// cacheDir=new
// File(android.os.Environment.getExternalStorageDirectory(),"LazyList");
cacheDir = new File(ConstValue.MY_ClothBitmap_DIR);
else
cacheDir = context.getCacheDir();
if (!cacheDir.exists())
cacheDir.mkdirs();
}
final int stub_id = R.drawable.icon;
public void DisplayImage(String url, Activity activity, ImageView imageView) {
if (ConstValue.ClothRoomcache.containsKey(url))
imageView.setImageBitmap(ConstValue.ClothRoomcache.get(url));
else {
queuePhoto(url, activity, imageView);
imageView.setImageResource(stub_id);
}
}
private void queuePhoto(String url, Activity activity, ImageView imageView) {
// This ImageView may be used for other images before. So there may be
// some old tasks in the queue. We need to discard them.
photosQueue.Clean(imageView);
PhotoToLoad p = new PhotoToLoad(url, imageView);
synchronized (photosQueue.photosToLoad) {
photosQueue.photosToLoad.push(p);
photosQueue.photosToLoad.notifyAll();
}
// start thread if it's not started yet
if (photoLoaderThread.getState() == Thread.State.NEW)
photoLoaderThread.start();
}
private Bitmap getBitmap(String url) {
// I identify images by hashcode. Not a perfect solution, good for the
// demo.
String filename = String.valueOf(url.hashCode());
File f = new File(cacheDir, filename);
// from SD cache
Bitmap b = decodeFile(f);
if (b != null)
return b;
// from web
try {
Bitmap bitmap = null;
/*
* InputStream is=new URL(url).openStream(); OutputStream os = new
* FileOutputStream(f); Utils.CopyStream(is, os); os.close();
*/
URL url1 = new URL(url);
bitmap = decodeFile(f);
/* Open a connection to that URL. */
URLConnection ucon = url1.openConnection();
/*
* Define InputStreams to read from the URLConnection.
*/
InputStream is = ucon.getInputStream();
// FlushedInputStream a = new FlushedInputStream(is);
BufferedInputStream bis = new BufferedInputStream(is);
/*
* Read bytes to the Buffer until there is nothing more to read(-1).
*/
ByteArrayBuffer baf = new ByteArrayBuffer(5000);
int current = 0;
while ((current = bis.read()) != -1) {
baf.append((byte) current);
}
/* Convert the Bytes read to a String. */
FileOutputStream fos = new FileOutputStream(f);
fos.write(baf.toByteArray());
fos.flush();
fos.close();
bitmap = decodeFile(f);
return bitmap;
} catch (Exception ex) {
ex.printStackTrace();
return null;
}
}
// decodes image and scales it to reduce memory consumption
private Bitmap decodeFile(File f) {
try {
// decode image size
BitmapFactory.Options o = new BitmapFactory.Options();
o.inJustDecodeBounds = true;
BitmapFactory.decodeStream(new FileInputStream(f), null, o);
// Find the correct scale value. It should be the power of 2.
final int REQUIRED_SIZE = ConstValue.bmpSize;
int width_tmp = o.outWidth, height_tmp = o.outHeight;
int scale = 1;
while (true) {
if (width_tmp / 2 < REQUIRED_SIZE || height_tmp / 2 < REQUIRED_SIZE)
break;
width_tmp /= 2;
height_tmp /= 2;
scale++;
}
// decode with inSampleSize
BitmapFactory.Options o2 = new BitmapFactory.Options();
o2.inSampleSize = scale;
return BitmapFactory.decodeStream(new FileInputStream(f), null, o2);
} catch (FileNotFoundException e) {
}
return null;
}
// Task for the queue
private class PhotoToLoad {
public String url;
public ImageView imageView;
public PhotoToLoad(String u, ImageView i) {
url = u;
imageView = i;
}
}
PhotosQueue photosQueue = new PhotosQueue();
public void stopThread() {
photoLoaderThread.interrupt();
}
// stores list of photos to download
class PhotosQueue {
private Stack<PhotoToLoad> photosToLoad = new Stack<PhotoToLoad>();
// removes all instances of this ImageView
public void Clean(ImageView image) {
for (int j = 0; j < photosToLoad.size();) {
if (photosToLoad.get(j).imageView == image)
photosToLoad.remove(j);
else
++j;
}
}
}
class PhotosLoader extends Thread {
public void run() {
try {
while (true) {
// thread waits until there are any images to load in the
// queue
if (photosQueue.photosToLoad.size() == 0)
synchronized (photosQueue.photosToLoad) {
photosQueue.photosToLoad.wait();
}
if (photosQueue.photosToLoad.size() != 0) {
PhotoToLoad photoToLoad;
synchronized (photosQueue.photosToLoad) {
photoToLoad = photosQueue.photosToLoad.pop();
// photoToLoad=photosQueue.photosToLoad.get(0);
// photosQueue.photosToLoad.remove(photoToLoad);
}
Bitmap bmp = getBitmap(photoToLoad.url);
ConstValue.ClothRoomcache.put(photoToLoad.url, bmp);
if (((String) photoToLoad.imageView.getTag()).equals(photoToLoad.url)) {
BitmapDisplayer bd = new BitmapDisplayer(bmp, photoToLoad.imageView);
Activity a = (Activity) photoToLoad.imageView.getContext();
a.runOnUiThread(bd);
}
}
if (Thread.interrupted())
break;
}
} catch (InterruptedException e) {
// allow thread to exit
}
}
}
PhotosLoader photoLoaderThread = new PhotosLoader();
// Used to display bitmap in the UI thread
class BitmapDisplayer implements Runnable {
Bitmap bitmap;
ImageView imageView;
public BitmapDisplayer(Bitmap b, ImageView i) {
bitmap = b;
imageView = i;
}
public void run() {
if (bitmap != null)
imageView.setImageBitmap(bitmap);
else
imageView.setImageResource(stub_id);
}
}
public static void clearCache() {
// clear memory cache
ConstValue.ClothRoomcache.clear();
// clear SD cache
File[] files = cacheDir.listFiles();
for (File f : files)
f.delete();
}
public class FlushedInputStream extends FilterInputStream {
public FlushedInputStream(InputStream inputStream) {
super(inputStream);
}
#Override
public long skip(long n) throws IOException {
long totalBytesSkipped = 0L;
while (totalBytesSkipped < n) {
long bytesSkipped = in.skip(n - totalBytesSkipped);
if (bytesSkipped == 0L) {
int a = read();
if (a < 0) {
break; // we reached EOF
} else {
bytesSkipped = 1; // we read one byte
}
}
totalBytesSkipped += bytesSkipped;
}
return totalBytesSkipped;
}
}
}
when you call the method ,in the gridView getView method:
holder.image.setTag(ChoseInfo.get(position).getLink());
imageLoader.DisplayImage(ChoseInfo.get(position).getLink(), activity, holder.image);
ChoseInfo.get(position).getLink())
Here getLink() is internet link.