I'm trying to calculate the remaining number of photos that can be taken using my custom camera and show that count to the user. I tried with the following code:
private void numberOfPhotosAvailable() {
long photosAvailable = 0;
StatFs stat = new StatFs(Environment.getExternalStorageDirectory().getPath());
resolution=getResolution();
long bytesPerPhoto=resolution/1048576;
long bytesAvailable = (long) stat.getAvailableBlocksLong() * (long) stat.getBlockSizeLong();
long megAvailable = bytesAvailable / 1048576;
System.out.println("Megs :" + megAvailable);
photosAvailable = megAvailable / bytesPerPhoto;
tvAvailablePhotos.setText("" + photosAvailable);
}
Method for getting resolution.
public long getResolution(){
long resolution=0;
Camera.Parameters params=mCamera.getParameters();
List<Camera.Size> sizes = params.getSupportedPictureSizes();
Camera.Size size = sizes.get(0);
int width=size.width;
int height=size.height;
resolution=width*height;
return resolution;
}
PROBLEM:
There is a lot of difference in the count shown in the phone's camera and count that is shown in my app.
So what is the proper way of doing this ?
NOTE: I will only be capturing image in the highest quality available. Therefore I am only calculating count according to one resolution only.
It is impossible to find the exact size of the output image for JPEG/PNG compression. The compression algorithms are optimized in such a way that they use as less size as possible but preserve the image pixels (although JPEG is slightly lossy).
However, you can estimate the no of images by taking multiple sample photos and calculating the average compression ratio.
From Wikipedia:
JPEG typically achieves 10:1 compression with little perceptible loss in image quality.
So estimated storage size can be calculated as:
int bytes = width * height * 2 / compressionRatio;
Here it is multiplied by 2 because in RGB_565 config, it requires 2 bytes to store 1 pixel.
I think that there is no ultimate solution for images left because there are too much different devices and cameras and also it depends heavily on image content.
One good explanation for that I have found here
As suggested before you can predict image size and calculate images count left based on a device free space. To get that prediction best solution is to try it on your device first.
After user starts using the app, you can include his last 10 photo sizes in calculation.
If it is not a key feature you can just present it as prediction based on usage, not as a binding fact.
P.S
I am using Samsung Galaxy S7 edge and there is no images left count in camera at all. (Or I am just unable to find it)
Well after lot of researching and googling, I came to this site.
According to this site, following are the steps to get the file size.
Multiply the detectors number of horizontal pixels by the number of vertical pixels to get the total number of pixels of the detector.
Multiply total number of pixels by the bit depth of the detector (16 bit, 14 bit etc.) to get the total number of bits of data.
Dividing the total number of bits by 8 equals the file size in bytes.
Divide the number of bytes by 1024 to get the file size in kilobytes. Divide by 1024 again and get the file size in megabytes.
So when I followed the above steps, i.e. my detectors resolution is 5376X3024. Proceeding with the above steps I finally got 39 MB as answer for image size.
But the image taken by camera was around 8-10 MB in size which was still not close to what I got in result above.
My phone (HTC Desire 10 pro) has a pro mode settings available. In this mode photos are captured as raw images. So when I checked for the size of the captured raw image, I was amused as the size of the raw file was indeed around 39 MB, which states that above steps are correct for calculating image's original size.
CONCLUSION
With the above steps I came to the conclusion that phone's softwares indeed use some compression algorithms to make image size less. So what I was comparing to was actually compressed images hence the count of images was different.
PROBABLE SOLUTION
The approach that I am now aiming is to get the last clicked image from my camera, get its file size and show count according to that file size. This will also be a approximate result but I don't think there can be any solution for getting exact count.
This is the code I am using to implement the above solution
private void numberOfPhotosAvailable() {
long photosAvailable = 0;
StatFs stat = new StatFs(Environment.getExternalStorageDirectory().getPath());
File lastFile=null;
lastFile=utils.getLatestFilefromDir(prefManager.getString(PrefrenceConstants.STORAGE_PATH));
if (lastFile!=null){
double fileSize=(lastFile.length())/(1024*1024);
long bytesAvailable = (long) stat.getAvailableBlocksLong() * (long) stat.getBlockSizeLong();
long megAvailable = bytesAvailable / 1048576;
System.out.println("Megs :" + megAvailable);
photosAvailable = (long) (megAvailable / fileSize);
tvAvailablePhotos.setText("" + photosAvailable);
}else{
tvAvailablePhotos.setVisibility(View.INVISIBLE);
}
}
I think you can check the DCIM directory(Default Camera Directory) for the number of files and calculate the size of all the files and get the average size by dividing the number of files.
and you will get the average size camera is capturing the images
do above steps in Asynctask.
and you already calculated the remaining size in bytes now divide again the remaining size with the average size and you will get the approx number of images you can capture.
Related
I'm trying to implement a resolution based Image capture
And my device supports the below resolution
4640x3472 ,4624x3472 ,4624x2600 ,4624x2136 ,3840x2160 ,3472x3472 ,2560x1920 ,
1920x1440 ,2340x1080 ,1920x1080 ,1440x1080 ,1080x1080 ,1440x720 ,1280x720 ,
960x720 ,720x480 ,640x480 ,352x288 ,320x240 ,176x144 ,9248x6936
Now, I also need to show an approximate size of an image if captured like
640x480 (~200-500kb)
1920x1440 (~1.5-3Mb)
...
With some research I came up with this blog
how to calculate file size of digital-image
1 Byte = 8 Bit 1 Kilobyte = 1,024 Bytes 1 Megabyte = 1,048,576 Bytes 1 Gigabyte = 1,073,741,824
2,949,120 X 16 bit = 47185920 ÷ 8bits = 5,898,240 Bytes 5,760 Kilobytes ÷ 1024 = 5.625 Megabytes)
But, My camera taking images with varying sizes and it is not matching to these numbers.
How, to calculate the approximate size range of an image based on resolution before capturing image ?
What I have developed thus far is the capability to write out various devices raw information using the standard DngCreator scheme as per below.
On one device that I am encountering however (HTC 10) the Image class contains planar information whose row stride is larger than the width. I so far have an understanding that this can happen with images, but I can't find out how to correct for it with the SDK available to us.
ByteBuffer byteBuffer = ByteBuffer.wrap(cameraImageF.getRawBytes());
byteBuffer.rewind();
dngCreator.writeByteBuffer(new FileOutputStream(rawLoggerFileF),
new Size(cameraImageF.getRawImageSize().getWidth(), cameraImageF.getRawImageSize().getHeight()),
byteBuffer, 0);
I have held onto the bytes from the original Image class and do some substantial calculations in between when I received them and when they were taken (this is the point of the application). So, I need to let go of the Image so that I can keep getting additional frames from the camera.
Now, this approach works fine for various devices (Samsung S7, Nexus 5, Nexus 6p, etc.). However on the HTC 10 the stride is 16 bytes longer per row and it seems as though I have no way of letting the DngCreator know that.
Underneath in the source code, the writeBuffer defaults to an internal rowStride = width * pixelStride. I do not have the capability to send in a different stride for a parameter. The rowStride does not equal the defaults.
The dngCreator.saveImage(Outputstream, Image) uses the internal Image's stride when it writes out to a buffer. However, I can't hold on to an Image class on the camera because it needs to be released and it is not a cloneable object.
I am a bit lost and trying to understand how to write out a valid .dng for a photograph that has rowStride > width.
You'll have to remove the extra bytes manually - that is, copy the raw image to a new ByteBuffer, and remove the extra bytes at the end of each row. So something like:
byte[] rawBytes = cameraImageF.getRawBytes();
ByteBuffer dst = ByteBuffer.allocate(cameraImageF.getRawImageSize().getWidth() * cameraImageF.getRawImageSize().getHeight() * 2);
for (int row = 0; row < cameraImageF.getRawImageSize().getHeight(); row++) {
dst.put(rawBytes,
row * cameraImageF.getRawImageRowStride(),
cameraImageF.getRawImageSize().getWidth() * 2);
}
dst.rewind();
dngCreator.writeByteBuffer(new FileOutputStream(rawLoggerFileF),
new Size(cameraImageF.getRawImageSize().getWidth(),
cameraImageF.getRawImageSize().getHeight()),
dst, 0);
That's of course not lovely for performance, but since DngCreator won't let you specify a row stride with the ByteBuffer interface, it's your only option.
Is there a reason you can't just increase your RAW ImageReader's maxCount to a higher one, so that you can hold on to the Image until you're done processing it?
I'm implementing the Camera2 API with the YUV_420_888 format on a Nexus 9. I checked the output sizes and wanted to use the largest (8MP, 3280 x 2460) size to save. However, it just appears as static lines, similar to how old TV's looked without a signal. I would like to stick with YUV_420_888 since my end goal is to save grayscale data (Y component).
I originally thought it was a camera bandwidth issue, but the same thing happened at some of the small sizes (320 x 240). None of the problems went away even when I increased frame duration and decreased the size of the preview to save on bandwidth. Some of the other sizes DID work (2048 x 1536, 1280 x 720) but I did not check all of them.
I'm starting to think getOutputSizes() may not necessarily be accurate. It gave me the same results for all other formats except RAW_SENSOR (JPEG, YUV_420_888, YV12). Has anyone encountered this or determined a solution?
Figured out the issue. I was not taking into account the rowStride of the returned pixels. So I had to run a for-loop to extract the non-padded data before saving it:
myRowStride = mImage.getPlanes()[0].getRowStride();
int iSkippedBytes = 0;
for (int i = 0; i < mStillSize.getWidth() * mStillSize.getHeight(); i++){
if (i % mStillSize.getWidth() == 0 && i != 0)
iSkippedBytes = iSkippedBytes + (myRowStride - mStillSize.getWidth());
imageBytes[i] = bytes[i + iSkippedBytes];
}
after googling a lot I have not yet found a way to resize an image preserving quality.
I have my image - stored by camera in full resolution - in
String filePath = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES) + "/my_directory/my_file_name.jpg";
Now, I need to resize it preserving aspect ratio and then save to another path.
What's the best way to do this without occurring the error "Out of memory on a xxxxxxx-byte allocation."?
I continue to retrieve this error on Samsung devices, I tried in every way, even with the library Picasso.
Thanks!
1st things 1st: depending on device and bitmap size, no matter what magic code you do, it will crash! Specially cheap Samsung phones that usually have no more than 16mb of RAM to the VM.
You can use this code How to get current memory usage in android? to check on amount of memory available and deal with it properly.
When doing those calculations, remember that bitmaps are uncompressed images, that means, even thou the JPG might be 100kb, the Bitmap might take several MB.
You'll use the code shown here https://developer.android.com/training/displaying-bitmaps/load-bitmap.html to read the bitmap boundaries, and do an approximate scale down as close as possible to the size you actually need, or enough to make the device not crash. That's why it's important to properly measure the memory.
That 1st code takes virtually no RAM as it creates from the disk, making it smaller by simply skipping pixels from the image. That's why it's approximate, it only does in power of 2 the scaling.
Then you'll use the standard API to scale down to the size you actually need https://developer.android.com/reference/android/graphics/Bitmap.html#createScaledBitmap(android.graphics.Bitmap, int, int, boolean)
so the pseudo code for it, will be:
try{
Info info = getImageInfo(File);
int power2scale = calculateScale(info, w, h);
Bitmap smaller = preScaleFromDisk(File, power2scale);
Bitmap bitmap = Bitmap.createScaledBitmap(smaller, w, h, f);
} catch(OutOfMemoryError ooe){
// call GC
// sleep to let GC run
// try again with higher power2scale
}
So I'm getting really confused here. The designer I work with wants high-quality images (png files) for Android tablets, but the game also has smaller images for less-powerful devices. I figured that the amount of memory on the heap would be the metric to determine which set of images to use, by using Runtime.getRuntime().maxMemory() - Runtime.getRuntime().totalMemory(). That doesn't seem to be the case though. On BlueStacks it can load the high-quality images just fine, and it has around 40,000,000 bytes. The designer's Galaxy Nexus has black boxes for some of the larger images (which I understand is due to a lack of memory for loading the image), but his Galaxy Nexus has about 50,000,000 available bytes, which is even more than BlueStacks.
So what is the limiting factor? And on a related matter, how is it that there are mobile games that have impressive quality visuals, yet I can't manage to load a few images? What am I doing wrong?
To note, I am using AndEngine, and below is an example of how I'm loading the images.
BuildableBitmapTextureAtlas resetTA = new BuildableBitmapTextureAtlas(this.getTextureManager(), 310 / d, 190 / d,
TextureOptions.BILINEAR);
resetTR = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(resetTA, this, "gfx/" + lowres + "reset.png", 1, 1);
try
{
resetTA.build(new BlackPawnTextureAtlasBuilder<IBitmapTextureAtlasSource, BitmapTextureAtlas>(0, 0, 0));
resetTA.load();
}
catch (TextureAtlasBuilderException e)
{
Debug.e(e);
}
One of the images that isn't loading in the Galaxy Nexus is a sprite sheet png file that's 2320x464.
There are two limiting factors here. First it's the heap memory. To find the available heap for your app you can use the method with the Runtime class, but that will tell you the maximum memory your app can use before it completely crashes. A limit that your app should respect in Android can be found this way:
ActivityManager am = (ActivityManager) getSystemService(ACTIVITY_SERVICE);
int memoryClass = am.getMemoryClass();
Log.d("MyTag", "Heap: + Integer.toString(memoryClass));
The second is GL_MAX_TEXTURE_SIZE value that limits the maximum dimension of a square texture for given device. This can vary, but the minimum these days seems to be 2048, therefore your textures can be as much as 2048x2048 pixels large. However the only recommendation is that the dimension must be larger than the screen dimensions and the real size is up to the manufacturer of the phone.
I think you can use the following code to find out the size:
int[] maxTextureSize = new int[1];
GLES20.glGetIntegerv(GLES20.GL_MAX_TEXTURE_SIZE, maxTextureSize, 0);
Log.d("MyTag", "GL_MAX_TEXTURE_SIZE: " + Integer.toString(int[0]));
The games can have impressive graphics before they split the big textures to smaller, load only what is needed and reuse as much as possible. I've made a game where some levels have 50000px wide ground by assembling it from 256x256 pieces making the game sharp even on full HD tablets. The pieces were distributed over several 2048x2048 textures.