Mechanism for apps to display their image content in native 4k - android

As I understand that currently UHD video content is streamed on 4k TV mostly using HEVC codec.
I want to understand how can apps which have UHD image content can display their image content in native 4K?
What I am exactly looking for is rendering 4k(3840*2060) jpeg images. My display supports 4k rendering and the SOC can even output 4k. I am looking for modifications in framework, so that all apps which have 4k images can render them on my device without downscaling.
Actually I am trying to come up with API set which others can use. But my main confusion is : for jpeg image i create a 4k surface, but their are other surfaces as well (buttons etc). They are rendered by surface flinger which renders at 1280*720.
Now what is the best way to compose my 4k surface with these other surfaces? Where should I upscale these surfaces and where to compose all of them?

An important thing to keep clear is what kind of development you have in mind. If you simply wish to put a video stream showing inside your App, you should take the user interface for the main activity that will simply consist of an instance of the VideoView class. The VideoView class has a wide range of methods that may be called in order to manage the playback of video.
Configure the VideoView with the path of the video to be played and then start the playback. Then select the VideoPlayerActivity.java file and modify the OnCreate() method as outlined in the following listing:
package com.example.videoplayer;
import android.os.Bundle;
import android.app.Activity;
import android.view.Menu;
import android.widget.VideoView;
public class VideoPlayerActivity extends Activity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_video_player);
final VideoView videoView =
(VideoView) findViewById(R.id.videoView1);
videoView.setVideoPath(
"http://www.ebookfrenzy.com/android_book/movie.mp4");
videoView.start();
}
.
.
.
}
Essentially what you have is the Android SDK for the App user interface, which means that you can use different choices to actually render the video streaming underneath the UI layer.
Migrating your already existing App from a tablet or mobile to a smart TV is also something that can be achieve quite smoothly. Some few things will have to be adjusted - e.g. touch screen usually for smart TV may not be an option.
Instead you should consider onkeyDown as a more reliable method to input interaction for your App:
#Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
switch (keyCode) {
case KeyEvent.KEYCODE_MEDIA_PLAY:{
if (!mPlaying) {
startSlideShow();
}
mPlaying = true;
break;
}
case KeyEvent.KEYCODE_MEDIA_PAUSE:{
mPlaying = false;
showStatusToast(R.string.slideshow_paused);
}
}
return super.onKeyDown(keyCode, event);
}
As part of Android Smart Google TV API, you can also make necessary adjustment for larger screen resolutions:
// Get the source image's dimensions
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true; // this does not download the actual image, just downloads headers.
BitmapFactory.decodeFile(IMAGE_FILE_URL, options);
int srcWidth = options.outWidth; // actual width of the image.
int srcHeight = options.outHeight; // actual height of the image.
// Only scale if the source is big enough. This code is just trying to fit a image into a certain width.
if(desiredWidth > srcWidth)
desiredWidth = srcWidth;
// Calculate the correct inSampleSize/scale value. This helps reduce memory use. It should be a power of 2.
int inSampleSize = 1;
while(srcWidth / 2 > desiredWidth){
srcWidth /= 2;
srcHeight /= 2;
inSampleSize *= 2;
}
float desiredScale = (float) desiredWidth / srcWidth;
// Decode with inSampleSize
options.inJustDecodeBounds = false; // now download the actual image.
options.inDither = false;
options.inSampleSize = inSampleSize;
options.inScaled = false;
options.inPreferredConfig = Bitmap.Config.ARGB_8888; // ensures the image stays as a 32-bit ARGB_8888 image.
// This preserves image quality.
Bitmap sampledSrcBitmap = BitmapFactory.decodeFile(IMAGE_FILE_URL, options);
// Resize
Matrix matrix = new Matrix();
matrix.postScale(desiredScale, desiredScale);
Bitmap scaledBitmap = Bitmap.createBitmap(sampledSrcBitmap, 0, 0,
sampledSrcBitmap.getWidth(), sampledSrcBitmap.getHeight(), matrix, true);
sampledSrcBitmap = null;
// Save
FileOutputStream out = new FileOutputStream(LOCAL_PATH_TO_STORE_IMAGE);
scaledBitmap.compress(Bitmap.CompressFormat.JPEG, 100, out);
scaledBitmap = null;
You can also easily convert your LCD TV into a smart TV and start playing and testing how your App for google TV would behave. For that you only need to put your hands in the adaptor kit.
In fact, Lenovo is releasing a 28-inch 4K monitor (the ThinkVision 28) that also runs Android, allowing you to run all of the usual media streaming apps, and Kogan is doing the same.
Hacking a little bit further in playing with gadgets, you can even huck up your mobile with MHL HDMI on a TV or use it as a computer.
So, Android working with MHL 3.0 does 4K video output is a reality that good developers can already make their Apps with the necessary resolution and input adjustment accordingly to the device in use.
If your main concern is to put performance and optimize the video streaming, you may consider following options:
NDK: You decide to take some library or implement your own program that actually is made in C++ to optimize the video streaming.
RenderScript: it uses C99 syntax with new APIs that are ultimately compiled to native code. While this syntax is well known, there's a learning curve to using this system because the APIs are not.
OpenCL: is meant for graphics acceleration and provides many tools for 3D rendering and video stream high performance.
In fact the code in OpenCL resembles C/C++:
for (int yy=-filterWidth; yy<=filterWidth; ++yy)
{
for (int xx=-filterWidth; xx<=filterWidth; ++xx)
{
int thisIndex = (y + yy) * width + (x + xx);
float4 currentPixel = oneover255 *convert_float4(srcBuffer[thisIndex]);
float domainDistance = fast_distance((float)(xx), (float)(yy));
float domainWeight = exp(-0.5f * pow((domainDistance/sigmaDomain),2.0f));
float rangeDistance = fast_distance(currentPixel.xyz, centerPixel.xyz);
float rangeWeight = exp(-0.5f * pow((rangeDistance/sigmaRange),2.0f));
float totalWeight = domainWeight * rangeWeight ;
normalizeCoeff += totalWeight;
sum4 += totalWeight * currentPixel;
}
}
In terms of ARM microprocessors capabilities, it is worth to mention Sigma Designs SMP8756 ARM for Android Set-top-boxes, meant to address full High Efficiency Video Coding (HEVC) capabilities.
Bilinear Filtering
To resize your image/video, what you need is to apply bilinear filtering. Bilinear Interpolation is the process of using each of the intermediate fields in an interlaced video frame to generate a full size target image. Either all the odd or all the even lines on the field are used. Interpolations are then performed between the lines and between adjoining pixels to generate an entire non-interlaced frame for the progressive scan output.
To implement have your image adjusted properly for the sizes you need, there are plenty of good algorithms that can be used for that purpose, such as some of the OpenCL features for image scaling, and likewise for native C++ other options are also available.

Related

AndroidCamera2 for face detection and distance measurement

I'm building a camera that needs to detect the user's face/eyes and measure distance through the eyes.
I found that on this project https://github.com/IvanLudvig/Screen-to-face-distance, it works great but it doesn't happen to use a preview of the frontal camera (Really, I tested it on at least 10 people, all measurements were REALLY close or perfect).
My app already had a selfie camera part made by me, but using the old camera API, and I didn't find a solution to have both camera preview and the face distance to work together on that, always would receive an error that the camera was already in use.
I decided to move to the camera2 to use more than one camera stream, and I'm still learning this process of having two streams at the same time for different things. Btw documentation on this seems to be so scarce, I'm really lost on it.
Now, am I on the right path to this?
Also, on his project,Ivan uses this:
Camera camera = frontCam();
Camera.Parameters campar = camera.getParameters();
F = campar.getFocalLength();
angleX = campar.getHorizontalViewAngle();
angleY = campar.getVerticalViewAngle();
sensorX = (float) (Math.tan(Math.toRadians(angleX / 2)) * 2 * F);
sensorY = (float) (Math.tan(Math.toRadians(angleY / 2)) * 2 * F);
This is the old camera API, how can I call this on the new one?
Judging from this answer: Android camera2 API get focus distance in AF mode
Do I need to get min and max focal lenghts?
For the horizontal and vertical angles I found this one: What is the Android Camera2 API equivalent of Camera.Parameters.getHorizontalViewAngle() and Camera.Parameters.getVerticalViewAngle()?
The rest I believe is done by Google's Cloud Vision API
EDIT:
I got it to work on camera2, using GMS's own example, CameraSourcePreview and GraphicOverlay to display whatever I want to display together the preview and detect faces.
Now to get the camera characteristics:
CameraManager manager = (CameraManager) this.getSystemService(Context.CAMERA_SERVICE);
try {
character = manager.getCameraCharacteristics(String.valueOf(1));
} catch (CameraAccessException e) {
Log.e(TAG, "CamAcc1Error.", e);
}
angleX = character.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE).getWidth();
angleY = character.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE).getHeight();
sensorX = (float)(Math.tan(Math.toRadians(angleX / 2)) * 2 * F);
sensorY = (float)(Math.tan(Math.toRadians(angleY / 2)) * 2 * F);
This pretty much gives me mm accuracy to face distance, which is exactly what I needed.
Now what is left is getting a picture from this preview with GMS's CameraSourcePreview, so that I can use later.
Final Edit here:
I solved the picture issue, but I forgot to edit here. The thing is, all the examples using camera2 to take a picture are really complicated (rightly so, it's a better API than camera, has a lot of options), but it can be really simplyfied to what I did here:
mCameraSource.takePicture(null, bytes -> {
Bitmap bitmap;
bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
if (bitmap != null) {
Matrix matrix = new Matrix();
matrix.postRotate(180);
matrix.postScale(1, -1);
rotateBmp = Bitmap.createBitmap(bitmap, 0, 0, bitmap.getWidth(),
bitmap.getHeight(), matrix, false);
saveBmp2SD(STORAGE_PATH, rotateBmp);
rotateBmp.recycle();
bitmap.recycle();
}
});
That's all I needed to take a picture and save to a location I specified, don't mind the recycling here, it's not right, I'm working on it
It looks like that bit of math is calculating the physical dimensions of the image sensor, via the angle-of-view equation:
The camera2 API has the sensor dimensions as part of the camera characteristics directly: SENSOR_INFO_PHYSICAL_SIZE.
In fact, if you want to get the field of view in camera2, you have to use the same equation in the other direction, since FOVs are not part of camera characteristics.
Beyond that, it looks like the example you linked just uses the old camera API to fetch that FOV information, and then closes the camera and uses the Vision API to actually drive the camera. So you'd have to look at the vision API docs to see how you can give it camera input instead of having it drive everything. Or you could use the camera API's built-in face detector, which on many devices gives you eye locations as well.

Upload a picture taken by the camera to a server with limited size

the title sounds maybe a bit like a "noob question" but I know quite well how to program for Android, I just to figure out what it is the best way to achieve what I want.
My use case is: the user takes a photo and sends it to our server which have a file-size limit (which could mean that we have to resize the photo directly on the device).
Seems easy, right? My problem are the following:
1) Better use intents which could crash because some camera apps are coded with the ass or build a basic view "take photo and confirm" with cawc camera libs ? (I did the two, I prefer intents but I'd like to have an opinion on that).
2) How do you handle the file size limit? I mean getting the size of the photo is quite easy with the File.length() (even if the returned value is not perfectly right) but if you goes over the limit, how can you say how big will be the resized picture? (you need to convert in bitmap to resize it and it's then a lot of problems with OOMException and you cannot calculate final size of a bitmap on the disk, you need to compress and write it to the disk and analyse the newly created file after).
Thanks for help :D
I did the same thing before.
1.I use intent to call the other camera app, and inside onActivityResult,
I get back the URI and process it as I need.
We do resize the pic, but I also keep the original ratio, and rotate it based on exif data.
Hopefully this resizing code block can give you some hints.
public static Bitmap DecodeImage(String path, int resolution) {
BitmapFactory.Options opts = new BitmapFactory.Options();
opts.inJustDecodeBounds = true;
BitmapFactory.decodeFile(path, opts);
opts.inSampleSize = computeSampleSize(opts, -1, resolution);
opts.inJustDecodeBounds = false;
return BitmapFactory.decodeFile(path, opts);
}
public static int computeSampleSize(BitmapFactory.Options options,
int minSideLength, int maxNumOfPixels) {
int initialSize = computeInitialSampleSize(options, minSideLength,
maxNumOfPixels);
int roundedSize;
if (initialSize <= 8) {
roundedSize = 1;
while (roundedSize < initialSize) {
roundedSize <<= 1;
}
} else {
roundedSize = (initialSize + 7) / 8 * 8;
}
return roundedSize;
}
private static int computeInitialSampleSize(BitmapFactory.Options options, int minSideLength, int maxNumOfPixels) {
double w = options.outWidth;
double h = options.outHeight;
int lowerBound = (maxNumOfPixels == -1) ? 1 :
(int) Math.ceil(Math.sqrt(w * h / maxNumOfPixels));
int upperBound = (minSideLength == -1) ? 128 :
(int) Math.min(Math.floor(w / minSideLength),
Math.floor(h / minSideLength));
if (upperBound < lowerBound) {
// return the larger one when there is no overlapping zone.
return lowerBound;
}
if ((maxNumOfPixels == -1) &&
(minSideLength == -1)) {
return 1;
} else if (minSideLength == -1) {
return lowerBound;
} else {
return upperBound;
}
}
The solution is not fancy but it is what I did it in the project, and we so far have no problems with it after release.
There are lots of question over this topic and hope you searched for it.
Q) Better use intents which could crash because some camera apps are coded with the ass or build a basic view "take photo and confirm" with cawc camera libs ? (I did the two, I prefer intents but I'd like to have an opinion on that).
A) Intents are best because they are build in, some device manufacturers intents for cropping and re-sizing images and I face similar problem of not having that on same manufacturer but on Older device. A stable/relible third party app would suffice for basic operations.
Q) How do you handle the file size limit? I mean getting the size of the photo is quite easy with the File.length() (even if the returned value is not perfectly right) but if you goes over the limit, how can you say how big will be the resized picture? (you need to convert in bitmap to resize it and it's then a lot of problems with OOMException and you cannot calculate final size of a bitmap on the disk, you need to compress and write it to the disk and analyse the newly created file after).
A)
How do you handle the file size limit?
Size limit depends on Camera, if you have 10MP camera then the resultant size would be greater than 5MP (hope you get the part).
You need to convert in bitmap to resize it and it's then a lot of problems with OOMException and you cannot calculate final size of a bitmap on the disk You can calculate the image size or sample it or crop it or resize it as far as you keep best practices of Android intact, recycle the bitmap as soon as you done with the bitmap operations. I have a app which has 100s of Images and most of them being send to server and some times it throws OOM, then I handle it accordingly.
You can gone through the link Loading Large Bitmap
They explained really well regarding bitmap scaling.
Please let me know if you don't get any point in this link.
Thanks

Image data from Android camera2 API flipped & squished on Galaxy S5

I am implementing an app that uses real-time image processing on live images from the camera. It was working, with limitations, using the now deprecated android.hardware.Camera; for improved flexibility & performance I'd like to use the new android.hardware.camera2 API. I'm having trouble getting the raw image data for processing however. This is on a Samsung Galaxy S5. (Unfortunately, I don't have another Lollipop device handy to test on other hardware).
I got the overall framework (with inspiration from the 'HdrViewFinder' and 'Camera2Basic' samples) working, and the live image is drawn on the screen via a SurfaceTexture and a GLSurfaceView. However, I also need to access the image data (grayscale only is fine, at least for now) for custom image processing. According to the documentation to StreamConfigurationMap.isOutputSupportedFor(class), the recommended surface to obtain image data directly would be ImageReader (correct?).
So I've set up my capture requests as:
mSurfaceTexture.setDefaultBufferSize(640, 480);
mSurface = new Surface(surfaceTexture);
...
mImageReader = ImageReader.newInstance(640, 480, format, 2);
...
List<Surface> surfaces = new ArrayList<Surface>();
surfaces.add(mSurface);
surfaces.add(mImageReader.getSurface());
...
mCameraDevice.createCaptureSession(surfaces, mCameraSessionListener, mCameraHandler);
and in the onImageAvailable callback for the ImageReader, I'm accessing the data as follows:
Image img = reader.acquireLatestImage();
ByteBuffer grayscalePixelsDirectByteBuffer = img.getPlanes()[0].getBuffer();
...but while (as said) the live image preview is working, there's something wrong with the data I get here (or with the way I get it). According to
mCameraInfo.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputFormats();
...the following ImageFormats should be supported: NV21, JPEG, YV12, YUV_420_888. I've tried all (plugged in for 'format' above), all support the set resolution according to getOutputSizes(format), but none of them give the desired result:
NV21: ImageReader.newInstance throws java.lang.IllegalArgumentException: NV21 format is not supported
JPEG: This does work, but it doesn't seem to make sense for a real-time application to go through JPEG encode and decode for each frame...
YV12 and YUV_420_888: this is the weirdest result -- I can see get the grayscale image, but it is flipped vertically (yes, flipped, not rotated!) and significantly squished (scaled significantly horizontally, but not vertically).
What am I missing here? What causes the image to be flipped and squished? How can I get a geometrically correct grayscale buffer? Should I be using a different type of surface (instead of ImageReader)?
Any hints appreciated.
I found an explanation (though not necessarily a satisfactory solution): it turns out that the sensor array's aspect ratio is 16:9 (found via mCameraInfo.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);).
At least when requesting YV12/YUV_420_888, the streamer appears to not crop the image in any way, but instead scale it non-uniformly, to reach the requested frame size. The images have the correct proportions when requesting a 16:9 format (of which there are only two higher-res ones, unfortunately). Seems a bit odd to me -- it doesn't appear to happen when requesting JPEG, or with the equivalent old camera API functions, or for stills; and I'm not sure what the non-uniformly scaled frames would be good for.
I feel that it's not a really satisfactory solution, because it means that you can't rely on the list of output formats, but instead have to find the sensor size first, find formats with the same aspect ratio, then downsample the image yourself (as needed)...
I don't know if this is the expected outcome here or a 'feature' of the S5. Comments or suggestions still welcome.
I had the same problem and found a solution.
The first part of the problem is setting the size of the surface buffer:
// We configure the size of default buffer to be the size of camera preview we want.
//texture.setDefaultBufferSize(width, height);
This is where the image gets skewed, not in the camera. You should comment it out, and then set an up-scaling of the image when displaying it.
int[] rgba = new int[width*height];
//getImage(rgba);
nativeLoader.convertImage(width, height, data, rgba);
Bitmap bmp = mBitmap;
bmp.setPixels(rgba, 0, width, 0, 0, width, height);
Canvas canvas = mTextureView.lockCanvas();
if (canvas != null) {
//canvas.drawBitmap(bmp, 0, 0, null );//configureTransform(width, height), null);
//canvas.drawBitmap(bmp, configureTransform(width, height), null);
canvas.drawBitmap(bmp, new Rect(0,0,320,240), new Rect(0,0, 640*2,480*2), null );
//canvas.drawBitmap(bmp, (canvas.getWidth() - 320) / 2, (canvas.getHeight() - 240) / 2, null);
mTextureView.unlockCanvasAndPost(canvas);
}
image.close();
You can play around with the values to fine tune the solution for your problem.

How to solve java.lang.OutOfMemoryError trouble in Android

Altough I have very small size image in drawable folder, I am getting this error from users. And I am not using any bitmap function in code. At least intentionally :)
java.lang.OutOfMemoryError
at android.graphics.BitmapFactory.nativeDecodeAsset(Native Method)
at android.graphics.BitmapFactory.decodeStream(BitmapFactory.java:683)
at android.graphics.BitmapFactory.decodeResourceStream(BitmapFactory.java:513)
at android.graphics.drawable.Drawable.createFromResourceStream(Drawable.java:889)
at android.content.res.Resources.loadDrawable(Resources.java:3436)
at android.content.res.Resources.getDrawable(Resources.java:1909)
at android.view.View.setBackgroundResource(View.java:16251)
at com.autkusoytas.bilbakalim.SoruEkrani.cevapSecimi(SoruEkrani.java:666)
at com.autkusoytas.bilbakalim.SoruEkrani$9$1.run(SoruEkrani.java:862)
at android.os.Handler.handleCallback(Handler.java:733)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:146)
at android.app.ActivityThread.main(ActivityThread.java:5602)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:515)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1283)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1099)
at dalvik.system.NativeStart.main(Native Method)
According to this stackTrace I'm gettin this error at this line ('tv' is a textView):
tv.setBackgroundResource(R.drawable.yanlis);
What is the problem? If you need some other information about code, I can add it.
Thanks!
You can't increase the heap size dynamically but you can request to use more by using.
android:largeHeap="true"
in the manifest.xml,you can add in your manifest these lines it is working for some situations.
<application
android:allowBackup="true"
android:icon="#mipmap/ic_launcher"
android:label="#string/app_name"
android:largeHeap="true"
android:supportsRtl="true"
android:theme="#style/AppTheme">
Whether your application's processes should be created with a large Dalvik heap. This applies to all processes created for the application. It only applies to the first application loaded into a process; if you're using a shared user ID to allow multiple applications to use a process, they all must use this option consistently or they will have unpredictable results.
Most apps should not need this and should instead focus on reducing their overall memory usage for improved performance. Enabling this also does not guarantee a fixed increase in available memory, because some devices are constrained by their total available memory.
To query the available memory size at runtime, use the methods getMemoryClass() or getLargeMemoryClass().
If still facing problem then this should also work
BitmapFactory.Options options = new BitmapFactory.Options();
options.inSampleSize = 8;
mBitmapInsurance = BitmapFactory.decodeFile(mCurrentPhotoPath,options);
If set to a value > 1, requests the decoder to subsample the original image, returning a smaller image to save memory.
This is the optimal use of BitmapFactory.Options.inSampleSize with regards to speed of displaying the image.
The documentation mentions using values that are a power of 2, so I am working with 2, 4, 8, 16 etc.
Lets get more deeper to Image Sampling:
For example, it’s not worth loading a 1024x768 pixel image into memory if it will eventually be displayed in a 128x128 pixel thumbnail in an ImageView.
To tell the decoder to subsample the image, loading a smaller version into memory, set inSampleSize to true in your BitmapFactory.Options object. For example, an image with resolution 2100 x 1500 pixels that is decoded with an inSampleSize of 4 produces a bitmap of approximately 512x384. Loading this into memory uses 0.75MB rather than 12MB for the full image (assuming a bitmap configuration of ARGB_8888). Here’s a method to calculate a sample size value that is a power of two based on a target width and height:
public static int calculateInSampleSize(
BitmapFactory.Options options, int reqWidth, int reqHeight) {
// Raw height and width of image
final int height = options.outHeight;
final int width = options.outWidth;
int inSampleSize = 1;
if (height > reqHeight || width > reqWidth) {
final int halfHeight = height / 2;
final int halfWidth = width / 2;
// Calculate the largest inSampleSize value that is a power of 2 and keeps both
// height and width larger than the requested height and width.
while ((halfHeight / inSampleSize) > reqHeight
&& (halfWidth / inSampleSize) > reqWidth) {
inSampleSize *= 2;
}
}
return inSampleSize;
}
Note: A power of two value is calculated because the decoder uses a
final value by rounding down to the nearest power of two, as per the
inSampleSize documentation.
To use this method, first decode with inJustDecodeBounds set to true, pass the options through and then decode again using the new inSampleSize value and inJustDecodeBounds set to false:
public static Bitmap decodeSampledBitmapFromResource(Resources res, int resId,
int reqWidth, int reqHeight) {
// First decode with inJustDecodeBounds=true to check dimensions
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeResource(res, resId, options);
// Calculate inSampleSize
options.inSampleSize = calculateInSampleSize(options, reqWidth, reqHeight);
// Decode bitmap with inSampleSize set
options.inJustDecodeBounds = false;
return BitmapFactory.decodeResource(res, resId, options);
}
This method makes it easy to load a bitmap of arbitrarily large size into an ImageView that displays a 100x100 pixel thumbnail, as shown in the following example code:
mImageView.setImageBitmap(decodeSampledBitmapFromResource(getResources(), R.id.myimage, 100, 100));
You can follow a similar process to decode bitmaps from other sources, by substituting the appropriate BitmapFactory.decode* method as needed.
I found this code also interesting:
private Bitmap getBitmap(String path) {
Uri uri = getImageUri(path);
InputStream in = null;
try {
final int IMAGE_MAX_SIZE = 1200000; // 1.2MP
in = mContentResolver.openInputStream(uri);
// Decode image size
BitmapFactory.Options o = new BitmapFactory.Options();
o.inJustDecodeBounds = true;
BitmapFactory.decodeStream(in, null, o);
in.close();
int scale = 1;
while ((o.outWidth * o.outHeight) * (1 / Math.pow(scale, 2)) >
IMAGE_MAX_SIZE) {
scale++;
}
Log.d(TAG, "scale = " + scale + ", orig-width: " + o.outWidth + ",
orig-height: " + o.outHeight);
Bitmap bitmap = null;
in = mContentResolver.openInputStream(uri);
if (scale > 1) {
scale--;
// scale to max possible inSampleSize that still yields an image
// larger than target
o = new BitmapFactory.Options();
o.inSampleSize = scale;
bitmap = BitmapFactory.decodeStream(in, null, o);
// resize to desired dimensions
int height = bitmap.getHeight();
int width = bitmap.getWidth();
Log.d(TAG, "1th scale operation dimenions - width: " + width + ",
height: " + height);
double y = Math.sqrt(IMAGE_MAX_SIZE
/ (((double) width) / height));
double x = (y / height) * width;
Bitmap scaledBitmap = Bitmap.createScaledBitmap(bitmap, (int) x,
(int) y, true);
bitmap.recycle();
bitmap = scaledBitmap;
System.gc();
} else {
bitmap = BitmapFactory.decodeStream(in);
}
in.close();
Log.d(TAG, "bitmap size - width: " +bitmap.getWidth() + ", height: " +
bitmap.getHeight());
return bitmap;
} catch (IOException e) {
Log.e(TAG, e.getMessage(),e);
return null;
}
How to Manage Your App's Memory: link
It's not a good idea to use android:largeHeap="true" here's the extract from google that explains it,
However, the ability to request a large heap is intended only for a
small set of apps that can justify the need to consume more RAM (such
as a large photo editing app). Never request a large heap simply
because you've run out of memory and you need a quick fix—you should
use it only when you know exactly where all your memory is being
allocated and why it must be retained. Yet, even when you're confident
your app can justify the large heap, you should avoid requesting it to
whatever extent possible. Using the extra memory will increasingly be
to the detriment of the overall user experience because garbage
collection will take longer and system performance may be slower when
task switching or performing other common operations.
After working excrutiatingly with out of memory errors i would say adding this to the manifest to avoid the oom issue is not a sin
Verifying App Behavior on the Android Runtime (ART)
The Android runtime (ART) is the default runtime for devices running Android 5.0 (API level 21) and higher. This runtime offers a number of features that improve performance and smoothness of the Android platform and apps. You can find more information about ART's new features in Introducing ART.
However, some techniques that work on Dalvik do not work on ART. This document lets you know about things to watch for when migrating an existing app to be compatible with ART. Most apps should just work when running with ART.
Addressing Garbage Collection (GC) Issues
Under Dalvik, apps frequently find it useful to explicitly call System.gc() to prompt garbage collection (GC). This should be far less necessary with ART, particularly if you're invoking garbage collection to prevent GC_FOR_ALLOC-type occurrences or to reduce fragmentation. You can verify which runtime is in use by calling System.getProperty("java.vm.version"). If ART is in use, the property's value is "2.0.0" or higher.
Furthermore, a compacting garbage collector is under development in the Android Open-Source Project (AOSP) to improve memory management. Because of this, you should avoid using techniques that are incompatible with compacting GC (such as saving pointers to object instance data). This is particularly important for apps that make use of the Java Native Interface (JNI). For more information, see Preventing JNI Issues.
Preventing JNI Issues
ART's JNI is somewhat stricter than Dalvik's. It is an especially good idea to use CheckJNI mode to catch common problems. If your app makes use of C/C++ code, you should review the following article:
Also, you can use native memory (NDK & JNI), so you actually bypass the heap size limitation.
Here are some posts made about it:
How to cache bitmaps into native memory
https://stackoverflow.com/a/9428660/1761003
JNI bitmap operations , for helping to avoid OOM when using large images
and here's a library made for it:
https://github.com/AndroidDeveloperLB/AndroidJniBitmapOperations
I see only two options:
You have memory leaks in your application.
Devices do not have enough memory when running your application.
If you are getting this Error java.lang.OutOfMemoryError this is the most common problem occurs in Android. This error is thrown by the Java Virtual Machine (JVM) when an object cannot be allocated due to lack of memory space.
Try this android:hardwareAccelerated="false" , android:largeHeap="true"in your
manifest.xml file under application like this:
<application
android:name=".MyApplication"
android:allowBackup="true"
android:icon="#mipmap/ic_launcher"
android:label="#string/app_name"
android:theme="#style/AppTheme"
android:hardwareAccelerated="false"
android:largeHeap="true" />
You should implement an LRU cache manager when dealing with bitmap
http://developer.android.com/reference/android/util/LruCache.html
http://developer.android.com/training/displaying-bitmaps/cache-bitmap.html
When should I recycle a bitmap using LRUCache?
OR
Use a tier library like Universal Image Loader :
https://github.com/nostra13/Android-Universal-Image-Loader
EDIT :
Now when dealing with images and most of the time with bitmap I use Glide which let you configure a Glide Module and a LRUCache
https://github.com/bumptech/glide
Few hints to handle such error/exception for Android Apps:
Activities & Application have methods like:
onLowMemory
onTrimMemory
Handle these methods to watch on memory usage.
tag in Manifest can have attribute 'largeHeap' set to TRUE, which requests more heap for App sandbox.
Managing in-memory caching & disk caching:
Images and other data could have been cached in-memory while app running, (locally in activities/fragment and globally); should be managed or removed.
Use of WeakReference, SoftReference of Java instance creation , specifically to files.
If so many images, use proper library/data structure which can manage memory, use samling of images loaded, handle disk-caching.
Handle OutOfMemory exception
Follow best practices for coding
Leaking of memory (Don't hold everything with strong reference)
Minimize activity stack e.g. number of activities in stack (Don't hold everything on context/activty)
Context makes sense, those data/instances not required out of scope (activity and fragments), hold them into appropriate context instead global reference-holding.
Minimize the use of statics, many more singletons.
Take care of OS basic memory fundametals
Memory fragmentation issues
Involk GC.Collect() manually sometimes when you are sure that in-memory caching no more needed.
android:largeHeap="true" didn't fix the error
In my case, I got this error after I added an icon/image to Drawable folder by converting SVG to vector. Simply, go to the icon xml file and set small numbers for the width and height
android:width="24dp"
android:height="24dp"
android:viewportWidth="3033"
android:viewportHeight="3033"
Check the image size
I was loading a ~350kB image in an imageview directly via XML (app:srcCompat) which was resulting in OOM error and the application crashed.
To solve it, I loaded the exact same image using Glide into the same imageview and it worked!
Lesson: Reduce image size / defer loading of image

Setting resolution to 720p or 1080p within GoogleTV?

How can I restrict my GoogleTV app to display in a specific resolution: i.e. 720P, 1080p, etc? For some reason my app is stuck at displaying at a resolution around 960x540 even though my GoogleTV and monitor can handle 1080p.
I'm not sure if it's just my GoogleTV that is displaying only in 960x540 or if other GoogleTVs are also seeing the same thing. In any case, I want to make sure that my app can only be viewed in the resolutions: 960x540 or 1280x720
Each display on a Google TV is different. (With the exception of built in TV's like the LG) There is a step when the Google TV is just turned on where you establish the resolution of the TV.
Most current Google TV (ARM based) are 1080p. Scaling down to 720 is accomplished by your TV.
All that said, 960x540 are android Device Pixels for either 720p or 1080p.
So, to summarize, you can't / shouldn't do what your asking.
I figured it out. I had to set the initial scale of the webview, and that allows me to view the WebView in 720p even though the screen resolution is 1080p.
This is the code I used:
boolean _bReloaded = false;
boolean _bLoaded = false;
int _nWebScale = 100;
DisplayMetrics displaymetrics = new DisplayMetrics();
getActivity().getWindowManager().getDefaultDisplay().getMetrics(displaymetrics);
int nScreenHeight = displaymetrics.heightPixels;
if(nScreenHeight >= 1000){
_nWebScale = (int) Math.round((nScreenHeight/720.0) * 100.0);
_bReloaded = false;
mWebView.setVisibility(View.INVISIBLE);
}
mWebView.setInitialScale(_nWebScale);
But I wasn't done there.
At this point, the webView is displayed in the proper resolution; however, there is a noticeable resizing of the webview window when the webView loads. This is why in the above code I hide the webview. Another problem is that in my particular case, my HTML wasn't listening to a window resize event and so it didn't readjust its UI after the webview was noticeably seen changing size. To fix this I could change my HTML and have it react to a JavaScript window resize event, or I could do what I ultimately went with - reloading the webview after it loads with the incorrect sized UI. With this solution I didn't have to react to a window resize event:
mWebView.setWebChromeClient(new WebChromeClient() {
public void onProgressChanged(WebView view, int progress) {
if (progress == 100) {
if(_nWebScale != 100 && (mWebView.getVisibility() != View.VISIBLE) && !_bReloaded){
_bReloaded = true;
mWebView.reload();
} else {
_bLoaded = true;
displayLoadingSpinner(false);
mWebView.setVisibility(View.VISIBLE);
}
} else if (mProgressBar.getVisibility() != View.VISIBLE) {
displayLoadingSpinner(true);
_bLoaded = false;
}
}
});
Another piece of info that isn't directly related to the original question, but may be useful is the following. Within my code, I had to resize and reposition a VideoView based on parameters that were sent from the webview. But because the webview may be in a different resolution than the actual display resolution, I had to adjust the parameters to make them in terms of the screen resolution.
For example:
//this parameter is equal to the width of the webview screen: 720
nHeightFromWebView;
//This is the x position of where the VideoView should be placed.
nXFromWebView;
float nScale = (float)_nWebScale/100;
int nAdjustedHeight = (int) Math.round(((float)nHeightFromWebView * nScale));
//The adjusted height computes to 1080 - the actual resolution of the screen
int nNewX = (int)Math.round((float)nXFromWebView * nScale);

Categories

Resources