Scale images inside textview using ImageGetter - android

I'm trying to scale the images showed in the textview but i just can't.
I'm using this code but no matter what, it shows the image cropped inside the container or doesn't show at all.
int width, height;
DisplayMetrics metrics = new DisplayMetrics();
metrics = Resources.getSystem().getDisplayMetrics();
int originalWidthScaled = (int) (result.getIntrinsicWidth() * metrics.density);
int originalHeightScaled = (int) (result.getIntrinsicHeight() * metrics.density);
if (originalWidthScaled > metrics.widthPixels) {
height = result.getIntrinsicHeight() * metrics.widthPixels
/ result.getIntrinsicWidth();
width = metrics.widthPixels;
} else {
height = originalHeightScaled;
width = originalWidthScaled;
}
urlDrawable.drawable = result;
urlDrawable.setBounds(0, 0, 0+width, 0+height);
// change the reference of the current drawable to the result
// from the HTTP call
// redraw the image by invalidating the container
container.invalidate();
// For ICS
container.setHeight(
container.getHeight() +
result.getIntrinsicHeight());
// Pre ICS
container.setEllipsize(null);

I answer myself i've changed
if (originalWidthScaled > metrics.widthPixels) {
height = result.getIntrinsicHeight() * metrics.widthPixels
/ result.getIntrinsicWidth();
width = metrics.widthPixels;
}
for
if (originalWidthScaled > (metrics.widthPixels * 70) / 100) {
width = (metrics.widthPixels * 70) / 100;
height = result.getIntrinsicHeight() * width
/ result.getIntrinsicWidth();
}
And now it occupies the 70% of the space of the screen which is exactly the max size of the container

For anyone who still looking for an answer using new APIs, this custom implementation of ImageGetter should allow you to scale up the image which will occupy the device display width, scale down if the given image is larger than the device display width or retain its original dimension if smaller.
/**
* Custom ImageGetter for [HtmlCompat.fromHtml] which accept both Url and Base64 from img tag.
* */
class HtmlImageGetter(
private val scope: LifecycleCoroutineScope,
private val res: Resources,
private val glide: RequestManager,
private val htmlTextView: AppCompatTextView,
#DrawableRes
private val errorImage: Int = 0,
private val matchParent: Boolean = true
) : ImageGetter {
override fun getDrawable(source: String): Drawable {
val holder = BitmapDrawablePlaceHolder(res, null)
scope.launch(Dispatchers.IO) {
runCatching {
glide
.asBitmap()
.load(
if (source.matches(Regex("data:image.*base64.*")))
Base64.decode(
source.replace("data:image.*base64".toRegex(), ""),
Base64.DEFAULT
) // Image tag used Base64
else
source // Image tag used URL
)
.submit()
.get()
}
.onSuccess { setDrawable(holder, it) }
.onFailure {
if (errorImage != 0)
BitmapFactory.decodeResource(res, errorImage)?.let {
setDrawable(holder, it)
}
}
}
return holder
}
private suspend fun setDrawable(holder: BitmapDrawablePlaceHolder, bitmap: Bitmap) {
val drawable = BitmapDrawable(res, bitmap)
val width: Int
val height: Int
val metrics = res.displayMetrics
val displayWidth = metrics.widthPixels - (htmlTextView.paddingStart + htmlTextView.paddingEnd + htmlTextView.marginStart + htmlTextView.marginEnd) * 100 / 100
val imageWidthScaled = (drawable.intrinsicWidth * metrics.density)
val imageHeightScaled = (drawable.intrinsicHeight * metrics.density)
// Scale up if matchParent is true
// Scale down if matchParent is false
if (matchParent || imageWidthScaled > displayWidth) {
width = displayWidth
height = (drawable.intrinsicHeight * width / drawable.intrinsicWidth)
}
else {
height = imageHeightScaled.roundToInt()
width = imageWidthScaled.roundToInt()
}
drawable.setBounds(0, 0, width, height)
holder.setDrawable(drawable)
holder.setBounds(0, 0, width, height)
withContext(Dispatchers.Main) { htmlTextView.text = htmlTextView.text }
}
internal class BitmapDrawablePlaceHolder(res: Resources, bitmap: Bitmap?) :
BitmapDrawable(res, bitmap) {
private var drawable: Drawable? = null
override fun draw(canvas: Canvas) {
drawable?.run { draw(canvas) }
}
fun setDrawable(drawable: Drawable) {
this.drawable = drawable
}
}
}

Related

Is it possible to get the actual size of the Camera Device and import it as a preview size?

I am making a project about Camera2API, usually by default it detect what the display size is and crop/cut the size using MeasuredSpec. My problem is that I don't know how to declare the actual camera hardware fully to get of what the camera actually see. Here is part of the code of what I am trying to do:
val cameraRes = Camera2BasicFragment().configureTransform(width, height) as Size
val cameraWidth = cameraRes.width
val cameraHeight = cameraRes.height
My aim is to make the camera hardware on any device literally and import it as a preview size, in order to make calculations of the crop/cut mechanism. I would be happy to get some help or conclusion whether or not, if it's true or not and avoid to be stretched in Portrait, in Landscape it doesn't stretch the TextureView, but on Portrait is squeezed or stretched completely. Here is the entire code:
class AutoFitTextureView #JvmOverloads constructor(
context: Context,
attrs: AttributeSet? = null,
defStyle: Int = 0
) : TextureView(context, attrs, defStyle) {
//private val SENSOR_INFO_PHYSICAL_SIZE: CameraCharacteristics.Key<SizeF>? = null
private val TAG: String = "Camera2APIPreviewResolution"
private var ratioWidth = 0 //Width for preview
private var ratioHeight = 0 //Height for preview
fun nodAspectRatio(width: Int, height: Int) {
if (width < 0 || height < 0) {
throw IllegalArgumentException("Size cannot be negative.")
}
val cameraRes = Camera2BasicFragment().configureTransform(width, height) as Size
val cameraWidth = cameraRes.width
val cameraHeight = cameraRes.height
val size = Point(width, height)
val windowManager: WindowManager = context.getSystemService(Service.WINDOW_SERVICE) as WindowManager
val display: Display = windowManager.defaultDisplay
display.getRealSize(size)
val displayWidth = size.x
val displayHeight = size.y
ratioWidth = width
ratioHeight = height
if (cameraWidth < cameraHeight) {
height != width
}
val coefWidth = displayWidth
val coefHeight = displayHeight
if (coefWidth < coefHeight) {
ratioWidth = displayWidth
ratioHeight = displayHeight * coefWidth
if (ratioHeight > displayHeight) {
ratioHeight - displayHeight
}
} else {
ratioWidth = displayWidth
ratioHeight = displayHeight
if (ratioWidth > displayWidth) {
ratioWidth - displayWidth
}
}
Log.d("Camera2APICamera", "width:$cameraWidth, height:$cameraHeight")
Log.d("Camera2APIDisplay", "width:$displayWidth, height:$displayHeight")
Log.d("Camera2APICoef", "width:$coefWidth, height:$coefHeight")
Log.d("Camera2APIResolution", "width:$ratioWidth, height:$ratioHeight")
requestLayout()
}
#SuppressLint("DrawAllocation")
override fun onMeasure(widthMeasureSpec: Int, heightMeasureSpec: Int) {
super.onMeasure(widthMeasureSpec, heightMeasureSpec)
val width = MeasureSpec.getSize(widthMeasureSpec)
val height = MeasureSpec.getSize(heightMeasureSpec)
if (ratioWidth == 0 || ratioHeight == 0) {
setMeasuredDimension(width, height)
/**It's forced stretched when either width or height are 0*/
} else {
/** This is made for calculation formula if width is less than the height as calculated and another if statement provided the Portrait and Landscape*/
if (width < height * ratioWidth / ratioHeight) {
setMeasuredDimension(width, width * ratioHeight / ratioWidth)
} else {
setMeasuredDimension(height * ratioWidth / ratioHeight, height)
}
}
}
}
And this is the message of the debug:
E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.example.camera2apikotlin4, PID: 18559
java.lang.ClassCastException: kotlin.Unit cannot be cast to android.util.Size
at com.example.camera2apikotlin4.AutoFitTextureView.nodAspectRatio(AutoFitTextureView.kt:35)
at com.example.camera2apikotlin4.Camera2BasicFragment.setUpCameraOutputs(Camera2BasicFragment.kt:364)
at com.example.camera2apikotlin4.Camera2BasicFragment.openCamera(Camera2BasicFragment.kt:422)
at com.example.camera2apikotlin4.Camera2BasicFragment.access$openCamera(Camera2BasicFragment.kt:33)
at com.example.camera2apikotlin4.Camera2BasicFragment$surfaceTextureListener$1.onSurfaceTextureAvailable(Camera2BasicFragment.kt:44)

Android image capture, how to get the lowest quality picture to take less space

I am using Android's default camera to capture my intent. The images that come out are of really good quality and I cannot seem to find a way to lower the quality of the images.
Is that even possible without implementing a custom Camera ?
Is it possible to set like size limit of maximum 2MB or something like that?
Or just take the image in the lowest quality possible as the images in my application do not need to be of good quality.
public class ImageCaptureIntent {
public interface ImageCaptureResultListener {
void onImageCaptured(File image);
void onImageCaptureError(Exception exception);
}
static final int IMAGE_CAPTURE_REQUEST = 1;
private enum BundleKeys {
IMAGE_FILE
}
private File imageFile;
public void onSaveInstanceState(#NonNull Bundle outState) {
if (imageFile != null) {
outState.putString(BundleKeys.IMAGE_FILE.name(), imageFile.getAbsolutePath());
}
}
public void onRestoreInstanceState(#NonNull Bundle savedInstanceState) {
if (savedInstanceState.containsKey(BundleKeys.IMAGE_FILE.name())) {
imageFile = new File(savedInstanceState.getString(BundleKeys.IMAGE_FILE.name()));
}
}
private static File createTempFile(File directory) throws IOException {
String timestamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
String filePrefix = "IMG_" + timestamp + "_";
File file = File.createTempFile(filePrefix,".jpg", directory);
if (file == null) {
throw new IOException("Could not create a temp file");
}
return file;
}
public boolean initiateImageCapture(ImageCaptureResultListener listener, Activity activity, File directory) {
if (listener == null) {
return false;
}
Intent captureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if (captureIntent.resolveActivity(activity.getPackageManager()) == null) {
listener.onImageCaptureError(new ActivityNotFoundException("No app for ACTION_IMAGE_CAPTURE"));
return false;
}
try {
this.imageFile = createTempFile(directory);
} catch (IOException e) {
listener.onImageCaptureError(e);
return false;
}
Uri imageUri = FileProvider.getUriForFile(activity,activity.getPackageName() + ".fileprovider", this.imageFile);
captureIntent.putExtra(MediaStore.EXTRA_OUTPUT, imageUri);
activity.startActivityForResult(captureIntent, IMAGE_CAPTURE_REQUEST);
return true;
}
public boolean parseActivityResult(ImageCaptureResultListener listener, int requestCode, int resultCode, Intent data) {
if (requestCode != IMAGE_CAPTURE_REQUEST) {
return false;
}
if (listener == null) {
return false;
}
if (resultCode == Activity.RESULT_OK) {
listener.onImageCaptured(imageFile);
} else {
listener.onImageCaptureError(new RuntimeException("Image capturing was cancelled"));
}
return true;
}
}
EDIT
I am not using Bitmaps in my application. I am taking images and then sending them to the backend. In perfect scenario I would like to capture low quality images and then save them to the phone if possible. If that is not possible then I would like to at least send the compressed images to backend.
When you get the path from intent then use it.
CompressBitMap().execute(Uri.fromFile(File(mImagePath)))
inner class CompressBitMap : AsyncTask<Uri, Int, File>() {
override fun doInBackground(vararg p0: Uri?): File? {
val bitmap: Bitmap?
val filename = "${Date().time}profile.png"
val fileDir = File(Environment.getExternalStorageDirectory(), getString(R.string.app_name))
if (!fileDir.exists()) {
fileDir.mkdir()
}
val destPath = File(fileDir, filename)
val outPutStream = FileOutputStream(destPath)
try {
bitmap = ScaledPicture(p0[0], activity.contentResolver).getBitmap(400, 400)
bitmap.compress(Bitmap.CompressFormat.PNG, 100, outPutStream)
outPutStream.flush()
outPutStream.close()
} catch (e: Exception) {
e.printStackTrace()
}
return destPath
}
override fun onPostExecute(result: File?) {
super.onPostExecute(result)
result?.let {
mImagePath = result.absolutePath
setProfileImage(mImagePath, image_circle, null)
}
}
}
ScaledPicture and ImageScalingUtil are two important classes for reduce the size of image.
ScalePicture:=>
import android.content.ContentResolver
import android.graphics.Bitmap
import android.graphics.BitmapFactory
import android.graphics.Matrix
import android.graphics.RectF
import android.media.ExifInterface
import android.net.Uri
import com.silverskysoft.skysalon.imageUtils.ImageScalingUtils
import java.io.FileNotFoundException
import java.io.IOException
import java.io.InvalidObjectException
class ScaledPicture(private var uri: Uri?, private var resolver: ContentResolver) {
private var path: String? = null
private var orientation: Matrix? = null
private var storedHeight: Int = 0
private var storedWidth: Int = 0
#Throws(IOException::class)
private fun getInformation(): Boolean {
/*if (getInformationFromMediaDatabase())
return true;*/
return getInformationFromFileSystem()
}
/* Support for file managers and dropbox */
#Throws(IOException::class)
private fun getInformationFromFileSystem(): Boolean {
path = uri?.path
if (path == null)
return false
val exif = ExifInterface(path.toString())
val orientation = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION,
ExifInterface.ORIENTATION_NORMAL)
this.orientation = Matrix()
when (orientation) {
ExifInterface.ORIENTATION_NORMAL -> {
}
ExifInterface.ORIENTATION_FLIP_HORIZONTAL -> this.orientation?.setScale(-1f, 1f)
ExifInterface.ORIENTATION_ROTATE_180 -> this.orientation?.setRotate(180f)
ExifInterface.ORIENTATION_FLIP_VERTICAL -> this.orientation?.setScale(1f, -1f)
ExifInterface.ORIENTATION_TRANSPOSE -> {
this.orientation?.setRotate(90f)
this.orientation?.postScale(-1f, 1f)
}
ExifInterface.ORIENTATION_ROTATE_90 -> this.orientation?.setRotate(90f)
ExifInterface.ORIENTATION_TRANSVERSE -> {
this.orientation?.setRotate(-90f)
this.orientation?.postScale(-1f, 1f)
}
ExifInterface.ORIENTATION_ROTATE_270 -> this.orientation?.setRotate(-90f)
}/* Identity matrix */
return true
}
#Throws(IOException::class)
private fun getStoredDimensions(): Boolean {
val input = resolver.openInputStream(uri)
val options = BitmapFactory.Options()
options.inJustDecodeBounds = true
BitmapFactory.decodeStream(resolver.openInputStream(uri), null, options)
/* The input stream could be reset instead of closed and reopened if it were possible
to reliably wrap the input stream on a buffered stream, but it's not possible because
decodeStream() places an upper read limit of 1024 bytes for a reset to be made (it calls
mark(1024) on the stream). */
input?.close()
if (options.outHeight <= 0 || options.outWidth <= 0)
return false
storedHeight = options.outHeight
storedWidth = options.outWidth
return true
}
#Throws(IOException::class)
fun getBitmap(reqWidth: Int, reqHeight: Int): Bitmap {
val heightWidth = 1000
if (!getInformation())
throw FileNotFoundException()
if (!getStoredDimensions())
throw InvalidObjectException(null)
val rect = RectF(0f, 0f, storedWidth.toFloat(), storedHeight.toFloat())
orientation?.mapRect(rect)
var width = rect.width().toInt()
var height = rect.height().toInt()
var subSample = 1
while (width > heightWidth || height > heightWidth) {
width /= 2
height /= 2
subSample *= 2
}
if (width == 0 || height == 0)
throw InvalidObjectException(null)
val options = BitmapFactory.Options()
options.inSampleSize = subSample
val subSampled = BitmapFactory.decodeStream(resolver.openInputStream(uri), null, options)
val picture: Bitmap
if (orientation?.isIdentity == false) {
picture = Bitmap.createBitmap(subSampled, 0, 0, options.outWidth, options.outHeight,
orientation, false)
subSampled.recycle()
} else
picture = subSampled
return ImageScalingUtils.decodeBitmap(picture, reqWidth, reqHeight, ImageScalingUtils.ScalingLogic.CROP)
}
}
ImageScalingUtils:=>
import android.graphics.Bitmap
import android.graphics.BitmapFactory
import java.io.ByteArrayOutputStream
/**
* Created by Avinash on 7/8/19.
* ImageScalingUtils responsible for compressing the bitmap efficiently
*/
object ImageScalingUtils {
/**
* Utility function for decoding an image resource. The decoded bitmap will
* be optimized for further scaling to the requested destination dimensions
* and scaling logic.
*
* #param dstWidth Width of destination area
* #param dstHeight Height of destination area
* #param scalingLogic Logic to use to avoid image stretching
* #return Decoded bitmap
*/
fun decodeBitmap(bm: Bitmap, dstWidth: Int, dstHeight: Int,
scalingLogic: ScalingLogic): Bitmap {
val stream = ByteArrayOutputStream()
bm.compress(Bitmap.CompressFormat.PNG, 100, stream)
val byteArray = stream.toByteArray()
val options = BitmapFactory.Options()
options.inJustDecodeBounds = true
BitmapFactory.decodeByteArray(byteArray, 0, byteArray.size, options)
options.inJustDecodeBounds = false
options.inSampleSize = calculateSampleSize(options.outWidth, options.outHeight, dstWidth,
dstHeight, scalingLogic)
return BitmapFactory.decodeByteArray(byteArray, 0, byteArray.size, options)
}
/**
* ScalingLogic defines how scaling should be carried out if source and
* destination image has different aspect ratio.
*
* CROP: Scales the image the minimum amount while making sure that at least
* one of the two dimensions fit inside the requested destination area.
* Parts of the source image will be cropped to realize this.
*
* FIT: Scales the image the minimum amount while making sure both
* dimensions fit inside the requested destination area. The resulting
* destination dimensions might be adjusted to a smaller size than
* requested.
*/
enum class ScalingLogic {
CROP, FIT
}
/**
* Calculate optimal down-sampling factor given the dimensions of a source
* image, the dimensions of a destination area and a scaling logic.
*
* #param srcWidth Width of source image
* #param srcHeight Height of source image
* #param dstWidth Width of destination area
* #param dstHeight Height of destination area
* #param scalingLogic Logic to use to avoid image stretching
* #return Optimal down scaling sample size for decoding
*/
private fun calculateSampleSize(srcWidth: Int, srcHeight: Int, dstWidth: Int, dstHeight: Int,
scalingLogic: ScalingLogic): Int {
if (scalingLogic == ScalingLogic.FIT) {
val srcAspect = srcWidth.toFloat() / srcHeight.toFloat()
val dstAspect = dstWidth.toFloat() / dstHeight.toFloat()
return if (srcAspect > dstAspect) {
srcWidth / dstWidth
} else {
srcHeight / dstHeight
}
} else {
val srcAspect = srcWidth.toFloat() / srcHeight.toFloat()
val dstAspect = dstWidth.toFloat() / dstHeight.toFloat()
return if (srcAspect > dstAspect) {
srcHeight / dstHeight
} else {
srcWidth / dstWidth
}
}
}
}
You should not use a file and fileprovider. Leave all empty.
Then you will get a Bitmap of a thumbnail in onActivityResult.
Bitmap bitmap = (Bitmap)data.getData();

Crop bitmap image in rectangular

I want to crop an image without using any library.
I am taking reference from https://stackoverflow.com/a/6909144 and try to change the value but can't figure out the solution
Bitmap bitmap = BitmapUtil.getBitmap(path);
Log.d(TAG,"bitmap width : "+bitmap.getWidth()+" height: "+bitmap.getHeight());
if (bitmap.getWidth() >= bitmap.getHeight()){
Toast.makeText(this,"Height Greater",Toast.LENGTH_SHORT).show();
Log.d(TAG,"Greater : Height");
textView.setText("Height Greater");
bitmap = Bitmap.createBitmap(
bitmap,
bitmap.getWidth()/2 - bitmap.getHeight()/2,
0,
bitmap.getHeight(),
bitmap.getHeight()
);
}else{
Toast.makeText(this,"Width Greater",Toast.LENGTH_SHORT).show();
Log.d(TAG,"Greater : Width");
textView.setText("Width Greater");
bitmap = Bitmap.createBitmap(
bitmap,
0,
bitmap.getHeight()/2 - bitmap.getWidth()/2,
bitmap.getWidth(),
bitmap.getWidth()
);
}
I want a crop bitmap image within the rectangle.
For efficiently creating bitmaps, try this
import android.graphics.Bitmap
import android.graphics.BitmapFactory
import java.io.ByteArrayOutputStream
/**
*
* ImageScalingUtils responsible for compressing the bitmap efficiently
*/
object ImageScalingUtils {
/**
* Utility function for decoding an image resource. The decoded bitmap will
* be optimized for further scaling to the requested destination dimensions
* and scaling logic.
*
* #param dstWidth Width of destination area
* #param dstHeight Height of destination area
* #param scalingLogic Logic to use to avoid image stretching
* #return Decoded bitmap
*/
fun decodeBitmap(bm: Bitmap, dstWidth: Int, dstHeight: Int,
scalingLogic: ScalingLogic): Bitmap {
val stream = ByteArrayOutputStream()
bm.compress(Bitmap.CompressFormat.PNG, 100, stream)
val byteArray = stream.toByteArray()
val options = BitmapFactory.Options()
options.inJustDecodeBounds = true
BitmapFactory.decodeByteArray(byteArray, 0, byteArray.size, options)
options.inJustDecodeBounds = false
options.inSampleSize = calculateSampleSize(options.outWidth, options.outHeight, dstWidth,
dstHeight, scalingLogic)
return BitmapFactory.decodeByteArray(byteArray, 0, byteArray.size, options)
}
/**
* ScalingLogic defines how scaling should be carried out if source and
* destination image has different aspect ratio.
*
* CROP: Scales the image the minimum amount while making sure that at least
* one of the two dimensions fit inside the requested destination area.
* Parts of the source image will be cropped to realize this.
*
* FIT: Scales the image the minimum amount while making sure both
* dimensions fit inside the requested destination area. The resulting
* destination dimensions might be adjusted to a smaller size than
* requested.
*/
enum class ScalingLogic {
CROP, FIT
}
/**
* Calculate optimal down-sampling factor given the dimensions of a source
* image, the dimensions of a destination area and a scaling logic.
*
* #param srcWidth Width of source image
* #param srcHeight Height of source image
* #param dstWidth Width of destination area
* #param dstHeight Height of destination area
* #param scalingLogic Logic to use to avoid image stretching
* #return Optimal down scaling sample size for decoding
*/
private fun calculateSampleSize(srcWidth: Int, srcHeight: Int, dstWidth: Int, dstHeight: Int,
scalingLogic: ScalingLogic): Int {
if (scalingLogic == ScalingLogic.FIT) {
val srcAspect = srcWidth.toFloat() / srcHeight.toFloat()
val dstAspect = dstWidth.toFloat() / dstHeight.toFloat()
return if (srcAspect > dstAspect) {
srcWidth / dstWidth
} else {
srcHeight / dstHeight
}
} else {
val srcAspect = srcWidth.toFloat() / srcHeight.toFloat()
val dstAspect = dstWidth.toFloat() / dstHeight.toFloat()
return if (srcAspect > dstAspect) {
srcHeight / dstHeight
} else {
srcWidth / dstWidth
}
}
}
}
Scaled Picture class to scale your picture
import android.content.ContentResolver
import android.graphics.Bitmap
import android.graphics.BitmapFactory
import android.graphics.Matrix
import android.graphics.RectF
import android.net.Uri
import androidx.exifinterface.media.ExifInterface
import java.io.FileNotFoundException
import java.io.IOException
import java.io.InvalidObjectException
/**
*
* ScaledPicture responsible for compressing the bitmap efficiently
*/
class ScaledPicture(private var uri: Uri?, private var resolver: ContentResolver) {
private var path: String? = null
private var orientation: Matrix? = null
private var storedHeight: Int = 0
private var storedWidth: Int = 0
#Throws(IOException::class)
private fun getInformation(): Boolean {
/*if (getInformationFromMediaDatabase())
return true;*/
return getInformationFromFileSystem()
}
/* Support for file managers and dropbox */
#Throws(IOException::class)
private fun getInformationFromFileSystem(): Boolean {
path = uri?.path
if (path == null)
return false
val exif = ExifInterface(path.toString())
val orientation = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION,
ExifInterface.ORIENTATION_NORMAL)
this.orientation = Matrix()
when (orientation) {
ExifInterface.ORIENTATION_NORMAL -> {
}
ExifInterface.ORIENTATION_FLIP_HORIZONTAL -> this.orientation?.setScale(-1f, 1f)
ExifInterface.ORIENTATION_ROTATE_180 -> this.orientation?.setRotate(180f)
ExifInterface.ORIENTATION_FLIP_VERTICAL -> this.orientation?.setScale(1f, -1f)
ExifInterface.ORIENTATION_TRANSPOSE -> {
this.orientation?.setRotate(90f)
this.orientation?.postScale(-1f, 1f)
}
ExifInterface.ORIENTATION_ROTATE_90 -> this.orientation?.setRotate(90f)
ExifInterface.ORIENTATION_TRANSVERSE -> {
this.orientation?.setRotate(-90f)
this.orientation?.postScale(-1f, 1f)
}
ExifInterface.ORIENTATION_ROTATE_270 -> this.orientation?.setRotate(-90f)
}/* Identity matrix */
return true
}
#Throws(IOException::class)
private fun getStoredDimensions(): Boolean {
val input = resolver.openInputStream(uri)
val options = BitmapFactory.Options()
options.inJustDecodeBounds = true
BitmapFactory.decodeStream(resolver.openInputStream(uri), null, options)
/* The input stream could be reset instead of closed and reopened if it were possible
to reliably wrap the input stream on a buffered stream, but it's not possible because
decodeStream() places an upper read limit of 1024 bytes for a reset to be made (it calls
mark(1024) on the stream). */
input?.close()
if (options.outHeight <= 0 || options.outWidth <= 0)
return false
storedHeight = options.outHeight
storedWidth = options.outWidth
return true
}
#Throws(IOException::class)
fun getBitmap(reqWidth: Int, reqHeight: Int): Bitmap {
val heightWidth = 1000
if (!getInformation())
throw FileNotFoundException()
if (!getStoredDimensions())
throw InvalidObjectException(null)
val rect = RectF(0f, 0f, storedWidth.toFloat(), storedHeight.toFloat())
orientation?.mapRect(rect)
var width = rect.width().toInt()
var height = rect.height().toInt()
var subSample = 1
while (width > heightWidth || height > heightWidth) {
width /= 2
height /= 2
subSample *= 2
}
if (width == 0 || height == 0)
throw InvalidObjectException(null)
val options = BitmapFactory.Options()
options.inSampleSize = subSample
val subSampled = BitmapFactory.decodeStream(resolver.openInputStream(uri), null, options)
val picture: Bitmap
if (orientation?.isIdentity == false) {
picture = Bitmap.createBitmap(subSampled, 0, 0, options.outWidth, options.outHeight,
orientation, false)
subSampled.recycle()
} else
picture = subSampled
return ImageScalingUtils.decodeBitmap(picture, reqWidth, reqHeight, ImageScalingUtils.ScalingLogic.CROP)
}
}
Copy the above classes to your project, and use it like this
var bitmap = ScaledPicture(mSelectedUri, contentResolver).getBitmap(800, 800)
Pass your desired height and width

How to have similar mechanism of center-crop on ExoPlayer's PlayerView , but not on the center?

Background
We record a video of the user's face, and usually the face is located at the upper half of the video.
Later we wish to view the video, but the aspect ratio of the PlayerView might be different than the one of the video, so there needs to be some scaling and cropping.
The problem
The only way I've found to scale the PlayerView so that it will be shown in the entire space it has but keeping the aspect ratio (which will result in cropping when needed, of course) , is by using app:resize_mode="zoom" . Here's a sample of how it works with center-crop: http://s000.tinyupload.com/?file_id=00574047057406286563 . The more the Views that show the content have a similar aspect ratio, the less cropping is needed.
But this is only for the center, meaning it takes a point of 0.5x0.5 of the video, and scale-crops from that point. This causes many cases of losing the important content of the video.
For example, if we have a video that was taken in portrait, and we have a square PlayerView and want to show the top area, this is the part that will be visible:
Of course, if the content itself is square, and the views are also square, it should show the entire content, without cropping.
What I've tried
I've tried searching over the Internet, StackOverflow (here) and on Github, but I couldn't find how to do it. The only clue I've found is about AspectRatioFrameLayout and AspectRatioTextureView, but I didn't find how to use them for this task, if it's even possible.
I was told (here) that I should use a normal TextureView , and provide it directly to SimpleExoPlayer using SimpleExoPlayer.setVideoTextureView. And to set a special transformation to it using TextureView.setTransform.
After a lot of trying what is best to use (and looking at video-crop repository , SuperImageView repository , and JCropImageView repository which have examples of scale/crop of ImageView and video), I've published a working sample that seems to show the video correctly, but I'm still not sure about it, as I also use an ImageView that's shown on top of it before it starts playing (to have a nicer transition instead of black content).
Here's the current code:
class MainActivity : AppCompatActivity() {
private val imageResId = R.drawable.test
private val videoResId = R.raw.test
private val percentageY = 0.2f
private var player: SimpleExoPlayer? = null
override fun onCreate(savedInstanceState: Bundle?) {
window.setBackgroundDrawable(ColorDrawable(0xff000000.toInt()))
super.onCreate(savedInstanceState)
if (cache == null) {
cache = SimpleCache(File(cacheDir, "media"), LeastRecentlyUsedCacheEvictor(MAX_PREVIEW_CACHE_SIZE_IN_BYTES))
}
setContentView(R.layout.activity_main)
// imageView.visibility = View.INVISIBLE
imageView.setImageResource(imageResId)
imageView.doOnPreDraw {
imageView.imageMatrix = prepareMatrixForImageView(imageView, imageView.drawable.intrinsicWidth.toFloat(), imageView.drawable.intrinsicHeight.toFloat())
// imageView.imageMatrix = prepareMatrix(imageView, imageView.drawable.intrinsicWidth.toFloat(), imageView.drawable.intrinsicHeight.toFloat())
// imageView.visibility = View.VISIBLE
}
}
override fun onStart() {
super.onStart()
playVideo()
}
private fun prepareMatrix(view: View, contentWidth: Float, contentHeight: Float): Matrix {
var scaleX = 1.0f
var scaleY = 1.0f
val viewWidth = view.measuredWidth.toFloat()
val viewHeight = view.measuredHeight.toFloat()
Log.d("AppLog", "viewWidth $viewWidth viewHeight $viewHeight contentWidth:$contentWidth contentHeight:$contentHeight")
if (contentWidth > viewWidth && contentHeight > viewHeight) {
scaleX = contentWidth / viewWidth
scaleY = contentHeight / viewHeight
} else if (contentWidth < viewWidth && contentHeight < viewHeight) {
scaleY = viewWidth / contentWidth
scaleX = viewHeight / contentHeight
} else if (viewWidth > contentWidth)
scaleY = viewWidth / contentWidth / (viewHeight / contentHeight)
else if (viewHeight > contentHeight)
scaleX = viewHeight / contentHeight / (viewWidth / contentWidth)
val matrix = Matrix()
val pivotPercentageX = 0.5f
val pivotPercentageY = percentageY
matrix.setScale(scaleX, scaleY, viewWidth * pivotPercentageX, viewHeight * pivotPercentageY)
return matrix
}
private fun prepareMatrixForVideo(view: View, contentWidth: Float, contentHeight: Float): Matrix {
val msWidth = view.measuredWidth
val msHeight = view.measuredHeight
val matrix = Matrix()
matrix.setScale(1f, (contentHeight / contentWidth) * (msWidth.toFloat() / msHeight), msWidth / 2f, percentageY * msHeight) /*,msWidth/2f,msHeight/2f*/
return matrix
}
private fun prepareMatrixForImageView(view: View, contentWidth: Float, contentHeight: Float): Matrix {
val dw = contentWidth
val dh = contentHeight
val msWidth = view.measuredWidth
val msHeight = view.measuredHeight
// Log.d("AppLog", "viewWidth $msWidth viewHeight $msHeight contentWidth:$contentWidth contentHeight:$contentHeight")
val scalew = msWidth.toFloat() / dw
val theoryh = (dh * scalew).toInt()
val scaleh = msHeight.toFloat() / dh
val theoryw = (dw * scaleh).toInt()
val scale: Float
var dx = 0
var dy = 0
if (scalew > scaleh) { // fit width
scale = scalew
// dy = ((msHeight - theoryh) * 0.0f + 0.5f).toInt() // + 0.5f for rounding
} else {
scale = scaleh
dx = ((msWidth - theoryw) * 0.5f + 0.5f).toInt() // + 0.5f for rounding
}
dy = ((msHeight - theoryh) * percentageY + 0.5f).toInt() // + 0.5f for rounding
val matrix = Matrix()
// Log.d("AppLog", "scale:$scale dx:$dx dy:$dy")
matrix.setScale(scale, scale)
matrix.postTranslate(dx.toFloat(), dy.toFloat())
return matrix
}
private fun playVideo() {
player = ExoPlayerFactory.newSimpleInstance(this#MainActivity, DefaultTrackSelector())
player!!.setVideoTextureView(textureView)
player!!.addVideoListener(object : VideoListener {
override fun onVideoSizeChanged(width: Int, height: Int, unappliedRotationDegrees: Int, pixelWidthHeightRatio: Float) {
super.onVideoSizeChanged(width, height, unappliedRotationDegrees, pixelWidthHeightRatio)
Log.d("AppLog", "onVideoSizeChanged: $width $height")
val videoWidth = if (unappliedRotationDegrees % 180 == 0) width else height
val videoHeight = if (unappliedRotationDegrees % 180 == 0) height else width
val matrix = prepareMatrixForVideo(textureView, videoWidth.toFloat(), videoHeight.toFloat())
textureView.setTransform(matrix)
}
override fun onRenderedFirstFrame() {
Log.d("AppLog", "onRenderedFirstFrame")
player!!.removeVideoListener(this)
// imageView.animate().alpha(0f).setDuration(5000).start()
imageView.visibility = View.INVISIBLE
}
})
player!!.volume = 0f
player!!.repeatMode = Player.REPEAT_MODE_ALL
player!!.playRawVideo(this, videoResId)
player!!.playWhenReady = true
// player!!.playVideoFromUrl(this, "https://sample-videos.com/video123/mkv/240/big_buck_bunny_240p_20mb.mkv", cache!!)
// player!!.playVideoFromUrl(this, "https://sample-videos.com/video123/mkv/720/big_buck_bunny_720p_1mb.mkv", cache!!)
// player!!.playVideoFromUrl(this#MainActivity, "https://sample-videos.com/video123/mkv/720/big_buck_bunny_720p_1mb.mkv")
}
override fun onStop() {
super.onStop()
player!!.setVideoTextureView(null)
// playerView.player = null
player!!.release()
player = null
}
companion object {
const val MAX_PREVIEW_CACHE_SIZE_IN_BYTES = 20L * 1024L * 1024L
var cache: com.google.android.exoplayer2.upstream.cache.Cache? = null
#JvmStatic
fun getUserAgent(context: Context): String {
val packageManager = context.packageManager
val info = packageManager.getPackageInfo(context.packageName, 0)
val appName = info.applicationInfo.loadLabel(packageManager).toString()
return Util.getUserAgent(context, appName)
}
}
fun SimpleExoPlayer.playRawVideo(context: Context, #RawRes rawVideoRes: Int) {
val dataSpec = DataSpec(RawResourceDataSource.buildRawResourceUri(rawVideoRes))
val rawResourceDataSource = RawResourceDataSource(context)
rawResourceDataSource.open(dataSpec)
val factory: DataSource.Factory = DataSource.Factory { rawResourceDataSource }
prepare(LoopingMediaSource(ExtractorMediaSource.Factory(factory).createMediaSource(rawResourceDataSource.uri)))
}
fun SimpleExoPlayer.playVideoFromUrl(context: Context, url: String, cache: Cache? = null) = playVideoFromUri(context, Uri.parse(url), cache)
fun SimpleExoPlayer.playVideoFile(context: Context, file: File) = playVideoFromUri(context, Uri.fromFile(file))
fun SimpleExoPlayer.playVideoFromUri(context: Context, uri: Uri, cache: Cache? = null) {
val factory = if (cache != null)
CacheDataSourceFactory(cache, DefaultHttpDataSourceFactory(getUserAgent(context)))
else
DefaultDataSourceFactory(context, MainActivity.getUserAgent(context))
val mediaSource = ExtractorMediaSource.Factory(factory).createMediaSource(uri)
prepare(mediaSource)
}
}
I had various issues on trying this till I got to the current situation, and I've updated this question multiple times accordingly. Now it even works with the percentageY I talked about, so I could set it to be from 20% of the top of the video, if I wish. However, I still think that it has a big chance that something is wrong, because when I tried to set it to 50% , I've noticed that the content might not fit the entire View.
I even looked at the source code of ImageView (here), to see how center-crop is used. When applied to the ImageView, it still worked as center-crop, but when I used the same technique on the video, it gave me a very wrong result.
The questions
My goal here was to show both ImageView and the video so that it will smoothly transition from a static image to a video. All that while having both have the top-scale-crop of 20% from the top (for example). I've published a sample project here to try it out and share people of what I've found.
So now my questions are around why this doesn't seem to work well for the imageView and/or video :
As it turns out, none of the matrix creations that I've tried work well for either ImageView or the video. What's wrong with it exactly? How can I change it for them to look the same? To scale-crop from the top 20%, for example?
I tried to use the exact matrix for both, but it seems each need it differently, even though both have the exact same size and content size. Why would I need a different matrix for each?
EDIT: after this question was answered, I've decided to make a small sample of how to use it (Github repository available here) :
import android.content.Context
import android.graphics.Matrix
import android.graphics.PointF
import android.net.Uri
import android.os.Bundle
import android.view.TextureView
import android.view.View
import androidx.annotation.RawRes
import androidx.appcompat.app.AppCompatActivity
import androidx.core.view.doOnPreDraw
import com.google.android.exoplayer2.ExoPlayerFactory
import com.google.android.exoplayer2.Player
import com.google.android.exoplayer2.SimpleExoPlayer
import com.google.android.exoplayer2.source.ExtractorMediaSource
import com.google.android.exoplayer2.source.LoopingMediaSource
import com.google.android.exoplayer2.trackselection.DefaultTrackSelector
import com.google.android.exoplayer2.upstream.*
import com.google.android.exoplayer2.upstream.cache.Cache
import com.google.android.exoplayer2.upstream.cache.CacheDataSourceFactory
import com.google.android.exoplayer2.upstream.cache.LeastRecentlyUsedCacheEvictor
import com.google.android.exoplayer2.upstream.cache.SimpleCache
import com.google.android.exoplayer2.util.Util
import com.google.android.exoplayer2.video.VideoListener
import kotlinx.android.synthetic.main.activity_main.*
import java.io.File
// https://stackoverflow.com/questions/54216273/how-to-have-similar-mechanism-of-center-crop-on-exoplayers-playerview-but-not
class MainActivity : AppCompatActivity() {
companion object {
private val FOCAL_POINT = PointF(0.5f, 0.2f)
private const val IMAGE_RES_ID = R.drawable.test
private const val VIDEO_RES_ID = R.raw.test
private var cache: Cache? = null
private const val MAX_PREVIEW_CACHE_SIZE_IN_BYTES = 20L * 1024L * 1024L
#JvmStatic
fun getUserAgent(context: Context): String {
val packageManager = context.packageManager
val info = packageManager.getPackageInfo(context.packageName, 0)
val appName = info.applicationInfo.loadLabel(packageManager).toString()
return Util.getUserAgent(context, appName)
}
}
private var player: SimpleExoPlayer? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
if (cache == null)
cache = SimpleCache(File(cacheDir, "media"), LeastRecentlyUsedCacheEvictor(MAX_PREVIEW_CACHE_SIZE_IN_BYTES))
// imageView.visibility = View.INVISIBLE
imageView.setImageResource(IMAGE_RES_ID)
}
private fun prepareMatrix(view: View, mediaWidth: Float, mediaHeight: Float, focalPoint: PointF): Matrix? {
if (view.visibility == View.GONE)
return null
val viewHeight = (view.height - view.paddingTop - view.paddingBottom).toFloat()
val viewWidth = (view.width - view.paddingStart - view.paddingEnd).toFloat()
if (viewWidth <= 0 || viewHeight <= 0)
return null
val matrix = Matrix()
if (view is TextureView)
// Restore true media size for further manipulation.
matrix.setScale(mediaWidth / viewWidth, mediaHeight / viewHeight)
val scaleFactorY = viewHeight / mediaHeight
val scaleFactor: Float
var px = 0f
var py = 0f
if (mediaWidth * scaleFactorY >= viewWidth) {
// Fit height
scaleFactor = scaleFactorY
px = -(mediaWidth * scaleFactor - viewWidth) * focalPoint.x / (1 - scaleFactor)
} else {
// Fit width
scaleFactor = viewWidth / mediaWidth
py = -(mediaHeight * scaleFactor - viewHeight) * focalPoint.y / (1 - scaleFactor)
}
matrix.postScale(scaleFactor, scaleFactor, px, py)
return matrix
}
private fun playVideo() {
player = ExoPlayerFactory.newSimpleInstance(this#MainActivity, DefaultTrackSelector())
player!!.setVideoTextureView(textureView)
player!!.addVideoListener(object : VideoListener {
override fun onVideoSizeChanged(videoWidth: Int, videoHeight: Int, unappliedRotationDegrees: Int, pixelWidthHeightRatio: Float) {
super.onVideoSizeChanged(videoWidth, videoHeight, unappliedRotationDegrees, pixelWidthHeightRatio)
textureView.setTransform(prepareMatrix(textureView, videoWidth.toFloat(), videoHeight.toFloat(), FOCAL_POINT))
}
override fun onRenderedFirstFrame() {
// Log.d("AppLog", "onRenderedFirstFrame")
player!!.removeVideoListener(this)
imageView.animate().alpha(0f).setDuration(2000).start()
// imageView.visibility = View.INVISIBLE
}
})
player!!.volume = 0f
player!!.repeatMode = Player.REPEAT_MODE_ALL
player!!.playRawVideo(this, VIDEO_RES_ID)
player!!.playWhenReady = true
// player!!.playVideoFromUrl(this, "https://sample-videos.com/video123/mkv/240/big_buck_bunny_240p_20mb.mkv", cache!!)
// player!!.playVideoFromUrl(this, "https://sample-videos.com/video123/mkv/720/big_buck_bunny_720p_1mb.mkv", cache!!)
// player!!.playVideoFromUrl(this#MainActivity, "https://sample-videos.com/video123/mkv/720/big_buck_bunny_720p_1mb.mkv")
}
override fun onStart() {
super.onStart()
imageView.doOnPreDraw {
val imageWidth: Float = imageView.drawable.intrinsicWidth.toFloat()
val imageHeight: Float = imageView.drawable.intrinsicHeight.toFloat()
imageView.imageMatrix = prepareMatrix(imageView, imageWidth, imageHeight, FOCAL_POINT)
}
playVideo()
}
override fun onStop() {
super.onStop()
if (player != null) {
player!!.setVideoTextureView(null)
// playerView.player = null
player!!.release()
player = null
}
}
override fun onDestroy() {
super.onDestroy()
if (!isChangingConfigurations)
cache?.release()
}
fun SimpleExoPlayer.playRawVideo(context: Context, #RawRes rawVideoRes: Int) {
val dataSpec = DataSpec(RawResourceDataSource.buildRawResourceUri(rawVideoRes))
val rawResourceDataSource = RawResourceDataSource(context)
rawResourceDataSource.open(dataSpec)
val factory: DataSource.Factory = DataSource.Factory { rawResourceDataSource }
prepare(LoopingMediaSource(ExtractorMediaSource.Factory(factory).createMediaSource(rawResourceDataSource.uri)))
}
fun SimpleExoPlayer.playVideoFromUrl(context: Context, url: String, cache: Cache? = null) = playVideoFromUri(context, Uri.parse(url), cache)
fun SimpleExoPlayer.playVideoFile(context: Context, file: File) = playVideoFromUri(context, Uri.fromFile(file))
fun SimpleExoPlayer.playVideoFromUri(context: Context, uri: Uri, cache: Cache? = null) {
val factory = if (cache != null)
CacheDataSourceFactory(cache, DefaultHttpDataSourceFactory(getUserAgent(context)))
else
DefaultDataSourceFactory(context, MainActivity.getUserAgent(context))
val mediaSource = ExtractorMediaSource.Factory(factory).createMediaSource(uri)
prepare(mediaSource)
}
}
Here's a solution for ImageView alone, if needed:
class ScaleCropImageView(context: Context, attrs: AttributeSet?) : AppCompatImageView(context, attrs) {
var focalPoint = PointF(0.5f, 0.5f)
set(value) {
field = value
updateMatrix()
}
private val viewWidth: Float
get() = (width - paddingLeft - paddingRight).toFloat()
private val viewHeight: Float
get() = (height - paddingTop - paddingBottom).toFloat()
init {
scaleType = ScaleType.MATRIX
}
override fun onSizeChanged(w: Int, h: Int, oldw: Int, oldh: Int) {
super.onSizeChanged(w, h, oldw, oldh)
updateMatrix()
}
override fun setImageDrawable(drawable: Drawable?) {
super.setImageDrawable(drawable)
updateMatrix()
}
#Suppress("MemberVisibilityCanBePrivate")
fun updateMatrix() {
if (scaleType != ImageView.ScaleType.MATRIX)
return
val dr = drawable ?: return
imageMatrix = prepareMatrix(
viewWidth, viewHeight,
dr.intrinsicWidth.toFloat(), dr.intrinsicHeight.toFloat(), focalPoint, Matrix()
)
}
private fun prepareMatrix(
viewWidth: Float, viewHeight: Float, mediaWidth: Float, mediaHeight: Float,
focalPoint: PointF, matrix: Matrix
): Matrix? {
if (viewWidth <= 0 || viewHeight <= 0)
return null
var scaleFactor = viewHeight / mediaHeight
if (mediaWidth * scaleFactor >= viewWidth) {
// Fit height
matrix.postScale(scaleFactor, scaleFactor, -(mediaWidth * scaleFactor - viewWidth) * focalPoint.x / (1 - scaleFactor), 0f)
} else {
// Fit width
scaleFactor = viewWidth / mediaWidth
matrix.postScale(scaleFactor, scaleFactor, 0f, -(mediaHeight * scaleFactor - viewHeight) * focalPoint.y / (1 - scaleFactor))
}
return matrix
}
}
The question is how to manipulate an image like ImageView.ScaleType.CENTER_CROP but to shift the focus from the center to another location that is 20% from the top of the image. First, let's look at what CENTER_CROP does:
From the documentation:
CENTER_CROP
Scale the image uniformly (maintain the image's aspect ratio) so that both dimensions (width and height) of the image will be equal to or larger than the corresponding dimension of the view (minus padding). The image is then centered in the view. From XML, use this syntax: android:scaleType="centerCrop".
In other words, scale the image without distortion such that either the width or height of the image (or both width and height) fit within the view so that the view is completely filled with the image (no gaps.)
Another way to think of this is that the center of the image is "pinned" to the center of the view. The image is then scaled to meet the criteria above.
In the following video, the white lines mark the center of the image; the red lines mark the center of the view. The scale type is CENTER_CROP. Notice how the center points of the image and the view coincide. As the view changes size, these two points continue to overlap and always appear at the center of the view regardless of the view size.
So, what does it mean to have center crop-like behavior at a different location such as 20% from the top? Like center crop, we can specify that the point that is 20% from the top of the image and the point that 20% from the top of the view will be "pinned" like the 50% point is "pinned" in center crop. The horizontal location of this point remains at 50% of the image and view. The image can now be scaled to satisfy the other conditions of center crop which specify that either the width and/or height of the image will fit the view with no gaps. (Size of view is understood to be the view size less padding.)
Here is a short video of this 20% crop behavior. In this video, the white lines show the middle of the image, the red lines show the pinned point in the view and the blue line that shows behind the horizontal red line identifies 20% from the top of the image. (Demo project is on GitHub.
Here is the result showing the full image that was supplied and the video in a square frame that transition from the still image. .
MainActivity.kt
prepareMatrix() is the method that does the work to determine how to scale/crop the image. There is some additional work to be done with the video since it appears that the video is made to fit the TextureViewas a scale type "FIT_XY" when it is assigned to the TextureView. Because of this scaling, the media size must be restored before prepareMatrix() is called for the video
class MainActivity : AppCompatActivity() {
private val imageResId = R.drawable.test
private val videoResId = R.raw.test
private var player: SimpleExoPlayer? = null
private val mFocalPoint = PointF(0.5f, 0.2f)
override fun onCreate(savedInstanceState: Bundle?) {
window.setBackgroundDrawable(ColorDrawable(0xff000000.toInt()))
super.onCreate(savedInstanceState)
if (cache == null) {
cache = SimpleCache(File(cacheDir, "media"), LeastRecentlyUsedCacheEvictor(MAX_PREVIEW_CACHE_SIZE_IN_BYTES))
}
setContentView(R.layout.activity_main)
// imageView.visibility = View.INVISIBLE
imageView.setImageResource(imageResId)
imageView.doOnPreDraw {
imageView.scaleType = ImageView.ScaleType.MATRIX
val imageWidth: Float = ContextCompat.getDrawable(this, imageResId)!!.intrinsicWidth.toFloat()
val imageHeight: Float = ContextCompat.getDrawable(this, imageResId)!!.intrinsicHeight.toFloat()
imageView.imageMatrix = prepareMatrix(imageView, imageWidth, imageHeight, mFocalPoint, Matrix())
val b = BitmapFactory.decodeResource(resources, imageResId)
val d = BitmapDrawable(resources, b.copy(Bitmap.Config.ARGB_8888, true))
val c = Canvas(d.bitmap)
val p = Paint()
p.color = resources.getColor(android.R.color.holo_red_dark)
p.style = Paint.Style.STROKE
val strokeWidth = 10
p.strokeWidth = strokeWidth.toFloat()
// Horizontal line
c.drawLine(0f, imageHeight * mFocalPoint.y, imageWidth, imageHeight * mFocalPoint.y, p)
// Vertical line
c.drawLine(imageWidth * mFocalPoint.x, 0f, imageWidth * mFocalPoint.x, imageHeight, p)
// Line in horizontal and vertical center
p.color = resources.getColor(android.R.color.white)
c.drawLine(imageWidth / 2, 0f, imageWidth / 2, imageHeight, p)
c.drawLine(0f, imageHeight / 2, imageWidth, imageHeight / 2, p)
imageView.setImageBitmap(d.bitmap)
imageViewFull.setImageBitmap(d.bitmap)
}
}
fun startPlay(view: View) {
playVideo()
}
private fun getViewWidth(view: View): Float {
return (view.width - view.paddingStart - view.paddingEnd).toFloat()
}
private fun getViewHeight(view: View): Float {
return (view.height - view.paddingTop - view.paddingBottom).toFloat()
}
private fun prepareMatrix(targetView: View, mediaWidth: Float, mediaHeight: Float,
focalPoint: PointF, matrix: Matrix): Matrix {
if (targetView.visibility != View.VISIBLE) {
return matrix
}
val viewHeight = getViewHeight(targetView)
val viewWidth = getViewWidth(targetView)
val scaleFactorY = viewHeight / mediaHeight
val scaleFactor: Float
val px: Float
val py: Float
if (mediaWidth * scaleFactorY >= viewWidth) {
// Fit height
scaleFactor = scaleFactorY
px = -(mediaWidth * scaleFactor - viewWidth) * focalPoint.x / (1 - scaleFactor)
py = 0f
} else {
// Fit width
scaleFactor = viewWidth / mediaWidth
px = 0f
py = -(mediaHeight * scaleFactor - viewHeight) * focalPoint.y / (1 - scaleFactor)
}
matrix.postScale(scaleFactor, scaleFactor, px, py)
return matrix
}
private fun playVideo() {
player = ExoPlayerFactory.newSimpleInstance(this#MainActivity, DefaultTrackSelector())
player!!.setVideoTextureView(textureView)
player!!.addVideoListener(object : VideoListener {
override fun onVideoSizeChanged(width: Int, height: Int, unappliedRotationDegrees: Int, pixelWidthHeightRatio: Float) {
super.onVideoSizeChanged(width, height, unappliedRotationDegrees, pixelWidthHeightRatio)
val matrix = Matrix()
// Restore true media size for further manipulation.
matrix.setScale(width / getViewWidth(textureView), height / getViewHeight(textureView))
textureView.setTransform(prepareMatrix(textureView, width.toFloat(), height.toFloat(), mFocalPoint, matrix))
}
override fun onRenderedFirstFrame() {
Log.d("AppLog", "onRenderedFirstFrame")
player!!.removeVideoListener(this)
imageView.animate().alpha(0f).setDuration(2000).start()
imageView.visibility = View.INVISIBLE
}
})
player!!.volume = 0f
player!!.repeatMode = Player.REPEAT_MODE_ALL
player!!.playRawVideo(this, videoResId)
player!!.playWhenReady = true
// player!!.playVideoFromUrl(this, "https://sample-videos.com/video123/mkv/240/big_buck_bunny_240p_20mb.mkv", cache!!)
// player!!.playVideoFromUrl(this, "https://sample-videos.com/video123/mkv/720/big_buck_bunny_720p_1mb.mkv", cache!!)
// player!!.playVideoFromUrl(this#MainActivity, "https://sample-videos.com/video123/mkv/720/big_buck_bunny_720p_1mb.mkv")
}
override fun onStop() {
super.onStop()
if (player != null) {
player!!.setVideoTextureView(null)
// playerView.player = null
player!!.release()
player = null
}
}
companion object {
const val MAX_PREVIEW_CACHE_SIZE_IN_BYTES = 20L * 1024L * 1024L
var cache: com.google.android.exoplayer2.upstream.cache.Cache? = null
#JvmStatic
fun getUserAgent(context: Context): String {
val packageManager = context.packageManager
val info = packageManager.getPackageInfo(context.packageName, 0)
val appName = info.applicationInfo.loadLabel(packageManager).toString()
return Util.getUserAgent(context, appName)
}
}
fun SimpleExoPlayer.playRawVideo(context: Context, #RawRes rawVideoRes: Int) {
val dataSpec = DataSpec(RawResourceDataSource.buildRawResourceUri(rawVideoRes))
val rawResourceDataSource = RawResourceDataSource(context)
rawResourceDataSource.open(dataSpec)
val factory: DataSource.Factory = DataSource.Factory { rawResourceDataSource }
prepare(LoopingMediaSource(ExtractorMediaSource.Factory(factory).createMediaSource(rawResourceDataSource.uri)))
}
fun SimpleExoPlayer.playVideoFromUrl(context: Context, url: String, cache: Cache? = null) = playVideoFromUri(context, Uri.parse(url), cache)
fun SimpleExoPlayer.playVideoFile(context: Context, file: File) = playVideoFromUri(context, Uri.fromFile(file))
fun SimpleExoPlayer.playVideoFromUri(context: Context, uri: Uri, cache: Cache? = null) {
val factory = if (cache != null)
CacheDataSourceFactory(cache, DefaultHttpDataSourceFactory(getUserAgent(context)))
else
DefaultDataSourceFactory(context, MainActivity.getUserAgent(context))
val mediaSource = ExtractorMediaSource.Factory(factory).createMediaSource(uri)
prepare(mediaSource)
}
}
you can use app:resize_mode="zoom" in com.google.android.exoplayer2.ui.PlayerView
I had a similar problem and solved it by applying transformations on the TextureView whose Surface is used by ExoPlayer:
player.addVideoListener(object : VideoListener {
override fun onVideoSizeChanged(
videoWidth: Int,
videoHeight: Int,
unappliedRotationDegrees: Int,
pixelWidthHeightRatio: Float,
) {
removeVideoListener(this)
val viewWidth: Int = textureView.width - textureView.paddingStart - textureView.paddingEnd
val viewHeight: Int = textureView.height - textureView.paddingTop - textureView.paddingBottom
if (videoWidth == viewWidth && videoHeight == viewHeight) {
return
}
val matrix = Matrix().apply {
// TextureView makes a best effort in fitting the video inside the View. The first transformation we apply is for reverting the fitting.
setScale(
videoWidth.toFloat() / viewWidth,
videoHeight.toFloat() / viewHeight,
)
}
// This algorithm is from ImageView's CENTER_CROP transformation
val offset = 0.5f // the center in CENTER_CROP but you probably want a different value here
val scale: Float
val dx: Float
val dy: Float
if (videoWidth * viewHeight > viewWidth * videoHeight) {
scale = viewHeight.toFloat() / videoHeight
dx = (viewWidth - videoWidth * scale) * offset
dy = 0f
} else {
scale = viewWidth.toFloat() / videoWidth
dx = 0f
dy = (viewHeight - videoHeight * scale) * offset
}
setTransform(matrix.apply {
postScale(scale, scale)
postTranslate(dx, dy)
})
}
})
player.setVideoTextureView(textureView)
player.prepare(createMediaSource())
Note that unless you're using DefaultRenderersFactory you need to make sure that your video Renderer actually calls onVideoSizeChanged by for instance creating the factory like so:
val renderersFactory = RenderersFactory { handler, videoListener, _, _, _, _ ->
// Allows other renderers to be removed by R8
arrayOf(
MediaCodecVideoRenderer(
context,
MediaCodecSelector.DEFAULT,
DefaultRenderersFactory.DEFAULT_ALLOWED_VIDEO_JOINING_TIME_MS,
handler,
videoListener,
-1,
),
MediaCodecAudioRenderer(context, MediaCodecSelector.DEFAULT),
)
}

Get screen width and height in Android

How can I get the screen width and height and use this value in:
#Override protected void onMeasure(int widthSpecId, int heightSpecId) {
Log.e(TAG, "onMeasure" + widthSpecId);
setMeasuredDimension(SCREEN_WIDTH, SCREEN_HEIGHT -
game.findViewById(R.id.flag).getHeight());
}
Using this code, you can get the runtime display's width & height:
DisplayMetrics displayMetrics = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(displayMetrics);
int height = displayMetrics.heightPixels;
int width = displayMetrics.widthPixels;
In a view you need to do something like this:
((Activity) getContext()).getWindowManager()
.getDefaultDisplay()
.getMetrics(displayMetrics);
In some scenarios, where devices have a navigation bar, you have to check at runtime:
public boolean showNavigationBar(Resources resources)
{
int id = resources.getIdentifier("config_showNavigationBar", "bool", "android");
return id > 0 && resources.getBoolean(id);
}
If the device has a navigation bar, then count its height:
private int getNavigationBarHeight() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR1) {
DisplayMetrics metrics = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(metrics);
int usableHeight = metrics.heightPixels;
getWindowManager().getDefaultDisplay().getRealMetrics(metrics);
int realHeight = metrics.heightPixels;
if (realHeight > usableHeight)
return realHeight - usableHeight;
else
return 0;
}
return 0;
}
So the final height of the device is:
int height = displayMetrics.heightPixels + getNavigationBarHeight();
There is a very simple answer and without pass context
public static int getScreenWidth() {
return Resources.getSystem().getDisplayMetrics().widthPixels;
}
public static int getScreenHeight() {
return Resources.getSystem().getDisplayMetrics().heightPixels;
}
Note: if you want the height include navigation bar, use method below
WindowManager windowManager =
(WindowManager) BaseApplication.getApplication().getSystemService(Context.WINDOW_SERVICE);
final Display display = windowManager.getDefaultDisplay();
Point outPoint = new Point();
if (Build.VERSION.SDK_INT >= 19) {
// include navigation bar
display.getRealSize(outPoint);
} else {
// exclude navigation bar
display.getSize(outPoint);
}
if (outPoint.y > outPoint.x) {
mRealSizeHeight = outPoint.y;
mRealSizeWidth = outPoint.x;
} else {
mRealSizeHeight = outPoint.x;
mRealSizeWidth = outPoint.y;
}
Just to update the answer by parag and SpK to align with current SDK backward compatibility from deprecated methods:
int Measuredwidth = 0;
int Measuredheight = 0;
Point size = new Point();
WindowManager w = getWindowManager();
if(Build.VERSION.SDK_INT >= Build.VERSION_CODES.HONEYCOMB) {
w.getDefaultDisplay().getSize(size);
Measuredwidth = size.x;
Measuredheight = size.y;
}else{
Display d = w.getDefaultDisplay();
Measuredwidth = d.getWidth();
Measuredheight = d.getHeight();
}
It’s very easy to get in Android:
int width = Resources.getSystem().getDisplayMetrics().widthPixels;
int height = Resources.getSystem().getDisplayMetrics().heightPixels;
Why not
DisplayMetrics displaymetrics = getResources().getDisplayMetrics();
Then use
displayMetrics.widthPixels
and
displayMetrics.heightPixels
• Kotlin Version via Extension Property
If you want to know the size of the screen in pixels as well as dp, using these extension properties really helps:
DimensionUtils.kt
import android.content.Context
import android.content.res.Resources
import android.graphics.Rect
import android.graphics.RectF
import android.os.Build
import android.util.DisplayMetrics
import android.view.WindowManager
import kotlin.math.roundToInt
/**
* #author aminography
*/
private val displayMetrics: DisplayMetrics by lazy { Resources.getSystem().displayMetrics }
/**
* Returns boundary of the screen in pixels (px).
*/
val screenRectPx: Rect
get() = displayMetrics.run { Rect(0, 0, widthPixels, heightPixels) }
/**
* Returns boundary of the screen in density independent pixels (dp).
*/
val screenRectDp: RectF
get() = screenRectPx.run { RectF(0f, 0f, right.px2dp, bottom.px2dp) }
/**
* Returns boundary of the physical screen including system decor elements (if any) like navigation
* bar in pixels (px).
*/
val Context.physicalScreenRectPx: Rect
get() = if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR1) {
(applicationContext.getSystemService(Context.WINDOW_SERVICE) as WindowManager)
.run { DisplayMetrics().also { defaultDisplay.getRealMetrics(it) } }
.run { Rect(0, 0, widthPixels, heightPixels) }
} else screenRectPx
/**
* Returns boundary of the physical screen including system decor elements (if any) like navigation
* bar in density independent pixels (dp).
*/
val Context.physicalScreenRectDp: RectF
get() = physicalScreenRectPx.run { RectF(0f, 0f, right.px2dp, bottom.px2dp) }
/**
* Converts any given number from pixels (px) into density independent pixels (dp).
*/
val Number.px2dp: Float
get() = this.toFloat() / displayMetrics.density
/**
* Converts any given number from density independent pixels (dp) into pixels (px).
*/
val Number.dp2px: Int
get() = (this.toFloat() * displayMetrics.density).roundToInt()
Usage:
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
val widthPx = screenRectPx.width()
val heightPx = screenRectPx.height()
println("[PX] screen width: $widthPx , height: $heightPx")
val widthDp = screenRectDp.width()
val heightDp = screenRectDp.height()
println("[DP] screen width: $widthDp , height: $heightDp")
println()
val physicalWidthPx = physicalScreenRectPx.width()
val physicalHeightPx = physicalScreenRectPx.height()
println("[PX] physical screen width: $physicalWidthPx , height: $physicalHeightPx")
val physicalWidthDp = physicalScreenRectDp.width()
val physicalHeightDp = physicalScreenRectDp.height()
println("[DP] physical screen width: $physicalWidthDp , height: $physicalHeightDp")
}
}
Result:
When the device is in portrait orientation:
[PX] screen width: 1440 , height: 2392
[DP] screen width: 360.0 , height: 598.0
[PX] physical screen width: 1440 , height: 2560
[DP] physical screen width: 360.0 , height: 640.0
When the device is in landscape orientation:
[PX] screen width: 2392 , height: 1440
[DP] screen width: 598.0 , height: 360.0
[PX] physical screen width: 2560 , height: 1440
[DP] physical screen width: 640.0 , height: 360.0
You can get width and height from context
java:
int width= context.getResources().getDisplayMetrics().widthPixels;
int height= context.getResources().getDisplayMetrics().heightPixels;
kotlin
val width: Int = context.resources.displayMetrics.widthPixels
val height: Int = context.resources.displayMetrics.heightPixels
Try below code :-
1.
Display display = getWindowManager().getDefaultDisplay();
Point size = new Point();
display.getSize(size);
int width = size.x;
int height = size.y;
2.
Display display = getWindowManager().getDefaultDisplay();
int width = display.getWidth(); // deprecated
int height = display.getHeight(); // deprecated
or
int width = getWindowManager().getDefaultDisplay().getWidth();
int height = getWindowManager().getDefaultDisplay().getHeight();
3.
DisplayMetrics metrics = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(metrics);
metrics.heightPixels;
metrics.widthPixels;
Some methods, applicable for retrieving screen size, are deprecated in API Level 31, including Display.getRealMetrics() and Display.getRealSize(). Starting from API Level 30 we can use WindowManager#getCurrentWindowMetrics(). The clean way to get screen size is to create some Compat class, e.g.:
object ScreenMetricsCompat {
private val api: Api =
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R) ApiLevel30()
else Api()
/**
* Returns screen size in pixels.
*/
fun getScreenSize(context: Context): Size = api.getScreenSize(context)
#Suppress("DEPRECATION")
private open class Api {
open fun getScreenSize(context: Context): Size {
val display = context.getSystemService(WindowManager::class.java).defaultDisplay
val metrics = if (display != null) {
DisplayMetrics().also { display.getRealMetrics(it) }
} else {
Resources.getSystem().displayMetrics
}
return Size(metrics.widthPixels, metrics.heightPixels)
}
}
#RequiresApi(Build.VERSION_CODES.R)
private class ApiLevel30 : Api() {
override fun getScreenSize(context: Context): Size {
val metrics: WindowMetrics = context.getSystemService(WindowManager::class.java).currentWindowMetrics
return Size(metrics.bounds.width(), metrics.bounds.height())
}
}
}
Calling ScreenMetricsCompat.getScreenSize(this).height in Activity we can get a screen height.
I suggest you create extension functions.
/**
* Return the width and height of the screen
*/
val Context.screenWidth: Int
get() = resources.displayMetrics.widthPixels
val Context.screenHeight: Int
get() = resources.displayMetrics.heightPixels
/**
* Pixel and Dp Conversion
*/
val Float.toPx get() = this * Resources.getSystem().displayMetrics.density
val Float.toDp get() = this / Resources.getSystem().displayMetrics.density
val Int.toPx get() = (this * Resources.getSystem().displayMetrics.density).toInt()
val Int.toDp get() = (this / Resources.getSystem().displayMetrics.density).toInt()
DisplayMetrics lDisplayMetrics = getResources().getDisplayMetrics();
int widthPixels = lDisplayMetrics.widthPixels;
int heightPixels = lDisplayMetrics.heightPixels;
For kotlin user's
fun Activity.displayMetrics(): DisplayMetrics {
val displayMetrics = DisplayMetrics()
windowManager.defaultDisplay.getMetrics(displayMetrics)
return displayMetrics
}
And in Activity you could use it like
resources.displayMetrics.let { displayMetrics ->
val height = displayMetrics.heightPixels
val width = displayMetrics.widthPixels
}
Or in fragment
activity?.displayMetrics()?.run {
val height = heightPixels
val width = widthPixels
}
As getMetrics and getRealMetrics are deprecated, Google recommends to determine the screen width and height as follows:
WindowMetrics windowMetrics = getActivity().getWindowManager().getMaximumWindowMetrics();
Rect bounds = windowMetrics.getBounds();
int widthPixels = bounds.width();
int heightPixels = bounds.height();
However, I've figured out another methode that gives me the same results:
Configuration configuration = mContext.getResources().getConfiguration();
Display.Mode mode = display.getMode();
int widthPixels = mode.getPhysicalWidth();
int heightPixels = mode.getPhysicalHeight();
None of the answers here work correctly for Chrome OS multiple displays, or soon-to-come Foldables.
When looking for the current configuration, always use the configuration from your current activity in getResources().getConfiguration(). Do not use the configuration from your background activity or the one from the system resource. The background activity does not have a size, and the system's configuration may contain multiple windows with conflicting sizes and orientations, so no usable data can be extracted.
So the answer is
val config = context.getResources().getConfiguration()
val (screenWidthPx, screenHeightPx) = config.screenWidthDp.dp to config.screenHeightDp.dp
DisplayMetrics dimension = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(dimension);
int width = dimension.widthPixels;
int height = dimension.heightPixels;
Get the value of screen width and height.
Display display = getWindowManager().getDefaultDisplay();
Point size = new Point();
display.getSize(size);
width = size.x;
height = size.y;
As an android official document said for the default display use Context#getDisplay() because this method was deprecated in API level 30.
getWindowManager().getDefaultDisplay().getMetrics(displayMetrics);
This code given below is in kotlin and is written accodring to the latest version of Android help you determine width and height:
fun getWidth(context: Context): Int {
var width:Int = 0
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R) {
val displayMetrics = DisplayMetrics()
val display: Display? = context.getDisplay()
display!!.getRealMetrics(displayMetrics)
return displayMetrics.widthPixels
}else{
val displayMetrics = DisplayMetrics()
this.windowManager.defaultDisplay.getMetrics(displayMetrics)
width = displayMetrics.widthPixels
return width
}
}
fun getHeight(context: Context): Int {
var height: Int = 0
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R) {
val displayMetrics = DisplayMetrics()
val display = context.display
display!!.getRealMetrics(displayMetrics)
return displayMetrics.heightPixels
}else {
val displayMetrics = DisplayMetrics()
this.windowManager.defaultDisplay.getMetrics(displayMetrics)
height = displayMetrics.heightPixels
return height
}
}
fun Activity.getRealScreenSize(): Pair<Int, Int> { //<width, height>
return if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R) {
val size = Point()
display?.getRealSize(size)
Pair(size.x, size.y)
} else {
val size = Point()
windowManager.defaultDisplay.getRealSize(size)
Pair(size.x, size.y)
}}
This is an extension function and you can use in your activity in this way:
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
val pair = getRealScreenSize()
pair.first //to get width
pair.second //to get height
}
I use the following code to get the screen dimensions
getWindow().getDecorView().getWidth()
getWindow().getDecorView().getHeight()
Full way to do it, that returns the true resolution (including when the user has changed the resolution) is to use "getRealSize".
I've noticed that all other available functions, including the ones the docs say to use instead of this - have some cases that the result is smaller.
Here's the code to do it:
WindowManager wm = (WindowManager) context.getSystemService(Context.WINDOW_SERVICE);
Point size = new Point();
wm.getDefaultDisplay().getRealSize(size);
final int width = size.x, height = size.y;
And since this can change on different orientation, here's a solution (in Kotlin), to get it right no matter the orientation:
/**
* returns the natural orientation of the device: Configuration.ORIENTATION_LANDSCAPE or Configuration.ORIENTATION_PORTRAIT .<br></br>
* The result should be consistent no matter the orientation of the device
*/
#JvmStatic
fun getScreenNaturalOrientation(context: Context): Int {
//based on : http://stackoverflow.com/a/9888357/878126
val windowManager = context.getSystemService(Context.WINDOW_SERVICE) as WindowManager
val config = context.resources.configuration
val rotation = windowManager.defaultDisplay.rotation
return if ((rotation == Surface.ROTATION_0 || rotation == Surface.ROTATION_180) && config.orientation == Configuration.ORIENTATION_LANDSCAPE || (rotation == Surface.ROTATION_90 || rotation == Surface.ROTATION_270) && config.orientation == Configuration.ORIENTATION_PORTRAIT)
Configuration.ORIENTATION_LANDSCAPE
else
Configuration.ORIENTATION_PORTRAIT
}
/**
* returns the natural screen size (in pixels). The result should be consistent no matter the orientation of the device
*/
#JvmStatic
fun getScreenNaturalSize(context: Context): Point {
val screenNaturalOrientation = getScreenNaturalOrientation(context)
val wm = context.getSystemService(Context.WINDOW_SERVICE) as WindowManager
val point = Point()
wm.defaultDisplay.getRealSize(point)
val currentOrientation = context.resources.configuration.orientation
if (currentOrientation == screenNaturalOrientation)
return point
else return Point(point.y, point.x)
}
Display display = ((WindowManager) this.getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
int mWidthScreen = display.getWidth();
int mHeightScreen = display.getHeight();
public class DisplayInfo {
int screen_height=0, screen_width=0;
WindowManager wm;
DisplayMetrics displaymetrics;
DisplayInfo(Context context) {
getdisplayheightWidth(context);
}
void getdisplayheightWidth(Context context) {
wm = (WindowManager) context.getSystemService(Context.WINDOW_SERVICE);
displaymetrics = new DisplayMetrics();
wm.getDefaultDisplay().getMetrics(displaymetrics);
screen_height = displaymetrics.heightPixels;
screen_width = displaymetrics.widthPixels;
}
public int getScreen_height() {
return screen_height;
}
public int getScreen_width() {
return screen_width;
}
}
Seems like all these answers fail for my Galaxy M51 with Android 11. After doing some research around I found this code :
WindowMetrics windowmetrics = MainActivity.getWindowManager().getCurrentWindowMetrics();
Rect rect = windowmetrics.getBounds();
int width = rect.right;
int height =rect.bottom;
shows my true device resolution of 1080x2400, the rest only return 810x1800.
Methods shown here are deprecated/outdated but this is still working.Require API 13
check it out
Display disp= getWindowManager().getDefaultDisplay();
Point dimensions = new Point();
disp.getSize(size);
int width = size.x;
int height = size.y;
As an android official document said for the default display use Context#getDisplay() because this method was deprecated in API level 30.
getWindowManager().getDefaultDisplay().getMetrics(displayMetrics);
This bowl of code help to determine width and height.
public static int getWidth(Context context) {
DisplayMetrics displayMetrics = new DisplayMetrics();
Display display = context.getDisplay();
if (display != null) {
display.getRealMetrics(displayMetrics);
return displayMetrics.widthPixels;
}
return -1;
}
For the Height:
public static int getHeight(Context context) {
DisplayMetrics displayMetrics = new DisplayMetrics();
Display display = context.getDisplay();
if (display != null) {
display.getRealMetrics(displayMetrics);
return displayMetrics.heightPixels;
}
return -1;
}
Try this code for Kotlin
val display = windowManager.defaultDisplay
val size = Point()
display.getSize(size)
var DEVICE_WIDTH = size.x
var DEVICE_HEIGHT = size.y
Just use the function below that returns width and height of the screen size as an array of integers
private int[] getScreenSIze(){
DisplayMetrics displaymetrics = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(displaymetrics);
int h = displaymetrics.heightPixels;
int w = displaymetrics.widthPixels;
int[] size={w,h};
return size;
}
On your onCreate function or button click add the following code to output the screen sizes as shown below
int[] screenSize= getScreenSIze();
int width=screenSize[0];
int height=screenSize[1];
screenSizes.setText("Phone Screen sizes \n\n width = "+width+" \n Height = "+height);
I updated answer for Kotlin language!
For Kotlin: You should call Window Manager and get metrics. After that easy way.
val displayMetrics = DisplayMetrics()
windowManager.defaultDisplay.getMetrics(displayMetrics)
var width = displayMetrics.widthPixels
var height = displayMetrics.heightPixels
How can we use it effectively in independent activity way with Kotlin language?
Here, I created a method in general Kotlin class. You can use it in all activities.
private val T_GET_SCREEN_WIDTH:String = "screen_width"
private val T_GET_SCREEN_HEIGHT:String = "screen_height"
private fun getDeviceSizes(activity:Activity, whichSize:String):Int{
val displayMetrics = DisplayMetrics()
activity.windowManager.defaultDisplay.getMetrics(displayMetrics)
return when (whichSize){
T_GET_SCREEN_WIDTH -> displayMetrics.widthPixels
T_GET_SCREEN_HEIGHT -> displayMetrics.heightPixels
else -> 0 // Error
}
}
I found weigan's answer best one in this page, here is how you can use that in Xamarin.Android:
public int GetScreenWidth()
{
return Resources.System.DisplayMetrics.WidthPixels;
}
public int GetScreenHeight()
{
return Resources.System.DisplayMetrics.HeightPixels;
}
Screen resolution is total no of pixel in screen. Following program will extract the screen resolution of the device. It will print screen width and height. Those values are in pixel.
public static Point getScreenResolution(Context context) {
// get window managers
WindowManager manager = (WindowManager)context.getSystemService(Context.WINDOW_SERVICE);
Display display = manager.getDefaultDisplay();
Point point = new Point();
display.getSize(point);
// get width and height
int width = point.x;
int height = point.y;
return point;
}

Categories

Resources