I have an image of a face (250px X 250px) that is in an absolute layout element. I currently get the user's touch coordinates and using some maths calculate what has been touched (eg the nose), then do something accordingly.
My question is how to scale this to fit the screen width available. If I set the image (in the xml) to fill_parent, the coordinates are way out. Can this be remedied by converting the touch coordinates to dips (if so, how), or will I need to get the screen width (again convert into dips) and sort out the coordinate problem using more maths?
Any and all help appreciated.
pixels = dps * (density / 160)
The (density / 160) factor is known as the density scale factor, and get be retrieved in Java from the Display Metrics object. What you should do is store the position of the nose etc in terms of dips (which are the same as pixels on a screen with density 160), and then convert dips to pixels depending on what screen you are running on:
final static int NOSE_POSITION_DP = 10;
final float scale = getContext().getResources().getDisplayMetrics().density;
final int nosePositionPixels = (int) (NOSE_POSITION_DP * scale + 0.5f);
I have three useful functions in my library...
get Screen Density
public static float getDensity(Context context){
float scale = context.getResources().getDisplayMetrics().density;
return scale;
}
convert Dip to Pixels.
public static int convertDiptoPix(int dip){
float scale = getDensity();
return (int) (dip * scale + 0.5f);
}
convert Pixels to Dips.
public static int convertPixtoDip(int pixel){
float scale = getDensity();
return (int)((pixel - 0.5f)/scale);
}
A very simple way of doing this.
int value = (int) TypedValue.applyDimension(TypedValue.COMPLEX_UNIT_PX, 250, (mContext).getResources().getDisplayMetrics());
public int getDip(int pixel)
{
float scale = getBaseContext().getResources().getDisplayMetrics().density;
return (int) (pixel * scale + 0.5f);
}
Related
I read here about converting dp units to pixel units. But I cant understand the 0.5f. Where does this number come from and what is the use of it?
// The gesture threshold expressed in dp
private static final float GESTURE_THRESHOLD_DP = 16.0f;
// Get the screen's density scale
final float scale = getResources().getDisplayMetrics().density;
// Convert the dps to pixels, based on density scale
mGestureThreshold = (int) (GESTURE_THRESHOLD_DP * scale + 0.5f);
// Use mGestureThreshold as a distance in pixels...
Casting floating point numbers to integer will floor them.
That 0.5f is to round the number:
x = (int) 3.9
print x // 3
x = (int) 3.9 + 0.5f
print x // 4
Its to round things. Scale may be a decimal (like 1.5). This means the product may not be a whole number. Adding .5 then converting to int ensures that the number rounds up if the number is more than halfway between two integers, and down if its less than halfway.
I am having a hard time to make this right.
Basically I am creating an ImageView and applying a LayoutParameter to it.
LayoutParams lp = new LayoutParams(width, height);
lp.gravity = Gravity.CENTER;
I know that width and height parameters receive pixel numbers, so I am passing them in DP and converting it to absolute pixels using:
public int convertToPixels(float dpSize){
final float density = getResources().getDisplayMetrics().density;
return ((int) (dpSize * density + 0.5f));
}
As far I know, this should make a drawable fill exactly the same area in different screens, right? Unfortunately, that is not happening at all.
Is there something wrong with these methods I am using?
These two emulators below have the same image and the same amount of DP.
Left emulator is 1.0 density and right one is 2.0. Why still does it look
so different ? Don't undertand..
Use :
public class Convert{
public static float convertDpToPixel(float dp){
DisplayMetrics metrics = Resources.getSystem().getDisplayMetrics();
float px = dp * (metrics.densityDpi / 160f);
return Math.round(px);
}
}
Just use it in a static way:
float requiredPixel = Convert.convertDpToPixel(16.0);
For more info: https://developer.android.com/guide/practices/screens_support.html
I have a app which uses co-ordinates on the image to pin a marker.
final GestureDetector gestureDetector = new GestureDetector(this, new GestureDetector.SimpleOnGestureListener() {
#Override
public boolean onSingleTapConfirmed(MotionEvent e) {
if (imageView.isReady()) {
sCoord = imageView.viewToSourceCoord(e.getX(), e.getY());
imageView.setPin(new PointF(sCoord.x,sCoord.y));
Toast.makeText(getApplicationContext(), "Single tap: " + ((int) convertPixelsToDp(sCoord.x)) + ", " + ((int) convertPixelsToDp(sCoord.y)), Toast.LENGTH_SHORT).show();
} else {
Toast.makeText(getApplicationContext(), "Something is not right!", Toast.LENGTH_SHORT).show();
}
return true;
}
});
Now am doing some calculations on the point and points and changing the position of the marker. It's working fine on the device which am using but when I use another device, it's shows random location(which is obvious).
I tried this to change the co-ordinates(pixels) to dp unit and then pinning the image to that position, still it doesn't work.
/**
* This method converts dp unit to equivalent pixels, depending on device density.
*
* #param dp A value in dp (density independent pixels) unit. Which we need to convert into pixels
* #param context Context to get resources and device specific display metrics
* #return A float value to represent px equivalent to dp depending on device density
*/
public static float convertDpToPixel(float dp, Context context){
Resources resources = context.getResources();
DisplayMetrics metrics = resources.getDisplayMetrics();
float px = dp * (metrics.densityDpi / DisplayMetrics.DENSITY_DEFAULT);
return px;
}
/**
* This method converts device specific pixels to density independent pixels.
*
* #param px A value in px (pixels) unit. Which we need to convert into db
* #param context Context to get resources and device specific display metrics
* #return A float value to represent dp equivalent to px value
*/
public static float convertPixelsToDp(float px, Context context){
Resources resources = context.getResources();
DisplayMetrics metrics = resources.getDisplayMetrics();
float dp = px / (metrics.densityDpi / DisplayMetrics.DENSITY_DEFAULT);
return dp;
}
Let's say I have an floorplan of a store, I want a pin a marker near the door. So I take the co-ordinate and convert it to dp using above mentioned function,then I pass that dp location to another device and there am converting that dp to pixels and using that location to pin the marker, but it's not working. Showing random locations.
//trying
float x=convertDpToPixel(sCoord.x);
float y=convertDpToPixel(sCoord.y);
imageView.setPin(new PointF(x,y));
Any suggestion is welcomed. Thanks.
The approach to solve this problem is just to use a Percentage approach, to elaborate the given answer I am going to show some simple Math.
The process of this computation requires the following Values:
imageWidth = the width of the image from the left to right.
imageHeight = the height if the image from top to bottom.
xOccupation = the occupied with of the x-coordinate from left to right.
yOccupation = the occupied height of the y-coordinate from top to bottom.
To compute the xPercentage and yPercentage we are going to use this formula:
xPercentage = xOccupation/imageWidth
yPercentage = yOccupation/imageHeight
After getting those 2 values, you are now ready to send it to your server and access the same value in all kinds of device regardless of the size density.
The client device requires to recompute the x and y coordinates by doing the reverse formula.
new-x-coordinate = newImageWidth*xPercentage.
new-y-coordinate = newImageHeight*yPercentage.
In Canvas, drawing an rectangle with RectF, need set top and left in dp or px?
Integer padding = 10;
Integer width = 100; // It is dp or px?
Integer height = 50;
RectF position = new RectF();
position.top = 0 + padding;
position.bottom = position.top + height;
position.left = 0 + padding;
position.right = position.left + width;
http://developer.android.com/intl/es/reference/android/graphics/RectF.html
It does not indicate if the values are represented in px or dp.
As has already been pointed out, Canvas and RectF use px and not dp.
As for the documentation, the documentation for Canvas
(http://developer.android.com/intl/es/reference/android/graphics/Canvas.html)
and RectF only mention pixels.
Since it is not explicitly pointed out that they are dp and both classes are directly derived from java.lang.Object, one can only conclude that it must be "normal" pixels.
If, for some reason, you need to convert from dp to px and vice versa, have a look at this document:
http://developer.android.com/intl/es/guide/practices/screens_support.html
It uses pixels, not density independent pixels.
It using pixel but if you want must convert it.
private int convertDpToPx(int dp){
return Math.round(dp * (getResources().getDisplayMetrics().xdpi / DisplayMetrics.DENSITY_DEFAULT));
}
I hope it helps.
I would like to have the dpi value, if the user taps the imageview on my activity.
at the moment i have set an onTouchListener, but with this listener i only get the "event.getX/Y" pixel values..
is there any way to get the dpi value? maybe something like a converter px->dpi?
EDIT:
// Converting dips to pixels
float dips = 20.0f;
float scale = getContext().getResources().getDisplayMetrics().density;
int pixels = Math.round(dips * scale);
// Converting pixels to dips
int pixels = 15;
float scale = getContext().getResources().getDisplayMetrics().density;
float dips = pixels / scale;