Converting Focal Length in millimeters to pixels - Android - android

In Android, I am currently accessing the camera's focal length by using getFocalLength() in Camera1. Camera2 is not an option.
I am trying to full fill the current calculation: focal_length_pix = focal_length_m * 1 (pix) / pixel_width_m.
Basically this converts the focal length from mm -> px. Now I know the focal_length_m variable but I am currently trying to figure out the pixel_width_m which is is the width of a pixel (on the sensor) in meters.
I am struggling to find a way to calculate the width of a pixel on the sensor. Any suggestions, ideas would be much appreciated.

You are able to calculate the focal length in pixels by the following:
double focal_length_pix = (size.width * 0.5) / tan(horizontalAngleView * 0.5 * PI/180);
size derives from getPreviewSize()

Related

Android Development Training Advance | problem with mRadius = (float) (Math.min(mWidth, mHeight) / 2 * 0.8)

I was studying the advanced android course, more specifically in this codelab https://developer.android.com/codelabs/advanced-android-training-custom-view-from-scratch?index=..%2F..advanced-android-training#2
I don't get why radius is calculated in that form, I don't understand how they calculate the radius. Why is there a 0.8 in this formula:
mRadius = (float) (Math.min(mWidth, mHeight) / 2 * 0.8)
Thanks a lot!
The 0.8 in this case defines, how much of the available space of the device, the circle should cover.
I think this is easiest explained using an example. Say our canvas width is 400 pixel and the height is 640 pixel. Because 400 is smaller the calculation would be:
mRadius = (float) (400/2)
this will result in a radius of 200 and the circle would therefore cover all available width. (Because there is an imaginary * 1 in the formula)
In this tutorial however the circle should have a margin. Setting (brackets are for clarity only)
mRadius = (float) ((400/2) * 0.8)
will result in a radius (and a circle) that covers only 80% of the available space.

Find the Dimensions (Height ,width) of an Object using Camera

I want to find the solution to get the dimensions of an Object Using Camera, Well it sounds like Duplicate one
How to measure height, width and distance of object using camera?
But the solution doesn't help me out.Well from the Above link i got some idea to find out the distance (Measure Distance).
Can somebody suggest me how am i supposed to get the width as well as height of an object. Simple math or any Idea would be really helpful.
Is there any possibilities to achieve the above solution using OpenCV.
Measure Height and Width
What i have tried so far:
Suppose we assume the Fixed Distance , We can calculate Angle of elevation
tan(α/2) = (l/2)/d,
hence
α = 2*atan(l/2d)
But still we don't know the value of L (Length of the object)
Another way to find the View angle:
double thetaV = Math.toRadians(camera.getParameters().getVerticalViewAngle());
double thetaH = Math.toRadians(camera.getParameters().getHorizontalViewAngle());
Seems Not working !!
The actual physics of a lens are explained for example on this website of Georgia State University.
See this illustration which explains how you can use either the linear magnification or focal length relations to find out object size from image size:
In particular, -i / h' = o / h, and this relation o / h holds true for all similar triangles (that is, an object of size 2h at distance 2o has the same sizeh' on the picture). So as you can see, even in the case of the full equation, you can't know both the distance o and the size h of an object -- however one will give you the other.
On the other hand, two objects at the same distance o will see their sizes h1' and h2' on the image be proportional to their sizes in real life h1 and h2, since h1' / h1 = M = h2' / h2.
Hence if you know for one object both oand h, you know M, thus knowing an object's size on film you can deduct its size from its distance and vice versa.
The -i / h' value is naturally expressed for the maximal h'. If the size of an object fills the image exactly, it fills the field of view, then the ratio of its distance to its size is tan(α/2) = (l / 2) / d (note that in the conventions of the image below, d = o and l = 2 * h).
This α is what you name theta in your example. Now, from the image size you can get under what angle you see the image -- that is, what size l would the image be if it were at distance d. From there, you can deduce the size of the object from its distance and vice versa.
Algorithm steps:
get ratio r = size of object in image (in px) / total size of image (in px).
Do this along the axis for which you know or plan to get the real object size, of course.
get the corresponding field of view and angle multiply r by the tangent of half that angle
r *= tan(camera.getParameters().getXXXXViewAngle() / 2)
r is now the tangent of the half-angle under which you see the object, hence the following relations are true: r = (l / 2) / d = h / o (with the respective drawing's notations).
If you know the distance d to the object, its size is l = 2 * r * d
If you know the size l of the object, it is at distance is d = l / (2 * r)
This works for objects hat are actually pointed at by the camera, if they aren't centred the maths may be off.

camera: image projection

I'd like to project images on a wall using camera. Images, essentially, must scale regarding the distance between camera and the wall.
Firstly, I made distance calculations by using right triangle trigonometry(visionHeight * Math.tan(a)). It's not 100% exact but yet close to real values.
Secondly, knowing the distance we can try to figure out all panorama height by using isosceles triangle trigonometry formula: c = a * tan(A);
A = mCamera.getParameters().getVerticalViewAngle();
The results are about 30% greater than the actual object height and it's kinda weird.
double panoramaHeight = (distance * Math.tan( mCamera.getParameters().getVerticalViewAngle() / 2 * 0.0174532925)) * 2;
I've also tried figuring out those angles using the same isosceles triangle's formula, but now knowing the distance and the height. I got 28 and 48 degrees angles.
Does it mean that android camera doesn't render everything it shoots ? And, what other solutions you can suggest ?
Web search shows that the values returned by getVerticalViewAngle() cannot be blindly trusted on all devices; also note that you should take into account the zoom level and aspect ratio, see Determine angle of view of smartphone camera

Android - Google Maps - Projection - toPixels() - Is it device independent pixels?

I am using the Map View Projection to obtain the screen pixels like
currentPixelLocation = businessMapMv.getProjection().toPixels(tappedLocation, null);
Then I am using this to do some manipulation on the screen like centering a balloon tip.
So what I do is
currentPixelLocation.y = currentPixelLocation.y - 100
This works fine. Are the pixel locations returned by the toPixels method device independent ?
Will my manioulation like the above code work for all screen resolutions ?
I think there is a misunderstanding in device independent pixels on your site. If you declare your layout in device independent pixels (dip), the framework itself calculates the correct Veiw size - based on device display density - for the device it running on at runtime. After the framework calculates the View dimension, the View has its dimension set in pixels.
So therefore, getProjection().toPixels() gives you the position relative to the underlaying MapView in pixels. Those pixels are device independent.
What seems wrong to me is you calculation currentPixelLocation.y = currentPixelLocation.y - 100 What does the 100 stand for? These 100 are definately device dependent. If you want to subtract 100 pixels, just use this calculation currentPixelLocation.y = currentPixelLocation.y - 100 * getResources().getDisplayMetrics().density + 0.5f. It ensures, that the 100 pixels are calculated device *in*dependent.
They are device independent, but they depend on the tile size that is defined for the tile layer. On regular maps the default tile size is 256 x 256 pixels.

Determine angle of view of smartphone camera

I'm trying to determine the degree size of the field-of-view of a Droid Incredible smartphone's camera. I need to know this value for an application that I'm developing. Does anyone know how I can find out/calculate it programmatically?
The Camera.Parameters getHorizontalViewAngle() and getVerticalViewAngle() functions provide you with the base view angles. I say "base", because these apply only to the Camera itself in an unzoomed state, and the values returned by these functions do not change even when the view angle itself does.
Camera.Parameters p = camera.getParameters();
double thetaV = Math.toRadians(p.getVerticalViewAngle());
double thetaH = Math.toRadians(p.getHorizontalViewAngle());
Two things cause your "effective" view angle to change: zoom, and using a preview aspect ratio that does not match the camera aspect ratio.
Basic Math
The trigonometry of field-of-view (Θ) is fairly simple:
tan(Θ/2) = x / 2z
x = 2z tan(Θ/2)
x is the linear distance viewable at distance z; i.e., if you held up a ruler at distance z=1 meter, you would be able to see x meters of that ruler.
For instance on my camera, horizontal field of view is 52.68° while vertical field of view is 40.74°. Convert these to radians and plug them into the formula with an arbitrary z value of 100m, and you get x values of 99.0m(horizontal) and 74.2m(vertical). This is a 4:3 aspect ratio.
Zoom
Applying this math to zoom levels is only slightly harder. Now, x remains constant and it is z that changes in a known ratio; we must determine Θ.
tan (Θ/2) = x / (2z)
tan (Θ'/2) = x / (2z')
Θ' = 2 atan((z / z') tan(Θ/2))
Where z is the base zoom level (100), z' is the current zoom level (from CameraParameters.getZoomRatios), Θ is the base horizontal/vertical field of view, and Θ' is the effective field of view. Adding on degree->radian conversions makes this rather verbose.
private static double zoomAngle(double degrees, int zoom) {
double theta = Math.toRadians(degrees);
return 2d * Math.atan(100d * Math.tan(theta / 2d) / zoom);
}
Camera.Parameters p = camera.getParameters();
int zoom = p.getZoomRatios().get(p.getZoom()).intValue();
double thetaH = zoomAngle(p.getHorizontalViewAngle(), zoom);
double thetaV = zoomAngle(p.getVerticalViewAngle(), zoom);
Aspect Ratio
While the typical camera is a 4:3 aspect ratio, the preview may also be available in 5:3 and 16:9 ratios and this seems to be accomplished by actually extending the horizontal field of view. This appears to be undocumented, hence unreliable, but by assuming that's how it works we can calculate the field of view.
The math is similar to the zoom calculations; however, in this case z remains constant and it is x that changes. By assuming that the vertical view angle remains unchanged while the horizontal view angle is varied as the aspect ratio changes, it's possible to calculate the new effective horizontal view angle.
tan(Θ/2) = v / (2z)
tan(Θ'/2) = h / (2z)
2z = v / tan(Θ/2)
Θ' = 2 atan((h/v) tan(Θ/2))
Here h/v is the aspect ratio and Θ is the base vertical field of view, while Θ' is the effective horizontal field of view.
Camera.Parameters p = camera.getParameters();
int zoom = p.getZoomRatios().get(p.getZoom()).intValue();
Camera.Size sz = p.getPreviewSize();
double aspect = (double) sz.width / (double) sz.height;
double thetaV = Math.toRadians(p.getVerticalViewAngle());
double thetaH = 2d * Math.atan(aspect * Math.tan(thetaV / 2));
thetaV = 2d * Math.atan(100d * Math.tan(thetaV / 2d) / zoom);
thetaH = 2d * Math.atan(100d * Math.tan(thetaH / 2d) / zoom);
As I said above, since this appears to be undocumented, it is simply a guess that it will apply to all devices; it should be considered a hack. The correct solution would be splitting off a new set of functions getCurrentHorizontalViewAngle and getCurrentVerticalViewAngle.
Unless there's some API call for that (I'm not an Android programmer, I wouldn't know), I would just snap a picture of a ruler from a known distance away, see how much of the ruler is shown in the picture, then use trigonometry to find the angle like this:
now you have the two distances l and d from the figure. With some simple goniometry, one can obtain:
tan(α/2) = (l/2)/d,
hence
α = 2*atan(l/2d)
So with this formula you can calculate the horizontal field-of-view of your camera. Of course measuring the vertical f.o.v. goes exactly the same way except that you then need to view the object in its vertical position.
Then you can hard-code it as a constant in your program. (A named constant, of course, so it'd be easy to change :-p)
I have a Droid Incredible as well. Android 2.2 introduced the functions you are looking for. In my code, I have:
public double getHVA() {
return camera.getParameters().getHorizontalViewAngle();
}
public double getVVA() {
return camera.getParameters().getVerticalViewAngle();
}
However, these require that you have the camera open. I'd be interested to know if there is a "best practices" way to not have to open the camera each time to get those values.
#David Zaslavsky - how? What is the mathematical relationship between the zoom levels? I can't find it anywhere (I asked in this question: What do the Android camera zoom numbers mathematically represent?)

Categories

Resources