Is it possible to measure distance to object with camera? - android

Is it possible to measure distance to object with phone camera?
I mean, in my application I start the camera, facing the camera to the object (lets say house) and then press the button and it calculates the distance and shows me in screen.
If it's possible where I can find some tutorial or information about it?

I accept the question has been answered adequately (with the obvious caveats of requiring level ground and possible accuracy problems) but for those who don't believe it can be done or that it needs a video camera, let me explain the low-level math needed to do it....
The picture above shows me standing outside my house. The horizontal (d) is the distance I want to measure and the vertical (h) is the height above the ground at which I'm holding the camera. In this case 'h' is a known value when I'm holding the android camera at eye-level (approx 67 inches or 1.7 metres). When I tilt the camera to aim it directly at the point my house meets the ground, all the software needs to do is work out the angle (a) relative to vertical and it can calculate 'd' using...
d = h * tan a

Well you should read how ithinkdiff.com "measures" the distance:
Uses the angle of the iPhone to estimate the distance to a point on the ground.
Hold the iPhone in front of you, align the point in the camera and get a direct
reading of the distance. The distance can then be used in the speed tool.
So basically it takes the height of where you hold the phone (eye-level), then you must point the camera to the point where object touches the ground. Then the phone measures the inclination and with simple trigonometry it calculates distance.
This is of course not very accurate. It gets less accurate the further the object is. Also it assumes that the ground is level.

Nope. The camera can only give you image data and an image alone doesn't give you enough information to give you depth information. If you had multiple images that you had location information for or even video you could then process it to triangulate the distance, but a single image alone would not be enough to give you a distance.

You can use the technique used by our eye to get perspective of depth and distance.
1) Get 2 images of the same object from two different camera positions.
2) The distance or pixels between object in 2 images is inversely proportional to distance between camera and object.
The implementation is available at https://github.com/agnelvishal/Distance-between-camera-and-object
Here is the research paper http://dsc.ijs.si/files/papers/S101%20Mrovlje.pdf

You have the angle in the phone's accelerometer. If you calculate the tangent of this angle and multiply it by the height of the camera lens, you get the distance.

I think this App uses the approach MisterSquonk mentioned (its free). Watch the "Trigonometry" technique.

I think by using FastCV you can calculate the distance between Camera and the object. In this You dont need to know the angle or the Position of camera that you are holding above ground Level. take a look at this question here

One way to achieve this is using the DPI's in your device. You can take a picture and calculate the height. But you'll need another object as a reference and then you will be able to know the problem with this method could be the perspective between the objects

I think it could be possible doing that using the phone camera. I know that the modern phones use lenses to focus on a object. If it is possible to know their focal length and their position(displacement) to focus on the chosen object it's also possible to determinate the distance.

No. Only with two cameras in stereo mode, like the xbox 360 kinect. It takes at least 3 points to triangulate distance.

Related

How to measure distance using camera pixels?

How can I know a distance between phone and an object which appears in the camera? Is there a way to measure it by using pixels of a camera between them? And if yes, then approximately what accuracy will I have?
Are you taking about a camera inside the phone? If so, there is no number of pixels between the phone and the object.
In "laboratory conditions" you could measure the size of the known object in the image and use some empirical values to interpolate the distance between phone and object. Maybe openCV for Android helps to get acceptable results outside laboratory conditions.
I think there is a solution to your problem:
Focus the object and use the focus-distance. The results for objects in the distance will be useless, results for nearer objects should be acceptable.

How to measure real size a object by using the other object ( square marker ) when knowing the real size in Android? [duplicate]

I have to make a mobile app that calculates the real life size of an object in an image.
I have done some research on it and found helpful [question]: How would you find the height of objects given an image?
The relation of the distance of the camera and real life size of the object isn't actually that complex, the ratio of the size of the object on the sensor and the size of the object in real life is the same as the ratio between the focal length and distance to the object.
distance to object (mm) = focal length (mm) * real height of the object (mm) * image height (pixels)
---------------------------------------------------------------------------
object height (pixels) * sensor height (mm)
But how to get the value of real height of the object if distance is not known ?
Do the tools that create 3d models from images have real life dimensions?
The simple answer is you can't.
Incidentally, this is why humans have two eyes. If you want to judge size without a known distance, you'll need at least two reference points. This allows you to triangulate the position of the object, get a distance to it, and use your known focal distance to calculate the size.
The more complex answer is there are ways around this for example:
Cheat by using a known reference:
For example, if you have an object of known size, you can infer the distance. This is similar to what NASA does to calibrate its cameras, for example.
You can make safe assumptions if you're dealing with common objects, such as the height of one storey when analysing the image of a building.
Move your camera around:
This allows you to get more than one reference point with the same camera.
I suppose you could use the accelerometer to accurately measure the positional relation between the image captured at point T1 in time and point T2. This would give you two images of the same subject with a known distance between them. This then allows you to triangulate as if you had two eyes.
Whether normal hand-held camera jitters will be sufficient for triangulation, or whether the accelerometer will be accurate enough to inertially position the phone, I don't know.
Assume a distance:
If your app is designed to compare something on the scale of a human hand (or other bit of human anatomy), you can probably safely assume a distance based on what people will naturally do. The focus limits of the camera itself will also give an upper and lower range on how far an object can be and still be in focus. This will probably be within a tolerable margin of error.
As you mention in your question, there is an entire subfield dedicated to this question, and it is an active research area.

How to Calculate Height of Android Phone from ground

I am working on a project in which i have to calculate my device height from ground. I have searched all over the internet but could not find any solution.
Please, Anyone tell me what to do..??
Take it with a grain of salt, a bit of humor and a sense of philosophy. Change the barometer by your smartphone.
http://naturelovesmath-en.blogspot.ca/2011/06/niels-bohr-barometer-question-myth.html
First it has to be clarified, if "height from ground" means altitude in meaning "height from sea level" or you mean, how far the phone is away from the floor, when you have it in your hands.
For the second case:
Like SonicWind states, you could do the trick using the camera.
It would require calibration of the camera and to have a standard object.
Take a picture of the standard object which has to be positioned on the ground with standard zoom.
Recognize the object size - or select it in the picture, and calculate the distance to the object.
-> you have the distance to the ground.
The object might be also your shoes etc. So if the application should be for multiple users, you might allow them to enter their shoe sizes ;)
This is an odd one..but OK..I like a challenge. The only way to realistically do this is to run a sonar sensor on the phone(easily done on arduino). Other than that..all you can do is set up the code to read the accelerators to guesstimate the distance(put the phone on the ground and pick it up to the height you want. It appears to be impossible to do otherwise(maybe some concept use of the camera..)

Transform Latitude,Longitude-Position on screen in augmented reality app

This is my first post on this forum and I'm very new in programming. I want to build an application where I can see exactly where some gps-values are on my phone. I know a lot of applications, like junaio, mixare and others, but they only show the direction to the objects and they are not very accurate (they don't have the goal to project it on the exact position on screen) - so I want to build it myself. I program in android, but I think it would be the same on iPhone.
I followed the steps suggested from dabhaid :
There are three steps.
1) Determine your position and orientation using sensors.
2) Convert from GPS coordinate space to a planar coordinate space by determining the relative position and bearing of known GPS coordinates using e.g great circle distance and bearing. (your devices stays at the origin of the coordinate space with this scheme)
3) Do a perspective projection http://en.wikipedia.org/wiki/3D_projection#Perspective_projection to figure out where on the plane that is your display (ok, your camera sensor) the objects should appear, so you can augment them.
Step 1: easy, I have the gps-position and all orientations from my mobile device (x,y,z). For further refinements, I can use some algorithm to smooth this values (average, low filter, whatever).
Step 2: I don't know, what is exactly meant by planar coordinate space. I have some different approaches to convert my gps coordinate space. One of them is ECEF (earth centered), where 0,0,0 is the center of the earth. Somehow, this doesn't look good to me, because every little change of ONE axis, results in changes of the other two axis. So if I change the altitude, all of the 3 axis will change. I don't know if I can follow step 3 with this coordinate system.
In step 2 is mentioned: using haversine - this would give me the distance to the point, but I don't get x,y,z from it. Do I have to calculate x,y by using trigometry (bearing (alpha) + distance (hypotenuse)) ?
Step 3: This method looks really cool! If I have my coordinate space from Step 2, I can calculate d_x,d_y,d_z by using the formula on wikipedia. But after this step, I'm not finished yet because i just have the coordinates and for projecting it on my screen, I only need two coordinates? The text from wikipedia is continued by calculating b_x,b_y They use e_x,e_y,e_z which is the viewer's position relative to the display surface -> How can I get these values from my mobile device? (android/ios). Another approach, which is suggested from wikipedia is: Calculating b_x,b_y by by using the formula mentioned on wikipedia. In this formula they use s_x,s_y, which is the screen size and r_x,r_y which is the recording surface size. Again, how can I get the recording surface from my mobile device?
I can't find anything for it on the internet. It seems that nobody on android/ios has ever implemented a perspective projection before...
Thank you very much for all of your answeres! Also, links to useful sites would help!
I think you can find many answers in this other thread: Transform GPS-Points to Screen-Points with Perspective Projection in Android.
Hope it helped, bye!
Here's a simple solution I did on this issue.
A: Mapping GPS locations on the camera preview in Android
Hope it helped. :D

how to calculate phone's movement in the vertical direction from rest?

I am developing an app using android OS for which I need to know how can I calculate the movement of the device up in the vertical direction.
For example, the device is at rest (point A), the user picks it up in his hand (point B), now there is a height change between point A and point B, how would i calculate that?
I have already gone through the articles about sensors and accelerometers, but I couldn't really find anything to help me with that. Anyone have any ideas?
If you integrate the acceleration twice you get position but the error is horrible. It is useless in practice. Here is an explanation why (Google Tech Talk) at 23:20. I highly recommend this video.
Now, you do not need anything accurate and that is a different story. The linear acceleration is available after sensor fusion, as described in the video. See Sensor.TYPE_LINEAR_ACCELERATION at SensorEvent. I would first try a high-pass filter to detect sudden increase in the linear acceleration along the vertical axis.
I have no idea whether it is good for your application.
You can actually establish (only) the vertical position without measuring acceleration over time. This is accomplished by measuring the angle between the direction to the center of the earth, and the direction to the magnetic north pole.
This only changes (significantly) when the altitude (height) of the phone changes. What you do is use the accelerometer and magnetometer to get two float[3] arrays, treat these as vectors, make them unit vectors, and then the angle between any two unit vectors is arccos(AxM).
Note that's dot product ie. math.acos(A[0]*B[0]+A[1]*B[1]+A[2]*B[2]) Any change in this angle corresponds to a change in height. Also note that this will have to be calibrated to real units and the ratio of change in angle to height will be different at various longitudes; But this is a method of getting an absolute value for height; though of course the angle also becomes skewed when undergoing acceleration, or when there are nearby magnets :)
you can correlate it to magnetic field sensor in microTesla
You can use dist= integral of integral of acceleration ~ sigma ~ summation
= integral of speed+constant

Categories

Resources