Is it possible to measure the distance between your android phone screen and user face?
I want to change the zoom ratio zoom of your android app screen base on the distance between your face and your phone screen.
Please help me.
Thanks.
Yes it is possible, as long as you have a front facing camera in your device. This was more or less my bachelor thesis.
Here is the code
https://github.com/philiiiiiipp/Android-Screen-to-Face-Distance-Measurement
and a video of the result https://www.youtube.com/watch?v=-6_pGkPKAL4
and the paper A new context - Screen to Face distance 1 1.pdf
I described how to calculate screen to face distance in a post on my blog. The algorithm is based on the distance between eyes. The further the face, the less the distance between your eyes appears on the camera.
https://ivanludvig.github.io/blog/2019/07/20/calculating-screen-to-face-distance-android
I don't believe it's possible.
Not all Android devices are equipped with a proximity sensor
These sensors are just meant to detect wether or not you have your cheek next to the phone.
EDIT following OP comment :
Technically, you could indeed use some image recognition algorithm to compute the screen-face distance based on the input from front camera.
But:
Not all Android devices have a front facing camera.
Implementing this from scratch would be extremely complex if you have no background in image processing.
Related
I am working on a face recognition app where the picture is taken and sent to server for recognition.
I have to add a validation that user should capture picture of real person and of another picture. I have tried a feature of eye blink and in which the camera waits for eye blink and captures as soon as eye is blinked, but that is not working out because it detects as eye blink if mobile is shaken during capture.
Would like to ask for help here, is there any way that we can detect if user is capturing picture of another picture. Any ideas would help.
I am using react native to build both Android and iOS apps.
Thanks in advance.
Thanks for support.
I resolve it by the eye blink trick after all. Here is a little algorithm I used:
Open camera, click capture button:
Camera detects if any face is in the view and waits for eye blink.
If eye blink probability is 90% for both the eyes, wait 200 milliseconds. Detect face again with eye open probability > 90% to verify if the face is still there, and capture the picture at the end.
That's a cheap trick but working out so far.
Regards
On some iPhones (iOS 11.1 upwards), there's a so-called trueDepthCamera that's used for Face ID. With it (or the back facing dual camea system) you can capture images along with depth maps. You could exploit that feature to see if the face is flat (captured from an image) or has normal facial contours. See here...
One would have to come up with a 3d face model to fool that.
It's limited to only a few iPhone models though and I don't know about Android.
i'm trying to do a simple AR scene with NFT image that i've created with genTextData. The result works fairly well in unity editor, but once compiled and run on an android device, the camera resolution is very bad and there's no focus at all.
My marker is rather small (3 cm picture), and the camera is so blurred that the AR cannot identify the marker from far away. I have to put the phone right in front of it (still verrrrryy blurred) and it will show my object but with a lot of flickering and jittering.
I tried playing with the filter fields (Sample rate/cutoff..), it helped just a little bit wit the flickering of the object, but it would never display it from far away..i always have to put my phone like right in front of it. The result that i want should be: detecting the small marker (sharp resolution or/and good focus) from a fair distance away from it..just like the distance from your computer screen to your eyes.
The problem could be camera resolution and focus, or it could be something else. But i'm pretty sure that the AR cannot identify the marker points because of the blurriness.
Any ideas or solutions about this problem ?
You can have a look here:
http://augmentmy.world/augmented-reality-unity-games-artoolkit-video-resolution-autofocus
I compiled the Unity plugin java part and set it to use the highest resolution from your phone. Also the auto focus mode is activated.
Tell me if that helps.
I am working on an application which actually detects the objects or faces and measures the distance from camera to that object or face. I complete the face detection area, now is there any way to measure the distance between detected face from the point where camera is located.
Please Provide any link or source code I have searched a lot but all in vain.
Essentially, by tracking the distance between the user's eyes, and how this changes as the face is closer or further from the camera, a fairly accurate idea of the distance from the camera can be obtained.
Android has a built in face detector class that will handle determining where the face is, and even calculate eye separation for you.
A guy did this for his thesis, and posted the code on github along with some nice images outlining what it does, a demo video and a link to the paper he wrote.
I want to develop an app where the face is detected using the front camera. However the image is not taken. The front camera should only detect the face and check whether it is within the correct dimensions. These dimensions will then help me to detect the distance between the face and front camera. I also want to check whether the phone is held at a distance of 20 inches or about 1 feet or not. If this is possible.. please help me with it. The app is basically for testing vision. I want to add the above feature in it.
You can achieve this without Unity or OpenCV or any other library. Please refer this and this links. They detect faces with help of android.hardware.Camera.Face class. Alo you will need to implement the listener android.hardware.Camera.FaceDetectionListener to grab the event of face detection. On being detected, you will get the face[] that will give you all the information related to the face that was detected. Hope this in what you wanted.
Is there a way I could show what the hind-side camera captures on a full-screen such that it creates an illusion of screen being see-through? It doesn't need to be perfect, just convincing enough, a little lag won't make any difference.
Is it possible to create such an effect using phone camera? If yes, how can the effect be achieved? (as in what transformations to apply etc.)
(I already know how to create a simple Camera Preview)
Edit : Now, I also know it has been done, http://gizmodo.com/5587749/the-samsung-galaxy-s-goes-see+through, But, I still have no clue how to properly do this, I know trial and error is one way, other is calculating what part a user should be seeing if phone wasn't there.
I think there would be some factors involved like -
viewing distance,
viewing angle,
camera zoom range,
camera focus,
camera quality,
phone orientation,
camera position (where is camera located on phone) etc.
So, I don't feel this problem has a simple enough solution, if it is not so, please clarify with an answer.
Thanks for help,
Shobhit,
You can use standard 3D projection math to project a portion of the backside camera image onto the display; you can manage this by assuming everything the camera sees is at a particular depth from the backside camera, and by assuming a particular viewpoint for the observer
You can improve on this by looking for faces/eyes using the frontside camera. You can get a rough estimate of the the viewing distance from the eye spacing, and assume a viewer position midway between the eyes. Of course, this only works for one viewer at a time (e.g., if your face tracker finds multiple faces, you can select one of them).
Also, you can improve the illusion by calibrating the camera and screen so you can match the color and brightness from one to the other.