Android detection of user eye focus coordinates on screen - android

I have this issue I have to make an application that is able to detect the x,y position of where the user is looking at the application, by that I dont mean like detecting user eye coordinates on face recognition like with opencv, I need the exact position of where the user is watching the screen, I have look everywhere and seem to find nothing on it. a little help of where can I research more about it would be nice, right now I can only find is opencv answers related to show the coordinates of both eyes when doing some facial recognition showing the camera, but thats not what I need.
thanks in advance!

Related

Face Detection Android

I've got a problem with Google's MLKit face detector, as it returns a face even if the face is half-covered by something and this makes the face recognition model I use to think it's a new face, so I would like to know a solution for this problem, maybe a different face detector or another solution using MLKit face detector.
Thanks in advance.
ML kit works on data. Your results will be more accurate as much as more data you will provide to your model.If its taking half covered image it will also be beneficial for your trained model. You can train model by providing many images like with eye close, eyes open, left face, right face, look down, look up, zoom or half face covered etc.Once your model have enough data then it will recognized your even if you are wearing a face mask or even if your eyes were closed.
According to my opinion MLKit is much enough to implement face detection in your app. They are also improving it contineously. Happy Coding :)

Detect if picture of picture taken in mobile app

I am working on a face recognition app where the picture is taken and sent to server for recognition.
I have to add a validation that user should capture picture of real person and of another picture. I have tried a feature of eye blink and in which the camera waits for eye blink and captures as soon as eye is blinked, but that is not working out because it detects as eye blink if mobile is shaken during capture.
Would like to ask for help here, is there any way that we can detect if user is capturing picture of another picture. Any ideas would help.
I am using react native to build both Android and iOS apps.
Thanks in advance.
Thanks for support.
I resolve it by the eye blink trick after all. Here is a little algorithm I used:
Open camera, click capture button:
Camera detects if any face is in the view and waits for eye blink.
If eye blink probability is 90% for both the eyes, wait 200 milliseconds. Detect face again with eye open probability > 90% to verify if the face is still there, and capture the picture at the end.
That's a cheap trick but working out so far.
Regards
On some iPhones (iOS 11.1 upwards), there's a so-called trueDepthCamera that's used for Face ID. With it (or the back facing dual camea system) you can capture images along with depth maps. You could exploit that feature to see if the face is flat (captured from an image) or has normal facial contours. See here...
One would have to come up with a 3d face model to fool that.
It's limited to only a few iPhone models though and I don't know about Android.

Face focus using Front camera

I want to develop an app where the face is detected using the front camera. However the image is not taken. The front camera should only detect the face and check whether it is within the correct dimensions. These dimensions will then help me to detect the distance between the face and front camera. I also want to check whether the phone is held at a distance of 20 inches or about 1 feet or not. If this is possible.. please help me with it. The app is basically for testing vision. I want to add the above feature in it.
You can achieve this without Unity or OpenCV or any other library. Please refer this and this links. They detect faces with help of android.hardware.Camera.Face class. Alo you will need to implement the listener android.hardware.Camera.FaceDetectionListener to grab the event of face detection. On being detected, you will get the face[] that will give you all the information related to the face that was detected. Hope this in what you wanted.

How to make Android screen see-through using backside camera?

Is there a way I could show what the hind-side camera captures on a full-screen such that it creates an illusion of screen being see-through? It doesn't need to be perfect, just convincing enough, a little lag won't make any difference.
Is it possible to create such an effect using phone camera? If yes, how can the effect be achieved? (as in what transformations to apply etc.)
(I already know how to create a simple Camera Preview)
Edit : Now, I also know it has been done, http://gizmodo.com/5587749/the-samsung-galaxy-s-goes-see+through, But, I still have no clue how to properly do this, I know trial and error is one way, other is calculating what part a user should be seeing if phone wasn't there.
I think there would be some factors involved like -
viewing distance,
viewing angle,
camera zoom range,
camera focus,
camera quality,
phone orientation,
camera position (where is camera located on phone) etc.
So, I don't feel this problem has a simple enough solution, if it is not so, please clarify with an answer.
Thanks for help,
Shobhit,
You can use standard 3D projection math to project a portion of the backside camera image onto the display; you can manage this by assuming everything the camera sees is at a particular depth from the backside camera, and by assuming a particular viewpoint for the observer
You can improve on this by looking for faces/eyes using the frontside camera. You can get a rough estimate of the the viewing distance from the eye spacing, and assume a viewer position midway between the eyes. Of course, this only works for one viewer at a time (e.g., if your face tracker finds multiple faces, you can select one of them).
Also, you can improve the illusion by calibrating the camera and screen so you can match the color and brightness from one to the other.

Move OpenGL ES objects with accelerometer

I'm developing an Android application. I want to do the following:
I will have a black screen with an object in its center, for example, a vase.
With this app, I will a 360 degrees view of vase. I explain: imagine the vase is the center of an imaginary circle. I want to make user follow this circle, to see the vase from any point of view. I don't know if I explain it well.
In real life, you can move around a vase and see it in front, behind, and other sides. I want to simulate this.
My problem is that I'm not sure if I can simulate this using accelerometer.
Who can I know if user is describing a circle with the mobile phone?
If you don't understand me or you need more details, please tell me.
You should combine accelerometer with compass. Compass gives you direction.

Categories

Resources