I have a requirement in android application where i want to turn off screen when no motion has been detected(Motion detection is done using camera).
But the camera should be running in background and taking pictures while the screen is off and as soon as the motion is detected then the screen should be turned on.
How can i do that in android?
I have searched a lot and implemented few things but could not achieve the desired result.
Please help me out.
I don't think this is possible. To take a picture the Camera class needs the SurfaceHolder and it needs to draw the frame on the screen for the capture to work.
Related
I am working on a face recognition app where the picture is taken and sent to server for recognition.
I have to add a validation that user should capture picture of real person and of another picture. I have tried a feature of eye blink and in which the camera waits for eye blink and captures as soon as eye is blinked, but that is not working out because it detects as eye blink if mobile is shaken during capture.
Would like to ask for help here, is there any way that we can detect if user is capturing picture of another picture. Any ideas would help.
I am using react native to build both Android and iOS apps.
Thanks in advance.
Thanks for support.
I resolve it by the eye blink trick after all. Here is a little algorithm I used:
Open camera, click capture button:
Camera detects if any face is in the view and waits for eye blink.
If eye blink probability is 90% for both the eyes, wait 200 milliseconds. Detect face again with eye open probability > 90% to verify if the face is still there, and capture the picture at the end.
That's a cheap trick but working out so far.
Regards
On some iPhones (iOS 11.1 upwards), there's a so-called trueDepthCamera that's used for Face ID. With it (or the back facing dual camea system) you can capture images along with depth maps. You could exploit that feature to see if the face is flat (captured from an image) or has normal facial contours. See here...
One would have to come up with a 3d face model to fool that.
It's limited to only a few iPhone models though and I don't know about Android.
I am trying to take 6 photos by rotating my phone for each 60 degree using camera2 api.
First time I need to take first photo using autofocus when focus area is the center of my screen.
And I have to keep this focus for the next 5 photos, it should not try to get new focus.
I have been trying to make it for a week. But the results are not as expected. I tried many ways and I faced few problems: images are blurry, the camera tries to find new focus for each photos, and so on.
It would be great if someone shows me direction :)
i don't know exact answer of your question..but i will try to tell you something's about your Question.
How to use flash without stopping camera feed?
above is link of my Answer.check that
i think ,in your App you have to put some condition for that.
Set your Flash mode regarding Rotation or whatever as you want..Like whenever Your Rotation is this, then Your Flash will this.
Like [On,Auto,Flash].
Hope this will help you...(:
I have doing a basic object detection on the camera preview screen in Android (greater than 3.2). For the devices which do not support processing on preview screen, I am buffering the preview screen, processing it and clearing the buffer. This part is working as desired.
What I now want is this app to run in the background while any other app is running in the foreground. I am using android service and am able to run a small test app in the background. However my concern is with the camera preview app.
I don't want to display the preview screen but use the preview screen information for processing. This might be too much to ask, but I wanted to know if this is even possible. I came across this link which shows some hope. Basically I want to process the video (preview) stream without displaying it on the screen. If this is doable, then I can think of putting this app in the background and some other app in the foreground.
Unfortunately I won't be able to share the code, however it is the standard logic of creating a surface view and starting the preview.
I would really appreciate any insight into this.
Check comments here.Basically he opens camera hardware, set preview callback and do startpreview without setting the previewDisplay (this might not work on every device). You can try this from your background service. All this will work if your foreground doesn't access the camera app. Please update this if it works. I am interested to know.
Is there a way I could show what the hind-side camera captures on a full-screen such that it creates an illusion of screen being see-through? It doesn't need to be perfect, just convincing enough, a little lag won't make any difference.
Is it possible to create such an effect using phone camera? If yes, how can the effect be achieved? (as in what transformations to apply etc.)
(I already know how to create a simple Camera Preview)
Edit : Now, I also know it has been done, http://gizmodo.com/5587749/the-samsung-galaxy-s-goes-see+through, But, I still have no clue how to properly do this, I know trial and error is one way, other is calculating what part a user should be seeing if phone wasn't there.
I think there would be some factors involved like -
viewing distance,
viewing angle,
camera zoom range,
camera focus,
camera quality,
phone orientation,
camera position (where is camera located on phone) etc.
So, I don't feel this problem has a simple enough solution, if it is not so, please clarify with an answer.
Thanks for help,
Shobhit,
You can use standard 3D projection math to project a portion of the backside camera image onto the display; you can manage this by assuming everything the camera sees is at a particular depth from the backside camera, and by assuming a particular viewpoint for the observer
You can improve on this by looking for faces/eyes using the frontside camera. You can get a rough estimate of the the viewing distance from the eye spacing, and assume a viewer position midway between the eyes. Of course, this only works for one viewer at a time (e.g., if your face tracker finds multiple faces, you can select one of them).
Also, you can improve the illusion by calibrating the camera and screen so you can match the color and brightness from one to the other.
I have a tabHost with three activities, one of them is meant for taking pictures and uses the android camera. When I click on this tab for the first time, a blank screen appears for 1-2 seconds before the tabs reappear and the camera preview starts. Is there a way to reduce the delay or at least let the tabs stay while the camera is opened. I tried initialising the camera in a separate thread, but it gave me a runtime exception.
As I understand, you are starting the Camera intent from your application, or are you just displaying a preview of the Camera feed right on your app? Providing code with your question will help you get better answers.
Are you having this problem on the emulator? If so, it's normal behavior and you shouldn't be worried. If this is on an actual device, I don't think you can really do anything about it. Try loading the Camera application by itself and compare times with your application, it shouldn't differ by much.
Ryan