I am trying to take 6 photos by rotating my phone for each 60 degree using camera2 api.
First time I need to take first photo using autofocus when focus area is the center of my screen.
And I have to keep this focus for the next 5 photos, it should not try to get new focus.
I have been trying to make it for a week. But the results are not as expected. I tried many ways and I faced few problems: images are blurry, the camera tries to find new focus for each photos, and so on.
It would be great if someone shows me direction :)
i don't know exact answer of your question..but i will try to tell you something's about your Question.
How to use flash without stopping camera feed?
above is link of my Answer.check that
i think ,in your App you have to put some condition for that.
Set your Flash mode regarding Rotation or whatever as you want..Like whenever Your Rotation is this, then Your Flash will this.
Like [On,Auto,Flash].
Hope this will help you...(:
Related
i'm trying to do a simple AR scene with NFT image that i've created with genTextData. The result works fairly well in unity editor, but once compiled and run on an android device, the camera resolution is very bad and there's no focus at all.
My marker is rather small (3 cm picture), and the camera is so blurred that the AR cannot identify the marker from far away. I have to put the phone right in front of it (still verrrrryy blurred) and it will show my object but with a lot of flickering and jittering.
I tried playing with the filter fields (Sample rate/cutoff..), it helped just a little bit wit the flickering of the object, but it would never display it from far away..i always have to put my phone like right in front of it. The result that i want should be: detecting the small marker (sharp resolution or/and good focus) from a fair distance away from it..just like the distance from your computer screen to your eyes.
The problem could be camera resolution and focus, or it could be something else. But i'm pretty sure that the AR cannot identify the marker points because of the blurriness.
Any ideas or solutions about this problem ?
You can have a look here:
http://augmentmy.world/augmented-reality-unity-games-artoolkit-video-resolution-autofocus
I compiled the Unity plugin java part and set it to use the highest resolution from your phone. Also the auto focus mode is activated.
Tell me if that helps.
I have a requirement in android application where i want to turn off screen when no motion has been detected(Motion detection is done using camera).
But the camera should be running in background and taking pictures while the screen is off and as soon as the motion is detected then the screen should be turned on.
How can i do that in android?
I have searched a lot and implemented few things but could not achieve the desired result.
Please help me out.
I don't think this is possible. To take a picture the Camera class needs the SurfaceHolder and it needs to draw the frame on the screen for the capture to work.
Is there a way I could show what the hind-side camera captures on a full-screen such that it creates an illusion of screen being see-through? It doesn't need to be perfect, just convincing enough, a little lag won't make any difference.
Is it possible to create such an effect using phone camera? If yes, how can the effect be achieved? (as in what transformations to apply etc.)
(I already know how to create a simple Camera Preview)
Edit : Now, I also know it has been done, http://gizmodo.com/5587749/the-samsung-galaxy-s-goes-see+through, But, I still have no clue how to properly do this, I know trial and error is one way, other is calculating what part a user should be seeing if phone wasn't there.
I think there would be some factors involved like -
viewing distance,
viewing angle,
camera zoom range,
camera focus,
camera quality,
phone orientation,
camera position (where is camera located on phone) etc.
So, I don't feel this problem has a simple enough solution, if it is not so, please clarify with an answer.
Thanks for help,
Shobhit,
You can use standard 3D projection math to project a portion of the backside camera image onto the display; you can manage this by assuming everything the camera sees is at a particular depth from the backside camera, and by assuming a particular viewpoint for the observer
You can improve on this by looking for faces/eyes using the frontside camera. You can get a rough estimate of the the viewing distance from the eye spacing, and assume a viewer position midway between the eyes. Of course, this only works for one viewer at a time (e.g., if your face tracker finds multiple faces, you can select one of them).
Also, you can improve the illusion by calibrating the camera and screen so you can match the color and brightness from one to the other.
I want to develop a camera demo app using the following web page example.
http://marakana.com/forums/android/examples/39.html
After taking a photo, I want to display the captured image in the main activity. How can I do this? Thanks.
Best to use another example. I can't get this one to work. Judging by the other Camera questions in SO, I think lots of people have the same problem. But before you give up on Marakana's example completely, try moving the elements OUTSIDE the element. And include the Marko forgot.
By doing that, I got past the RuntimeException, and now get a completely different error, one I think is specific to my phone: it is trying to create a Preview with an 'invalid preview size' of 480x604. But Marko's code sets the size from the size of the preview surface created,, so the phone itself is defaulting to an invalid size for the Preview surface.
I developed a small video camera application. It all works fine except focus.
I understand I need to call camera.autofocus, but I don't really know where is the right place to put the call it.
Anyone ever succeeded in autofocusing a video camera on android?
Thank
Eli
It's probably a matter of preference based on how you think users will use your app.
I think a common convention is to do an auto-focus when the user touches the scene in the preview. Most OEM camera apps seem to do this.
Doing auto-focus after zooming is also a common thing.
Finally, you might want to have a look at the zxing project (bar code scanner app) which has a nifty continuous auto focus approach that might be of use, though since youre capturing video, it might not be ideal as the focus transitions might be too noticeable.
http://code.google.com/p/zxing/source/browse/trunk/android/src/com/google/zxing/client/android/camera/AutoFocusCallback.java?r=1698