I am not sure how to exactly express this. But you must have noticed that the android camera automatically adjusts the 'look' of the camera preview based on the object it is being pointed at. Like for example, if you point the camera directly at a light, it will darken the surrounding area of the light and make the light appear without blowing out the color. I have fiddled with many of the settings in the camera app but couldn't find any way to stop this automatic adjustment.
So what is this adjustment called really. And can I turn this feature off/on from code?
Related
I'm looking at the doc http://developer.android.com/reference/android/hardware/Camera.Parameters.html
and I'm not seeing anything that would allow me to make it possible for the user to tap on a specific point in the camera preview and have that become the point on which the camera would try to focus.
Is this simply missing? or am I overlooking how this can be done?
In Android 4.x, this is possible, with the setFocusAreas. You'll have to check getMaxNumFocusAreas first to see if this feature is supported on your device, and how many areas to use.
Then, you'll need to convert the user's touch coordinates to the coordinates used by the Camera.Area object (described here), and call setFocusAreas with the coordinates. From then on, calls to autoFocus will use that region for selecting focus.
I am using the custom camera application with preview, using the sdk example on nexus s (2.3)
everything works fine, including taking sharp picture, but the preview does not seem to adjust his level (intensity) like the built in camera does:
when I preview dark objects, the built in camera compensates and increases the intensity, while the custom preview stays in the default intensity, making the preview pretty dark. The images turns out in the correct intensity.
It is not related to white balancing, nor to camera exposure.
I do not want to have a full preview processing chain - just enable the luminance automatic level control - is it possible using the standard API?
Thanks
If you use the autofocus on preview I believe you will get the results you expect.
See Camera.autofocus and Camera.AutoFocusCallback
I believe you'll get one autofocus per call. That is, it's not continuous. You can either implement a way to call it continuously using handlers or simply put a touchlistener on your surfaceview and do autofocus when the user taps the preview.
I'm trying to find out if there is any infrared source in view of the camera on an android device. (Namely a infrared LED)
Since the camera captures infrared light (I can see the LEDs light up in the preview/pictures) I thought it should be somehow possible to find out if the camera is currently capturing infrared signals, but as the IR 'color' is somehow translated to visible colors (purple like), it's apparently not as easy as just finding out if there is any purple in the picture as it might be real purple not infrared.
The Android reference tells me I can get the picture in different image formats (YCbCr, YUV ,...) but none of these formats seem to be of much help.
Now my idea is, to somehow get the "original" data from the camera, that still includes the information on what is infrared and what not or to basicaly revert the infrared to visible light conversion that apparently happens automatically in the background. Any idea on how I might achive that?
Good question, If I take the remote control for HI-FI or TV and I press the Volume up / down, than I can get the IR light source for the Nexus One camera: it is visible a light purple color flashing. So the Nexus One has IR camera :)
Digital cameras use CCD and other similar sensors to capture infrared
images. Although all digital cameras available on the market are
sensitive to infrared light, they are equipped with infrared-blocking
filters. The main reason for this is that consumer cameras are
designed to capture visible light. But sometimes these filters are
used together, giving very interesting in-camera effects like false
color, wood effects etc. To start with infrared photography, all you
need to have is A digital camera that is sensitive to infrared light.
A visible-light blocking filter (e.g. a Wratten 89B filter)
Image-editing software, such as Photoshop.
http://www.smashingmagazine.com/2009/01/11/40-incredible-near-infrared-photos/
I wrote a very-very simple online radio player. Has an asynchronous call with Media Player. Some devices are playing some not, and different mode. From my sad experience with the simple Madia Player you will have to write like 5 versions to get working in a few devices. Audio-Video has added, removed features from each manufacturer: Samsung, Sony Ericsson, Motorola and you just wonder why doesn't work your code on THAT device if is platform independent Java...
I think to get the original data from camera it would be possible via NDK, but didn't tested.
It provides headers and libraries that allow you to build activities,
handle user input, use hardware sensors, access application resources,
and more, when programming in C or C++
http://developer.android.com/sdk/ndk/index.html
If somebody found a solution using SDK and working with 2 different phone manufacturers please let me know!
My friend only way to find out infrared source in android not relying on color codes
because color spaces are only created for visible lights so no use for invisible light registered inside the sensor as a rgb value even though it happened
logically color space values are only meaningful for visible light.
and color space values are not really created to represent the wavelengths out of our visible range.
there is no correct way to interpret visible color space values such as rgb yuv as infrared unless you know the exact camera sensor detail white papers and tested the sensor with multiple highly accurate filters for each specific wavelengths this is so over kill for and app.
to keep things simple use external external infrared filter for specific wavelengths
and remove the filter inside the phone or tablet camera if you can physically modify
other way around use known model of web cam and remove the filter on top of the sensor and use external filters to protect the sensor from UV light and so on.
Is there a way I could show what the hind-side camera captures on a full-screen such that it creates an illusion of screen being see-through? It doesn't need to be perfect, just convincing enough, a little lag won't make any difference.
Is it possible to create such an effect using phone camera? If yes, how can the effect be achieved? (as in what transformations to apply etc.)
(I already know how to create a simple Camera Preview)
Edit : Now, I also know it has been done, http://gizmodo.com/5587749/the-samsung-galaxy-s-goes-see+through, But, I still have no clue how to properly do this, I know trial and error is one way, other is calculating what part a user should be seeing if phone wasn't there.
I think there would be some factors involved like -
viewing distance,
viewing angle,
camera zoom range,
camera focus,
camera quality,
phone orientation,
camera position (where is camera located on phone) etc.
So, I don't feel this problem has a simple enough solution, if it is not so, please clarify with an answer.
Thanks for help,
Shobhit,
You can use standard 3D projection math to project a portion of the backside camera image onto the display; you can manage this by assuming everything the camera sees is at a particular depth from the backside camera, and by assuming a particular viewpoint for the observer
You can improve on this by looking for faces/eyes using the frontside camera. You can get a rough estimate of the the viewing distance from the eye spacing, and assume a viewer position midway between the eyes. Of course, this only works for one viewer at a time (e.g., if your face tracker finds multiple faces, you can select one of them).
Also, you can improve the illusion by calibrating the camera and screen so you can match the color and brightness from one to the other.
I developed a small video camera application. It all works fine except focus.
I understand I need to call camera.autofocus, but I don't really know where is the right place to put the call it.
Anyone ever succeeded in autofocusing a video camera on android?
Thank
Eli
It's probably a matter of preference based on how you think users will use your app.
I think a common convention is to do an auto-focus when the user touches the scene in the preview. Most OEM camera apps seem to do this.
Doing auto-focus after zooming is also a common thing.
Finally, you might want to have a look at the zxing project (bar code scanner app) which has a nifty continuous auto focus approach that might be of use, though since youre capturing video, it might not be ideal as the focus transitions might be too noticeable.
http://code.google.com/p/zxing/source/browse/trunk/android/src/com/google/zxing/client/android/camera/AutoFocusCallback.java?r=1698