Android - Reducing Glare on Camera (Photo/Video) [closed] - android

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I am attempting to Reduce the glare produced when taking a photo or video of a Reflecting surface; Window, Glasses or another Mobile Device.
I have done some research on the subject, and it seems that some algorithms exist. But I am yet to find a coded implementation.
The reason I need this is because I am making an Application that allows you to read different colours on another devices screen using the camera on your device.
And if there is a lot of glare, the colours wont read properly. It needs to be able to distinguish between 16 Colors.
Are there any existing implementations, and if so how would I implement them into Android?

I recommend doing a threshold on top of a gaussian blur to identify bright spots in your image and removing them. OpenCV is the industry standard and your best bet for image manipulation. I recommend doing some experiments on a computer first to get your process flow right first, before moving it onto a phone. Also, stay away from anything too novel/complicated.
[1] How to detect Hotspots in an image
[2] http://opencv.org/platforms/android.html

Related

Detect physical heartbeat using the front facing camera [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I want to detect the user's physical heartbeat using the front-facing camera of the phone.
Some apps detect heartbeat in the finger using micro changes of the red color intensity of the finger, while turning on the led flash.
I'm interested in learning the user's heartbeat based on their facial red color intensity.
The solution can take into account that the lighting is ideal and only the red color intensity of the user's face is changing.
Does anyone know of a solution for this?
Is there a ready made algorithm for that?
Did anyone write an implementation for that?
Thank you!
Yes indeed it's possible. Take a look at this TED talk that demonstrates exactly that. In the video the presenter provides a link that includes the source code, as well as pre-built tools that do that sort of processing on existing video:
http://www.ted.com/talks/michael_rubinstein_see_invisible_motion_hear_silent_sounds_cool_creepy_we_can_t_decide?language=en

Human body shape detection on image from Android device camera [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I must implement an applcation that draws a heart on a person seen via an Android device camera(augmented reality stuff).
So, I must detect the shape/contour of the upper part of human body(head, neck, shoulders) in order to know where the heart must be situated. This stuff must be made in real time.
I've searched about this stuff but I haven't find something useful. Does anybody know some tutorials or examples of something related to this? Thanks.
I would recommend using OpenCV if the memory isn't a problem ( your app would be like 50 MB). There are plenty of tutorials to learn how to use OpenCV. Try Cascade Classifier
In case of face detection, as far as i know, android has a library for this. Camera.Face

Wrapper Library for Android Camera API [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
Is there a wrapper library for the Android Camera API that covers all the pitfalls in the different API versions (f.e. checks the version if front camera, auto-focus or flash light is available and provides controls for it in the UI) and all the hardware bugs in different handsets ?
For anybody who stumbles here searching a camera library, there is CWAC-Camera library which helps reducing the somehow extensive Android Camera API and, at the same time, bringing better support across various devices:
https://github.com/commonsguy/cwac-camera
Hope this helps!
Edited
CWAC-Camera library is deprecated
use this second version
https://github.com/commonsguy/cwac-cam2
If you aren't so ambicious, the Square Camera is a very simple approach to task, and seems to work nicely: https://github.com/boxme/SquareCamera
No video record, no options, but good focus and the square approach avoid issues with resolution and portrait/landscape modes. Seems to work on devices where CWAC2 doesn't.
I'm facing the same problem right now. I just can't get my head around the fact that it is so hard to do such a simple thing on Android: taking a picture and saving it to the device.
Anyroad: I just stumbled upon this library. I didn't try it yet. Maybe it helps you: https://github.com/girishnair12345/Girish-Camera-Library-Project

Loading and showing high resolution images on mobile devices [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I'm looking for a framework or a solution of loading and showing high resolution images on mobile devices where being able of viewing them in their native resolution is compulsory. Scaling is optional.
I'm talking about resolutions reaching over 2000x1000 pixels.
10x for your kind help BR
Without going into much detail you could probably use a TextureRegion with libgdx to display certain parts of your image on the screen.
I even think there's an example on the wiki you could use.
Check out Apple's PhotoScroller example. I have successfully used it with images over 100 megapixels.

Video making software for Android Tablet [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
I wanted to make video demo of my application in order to present. How can i make my application video when it running and open on android Tablet.
I've tried to come up with ways to do this to no avail. Best I've ever been able to achieve is stick your device in front of a nice camera.
However this http://www.bluestacks.com/ seems like it might be promising for this purpose once it is released. If you can run your apps on a windows machine (and it is much better than the emulator) then you could record that section of your screen with something like Fraps.
Edit: Bluestacks has since opened up for beta. For me it runs similarly to the emulator so does not provide a whole lot of benefit for the purposes of recording your applications. If you don't have a development environment with an emulator set up already though bluestacks will be ready for you to actually start recording quicker.

Categories

Resources