I want to develop a camera demo app using the following web page example.
http://marakana.com/forums/android/examples/39.html
After taking a photo, I want to display the captured image in the main activity. How can I do this? Thanks.
Best to use another example. I can't get this one to work. Judging by the other Camera questions in SO, I think lots of people have the same problem. But before you give up on Marakana's example completely, try moving the elements OUTSIDE the element. And include the Marko forgot.
By doing that, I got past the RuntimeException, and now get a completely different error, one I think is specific to my phone: it is trying to create a Preview with an 'invalid preview size' of 480x604. But Marko's code sets the size from the size of the preview surface created,, so the phone itself is defaulting to an invalid size for the Preview surface.
Related
I am making an application where the user has to be suggested about the quality of the picture he is about to take.
Like "Brightness is low" and "Your environment is dim".
To achieve this, we need to put an overlay message on the camera preview while the user is viewing the subject.
Does anyone have an idea on how could we take the image brightness from the camera preview by using sensors or some other way in live mode even before the picture is taken?
Any help would be appreciated.
Using the camera parameters as pointed out by #appleskin, would either of these methods help? setWhiteBalance() or setExposureCompensation().If you want to achieve this functionality at run time then use native NDK effect for brightness.you can easily got .so files from github just use according to your further requirement.
I am currently working on an application in android studios that messes with the colors of live camera feed from a phones camera. For example, I may want to filter out all the reds, or maybe I want to make displayed camera image black-and-white.
However, I haven't really found much on how to do this. I've found tutorials on using both the deprecated Camera class and the android.hardware.camera2 class. My preferred sample code was for camera2, found directly here (takes you directly to Java class files, not whole project).
So does anyone know how to use camera2 to do what I want? Do I need to use the deprecated class Camera instead? My idea is that I need to have an activity that has the main job of displaying images, and behind the scenes the phone camera is running, sending the image (in whatever format, Bitmap) to have the colors messed with (by some code I will make), which then sends the image to be displayed in the main activity.
So that is three main pieces: (1) Camera to Bitmap, to get what is currently seen by the phone Camera and store it in code; (2) mess with the colors of the Bitmap to distort the current view in my desired way; and (3) then a way of taking the resulting distorted view and displaying that on the screen. Of course, as mentioned, it's the first and last of the three just mentioned that I really need help with.
Please let me know what other details will be helpful to know about.
I'm trying to develop an app for Android, and I would need to get uncompressed pictures with a resolution as high as possible from the camera. I tried takePicture's rawCallback and postviewCallback, but they are not working.
Right now I'm trying with OpenCV (version 2.4) using VideoCapture, but I'm stuck in the default 960x720, which is poor for what I need; and my phone, a Samsung Galaxy S3, is able to provide, theoretically, up to 8Mpx (3,264×2,448 for pictures, and 1,920×1,080 for video, according to Wikipedia). VideoCapture.set(Highgui.CV_CAP_PROP_FRAME_WIDTH/HEIGHT, some number) makes the camera return a black image as far as I've found.
Is there any way to obtain a higher resolution, either through OpenCV or with the Android API, without compressing?
I'm really sorry if this has been asked before; I have been looking for days and I have found nothing.
Thank you for your time!
EDIT: Although it is not exactly what I was asking, I found that there is a way to do something very similar: if you set an OnPreviewCallback for the Camera, using setPreviewCallback, you do get the raw picture data from the camera (at least in the S3 I'm working with). I leave it here in case somebody finds it useful in the future.
EDIT: A partial solution is explained in an answer below. To sum up,
vc.set(Highgui.CV_CAP_PROP_FRAME_WIDTH, desiredFrameWidth);
vc.set(Highgui.CV_CAP_PROP_FRAME_HEIGHT, desiredFrameHeight);
works under some conditions; please see below for further detail.
You have to get supported camera preview resoultions by calling getSupportedPreviewSizes.
After this you can set any resolution with method setPreviewSize. And don't forget to setParameters in the end. Actally many OpenCV Android examples contain this information (look at sample3).
In case anybody ever finds this useful, I found a (partial) solution: If your VideoCapture variable is called vc, this should work:
vc.set(Highgui.CV_CAP_PROP_FRAME_WIDTH, desiredFrameWidth);
vc.set(Highgui.CV_CAP_PROP_FRAME_HEIGHT, desiredFrameHeight);
Mind that the combination of width and height must be one of the supported picture formats for your camera, otherwise it will just get a black image. You can get those through Camera.Parameters.getSupportedPictureSizes().
However, setting a high resolution appears to exceed the YUV conversion buffer's capacity, so I'm still struggling with that. I'm going to make a new separate question for that, to keep everything clearer: new thread
setPreviewSize does not set picture resolution. setPictureSize does.
so, the way I read the documentation, using EXTRA_OUTPUT tells the camera to save the file in a specific location. That's great, but it also says to get a full size image. That's not so great.
How can I get just a small image but still specify the filename?
After trying to work with the built-in Camera activity for some time now I can advise you not to expect anything good from it because:
built-in activity differs from version to version. For example in 2.2 emulator it even crashes when you try to take a (dummy) picture.
Camera activity on real devices like Samsung Galaxy S is different, i.e. it not just looks different, it has different code and set of bugs.
Original built-in Camera activity has CROP feature, but it is not part of the public API and thus it is not good idea to use it.
So far I fount that to be safe when working with camera I need to:
- create my custom camera activity that misses the fancy stuff like filters, etc but is more configurable (I don't have it yet). I've tried to find third party Camera App but every one of them seems to be targeted at normal users not developers, i.e. has many "cool" features but it is slow / bloated / buggy / has bad UI.
- create thumbnail images by myself outside of the Camera activity (for more control).
I really hope that I am missing something here and someone will correct me in the comments with appropriate solution...
I ended up just dealing with the large images by always scaling on the read. It would have been nice not to have to do that as I read in more than one place, but ...oh well...
problem solved, although far from elegant.
I want to create a simple game for android which is supposed to use a camera. The thing is that I don't really care about the resolution of the picture, what meters to me is speed. I need the picture to be taken the moment a key is pressed. Ideal solution would be to use a frame from camera preview. I need this picture as a table of data for analysis, I don't need a file.
Is this possible to get a single frame form camera/video camera preview?
Or maybe it's easier to get a low resolution picture?
Can you tell me which function would be useful here, and maybe give a short example?
How can I get access to picture data using such a function?
Thanks in advance!
Bye
You may find the ZXing barcode scanner's source code helpful.