Android Camera Intent Image Quality - android

I'm making an android application that uses the camera by using the built in one as an intent. I'm doing some analysis on the photo on a pixel by pixel basis. My wildfire returns 79000 pixels in a normal photo, this is making the analysis a little slow.
Is there any way to set the quality/resolution of the camera or image returned when using intents? I've had a look around on Google and the api and haven't come up with much. Anyone know if this is even possible or know of some other way to degrade the image resolution?
Any help greatly appreciated.

Why use the built in Appliction though the intent and not an application you'll write yourself by using the API ? I have seen various problems when using the camera this way; after all, if you want full control you shouldn't be using the intent. There are lots of working examples for taking photos through the API.
So, when using the camera through the API you may use the setParameters method of the Camera class to pass a CameraParameters object to your camera object and change various parameters.
The CameraParameters class contains a setPictureSize(int , int) method which you may use to change the size of your picture. You can use the getSupportedPictureSizes() method of CameraParameters to find out which picture sizes your device supports and use the one that fits you.
List<Size> sizes = mCamera.getParameters().getSupportedPictureSizes();
for (Size size : sizes) {
// Do something, e.g. :
menu.add(0, 1, 1, size.width + "x" + size.height);
}
Also, I see that you may set the format of your picture as NV21. This is a raw format and you can easily subsample the pixels of the returned image yourself (take a single pixel as the average of every 4 pixels so you will reduce both your image width and height by two) !

Related

ARCore : Get Camera's resolution

Is there any way to get camera's resolution ? (unless using Android from scratch)
I did not found any getter, on a setter on session.setDisplayGeometry() that is not really what I expect.
My goal is to know the camera's picture ratio to cropp it on my display since the screen and the camera do not use the same size.
Thanks.
There isn't a way to access the camera's resolution through the ARCore API in the developer preview. I asked about the resolution in comments of a separate question and it looks like the camera resolution in the developer preview will always be 1920x1080.
There is a way to get the resolution of the camera:
Frame frame = session.update();
Camera camera = frame.getCamera();
int dim[] = camera.getImageIntrinsics().getImageDimensions();
Log.d("ARCORE","Camera Dimensions: "+dim[0]+" x "+dim[1]);
But it only gives me 640x480, and I don't think there's any way to change it right now, though it may use different values on different hardware.
(ARCore 1.3)
I know it's an old question, but the answer seems to have changed since then.

Is there any quality difference in image captured using camera.takePicture(shutter callback, jpeg callback) and setOneShotPreviewCallback

I'm an intermediate android programmer. I've a simple application created for learning camera. My app is using camera.takePicture() method to register callbacks for JPEG callback and eventually capture the picture.
But I feel that it may also be possible to capture the image using setOneShotPreviewCallback() and providing a callback.
My question is:
Will there be any differences in image quality between the 2 approaches?
Any additional things to be taken care of when trying to construct image using setOneShotPreviewCallback()?
Thanks in advance.
takePicture() uses (potentially) the camera's full resolution. The preview gives you the image shown on screen which is more usually the screen's resolution. The picture will be higher resolution in general and higher quality. Note that you get something like JPEG-encoded data from the picture callback, but raw image buffer data in the preview callback.
The person is using the surface view to capture the image by camera its quality will be good

Android 'Smart Display' and issues with AIR/Camera Interactions

I'm in the process of attempting to use the camera and some motion tracking AS3 classes to detect movement in front of a ViewSonic Smart Display, for the sake of a demo. I've gotten the app and detection to function on other Android devices, but the 'Smart Display' is presenting me with some odd issues.
Taking a long shot that someone might've encountered this, but this is the very simple camera set up code I reduced the issue down to:
var camera = Camera.getCamera();
camera.setMode(stage.stageWidth, stage.stageHeight, 30, true);
var video:Video = new Video(stage.stageWidth, stage.stageHeight);
video.attachCamera(camera);
My problem lies at the point of "video.attachCamera"
For some reason, this device takes this function as "Display the video in a tiny window in the upper right hand corner" and ignores all other code, dominating the screen with blank black, and a tiny (maybe 40x20px square) of video stream.
Image of it occuring...
Any help is much appreciated, thanks
The problem might be the values that you are passing to the camera with the setMode() method. You are trying to set the camera to capture at the width/height of the stage.
The camera likely does not have such a capture resolution, and as the documentation for setMode() states, it will try to find something that is close to what you have specified:
Sets the camera capture mode to the native mode that best meets the specified requirements. If the camera does not have a native mode that matches all the parameters you pass, the runtime selects a capture mode that most closely synthesizes the requested mode. This manipulation may involve cropping the image and dropping frames.
Now, it is granted that you would expect Flash to have picked a resolution that is bigger than what is shown in your screenshot. But given the myriad of camera devices/drivers, it's possible this is not working too well in your case.
You might start off by experimenting w/more typical resolutions to capture the video: 480x320, 640x480, 800x600, or at the most 1024x768. Most applications on the web probably use the first or second capture resolutions.
So change:
camera.setMode(stage.stageWidth, stage.stageHeight, 30, true);
To:
camera.setMode(640, 480, 30, true);
Note you can display the video in any size you want, but the capture resolutions you can use depend on your camera hardware/drivers/OS/etc. Typical resolutions have a 4:3 aspect ratio and are relatively small (not the full dimensions of the screen/stage). The capture resolution you use affects the quality of video and the amount of network bandwidth you need to stream the video. Generally (for streaming), you don't want to use a big capture resolution, but maybe it's not so important in your motion capture use case.

Android high res photo using camera preview

Is it possible to programmatically take a picture in full/high resolution? I use the camera preview and surface with some custom overlay content. The problem is that the takePhoto function returns data only in preview size low resolutions.
Even if I check the getSupportedPictureSizes the resolutions that are listed are far from the 5Mpix that is the max supported resolution by the system camera. So the question is can I take a photo in max resolution and use custom camera preview or I have to call the system camera Intent to have a full res photo?
Yes you can, you should setPictureSize(), see https://github.com/alexcohn/JBcamera for example.
PS Thanks, Benjamin, for drawing my attention back to this question. if I understand correctly, the author was upset with the resolution of data array returned from IMAGE_CAPTURE intent. But this is only the thumbnail; the actual hi-rez imagis writn to file. You can find this Jpeg file and load it into your app, in onActivityResult()
By default I think it is set to a low resolution. To change this call MediaStore.EXTRA_OUTPUT in your intent. See here for more detail:
http://developer.android.com/reference/android/provider/MediaStore.html#ACTION_IMAGE_CAPTURE

How is the camera preview connected with the final image output?

I've always been under the impression that the preview and the final output are not connected in any way; meaning that I can set the preview to be some arbitrary dimension and that the final JPG will be whatever specific resolution I set in to be in the params, but I just ran into a very odd situation where the image data coming back in the byte[] that's in the jpg callback is different, depending on what dimensions I set my preview to.
Can someone enlighten me on what actual relationship the preview has on the final JPG? (or point me to documentation on said relationship).
TIA
[Edit]
As per ravi's answer, this was my assumption as well, however, I see no alternative but to surmise that they are, in fact, directly connected based on the evidence. I'll post code if necessary (though there's a lot of it) but here's what I'm doing.
I have a preview screen where the user takes a photo of themselves. I then display the picture captured (from the jpg callback bitmap data) in a subsequent draw view and allow them to trace a shape over their photo. I then pass the points of their polygon into a class that cuts that shape out of the original image, and gives back the cut image.
All of this works, BUT depending on how I present the PREVIEW, the polygon cutting class crashes on an array out of bounds index as it tries to access pixels on the final image that simply don't exist. This effect is produced EXCLUSIVELY by altering the shape of the preview View's dimensions. I'm not altering ANYTHING else in the code, and yet, just by mis-shaping my preview view, I can reproduce this error 100% of the time.
I can't see an explanation other than that the preview and the final are directly connected somehow, since I'm never operating on the preview's data, I only display it in a SurfaceView and then move on to deal exclusively with the data from the JPG callback following the user having taken their photo.
There is no relation between the preview resolution and the final image that is captured.
They are completely independent (at least for the still image capture). The preview resolution and the aspect ratio are not interrelated with the final image resolution and the aspect ratio in anyway.
In the camera application that I have written, the preview is always VGA but the image I capture varies from 5M to VGA (depending on the device capability)
Perhaps if you can explain the situation it would be more helpful.
We are currently developing a camera application and face very similiar problems. In our case, we want to display a 16:9 preview, while capturing a 4:3 picture. On most devices this works without any problems, but on some (e.g. Galaxy Nexus, LG Optimus 3D), the output camera picture depends on the preview you've chosen. In our case the outcoming pictures on that devices are distorted when the preview ratio is different from the picture ratio.
We tried to fix this, by changing the preview resolution to a better one just before capturing the image. But this does not work on some devices and occure error while starting the preview again after capturing is finished.
We also tried to fix this, by enlarging the SurfaceView to fullscreen-width and "over fullscreen"-height to make a 16:9 preview out of a 4:3 preview. But this does not work, because SurfaceViews can not be higher then screenheight.
So there IS any connection on SOME devices, and we really want to know, how to fix/workaround this.

Categories

Resources