From my current search, Android camera intent can not specify pictureSize.
Since my app needs to be fast, I do not want to save a large size picture in sd card, then load it to a small size of Bitmap. I think it takes time. Plus, I need a gray scale image, rather than a color Bitmap. I know how to convert them, but again it takes time.
I plan to take a picture at a specified size, and directly process the Y part (gray scale) in the YUV data in the memory.
So does that mean I have to write my own camera app using camera API?
Are there any good examples?
Many examples I checked so far often do not consider autofocus.
I add features in XML file:
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
I add the auto focus mode to the camera parameter.
parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
But it does not work.
So I add autofocus command immediately before camera press button.
preview.camera.autoFocus(myAutoFocusCallback);
preview.camera.takePicture(shutterCallback, rawCallback, jpegCallback);
But the autofocus takes some time, and until the preview becomes clear, the take pictures has been performed.
Plus, I also want it to autofocus even if I do not press the camera button.
How can I put autofocus nicely in it?
Are there any good examples?
Thanks.
The Android Developers Guide has a tutorial on making a camera app: http://developer.android.com/guide/topics/media/camera.html#custom-camera
You don't need to add both the camera and the camera.autofocus features in the manifest. The latter implies the first - though it's not really a problem.
FOCUS_MODE_AUTO does not mean it the cameras will focus continuously, just that it will use autofocus at some point (instead of manual focus) by a callback function. You'll need FOCUS_MODE_CONTINUOUS_PICTURE if you want the camera focusing by itself. It's explained in the documentation.
As for taking pictures before the camera is focused: try calling takePicture() from inside your autoFocusCallback like this:
private AutoFocusCallback myAutoFocusCallback = new AutoFocusCallback() {
#Override
public void onAutoFocus(boolean success, Camera camera) {
if (success) {
camera.takePicture(shutterCallback, rawCallback, jpegCallback);
}
}
};
Related
I'm using Android + Opencv(new to opencv) and I'm currently working with real time object detection (the object stays really close to the android device Camera) , and I noticed that the Android camera's autoFocus keeps modifying my frames (kind of 'zoom in' and 'zoom out' effect) which make it harder for me to keep tracking the object.
I need to turn the "AUTO FOCUS" off because in my case the more blurred image input I have, the better, and I also need to turn the AutoWhiteBalance off as well, or maybe set to a different value.
I would like to know how to do it through my OpenCV CameraBridgeViewBase so I could modify the camera's Focus/WhiteBalance settings.
I've trying to find a way to solve it, and I noticed that many people face the same problems.
Here, at Stack Overflow, would be a great place to find someone who have worked with that and found a good way to overcome these problems.
create your own subclass of javacameraview
public class MyJavaCameraView extends JavaCameraView {
where you can have access to mCamera;
add whatever camera access using method you are interested in
for example
// Setup the camera
public void setFlashMode(boolean flashLightON) {
Camera camera = mCamera;
if (camera != null) {
Camera.Parameters params = camera.getParameters();
params.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
camera.setParameters(params);
and use this new class as part of the main activity
//force java camera
mOpenCvCameraView = (MyJavaCameraView) findViewById(R.id.activity_surface_view);
mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
mOpenCvCameraView.setCvCameraViewListener(this);
mOpenCvCameraView.enableView();
In my android app, I am trying to do some image recognition, and I want to open the camera and see the camera generated images before even taking a pic, like the camera browse mode.
Basically as you browse I want to continuously grab the current screen and then read it and generate some object to update some text on the screen. Something like this
#Override
// this gets called every 5 seconds while in camera browse mode. bitmap is the image of the camera currently
public void getScreen(Bitmap bitmap) {
MyData data = myalgorithm(bitmap);
displayCountOnScreen(data);
}
I saw this app https://play.google.com/store/apps/details?id=com.fingersoft.cartooncamera&hl=en
and in camera browse mode, they change the screen and put some other GUI stuff on the screen. I want to do that too.
Anyone know how I can do this?
Thanks
If all you want to do is put some GUI elements on the screen, then there is no need to fetch all the preview frames as Bitmaps (though you could do that as well, if you want):
Create a layout with a SurfaceView for where you want the video data to appear, and then put other views on top.
In onCreate, you can get it like this:
surfaceView = (SurfaceView)findViewById(R.id.cameraSurfaceView);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(this); // For when you need to know when it changes.
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
When you create a camera, you need to pass it a Surface to display the preview on, see:
http://developer.android.com/reference/android/hardware/Camera.html#setPreviewDisplay(android.view.SurfaceHolder):
Camera camera = ...
...
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
If you want to do image recognition, what you want are the raw image bytes to work with. In this case, register a PreviewCallback, see here:
http://developer.android.com/reference/android/hardware/Camera.html#setPreviewCallbackWithBuffer(android.hardware.Camera.PreviewCallback)
in my Activity, i want to take a picture with android.hardware.Camera.
The code (see below) works fine in my AVD, but it doesn't work on my Android phone - all I get is "Error -1".
AVD:
Target: Android 2.3.3
SD Card: 64 MB
WVGA800
Phone:
Samsung Galaxy S2 with Android 2.3.6
Code:
android.hardware.Camera camera = Camera.open();
camera.takePicture(null, null, mPictureCallback);
camera.stopPreview();
camera.setPreviewCallback(null);
camera.release();
camera = null;
Manifest:
uses-feature android:name="android.hardware.camera"
uses-permission android:name="android.permission.CAMERA"
I don't think that the PictureCallback matters - when I comment everything in the onPictureTaken method, the same error is returned.
Also, I have restarted my phone, tried setting a few Camera parameters etc. - but nothing helps.
I can't find this specific error code for the Camera either.
Thanks in advance!
Without a correct assigned SurfaceView it will not work.
Even the SurfaceView must have a minimum size.
Your code have a important missing part. You need to call startPreview(), before calling takePicture(). Other important thing is that the photo taken can delay a little and java Garbage Colletor can collect your camera variable, before you have the result. So release the camera variable on the Picture Callback method.You also dont need to explicity define the setPreviewCallback(null), you can remove it from your code.
It is important to avoid the excution of startPreview() twice before the picture be taken. Disable the element on your interface and enable after (and inside) the Callback method.
I am doing some image processing of previews captured by the camera. It is cpu consuming task and I have to stop preview to make it faster. Before new frame is being processed I am calling Camera.stopPreview() and after Camera.startPreview().
However, I would like to have last captured frame displayed on SurfaceView after stopping the preview. It works 'out of the box' on 2.3 devices, however, SurfaceView gets black after calling Camera.stopPreview() on older versions of SDK. Does anyone know what has changed and what to do?
Yes, this was an improvement of the 2.3.
I had this problem in 2.2 as well, there was no way to work on a preview image despite the fact this was theoretically possible according to the API. To solve this I had to actually take a picture using Camera.takePicture(null, null, Camera.PictureCallback myCallback) (see info here) and then to implement a callback to handle the taken picture. The instance of the class that implements this callback is actually the parameter to pass to Camera.takePicture() and the callback method itself looks like this:
public void onPictureTaken(byte[] JPEGData, Camera camera) {
final Bitmap bitmap = createBitmapFromView(JPEGData);
// do something with the Bitmap
}
Doing that way prevents the picture to be saved on the external storage with the regular pictures taken with the camera application. Should you need to serialize the Bitmap you'll have to do it explicitely. But it doesn't prevent the camera's trigger sound from being emitted.
Camera.takePicture() has to be called wile the preview is running. stopPreview() can be called right after.
One thing to be careful with /!\:
Camera.takePicture() is not reentrant (at all). The callback must have returned before any subsequent call of Camera.takePicture(). This was freezing my phone, I had to shutdown and restart it before it to be usable again. As the action was triggered by a button, on my side, I had to shield it with a boolean:
if (!mPictureTaken) {
mPictureTaken = true; // absolutely NOT reentrant. Any double click sticks the phone otherwise.
mCameraView.takePicture(callback);
}
We are using an LG Optimus speed and are trying to obtain an image from the camera with our own activity. The Code we are using to do so is:
GetImage(new PictureCallback(){
#Override
public void onPictureTaken(byte[] data, Camera camera) {
camera.startPreview();
bmp = BitmapConversion.convertBmp(data));
}
});
...
public static void GetImage(final PictureCallback jpgCallback) {
GetCamera().autoFocus(new AutoFocusCallback(){
#Override
public void onAutoFocus(boolean success, Camera camera) {
if(success)
GetCamera().takePicture(null, null, jpgCallback);
else
GetImage(jpgCallback);
}
});
}
The images have a considerable worse quality than the images obatained with the native android camera app. Here are 2 example pictures, both taken with a resolution of 640x480 an magnified. As you can see the left picture taken with the native app looks "cleaner" than the right taken with our own application.
Any Ideas?
You don't know what the native app is doing in terms of configuring the camera before taking the image and post-processing after taking the image.
There are many settings available on the camera which are well documented and should be investigated.
You should also be aware that vastly different results exist on using the same method but with the slightest variation in light and focus.
Try looking into the autofocus settings and perhaps do something on autofocus callback.
When comparing the two methods make sure your camera is balanced on something rather than handheld and ensure that the distance and light levels are identical.