Get and show partial area of camera view in Android - android

I use following codes to take pictures from the camera view.
captureButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
mCamera.autoFocus(new AutoFocusCallback() {
#Override
public void onAutoFocus(boolean success, Camera camera) {
mCamera.takePicture(null, null, mPicture);
}
});
}
});
Instead of asking users to shoot a picture, now I need to continuously show a smaller part (e.g., 50% width) of the camera picture directly to a view (e.g., ImageView).
How can I get the current content from the camera and operate to get a part of it? I need a fast way, so as to be in a video streaming manner (better if I could directly realize this by some codes with the SurfaceHolder preview configuration).

I don't know if I understand you correctly. You could use a custom view extending SurfaceView to get and show the camera preview (something like this, you will probably have to play around with some parameters). Then you could place that view on your layout.

Related

How to get camera screen in android?

In my android app, I am trying to do some image recognition, and I want to open the camera and see the camera generated images before even taking a pic, like the camera browse mode.
Basically as you browse I want to continuously grab the current screen and then read it and generate some object to update some text on the screen. Something like this
#Override
// this gets called every 5 seconds while in camera browse mode. bitmap is the image of the camera currently
public void getScreen(Bitmap bitmap) {
MyData data = myalgorithm(bitmap);
displayCountOnScreen(data);
}
I saw this app https://play.google.com/store/apps/details?id=com.fingersoft.cartooncamera&hl=en
and in camera browse mode, they change the screen and put some other GUI stuff on the screen. I want to do that too.
Anyone know how I can do this?
Thanks
If all you want to do is put some GUI elements on the screen, then there is no need to fetch all the preview frames as Bitmaps (though you could do that as well, if you want):
Create a layout with a SurfaceView for where you want the video data to appear, and then put other views on top.
In onCreate, you can get it like this:
surfaceView = (SurfaceView)findViewById(R.id.cameraSurfaceView);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(this); // For when you need to know when it changes.
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
When you create a camera, you need to pass it a Surface to display the preview on, see:
http://developer.android.com/reference/android/hardware/Camera.html#setPreviewDisplay(android.view.SurfaceHolder):
Camera camera = ...
...
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
If you want to do image recognition, what you want are the raw image bytes to work with. In this case, register a PreviewCallback, see here:
http://developer.android.com/reference/android/hardware/Camera.html#setPreviewCallbackWithBuffer(android.hardware.Camera.PreviewCallback)

Android: Add Color Overlay Effects on Video during Capture of Video

How to Add Color Overlay Effects on Video during Capture of Video??
Is there any tutorial available? And which library I can use for this??
I have a simple Task to do in this app. On Button click set a color effect on some frames.
My code
mCamera = getCameraInstance();
mCamera.setPreviewCallback(new PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
Log.d("HFI","Length: "+data.length);
}
});
You will have to manipulate each frame of the video you are capturing/captured. Go have a look at this question and also this question, it should help with your problem.

Android camera control how long captured image stays in preview after shutter

I am taking a picture in Android (2.3 and greater) that takes a picture using camera.takePicture(). It works great, but I get weird behavior on different devices. On my Nexus One, the image I captured stays frozen for a few seconds before reverting to the preview. On my Transformer, it reverts to the preview almost immediately.
For now, a workaround would be to call camera.stopPreview() in the onShutter() event, but that's still a bit weird, since it's not showing the photo you took, it's showing what the preview saw a split second after you took the picture. On the Transformer, you can even see it "freeze-move-freeze" as it freezes for a split second after taking the picture, starts moving again, then gets to onShutter and freezes when I call stopPreview().
Does anyone know of a setting somewhere, or some code I could call, that would tell the camera how long to hold onto that image before restarting the preview? Or better yet, have it not automatically release the preview at all, and wait until I call startPreview?
on my devices I have to restart the preview manually, it doesn't start up itself after taking a picture. I use a picture callback like
camera.takePicture(null, null, takePictureCallback);
and the callback is
private Camera.PictureCallback takePictureCallback = new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera cam) {
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
if (camera != null) {
camera.startPreview();
}
canTakePicture = true;
}
}, PHOTO_PREVIEW_LENGTH);
new PhotoProcessor(activity, targetUri).execute(data);
}
};
PhotoProcessor saves the image, PHOTO_PREVIEW_LENGTH is the length in ms for how long is the captured image shown.

Android Autofocus Callback in the Dark

I have some trouble with the onAutoFocus callback of the Android Camera API. In the constructor of my Preview class I set the focus mode to FOCUS_MODE_AUTO and the flash mode to FLASH_MODE_AUTO. The button I present to the user to take a picture has a custom animation attached to it. When the user pressed the button, the animation starts and so does the auto focus:
public void onAnimationStart(Animation animation) {
isAutoFocusing = true;
AutoFocusCallBackImpl autoFocusCallBack = new AutoFocusCallBackImpl();
camera.autoFocus(autoFocusCallBack);
}
Then in the onAutoFocus method I take the picture:
public void onAutoFocus(boolean success, Camera camera) {
if (camera != null) {
try {
camera.takePicture(shutterCallback, rawCallback, jpegCallback);
} catch(Exception e) {
// If something went wrong, we return
// the user to the dashboard.
setResult(Constants.PICTURE_CAMERA_ERROR);
finish();
}
}
}
This works perfectly when there is enough light (so without the flash). In the dark however, the flash goes off and the picture is taken, but it appears that the camera did not focus properly. I know that the onAutoFocus callback is called immediately if auto focus isn't supported by the camera, but that clearly isn't the case here. Is auto focus impossible in the dark (even with the flash)?
This obviously is a hardware issue: To focus automatically, your device needs an image. No image (in the dark) -> no autofocus. There is no way for the camera to tell whether the image is sharp if there is no image, that's why focusing in the dark does not work.
That's why cameras (and I guess some android devices too) have a small light which gets switched on while focusing. But I guess most phones don't have this focus-light :/

Bad image Quality when using own Camera Activity

We are using an LG Optimus speed and are trying to obtain an image from the camera with our own activity. The Code we are using to do so is:
GetImage(new PictureCallback(){
#Override
public void onPictureTaken(byte[] data, Camera camera) {
camera.startPreview();
bmp = BitmapConversion.convertBmp(data));
}
});
...
public static void GetImage(final PictureCallback jpgCallback) {
GetCamera().autoFocus(new AutoFocusCallback(){
#Override
public void onAutoFocus(boolean success, Camera camera) {
if(success)
GetCamera().takePicture(null, null, jpgCallback);
else
GetImage(jpgCallback);
}
});
}
The images have a considerable worse quality than the images obatained with the native android camera app. Here are 2 example pictures, both taken with a resolution of 640x480 an magnified. As you can see the left picture taken with the native app looks "cleaner" than the right taken with our own application.
Any Ideas?
You don't know what the native app is doing in terms of configuring the camera before taking the image and post-processing after taking the image.
There are many settings available on the camera which are well documented and should be investigated.
You should also be aware that vastly different results exist on using the same method but with the slightest variation in light and focus.
Try looking into the autofocus settings and perhaps do something on autofocus callback.
When comparing the two methods make sure your camera is balanced on something rather than handheld and ensure that the distance and light levels are identical.

Categories

Resources