I am trying to Implement a Functionality that includes taking Pictures While Recording Video. That's the reason i have concluded to use the Screenshot approach of SurfaceView.
However, when i try to take the Screen Shot of SurfaceView. I am always getting a Blank Image.
Here is the code that i am using for taking a Snapshot:
View tempView = (View)MY_SURFACE_VIEW;
tempView.setDrawingCacheEnabled(true);
Bitmap tempBmp = Bitmap.createBitmap(tempView.getDrawingCache());
tempView.setDrawingCacheEnabled(false);
//Saving this Bitmap to a File....
In case you guys may think this is a duplicate Question, let me assure you that i have tried the following Solutions provided on SO for the same Problem before asking this one.
https://stackoverflow.com/questions/24134964/issue-with-camera-picture-taken-snapshot-using-surfaceview-in-android
Facing issue to take a screenshot while recording a video
Take camera screenshot while recording - Like in Galaxy S3?
Taking screen shot of a SurfaceView in android
Get screenshot of surfaceView in Android (This is the correct answer, but Partially Answered. I have already asked #sajar to Explain the Answer)
Other Resources on Internet:
1. http://www.coderanch.com/t/622613/Android/Mobile/capture-screenshot-simple-animation-project
2. http://www.phonesdevelopers.com/1795894/
None of this has worked so far for me. I also know that we need to create some Thread that interacts with the Surface Holder and Gets the Bitmap from It. But i am not sure how to implement that.
Any Help is Highly Appreciated.
Here's another one: Take screenshot of SurfaceView.
SurfaceViews have a "surface" part and a "view" part; your code tries to capture the "view" part. The "surface" part is a separate layer, and there's no trivial "grab all pixels" method. The basic difficulty is that your app is on the "producer" side of the surface, rather than the "consumer" side, so reading pixels back out is problematic. Note that the underlying buffers are in whatever format is most convenient for the data producer, so for camera preview it'll be a YUV buffer.
The easiest and most efficient way to "capture" the surface pixels is to draw them twice, once for the screen and once for capture. If you do this with OpenGL ES, the YUV to RGB conversion will likely be done by a hardware module, which will be much faster than receiving camera frames in YUV buffers and doing your own conversion.
Grafika's "texture from camera" activity demonstrates manipulation of incoming video data with GLES. After rendering you can get the pixels with glReadPixels(). The performance of glReadPixels() can vary significantly between devices and different use cases. EglSurfaceBase#saveFrame() shows how to capture to a Bitmap and save as PNG.
More information about the Android graphics architecture, notably the producer-consumer nature of SurfaceView surfaces, can be found in this document.
public class AndroidSurfaceviewExample extends Activity implements SurfaceHolder.Callback {
static Camera camera;
SurfaceView surfaceView;
SurfaceHolder surfaceHolder;
static boolean boo;
static Thread x;
GLSurfaceView glSurfaceView;
public static Bitmap mBitmap;
public static Camera.Parameters param;
public static Camera.Size mPreviewSize;
public static byte[] byteArray;
PictureCallback jpegCallback;
private Bitmap inputBMP = null, bmp, bmp1;
public static ImageView imgScreen;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.camera);
surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
surfaceHolder = surfaceView.getHolder();
Button btnTakeScreen = (Button)findViewById(R.id.btnTakeScreen);
imgScreen = (ImageView)findViewById(R.id.imgScreen);
btnTakeScreen.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
Bitmap screen = Bitmap.createBitmap(getBitmap());
imgScreen.setImageBitmap(screen);
}
});
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
surfaceHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
jpegCallback = new PictureCallback() {
#SuppressLint("WrongConstant")
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d("Log", "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Toast.makeText(getApplicationContext(), "Picture Saved", 2000).show();
refreshCamera();
}
};
}
public void refreshCamera() {
if (surfaceHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
camera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
refreshCamera();
}
public void surfaceCreated(SurfaceHolder holder) {
if (camera == null) {
try {
camera = Camera.open();
} catch (RuntimeException ignored) {
}
}
try {
if (camera != null) {
WindowManager winManager = (WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE);
camera.setPreviewDisplay(surfaceHolder);
}
} catch (Exception e) {
if (camera != null)
camera.release();
camera = null;
}
if (camera == null) {
return;
} else {
camera.setPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] bytes, Camera camera) {
if (param == null) {
return;
}
byteArray = bytes;
}
});
}
param = camera.getParameters();
mPreviewSize = param.getSupportedPreviewSizes().get(0);
param.setColorEffect(Camera.Parameters.EFFECT_NONE);
//set antibanding to none
if (param.getAntibanding() != null) {
param.setAntibanding(Camera.Parameters.ANTIBANDING_OFF);
}
// set white ballance
if (param.getWhiteBalance() != null) {
param.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_CLOUDY_DAYLIGHT);
}
//set flash
if (param.getFlashMode() != null) {
param.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
}
//set zoom
if (param.isZoomSupported()) {
param.setZoom(0);
}
//set focus mode
param.setFocusMode(Camera.Parameters.FOCUS_MODE_INFINITY);
// modify parameter
camera.setParameters(param);
try {
// The Surface has been created, now tell the camera where to draw
// the preview.
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
// check for exceptions
System.err.println(e);
return;
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// stop preview and release camera
camera.stopPreview();
camera.release();
camera = null;
}
public Bitmap getBitmap() {
try {
if (param == null)
return null;
if (mPreviewSize == null)
return null;
int format = param.getPreviewFormat();
YuvImage yuvImage = new YuvImage(byteArray, format, mPreviewSize.width, mPreviewSize.height, null);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
Log.i("myLog","array: "+byteArray.toString());
Rect rect = new Rect(0, 0, mPreviewSize.width, mPreviewSize.height);
yuvImage.compressToJpeg(rect, 75, byteArrayOutputStream);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPurgeable = true;
options.inInputShareable = true;
mBitmap = BitmapFactory.decodeByteArray(byteArrayOutputStream.toByteArray(), 0, byteArrayOutputStream.size(), options);
byteArrayOutputStream.flush();
byteArrayOutputStream.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
return mBitmap;
}
Related
Is it possible to use camera in fragment like view, so that it wouldn't open another app and go away from my special app?
I want something like SurfaceView with camera?
Yes it is, Check this link .
Basically overwritting the SurfaceView and integrating the camera picture callback.
example code :
/* Surface on which the camera projects it's capture results.
*/
class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
SurfaceHolder mHolder;
Camera mCamera;
public CameraPreview(Context context, Camera camera) {
super(context);
mCamera = camera;
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, now tell the camera where to draw the preview.
try {
mCamera.setPreviewDisplay(holder);
mCamera.startPreview();
} catch (IOException e) {
e.printStackTrace();
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// empty. Take care of releasing the Camera preview in your activity.
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// If your preview can change or rotate, take care of those events here.
// Make sure to stop the preview before resizing or reformatting it.
if (mHolder.getSurface() == null){
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e){
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
try {
mCamera.setPreviewDisplay(mHolder);
mCamera.startPreview();
} catch (Exception e){
e.printStackTrace();
}
}
}
with a camera picture callback like:
private Camera.PictureCallback mPicture = new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
File pictureFile = getOutputMediaFile();
if (pictureFile == null){
Toast.makeText(getActivity(), "Image retrieval failed.", Toast.LENGTH_SHORT)
.show();
return;
}
try {
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(data);
fos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
};
Native implementation is way better.
XML
<com.google.android.cameraview.CameraView
android:id="#+id/camera"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:keepScreenOn="true"
android:adjustViewBounds="true"
app:autoFocus="true"
app:aspectRatio="4:3"
app:facing="back"
app:flash="auto"/>
Inside Activity/Fragment
Start camera
mCameraView.start();
Stop camera
mCameraView.stop();
Open source: Google
Requires API Level 9. The library uses Camera 1 API on API Level 9-20 and Camera2 on 21 and above.
I am trying to Implement a Functionality that includes taking Pictures While Recording Video. That's the reason i have concluded to use the Screenshot approach of SurfaceView.
However, when i try to take the Screen Shot of SurfaceView. I am always getting a Blank Image.
Here is the code that i am using for taking a Snapshot:
View tempView = (View)MY_SURFACE_VIEW;
tempView.setDrawingCacheEnabled(true);
Bitmap tempBmp = Bitmap.createBitmap(tempView.getDrawingCache());
tempView.setDrawingCacheEnabled(false);
//Saving this Bitmap to a File....
In case you guys may think this is a duplicate Question, let me assure you that i have tried the following Solutions provided on SO for the same Problem before asking this one.
https://stackoverflow.com/questions/24134964/issue-with-camera-picture-taken-snapshot-using-surfaceview-in-android
Facing issue to take a screenshot while recording a video
Take camera screenshot while recording - Like in Galaxy S3?
Taking screen shot of a SurfaceView in android
Get screenshot of surfaceView in Android (This is the correct answer, but Partially Answered. I have already asked #sajar to Explain the Answer)
Other Resources on Internet:
1. http://www.coderanch.com/t/622613/Android/Mobile/capture-screenshot-simple-animation-project
2. http://www.phonesdevelopers.com/1795894/
None of this has worked so far for me. I also know that we need to create some Thread that interacts with the Surface Holder and Gets the Bitmap from It. But i am not sure how to implement that.
Any Help is Highly Appreciated.
Here's another one: Take screenshot of SurfaceView.
SurfaceViews have a "surface" part and a "view" part; your code tries to capture the "view" part. The "surface" part is a separate layer, and there's no trivial "grab all pixels" method. The basic difficulty is that your app is on the "producer" side of the surface, rather than the "consumer" side, so reading pixels back out is problematic. Note that the underlying buffers are in whatever format is most convenient for the data producer, so for camera preview it'll be a YUV buffer.
The easiest and most efficient way to "capture" the surface pixels is to draw them twice, once for the screen and once for capture. If you do this with OpenGL ES, the YUV to RGB conversion will likely be done by a hardware module, which will be much faster than receiving camera frames in YUV buffers and doing your own conversion.
Grafika's "texture from camera" activity demonstrates manipulation of incoming video data with GLES. After rendering you can get the pixels with glReadPixels(). The performance of glReadPixels() can vary significantly between devices and different use cases. EglSurfaceBase#saveFrame() shows how to capture to a Bitmap and save as PNG.
More information about the Android graphics architecture, notably the producer-consumer nature of SurfaceView surfaces, can be found in this document.
public class AndroidSurfaceviewExample extends Activity implements SurfaceHolder.Callback {
static Camera camera;
SurfaceView surfaceView;
SurfaceHolder surfaceHolder;
static boolean boo;
static Thread x;
GLSurfaceView glSurfaceView;
public static Bitmap mBitmap;
public static Camera.Parameters param;
public static Camera.Size mPreviewSize;
public static byte[] byteArray;
PictureCallback jpegCallback;
private Bitmap inputBMP = null, bmp, bmp1;
public static ImageView imgScreen;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.camera);
surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
surfaceHolder = surfaceView.getHolder();
Button btnTakeScreen = (Button)findViewById(R.id.btnTakeScreen);
imgScreen = (ImageView)findViewById(R.id.imgScreen);
btnTakeScreen.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
Bitmap screen = Bitmap.createBitmap(getBitmap());
imgScreen.setImageBitmap(screen);
}
});
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
surfaceHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
jpegCallback = new PictureCallback() {
#SuppressLint("WrongConstant")
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d("Log", "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Toast.makeText(getApplicationContext(), "Picture Saved", 2000).show();
refreshCamera();
}
};
}
public void refreshCamera() {
if (surfaceHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
camera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
refreshCamera();
}
public void surfaceCreated(SurfaceHolder holder) {
if (camera == null) {
try {
camera = Camera.open();
} catch (RuntimeException ignored) {
}
}
try {
if (camera != null) {
WindowManager winManager = (WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE);
camera.setPreviewDisplay(surfaceHolder);
}
} catch (Exception e) {
if (camera != null)
camera.release();
camera = null;
}
if (camera == null) {
return;
} else {
camera.setPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] bytes, Camera camera) {
if (param == null) {
return;
}
byteArray = bytes;
}
});
}
param = camera.getParameters();
mPreviewSize = param.getSupportedPreviewSizes().get(0);
param.setColorEffect(Camera.Parameters.EFFECT_NONE);
//set antibanding to none
if (param.getAntibanding() != null) {
param.setAntibanding(Camera.Parameters.ANTIBANDING_OFF);
}
// set white ballance
if (param.getWhiteBalance() != null) {
param.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_CLOUDY_DAYLIGHT);
}
//set flash
if (param.getFlashMode() != null) {
param.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
}
//set zoom
if (param.isZoomSupported()) {
param.setZoom(0);
}
//set focus mode
param.setFocusMode(Camera.Parameters.FOCUS_MODE_INFINITY);
// modify parameter
camera.setParameters(param);
try {
// The Surface has been created, now tell the camera where to draw
// the preview.
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
// check for exceptions
System.err.println(e);
return;
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// stop preview and release camera
camera.stopPreview();
camera.release();
camera = null;
}
public Bitmap getBitmap() {
try {
if (param == null)
return null;
if (mPreviewSize == null)
return null;
int format = param.getPreviewFormat();
YuvImage yuvImage = new YuvImage(byteArray, format, mPreviewSize.width, mPreviewSize.height, null);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
Log.i("myLog","array: "+byteArray.toString());
Rect rect = new Rect(0, 0, mPreviewSize.width, mPreviewSize.height);
yuvImage.compressToJpeg(rect, 75, byteArrayOutputStream);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPurgeable = true;
options.inInputShareable = true;
mBitmap = BitmapFactory.decodeByteArray(byteArrayOutputStream.toByteArray(), 0, byteArrayOutputStream.size(), options);
byteArrayOutputStream.flush();
byteArrayOutputStream.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
return mBitmap;
}
I'm developing for API 7 (2.1). I implemented a camera view like this:
public class CameraView extends SurfaceView implements SurfaceHolder.Callback {
SurfaceHolder mHolder;
int width;
int height;
Camera mCamera;
public CameraView(Context context, AttributeSet attrs) {
super(context, attrs);
initHolder();
}
public CameraView(Context context) {
super(context);
initHolder();
}
private void initHolder() {
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, acquire the camera and tell it where to
// draw.
mCamera = Camera.open();
// Parameters params = mCamera.getParameters();
// // If we aren't landscape (the default), tell the camera we want
// // portrait mode
// if (this.getResources().getConfiguration().orientation != Configuration.ORIENTATION_LANDSCAPE) {
// params.set("orientation", "portrait"); // "landscape"
// // And Rotate the final picture if possible
// // This works on 2.0 and higher only
// // params.setRotation(90);
// // Use reflection to see if it exists and to call it so you can
// // support older versions
// try {
// Method rotateSet = Camera.Parameters.class.getMethod("setRotation", new Class[] { Integer.TYPE });
// Object arguments[] = new Object[] { new Integer(90) };
// rotateSet.invoke(params, arguments);
// } catch (NoSuchMethodException nsme) {
// // Older Device
// Log.v("CAMERAVIEW", "No Set Rotation");
// } catch (IllegalArgumentException e) {
// Log.v("CAMERAVIEW", "Exception IllegalArgument");
// } catch (IllegalAccessException e) {
// Log.v("CAMERAVIEW", "Illegal Access Exception");
// } catch (InvocationTargetException e) {
// Log.v("CAMERAVIEW", "Invocation Target Exception");
// }
// }
// mCamera.setParameters(params);
setDisplayOrientation(mCamera, 90);
try {
mCamera.setPreviewDisplay(holder);
} catch (IOException exception) {
mCamera.release();
mCamera = null;
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// Surface will be destroyed when we return, so stop the preview.
// Because the CameraDevice object is not a shared resource, it's very
// important to release it when the activity is paused.
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
width = w;
height = h;
// Now that the size is known, set up the camera parameters and begin the preview.
Camera.Parameters parameters = mCamera.getParameters();
//parameters.setPreviewSize(w, h);
mCamera.setParameters(parameters);
mCamera.startPreview();
}
public void takePicture(Camera.ShutterCallback shutter, Camera.PictureCallback raw, Camera.PictureCallback jpeg) {
mCamera.takePicture(shutter, raw, jpeg);
}
protected void setDisplayOrientation(Camera camera, int angle){
Method downPolymorphic;
try {
downPolymorphic = camera.getClass().getMethod("setDisplayOrientation", new Class[] { int.class });
if (downPolymorphic != null) {
downPolymorphic.invoke(camera, new Object[] { angle });
}
} catch (Exception e1) {}
}
}
Only the approach using reflection (taken from Williew's answer in Android camera rotate) works on my device to show the preview with the correct rotation, otherwise, the preview is always rotated -90°
So far so good, but now I have another problem. When I get the bitmap with the activitie's callback:
public void onPictureTaken(byte[] data, Camera camera) {
Bitmap b = BitmapFactory.decodeByteArray(data, 0, data.length);
Log.d("test", "---width: " + b.getWidth() + " height: " + b.getHeight());
}
It's always in landscape mode! (in this case 2560 x 1920). So if the user took a picture holding the device in portrait mode, I get a bitmap 2560 x 1920, which anyways, for some reason looks exactly like the portrait pic I took, when I put in in an image view. Problem comes when the user takes the pic holding the device in landscape mode, I would like to rotate it, in order to show the result in portrait mode (scaled down) - or do some other special actions for landscape pics.
But I can't differentiate them from the portrait pics because the bitmap's dimensions are the same :/
How do I recognize portrait / landscape pictures?
Any idea...? I'm new to the camera and kind of lost...
Thanks in advance.
Edit
Ok, I think there's no problem, with the dimensions always being the same, because the picture has actually always the same dimensions, no matter how I was holding the device. Only thing I don't understand is why I always get width > height when the preview and the pics are clearly in portrait mode.
I have built an application which takes photos when you touch the preview.
I can take many photos, but sometimes when i touch the preview to take a photo, there is no shutter sound and the whole application freezes. Moreover, after that, if i try to launch launch the built-in camera application, i get a message that the camera can't be used.
I don't know the reason for that behavior, it happens randomly and when it happens i must restart the device (Samsung Galaxy S) to be able to use the camera again.
In the DDM, after the crash i can see the following line: keyDispatchingTimedOut
Here is the relevant code:
CameraActivity Class:
public class CameraActivity extends Activity {
private static final String TAG = "CameraDemo";
Preview preview;
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
preview = new Preview(this);
((FrameLayout) findViewById(R.id.preview)).addView(preview);
((FrameLayout) findViewById(R.id.preview)).setOnTouchListener(preview);
Log.d(TAG, "Camera Activity Created.");
}
}
Preview Class:
class Preview extends SurfaceView implements SurfaceHolder.Callback, OnTouchListener {
private static final String TAG = "Preview";
SurfaceHolder mHolder;
public Camera camera;
Context ctx;
boolean previewing = false;
Preview(Context context) {
super(context);
ctx = context;
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
// Called once the holder is ready
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, acquire the camera and tell it where
// to draw.
camera = Camera.open();
}
// Called when the holder is destroyed
public void surfaceDestroyed(SurfaceHolder holder) {
if (camera != null) {
camera.setPreviewCallback(null);
camera.stopPreview();
camera.release();
camera = null;
}
previewing = false;
}
// Called when holder has changed
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
if(previewing){
camera.stopPreview();
previewing = false;
}
if (camera != null){
try {
camera.setDisplayOrientation(90);
camera.setPreviewDisplay(holder);
camera.setPreviewCallback(new PreviewCallback() {
// Called for each frame previewed
public void onPreviewFrame(byte[] data, Camera camera) {
Log.d(TAG, "onPreviewFrame called at: " + System.currentTimeMillis());
Preview.this.invalidate();
}
});
camera.startPreview();
previewing = true;
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
public boolean onTouch(View v, MotionEvent event) {
camera.takePicture(shutterCallback, rawCallback, jpegCallback);
return false;
}
// Called when shutter is opened
ShutterCallback shutterCallback = new ShutterCallback() {
public void onShutter() {
Log.d(TAG, "onShutter'd");
}
};
// Handles data for raw picture
PictureCallback rawCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.d(TAG, "onPictureTaken - raw");
}
};
// Handles data for jpeg picture
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
// Write to SD Card
outStream = new FileOutputStream(String.format("/sdcard/TVguide/Detection/detected.jpg", System.currentTimeMillis())); // <9>
outStream.write(data);
outStream.close();
Log.d(TAG, "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) { // <10>
//Toast.makeText(ctx, "Exception #2", Toast.LENGTH_LONG).show();
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {}
Log.d(TAG, "onPictureTaken - jpeg");
Toast.makeText(ctx, "SAVED", Toast.LENGTH_SHORT).show();
camera.startPreview();
}
};
}
Please help, i am trying a few days to understand where the problem is with no success
Eyal
I just run into this issue when testing my application on a Samsung Galaxy SII. You just have to remove the preview callback before taking the picture:
mCamera.setPreviewCallback(null);
mCamera.takePicture(null, null, mPictureCallback);
I don't know what causes that bug, it would really help if you posted the loggcat output from the time from when this error happened.
But, I can make some gusesses. It looks like camera is locked (built-in camera does not work). If your app force closed, the camera lock might be caused by erroneus error handling in Samsung camera HAL. Especially in older phones, like Galaxy S, they did not do the best job at handling wrong, or not standard API calls.
Here are some suggestions of what may have caused this behaviour:
You should add a guard for picture taking. Right now, if you touch the screen and take picture, you can touch the screen again, before the picture finishes taking. So, camera.takePicture() will be called twice. The second one will fail. This is my best guess.
Add some boolean isTakingPicture = false variable and then:
public boolean onTouch(View v, MotionEvent event) {
if (!isTakingPicture) {
camera.takePicture(shutterCallback, rawCallback, jpegCallback);
isTakingPicture = true;
}
return false;
}
...
public void onPictureTaken(byte[] data, Camera camera) {
isTakingPicture = false;
...
What do you use previewCallback for? I doesn't do anything useful here. Preview callbacks sometimes can sometimes cause some pain, although your code looks fine to me. You can alwys try to remove it and check if that helps.
I experienced a similar issue reported here. On LG p705 and Samsung Galaxy Trend, after taking a photo, the preview is frozen and camera was no longer usable until the phone was restarted. On Galaxy S3 however, the preview continues to display properly even after multiple photo snaps.
While debugging, I noticed that the relevant listener class was receiving more than one call when the camera button was pressed to take picture. I am unsure why it is being invoked twice, even though the button was only click once. In any case, thanks to Tomasz's suggestion to use of a boolean variable, the second call skips taking photo while the first attempt is in progress. And thanks Eyal for the question too. :)
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
Hello friends. I am creating an Android application which uses camera code. I have used android-API 2.1 . I have installed my application on dell mobile and the orientation of camera is correct. However, I installed my application in other android mobiles like Samsung Galaxy S2 etc. Here, the problem is with camera orientation, even in portrait mode the camera is rotated 90 degrees. Can anyone suggest me if any camera parameters need to be changed so that camera orientation works well on all mobiles...? Here is my code:
//class CameraDemo
public class CameraDemo extends Activity {
private static final String TAG = "CameraDemo";
Camera camera;
Preview preview;
Button buttonClick;
Button next;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
try
{
super.onCreate(savedInstanceState);
setContentView(R.layout.camdemo);
preview = new Preview(this);
((FrameLayout) findViewById(R.id.preview)).addView(preview);
buttonClick = (Button) findViewById(R.id.buttonClick);
next = (Button) findViewById(R.id.next);
buttonClick.setOnClickListener( new OnClickListener() {
public void onClick(View v) {
preview.camera.takePicture(shutterCallback, rawCallback, jpegCallback);
try{
Constants.takepic=1;
buttonClick.setEnabled(false); //once click is clicked, its disable
}
catch(Exception e)
{
e.printStackTrace();
}
}
});
next.setOnClickListener( new OnClickListener() {
public void onClick(View v) {
//preview.camera.takePicture(shutterCallback, rawCallback, jpegCallback);
if(Constants.takepic==1)
{
Intent intent_new=new Intent(CameraDemo.this,NotesAndUpload.class);
startActivity(intent_new);
}
else
{
Toast.makeText(CameraDemo.this, "Take a picture",5);
}
}
});
Log.d(TAG, "onCreate'd");
}
catch(Exception e ){}
}
ShutterCallback shutterCallback = new ShutterCallback() {
public void onShutter() {
Log.d(TAG, "onShutter'd");
}
};
/** Handles data for raw picture */
PictureCallback rawCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.d(TAG, "onPictureTaken - raw");
}
};
/** Handles data for jpeg picture */
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
//taking time and date
Constants.dateValue=dateAndTime("yyyy-MM-dd");
Constants.timeValue=dateAndTime("HH:mm:ss");
FileOutputStream outStream = null;
try {
// write to local sandbox file system
// outStream =
CameraDemo.this.openFileOutput(String.format("%d.j pg",
System.currentTimeMillis()), 0);
// Or write to sdcard
long imageNameLong=System.currentTimeMillis();
Constants.imageName=new Long(imageNameLong).toString();
outStream = new
FileOutputStream(String.format("/sdcard/%d.jpg",imageNameLong));
outStream.write(data);
Constants.takepic=1;
outStream.close();
Log.d(TAG, "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Log.d(TAG, "onPictureTaken - jpeg");
}
};
}
//Preview Class
class Preview extends SurfaceView implements SurfaceHolder.Callback {
private static final String TAG = "Preview";
SurfaceHolder mHolder;
public Camera camera;
Preview(Context context) {
super(context);
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BU FFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, acquire the camera and tell it where
// to draw.
camera = Camera.open();
try {
camera.setPreviewDisplay(holder);
camera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera arg1) {
/*FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format("/sdcard/%d.jpg",
System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d(TAG, "onPreviewFrame - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}*/
Preview.this.invalidate();
}
});
} catch (IOException e) {
e.printStackTrace();
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// Surface will be destroyed when we return, so stop the preview.
// Because the CameraDevice object is not a shared resource, it's very
// important to release it when the activity is paused.
camera.stopPreview();
camera = null;
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
//CAMERA PARAMETERS NEED TO BE CHANGED :-
Camera.Parameters parameters = camera.getParameters();
parameters.setPreviewSize(w, h);
if(getResources().getConfiguration().orientation ==
Configuration.ORIENTATION_PORTRAIT)
{
parameters.set("rotation",90);
parameters.set("orientation", "portrait");
}
if (getResources().getConfiguration().orientation ==
Configuration.ORIENTATION_LANDSCAPE)
{
parameters.set("rotation", 90);
parameters.set("orientation", "landscape");
}
parameters.set("jpeg-quality", 50);
camera.setParameters(parameters);
camera.startPreview();
}
#Override
public void draw(Canvas canvas) {
super.draw(canvas);
Paint p= new Paint(Color.RED);
Log.d(TAG,"draw");
canvas.drawText("PREVIEW", canvas.getWidth()/2, canvas.getHeight()/2, p );
}
}
If it gives you any error please post it, or use Log.d to show what Parameters is returning your Dell and using SendLog to see what the Samsung Galaxy is returning too. I think it's very important to see it, because I'd some problems with other parameters and I solved them like that.