I'm developing for API 7 (2.1). I implemented a camera view like this:
public class CameraView extends SurfaceView implements SurfaceHolder.Callback {
SurfaceHolder mHolder;
int width;
int height;
Camera mCamera;
public CameraView(Context context, AttributeSet attrs) {
super(context, attrs);
initHolder();
}
public CameraView(Context context) {
super(context);
initHolder();
}
private void initHolder() {
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, acquire the camera and tell it where to
// draw.
mCamera = Camera.open();
// Parameters params = mCamera.getParameters();
// // If we aren't landscape (the default), tell the camera we want
// // portrait mode
// if (this.getResources().getConfiguration().orientation != Configuration.ORIENTATION_LANDSCAPE) {
// params.set("orientation", "portrait"); // "landscape"
// // And Rotate the final picture if possible
// // This works on 2.0 and higher only
// // params.setRotation(90);
// // Use reflection to see if it exists and to call it so you can
// // support older versions
// try {
// Method rotateSet = Camera.Parameters.class.getMethod("setRotation", new Class[] { Integer.TYPE });
// Object arguments[] = new Object[] { new Integer(90) };
// rotateSet.invoke(params, arguments);
// } catch (NoSuchMethodException nsme) {
// // Older Device
// Log.v("CAMERAVIEW", "No Set Rotation");
// } catch (IllegalArgumentException e) {
// Log.v("CAMERAVIEW", "Exception IllegalArgument");
// } catch (IllegalAccessException e) {
// Log.v("CAMERAVIEW", "Illegal Access Exception");
// } catch (InvocationTargetException e) {
// Log.v("CAMERAVIEW", "Invocation Target Exception");
// }
// }
// mCamera.setParameters(params);
setDisplayOrientation(mCamera, 90);
try {
mCamera.setPreviewDisplay(holder);
} catch (IOException exception) {
mCamera.release();
mCamera = null;
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// Surface will be destroyed when we return, so stop the preview.
// Because the CameraDevice object is not a shared resource, it's very
// important to release it when the activity is paused.
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
width = w;
height = h;
// Now that the size is known, set up the camera parameters and begin the preview.
Camera.Parameters parameters = mCamera.getParameters();
//parameters.setPreviewSize(w, h);
mCamera.setParameters(parameters);
mCamera.startPreview();
}
public void takePicture(Camera.ShutterCallback shutter, Camera.PictureCallback raw, Camera.PictureCallback jpeg) {
mCamera.takePicture(shutter, raw, jpeg);
}
protected void setDisplayOrientation(Camera camera, int angle){
Method downPolymorphic;
try {
downPolymorphic = camera.getClass().getMethod("setDisplayOrientation", new Class[] { int.class });
if (downPolymorphic != null) {
downPolymorphic.invoke(camera, new Object[] { angle });
}
} catch (Exception e1) {}
}
}
Only the approach using reflection (taken from Williew's answer in Android camera rotate) works on my device to show the preview with the correct rotation, otherwise, the preview is always rotated -90°
So far so good, but now I have another problem. When I get the bitmap with the activitie's callback:
public void onPictureTaken(byte[] data, Camera camera) {
Bitmap b = BitmapFactory.decodeByteArray(data, 0, data.length);
Log.d("test", "---width: " + b.getWidth() + " height: " + b.getHeight());
}
It's always in landscape mode! (in this case 2560 x 1920). So if the user took a picture holding the device in portrait mode, I get a bitmap 2560 x 1920, which anyways, for some reason looks exactly like the portrait pic I took, when I put in in an image view. Problem comes when the user takes the pic holding the device in landscape mode, I would like to rotate it, in order to show the result in portrait mode (scaled down) - or do some other special actions for landscape pics.
But I can't differentiate them from the portrait pics because the bitmap's dimensions are the same :/
How do I recognize portrait / landscape pictures?
Any idea...? I'm new to the camera and kind of lost...
Thanks in advance.
Edit
Ok, I think there's no problem, with the dimensions always being the same, because the picture has actually always the same dimensions, no matter how I was holding the device. Only thing I don't understand is why I always get width > height when the preview and the pics are clearly in portrait mode.
Related
How to solve the streching of image in my camera preview?
Camera Preview Class
public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
private SurfaceHolder mHolder;
private Camera mCamera;
public class CameraPreview extends SurfaceView implements
SurfaceHolder.Callback {
private SurfaceHolder mHolder;
private Camera mCamera;
public CameraPreview(Context context, Camera camera) {
super(context);
mCamera = camera;
mHolder = getHolder();
mHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
try {
// create the surface and start camera preview
if (mCamera == null) {
mCamera.setPreviewDisplay(holder);
mCamera.startPreview();
}
} catch (IOException e) {
Log.d(VIEW_LOG_TAG,
"Error setting camera preview: " + e.getMessage());
}
}
public void refreshCamera(Camera camera) {
if (mHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
setCamera(camera);
try {
mCamera.setPreviewDisplay(mHolder);
mCamera.startPreview();
} catch (Exception e) {
Log.d(VIEW_LOG_TAG,
"Error starting camera preview: " + e.getMessage());
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// If your preview can change or rotate, take care of those events here.
// Make sure to stop the preview before resizing or reformatting it.
refreshCamera(mCamera);
}
public void setCamera(Camera camera) {
// method to set a camera instance
mCamera = camera;
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
// TODO Auto-generated method stub
// mCamera.release();
}
}
Maybe your CameraView's aspect ratio is not match with the aspect ratio of camera preview size.
You should caculate your CameraView's aspect ratio and use this aspect ratio to find best size for Camera.Parameters.setPreviewSize() method.
Try this
private Camera.Size getBestPreviewSize(Camera.Parameters parameters, int width, int height) {
Camera.Size result = null;
double ratio = width / height;
double w = 0, h = 0;
for (Camera.Size size : parameters.getSupportedPreviewSizes()) {
w = size.width;
h = size.height;
if (w / h == ratio) {
if (result != null) {
if (w > result.width) {
result = size;
}
} else {
result = size;
}
}
}
return (result);
}
I have following class code in an APK, doing face detection and drawing a custom view on camera preview. The face detection works fine on My "Samsung-S3" but i tested it on couple of other android cell phones where face detection never starts. Why is that so/How to get it work? ( Don't mind indentation please)
public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
private SurfaceHolder mHolder;
private Camera mCamera;
private MyDrawing md;
public CameraPreview(Context context, Camera camera) {
super(context);
mCamera = camera;
mHolder = getHolder();
mHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void refreshCamera(Camera camera) {
if (mHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
setCamera(camera);
try {
mCamera.setPreviewDisplay(mHolder);
mCamera.startPreview();
//startFaceDetection();
} catch (Exception e) {
Log.d(VIEW_LOG_TAG, "Error starting camera preview: " + e.getMessage());
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// If your preview can change or rotate, take care of those events here.
// Make sure to stop the preview before resizing or reformatting it.
refreshCamera(mCamera);
}
public void setCamera(Camera camera) {
//method to set a camera instance
mCamera = camera;
mCamera.setFaceDetectionListener(faceDetectionListener);
startFaceDetection();
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
// TODO Auto-generated method stub
mCamera.release();
}
private Camera.FaceDetectionListener faceDetectionListener = new Camera.FaceDetectionListener() {
#Override
public void onFaceDetection(Camera.Face[] faces, Camera c) {
if (faces.length > 0) {
Log.d("FaceDetection", "face detected X and Y are as: " + faces.length +
" Face 1 Location X: " + faces[0].rect.centerX() +
"Y: " + faces[0].rect.centerY() +" LIES IN "+(MyDrawing.w-MyDrawing.radius) +"--"+(MyDrawing.w+MyDrawing.radius));
if(faces[0].rect.centerX()>=0 && faces[0].rect.centerX()<115 )
{
Log.d("ALERT = ", "Detection Started" );
AndroidVideoCaptureExample.capture.setText("Recording/ stopNsave ");
AndroidVideoCaptureExample.faceDetect();
}
} else {
Log.d("FaceDetection", "circle cordinates are as: " + (MyDrawing.w-MyDrawing.radius) +"cX"+ MyDrawing.radius+"cY");
}
}
};
public void startFaceDetection(){
// Try starting Face Detection
Camera.Parameters params = mCamera.getParameters();
// start face detection only *after* preview has started
if (params.getMaxNumDetectedFaces() > 0){
// camera supports face detection, so can start it:
mCamera.startFaceDetection();
}
}
Min supported SDK is set 10
The problem was in setting TargetSDK in Android Manifest file. I have set it to latest Android M while before it was working for 4.4.4(KitKat) as Target. Now works fine.Bingo.
I am trying to Implement a Functionality that includes taking Pictures While Recording Video. That's the reason i have concluded to use the Screenshot approach of SurfaceView.
However, when i try to take the Screen Shot of SurfaceView. I am always getting a Blank Image.
Here is the code that i am using for taking a Snapshot:
View tempView = (View)MY_SURFACE_VIEW;
tempView.setDrawingCacheEnabled(true);
Bitmap tempBmp = Bitmap.createBitmap(tempView.getDrawingCache());
tempView.setDrawingCacheEnabled(false);
//Saving this Bitmap to a File....
In case you guys may think this is a duplicate Question, let me assure you that i have tried the following Solutions provided on SO for the same Problem before asking this one.
https://stackoverflow.com/questions/24134964/issue-with-camera-picture-taken-snapshot-using-surfaceview-in-android
Facing issue to take a screenshot while recording a video
Take camera screenshot while recording - Like in Galaxy S3?
Taking screen shot of a SurfaceView in android
Get screenshot of surfaceView in Android (This is the correct answer, but Partially Answered. I have already asked #sajar to Explain the Answer)
Other Resources on Internet:
1. http://www.coderanch.com/t/622613/Android/Mobile/capture-screenshot-simple-animation-project
2. http://www.phonesdevelopers.com/1795894/
None of this has worked so far for me. I also know that we need to create some Thread that interacts with the Surface Holder and Gets the Bitmap from It. But i am not sure how to implement that.
Any Help is Highly Appreciated.
Here's another one: Take screenshot of SurfaceView.
SurfaceViews have a "surface" part and a "view" part; your code tries to capture the "view" part. The "surface" part is a separate layer, and there's no trivial "grab all pixels" method. The basic difficulty is that your app is on the "producer" side of the surface, rather than the "consumer" side, so reading pixels back out is problematic. Note that the underlying buffers are in whatever format is most convenient for the data producer, so for camera preview it'll be a YUV buffer.
The easiest and most efficient way to "capture" the surface pixels is to draw them twice, once for the screen and once for capture. If you do this with OpenGL ES, the YUV to RGB conversion will likely be done by a hardware module, which will be much faster than receiving camera frames in YUV buffers and doing your own conversion.
Grafika's "texture from camera" activity demonstrates manipulation of incoming video data with GLES. After rendering you can get the pixels with glReadPixels(). The performance of glReadPixels() can vary significantly between devices and different use cases. EglSurfaceBase#saveFrame() shows how to capture to a Bitmap and save as PNG.
More information about the Android graphics architecture, notably the producer-consumer nature of SurfaceView surfaces, can be found in this document.
public class AndroidSurfaceviewExample extends Activity implements SurfaceHolder.Callback {
static Camera camera;
SurfaceView surfaceView;
SurfaceHolder surfaceHolder;
static boolean boo;
static Thread x;
GLSurfaceView glSurfaceView;
public static Bitmap mBitmap;
public static Camera.Parameters param;
public static Camera.Size mPreviewSize;
public static byte[] byteArray;
PictureCallback jpegCallback;
private Bitmap inputBMP = null, bmp, bmp1;
public static ImageView imgScreen;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.camera);
surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
surfaceHolder = surfaceView.getHolder();
Button btnTakeScreen = (Button)findViewById(R.id.btnTakeScreen);
imgScreen = (ImageView)findViewById(R.id.imgScreen);
btnTakeScreen.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
Bitmap screen = Bitmap.createBitmap(getBitmap());
imgScreen.setImageBitmap(screen);
}
});
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
surfaceHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
jpegCallback = new PictureCallback() {
#SuppressLint("WrongConstant")
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d("Log", "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Toast.makeText(getApplicationContext(), "Picture Saved", 2000).show();
refreshCamera();
}
};
}
public void refreshCamera() {
if (surfaceHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
camera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
refreshCamera();
}
public void surfaceCreated(SurfaceHolder holder) {
if (camera == null) {
try {
camera = Camera.open();
} catch (RuntimeException ignored) {
}
}
try {
if (camera != null) {
WindowManager winManager = (WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE);
camera.setPreviewDisplay(surfaceHolder);
}
} catch (Exception e) {
if (camera != null)
camera.release();
camera = null;
}
if (camera == null) {
return;
} else {
camera.setPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] bytes, Camera camera) {
if (param == null) {
return;
}
byteArray = bytes;
}
});
}
param = camera.getParameters();
mPreviewSize = param.getSupportedPreviewSizes().get(0);
param.setColorEffect(Camera.Parameters.EFFECT_NONE);
//set antibanding to none
if (param.getAntibanding() != null) {
param.setAntibanding(Camera.Parameters.ANTIBANDING_OFF);
}
// set white ballance
if (param.getWhiteBalance() != null) {
param.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_CLOUDY_DAYLIGHT);
}
//set flash
if (param.getFlashMode() != null) {
param.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
}
//set zoom
if (param.isZoomSupported()) {
param.setZoom(0);
}
//set focus mode
param.setFocusMode(Camera.Parameters.FOCUS_MODE_INFINITY);
// modify parameter
camera.setParameters(param);
try {
// The Surface has been created, now tell the camera where to draw
// the preview.
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
// check for exceptions
System.err.println(e);
return;
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// stop preview and release camera
camera.stopPreview();
camera.release();
camera = null;
}
public Bitmap getBitmap() {
try {
if (param == null)
return null;
if (mPreviewSize == null)
return null;
int format = param.getPreviewFormat();
YuvImage yuvImage = new YuvImage(byteArray, format, mPreviewSize.width, mPreviewSize.height, null);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
Log.i("myLog","array: "+byteArray.toString());
Rect rect = new Rect(0, 0, mPreviewSize.width, mPreviewSize.height);
yuvImage.compressToJpeg(rect, 75, byteArrayOutputStream);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPurgeable = true;
options.inInputShareable = true;
mBitmap = BitmapFactory.decodeByteArray(byteArrayOutputStream.toByteArray(), 0, byteArrayOutputStream.size(), options);
byteArrayOutputStream.flush();
byteArrayOutputStream.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
return mBitmap;
}
I am trying to Implement a Functionality that includes taking Pictures While Recording Video. That's the reason i have concluded to use the Screenshot approach of SurfaceView.
However, when i try to take the Screen Shot of SurfaceView. I am always getting a Blank Image.
Here is the code that i am using for taking a Snapshot:
View tempView = (View)MY_SURFACE_VIEW;
tempView.setDrawingCacheEnabled(true);
Bitmap tempBmp = Bitmap.createBitmap(tempView.getDrawingCache());
tempView.setDrawingCacheEnabled(false);
//Saving this Bitmap to a File....
In case you guys may think this is a duplicate Question, let me assure you that i have tried the following Solutions provided on SO for the same Problem before asking this one.
https://stackoverflow.com/questions/24134964/issue-with-camera-picture-taken-snapshot-using-surfaceview-in-android
Facing issue to take a screenshot while recording a video
Take camera screenshot while recording - Like in Galaxy S3?
Taking screen shot of a SurfaceView in android
Get screenshot of surfaceView in Android (This is the correct answer, but Partially Answered. I have already asked #sajar to Explain the Answer)
Other Resources on Internet:
1. http://www.coderanch.com/t/622613/Android/Mobile/capture-screenshot-simple-animation-project
2. http://www.phonesdevelopers.com/1795894/
None of this has worked so far for me. I also know that we need to create some Thread that interacts with the Surface Holder and Gets the Bitmap from It. But i am not sure how to implement that.
Any Help is Highly Appreciated.
Here's another one: Take screenshot of SurfaceView.
SurfaceViews have a "surface" part and a "view" part; your code tries to capture the "view" part. The "surface" part is a separate layer, and there's no trivial "grab all pixels" method. The basic difficulty is that your app is on the "producer" side of the surface, rather than the "consumer" side, so reading pixels back out is problematic. Note that the underlying buffers are in whatever format is most convenient for the data producer, so for camera preview it'll be a YUV buffer.
The easiest and most efficient way to "capture" the surface pixels is to draw them twice, once for the screen and once for capture. If you do this with OpenGL ES, the YUV to RGB conversion will likely be done by a hardware module, which will be much faster than receiving camera frames in YUV buffers and doing your own conversion.
Grafika's "texture from camera" activity demonstrates manipulation of incoming video data with GLES. After rendering you can get the pixels with glReadPixels(). The performance of glReadPixels() can vary significantly between devices and different use cases. EglSurfaceBase#saveFrame() shows how to capture to a Bitmap and save as PNG.
More information about the Android graphics architecture, notably the producer-consumer nature of SurfaceView surfaces, can be found in this document.
public class AndroidSurfaceviewExample extends Activity implements SurfaceHolder.Callback {
static Camera camera;
SurfaceView surfaceView;
SurfaceHolder surfaceHolder;
static boolean boo;
static Thread x;
GLSurfaceView glSurfaceView;
public static Bitmap mBitmap;
public static Camera.Parameters param;
public static Camera.Size mPreviewSize;
public static byte[] byteArray;
PictureCallback jpegCallback;
private Bitmap inputBMP = null, bmp, bmp1;
public static ImageView imgScreen;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.camera);
surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
surfaceHolder = surfaceView.getHolder();
Button btnTakeScreen = (Button)findViewById(R.id.btnTakeScreen);
imgScreen = (ImageView)findViewById(R.id.imgScreen);
btnTakeScreen.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
Bitmap screen = Bitmap.createBitmap(getBitmap());
imgScreen.setImageBitmap(screen);
}
});
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
surfaceHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
jpegCallback = new PictureCallback() {
#SuppressLint("WrongConstant")
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d("Log", "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Toast.makeText(getApplicationContext(), "Picture Saved", 2000).show();
refreshCamera();
}
};
}
public void refreshCamera() {
if (surfaceHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
camera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
refreshCamera();
}
public void surfaceCreated(SurfaceHolder holder) {
if (camera == null) {
try {
camera = Camera.open();
} catch (RuntimeException ignored) {
}
}
try {
if (camera != null) {
WindowManager winManager = (WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE);
camera.setPreviewDisplay(surfaceHolder);
}
} catch (Exception e) {
if (camera != null)
camera.release();
camera = null;
}
if (camera == null) {
return;
} else {
camera.setPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] bytes, Camera camera) {
if (param == null) {
return;
}
byteArray = bytes;
}
});
}
param = camera.getParameters();
mPreviewSize = param.getSupportedPreviewSizes().get(0);
param.setColorEffect(Camera.Parameters.EFFECT_NONE);
//set antibanding to none
if (param.getAntibanding() != null) {
param.setAntibanding(Camera.Parameters.ANTIBANDING_OFF);
}
// set white ballance
if (param.getWhiteBalance() != null) {
param.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_CLOUDY_DAYLIGHT);
}
//set flash
if (param.getFlashMode() != null) {
param.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
}
//set zoom
if (param.isZoomSupported()) {
param.setZoom(0);
}
//set focus mode
param.setFocusMode(Camera.Parameters.FOCUS_MODE_INFINITY);
// modify parameter
camera.setParameters(param);
try {
// The Surface has been created, now tell the camera where to draw
// the preview.
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
// check for exceptions
System.err.println(e);
return;
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// stop preview and release camera
camera.stopPreview();
camera.release();
camera = null;
}
public Bitmap getBitmap() {
try {
if (param == null)
return null;
if (mPreviewSize == null)
return null;
int format = param.getPreviewFormat();
YuvImage yuvImage = new YuvImage(byteArray, format, mPreviewSize.width, mPreviewSize.height, null);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
Log.i("myLog","array: "+byteArray.toString());
Rect rect = new Rect(0, 0, mPreviewSize.width, mPreviewSize.height);
yuvImage.compressToJpeg(rect, 75, byteArrayOutputStream);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPurgeable = true;
options.inInputShareable = true;
mBitmap = BitmapFactory.decodeByteArray(byteArrayOutputStream.toByteArray(), 0, byteArrayOutputStream.size(), options);
byteArrayOutputStream.flush();
byteArrayOutputStream.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
return mBitmap;
}
I am creating a camera app but I have problems on startPreview, it sends me:
java.lang.RuntimeException: startPreview failed
here is my camera Activity:
public class CameraActivity extends Activity {
private Camera mCamera;
private CameraPreview mPreview;
private Target_Frame targetFrame;
#Override
protected void onCreate(Bundle savedInstanceState) {
// TODO Auto-generated method stub
super.onCreate(savedInstanceState);
setContentView(R.layout.camera_layout);
mCamera=getCameraInstance();
mPreview=new CameraPreview(this, mCamera);
FrameLayout preview=(FrameLayout)findViewById(R.id.camera_preview);
preview.addView(mPreview);
}
/** Check if this device has a camera only if not specified in the manifest */
public boolean checkCameraHardware(Context context) {
if (context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA)){
// this device has a camera
return true;
} else {
// no camera on this device
return false;
}
}
/** A safe way to get an instance of the Camera object. */
public static Camera getCameraInstance(){
Camera c = null;
try {
c = Camera.open(); // attempt to get a Camera instance
}
catch (Exception e){
// Camera is not available (in use or does not exist)
}
return c; // returns null if camera is unavailable
}
/**Check if the device has flash*/
public boolean checkFlash(Context context){
if(context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA_FLASH)){
//the device has flash
return true;
}else{
//no flash
return false;
}
}
#Override
protected void onDestroy() {
// TODO Auto-generated method stub
super.onDestroy();
releaseCamera();
}
#Override
protected void onPause() {
// TODO Auto-generated method stub
super.onPause();
releaseCamera();
}
#Override
protected void onResume() {
// TODO Auto-generated method stub
super.onResume();
//Test if i have to put all this code like in onCreate
if(mCamera!=null){
return;
}
mCamera=getCameraInstance();
if(mPreview!=null){
return;
}
mPreview=new CameraPreview(this, mCamera);
FrameLayout preview=(FrameLayout)findViewById(R.id.camera_preview);
preview.addView(mPreview);
}
private void releaseCamera(){
if (mCamera != null){
mCamera.release(); // release the camera for other applications
mCamera = null;
}
}}
And here is my SurfaceView code:
public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
private static final String TAG = "CameraPreview";
private SurfaceHolder mHolder;
private Camera mCamera;
public CameraPreview(Context context, Camera camera) {
super(context);
mCamera = camera;
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, now tell the camera where to draw the preview.
try {
mCamera.setPreviewDisplay(holder);
mCamera.startPreview();
} catch (IOException e) {
Log.d(TAG, "Error setting camera preview: " + e.getMessage());
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// empty. Take care of releasing the Camera preview in your activity.
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// If your preview can change or rotate, take care of those events here.
// Make sure to stop the preview before resizing or reformatting it.
if (mHolder.getSurface() == null){
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e){
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
Parameters parameters= mCamera.getParameters();
parameters.setPreviewSize(w, h);
mCamera.setParameters(parameters);
// start preview with new settings
try {
mCamera.setPreviewDisplay(mHolder);
mCamera.startPreview();
} catch (Exception e){
Log.d(TAG, "Error starting camera preview: " + e.getMessage());
}
}
}
And here is my error log:
12-01 13:17:01.135: E/AndroidRuntime(1161): FATAL EXCEPTION: main
12-01 13:17:01.135: E/AndroidRuntime(1161): java.lang.RuntimeException: startPreview
12-01 13:17:01.135: E/AndroidRuntime(1161): at com.example.prueba.CameraPreview.surfaceCreated(CameraPreview.java:36)
I have solved deleting some lines in surfaceChanged
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// If your preview can change or rotate, take care of those events here.
// Make sure to stop the preview before resizing or reformatting it.
Log.d("Function", "surfaceChanged iniciado");
if (mHolder.getSurface() == null){
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e){
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
try {
mCamera.setPreviewDisplay(mHolder);
mCamera.startPreview();
} catch (Exception e){
Log.d(TAG, "Error starting camera preview: " + e.getMessage());
}
}
So the error must be in i one of these lines:
Parameters parameters= mCamera.getParameters();
parameters.setPreviewSize(w, h);
mCamera.setParameters(parameters);
Someone could explain me what was wrong in those lines?
Is (w, h) a valid preview size for your camera?
You can use mCamera.getParameters().getSupportedPreviewSizes() to get all of the valid preview sizes.
Its late, but if someones looking for the answer
The variables w and h are not the optimal preview sizes . You can get optimal preview sizes using the function
public static Camera.Size getOptimalPreviewSize(List<Camera.Size> sizes, int w, int h) {
final double ASPECT_TOLERANCE = 0.1;
double targetRatio=(double)h / w;
if (sizes == null) return null;
Camera.Size optimalSize = null;
double minDiff = Double.MAX_VALUE;
int targetHeight = h;
for (Camera.Size size : sizes) {
double ratio = (double) size.width / size.height;
if (Math.abs(ratio - targetRatio) > ASPECT_TOLERANCE) continue;
if (Math.abs(size.height - targetHeight) < minDiff) {
optimalSize = size;
minDiff = Math.abs(size.height - targetHeight);
}
}
if (optimalSize == null) {
minDiff = Double.MAX_VALUE;
for (Camera.Size size : sizes) {
if (Math.abs(size.height - targetHeight) < minDiff) {
optimalSize = size;
minDiff = Math.abs(size.height - targetHeight);
}
}
}
return optimalSize;
}
and you can call the function using
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
..
size=getOptimalPreviewSize(parameters.getSupportedPreviewSizes(), w, h);
parameters.setPreviewSize(size.getWidth(), size.getHeight());
..
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, now tell the camera where to draw the preview.
try {
mCamera.setPreviewDisplay(holder);
mCamera.getParameters().getSupportedPreviewSizes();
mCamera.startPreview();
Log.d(TAG, "Camera preview started.");
} catch (IOException e) {
Log.d(TAG, "Error setting camera preview: " + e.getMessage());
}
}
I had this error, and it's because I didn't call releaseCamera on the onPause the first time I start my app.
After a reboot, everything works fine ^^