I am trying to Implement a Functionality that includes taking Pictures While Recording Video. That's the reason i have concluded to use the Screenshot approach of SurfaceView.
However, when i try to take the Screen Shot of SurfaceView. I am always getting a Blank Image.
Here is the code that i am using for taking a Snapshot:
View tempView = (View)MY_SURFACE_VIEW;
tempView.setDrawingCacheEnabled(true);
Bitmap tempBmp = Bitmap.createBitmap(tempView.getDrawingCache());
tempView.setDrawingCacheEnabled(false);
//Saving this Bitmap to a File....
In case you guys may think this is a duplicate Question, let me assure you that i have tried the following Solutions provided on SO for the same Problem before asking this one.
https://stackoverflow.com/questions/24134964/issue-with-camera-picture-taken-snapshot-using-surfaceview-in-android
Facing issue to take a screenshot while recording a video
Take camera screenshot while recording - Like in Galaxy S3?
Taking screen shot of a SurfaceView in android
Get screenshot of surfaceView in Android (This is the correct answer, but Partially Answered. I have already asked #sajar to Explain the Answer)
Other Resources on Internet:
1. http://www.coderanch.com/t/622613/Android/Mobile/capture-screenshot-simple-animation-project
2. http://www.phonesdevelopers.com/1795894/
None of this has worked so far for me. I also know that we need to create some Thread that interacts with the Surface Holder and Gets the Bitmap from It. But i am not sure how to implement that.
Any Help is Highly Appreciated.
Here's another one: Take screenshot of SurfaceView.
SurfaceViews have a "surface" part and a "view" part; your code tries to capture the "view" part. The "surface" part is a separate layer, and there's no trivial "grab all pixels" method. The basic difficulty is that your app is on the "producer" side of the surface, rather than the "consumer" side, so reading pixels back out is problematic. Note that the underlying buffers are in whatever format is most convenient for the data producer, so for camera preview it'll be a YUV buffer.
The easiest and most efficient way to "capture" the surface pixels is to draw them twice, once for the screen and once for capture. If you do this with OpenGL ES, the YUV to RGB conversion will likely be done by a hardware module, which will be much faster than receiving camera frames in YUV buffers and doing your own conversion.
Grafika's "texture from camera" activity demonstrates manipulation of incoming video data with GLES. After rendering you can get the pixels with glReadPixels(). The performance of glReadPixels() can vary significantly between devices and different use cases. EglSurfaceBase#saveFrame() shows how to capture to a Bitmap and save as PNG.
More information about the Android graphics architecture, notably the producer-consumer nature of SurfaceView surfaces, can be found in this document.
public class AndroidSurfaceviewExample extends Activity implements SurfaceHolder.Callback {
static Camera camera;
SurfaceView surfaceView;
SurfaceHolder surfaceHolder;
static boolean boo;
static Thread x;
GLSurfaceView glSurfaceView;
public static Bitmap mBitmap;
public static Camera.Parameters param;
public static Camera.Size mPreviewSize;
public static byte[] byteArray;
PictureCallback jpegCallback;
private Bitmap inputBMP = null, bmp, bmp1;
public static ImageView imgScreen;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.camera);
surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
surfaceHolder = surfaceView.getHolder();
Button btnTakeScreen = (Button)findViewById(R.id.btnTakeScreen);
imgScreen = (ImageView)findViewById(R.id.imgScreen);
btnTakeScreen.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
Bitmap screen = Bitmap.createBitmap(getBitmap());
imgScreen.setImageBitmap(screen);
}
});
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
surfaceHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
jpegCallback = new PictureCallback() {
#SuppressLint("WrongConstant")
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d("Log", "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Toast.makeText(getApplicationContext(), "Picture Saved", 2000).show();
refreshCamera();
}
};
}
public void refreshCamera() {
if (surfaceHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
camera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
refreshCamera();
}
public void surfaceCreated(SurfaceHolder holder) {
if (camera == null) {
try {
camera = Camera.open();
} catch (RuntimeException ignored) {
}
}
try {
if (camera != null) {
WindowManager winManager = (WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE);
camera.setPreviewDisplay(surfaceHolder);
}
} catch (Exception e) {
if (camera != null)
camera.release();
camera = null;
}
if (camera == null) {
return;
} else {
camera.setPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] bytes, Camera camera) {
if (param == null) {
return;
}
byteArray = bytes;
}
});
}
param = camera.getParameters();
mPreviewSize = param.getSupportedPreviewSizes().get(0);
param.setColorEffect(Camera.Parameters.EFFECT_NONE);
//set antibanding to none
if (param.getAntibanding() != null) {
param.setAntibanding(Camera.Parameters.ANTIBANDING_OFF);
}
// set white ballance
if (param.getWhiteBalance() != null) {
param.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_CLOUDY_DAYLIGHT);
}
//set flash
if (param.getFlashMode() != null) {
param.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
}
//set zoom
if (param.isZoomSupported()) {
param.setZoom(0);
}
//set focus mode
param.setFocusMode(Camera.Parameters.FOCUS_MODE_INFINITY);
// modify parameter
camera.setParameters(param);
try {
// The Surface has been created, now tell the camera where to draw
// the preview.
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
// check for exceptions
System.err.println(e);
return;
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// stop preview and release camera
camera.stopPreview();
camera.release();
camera = null;
}
public Bitmap getBitmap() {
try {
if (param == null)
return null;
if (mPreviewSize == null)
return null;
int format = param.getPreviewFormat();
YuvImage yuvImage = new YuvImage(byteArray, format, mPreviewSize.width, mPreviewSize.height, null);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
Log.i("myLog","array: "+byteArray.toString());
Rect rect = new Rect(0, 0, mPreviewSize.width, mPreviewSize.height);
yuvImage.compressToJpeg(rect, 75, byteArrayOutputStream);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPurgeable = true;
options.inInputShareable = true;
mBitmap = BitmapFactory.decodeByteArray(byteArrayOutputStream.toByteArray(), 0, byteArrayOutputStream.size(), options);
byteArrayOutputStream.flush();
byteArrayOutputStream.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
return mBitmap;
}
Related
Is it possible to use camera in fragment like view, so that it wouldn't open another app and go away from my special app?
I want something like SurfaceView with camera?
Yes it is, Check this link .
Basically overwritting the SurfaceView and integrating the camera picture callback.
example code :
/* Surface on which the camera projects it's capture results.
*/
class CameraPreview extends SurfaceView implements SurfaceHolder.Callback {
SurfaceHolder mHolder;
Camera mCamera;
public CameraPreview(Context context, Camera camera) {
super(context);
mCamera = camera;
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, now tell the camera where to draw the preview.
try {
mCamera.setPreviewDisplay(holder);
mCamera.startPreview();
} catch (IOException e) {
e.printStackTrace();
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// empty. Take care of releasing the Camera preview in your activity.
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// If your preview can change or rotate, take care of those events here.
// Make sure to stop the preview before resizing or reformatting it.
if (mHolder.getSurface() == null){
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e){
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
try {
mCamera.setPreviewDisplay(mHolder);
mCamera.startPreview();
} catch (Exception e){
e.printStackTrace();
}
}
}
with a camera picture callback like:
private Camera.PictureCallback mPicture = new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
File pictureFile = getOutputMediaFile();
if (pictureFile == null){
Toast.makeText(getActivity(), "Image retrieval failed.", Toast.LENGTH_SHORT)
.show();
return;
}
try {
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(data);
fos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
};
Native implementation is way better.
XML
<com.google.android.cameraview.CameraView
android:id="#+id/camera"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:keepScreenOn="true"
android:adjustViewBounds="true"
app:autoFocus="true"
app:aspectRatio="4:3"
app:facing="back"
app:flash="auto"/>
Inside Activity/Fragment
Start camera
mCameraView.start();
Stop camera
mCameraView.stop();
Open source: Google
Requires API Level 9. The library uses Camera 1 API on API Level 9-20 and Camera2 on 21 and above.
I am trying to Implement a Functionality that includes taking Pictures While Recording Video. That's the reason i have concluded to use the Screenshot approach of SurfaceView.
However, when i try to take the Screen Shot of SurfaceView. I am always getting a Blank Image.
Here is the code that i am using for taking a Snapshot:
View tempView = (View)MY_SURFACE_VIEW;
tempView.setDrawingCacheEnabled(true);
Bitmap tempBmp = Bitmap.createBitmap(tempView.getDrawingCache());
tempView.setDrawingCacheEnabled(false);
//Saving this Bitmap to a File....
In case you guys may think this is a duplicate Question, let me assure you that i have tried the following Solutions provided on SO for the same Problem before asking this one.
https://stackoverflow.com/questions/24134964/issue-with-camera-picture-taken-snapshot-using-surfaceview-in-android
Facing issue to take a screenshot while recording a video
Take camera screenshot while recording - Like in Galaxy S3?
Taking screen shot of a SurfaceView in android
Get screenshot of surfaceView in Android (This is the correct answer, but Partially Answered. I have already asked #sajar to Explain the Answer)
Other Resources on Internet:
1. http://www.coderanch.com/t/622613/Android/Mobile/capture-screenshot-simple-animation-project
2. http://www.phonesdevelopers.com/1795894/
None of this has worked so far for me. I also know that we need to create some Thread that interacts with the Surface Holder and Gets the Bitmap from It. But i am not sure how to implement that.
Any Help is Highly Appreciated.
Here's another one: Take screenshot of SurfaceView.
SurfaceViews have a "surface" part and a "view" part; your code tries to capture the "view" part. The "surface" part is a separate layer, and there's no trivial "grab all pixels" method. The basic difficulty is that your app is on the "producer" side of the surface, rather than the "consumer" side, so reading pixels back out is problematic. Note that the underlying buffers are in whatever format is most convenient for the data producer, so for camera preview it'll be a YUV buffer.
The easiest and most efficient way to "capture" the surface pixels is to draw them twice, once for the screen and once for capture. If you do this with OpenGL ES, the YUV to RGB conversion will likely be done by a hardware module, which will be much faster than receiving camera frames in YUV buffers and doing your own conversion.
Grafika's "texture from camera" activity demonstrates manipulation of incoming video data with GLES. After rendering you can get the pixels with glReadPixels(). The performance of glReadPixels() can vary significantly between devices and different use cases. EglSurfaceBase#saveFrame() shows how to capture to a Bitmap and save as PNG.
More information about the Android graphics architecture, notably the producer-consumer nature of SurfaceView surfaces, can be found in this document.
public class AndroidSurfaceviewExample extends Activity implements SurfaceHolder.Callback {
static Camera camera;
SurfaceView surfaceView;
SurfaceHolder surfaceHolder;
static boolean boo;
static Thread x;
GLSurfaceView glSurfaceView;
public static Bitmap mBitmap;
public static Camera.Parameters param;
public static Camera.Size mPreviewSize;
public static byte[] byteArray;
PictureCallback jpegCallback;
private Bitmap inputBMP = null, bmp, bmp1;
public static ImageView imgScreen;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.camera);
surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
surfaceHolder = surfaceView.getHolder();
Button btnTakeScreen = (Button)findViewById(R.id.btnTakeScreen);
imgScreen = (ImageView)findViewById(R.id.imgScreen);
btnTakeScreen.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
Bitmap screen = Bitmap.createBitmap(getBitmap());
imgScreen.setImageBitmap(screen);
}
});
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
surfaceHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
jpegCallback = new PictureCallback() {
#SuppressLint("WrongConstant")
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d("Log", "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Toast.makeText(getApplicationContext(), "Picture Saved", 2000).show();
refreshCamera();
}
};
}
public void refreshCamera() {
if (surfaceHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
camera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
refreshCamera();
}
public void surfaceCreated(SurfaceHolder holder) {
if (camera == null) {
try {
camera = Camera.open();
} catch (RuntimeException ignored) {
}
}
try {
if (camera != null) {
WindowManager winManager = (WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE);
camera.setPreviewDisplay(surfaceHolder);
}
} catch (Exception e) {
if (camera != null)
camera.release();
camera = null;
}
if (camera == null) {
return;
} else {
camera.setPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] bytes, Camera camera) {
if (param == null) {
return;
}
byteArray = bytes;
}
});
}
param = camera.getParameters();
mPreviewSize = param.getSupportedPreviewSizes().get(0);
param.setColorEffect(Camera.Parameters.EFFECT_NONE);
//set antibanding to none
if (param.getAntibanding() != null) {
param.setAntibanding(Camera.Parameters.ANTIBANDING_OFF);
}
// set white ballance
if (param.getWhiteBalance() != null) {
param.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_CLOUDY_DAYLIGHT);
}
//set flash
if (param.getFlashMode() != null) {
param.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
}
//set zoom
if (param.isZoomSupported()) {
param.setZoom(0);
}
//set focus mode
param.setFocusMode(Camera.Parameters.FOCUS_MODE_INFINITY);
// modify parameter
camera.setParameters(param);
try {
// The Surface has been created, now tell the camera where to draw
// the preview.
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
// check for exceptions
System.err.println(e);
return;
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// stop preview and release camera
camera.stopPreview();
camera.release();
camera = null;
}
public Bitmap getBitmap() {
try {
if (param == null)
return null;
if (mPreviewSize == null)
return null;
int format = param.getPreviewFormat();
YuvImage yuvImage = new YuvImage(byteArray, format, mPreviewSize.width, mPreviewSize.height, null);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
Log.i("myLog","array: "+byteArray.toString());
Rect rect = new Rect(0, 0, mPreviewSize.width, mPreviewSize.height);
yuvImage.compressToJpeg(rect, 75, byteArrayOutputStream);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPurgeable = true;
options.inInputShareable = true;
mBitmap = BitmapFactory.decodeByteArray(byteArrayOutputStream.toByteArray(), 0, byteArrayOutputStream.size(), options);
byteArrayOutputStream.flush();
byteArrayOutputStream.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
return mBitmap;
}
This question already has answers here:
How do I open the "front camera" on the Android platform?
(10 answers)
Closed 5 years ago.
i need Some Help help I am Working With An app which Can take picture automatic On Activity With Main Camera But I want to use Front Camera Instead Of Main Camera
can U Please Tell Me How To use Front Camera
here Is My Code
public class TakePicture extends Activity implements SurfaceHolder.Callback
{
//a variable to store a reference to the Image View at the main.xml file
private ImageView iv_image;
//a variable to store a reference to the Surface View at the main.xml file
private SurfaceView sv;
//a bitmap to display the captured image
private Bitmap bmp;
//Camera variables
//a surface holder
private SurfaceHolder sHolder;
//a variable to control the camera
private Camera mCamera;
//the camera parameters
private Parameters parameters;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
//get the Image View at the main.xml file
iv_image = (ImageView) findViewById(R.id.imageView);
//get the Surface View at the main.xml file
sv = (SurfaceView) findViewById(R.id.surfaceView);
//Get a surface
sHolder = sv.getHolder();
//add the callback interface methods defined below as the Surface View callbacks
sHolder.addCallback(this);
//tells Android that this surface will have its data constantly replaced
sHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
#Override
public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3)
{
//get camera parameters
parameters = mCamera.getParameters();
//set camera parameters
mCamera.setParameters(parameters);
mCamera.startPreview();
//sets what code should be executed after the picture is taken
Camera.PictureCallback mCall = new Camera.PictureCallback()
{
#Override
public void onPictureTaken(byte[] data, Camera camera)
{
//decode the data obtained by the camera into a Bitmap
bmp = BitmapFactory.decodeByteArray(data, 0, data.length);
//set the iv_image
iv_image.setImageBitmap(bmp);
TakePicture t=new TakePicture();
t.SaveBitmap(bmp);
// SendEmail(bmp);
}
};
mCamera.takePicture(null, null, mCall);
}
#Override
public void surfaceCreated(SurfaceHolder holder)
{
// The Surface has been created, acquire the camera and tell it where
// to draw the preview.
mCamera = Camera.open();
try {
mCamera.setPreviewDisplay(holder);
} catch (IOException exception) {
mCamera.release();
mCamera = null;
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder)
{
//stop the preview
mCamera.stopPreview();
//release the camera
mCamera.release();
//unbind the camera from this object
mCamera = null;
}
public void openFrontFacingCamera() {
numberOfCamera = Camera.getNumberOfCameras();
if(camId == Camera.CameraInfo.CAMERA_FACING_BACK){
camId = Camera.CameraInfo.CAMERA_FACING_FRONT;
Toast.makeText(getApplicationContext(), "BACK TO FRONT" ,
1000).show();
try {
camera.release();
camera = Camera.open(camId);
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
previewing = true;
} catch (RuntimeException e) {
} catch (IOException e) {}
}else if(camId == Camera.CameraInfo.CAMERA_FACING_FRONT){
camId = Camera.CameraInfo.CAMERA_FACING_BACK;
Toast.makeText(getApplicationContext(), "FRONT TO BACK" ,
1000).show();
try {
camera.release();
camera = Camera.open(camId);
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (RuntimeException e) {
} catch (IOException e) {}
}
}
hope this function will help u
change
mCamera = Camera.open();
to
mCamera = Camera.open(Camera.CameraInfo.CAMERA_FACING_FRONT);
I'm developing for API 7 (2.1). I implemented a camera view like this:
public class CameraView extends SurfaceView implements SurfaceHolder.Callback {
SurfaceHolder mHolder;
int width;
int height;
Camera mCamera;
public CameraView(Context context, AttributeSet attrs) {
super(context, attrs);
initHolder();
}
public CameraView(Context context) {
super(context);
initHolder();
}
private void initHolder() {
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, acquire the camera and tell it where to
// draw.
mCamera = Camera.open();
// Parameters params = mCamera.getParameters();
// // If we aren't landscape (the default), tell the camera we want
// // portrait mode
// if (this.getResources().getConfiguration().orientation != Configuration.ORIENTATION_LANDSCAPE) {
// params.set("orientation", "portrait"); // "landscape"
// // And Rotate the final picture if possible
// // This works on 2.0 and higher only
// // params.setRotation(90);
// // Use reflection to see if it exists and to call it so you can
// // support older versions
// try {
// Method rotateSet = Camera.Parameters.class.getMethod("setRotation", new Class[] { Integer.TYPE });
// Object arguments[] = new Object[] { new Integer(90) };
// rotateSet.invoke(params, arguments);
// } catch (NoSuchMethodException nsme) {
// // Older Device
// Log.v("CAMERAVIEW", "No Set Rotation");
// } catch (IllegalArgumentException e) {
// Log.v("CAMERAVIEW", "Exception IllegalArgument");
// } catch (IllegalAccessException e) {
// Log.v("CAMERAVIEW", "Illegal Access Exception");
// } catch (InvocationTargetException e) {
// Log.v("CAMERAVIEW", "Invocation Target Exception");
// }
// }
// mCamera.setParameters(params);
setDisplayOrientation(mCamera, 90);
try {
mCamera.setPreviewDisplay(holder);
} catch (IOException exception) {
mCamera.release();
mCamera = null;
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// Surface will be destroyed when we return, so stop the preview.
// Because the CameraDevice object is not a shared resource, it's very
// important to release it when the activity is paused.
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
width = w;
height = h;
// Now that the size is known, set up the camera parameters and begin the preview.
Camera.Parameters parameters = mCamera.getParameters();
//parameters.setPreviewSize(w, h);
mCamera.setParameters(parameters);
mCamera.startPreview();
}
public void takePicture(Camera.ShutterCallback shutter, Camera.PictureCallback raw, Camera.PictureCallback jpeg) {
mCamera.takePicture(shutter, raw, jpeg);
}
protected void setDisplayOrientation(Camera camera, int angle){
Method downPolymorphic;
try {
downPolymorphic = camera.getClass().getMethod("setDisplayOrientation", new Class[] { int.class });
if (downPolymorphic != null) {
downPolymorphic.invoke(camera, new Object[] { angle });
}
} catch (Exception e1) {}
}
}
Only the approach using reflection (taken from Williew's answer in Android camera rotate) works on my device to show the preview with the correct rotation, otherwise, the preview is always rotated -90°
So far so good, but now I have another problem. When I get the bitmap with the activitie's callback:
public void onPictureTaken(byte[] data, Camera camera) {
Bitmap b = BitmapFactory.decodeByteArray(data, 0, data.length);
Log.d("test", "---width: " + b.getWidth() + " height: " + b.getHeight());
}
It's always in landscape mode! (in this case 2560 x 1920). So if the user took a picture holding the device in portrait mode, I get a bitmap 2560 x 1920, which anyways, for some reason looks exactly like the portrait pic I took, when I put in in an image view. Problem comes when the user takes the pic holding the device in landscape mode, I would like to rotate it, in order to show the result in portrait mode (scaled down) - or do some other special actions for landscape pics.
But I can't differentiate them from the portrait pics because the bitmap's dimensions are the same :/
How do I recognize portrait / landscape pictures?
Any idea...? I'm new to the camera and kind of lost...
Thanks in advance.
Edit
Ok, I think there's no problem, with the dimensions always being the same, because the picture has actually always the same dimensions, no matter how I was holding the device. Only thing I don't understand is why I always get width > height when the preview and the pics are clearly in portrait mode.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
Hello friends. I am creating an Android application which uses camera code. I have used android-API 2.1 . I have installed my application on dell mobile and the orientation of camera is correct. However, I installed my application in other android mobiles like Samsung Galaxy S2 etc. Here, the problem is with camera orientation, even in portrait mode the camera is rotated 90 degrees. Can anyone suggest me if any camera parameters need to be changed so that camera orientation works well on all mobiles...? Here is my code:
//class CameraDemo
public class CameraDemo extends Activity {
private static final String TAG = "CameraDemo";
Camera camera;
Preview preview;
Button buttonClick;
Button next;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
try
{
super.onCreate(savedInstanceState);
setContentView(R.layout.camdemo);
preview = new Preview(this);
((FrameLayout) findViewById(R.id.preview)).addView(preview);
buttonClick = (Button) findViewById(R.id.buttonClick);
next = (Button) findViewById(R.id.next);
buttonClick.setOnClickListener( new OnClickListener() {
public void onClick(View v) {
preview.camera.takePicture(shutterCallback, rawCallback, jpegCallback);
try{
Constants.takepic=1;
buttonClick.setEnabled(false); //once click is clicked, its disable
}
catch(Exception e)
{
e.printStackTrace();
}
}
});
next.setOnClickListener( new OnClickListener() {
public void onClick(View v) {
//preview.camera.takePicture(shutterCallback, rawCallback, jpegCallback);
if(Constants.takepic==1)
{
Intent intent_new=new Intent(CameraDemo.this,NotesAndUpload.class);
startActivity(intent_new);
}
else
{
Toast.makeText(CameraDemo.this, "Take a picture",5);
}
}
});
Log.d(TAG, "onCreate'd");
}
catch(Exception e ){}
}
ShutterCallback shutterCallback = new ShutterCallback() {
public void onShutter() {
Log.d(TAG, "onShutter'd");
}
};
/** Handles data for raw picture */
PictureCallback rawCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.d(TAG, "onPictureTaken - raw");
}
};
/** Handles data for jpeg picture */
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
//taking time and date
Constants.dateValue=dateAndTime("yyyy-MM-dd");
Constants.timeValue=dateAndTime("HH:mm:ss");
FileOutputStream outStream = null;
try {
// write to local sandbox file system
// outStream =
CameraDemo.this.openFileOutput(String.format("%d.j pg",
System.currentTimeMillis()), 0);
// Or write to sdcard
long imageNameLong=System.currentTimeMillis();
Constants.imageName=new Long(imageNameLong).toString();
outStream = new
FileOutputStream(String.format("/sdcard/%d.jpg",imageNameLong));
outStream.write(data);
Constants.takepic=1;
outStream.close();
Log.d(TAG, "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Log.d(TAG, "onPictureTaken - jpeg");
}
};
}
//Preview Class
class Preview extends SurfaceView implements SurfaceHolder.Callback {
private static final String TAG = "Preview";
SurfaceHolder mHolder;
public Camera camera;
Preview(Context context) {
super(context);
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BU FFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, acquire the camera and tell it where
// to draw.
camera = Camera.open();
try {
camera.setPreviewDisplay(holder);
camera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera arg1) {
/*FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format("/sdcard/%d.jpg",
System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d(TAG, "onPreviewFrame - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}*/
Preview.this.invalidate();
}
});
} catch (IOException e) {
e.printStackTrace();
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// Surface will be destroyed when we return, so stop the preview.
// Because the CameraDevice object is not a shared resource, it's very
// important to release it when the activity is paused.
camera.stopPreview();
camera = null;
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
//CAMERA PARAMETERS NEED TO BE CHANGED :-
Camera.Parameters parameters = camera.getParameters();
parameters.setPreviewSize(w, h);
if(getResources().getConfiguration().orientation ==
Configuration.ORIENTATION_PORTRAIT)
{
parameters.set("rotation",90);
parameters.set("orientation", "portrait");
}
if (getResources().getConfiguration().orientation ==
Configuration.ORIENTATION_LANDSCAPE)
{
parameters.set("rotation", 90);
parameters.set("orientation", "landscape");
}
parameters.set("jpeg-quality", 50);
camera.setParameters(parameters);
camera.startPreview();
}
#Override
public void draw(Canvas canvas) {
super.draw(canvas);
Paint p= new Paint(Color.RED);
Log.d(TAG,"draw");
canvas.drawText("PREVIEW", canvas.getWidth()/2, canvas.getHeight()/2, p );
}
}
If it gives you any error please post it, or use Log.d to show what Parameters is returning your Dell and using SendLog to see what the Samsung Galaxy is returning too. I think it's very important to see it, because I'd some problems with other parameters and I solved them like that.