I have made a camera demo application in android. It was working fine on a button click event. But now I want to open the camera when activity starts and its not working. It fails with an error
Failed to connect to camera service
Please have a look at my code and save me. My code is as below:
code
surfaceView = (SurfaceView) findViewById(R.id.surfaceView1);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(this);
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
rawCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.d("Log", "onPictureTaken - raw");
}
};
/** Handles data for jpeg picture */
shutterCallback = new ShutterCallback() {
public void onShutter() {
Log.i("Log", "onShutter'd");
}
};
jpegCallback = new PictureCallback() {
#SuppressLint("NewApi")
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format(
"/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d("Log", "onPictureTaken - wrote bytes: " + data.length);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inMutable = true;
bmp = BitmapFactory.decodeByteArray(data, 0, data.length,
options);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Log.d("Log", "onPictureTaken - jpeg");
}
};
start_camera();
private void start_camera() {
try {
releaseCameraAndPreview();
camera = Camera.open();
camera.setDisplayOrientation(90);
} catch (RuntimeException e) {
Log.e(tag, "init_camera: " + e);
return;
}
param = camera.getParameters();
param.set("jpeg-quality", 100);
camera.setParameters(param);
// modify parameter
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
// camera.takePicture(shutter, raw, jpeg)
} catch (Exception e) {
Log.e(tag, "init_camera: " + e);
return;
}
}
Related
I am working in QR Codes project. I have used ZXING library for generating QR codes. I want to scan QR Code in my app. But for that i am using my own custom camera. In my camera i have captured image and created the bitmap of captured image. Is it possible to use that bitmap for scanning QR Codes by calling the decode functions of ZXING library and passing that bitmap or byte[] in it? I will be thankful to you for helping me. Here is my implementation,
public void onCreate(Bundle savedInstanceState) {
// TODO OnCreate Method
super.onCreate(savedInstanceState);
setContentView(R.layout.camera_layout);
cameraId = Camera.CameraInfo.CAMERA_FACING_BACK;
activity = this;
filepath = Environment.getExternalStorageDirectory();
if (checkCameraHardware(this)) {
// Create an instance of Camera
mCamera = getCameraInstance();
setCameraDisplayOrientation(this, cameraId, mCamera);
try {
// Get Camera Parameters
Camera.Parameters params = mCamera.getParameters();
// Set the Focus Mode
params.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
mCamera.setParameters(params);
Toast.makeText(getApplicationContext(), "Camera Available",
Toast.LENGTH_LONG).show();
mPreview = new CameraPreview(this, mCamera);
FrameLayout preview = (FrameLayout) findViewById(R.id.cameraPreview);
preview.addView(mPreview);
} catch (Exception e) {
Toast.makeText(getApplicationContext(),
"Error: " + e.getMessage(), Toast.LENGTH_LONG).show();
}
} else {
Toast.makeText(getApplicationContext(), "Camera Not Available",
Toast.LENGTH_LONG).show();
}
Button captureButton = (Button) findViewById(R.id.button_capture);
captureButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
// get an image from the camera
mCamera.takePicture(null, null, mPicture);
}
});
}
#Override
protected void onPause() {
// TODO OnPause Method
super.onPause();
releaseCamera();
}
// TODO Detecting Camera Hardware
private boolean checkCameraHardware(Context context) {
if (context.getPackageManager().hasSystemFeature(
PackageManager.FEATURE_CAMERA)) {
// This device has camera
return true;
} else {
// No Camera on this Device
return false;
}
}
// TODO Accessing Camera
public static Camera getCameraInstance() {
Camera c = null;
try {
c = Camera.open();
} catch (Exception e) {
// Camera is not available (in use or does not exist)
}
return c; // returns null if camera is unavailable
}
private PictureCallback mPicture = new PictureCallback() {
#SuppressLint("InlinedApi")
#Override
public void onPictureTaken(byte[] data, Camera camera) {
// TODO Takes the picture and write to file
File pictureFile = getOutputMediaFile(FileColumns.MEDIA_TYPE_IMAGE);
if (pictureFile == null) {
Log.d("PICFILE",
"Error creating media file, check storage permissions");
return;
}
try {
Bitmap bmp = BitmapFactory
.decodeByteArray(data, 0, data.length);
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(data);
fos.close();
updateGallery();
// Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0,
// data.length);
Intent i = new Intent(getApplicationContext(),
TestActivity.class);
i.putExtra("Image", data);
startActivity(i);
} catch (Exception e) {
Log.d("IOEXCEPTION", "Error accessing file: " + e.getMessage());
}
}
};
Yes you can use decode method .Check the below code for implementation.
String detectBarCode(Bitmap bitmap) {
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, stream);
int[] intArray = new int[bitmap.getWidth() * bitmap.getHeight()];
bitmap.getPixels(intArray, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
LuminanceSource source = new RGBLuminanceSource(bitmap.getWidth(), bitmap.getHeight(), intArray);
Reader reader = new QRCodeReader();
try {
Result result = reader.decode(new BinaryBitmap(new HybridBinarizer(source)));
return result.getText();
} catch (NotFoundException e) {
e.printStackTrace();
return null;
} catch (ChecksumException e) {
e.printStackTrace();
return null;
} catch (FormatException e) {
e.printStackTrace();
return null;
}
}
I'm a newbie in Android and I've been trying to make an Android camera application that is supposed to be able to capture 5 images when we click the camera button. But the code I'm working on doesn't seem to work. Is there anyone that can help please? So far, this is the code:
public class CameraDemo extends Activity {
private static final String TAG = "CameraDemo";
Camera camera;
Preview preview;
Button buttonClick;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
preview = new Preview(this);
((FrameLayout) findViewById(R.id.preview)).addView(preview);
buttonClick = (Button) findViewById(R.id.buttonClick);
buttonClick.setOnClickListener( new OnClickListener() {
public void onClick(View v) {
int count = 0;
while(count < 5){
preview.camera.takePicture(shutterCallback, rawCallback, jpegCallback);
count++;
try {
wait(1000);
} catch (InterruptedException exception) {
exception.printStackTrace();
}
}
}
});
Log.d(TAG, "onCreate'd");
}
ShutterCallback shutterCallback = new ShutterCallback() {
public void onShutter() {
Log.d(TAG, "onShutter'd");
}
};
/** Handles data for raw picture */
PictureCallback rawCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.d(TAG, "onPictureTaken - raw");
}
};
/** Handles data for jpeg picture */
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
// write to local sandbox file system
// outStream = CameraDemo.this.openFileOutput(String.format("%d.jpg", System.currentTimeMillis()), 0);
// Or write to sdcard
outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d(TAG, "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Log.d(TAG, "onPictureTaken - jpeg");
}
};
}
And this is the other class named Preview:
class Preview extends SurfaceView implements SurfaceHolder.Callback {
private static final String TAG = "Preview";
SurfaceHolder mHolder;
public Camera camera;
Preview(Context context) {
super(context);
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, acquire the camera and tell it where
// to draw.
camera = Camera.open();
try {
camera.setPreviewDisplay(holder);
camera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera arg1) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d(TAG, "onPreviewFrame - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Preview.this.invalidate();
}
});
} catch (IOException e) {
e.printStackTrace();
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// Surface will be destroyed when we return, so stop the preview.
// Because the CameraDevice object is not a shared resource, it's very
// important to release it when the activity is paused.
camera.stopPreview();
camera = null;
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
Camera.Parameters parameters = camera.getParameters();
parameters.setPreviewSize(w, h);
camera.setParameters(parameters);
camera.startPreview();
}
#Override
public void draw(Canvas canvas) {
super.draw(canvas);
Paint p= new Paint(Color.RED);
Log.d(TAG,"draw");
canvas.drawText("PREVIEW", canvas.getWidth()/2, canvas.getHeight()/2, p );
}
}
I’m a beginner and I need help. How do I take photos with camera and save or send to next activity?
I've tried a couple of options, i.e. takepicture with picture callback and surfaceview/take with intent. However, neither works properly on Android 2.3.3. Could someone figure out the issues with my code below?
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.sf_foto);
mCamera = getCameraInstance();
mCameraPreview = new SF_CameraPreview(this, mCamera);
FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
preview.addView(mCameraPreview);
ImageButton captureButton = (ImageButton) findViewById(R.id.button_capture);
captureButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
mCamera.takePicture(myShutterCallback, mPicture_RAW, mPicture);
}
});
}
private Camera getCameraInstance() {
Camera camera = null;
try {
camera = Camera.open(0);
camera.setDisplayOrientation(90);
} catch (Exception e) {
// cannot get camera or does not exist
}
return camera;
}
ShutterCallback myShutterCallback = new ShutterCallback(){
#Override
public void onShutter() {}
};
PictureCallback mPicture_RAW = new PictureCallback(){
#Override
public void onPictureTaken(byte[] arg0, Camera arg1) {}
};
PictureCallback mPicture = new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
File pictureFile = getOutputMediaFile();
if (pictureFile == null) {
return;
}
try {
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(data);
fos.flush();
fos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
Intent i = new Intent(StyloveFoto.this, Filter.class);
startActivity(i);
}
};
protected File getOutputMediaFile() {
File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM),"KWAlbum");
if (!mediaStorageDir.exists()) {
if (!mediaStorageDir.mkdirs()) {
Log.d("KWAlbum", "failed to create directory");
return null;
}
}
// Create a media file name
File mediaFile = new File(mediaStorageDir.getPath() + File.separator + "KW" + ".jpg");
return mediaFile;
}
my surface view:
public class SF_CameraPreview extends SurfaceView implements SurfaceHolder.Callback{
private SurfaceHolder mSurfaceHolder;
private Camera mCamera;
public SF_CameraPreview(Context context, Camera camera) {
super(context);
this.mCamera = camera;
this.mSurfaceHolder = this.getHolder();
this.mSurfaceHolder.addCallback(this);
this.mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
#Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
try {
mCamera.setPreviewDisplay(surfaceHolder);
mCamera.startPreview();
} catch (IOException e) {
// left blank for now
}
}
#Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
//mCamera.stopPreview();
mCamera.release();
}
#Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int format,
int width, int height) {
// start preview with new settings
try {
mCamera.setPreviewDisplay(surfaceHolder);
Camera.Parameters parameters = mCamera.getParameters();
parameters.set("orientation", "portrait");
mCamera.setParameters(parameters);
mCamera.startPreview();
} catch (Exception e) {
// intentionally left blank for a test
}
}
}
instead of this line : fos.write(data);
write : fos.write(data,0,data.length);
if you want to pass it to the next activity:
Intent i = new Intent(StyloveFoto.this, Filter.class);
i.putExtra("myImage",data);
startActivity(i);
and then in class filter in the oncreate method
byte[] myImage = getIntent()getIntent().getByteArrayExtra("myImage");
I got the same problem and i just solve it before.
The problem is that mPicture is an object of PictureCallback. You can't set intent directing to Filter.class from StyloveFoto.this, because it is in PictureCallback interface. Try this one:
Intent i = new Intent(getBaseContext() , Filter.class);
startActivity(i);
Such a big trap in java....hope it helps :)
Either you save the picture like this and pass the path to another activity
final File file = new File(Environment.getExternalStorageDirectory() + "/" + System.currentTimeMillis() + "_pic.jpg");
OutputStream output = null;
try {
output = new FileOutputStream(file);
output.write(data);
} catch (IOException e) {
e.printStackTrace();
} finally {
if (null != output) {
try {
output.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
OR
Pass data from camera activity :
Intent intent = new Intent(getApplicationContext(), your_class.class);
intent.putExtra("path", path);
intent.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
startActivity(intent);
Receiving Activity
byte[] data = getIntent().getByteArrayExtra("path");
Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
image.setImageBitmap(scaleDownBitmapImage(bitmap, 300, 200));
I am working on the camera code in android to take picture and save it on the phone. It takes the picture from phone camera and saves it on the memory card. The only problem is that the camera preview does not restart after taking the picture.
I cannot figure out the solution. Code is as follows. Suggestions are needed . . .
There are two classes in my project . . .
CAMERAACTIVITY CLASS
public class CameraActivity extends Activity
{
private static final String TAG = "CameraDemo";
Preview preview;
Button buttonClick;
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
preview = new Preview(this);
((FrameLayout) findViewById(R.id.preview)).addView(preview);
buttonClick = (Button) findViewById(R.id.buttonClick);
buttonClick.setOnClickListener(new OnClickListener() {
public void onClick(View v) {
preview.camera.takePicture(shutterCallback, rawCallback, jpegCallback);
}
});
Log.d(TAG, "onCreate'd");
}
// Called when shutter is opened
ShutterCallback shutterCallback = new ShutterCallback() {
public void onShutter() {
Log.d(TAG, "onShutter'd");
}
};
// Handles data for raw picture
PictureCallback rawCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.d(TAG, "onPictureTaken - raw");
}
};
// Handles data for jpeg picture
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
// Write to SD Card
outStream = new FileOutputStream(String.format("/sdcard/DCIM/queries.jpg",
System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d(TAG, "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Log.d(TAG, "onPictureTaken - jpeg");
}
};
}
Preview Class
class Preview extends SurfaceView implements SurfaceHolder.Callback{
private static final String TAG = "Preview";
SurfaceHolder mHolder; // <2>
public Camera camera; // <3>
Preview(Context context) {
super(context);
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder(); // <4>
mHolder.addCallback(this); // <5>
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); // <6>
}
// Called once the holder is ready
public void surfaceCreated(SurfaceHolder holder) { // <7>
// The Surface has been created, acquire the camera and tell it where
// to draw.
camera = Camera.open(); // <8>
try {
Camera.Parameters parameters = camera.getParameters();
parameters.set("orientation", "landscape");
camera.setParameters(parameters);
camera.setPreviewDisplay(holder);
camera.setPreviewCallback(new PreviewCallback() {
// Called for each frame previewed
public void onPreviewFrame(byte[] data, Camera camera) {
Log.d(TAG, "onPreviewFrame called at: " + System.currentTimeMillis());
Preview.this.invalidate();
}
});
} catch (IOException e) {
e.printStackTrace();
}
}
// Called when the holder is destroyed
public void surfaceDestroyed(SurfaceHolder holder) {
//Log.d(TAG,"Stopping preview in SurfaceDestroyed().");
camera.setPreviewCallback(null);
camera.stopPreview();
camera.release();
camera = null;
}
// Called when holder has changed
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
camera.startPreview();
}
}
The article about Camera from Android API Guide also suffers from the same problem. I could get the preview back after taking pictures by adding two more stuff like following:
1) Add external storage permission in the AndroidManifest.xml file:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
2) Start preview again by calling camera.startPreview(). In your code:
...
Log.d(TAG, "onPictureTaken - jpeg");
camera.startPreview();
....
I'm sure you'll be able to get yours to work in the same way.
Start the CAMERAACTIVITY again in onPictureTaken like this:
// Handles data for jpeg picture
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
// Write to SD Card
outStream = new FileOutputStream(String.format("/sdcard/DCIM/queries.jpg",
System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d(TAG, "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
camera.release();
camera = null;
startActivity(new Intent(CAMERAACTIVITY.this, CAMERAACTIVITY.class));
finish();
}
Log.d(TAG, "onPictureTaken - jpeg");
}
};
If you want, you can also use Thread.sleep(2000); so that, the captured image will be shown for 2 seconds and then again camera activity will start.
add
Thread.Sleep(2000);
camera.startPreview();
so that the camera will show the captured image for 2 secs and will restart the camera.
I'm using Camera API to take a picture and save the picture in SD card.
But I'm not able to view the picture. The screen is blank.
Here is my code:
public class CameraDemo extends Activity {
private static final String TAG = "CameraDemo";
Camera camera;
CameraPreview preview;
Button buttonClick;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
preview = new CameraPreview(this);
((FrameLayout) findViewById(R.id.preview)).addView(preview);
buttonClick = (Button) findViewById(R.id.buttonClick);
buttonClick.setOnClickListener(new OnClickListener() {
public void onClick(View v) {
preview.camera.takePicture(shutterCallback, rawCallback,
jpegCallback);
}
});
Log.d(TAG, "onCreate'd");
}
ShutterCallback shutterCallback = new ShutterCallback() {
public void onShutter() {
Log.d(TAG, "onShutter'd");
}
};
/** Handles data for raw picture */
PictureCallback rawCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.d(TAG, "onPictureTaken - raw");
}
};
/** Handles data for jpeg picture */
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
// write to local sandbox file system
// outStream =
// CameraDemo.this.openFileOutput(String.format("%d.jpg",
// System.currentTimeMillis()), 0);
// Or write to sdcard
outStream = new FileOutputStream(String.format(
"/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d(TAG, "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Log.d(TAG, "onPictureTaken - jpeg");
}
};
}
And the CameraPreview class
CameraPreview(Context context) {
super(context);
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, acquire the camera and tell it where
// to draw.
camera = Camera.open();
try {
camera.setPreviewDisplay(holder);
camera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera arg1) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format(
"/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d(TAG, "onPreviewFrame - wrote bytes: "
+ data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
CameraPreview.this.invalidate();
}
});
} catch (IOException e) {
e.printStackTrace();
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// Surface will be destroyed when we return, so stop the preview.
// Because the CameraDevice object is not a shared resource, it's very
// important to release it when the activity is paused.
camera.stopPreview();
camera = null;
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
Camera.Parameters parameters = camera.getParameters();
parameters.setPreviewSize(w, h);
camera.setParameters(parameters);
camera.startPreview();
}
#Override
public void draw(Canvas canvas) {
super.draw(canvas);
Paint p = new Paint(Color.RED);
Log.d(TAG, "draw");
canvas.drawText("PREVIEW", canvas.getWidth() / 2,
canvas.getHeight() / 2, p);
}
}
Help me!