This is my code to take pictures in Android. But, it always return a blank image. What might be the error? I saw a few issues of setting the flash, exposure and focus mode which I set in my code but still the camera returns a blank image even though the photo gets taken.(Atleast I heard the sound of the aperture.)
C
amera.Parameters p = camera.getParameters();
List<Size> sizes = p.getSupportedPictureSizes();
// Choose any one you want among sizes
Size size = sizes.get(0);
p.setPictureSize(size.width, size.height);
p.set("flash-mode","off");
p.set("focus-mode","auto");
p.setExposureCompensation(100);
p.setFocusMode("auto");
camera.setParameters(p);
camera.startPreview();
camera.takePicture(shutterCallback, rawCallback,
jpegCallback);
ShutterCallback shutterCallback = new ShutterCallback() {
public void onShutter() {
Log.d(TAG, "onShutter'd");
}
};
/** Handles data for raw picture */
PictureCallback rawCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.d(TAG, "onPictureTaken - raw");
}
};
/** Handles data for jpeg picture */
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
// write to local sandbox file system
// outStream =
// CameraDemo.this.openFileOutput(String.format("%d.jpg",
// System.currentTimeMillis()), 0);
// Or write to sdcard
outStream = new FileOutputStream(String.format(
"/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d(TAG, "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Log.d(TAG, "onPictureTaken - jpeg");
}
};
#Override
public void draw(Canvas canvas) {
super.draw(canvas);
Paint p = new Paint(Color.RED);
Log.d(TAG, "draw");
canvas.drawText("PREVIEW", canvas.getWidth() / 2,
canvas.getHeight() / 2, p);
}
Camera.java
public class Camera extends Activity
{
private static final int CAMERA_REQUEST = 1888;
private String selectedImagePath;
#Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
Intent cameraIntent = new Intent(ACTION_IMAGE_CAPTURE);
startActivityForResult(cameraIntent, CAMERA_REQUEST);
}
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data)
{
if (resultCode == RESULT_OK) {
if (requestCode == CAMERA_REQUEST)
{
Bitmap photo = (Bitmap) data.getExtras().get("data");
ByteArrayOutputStream bytes = new ByteArrayOutputStream();
photo.compress(Bitmap.CompressFormat.JPEG, 40, bytes);
Random randomGenerator = new Random();randomGenerator.nextInt();
String newimagename=randomGenerator.toString()+".jpg";
File f = new File(Environment.getExternalStorageDirectory()
+ File.separator + newimagename);
try {
f.createNewFile();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
//write the bytes in file
try {
fo = new FileOutputStream(f.getAbsoluteFile());
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
try {
fo.write(bytes.toByteArray());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
uri=f.getAbsolutePath();
//this is the url that where you are saved the image
}
}
Related
I have made a camera demo application in android. It was working fine on a button click event. But now I want to open the camera when activity starts and its not working. It fails with an error
Failed to connect to camera service
Please have a look at my code and save me. My code is as below:
code
surfaceView = (SurfaceView) findViewById(R.id.surfaceView1);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(this);
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
rawCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.d("Log", "onPictureTaken - raw");
}
};
/** Handles data for jpeg picture */
shutterCallback = new ShutterCallback() {
public void onShutter() {
Log.i("Log", "onShutter'd");
}
};
jpegCallback = new PictureCallback() {
#SuppressLint("NewApi")
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format(
"/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d("Log", "onPictureTaken - wrote bytes: " + data.length);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inMutable = true;
bmp = BitmapFactory.decodeByteArray(data, 0, data.length,
options);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Log.d("Log", "onPictureTaken - jpeg");
}
};
start_camera();
private void start_camera() {
try {
releaseCameraAndPreview();
camera = Camera.open();
camera.setDisplayOrientation(90);
} catch (RuntimeException e) {
Log.e(tag, "init_camera: " + e);
return;
}
param = camera.getParameters();
param.set("jpeg-quality", 100);
camera.setParameters(param);
// modify parameter
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
// camera.takePicture(shutter, raw, jpeg)
} catch (Exception e) {
Log.e(tag, "init_camera: " + e);
return;
}
}
I am working in QR Codes project. I have used ZXING library for generating QR codes. I want to scan QR Code in my app. But for that i am using my own custom camera. In my camera i have captured image and created the bitmap of captured image. Is it possible to use that bitmap for scanning QR Codes by calling the decode functions of ZXING library and passing that bitmap or byte[] in it? I will be thankful to you for helping me. Here is my implementation,
public void onCreate(Bundle savedInstanceState) {
// TODO OnCreate Method
super.onCreate(savedInstanceState);
setContentView(R.layout.camera_layout);
cameraId = Camera.CameraInfo.CAMERA_FACING_BACK;
activity = this;
filepath = Environment.getExternalStorageDirectory();
if (checkCameraHardware(this)) {
// Create an instance of Camera
mCamera = getCameraInstance();
setCameraDisplayOrientation(this, cameraId, mCamera);
try {
// Get Camera Parameters
Camera.Parameters params = mCamera.getParameters();
// Set the Focus Mode
params.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
mCamera.setParameters(params);
Toast.makeText(getApplicationContext(), "Camera Available",
Toast.LENGTH_LONG).show();
mPreview = new CameraPreview(this, mCamera);
FrameLayout preview = (FrameLayout) findViewById(R.id.cameraPreview);
preview.addView(mPreview);
} catch (Exception e) {
Toast.makeText(getApplicationContext(),
"Error: " + e.getMessage(), Toast.LENGTH_LONG).show();
}
} else {
Toast.makeText(getApplicationContext(), "Camera Not Available",
Toast.LENGTH_LONG).show();
}
Button captureButton = (Button) findViewById(R.id.button_capture);
captureButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
// get an image from the camera
mCamera.takePicture(null, null, mPicture);
}
});
}
#Override
protected void onPause() {
// TODO OnPause Method
super.onPause();
releaseCamera();
}
// TODO Detecting Camera Hardware
private boolean checkCameraHardware(Context context) {
if (context.getPackageManager().hasSystemFeature(
PackageManager.FEATURE_CAMERA)) {
// This device has camera
return true;
} else {
// No Camera on this Device
return false;
}
}
// TODO Accessing Camera
public static Camera getCameraInstance() {
Camera c = null;
try {
c = Camera.open();
} catch (Exception e) {
// Camera is not available (in use or does not exist)
}
return c; // returns null if camera is unavailable
}
private PictureCallback mPicture = new PictureCallback() {
#SuppressLint("InlinedApi")
#Override
public void onPictureTaken(byte[] data, Camera camera) {
// TODO Takes the picture and write to file
File pictureFile = getOutputMediaFile(FileColumns.MEDIA_TYPE_IMAGE);
if (pictureFile == null) {
Log.d("PICFILE",
"Error creating media file, check storage permissions");
return;
}
try {
Bitmap bmp = BitmapFactory
.decodeByteArray(data, 0, data.length);
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(data);
fos.close();
updateGallery();
// Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0,
// data.length);
Intent i = new Intent(getApplicationContext(),
TestActivity.class);
i.putExtra("Image", data);
startActivity(i);
} catch (Exception e) {
Log.d("IOEXCEPTION", "Error accessing file: " + e.getMessage());
}
}
};
Yes you can use decode method .Check the below code for implementation.
String detectBarCode(Bitmap bitmap) {
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, stream);
int[] intArray = new int[bitmap.getWidth() * bitmap.getHeight()];
bitmap.getPixels(intArray, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
LuminanceSource source = new RGBLuminanceSource(bitmap.getWidth(), bitmap.getHeight(), intArray);
Reader reader = new QRCodeReader();
try {
Result result = reader.decode(new BinaryBitmap(new HybridBinarizer(source)));
return result.getText();
} catch (NotFoundException e) {
e.printStackTrace();
return null;
} catch (ChecksumException e) {
e.printStackTrace();
return null;
} catch (FormatException e) {
e.printStackTrace();
return null;
}
}
I’m a beginner and I need help. How do I take photos with camera and save or send to next activity?
I've tried a couple of options, i.e. takepicture with picture callback and surfaceview/take with intent. However, neither works properly on Android 2.3.3. Could someone figure out the issues with my code below?
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.sf_foto);
mCamera = getCameraInstance();
mCameraPreview = new SF_CameraPreview(this, mCamera);
FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
preview.addView(mCameraPreview);
ImageButton captureButton = (ImageButton) findViewById(R.id.button_capture);
captureButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
mCamera.takePicture(myShutterCallback, mPicture_RAW, mPicture);
}
});
}
private Camera getCameraInstance() {
Camera camera = null;
try {
camera = Camera.open(0);
camera.setDisplayOrientation(90);
} catch (Exception e) {
// cannot get camera or does not exist
}
return camera;
}
ShutterCallback myShutterCallback = new ShutterCallback(){
#Override
public void onShutter() {}
};
PictureCallback mPicture_RAW = new PictureCallback(){
#Override
public void onPictureTaken(byte[] arg0, Camera arg1) {}
};
PictureCallback mPicture = new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
File pictureFile = getOutputMediaFile();
if (pictureFile == null) {
return;
}
try {
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(data);
fos.flush();
fos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
Intent i = new Intent(StyloveFoto.this, Filter.class);
startActivity(i);
}
};
protected File getOutputMediaFile() {
File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM),"KWAlbum");
if (!mediaStorageDir.exists()) {
if (!mediaStorageDir.mkdirs()) {
Log.d("KWAlbum", "failed to create directory");
return null;
}
}
// Create a media file name
File mediaFile = new File(mediaStorageDir.getPath() + File.separator + "KW" + ".jpg");
return mediaFile;
}
my surface view:
public class SF_CameraPreview extends SurfaceView implements SurfaceHolder.Callback{
private SurfaceHolder mSurfaceHolder;
private Camera mCamera;
public SF_CameraPreview(Context context, Camera camera) {
super(context);
this.mCamera = camera;
this.mSurfaceHolder = this.getHolder();
this.mSurfaceHolder.addCallback(this);
this.mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
#Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
try {
mCamera.setPreviewDisplay(surfaceHolder);
mCamera.startPreview();
} catch (IOException e) {
// left blank for now
}
}
#Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
//mCamera.stopPreview();
mCamera.release();
}
#Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int format,
int width, int height) {
// start preview with new settings
try {
mCamera.setPreviewDisplay(surfaceHolder);
Camera.Parameters parameters = mCamera.getParameters();
parameters.set("orientation", "portrait");
mCamera.setParameters(parameters);
mCamera.startPreview();
} catch (Exception e) {
// intentionally left blank for a test
}
}
}
instead of this line : fos.write(data);
write : fos.write(data,0,data.length);
if you want to pass it to the next activity:
Intent i = new Intent(StyloveFoto.this, Filter.class);
i.putExtra("myImage",data);
startActivity(i);
and then in class filter in the oncreate method
byte[] myImage = getIntent()getIntent().getByteArrayExtra("myImage");
I got the same problem and i just solve it before.
The problem is that mPicture is an object of PictureCallback. You can't set intent directing to Filter.class from StyloveFoto.this, because it is in PictureCallback interface. Try this one:
Intent i = new Intent(getBaseContext() , Filter.class);
startActivity(i);
Such a big trap in java....hope it helps :)
Either you save the picture like this and pass the path to another activity
final File file = new File(Environment.getExternalStorageDirectory() + "/" + System.currentTimeMillis() + "_pic.jpg");
OutputStream output = null;
try {
output = new FileOutputStream(file);
output.write(data);
} catch (IOException e) {
e.printStackTrace();
} finally {
if (null != output) {
try {
output.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
OR
Pass data from camera activity :
Intent intent = new Intent(getApplicationContext(), your_class.class);
intent.putExtra("path", path);
intent.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
startActivity(intent);
Receiving Activity
byte[] data = getIntent().getByteArrayExtra("path");
Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
image.setImageBitmap(scaleDownBitmapImage(bitmap, 300, 200));
I want to take a picture with my phone via an application and save the image on my phone.
I've tried many of the solutions proposed on the stackoverflow questions but it did not mark so I built a method that saves the file with the right name ... but the file is empty (0kb)!
Here is my code
public class GameActivity extends Activity implements SurfaceHolder.Callback/*,Camera.PictureCallback*/ {
private Camera camera;
private SurfaceView surfaceCamera;
public Handler handler = new Handler();
private boolean isPreview=false;
private SurfaceHolder holder;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.gameactivity);
surfaceCamera = (SurfaceView) findViewById(R.id.surfaceViewCamera);
getWindow().setFormat(PixelFormat.UNKNOWN);
holder = surfaceCamera.getHolder();
holder.addCallback(this);
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
ImageView image = (ImageView) findViewById(R.id.imageView3);
image.setOnTouchListener(new OnTouchListener(){
#Override
public boolean onTouch(View arg0, MotionEvent arg1) {
camera.takePicture(myShutterCallback, myPictureCallback_RAW, myPictureCallback_JPG);
return false;
}
Camera.ShutterCallback myShutterCallback = new Camera.ShutterCallback() {
public void onShutter() {
// TODO bl
}
};
PictureCallback myPictureCallback_RAW = new PictureCallback() {
public void onPictureTaken(byte[] arg0, Camera arg1) {
// TODO Auto-generated method stub
}
};
PictureCallback myPictureCallback_JPG = new PictureCallback(){
#Override
public void onPictureTaken(byte[] data, Camera camera) {
File imagesFolder = new File(Environment.getExternalStorageDirectory(), "/KersplattFolder");
imagesFolder.mkdirs();
String fileName = "image.jpg";
File output = new File(imagesFolder, fileName);
try {
FileOutputStream fos = new FileOutputStream(output);
fos.write(data[0]);
fos.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Toast.makeText(GameActivity.this,
"Image saved: ",
Toast.LENGTH_LONG).show();
camera.startPreview();
}
};
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
if (isPreview) {
camera.stopPreview();
}
Camera.Parameters p = camera.getParameters();
p.setPreviewSize(width,height);
camera.setParameters(p);
try {
camera.setPreviewDisplay(holder);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
camera.startPreview();
isPreview=true;
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
camera = Camera.open();
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
camera.stopPreview();
isPreview=false;
camera.release();
}
EDIT1 : when I put data instead of data[0] I save a black image as well but the file has a weight so I guess the real image is somewhere...
EDIT2
I have added the code
File output = new File(imagesFolder, fileName);
Uri outputFileUri = Uri.fromFile(output);
String filePath = outputFileUri.getPath();
File file= new File(filePath);
try {
FileOutputStream fos = new FileOutputStream(file,true);
fos.write(data);
fos.close();
}
Still have the black image
have you defined the required permissions in your Android manifest like described here?
<manifest ... >
<uses-feature android:name="android.hardware.camera" />
...
</manifest ... >
UPDATE:
Ok, remove the brackets after data: data instead of data[]. Then it should work.
Your code: fos.write(data[0]); is wrong. I am surprised that you did not get an exception
Here is the code I used to save the byte array after taking a picture. "data" is the byte array.
String filePath = outputFileUri.getPath(); // .getEncodedPath();
File file = new File(filePath);
FileOutputStream os;
try {
os = new FileOutputStream(file, true);
os.write(data);
os.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
I'm using Camera API to take a picture and save the picture in SD card.
But I'm not able to view the picture. The screen is blank.
Here is my code:
public class CameraDemo extends Activity {
private static final String TAG = "CameraDemo";
Camera camera;
CameraPreview preview;
Button buttonClick;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
preview = new CameraPreview(this);
((FrameLayout) findViewById(R.id.preview)).addView(preview);
buttonClick = (Button) findViewById(R.id.buttonClick);
buttonClick.setOnClickListener(new OnClickListener() {
public void onClick(View v) {
preview.camera.takePicture(shutterCallback, rawCallback,
jpegCallback);
}
});
Log.d(TAG, "onCreate'd");
}
ShutterCallback shutterCallback = new ShutterCallback() {
public void onShutter() {
Log.d(TAG, "onShutter'd");
}
};
/** Handles data for raw picture */
PictureCallback rawCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.d(TAG, "onPictureTaken - raw");
}
};
/** Handles data for jpeg picture */
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
// write to local sandbox file system
// outStream =
// CameraDemo.this.openFileOutput(String.format("%d.jpg",
// System.currentTimeMillis()), 0);
// Or write to sdcard
outStream = new FileOutputStream(String.format(
"/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d(TAG, "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Log.d(TAG, "onPictureTaken - jpeg");
}
};
}
And the CameraPreview class
CameraPreview(Context context) {
super(context);
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, acquire the camera and tell it where
// to draw.
camera = Camera.open();
try {
camera.setPreviewDisplay(holder);
camera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera arg1) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format(
"/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d(TAG, "onPreviewFrame - wrote bytes: "
+ data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
CameraPreview.this.invalidate();
}
});
} catch (IOException e) {
e.printStackTrace();
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// Surface will be destroyed when we return, so stop the preview.
// Because the CameraDevice object is not a shared resource, it's very
// important to release it when the activity is paused.
camera.stopPreview();
camera = null;
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
Camera.Parameters parameters = camera.getParameters();
parameters.setPreviewSize(w, h);
camera.setParameters(parameters);
camera.startPreview();
}
#Override
public void draw(Canvas canvas) {
super.draw(canvas);
Paint p = new Paint(Color.RED);
Log.d(TAG, "draw");
canvas.drawText("PREVIEW", canvas.getWidth() / 2,
canvas.getHeight() / 2, p);
}
}
Help me!