How to wait till the camera capture image in android? - android

I am working on an Android application which programatically capture image using both front and back camera and save to a folder in the internal memory.
Once the image is saved the application will send the contents of the folder through email. How to wait the application till images captured completely, otherwise a blank email is send?
class MainActivity{
//other codes
public void buttonClick(View v) {
CameraService.startCamera(0, true);
sendEmail();
}
public void sendEmail()
{
//get contents from the folder and send the contents using java mail api
}
}
This is the class which capture image using both front and back camera.
class CamearService
{
public static void startCamera(int cameraID,final boolean isFirstTime) {
mCamera = Camera.open(cameraID);
try {
mCamera.setPreviewTexture(new SurfaceTexture(10));
} catch (IOException e1) {
}
Parameters params = mCamera.getParameters();
params.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
params.setPictureFormat(ImageFormat.JPEG);
params.setJpegQuality(100);
mCamera.setParameters(params);
mCamera.startPreview();
mCamera.takePicture(null, null, null, new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
Log.i("hello", "picture-taken");
if (data != null) {
mCamera.stopPreview();
mCamera.release();
try {
BitmapFactory.Options opts = new BitmapFactory.Options();
Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0,
data.length, opts);
storeImage(bitmap); //function to store image to local folder
if(isFirstTime)
{
//Capture using front camera
CameraService.startCamera(1, false);
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
});
}
}
when i click on the button startCamera() function is executed and before capturing the second image sendEmail() function is executed. As a result only one image is send through email.

What you are looking for is basic programming concept called Callbacks (which you are using with the Camera.takePicture() method call).
Please read my answer here as it is apply as well to your case:
Handle data returned by an Async task (Firebase)

Related

Android - Surface View ScreenShot [duplicate]

I am trying to Implement a Functionality that includes taking Pictures While Recording Video. That's the reason i have concluded to use the Screenshot approach of SurfaceView.
However, when i try to take the Screen Shot of SurfaceView. I am always getting a Blank Image.
Here is the code that i am using for taking a Snapshot:
View tempView = (View)MY_SURFACE_VIEW;
tempView.setDrawingCacheEnabled(true);
Bitmap tempBmp = Bitmap.createBitmap(tempView.getDrawingCache());
tempView.setDrawingCacheEnabled(false);
//Saving this Bitmap to a File....
In case you guys may think this is a duplicate Question, let me assure you that i have tried the following Solutions provided on SO for the same Problem before asking this one.
https://stackoverflow.com/questions/24134964/issue-with-camera-picture-taken-snapshot-using-surfaceview-in-android
Facing issue to take a screenshot while recording a video
Take camera screenshot while recording - Like in Galaxy S3?
Taking screen shot of a SurfaceView in android
Get screenshot of surfaceView in Android (This is the correct answer, but Partially Answered. I have already asked #sajar to Explain the Answer)
Other Resources on Internet:
1. http://www.coderanch.com/t/622613/Android/Mobile/capture-screenshot-simple-animation-project
2. http://www.phonesdevelopers.com/1795894/
None of this has worked so far for me. I also know that we need to create some Thread that interacts with the Surface Holder and Gets the Bitmap from It. But i am not sure how to implement that.
Any Help is Highly Appreciated.
Here's another one: Take screenshot of SurfaceView.
SurfaceViews have a "surface" part and a "view" part; your code tries to capture the "view" part. The "surface" part is a separate layer, and there's no trivial "grab all pixels" method. The basic difficulty is that your app is on the "producer" side of the surface, rather than the "consumer" side, so reading pixels back out is problematic. Note that the underlying buffers are in whatever format is most convenient for the data producer, so for camera preview it'll be a YUV buffer.
The easiest and most efficient way to "capture" the surface pixels is to draw them twice, once for the screen and once for capture. If you do this with OpenGL ES, the YUV to RGB conversion will likely be done by a hardware module, which will be much faster than receiving camera frames in YUV buffers and doing your own conversion.
Grafika's "texture from camera" activity demonstrates manipulation of incoming video data with GLES. After rendering you can get the pixels with glReadPixels(). The performance of glReadPixels() can vary significantly between devices and different use cases. EglSurfaceBase#saveFrame() shows how to capture to a Bitmap and save as PNG.
More information about the Android graphics architecture, notably the producer-consumer nature of SurfaceView surfaces, can be found in this document.
public class AndroidSurfaceviewExample extends Activity implements SurfaceHolder.Callback {
static Camera camera;
SurfaceView surfaceView;
SurfaceHolder surfaceHolder;
static boolean boo;
static Thread x;
GLSurfaceView glSurfaceView;
public static Bitmap mBitmap;
public static Camera.Parameters param;
public static Camera.Size mPreviewSize;
public static byte[] byteArray;
PictureCallback jpegCallback;
private Bitmap inputBMP = null, bmp, bmp1;
public static ImageView imgScreen;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.camera);
surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
surfaceHolder = surfaceView.getHolder();
Button btnTakeScreen = (Button)findViewById(R.id.btnTakeScreen);
imgScreen = (ImageView)findViewById(R.id.imgScreen);
btnTakeScreen.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
Bitmap screen = Bitmap.createBitmap(getBitmap());
imgScreen.setImageBitmap(screen);
}
});
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
surfaceHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
jpegCallback = new PictureCallback() {
#SuppressLint("WrongConstant")
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d("Log", "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Toast.makeText(getApplicationContext(), "Picture Saved", 2000).show();
refreshCamera();
}
};
}
public void refreshCamera() {
if (surfaceHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
camera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
refreshCamera();
}
public void surfaceCreated(SurfaceHolder holder) {
if (camera == null) {
try {
camera = Camera.open();
} catch (RuntimeException ignored) {
}
}
try {
if (camera != null) {
WindowManager winManager = (WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE);
camera.setPreviewDisplay(surfaceHolder);
}
} catch (Exception e) {
if (camera != null)
camera.release();
camera = null;
}
if (camera == null) {
return;
} else {
camera.setPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] bytes, Camera camera) {
if (param == null) {
return;
}
byteArray = bytes;
}
});
}
param = camera.getParameters();
mPreviewSize = param.getSupportedPreviewSizes().get(0);
param.setColorEffect(Camera.Parameters.EFFECT_NONE);
//set antibanding to none
if (param.getAntibanding() != null) {
param.setAntibanding(Camera.Parameters.ANTIBANDING_OFF);
}
// set white ballance
if (param.getWhiteBalance() != null) {
param.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_CLOUDY_DAYLIGHT);
}
//set flash
if (param.getFlashMode() != null) {
param.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
}
//set zoom
if (param.isZoomSupported()) {
param.setZoom(0);
}
//set focus mode
param.setFocusMode(Camera.Parameters.FOCUS_MODE_INFINITY);
// modify parameter
camera.setParameters(param);
try {
// The Surface has been created, now tell the camera where to draw
// the preview.
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
// check for exceptions
System.err.println(e);
return;
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// stop preview and release camera
camera.stopPreview();
camera.release();
camera = null;
}
public Bitmap getBitmap() {
try {
if (param == null)
return null;
if (mPreviewSize == null)
return null;
int format = param.getPreviewFormat();
YuvImage yuvImage = new YuvImage(byteArray, format, mPreviewSize.width, mPreviewSize.height, null);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
Log.i("myLog","array: "+byteArray.toString());
Rect rect = new Rect(0, 0, mPreviewSize.width, mPreviewSize.height);
yuvImage.compressToJpeg(rect, 75, byteArrayOutputStream);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPurgeable = true;
options.inInputShareable = true;
mBitmap = BitmapFactory.decodeByteArray(byteArrayOutputStream.toByteArray(), 0, byteArrayOutputStream.size(), options);
byteArrayOutputStream.flush();
byteArrayOutputStream.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
return mBitmap;
}

Capture Image programmatically in android

how i can Using camera to capture picture programmatically without touch the capture button in android like selfie stick using bluetooth sign.
Try something like this:
public void takePictureNoPreview(Context context){
// open back facing camera by default
Camera myCamera=Camera.open();
if(myCamera!=null){
try{
//set camera parameters if you want to
//...
// here, the unused surface view and holder
SurfaceView dummy=new SurfaceView(context)
myCamera.setPreviewDisplay(dummy.getHolder());
myCamera.startPreview();
myCamera.takePicture(null, null, getJpegCallback()):
} finally {
myCamera.close();
}
} else {
//booo, failed!
}
private PictureCallback getJpegCallback(){
PictureCallback jpeg=new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream fos;
try {
fos = new FileOutputStream("test.jpeg");
fos.write(data);
fos.close();
} catch (IOException e) {
//do something about it
}
}
};
}
}
Or try some solutions from this post: Taking pictures with camera on Android programmatically

Method Camera.takePicture() crash on Nexus 5, but Xperia mini is OK

At first please excuse my bad English.
I have problem with programmatically taking photo. I wrote an app, that makes collection of photos based on countdown timer and after that, photos are being processed using c++ code.
I'm using dummy SurfaceView, because I don't need preview in UI. The code below is working on my phone Xperia mini - API 15 (so permissions and code would be correct), but I borrowed school Nexus 5 - API 21 and there is problem with preview.
takePicture: camera 0: Cannot take picture without preview enabled
I found a solution, which uses setPreviewTexture (commented below) instead of setPreviewDisplay. It working for the first photo, which is normally saved, but I get the same error after the second call of takePicture().
Thanks for every advice, LS
Camera camera;
#Override
protected void onResume() {
super.onResume();
// is camera on device?
if(!checkCameraHardware()) return;
releaseCamera();
try {
camera.stopPreview();
} catch (Exception e){
Log.d(TAG, "No preview before.");
}
SurfaceView dummy = new SurfaceView(this);
camera = Camera.open();
Camera.Parameters params = camera.getParameters();
params.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
camera.setParameters(params);
try {
//camera.setPreviewTexture(new SurfaceTexture(10));
camera.setPreviewDisplay(dummy.getHolder());
} catch (IOException e) {
e.printStackTrace();
}
camera.startPreview();
}
SOLUTION:
I needed to refresh preview. The code below is working on Xperie and Nexus too.
Question remains why I have to use setPreviewTexture, because setPreviewDisplay always returns error on Nexus.
camera.takePicture(null, null, new PictureCallback() {
#Override
public void onPictureTaken(final byte[] data, Camera camera) {
// save picture
refreshPreview();
}
});
public void refreshPreview() {
try {
camera.stopPreview();
} catch (Exception e) {}
try {
camera.startPreview();
} catch (Exception e) {}
}
and in function onResume()
try {
camera.setPreviewTexture(new SurfaceTexture(10));
} catch (IOException e) {}
Just add a callback for starting preview on your camera instance. The thing is that after starting preview on camera instance, it needs some time to be able take a picture. Try this:
camera.startPreview();
camera.setOneShotPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
camera.takePicture(null, null, new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
// do something you want with your picture and stop preview
camera.stopPreview();
}
});
Once the picture is taken, refresh you're surfaceview & stop the preview and releasee camera and restart the process again.
try {
camera.takePicture(null, null, new PictureCallback() {
public void onPictureTaken(final byte[] data, Camera camera) {
//once ur logic done
refreshCamera();
}
});
} catch (Exception e2) {
// Toast.makeText(getApplicationContext(), "Picture not taken", Toast.LENGTH_SHORT).show();
e2.printStackTrace();
}
public void refreshCamera() {
if (dummy.getHolder().getSurface() == null) {
return;
}
try {
camera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
try {
camera.setPreviewDisplay(dummy.getHolder());
camera.startPreview();
} catch (Exception e) {
}
}
Hope this solution may help you.

Taking a ScreenShot of SurfaceView with Camera Preview in it

I am trying to Implement a Functionality that includes taking Pictures While Recording Video. That's the reason i have concluded to use the Screenshot approach of SurfaceView.
However, when i try to take the Screen Shot of SurfaceView. I am always getting a Blank Image.
Here is the code that i am using for taking a Snapshot:
View tempView = (View)MY_SURFACE_VIEW;
tempView.setDrawingCacheEnabled(true);
Bitmap tempBmp = Bitmap.createBitmap(tempView.getDrawingCache());
tempView.setDrawingCacheEnabled(false);
//Saving this Bitmap to a File....
In case you guys may think this is a duplicate Question, let me assure you that i have tried the following Solutions provided on SO for the same Problem before asking this one.
https://stackoverflow.com/questions/24134964/issue-with-camera-picture-taken-snapshot-using-surfaceview-in-android
Facing issue to take a screenshot while recording a video
Take camera screenshot while recording - Like in Galaxy S3?
Taking screen shot of a SurfaceView in android
Get screenshot of surfaceView in Android (This is the correct answer, but Partially Answered. I have already asked #sajar to Explain the Answer)
Other Resources on Internet:
1. http://www.coderanch.com/t/622613/Android/Mobile/capture-screenshot-simple-animation-project
2. http://www.phonesdevelopers.com/1795894/
None of this has worked so far for me. I also know that we need to create some Thread that interacts with the Surface Holder and Gets the Bitmap from It. But i am not sure how to implement that.
Any Help is Highly Appreciated.
Here's another one: Take screenshot of SurfaceView.
SurfaceViews have a "surface" part and a "view" part; your code tries to capture the "view" part. The "surface" part is a separate layer, and there's no trivial "grab all pixels" method. The basic difficulty is that your app is on the "producer" side of the surface, rather than the "consumer" side, so reading pixels back out is problematic. Note that the underlying buffers are in whatever format is most convenient for the data producer, so for camera preview it'll be a YUV buffer.
The easiest and most efficient way to "capture" the surface pixels is to draw them twice, once for the screen and once for capture. If you do this with OpenGL ES, the YUV to RGB conversion will likely be done by a hardware module, which will be much faster than receiving camera frames in YUV buffers and doing your own conversion.
Grafika's "texture from camera" activity demonstrates manipulation of incoming video data with GLES. After rendering you can get the pixels with glReadPixels(). The performance of glReadPixels() can vary significantly between devices and different use cases. EglSurfaceBase#saveFrame() shows how to capture to a Bitmap and save as PNG.
More information about the Android graphics architecture, notably the producer-consumer nature of SurfaceView surfaces, can be found in this document.
public class AndroidSurfaceviewExample extends Activity implements SurfaceHolder.Callback {
static Camera camera;
SurfaceView surfaceView;
SurfaceHolder surfaceHolder;
static boolean boo;
static Thread x;
GLSurfaceView glSurfaceView;
public static Bitmap mBitmap;
public static Camera.Parameters param;
public static Camera.Size mPreviewSize;
public static byte[] byteArray;
PictureCallback jpegCallback;
private Bitmap inputBMP = null, bmp, bmp1;
public static ImageView imgScreen;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.camera);
surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
surfaceHolder = surfaceView.getHolder();
Button btnTakeScreen = (Button)findViewById(R.id.btnTakeScreen);
imgScreen = (ImageView)findViewById(R.id.imgScreen);
btnTakeScreen.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
Bitmap screen = Bitmap.createBitmap(getBitmap());
imgScreen.setImageBitmap(screen);
}
});
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
surfaceHolder.addCallback(this);
// deprecated setting, but required on Android versions prior to 3.0
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
jpegCallback = new PictureCallback() {
#SuppressLint("WrongConstant")
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(String.format("/sdcard/%d.jpg", System.currentTimeMillis()));
outStream.write(data);
outStream.close();
Log.d("Log", "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Toast.makeText(getApplicationContext(), "Picture Saved", 2000).show();
refreshCamera();
}
};
}
public void refreshCamera() {
if (surfaceHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
camera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
// set preview size and make any resize, rotate or
// reformatting changes here
// start preview with new settings
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
refreshCamera();
}
public void surfaceCreated(SurfaceHolder holder) {
if (camera == null) {
try {
camera = Camera.open();
} catch (RuntimeException ignored) {
}
}
try {
if (camera != null) {
WindowManager winManager = (WindowManager) getApplicationContext().getSystemService(Context.WINDOW_SERVICE);
camera.setPreviewDisplay(surfaceHolder);
}
} catch (Exception e) {
if (camera != null)
camera.release();
camera = null;
}
if (camera == null) {
return;
} else {
camera.setPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] bytes, Camera camera) {
if (param == null) {
return;
}
byteArray = bytes;
}
});
}
param = camera.getParameters();
mPreviewSize = param.getSupportedPreviewSizes().get(0);
param.setColorEffect(Camera.Parameters.EFFECT_NONE);
//set antibanding to none
if (param.getAntibanding() != null) {
param.setAntibanding(Camera.Parameters.ANTIBANDING_OFF);
}
// set white ballance
if (param.getWhiteBalance() != null) {
param.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_CLOUDY_DAYLIGHT);
}
//set flash
if (param.getFlashMode() != null) {
param.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
}
//set zoom
if (param.isZoomSupported()) {
param.setZoom(0);
}
//set focus mode
param.setFocusMode(Camera.Parameters.FOCUS_MODE_INFINITY);
// modify parameter
camera.setParameters(param);
try {
// The Surface has been created, now tell the camera where to draw
// the preview.
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception e) {
// check for exceptions
System.err.println(e);
return;
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// stop preview and release camera
camera.stopPreview();
camera.release();
camera = null;
}
public Bitmap getBitmap() {
try {
if (param == null)
return null;
if (mPreviewSize == null)
return null;
int format = param.getPreviewFormat();
YuvImage yuvImage = new YuvImage(byteArray, format, mPreviewSize.width, mPreviewSize.height, null);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
Log.i("myLog","array: "+byteArray.toString());
Rect rect = new Rect(0, 0, mPreviewSize.width, mPreviewSize.height);
yuvImage.compressToJpeg(rect, 75, byteArrayOutputStream);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPurgeable = true;
options.inInputShareable = true;
mBitmap = BitmapFactory.decodeByteArray(byteArrayOutputStream.toByteArray(), 0, byteArrayOutputStream.size(), options);
byteArrayOutputStream.flush();
byteArrayOutputStream.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
return mBitmap;
}

Go from application to camera take photo and store in a column in database

I have a requirement that i want to go from my application to the camera and then the photo take should be stored on the database, then from the database i want to show the column data within a list view with other data. I have an idea about the list view, using an image view etc... i just don't know where to start with regards to accessing the camera and storing it to a column in the database?? any tips or even a tutorial that i could look at which would be helpful?
public class CameraAPI extends Activity implements SurfaceHolder.Callback{
public Camera camera;
MediaRecorder mediaRecorder;
public void onCreate (Bundle savedInstanceState){
super.onCreate(savedInstanceState);
setContentView(R.layout.camera);
SurfaceView surface = (SurfaceView)findViewById(R.id.acccam);
SurfaceHolder holder = surface.getHolder();
holder.addCallback(this);
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void tackPhoto(View view){
takePicture();
}
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
// TODO Auto-generated method stub
}
public void surfaceCreated(SurfaceHolder holder) {
// TODO Auto-generated method stub
if (mediaRecorder == null){
try{
camera = camera.open();
camera.setPreviewDisplay(holder);
camera.startPreview();
}catch (IOException e){
Log.d("CAMERA", e.getMessage());
}
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
// TODO Auto-generated method stub
camera.stopPreview();
camera.release();
}
public void takePicture(){
camera.takePicture(shutterCallback, rawCallback, jpegCallback);
}
ShutterCallback shutterCallback= new ShutterCallback()
{
public void onShutter(){
}
};
PictureCallback rawCallback = new PictureCallback(){
public void onPictureTaken(byte[] data, Camera camera){
}
};
PictureCallback jpegCallback = new PictureCallback(){
public void onPictureTaken(byte[] data, Camera camera){
FileOutputStream outStream = null;
try{
outStream = new FileOutputStream("/sdcard/Image.jpg");
outStream.write(data);
outStream.close();
} catch (FileNotFoundException e){
Log.d("CAMERA", e.getMessage());
} catch (IOException e){
Log.d("CAMERA", e.getMessage());
}
}
};
}
this is my camera API code... is there anyway to tweak this then to add to a database instead of to the sd card?
thanks,
Stefan
Start here for reference to using the camera, then read this page to learn about using an SQLite database.
Go through the following link. Its a good example of capturing image from your application:
Capture image from your app using camera
Once you have the path of captured image, you can save it in your database.
First save taken pic in device's cache memory, convert this saved image in ByteArray and them save it in Database. Do this in background thread so that UI thread should not be disturbed.
Do not forget to delete that image from cache once your image is saved in your database.
Or try directly saving this passed byte array to your database
values.put(MyBaseColumn.MyTable.ImageColumn, data);
and insert this data to your database.
Hope this will work.
Go through the link which will helps to save image in database

Categories

Resources