I am working with capturing an image and then show it on my ImageView in my fragment. I searched this through web and still I cannot make it.
Below is what I am doing.
btnCaptureImg.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
Intent cameraIntent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(cameraIntent, 1888);
}
});
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
Log.e("LOG", ""+requestCode);
if (requestCode == 1888 && resultCode == Activity.RESULT_OK) {
Bitmap photo = (Bitmap) data.getExtras().get("data");
imageView.setImageBitmap(photo);
}
super.onActivityResult(requestCode, resultCode, data);
}
The Logcat says 1888 only but the imageView did not load the image.
I also tried this
btnCaptureImg.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
File folder = new File(Environment.getExternalStorageDirectory().toString()+"/ImagesFolder/");
folder.mkdirs();
Intent cameraIntent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
resultingFile = new File(folder.toString() + "/image.jpg");
Uri uriSavedImage=Uri.fromFile(resultingFile);
cameraIntent.putExtra(MediaStore.EXTRA_OUTPUT, uriSavedImage);
startActivityForResult(cameraIntent, 1888);
}
});
But it throws exception:
E/AndroidRuntime(4824): java.lang.RuntimeException: Unable to resume activity {com.myapp.android/com.myapp.android.MyHomeActivity}: java.lang.RuntimeException: Failure delivering result ResultInfo{who=null, request=264032, result=-1, data=null} to activity {com.myapp.android/com.myapp.android.MyHomeActivity}: java.lang.NullPointerException
E/AndroidRuntime(4824): at com.myapp.android.MyFragment.onActivityResult(MyFragment.java:300)
Try putting
cameraIntent.putExtra(MediaStore.EXTRA_OUTPUT, your_image_uri);
before this
startActivityForResult(cameraIntent, 1888);
I'm sorry mate but I can't seem to understand why it's giving that exception. Try this approach, it works for me.
Define the file globally and inside onActivityResult
if (requestCode == 1888 && resultCode == Activity.RESULT_OK) {
Bitmap photo = Media.getBitmap(getActivity().getContentResolver(), Uri.fromFile(resultingFile));
//Bitmap photo = (Bitmap) data.getExtras().get("data");
imageView.setImageBitmap(photo);
}
Check if your onActicityResult is beeing called. If not check:
onActivityResult is not being called in Fragment
First, in the fragment starts the intent to capture the image:
private void startIntentImageCapture() {
Intent cameraIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
try {
File storageDir = Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_PICTURES);
File image = File.createTempFile("imageName.jpg", ".jpg", storageDir);
path = Uri.fromFile(image);
cameraIntent.putExtra(MediaStore.EXTRA_OUTPUT, path);
getActivity().startActivityForResult(cameraIntent, CAPTURE_IMAGE_REQUEST_CODE);
} catch (IOException e) {
e.printStackTrace();
}
}
path variable, must be an public static Uri instance. And another public static Integer for the request code: public static final Integer CAPTURE_IMAGE_REQUEST_CODE = 100;
Then in the Activity, read this path and decode the image:
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == YourFragment.CAPTURE_IMAGE_REQUEST_CODE) {
if (resultCode == RESULT_OK) {
Bitmap imageBitmap = decodeFile(YourFragment.path);
//do something with your photo
}
}
}
I'm sure this is not the best and cleaner method but it works for me. Hope it helps!
EDIT
I forgot the code to decode the image.
public Bitmap decodeFile(Uri uriFile) throws OutOfMemoryError {
String filePath = uriFile.getPath();
BitmapFactory.Options bmOptions;
Bitmap imageBitmap;
try {
imageBitmap = BitmapFactory.decodeFile(filePath);
} catch (OutOfMemoryError e) {
bmOptions = new BitmapFactory.Options();
bmOptions.inSampleSize = 4;
bmOptions.inPurgeable = true;
imageBitmap = BitmapFactory.decodeFile(filePath, bmOptions);
}
return imageBitmap;
}
The bmOptions ensure that OutOfMemoryError doesn't occurs.
Related
I have tried many codes from stackoverflow and internet.But I am able to find how to pick image from camera.I have used the following code,but data.getData() always returns null.Dont know how to solve.Pleas help this.
Intent takePictureIntent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
if (takePictureIntent.resolveActivity(getPackageManager()) != null) {
startActivityForResult(takePictureIntent, PICK_IMAGE_CAMERA1);
}
protected void onActivityResult(int requestCode, int resultCode, Intent data)
{
if (requestCode == 30 && resultCode == Activity.RESULT_OK&&data!null)
{
Bundle extras = data.getExtras();
Bitmap imageBitmap = (Bitmap) extras.get("data");
imageView.setImageBitmap(imageBitmap);
}
}
Try this code #Thrishool,
private static final int CAMERA = 1;
//choosing the image from camera
Intent intent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent, CAMERA);
//now get the data from the onActivity result
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == CAMERA) {
Bitmap thumbnail = (Bitmap) data.getExtras().get("data");
imageview.setImageBitmap(thumbnail);
ByteArrayOutputStream baos=new ByteArrayOutputStream();
thumbnail.compress(Bitmap.CompressFormat.PNG,100, baos);
byte [] b=baos.toByteArray();
String temp = Base64.encodeToString(b, Base64.DEFAULT);
Log.e("savedImage",temp);
Toast.makeText(ProfileActivity.this, "Image Saved!", Toast.LENGTH_SHORT).show();
}
}
Try this and let me know #Thrishool
I have a problem that occurs when I click an image using Intent and launch Android Camera. The image that I get through Intent data carries information of resized Bitmap image. Maybe I have a wrong understanding, but please suggest what can I do to correct it. The ImageView displays the same image I clicked but a very blurrred one
Here is the underlying code:
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
InputStream stream= null;
super.onActivityResult(requestCode, resultCode, data);
if(this.requestCode == requestCode && resultCode == RESULT_OK){
try{
//stream= getContentResolver().openInputStream(data.getData());
Bundle extras= data.getExtras();
Bitmap bitmap = (Bitmap) extras.get("data");
imageHolder.setImageBitmap(bitmap);
}
catch (Exception ex){
ex.printStackTrace();
}
}
}
try{
//stream= getContentResolver().openInputStream(data.getData());
Bundle extras= data.getExtras();
Bitmap bitmap = (Bitmap) extras.get("data");
//set value Width=200, Height=200
Bitmap resizedimg = Bitmap.createScaledBitmap(bitmap, Width, Height, true);
imageHolder.setImageBitmap(resizedimg);
}
catch (Exception ex){
ex.printStackTrace();
}
maybe you should pass a uri to Camera, and get image from it when ActivityResult instead of bitmap
try this:
public class MainActivity extends Activity {
static int CAMERA_TAKE_PIC =1;
Uri outPutfileUri;
ImageView mImageView;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
mImageView = (ImageView) findViewById(R.id.image);
}
public void CameraClick(View v) {
Intent intent= new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
File file = new File(Environment.getExternalStorageDirectory(),
"Photo.jpg");
outPutfileUri = Uri.fromFile(file);
intent.putExtra(MediaStore.EXTRA_OUTPUT, outPutfileUri);
startActivityForResult(intent, CAMERA_TAKE_PIC);
}
Bitmap bitmap = null;
#Override
protected void onActivityResult(int requestCode, int resultCode,Intent data)
{
if (requestCode == CAMERA_TAKE_PIC && resultCode==RESULT_OK) {
String uri = outPutfileUri.toString();
Log.e("uri-:", uri);
Toast.makeText(this, outPutfileUri.toString(),Toast.LENGTH_LONG).show();
//Bitmap myBitmap = BitmapFactory.decodeFile(uri);
// mImageView.setImageURI(Uri.parse(uri)); OR drawable make image strechable so try bleow also
try {
bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(), outPutfileUri);
Drawable d = new BitmapDrawable(getResources(), bitmap);
mImageView.setImageDrawable(d);
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
i think the image is blurred because the intent data contains only the thumbnail of the image.you can save the image on the internal/external storage and then acquire it.
The image is probably blurred because the intent data contains only the thumbnail of the image. You must save the image on the internal/external storage and then acquire it.Also your question should be how to get full size image from camera.
Go through this doc https://developer.android.com/training/camera/photobasics.html.
This is how it should be done
Intent i=new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
File dir=
Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM);
output=new File(dir, "CameraContentDemo.jpeg");
i.putExtra(MediaStore.EXTRA_OUTPUT, Uri.fromFile(output));
startActivityForResult(i, CONTENT_REQUEST);
Refrence can be found in this link.
https://stackoverflow.com/a/32121829/3111083
So.. I'm trying to choose a picture using both camera and gallery, then pass it to an external library to crop image.
The problem lies with saving image after taking it with the camera. I've managed to get the camera running and take an image with it, then it display the image and the default Android Ok and back button. The Ok button responds to touch (as touch effect can be seen) but it doesn't do anything.
Here's the code for getting the file ready to save
date = calendar.getTime();
SimpleDateFormat simpleDateFormat = new SimpleDateFormat("ddMMyyyyHHmmss");
dateString = simpleDateFormat.format(date);
dateBuilder = new StringBuilder().append(dateString).append(".jpg");
SAMPLE_CROPPED_IMAGE_NAME = dateBuilder.toString();
final String cameraDir = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES) + "/SamplePics/";
newDir = new File(cameraDir);
newDir.mkdirs();
This is the code for camera function
RelativeLayout.OnClickListener photoCameraWrapperHandler = new RelativeLayout.OnClickListener(){
#Override
public void onClick(View v) {
//Intent intent = new Intent(SignupStepThreeActivity.this, SignupStepFourActivity.class);
//startActivity(intent);
String cameraFile = SAMPLE_CROPPED_IMAGE_NAME;
newFile = new File(cameraFile);
try {
newFile.createNewFile();
}
catch (IOException e)
{
}
Uri outputFileUri = Uri.fromFile(newFile);
Intent cameraIntent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
cameraIntent.putExtra(MediaStore.EXTRA_OUTPUT, outputFileUri);
startActivityForResult(cameraIntent, CAMERA_REQUEST);
}
};
And here's the onActivityResult :
private static final int CAMERA_REQUEST = 1888;
private static final int REQUEST_SELECT_PICTURE = 0x01;
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
if (resultCode == RESULT_OK) {
if (requestCode == REQUEST_SELECT_PICTURE || requestCode == CAMERA_REQUEST) {
final Uri selectedUri = data.getData();
if (selectedUri != null) {
startCropActivity(data.getData());
} else {
Toast.makeText(SignupStepThreeActivity.this, R.string.toast_cannot_retrieve_selected_image, Toast.LENGTH_SHORT).show();
}
} else if (requestCode == UCrop.REQUEST_CROP) {
handleCropResult(data);
}
}
if (resultCode == UCrop.RESULT_ERROR) {
handleCropError(data);
}
}
Several updates before, it crashes when I click on camera button, I suspected it was because of I kind of take the uri of the image from storage but I haven't created the folder. Now I finally managed to get the camera running but not saving. The create folder part works tho..
public String getCamerPath(Context context) {
SharedPreferences prefs = context.getSharedPreferences("setCamerPath", 0);
String value = prefs.getString("getCamerPath", "");
return value;
}
public void setCamerPath(Context context, String value)
{
SharedPreferences prefs = context.getSharedPreferences("setCamerPath", 0);
SharedPreferences.Editor editor = prefs.edit();
editor.putString("getCamerPath", value);
editor.commit();
}
Uri outputFileUri = Uri.fromFile(newFile);
setCamerPath(this, outputFileUri.getPath());
Intent cameraIntent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
cameraIntent.putExtra(MediaStore.EXTRA_OUTPUT, outputFileUri);
startActivityForResult(cameraIntent, CAMERA_REQUEST);
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
if (resultCode == RESULT_OK) {
if (requestCode == REQUEST_SELECT_PICTURE || requestCode == CAMERA_REQUEST) {
startCropActivity(getCamerPath(this));
} else if (requestCode == UCrop.REQUEST_CROP) {
handleCropResult(data);
}
}
if (resultCode == UCrop.RESULT_ERROR) {
handleCropError(data);
}
}
In the camera OnClickListener, I commented some stuffs like this :
Intent cameraIntent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
//Uri outputFileUri = Uri.fromFile(newFile);
//cameraIntent.putExtra(MediaStore.EXTRA_OUTPUT, outputFileUri);
//setCamerPath(SignupStepThreeActivity.this, outputFileUri.getPath());
startActivityForResult(cameraIntent, CAMERA_REQUEST);
In my onActivityResult, I've modified a part of CAMERA_REQUEST :
else if (requestCode == CAMERA_REQUEST) {
Uri capturedImageUri = data.getData();
Bitmap bitmap;
if (capturedImageUri != null) {
startCropActivity(capturedImageUri);
} else {
Toast.makeText(SignupStepThreeActivity.this, R.string.toast_cannot_retrieve_selected_image, Toast.LENGTH_SHORT).show();
Bundle extras = data.getExtras();
bitmap = (Bitmap) extras.get("data");
Uri imageUri = getImageUri(SignupStepThreeActivity.this, bitmap);
startCropActivity(imageUri);
}
}
So basically as I've been reading around for the past 4 hours, data is Intent. It will always be not null BUT will not always contain Uri itself for the image. Because I need the Uri for a library I'm using, I use a method called getImageUri from another Stackoverflow answer.
public static Uri getImageUri(Context inContext, Bitmap inImage)
{
ByteArrayOutputStream bytes = new ByteArrayOutputStream();
inImage.compress(Bitmap.CompressFormat.JPEG, 100, bytes);
String path = MediaStore.Images.Media.insertImage(inContext.getContentResolver(), inImage, "Title", null);
return Uri.parse(path);
}
Now the problem lies with bitmap image quality after taking the image, I'm going to tinker around with it and be back with an update.
This can also resolve the problem
#Override
protected void onSaveInstanceState(Bundle outState) {
super.onSaveInstanceState(outState);
Log.d("camera", "onSaveInstance");
// save file url in bundle as it will be null on screen orientation
// changes
outState.putParcelable("file_uri", fileUri);
}
#Override
protected void onRestoreInstanceState(Bundle savedInstanceState) {
super.onRestoreInstanceState(savedInstanceState);
Log.d("camera", "onRestoreInstance");
// get the file url
fileUri = savedInstanceState.getParcelable("file_uri");
}
In my Fragment I try to take picture from my camera but the onActivityResult of my Fragment is not called. After taking photo this Fragment is not showing and is switching to my first Fragment. In there any other way for capturing photos in a Fragment, or what am I doing wrong?
Here is my current code:
public void takePhoto() {
Intent intent = new Intent("android.media.action.IMAGE_CAPTURE");
File photo = new File(Environment.getExternalStorageDirectory(), "Pic.jpg");
intent.putExtra(MediaStore.EXTRA_OUTPUT,
Uri.fromFile(photo));
imageUri = Uri.fromFile(photo);
PhotosListFragment.this.startActivityForResult(intent, 100);
}
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
switch (requestCode) {
case 100:
if (resultCode == Activity.RESULT_OK) {
Uri selectedImage = imageUri;
getActivity().getContentResolver().notifyChange(selectedImage, null);
ContentResolver cr = getActivity().getContentResolver();
Bitmap bitmap;
try {
bitmap = android.provider.MediaStore.Images.Media
.getBitmap(cr, selectedImage);
viewHolder.imageView.setImageBitmap(bitmap);
Toast.makeText(getActivity(), selectedImage.toString(),
Toast.LENGTH_LONG).show();
} catch (Exception e) {
Toast.makeText(getActivity(), "Failed to load", Toast.LENGTH_SHORT)
.show();
Log.e("Camera", e.toString());
}
}
}
}
Hope this will help you:
public class CameraImage extends Fragment {
private static final int CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE = 1888;
Button button;
ImageView imageView;
#Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
final View rootView = inflater.inflate(R.layout.camera_image,
container, false);
button = (Button) rootView.findViewById(R.id.button);
imageView = (ImageView) rootView.findViewById(R.id.imageview);
button.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View view) {
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent,
CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE);
}
});
return rootView;
}
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE) {
if (resultCode == Activity.RESULT_OK) {
Bitmap bmp = (Bitmap) data.getExtras().get("data");
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bmp.compress(Bitmap.CompressFormat.PNG, 100, stream);
byte[] byteArray = stream.toByteArray();
// convert byte array to Bitmap
Bitmap bitmap = BitmapFactory.decodeByteArray(byteArray, 0,
byteArray.length);
imageView.setImageBitmap(bitmap);
}
}
}
}
This is one of the most popular issue. We can found lots of thread regarding this issue. But none of them is useful for ME.
So I have solved this problem using this solution.
Let's first understand why this is happening.
We can call startActivityForResult directly from Fragment but actually mechanic behind are all handled by Activity.
Once you call startActivityForResult from a Fragment, requestCode will be changed to attach Fragment's identity to the code. That will let Activity be able to track back that who send this request once result is received.
Once Activity was navigated back, the result will be sent to Activity's onActivityResult with the modified requestCode which will be decoded to original requestCode + Fragment's identity. After that, Activity will send the Activity Result to that Fragment through onActivityResult. And it's all done.
The problem is:
Activity could send the result to only the Fragment that has been attached directly to Activity but not the nested one. That's the reason why onActivityResult of nested fragment would never been called no matter what.
Solution:
1) Start Camera Intent in your Fragment by below code:
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
Fragment frag = this;
/** Pass your fragment reference **/
frag.startActivityForResult(intent, REQUEST_IMAGE_CAPTURE); // REQUEST_IMAGE_CAPTURE = 12345
2) Now in your Parent Activity override **onActivityResult() :**
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
}
You have to call this in parent activity to make it work.
3) In your fragment call:
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == Activity.RESULT_OK) {
if (requestCode == REQUEST_IMAGE_CAPTURE) {
// Do something with imagePath
Bitmap photo = (Bitmap) data.getExtras().get("data");
imageview.setImageBitmap(photo);
// CALL THIS METHOD TO GET THE URI FROM THE BITMAP
Uri selectedImage = getImageUri(getActivity(), photo);
String realPath=getRealPathFromURI(selectedImage);
selectedImage = Uri.parse(realPath);
}
}
}
4) Reference methods for getting URI:
-> Method for getting Uri from the Bitmap
public Uri getImageUri(Context inContext, Bitmap inImage) {
ByteArrayOutputStream bytes = new ByteArrayOutputStream();
inImage.compress(Bitmap.CompressFormat.JPEG, 100, bytes);
String path = MediaStore.Images.Media.insertImage(inContext.getContentResolver(), inImage, "Title", null);
return Uri.parse(path);
}
-> Method for getting File path from the Uri
public String getRealPathFromURI(Uri contentUri) {
Cursor cursor = null;
try {
String[] proj = { MediaStore.Images.Media.DATA };
cursor = getActivity().getContentResolver().query(contentUri, proj, null, null, null);
int column_index = cursor.getColumnIndexOrThrow(MediaStore.Images.Media.DATA);
cursor.moveToFirst();
return cursor.getString(column_index);
} finally {
if (cursor != null) {
cursor.close();
}
}
}
That's it.
With this solution, it could be applied for any single fragment whether it is nested or not. And yes, it also covers all the case! Moreover, the codes are also nice and clean.
I tried your code its working fine dude. I changed
PhotosListFragment.this.startActivityForResult(intent, 100);
to
getActivity().startActivityForResult(intent, 100);
which after taking the picture, returning back to same activity.
I think both of your fragments are on same activity.
if that is the situation, I suggest you to create a new activity and put the new fragment in there.
For Fragment this is simplest solution:
cameraIamgeView.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
Intent cameraIntent=new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
getActivity().startActivityFromFragment(PlaceOrderFragment.this, cameraIntent, CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE);
}
});
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data){
// super.onActivityResult(requestCode, resultCode, data);
try {
if (requestCode == CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE) {
if (resultCode == Activity.RESULT_OK && data != null) {
Bitmap bmp = (Bitmap) data.getExtras().get("data");
ByteArrayOutputStream stream = new ByteArrayOutputStream();
/*
bmp.compress(Bitmap.CompressFormat.PNG, 100, stream);
byte[] byteArray = stream.toByteArray();
// convert byte array to Bitmap
Bitmap bitmap = BitmapFactory.decodeByteArray(byteArray, 0,
byteArray.length);
*/
cameraIamgeView.setImageBitmap(bmp);
}
}
}catch(Exception e){
Toast.makeText(this.getActivity(), e+"Something went wrong", Toast.LENGTH_LONG).show();
}
}
I new in the android developing.
I want to develop simple application that will be able to take a picture using the cell phone camara and show it on the screen of the cell phone.
Is there some simple example that i can use ? or some code that can help me learn how to do it ?
Thanks for any help
to start camera you use
Intent cameraIntent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(cameraIntent, 0);
and here you have the handeling
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == 0) {
Bitmap photo = (Bitmap) data.getExtras().get("data");
imageView.setImageBitmap(photo);
}
}
Try this.. Use the below code in onCreate
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
URI mUri = Uri.fromFile(new File(Environment.getExternalStorageDirectory(),
"pic_" + String.valueOf(System.currentTimeMillis()) + ".jpg"));
intent.putExtra(android.provider.MediaStore.EXTRA_OUTPUT, mUri);
try {
intent.putExtra("return-data", true);
startActivityForResult(intent, CAMERA_RESULT);
} catch (ActivityNotFoundException e) {
e.printStackTrace();
}
Then OnActivityResult
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == RESULT_OK) {
//Here you will get path of image stored in sdcard then pass it to next activity as your desires..
mImagePath = extras.getString("image-path");
mSaveUri = getImageUri(mImagePath);
Bitmap mBitmap = getBitmap(mImagePath);
// here mBitmap is assigned to any imageview and you can use it in for display
}
}
private Uri getImageUri(String path) {
return Uri.fromFile(new File(path));
}
private Bitmap getBitmap(String path) {
Uri uri = getImageUri(path);
InputStream in = null;
try {
in = mContentResolver.openInputStream(uri);
return BitmapFactory.decodeStream(in).copy(Config.ARGB_8888, true);
} catch (FileNotFoundException e) {
//Log.e(TAG, "file " + path + " not found");
}
return null;
}
}