I have an activity where I open a image picker intent and the when I get the selected Uri, I create a copy of the file in cache directory and then pass the location of the image to Picasso to load the image. I am doing this because some apps like Google Photos do not allow the actual Uris to be passed to different activities for security reasons.
Here is my code for the same :
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_external_photo_share);
ButterKnife.bind(this);
LogUtil.i(TAG, "onCreate called");
tinyDB = new TinyDB(this);
if (tinyDB.getBoolean(AppConstants.LOGIN_STATE, false)) {
imageUri = (Uri) getIntent().getExtras().get(Intent.EXTRA_STREAM);
if (imageUri == null || imageUri.toString().isEmpty()) {
ExternalPhotoShareActivity.this.finish();
}
String newActualPath = getActualPathFromUri(imageUri);
Uri newUri = null;
if (newActualPath != null) {
newUri = Uri.fromFile(new File(newActualPath));
}
Intent intent = new Intent(ExternalPhotoShareActivity.this, AddCaptionActivity.class);
intent.putExtra("externalImageUri", newUri);
intent.putExtra("externalImagePath", newActualPath);
startActivity(intent);
ExternalPhotoShareActivity.this.finish();
} else {
final Dialog dialog = new Dialog(ExternalPhotoShareActivity.this);
dialog.requestWindowFeature(Window.FEATURE_NO_TITLE);
dialog.setContentView(R.layout.dialog_external_share_error);
dialog.setCanceledOnTouchOutside(false);
dialog.show();
TextView goBack = (TextView) dialog.findViewById(R.id.textView86);
goBack.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
try {
if(dialog != null) {
if(dialog.isShowing()) {
dialog.dismiss();
}
}
} catch (final IllegalArgumentException e) {
// Handle or log or ignore
} catch (final Exception e) {
// Handle or log or ignore
}
ExternalPhotoShareActivity.this.finish();
}
});
}
}
private String getActualPathFromUri(Uri selectedImageUri) {
Bitmap bitmap = null;
try {
bitmap = getBitmapFromUri(selectedImageUri);
} catch (IOException e) {
e.printStackTrace();
}
if (bitmap == null) {
return null;
}
File imageFileFolder = new File(getCacheDir(), "galleri5");
if (!imageFileFolder.exists()) {
imageFileFolder.mkdir();
}
FileOutputStream out = null;
File imageFileName = new File(imageFileFolder, "galleri5-" + System.currentTimeMillis() + ".jpg");
try {
out = new FileOutputStream(imageFileName);
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, out);
out.flush();
out.close();
} catch (IOException e) {
} finally {
if (out != null) {
try {
out.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
return imageFileName.getAbsolutePath();
}
private Bitmap getBitmapFromUri(Uri uri) throws IOException {
ParcelFileDescriptor parcelFileDescriptor =
getContentResolver().openFileDescriptor(uri, "r");
assert parcelFileDescriptor != null;
FileDescriptor fileDescriptor = parcelFileDescriptor.getFileDescriptor();
Bitmap image = BitmapFactory.decodeFileDescriptor(fileDescriptor);
parcelFileDescriptor.close();
return image;
}
I call getActualPathFromUri() in my activity's onCreate() method. Sometimes, when the images from gallery are large, it takes some seconds to load the image on the screen. So, I thought about executing these two methods in background thread so that I could show the UI while the background work is done.
I recently started using Android Support Annotations and tried to annotate the getActualPathFromUri() with #WorkerThread. But in my onCreate() method, it marks it red and says that this method should be called from Worker Thread, currently inferred thread is main.
What is the proper way of doing this? Should I even do it in background thread or not? Thanks.
The annotations #WorkerThread or #UIThread are only used for flagging a method. Android Studio will raise an error when your method is called from a thread that does not match your annotated constraint. See this documentation.
For advice on threading with AsyncTask see this android developers blog post.
In this case, with the annotation #WorkerThrad you're saying that getActualPathFromUri() should only be called from a worker thread. As you're calling it from onCreate() which runs inside the UI-thread, lint detects the issue and flags it.
The fact that you annotate the method like that, does not mean you're running the method inside a worker thread. Annotation in this case is just a way to flag the developer (that might be different than yourself if working in a team) that a particular method is meant to be called in a particular thread.
If you want to actually run that method in an actual worker thread, just call it inside an AsyncTask#doInBackground method.
EDIT: if you don't wanna use AsyncTask you can use any other Android threading mechanism, like IntentService, GcmTaskService and whatnot but, you should not run it inside the UI thread because it may cause jank.
Related
I am writing tests using espresso,my app intends to phone camera, where i press the click button manually,and then it migrates to the next screen,I am not able to automate the image click button in test code,how can i access camera using code through which i can do the same.
Thanks.
You should not open the camera intent or you'll have no way of getting any resulting image back from it (without pressing the camera button manually).
Have a look at the Stubbing out the Camera section of this website:
https://guides.codepath.com/android/UI-Testing-with-Espresso#stubbing-out-the-camera
This way you test your activity by simulating an actual image "returned" to your app from the Camera.
Update
And this is the method I use to get a bitmap to test:
public static Bitmap getTestBitmap(Context context, String resourceName) {
Resources resources = context.getResources();
Bitmap ret = null;
int imageResource = resources.getIdentifier(
resourceName, "drawable", context.getPackageName());
Uri pictureUri = Uri.parse(ContentResolver.SCHEME_ANDROID_RESOURCE + "://"
+ resources.getResourcePackageName(imageResource) + '/'
+ resources.getResourceTypeName(imageResource) + '/'
+ resources.getResourceEntryName(imageResource));
try {
ret = MediaStore.Images.Media.getBitmap(context.getContentResolver(), pictureUri);
} catch (Exception e) {
}
return ret;
}
And then I save the bitmap in internal storage and get the uri:
public static Uri saveToInternalStorage(Context context, Bitmap bitmapImage, String fileName) {
ContextWrapper cw = new ContextWrapper(context);
// path to /data/data/yourapp/app_data/pictures
File directory = cw.getDir("pictures", Context.MODE_PRIVATE);
// Create imageDir
File mypath = new File(directory, fileName);
FileOutputStream fos = null;
try {
fos = new FileOutputStream(mypath);
// Use the compress method on the BitMap object to write image to the OutputStream
bitmapImage.compress(Bitmap.CompressFormat.PNG, 100, fos);
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
fos.close();
} catch (Exception e) {
}
}
return Uri.fromFile(new File(mypath.getAbsolutePath()));
}
I know this is late, but it's something I struggled with myself and I would like to post an answer to help someone else. Here is how you click the camera button from a chooser (after you set it up), you use UIAutomator, as suggested by PunitD in the comments of the original post. This will pick up from where the test is showing a chooser on the screen.
public static final int waitTimeNativeApi = 6000;
public static void await(int time) {
try {
Thread.sleep(time);
} catch (InterruptedException e) {
Log.e(TAG, "Interrupted while sleeping");
}
}
private void takePhoto() {
boolean usePixels = true;
UiDevice device = UiDevice.getInstance(InstrumentationRegistry.getInstrumentation());
UiObject titleTextUI = device.findObject(new UiSelector()
.className("android.widget.TextView")
.text("Camera")
);
try {
titleTextUI.clickTopLeft();
if (usePixels) {
takePhotoForPixels(device);
} else {
takePhotoForSamsung(device);
}
} catch (UiObjectNotFoundException unofe) {
unofe.printStackTrace();
}
}
private void takePhotoForPixels(UiDevice device) {
// close the app selector to go back to our app so we can carry on with Espresso
await(waitTimeNativeApi);
//TAP on the camera icon
device.click(device.getDisplayWidth() / 2, device.getDisplayHeight() - 100);
await(waitTimeNativeApi);
//Get the OK button
device.click(device.getDisplayWidth() / 2, device.getDisplayHeight() - 100);
await(waitTimeNativeApi);
}
private void takePhotoForSamsung(UiDevice device) throws UiObjectNotFoundException {
// close the app selector to go back to our app so we can carry on with Espresso
UiObject titleTextUI = device.findObject(new UiSelector()
.className("android.widget.TextView")
.text("Camera")
);
titleTextUI.clickTopLeft();
await(waitTimeNativeApi);
//TAP on the camera icon
device.click(device.getDisplayWidth() / 2, device.getDisplayHeight() - 50);
//Get the OK button
UiObject cameraOkUi = device.findObject(new UiSelector()
.className("android.widget.TextView")
.text("OK")
);
cameraOkUi.click();
await(waitTimeNativeApi);
}
In this way, you will take an actual photo and get the results back in onActivityResult.
I want to start the camera intent within my app to take a picture and save it to internal storage. I'm using the code of google developers page Capturing images or video.
In the processPictureWhenReady method I've implemented the following code to save the picture:
private void processPictureWhenReady(final String picturePath) {
Log.v("path processPictureWhenReady ", " " + picturePath);
final File pictureFile = new File(picturePath);
if (pictureFile.exists()) {
// The picture is ready; process it.
try {
Bitmap imageBitmap = BitmapFactory.decodeFile(picturePath);
int w = imageBitmap.getWidth();
int h = imageBitmap.getHeight();
Bitmap bm2 = Bitmap.createScaledBitmap(imageBitmap, w / 2,
h / 2, true);
imageBitmap = bm2.copy(bm2.getConfig(), true);
MediaStore.Images.Media.insertImage(getContentResolver(),
imageBitmap, "test", "Test");
} catch (Exception e) {
Log.e("Exc", e.getMessage());
}
}
The camera intent is starting and then I have "tap to accept" to take a picture. But then nothing happens. I have a log message in my onActivityResult method and noticed that the method is not beeing called.
This is a known issue. I have the same problem. I'm following the case here in the meantime
I've seen people try implementing the preview mode with SurfaceView (I haven't personally gotten it to work but it's worth a shot). Also check here for a similar problem.
I used this method it worked for me very well.
private void processPictureWhenReady(final String picturePath) {
final File pictureFile = new File(picturePath);
if(pictureFile.exists()){
}
if (pictureFile.exists()) {
} else {
final File parentDirectory = pictureFile.getParentFile();
FileObserver observer = new FileObserver(parentDirectory.getPath()) {
private boolean isFileWritten;
#Override
public void onEvent(int event, String path) {
if (!isFileWritten) {
// For safety, make sure that the file that was created in
// the directory is actually the one that we're expecting.
File affectedFile = new File(parentDirectory, path);
isFileWritten = (event == FileObserver.CLOSE_WRITE && affectedFile.equals(pictureFile));
if (isFileWritten) {
stopWatching();
// Now that the file is ready, recursively call
// processPictureWhenReady again (on the UI thread).
runOnUiThread(new Runnable() {
#Override
public void run() {
processPictureWhenReady(picturePath);
}
});
}
}
}
};
observer.startWatching();
}
}
I'm new to android. Now, i'm doing image capturing function using:
Intent intent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE );
The problem is after I capture my photo, the new captured photo does not display on the image display page.
Does anyone here know where there is code that can help me refresh my android or any step that I need to do so that my image display page can be updated once I capture a new photo?
Any help would be very appreciated and many thanks to you first.
Updated answer:
I use this, may be this can help others:
mScanner = new MediaScannerConnection(
Camera.this,
new MediaScannerConnection.MediaScannerConnectionClient() {
public void onMediaScannerConnected() {
mScanner.scanFile(outputFileUri.getPath(), null /* mimeType */);
}
public void onScanCompleted(String path, Uri uri) {
//we can use the uri, to get the newly added image, but it will return path to full sized image
//e.g. content://media/external/images/media/7
//we can also update this path by replacing media by thumbnail to get the thumbnail
//because thumbnail path would be like content://media/external/images/thumbnail/7
//But the thumbnail is created after some delay by Android OS
//So you may not get the thumbnail. This is why I started new UI thread
//and it'll only run after the current thread completed.
if (path.equals(outputFileUri.getPath())) {
mScanner.disconnect();
//we need to create new UI thread because, we can't update our mail thread from here
//Both the thread will run one by one, see documentation of android
Camera.this
.runOnUiThread(new Runnable() {
public void run() {
}
});
}
}
});
mScanner.connect();
Sally, did you mean that after you take a photo you don't see it in the gallery or file-manager when you look at the directory you know the file to be in?
If so, you need to run the media-scanner like this:
sendBroadcast(new Intent(Intent.ACTION_MEDIA_SCANNER_SCAN_FILE, uri));
... where uri is the uri of the photo, since you know it already, although you can use the uri of the directory instead if that's easier (though it is slower - potentially very slow if the directory contains many files or nested directories).
you should take photo using below code::
Calendar cal = Calendar.getInstance();
File file = new File(Environment.getExternalStorageDirectory(),(cal.getTimeInMillis()+".jpg"));
if(!file.exists()){
try {
file.createNewFile();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
selectedImageUri = Uri.fromFile(file);
Intent i = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
i.putExtra(MediaStore.EXTRA_OUTPUT, selectedImageUri);
startActivityForResult(i, CAMERA_RESULT);
and on activity result you can use these code:::
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
switch (requestCode) {
case CAMERA_RESULT:
if (resultCode == RESULT_OK) {
try {
Bitmap bitmap = MediaStore.Images.Media.getBitmap(getContext().getContentResolver(), selectedImageUri);
imageView.setImageBitmap(bitmap);
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
break;
}
}
}
You can refresh your activity with :
Intent myIntent = getIntent();
finish();
startActivity(myIntent);
I need to get a frame of a video file (it may be on sdcard, cache dir or app dir). I have package android.media in my application and inside I have class MediaMetadataRetriever. To get first frame into a bitmap, I use code:
public static Bitmap getVideoFrame(Context context, Uri videoUri) {
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
retriever.setMode(MediaMetadataRetriever.MODE_CAPTURE_FRAME_ONLY);
retriever.setDataSource(context, videoUri);
return retriever.captureFrame();
} catch (IllegalArgumentException ex) {
throw new RuntimeException();
} catch (RuntimeException ex) {
throw new RuntimeException();
} finally {
retriever.release();
}
}
But this it's not working. It throws an exception (java.lang.RuntimeException: setDataSource failed: status = 0x80000000) when I set data source. Do you know how to make this code to work? Or Do you have any similar (simple) solution without using ffmpeg or other external libraries? videoUri is a valid uri (media player can play video from that URI)
The following works for me:
public static Bitmap getVideoFrame(FileDescriptor FD) {
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
retriever.setDataSource(FD);
return retriever.getFrameAtTime();
} catch (IllegalArgumentException ex) {
ex.printStackTrace();
} catch (RuntimeException ex) {
ex.printStackTrace();
} finally {
try {
retriever.release();
} catch (RuntimeException ex) {
}
}
return null;
}
Also works if you use a path instead of a filedescriptor.
Try this, I've used it and its working
public static Bitmap getVideoFrame(Context context, Uri uri) {
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
retriever.setDataSource(uri.toString(),new HashMap<String, String>());
return retriever.getFrameAtTime();
} catch (IllegalArgumentException ex) {
ex.printStackTrace();
} catch (RuntimeException ex) {
ex.printStackTrace();
} finally {
try {
retriever.release();
} catch (RuntimeException ex) {
}
}
return null;
}
In place of uri you can directly pass your url .
I used this code and that is working for me. you can try this one.
if (Build.VERSION.SDK_INT >= 14) {
ffmpegMetaDataRetriever.setDataSource(
videoFile.getAbsolutePath(),
new HashMap<String, String>());
} else {
ffmpegMetaDataRetriever.setDataSource(videoFile
.getAbsolutePath());
}
I have the same mistake on my application. I saw on this site that
this is an unofficial way of doing it
and it will only work in cupcake (and
maybe later version). The Android team
does not guarantee that
libmedia_jni.so, which the java file
uses, will be included or have the
same interface in future versions.
http://osdir.com/ml/AndroidDevelopers/2009-06/msg02442.html
I have updated my phone to GingerBread and it doesn't work anymore.
Uri's are not very specific. Sometimes they refer to something in a bundle. They often need to be translated to an absolute path form. The other instance in which you used the Uri, it probably was smart enough to check what kind of Uri it was. This case that you have shown appears to be not looking very hard.
I was getting the same error using the ThumbnailUtils class http://developer.android.com/reference/android/media/ThumbnailUtils.html
It uses MediaMetadataRetriever under the hood and most of the time you can send it a filepath using this method with no problem:
public static Bitmap createVideoThumbnail (String filePath, int kind)
However, on Android 4.0.4, I kept getting the same error #gabi was seeing. Using a file descriptor instead solved the problem and still works for non-4.0.4 devices. I actually ended up subclassing ThumbnailUtils. Here is my subclass method:
public static Bitmap createVideoThumbnail(FileDescriptor fDescriptor, int kind)
{
Bitmap bitmap = null;
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
retriever.setDataSource(fDescriptor);
bitmap = retriever.getFrameAtTime(-1);
}
catch (IllegalArgumentException ex) {
// Assume this is a corrupt video file
Log.e(LOG_TAG, "Failed to create video thumbnail for file description: " + fDescriptor.toString());
}
catch (RuntimeException ex) {
// Assume this is a corrupt video file.
Log.e(LOG_TAG, "Failed to create video thumbnail for file description: " + fDescriptor.toString());
} finally {
try {
retriever.release();
} catch (RuntimeException ex) {
// Ignore failures while cleaning up.
}
}
if (bitmap == null) return null;
if (kind == Images.Thumbnails.MINI_KIND) {
// Scale down the bitmap if it's too large.
int width = bitmap.getWidth();
int height = bitmap.getHeight();
int max = Math.max(width, height);
if (max > 512) {
float scale = 512f / max;
int w = Math.round(scale * width);
int h = Math.round(scale * height);
bitmap = Bitmap.createScaledBitmap(bitmap, w, h, true);
}
} else if (kind == Images.Thumbnails.MICRO_KIND) {
bitmap = extractThumbnail(bitmap,
TARGET_SIZE_MICRO_THUMBNAIL,
TARGET_SIZE_MICRO_THUMBNAIL,
OPTIONS_RECYCLE_INPUT);
}
return bitmap;
}
The exception is thrown also when the File doesn't exist. So before calling setDataSource() you'd better check if new File(url).exists().
so is there a specific way to get the frame from video as
File sdcard = Environment.getExternalStorageDirectory();
File file = new File(sdcard, "myvideo.mp4");
I've got a image selector/cropper with code taken from this site
They create the image in the phone's external storage but I want to store this in my app's internal storage, a process documented here
This is what my function to retrieve the temp file looks like, however when I try to use the file returned from this function, the image does not change. In fact, looking at logcat, it seems resolveUri failed on bad bitmap uri on that file I generated. The error occurs when I try to set the Image URI, leading me to believe it was not saved properly. This is odd to me considering the original code from the site just creates a file in the SD card, and the code works fine for reading/writing to that. So I wonder where the problem arises.
#Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
switch (requestCode) {
case PHOTO_PICKED:
if (resultCode == RESULT_OK) {
if (data != null) {
Bundle extras = data.getExtras();
if (extras != null) {
ImageView callerImage = (ImageView)findViewById(R.id.contact_image);
callerImage.setImageURI(Uri.fromFile(getTempFile()));
}
}
}
break;
}
}
private File getTempFile() {
try {
FileOutputStream fos = openFileOutput(TEMP_PHOTO_FILE, Context.MODE_WORLD_WRITEABLE);
fos.close();
File f = getFileStreamPath(TEMP_PHOTO_FILE);
return f;
} catch (FileNotFoundException e) {
// To be logged later
return null;
} catch (IOException e) {
// To be logged later
return null;
}
}
Never mind, I am so silly. When I called getTempFile, each time it recreates the file, which is a mistake. It should only create the file on the initial call and simply open it the rest of the time.