I need to get a frame of a video file (it may be on sdcard, cache dir or app dir). I have package android.media in my application and inside I have class MediaMetadataRetriever. To get first frame into a bitmap, I use code:
public static Bitmap getVideoFrame(Context context, Uri videoUri) {
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
retriever.setMode(MediaMetadataRetriever.MODE_CAPTURE_FRAME_ONLY);
retriever.setDataSource(context, videoUri);
return retriever.captureFrame();
} catch (IllegalArgumentException ex) {
throw new RuntimeException();
} catch (RuntimeException ex) {
throw new RuntimeException();
} finally {
retriever.release();
}
}
But this it's not working. It throws an exception (java.lang.RuntimeException: setDataSource failed: status = 0x80000000) when I set data source. Do you know how to make this code to work? Or Do you have any similar (simple) solution without using ffmpeg or other external libraries? videoUri is a valid uri (media player can play video from that URI)
The following works for me:
public static Bitmap getVideoFrame(FileDescriptor FD) {
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
retriever.setDataSource(FD);
return retriever.getFrameAtTime();
} catch (IllegalArgumentException ex) {
ex.printStackTrace();
} catch (RuntimeException ex) {
ex.printStackTrace();
} finally {
try {
retriever.release();
} catch (RuntimeException ex) {
}
}
return null;
}
Also works if you use a path instead of a filedescriptor.
Try this, I've used it and its working
public static Bitmap getVideoFrame(Context context, Uri uri) {
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
retriever.setDataSource(uri.toString(),new HashMap<String, String>());
return retriever.getFrameAtTime();
} catch (IllegalArgumentException ex) {
ex.printStackTrace();
} catch (RuntimeException ex) {
ex.printStackTrace();
} finally {
try {
retriever.release();
} catch (RuntimeException ex) {
}
}
return null;
}
In place of uri you can directly pass your url .
I used this code and that is working for me. you can try this one.
if (Build.VERSION.SDK_INT >= 14) {
ffmpegMetaDataRetriever.setDataSource(
videoFile.getAbsolutePath(),
new HashMap<String, String>());
} else {
ffmpegMetaDataRetriever.setDataSource(videoFile
.getAbsolutePath());
}
I have the same mistake on my application. I saw on this site that
this is an unofficial way of doing it
and it will only work in cupcake (and
maybe later version). The Android team
does not guarantee that
libmedia_jni.so, which the java file
uses, will be included or have the
same interface in future versions.
http://osdir.com/ml/AndroidDevelopers/2009-06/msg02442.html
I have updated my phone to GingerBread and it doesn't work anymore.
Uri's are not very specific. Sometimes they refer to something in a bundle. They often need to be translated to an absolute path form. The other instance in which you used the Uri, it probably was smart enough to check what kind of Uri it was. This case that you have shown appears to be not looking very hard.
I was getting the same error using the ThumbnailUtils class http://developer.android.com/reference/android/media/ThumbnailUtils.html
It uses MediaMetadataRetriever under the hood and most of the time you can send it a filepath using this method with no problem:
public static Bitmap createVideoThumbnail (String filePath, int kind)
However, on Android 4.0.4, I kept getting the same error #gabi was seeing. Using a file descriptor instead solved the problem and still works for non-4.0.4 devices. I actually ended up subclassing ThumbnailUtils. Here is my subclass method:
public static Bitmap createVideoThumbnail(FileDescriptor fDescriptor, int kind)
{
Bitmap bitmap = null;
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
retriever.setDataSource(fDescriptor);
bitmap = retriever.getFrameAtTime(-1);
}
catch (IllegalArgumentException ex) {
// Assume this is a corrupt video file
Log.e(LOG_TAG, "Failed to create video thumbnail for file description: " + fDescriptor.toString());
}
catch (RuntimeException ex) {
// Assume this is a corrupt video file.
Log.e(LOG_TAG, "Failed to create video thumbnail for file description: " + fDescriptor.toString());
} finally {
try {
retriever.release();
} catch (RuntimeException ex) {
// Ignore failures while cleaning up.
}
}
if (bitmap == null) return null;
if (kind == Images.Thumbnails.MINI_KIND) {
// Scale down the bitmap if it's too large.
int width = bitmap.getWidth();
int height = bitmap.getHeight();
int max = Math.max(width, height);
if (max > 512) {
float scale = 512f / max;
int w = Math.round(scale * width);
int h = Math.round(scale * height);
bitmap = Bitmap.createScaledBitmap(bitmap, w, h, true);
}
} else if (kind == Images.Thumbnails.MICRO_KIND) {
bitmap = extractThumbnail(bitmap,
TARGET_SIZE_MICRO_THUMBNAIL,
TARGET_SIZE_MICRO_THUMBNAIL,
OPTIONS_RECYCLE_INPUT);
}
return bitmap;
}
The exception is thrown also when the File doesn't exist. So before calling setDataSource() you'd better check if new File(url).exists().
so is there a specific way to get the frame from video as
File sdcard = Environment.getExternalStorageDirectory();
File file = new File(sdcard, "myvideo.mp4");
Related
I want to extract frames from video file stored on the device. Every solution i found is to use FFmpegMediaMetadataRetriever or MediaMetadaraRetriever, but as I wrote here it's not working for me. Is there any other way to extract frames from video?
I admit I haven't used this method for a while, but if it still works with the current Android API, it should do the trick.
Please let me know if it still works. If not - I will delete this answer.
public static Bitmap getVideoFrame(Context context, Uri uri, long time) {
Bitmap bitmap = null;
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
try {
retriever.setDataSource(context, uri);
bitmap = retriever.getFrameAtTime(time);
} catch (RuntimeException ex) {
ex.printStackTrace();
} finally {
try {
retriever.release();
} catch (RuntimeException ex) {
ex.printStackTrace();
}
}
return bitmap;
}
Hope this helps.
I have an activity where I open a image picker intent and the when I get the selected Uri, I create a copy of the file in cache directory and then pass the location of the image to Picasso to load the image. I am doing this because some apps like Google Photos do not allow the actual Uris to be passed to different activities for security reasons.
Here is my code for the same :
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_external_photo_share);
ButterKnife.bind(this);
LogUtil.i(TAG, "onCreate called");
tinyDB = new TinyDB(this);
if (tinyDB.getBoolean(AppConstants.LOGIN_STATE, false)) {
imageUri = (Uri) getIntent().getExtras().get(Intent.EXTRA_STREAM);
if (imageUri == null || imageUri.toString().isEmpty()) {
ExternalPhotoShareActivity.this.finish();
}
String newActualPath = getActualPathFromUri(imageUri);
Uri newUri = null;
if (newActualPath != null) {
newUri = Uri.fromFile(new File(newActualPath));
}
Intent intent = new Intent(ExternalPhotoShareActivity.this, AddCaptionActivity.class);
intent.putExtra("externalImageUri", newUri);
intent.putExtra("externalImagePath", newActualPath);
startActivity(intent);
ExternalPhotoShareActivity.this.finish();
} else {
final Dialog dialog = new Dialog(ExternalPhotoShareActivity.this);
dialog.requestWindowFeature(Window.FEATURE_NO_TITLE);
dialog.setContentView(R.layout.dialog_external_share_error);
dialog.setCanceledOnTouchOutside(false);
dialog.show();
TextView goBack = (TextView) dialog.findViewById(R.id.textView86);
goBack.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
try {
if(dialog != null) {
if(dialog.isShowing()) {
dialog.dismiss();
}
}
} catch (final IllegalArgumentException e) {
// Handle or log or ignore
} catch (final Exception e) {
// Handle or log or ignore
}
ExternalPhotoShareActivity.this.finish();
}
});
}
}
private String getActualPathFromUri(Uri selectedImageUri) {
Bitmap bitmap = null;
try {
bitmap = getBitmapFromUri(selectedImageUri);
} catch (IOException e) {
e.printStackTrace();
}
if (bitmap == null) {
return null;
}
File imageFileFolder = new File(getCacheDir(), "galleri5");
if (!imageFileFolder.exists()) {
imageFileFolder.mkdir();
}
FileOutputStream out = null;
File imageFileName = new File(imageFileFolder, "galleri5-" + System.currentTimeMillis() + ".jpg");
try {
out = new FileOutputStream(imageFileName);
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, out);
out.flush();
out.close();
} catch (IOException e) {
} finally {
if (out != null) {
try {
out.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
return imageFileName.getAbsolutePath();
}
private Bitmap getBitmapFromUri(Uri uri) throws IOException {
ParcelFileDescriptor parcelFileDescriptor =
getContentResolver().openFileDescriptor(uri, "r");
assert parcelFileDescriptor != null;
FileDescriptor fileDescriptor = parcelFileDescriptor.getFileDescriptor();
Bitmap image = BitmapFactory.decodeFileDescriptor(fileDescriptor);
parcelFileDescriptor.close();
return image;
}
I call getActualPathFromUri() in my activity's onCreate() method. Sometimes, when the images from gallery are large, it takes some seconds to load the image on the screen. So, I thought about executing these two methods in background thread so that I could show the UI while the background work is done.
I recently started using Android Support Annotations and tried to annotate the getActualPathFromUri() with #WorkerThread. But in my onCreate() method, it marks it red and says that this method should be called from Worker Thread, currently inferred thread is main.
What is the proper way of doing this? Should I even do it in background thread or not? Thanks.
The annotations #WorkerThread or #UIThread are only used for flagging a method. Android Studio will raise an error when your method is called from a thread that does not match your annotated constraint. See this documentation.
For advice on threading with AsyncTask see this android developers blog post.
In this case, with the annotation #WorkerThrad you're saying that getActualPathFromUri() should only be called from a worker thread. As you're calling it from onCreate() which runs inside the UI-thread, lint detects the issue and flags it.
The fact that you annotate the method like that, does not mean you're running the method inside a worker thread. Annotation in this case is just a way to flag the developer (that might be different than yourself if working in a team) that a particular method is meant to be called in a particular thread.
If you want to actually run that method in an actual worker thread, just call it inside an AsyncTask#doInBackground method.
EDIT: if you don't wanna use AsyncTask you can use any other Android threading mechanism, like IntentService, GcmTaskService and whatnot but, you should not run it inside the UI thread because it may cause jank.
In android, when you get the uri from the gallery, it's value will be start with content://blahblahblah.blahblah.format, but if you get the uri from your phone's camera, it will be starting with file:///
Below is what I want to do:
private File uriToBitmap(Uri uri, int maxSize) throws FileNotFoundException {
try {
imageStream = getContentResolver().openInputStream(uri);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
Bitmap claimBitmap = BitmapFactory.decodeStream(imageStream);
}
In this method I would like to pass a file type uri and use getContentResolver() function, but unfortunately the claimBitmap is a null, does that means the getContentResolver() method doens't accept the file type uri? Please help.
No, file scheme is supported by ContentResolver, if the uri which scheme is not support, this method will throw a FileNotFoundException. Please check the
public final InputStream openInputStream(Uri uri)
throws FileNotFoundException {
String scheme = uri.getScheme();
if (SCHEME_ANDROID_RESOURCE.equals(scheme)) {
// Note: left here to avoid breaking compatibility. May be removed
// with sufficient testing.
OpenResourceIdResult r = getResourceId(uri);
try {
InputStream stream = r.r.openRawResource(r.id);
return stream;
} catch (Resources.NotFoundException ex) {
throw new FileNotFoundException("Resource does not exist: " + uri);
}
} else if (SCHEME_FILE.equals(scheme)) {
// Note: left here to avoid breaking compatibility. May be removed
// with sufficient testing.
return new FileInputStream(uri.getPath());
} else {
AssetFileDescriptor fd = openAssetFileDescriptor(uri, "r", null);
try {
return fd != null ? fd.createInputStream() : null;
} catch (IOException e) {
throw new FileNotFoundException("Unable to create stream");
}
}
}
this is the source code of ContentResolver, hope this can help you
Follow this URL . It will helps you
If you stored as in local drive.
String filePath = null;
Uri _uri = data.getData();
Log.d("","URI = "+_uri);
if(_uri!=null&&"content".equals(_uri.getScheme())){
Cursor cursor=this.getContentResolver().query(_uri,new String[]{android.provider.MediaStore.Images.ImageColumns.DATA},null,null,null);
cursor.moveToFirst();
filePath=cursor.getString(0);
cursor.close();
}else
{
filePath=_uri.getPath();
}
I am developing a module for which i want to show all the user's videos from sd card into a Gridview. I have grabbed video file paths in usual way (Checking if file or directory and save if its a file) in a arraylist and grabbed its bitmap thumbnail with following code:
Bitmap bmThumbnail = ThumbnailUtils.createVideoThumbnail(VideoValues.get(position).getAbsolutePath(),
Thumbnails.MINI_KIND);
Obviously this code runs in a background thread. But the only problem is that the gribview still freezes a lot while scrolling. According to me the main problem is extracting the bitmap from video, which takes a lot of time. Can anyone suggest me a different way to get bitmap from video and how it in a grid ? I have seen the smooth behavior in other apps like Facebook, etc. But I cannot figure out as to how that can be done.
please use below method for retrive video thumbnail from video
#SuppressLint("NewApi")
public static Bitmap retriveVideoFrameFromVideo(String videoPath)
throws Throwable
{
Bitmap bitmap = null;
MediaMetadataRetriever mediaMetadataRetriever = null;
try
{
mediaMetadataRetriever = new MediaMetadataRetriever();
mediaMetadataRetriever.setDataSource(videoPath);
bitmap = mediaMetadataRetriever.getFrameAtTime();
}
catch (Exception e)
{
throw new Throwable(
"Exception in retriveVideoFrameFromVideo(String videoPath)"
+ e.getMessage());
}
finally
{
if (mediaMetadataRetriever != null)
{
mediaMetadataRetriever.release();
}
}
return bitmap;
}
I already got it to create a thumbnail from my video.
The code looks like this:
videoGalleryThumbnails.add(ThumbnailUtils.extractThumbnail(ThumbnailUtils.createVideoThumbnail(
videoFile.getAbsolutePath(), MediaStore.Images.Thumbnails.MINI_KIND), 500, 200));
But the thumbnail created is at a really bad time. It is exactly when the video is black. Now i have no use of a completely black Thumbnail.
How can i take a Thumbnail of my video at a specific time? E.g. at 00:31 or at 01:44?
Or is it not possible?
I tried also to use MediaMetadataRetriever, but i get only a white image. Code looks like this
File tempVideoList[] = (Environment.getExternalStoragePublicDirectory(PATH_VIDEO_GALLERY))
.listFiles();
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
Bitmap myBitmap=null;
for (File videoFile : tempVideoList) {
if (videoFile.isFile()) {
//from here
try {
retriever.setDataSource(videoFile.getAbsolutePath());
myBitmap = retriever.getFrameAtTime(11); //at 11th second
} catch (Exception ex) {
Log.i("MyDebugCode", "MediaMetadataRetriever got exception:" + ex);
}
videoGalleryThumbnails.add(myBitmap);
//to here
}
If i replace the code marked as "from here" to "to here" with the top first code, it works.
I also tried MediaMetadataRetriever.OPTION_CLOSEST and OPTION_CLOSEST_SYNC and OPTION_NEXT_SYNC.
Ok i got it.
The MediaMetadataRetriever was absolutely the right way to go. The problem is, that it counts the time in microseconds and not in seconds. Solution looks like this:
try {
retriever.setDataSource(videoFile.getAbsolutePath());
int timeInSeconds = 30;
myBitmap = retriever.getFrameAtTime(timeInSeconds * 1000000,
MediaMetadataRetriever.OPTION_CLOSEST_SYNC);
} catch (Exception ex) {
Log.i("MyDebugCode", "MediaMetadataRetriever got exception:" + ex);
}
videoGalleryThumbnails.add(myBitmap);
I don't know, if OPTION_CLOSEST_SYNC is actually needed, but it looks like this is the better way for programming.
Thanks to William Riley for pointing me in the right direction.
need little changes for this code:
try {
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
//have to control the version for the setDataSource
if (Build.VERSION.SDK_INT >= 14)
retriever.setDataSource(video_path, new HashMap<String, String>());
else
retriever.setDataSource(video_path);
int timeInSeconds = 5;
Bitmap thumb = retriever.getFrameAtTime(timeInSeconds * 1000000,
MediaMetadataRetriever.OPTION_CLOSEST_SYNC);
imageViewThumb.setImageBitmap(thumb);
} catch (Exception ex) {
ex.printStackTrace();
}
if we don't control the version for "setDataSource" then we will get exceptions. for me it was not working until version controlling.