After compressing the video, it's quality getting dull in Android - android

I have done with the video compressing using ffmpeg in Android and I am having some problem in it.
I have captured one video of exactly one minute and it has 123 MB of size on my nexus 5. I did video compressing of the same video from 123 MB to 1.30 MB approx and it will take 2 minutes near about and that was successfully done.
But the question is that I have the compressed video in my SD Card and when I'll play it, the quality of the video is totally dull, below is my code using ffmpeg.
String[] complexCommand = {"ffmpeg", "-i", videoPath, "-strict","experimental","-s", "160x120","-r","25", "-vcodec", "mpeg4", "-b", "150k", "-ab","48000", "-ac", "2", "-ar", "22050", demoVideoFolder + "Compressed_Video.mp4"};
LoadJNI vk = new LoadJNI();
try {
vk.run(complexCommand, workFolder, getApplicationContext(),
false);
GeneralUtils.copyFileToFolder(vkLogPath, demoVideoFolder);
} catch (Throwable e) {
Log.e(Prefs.TAG, "vk run exeption.", e);
} finally {
if (wakeLock.isHeld())
wakeLock.release();
else {
Log.i(Prefs.TAG,
"Wake lock is already released, doing nothing");
}
}
Log.i(Prefs.TAG, "doInBackground finished");
Here videopath is my filepath and demofolder is my output folder. I have attached the snapshot, just have a look into it.
Please, tell me what I should do, so just in advance your efforts will be highly appreciated and thanks for that.

"Dull" is very subjective, I really don't know what to make of that. If you have specific artifacts you want to discuss, please post screenshots. I can make some general comments on your commandline that may or may not be helpful:
-s 160x120 - are we back in 1995? This is what we used to refer to when we said "stamp-sized video" in the mid-90s. In case you didn't know, this resizes the video to a resolution of 160x120, which destroys quality.
-r 25 you're dropping and adding frames at random here. You most likely want to use a fps filter, or remove this option altogether.
-vcodec mpeg4 - people use H.264 nowadays (-vcodec libx264), if not HEVC/VP9 (-vcodec libx265/libvpx-vp9).
-b 150k - this is a very low bitrate. If you don't like the video quality, please increase the bitrate.
-strict experimental - don't use this unless you know what you're doing.

Related

FFmpeg adding image watermark to video process is very slow

I am adding image watermark to video with help of FFmpeg but FFmpeg takes an inordinate amount of time with the below command-
String[] cmd = {"-i",videoPath, "-i", waterMark.toString(),"-filter_complex","overlay=5:5","-codec:a", "copy", outputPath};
so i tried another command which was little bit faster but increase output file size(which i do not want)
String[] cmd = {"-y","-i", videoPath, "-i", waterMark.toString(), "-filter_complex", "overlay=5:5", "-c:v","libx264","-preset", "ultrafast", outputPath};
Some one please explain to me how to increase the speed of FFmpeg watermarking speed without increasing the size of output.
Thanks.
You mentioned that a 7MB video takes between 30-60 seconds.
There is always a trade off when choosing between speed and quality.
I tested on my phone using a 7MB file and it took 13 seconds, still slow, but we can't expect much better then that.
Ways to increase speed:
Lowering the frame rate, using the -r command
Changing the bitrate, using the -b:v and -b:a commands
Change the Constant Rate Factor, using -crf. The default value is 21
The range of the quantizer scale is 0-51: where 0 is lossless, 23 is default, and 51 is worst possible. A lower value is a higher quality and a subjectively sane range is 18-28. Consider 18 to be visually lossless or nearly so: it should look the same or nearly the same as the input but it isn't technically lossless.
This is what I have found works the best on most android devices:
String[] s = {"-i", VideoPath, "-i", ImagePath, "-filter_complex", "[0:v]pad=iw:if(lte(ih\\,iw)\\,ih\\,2*trunc(iw*16/9/2)):(ow-iw)/2:(oh-ih)/2[v0];[1:v][v0]scale2ref[v1][v0];[v0][v1]overlay=x=(W-w)/2:y=(H-h)/2[v]", "-map", "[v]", "-map", "0:a", "-c:v", "libx264", "-preset", "ultrafast", "-r", myFrameRate, directoryToStore[0] + "/" + SavedVideoName};
I reduced my framerate slightly, you can experiment what works best for you. I'm using mp4parser to retrieve the frame rate.
I have to give credit to #Gyan that provided me with a way to perfectly scale images that is being placed on top of a video, you can look at the question I asked here.
If you are unsure about the frame rate, you can remove it from the command and first test if your speed is reduced.
Try it, if you have any questions, please ask.
OP opted to go with the following command:
String[] cmd = {"-y","-i", videoPath, "-i", waterMark.toString(), "-filter_complex", "overlay=(main_w-overlay_w-10):5", "-map", "0:a","-c:v", "libx264", "-crf", "28","-preset", "ultrafast" ,outputPath};
Edit
Just to add on the command I mentioned and provide a detailed explanation of how to use it etc:
String[] cmd = {"-i", videoPath, "-i", waterMark.toString(), "-filter_complex", "[0:v]pad=iw:if(lte(ih\\,iw)\\,ih\\,2*trunc(iw*16/9/2)):(ow-iw)/2:(oh-ih)/2[v0];[1:v][v0]scale2ref[v1][v0];[v0][v1]overlay=x=(W-w)/2:y=(H-h)/2[v]", "-map", "[v]", "-map", "0:a", "-c:v", "libx264", "-preset", "ultrafast", "-r", myFrameRate, outputPath};
This is for device's that has a display aspect ratio of 16:9. If you want this filter to work on all device's you will have to get the aspect ratio of the device and change the filter 16/9/2 respectively.
You can get the device aspect ratio by creating this methods:
int gcd(int p, int q) {
if (q == 0) return p;
else return gcd(q, p % q);
}
void ratio(int a, int b) {
final int gcd = gcd(a,b);
if(a > b) {
setAspectRatio(a/gcd, b/gcd);
} else {
setAspectRatio(b/gcd, a/gcd);
}
}
void setAspectRatio(int a, int b) {
System.out.println("aspect ratio = "+a + " " + b);
//This is the string that will be used in the filter (instead of hardcoding 16/9/2
filterAspectRatio = a + "/" + b + "/" + "2";
}
Now you have the correct aspect ratio and you can change the filter accordingly.
Next, create a watermark and add it to a view, make that view the size of the device (match_parent) and scale/place the watermark where you would like it to be. You can then get the bitmap by calling:
Bitmap waterMarkBitmap = watermarkView.getDrawingCache();
and create a file from the Bitmap, like this:
String outputFilename = "myCoolWatermark.png"; //provide a name for you saved watermark
File path = Environment.getExternalStorageDirectory(); //this can be changed to where you want to store the bitmap
File waterMark = new File(path, outputFilename);
try (FileOutputStream out = new FileOutputStream(waterMark)) {
waterMarkBitmap.compress(Bitmap.CompressFormat.PNG, 100, out); // PNG is a lossless format, the compression factor (100) is ignored
} catch (IOException e) {
e.printStackTrace();
}
The watermark is created and can be reused, or you can delete it when you are done with it.
Now you can call the command mentioned above.
This is a very common question here. The simple answer is that you can't increase the encoding speed of ffmpeg on Android. You're encoding on a phone, so you don't expect desktop/server performance using an encoder and no hardware acceleration support.
There are a few things users can do:
Stream copy the audio with -c:a copy (you're already doing that).
Use -preset ultrafast to give up encoding efficiency for encoding speed (you're also already doing that).
Make the output width x height smaller with the scale filter (probably not an acceptable option for you).
Make sure your x264 was not compiled with --disable-asm so you can take advantage of the various ARM and NEON optimizations in x264 for a significant increase in encoding speed. However, I don't know which Android devices support that, but it's something to look into. For a quick check to see if you are using any optimizations refer to the console output from ffmpeg and search for using cpu capabilities. If none! then it is not using any optimizations, otherwise it may say ARMv7 NEON or something like that.
Offload the encoding to a server. Saves your users' battery life too.
All this for an annoying watermark? Avoid re-encoding and use a player to overlay the watermark.
Apparently FFmpeg has MediaCodec decoding support on Android, but encoding is the bottleneck here. However maybe it will save a few fps.
Send a patch to FFmpeg that enables MediaCodec encoding support or wait a few years for someone else to do so.
Forget ffmpeg and use MediaCodec directly. I am clueless about this and too lazy to look it up, but I assume it uses hardware to encode and I'll guess you can use it to make an overlay. Someone correct me if I am wrong.

Reverse video in android

I have recorded a video from camera in my app and saved in device storage.Now I want to reverse the video such that it plays from backwards.i.e. if video is of 10 seconds then the last frame at 10th second will become first frame and it starts playing from there to 1st second first frame.I want to save the reversed video in a file.How should i proceed in that?
If you are prepared to use ffmpeg you can use this approach - it essentially breaks the video into frames and then builds it again in reverse order:
https://stackoverflow.com/a/8137637/334402
There are several ways to use ffmpeg in Android but the 'wrapper' approach is one which I have found a reasonable blend of performance and ease of use. Some example Android ffmpeg wrapper:
http://hiteshsondhi88.github.io/ffmpeg-android-java/
https://github.com/guardianproject/android-ffmpeg
It's worth being aware that this will be time-consuming on a Mobile - if you have the luxury of being able to upload to a server and doing the reversal there it might be quicker.
Thanks to Mick for giving me an idea to use ffmpeg for reversing video.
I have posted code at github for reversing video along with performing other video editing operation using ffmpeg and complete tutorial in my blog post here.
As written in my blog post,
For reversing video,first we need to divide video into segments with
duration of 10 seconds or less because reverse video command for
ffmpeg will not work for long duration videos unless your device has
32 GB of RAM.
Hence,to reverse a video-
1.Divide video into segments with duration of 10 seconds or less.
2.Reverse the segmented videos
3.Concatenate reversed segmented videos in reverse order.
For dividing video into segments with duration of 6 seconds we can use
the below command-
String[] complexCommand = {"-i", inputFileAbsolutePath, "-c:v",
"libx264", "-crf", "22", "-map", "0", "-segment_time", "6", "-g", "9",
"-sc_threshold", "0", "-force_key_frames", "expr:gte(t,n_forced*6)",
"-f", "segment", outputFileAbsolutePath};
Here,
-c:v libx264
encodes all video streams with libx264
-crf
Set the quality for constant quality mode.
-segment_time
time for each segment of video
-g
GOP size
-sc_threshold
set scene change threshold.
-force_key_frames expr:gte(t,n_forced*n)
Forcing a keyframe every n seconds
After segmenting video,we need to reverse the segmented videos.For
that we need to run a loop where each segmented video file will be
reversed.
To reverse a video with audio(without removing its audio) we can use
the below command-
String command[] = {"-i", inputFileAbsolutePath, "-vf", "reverse",
"-af", "areverse", outputFileAbsolutePath};
To reverse a video with audio removing its audio we can use the below
command-
String command[] = {"-i", inputFileAbsolutePath, "-an", "-vf",
"reverse", outputFileAbsolutePath};
To reverse a video without audio we can use the below command-
String command[] = {"-i",inputFileAbsolutePath, "-vf", "reverse",
outputFileAbsolutePath};
After reversing segmented videos,we need to concatenate reversed
segmented videos in reverse order.For that we sort videos on the basis
of last modified file using Arrays.sort(files,
LastModifiedFileComparator.LASTMODIFIED_REVERSE).
Then, to concatenate reversed segmented videos(with audio) we can use the below
command-
String command[] =
{"-i",inputFile1AbsolutePath,"-i",inputFile2AbsolutePath
.....,"-i",inputFileNAbsolutePath,"-filter_complex","[0:v0] [0:a0]
[1:v1] [1:a1]...[N:vN] concat=n=N:v=1:a=1 [v]
[a],"-map","[v]","-map","[a]", outputFileAbsolutePath};
To concatenate reversed segmented videos(without audio) we can use the below
command-
String command[] =
{"-i",inputFile1AbsolutePath,"-i",inputFile2AbsolutePath
.....,"-i",inputFileNAbsolutePath,"-filter_complex","[0:0] [1:0]
[2:0]...[N:0] concat=n=N:v=1:a=0",outputFileAbsolutePath};
Here,
-filter_complex [0:v0] [0:a0] [1:v1] [1:a1]…[N:vN] tells ffmpeg what streams to send to the concat filter.In the above case, video stream 0
[0:v0] and audio stream 0 [0:a0] from input 0,video stream 1 [1:v1]
and audio stream 1 [1:v1] from input 1 and so on.
concat filter is used to concatenate audio and video streams, joining
them together one after the other.The filter accepts the following
options:
n
Set the number of segments. Default is 2.
v
Set the number of output video streams, that is also the number of
video streams in each segment. Default is 1.
a
Set the number of output audio streams, that is also the number of
audio streams in each segment. Default is 0.

Xamarin Android Player - Can't play this video

I'm using Android's VideoView to play an embedded video in my app. It works fine on my device but I keep getting a "Can't play this video" message and a black screen in the Xamarin Android Player.
The corresponding error log looks like this:
Unable to play video
[MediaPlayer] Error (1,-38)
[VideoView] Error: 1,-38
I found a few posts regarding this error but none of them helped me solving this issue and I'm not able to find a proper description for this status code.
My C# code looks like this:
videoView = new VideoView (Context);
base.SetNativeControl (videoView);
videoView.SetOnErrorListener (new ErrorListener ());
string fileName = e.NewElement.FileSource;
fileName = fileName.ToLower ().Substring (0, fileName.LastIndexOf ("."));
int resourceID = Context.Resources.GetIdentifier (fileName, "raw", Context.PackageName);
var fullPath = String.Format ("android.resource://{0}/{1}", Context.PackageName, resourceID);
videoView.SetVideoPath (fullPath);
videoView.RequestFocus ();
videoView.Start ();
Seems to be an issue with the type of encoding that the emulator supports, if you install ffmpeg, if your on a mac by running these commands:
ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
brew install ffmpeg
then process your video file with:
ffmpeg -i big_buck_bunny_720p_1mb.mp4 -c:v libx264 -profile:v baseline -c:a aac -strict -2 -b:a 128k output.mp4
and try to play the output it wont show that error but it will be a blank video (just a black screen). So I think the issue is just getting the right encoding, have tried some different encodings but all seems to just show a black screen.
Will do some more digging but for the time being it seems to be just the emulator does not support your encoding.
EDIT
Ok so I got the videoplay working, I processed the video with:
ffmpeg -i SampleVideo_1080x720_1mb.mp4 -codec:v libx264 -profile:v baseline -preset slow -b:v 250k -maxrate 250k -bufsize 500k -vf scale=-1:360 -threads 0 -codec:a aac -strict -2 -b:a 96k output.mp4
Check this site for the ffmpeg parameters.
I setup my VideoView like so:
public class Activity1 : Activity
{
VideoView videoView;
protected override void OnCreate (Bundle bundle)
{
base.OnCreate (bundle);
// Set our view from the "main" layout resource
SetContentView (Resource.Layout.Main);
videoView = FindViewById<VideoView> (Resource.Id.SampleVideoView);
videoView.SetMediaController(new MediaController(this));
videoView.SetVideoPath ($"android.resource://{PackageName}/{Resource.Raw.output}");
videoView.RequestFocus ();
videoView.Start ();
}
}
This seems to work on the Xamarin Android Player but only for API versions 16(JellyBean) and 19(Kitkat). 21 (lollipop) just doesn't load the video.
Then I downloaded the GenyMotion Emulator (need to create an account but its free for personal use) to check if it was the Xamarin Player or not. It works on all (16,17,18,19,20 + 22) apart from 21(lollipop). looks like something is wrong with the emulators for 21, I did all my testing on nexus 4 emulators. So if you want tot est video playback I would try to avoid emulators with API 21.
Different Android OS versions support different combinations of audio and video encodings within the video container. So it depends what version your Android Player is emulating. For the table see http://developer.android.com/guide/appendix/media-formats.html#core

extract all video frames in Android

I recorded a Video for limited time. Now i want to fetch all frames of video. I am using the below code and by using it i am able to get frames but i am not getting all video frames. 3 to 4 frames are repeated then i got a different frame. But as we all know we can get 25- 30 frames in a second to display smooth video. How to get all frames.
for (int i = 0; i < 30; i++) {
Bitmap bArray = mediaMetadataRetriever.getFrameAtTime(
1000000 * i,
MediaMetadataRetriever.OPTION_CLOSEST);
savebitmap(bArray, 33333 * i);
}
I don't want to use NDK. I got this link don't know what should be the value for "argb8888". I am getting error here. Can anyone explain how to do it.
Getting frames from Video Image in Android
I faced the same problem before and the Android's MediaMetadataRetriever seems not appropriated for this task since it doesn't have a good precision.
I used a library called "FFmpegMediaMetadataRetriever" in android studio:
Add this line in your build.graddle under module app:
compile 'com.github.wseemann:FFmpegMediaMetadataRetriever:1.0.14'
Rebuild your project.
Use the FFmpegMediaMetadataRetriever class to grab frames with higher
precision:
FFmpegMediaMetadataRetriever med = new FFmpegMediaMetadataRetriever();
med.setDataSource("your data source");
and in your loop you can grab frame using:
Bitmap bmp = med.getFrameAtTime(i*1000000, FFmpegMediaMetadataRetriever.OPTION_CLOSEST);
To get image frames from video we can use ffmpeg.For integrating FFmpeg in android we can use precompiled libraries like ffmpeg-android.
To extract image frames from a video we can use below command
String[] complexCommand = {"-y", "-i", inputFileAbsolutePath, "-an",
"-r", "1/2", "-ss", "" + startMs / 1000, "-t", "" + (endMs - startMs)
/ 1000, outputFileAbsolutePath};
Here,
-y
Overwrite output files
-i
ffmpeg reads from an arbitrary number of input “files” specified by the -i option
-an
Disable audio recording.
-r
Set frame rate
-ss
seeks to position
-t
limit the duration of data read from the input file
Here in place of inputFileAbsolutePath you have to specify the absolute path of video file from which you want to extract images.
For complete code check out this on my repository .Inside extractImagesVideo() method I am running command for extracting images from video.
For complete tutorial regarding integration of ffmpeg library and using ffmpeg commands to edit videos, check out this post which I have written on my blog.
You need to do :
Decode the video.
Present the decoded images at least as fast as 24 images / second. I
suppose you can skip this step.
Save the decoded images.
It appears that decoding the video would be the most challenging step. People and companies have spent years developing codecs (encoder / decoder) for various video formats.
Use this library JMF for FFMPEG.

Android -- Can't play any videos (mp4/mov/3gp/etc.)?

I'm having great difficulty getting my Android application to play videos from the SD card. It doesn't matter what size, bitrate, video format, or any other setting I can think of, neither the emulator nor my G1 will play anything I try to encode. I've also tried a number of videos from the web (various video formats, bitrates, with and without audio tracks, etc.), and none of those work either.
All I keep getting is a dialog box that says:
"Cannot play video"
"Sorry, this video cannot be played."
There are errors reported in LogCat, but I don't understand them and I've tried searching the Internet for further explanations without any luck. See below:
03-30 05:34:26.807: ERROR/QCOmxcore(51): OMXCORE API : Free Handle 390d4
03-30 05:34:26.817: ERROR/QCOmxcore(51): Unloading the dynamic library for OMX.qcom.video.decoder.avc
03-30 05:34:26.817: ERROR/PlayerDriver(51): Command PLAYER_PREPARE completed with an error or info PVMFErrNoResources
03-30 05:34:26.857: ERROR/MediaPlayer(14744): error (1, -15)03-30 05:34:26.867: ERROR/MediaPlayer(14744): Error (1,-15)
Sometimes I also get this:
03-30 05:49:49.267: ERROR/PlayerDriver(51): Command PLAYER_INIT completed with an error or info PVMFErrResource
03-30 05:49:49.267: ERROR/MediaPlayer(19049): error (1, -17)
03-30 05:49:49.347: ERROR/MediaPlayer(19049): Error (1,-17)
Here is the code I'm using (in my onCreate() method):
this.setContentView(R.layout.main);
//just a simple VideoView loading files from the SD card
VideoView myIntroView = (VideoView) this.findViewById(R.id.VideoView01);
MediaController mc = new MediaController(this);
myIntroView.setMediaController(mc);
myIntroView.setVideoPath("/sdcard/test.mp4");
myIntroView.requestFocus();
myIntroView.start();
Please help!
Okay, here goes. The video I've been working on in Adobe Premiere is supposed to be 480x800 (WxH), but I have the Adobe Media Encoder output the sequence as an "Uncompressed Microsoft AVI" using the "UYVY" video codec, 24fps frame rate, progressive, square pixels, and dimensions: 720x800 (WxH). This outputs a rather large file with 120px black borders on either side of the video content. I then take the video into Handbrake 0.9.4 and use the following settings (I started with the Regular->Normal preset):
Container: MP4 File
Large file size: [un-Checked]
Web-optimized: [un-Checked]
iPod 5G support: [un-Checked]
Width: 320 (this is key, any higher than 320 and it won’t work)
Height: 528
Keep Aspect Ratio: [Checked]
Anamorphic: None
Crop Left: 120
Crop Right: 120
Everything under the "Video Filter" tab set to "Off"
Video Codec: H.264(x264)
Framerate: same as source
2-Pass Encoding: [Checked]
Turbo first pass: [un-Checked]
Avg Bitrate: 384
Create chapter markers: [un-Checked]
Reference Frames: 2
Mixed References: [un-Checked]
B-Frames: 0
Motion Estimation Method: Uneven Multi-Hexagon
Sub-pixel Motion Estimation: 9
Motion Estimation Range: 16
Analysis: Default
8x8 DCT: [un-Checked]
CABAC Entropy Coding: [un-Checked]
No Fast-P-Skip: [un-Checked]
No DCT-Decimate: [un-Checked]
Deblocking: Default, Default
Psychovisual Rate Distortion: [MAX]
My main problem was that I was trying to output an mp4 file with 480x800 (WxH) dimensions. After changing the width to 320 (higher values didn't work), yet keeping the proper aspect ratio, the output video now plays without errors. I hope this helps someone else with a similar problem.
Side note: I wish the Android video restrictions were better documented.
I have had quite a bit of trouble getting many different videos to play on my phone (HTC hero). Standard 512K mp4's play (example: http://www.archive.org/details/more_animation), check with them first to make sure it's not your code.
Here's my code, from onCreate() in a sub-activity which only plays the video file:
protected VideoView mine;
protected boolean done = false;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.videoshow);
mine = (VideoView) findViewById(R.id.video); // Save the VideoView for touch event processing
try {
String myURI = "/sdcard/" + path + "/v/"
+ currentItem.getFile()
+ "." + currentItem.getFileType();
Uri video = Uri.parse(myURI);
mine.setVideoURI(video);
mine.start();
mine.setOnCompletionListener(new OnCompletionListener() {
public void onCompletion(MediaPlayer mp) {
result.putExtra("com.ejf.convincer01.Finished", true);
done = true;
}
});
} catch (Exception ex) {
Log.d(DEBUG_TAG, "Video failed: '" + ex + "'" );
}
I was facing this problem until I figured out that the problem was in the directory of my video. I was saving my videos to a directory that is unreachable to the video view. So every time I try to play a video it gives me error message says: "Can’t open video" or something like that.
Try and save your video to this directory which will be also visible in phone gallery.
String path = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES) + "/" + "your app name ";

Categories

Resources