i am trying to make a video from selected images from command line using ffmpeg in android
using this project as my source i m trying to make video
this is the command i m trying to create video
String[] ffmpegCommand = {ffmpegBin,
"-y",
"-qscale",
"1",
"-r", "" + frameRate,
"-i", image1.getAbsolutePath(),
"-t", "" + (((4) * 30) + 4), //"-s",heightwidth,
"-i", image2.getAbsolutePath(),
"-t", "" + (((4) * 30) + 4), //"-s",heightwidth,
"-i", image3.getAbsolutePath(),
"-t", "" + (((4) * 30) + 4), //"-s",heightwidth,
"-i", image4.getAbsolutePath(),
"-t", "" + (((4) * 30) + 4), //"-s",heightwidth,
"-vcodec",
"libx264",
"-s",
"640x480",
outputFile.getAbsolutePath()};
but the video created shows only 1st image and video is created of less than a second
what is the problem in this statement ? and why only 1 image is shown in video?
sorry about my bad english
Here i am creating video of 12 seconds from 4 images each of 3 seconds with fade in fade out effects.
Run below command and make sure that all the images having same height width.
String strCommand = "ffmpeg -loop 1 -t 3 -i " + /sdcard/videokit/1.jpg + " -loop 1 -t 3 -i " + /sdcard/videokit/2.jpg + " -loop 1 -t 3 -i " + /sdcard/videokit/3.jpg + " -loop 1 -t 3 -i " + /sdcard/videokit/4.jpg + " -filter_complex [0:v]trim=duration=3,fade=t=out:st=2.5:d=0.5[v0];[1:v]trim=duration=3,fade=t=in:st=0:d=0.5,fade=t=out:st=2.5:d=0.5[v1];[2:v]trim=duration=3,fade=t=in:st=0:d=0.5,fade=t=out:st=2.5:d=0.5[v2];[3:v]trim=duration=3,fade=t=in:st=0:d=0.5,fade=t=out:st=2.5:d=0.5[v3];[v0][v1][v2][v3]concat=n=4:v=1:a=0,format=yuv420p[v] -map [v] -preset ultrafast " + /sdcard/videolit/output.mp4;
This is the ffmpeg command that you should adapt into your string array:
ffmpeg -framerate 25 -t 124 -loop 1 -i image1
-framerate 25 -t 124 -loop 1 -i image2
-framerate 25 -t 124 -loop 1 -i image3
-framerate 25 -t 124 -loop 1 -i image4
-filter_complex "[0][1][2][3]concat=n=4"
-c:v libx264 -s 640x480 outputfile
Rule is that input options (framerate, t..etc) go before the input entry.
The concat filter joins the image streams together. If they are different sizes, you should resize to make them the same.
Related
By using below command we can merge video and image together. I not an expert to build ffmpeg commands.
String cmd = "-t 5 -i " + videoPath +
" -loop 1 -t 5 -i " + imagePath +
" -f lavfi -t 5 -i anullsrc" +
" -filter_complex [1][0]scale2ref[2nd][ref];[ref][0:a][2nd][2:a]concat=n=2:v=1:a=1[v][a]" +
" -c:v libx264 -c:a aac -strict -2 -map [v] -map [a] -preset veryfast " + outputPath;
Any one who have idea to add a transition effect between video and image.
Please help me.
Thanks in advance.
I want to make a sample video file into video for 30 seconds to show. But I can not do it because I'm not good enough with FFmpeg.
My code :
String command[]={
"-y",
"-r",
"1/5",
"-i",
src.getAbsolutePath(), // only one image file path
"-c:v",
"libx264",
"-vf",
"fps=25",
"-pix_fmt",
"yuv420p",
imagesvideoOutput
};
Simple command for 30 second video from a single image:
ffmpeg -loop 1 -i image.png -vf format=yuv420p -t 30 output.mp4
Faster, but somewhat more complicated method:
ffmpeg -loop 1 -framerate 1 -i image.png -vf fps=25,format=yuv420p -t 30 output.mp4
I am using ffmeg for video compression. I have used ffpmeg library
Use command "ffmpeg -i " + filein.trim() + " -vcodec h264 -acodec mp2 " + fileout.trim() but i doesn't respond anything.
I have used any ndk library for first time.
Use this command will work for you. This works for me .
String cmd = "-y -i " + currentInputVideoPath + " -strict -2 -vcodec libx264 -preset ultrafast " + "-crf 24 -acodec aac -ar 44100 -ac 2 -b:a 96k -s 320x240 -aspect 4:3 " + currentOutputVideoPath;
I'm working with 2 videos and I want to make 3 different operations at same
execution with FFmpeg. Each command works separate but it'll much more efficient to make them all run at the same execution.
So for example I have:
Video 1
Video 2
First I want to cut both videos from starting point to end point:
cmd = -i video1Path -ss 00:00:30.0 -c copy -t 00:00:10.0 video1Output
cmd = -i video2Path -ss 00:00:30.0 -c copy -t 00:00:10.0 video2Output
Than resizing Video 1:
cmd = "-i " + video1Output+ " -vf scale=240:360" + resizedVideo1;
Now overlaying the resizedVideo1 on top of Video 2:
cmd = "-i " + video2Output + " -i " + resizedVideo1 + " -filter_complex [0:v][1:v]" + overlayCmd + " " + finalVideoPath;
I'm wondering if it's possible to achieve all this actions at the same FFMpeg executions using 1 filter_complex...
The combined command will be
ffmpeg -ss 30 -t 10 -i video2 -ss 30 -t 10 -i video1
-filter_complex
"[1]scale=240:360[v1];[0][v1]overlay"
output.mp4
I'm trying to make a mp4 from an app on Android, to do so i added the FFmpeg binary.
to create the mp3 i use a command :
ffmpegBinaryPath + " -r 24 -i " + inputImgPath + " -c:v libx264 -crf 23 -pix_fmt yuv420p -s 640x480 " + outputVideoPath;
But i want to add sound on this mp4 i use :
ffmpegBinaryPath + " -r 24 -i " + inputImgPath + " -i " + mp3Path + " -c:v libx264 -crf 23 -pix_fmt yuv420p -s 640x480 " + outputVideoPath;
But it does not work i have no error message and FFmpeg stop at the beginning.
I think i need a mp3 code or something like but i can't fin a way to make it works.
So any help will be appreciated.
Thank you.