Ok, so I have successfully compiled ffmpeg for android using the Guardian Project source code. Now the binary comes out around 10 MB. But since it has to goto a phone, i just wanted it to be as small as possible.
Now The Guardian Project code has a file configure_ffmpeg.sh(goto link to see) which it says to change to add/remove functionality. But i have no clue what does what. So could some help me out sorting this. Ideally I would be able to disable a few things.
Thanks,
Digvijay
Regarding the size of the project I don't know what you could do.
Regarding the conversion of images to a video, you could do something like:
public static String convert(File file) {
FFmpegFrameGrabber frameGrabber =
new FFmpegFrameGrabber(file.getAbsolutePath());
String newName = file.getName();
newName =
newName.replace("." + FileManager.getExtension(file),
"_RECODED.mp4");
String newFilePath = file.getParent() + "/" + newName;
FFmpegFrameRecorder recorder;
Frame frame = new Frame();
recorder = new FFmpegFrameRecorder(newFilePath, 250, 250);
recorder.setFrameRate(20);
recorder.setFormat("mp4");
recorder.setAudioChannels(2);
recorder.setPreset("veryfast");
try {
long startTime = System.currentTimeMillis();
System.out.println("" + startTime);
Log.d("Tempo", "" + startTime);
frameGrabber.start();
recorder.start();
while (true) {
try {
frame = frameGrabber.grabFrame();
if (frame == null) {
System.out.println("!!! Failed cvQueryFrame");
break;
}
recorder.record(frame);
} catch (Exception e) {
Log.e("converter", e.getMessage());
}
}
frameGrabber.stop();
frameGrabber.release();
recorder.stop();
recorder.release();
long stopTime = System.currentTimeMillis();
System.out.println(""+stopTime);
} catch (Exception e) {
e.printStackTrace();
}
return newFilePath;
}
This actually get every frame from an input video and creates another .mp4 video. You could just rewrite it to code from every frame in your folder (or anything like that).
Related
I am trying to merge audio and video file. The length of audio and video file are same but after complete merging process in output video file audio playing only for 3-4 second while audio and video file length was 14 seconds. Please help me why audio not playing completely in the output file.
String OutputPath = path + outputVideo;
try {
FrameGrabber grabber1 = new FFmpegFrameGrabber(videoFile);
FrameGrabber grabber2 = new FFmpegFrameGrabber(audioFile);
grabber1.start();
grabber2.start();
FrameRecorder recorder = new FFmpegFrameRecorder(OutputPath,
grabber1.getImageWidth(), grabber1.getImageHeight(),2);
recorder.setFormat(grabber1.getFormat());
recorder.setVideoQuality(1);
recorder.setFrameRate(grabber1.getFrameRate());
recorder.setSampleRate(grabber2.getSampleRate());
recorder.start();
Frame frame1, frame2;
while ((frame1 = grabber1.grabFrame()) != null &&
(frame2 = grabber2.grabFrame()) != null) {
recorder.record(frame1);
recorder.record(frame2);
}
recorder.stop();
grabber1.stop();
grabber2.stop();
} catch (FrameGrabber.Exception e) {
e.printStackTrace();
} catch (FrameRecorder.Exception e) {
e.printStackTrace();
}
I have created functionality to record video in my app.
When I play a song, that song is recorded with video and a video file is created, similar to a dubshmash application.
Now the problem that I am facing is that other voices such as near by sounds also get recorded. The song file is recorded in the video record screen and I play the song when video recording activity launches.
How can I have my application record only song with video?
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
Is there any solution in audio source set as a speaker , because song sound going through a speaker? if is it another possible way please reply me.
You can record video without audio and merge audio later on using mp4 parser like this:
/*
* #param videoFile path to video file
* #param audioFile path to audiofile
*/
public String mux(String videoFile, String audioFile) {
Movie video = null;
try {
video = new MovieCreator().build(videoFile);
} catch (RuntimeException e) {
e.printStackTrace();
return null;
} catch (IOException e) {
e.printStackTrace();
return null;
}
Movie audio = null;
try {
audio = new MovieCreator().build(audioFile);
} catch (IOException e) {
e.printStackTrace();
return null;
} catch (NullPointerException e) {
e.printStackTrace();
return null;
}
int size = audio.getTracks().size();
Track audioTrack = audio.getTracks().get((size - 1));
video.addTrack(audioTrack);
Container out = new DefaultMp4Builder().build(video);
File myDirectory = new File(Environment.getExternalStorageDirectory(), "/Folder Name");
if (!myDirectory.exists()) {
myDirectory.mkdirs();
}
filePath = myDirectory + "/video" + System.currentTimeMillis() + ".mp4";
try {
RandomAccessFile ram = new RandomAccessFile(String.format(filePath), "rw");
FileChannel fc = ram.getChannel();
out.writeContainer(fc);
ram.close();
} catch (IOException e) {
e.printStackTrace();
return null;
}
return filePath;
}
In build.gradle add following dependency
compile 'com.googlecode.mp4parser:isoparser:1.0.5.4'
If you want to working with video then you have to use FFMPEG library
That can be you can work with Video.
That for i have already give answer to How to use ffmpeg in android studio? see this LINK. Go step by step and import in your project
You can use a MediaRecorder without calling setAudio* on it.
remove this line
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
see this link
There is currently no way to directly record android output without "background noise".
Note that this is a security concern to restrict access to other apps audio output, therefore it is very unlikely that it could be achieved directly.
See this answer
I have created functionality to record video in my app.
When I play a song, that song is recorded with video and a video file is created, similar to a dubshmash application.
Now the problem that I am facing is that other voices such as near by sounds also get recorded. The song file is recorded in the video record screen and I play the song when video recording activity launches.
How can I have my application record only song with video?
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
Is there any solution in audio source set as a speaker , because song sound going through a speaker? if is it another possible way please reply me.
You can record video without audio and merge audio later on using mp4 parser like this:
/*
* #param videoFile path to video file
* #param audioFile path to audiofile
*/
public String mux(String videoFile, String audioFile) {
Movie video = null;
try {
video = new MovieCreator().build(videoFile);
} catch (RuntimeException e) {
e.printStackTrace();
return null;
} catch (IOException e) {
e.printStackTrace();
return null;
}
Movie audio = null;
try {
audio = new MovieCreator().build(audioFile);
} catch (IOException e) {
e.printStackTrace();
return null;
} catch (NullPointerException e) {
e.printStackTrace();
return null;
}
int size = audio.getTracks().size();
Track audioTrack = audio.getTracks().get((size - 1));
video.addTrack(audioTrack);
Container out = new DefaultMp4Builder().build(video);
File myDirectory = new File(Environment.getExternalStorageDirectory(), "/Folder Name");
if (!myDirectory.exists()) {
myDirectory.mkdirs();
}
filePath = myDirectory + "/video" + System.currentTimeMillis() + ".mp4";
try {
RandomAccessFile ram = new RandomAccessFile(String.format(filePath), "rw");
FileChannel fc = ram.getChannel();
out.writeContainer(fc);
ram.close();
} catch (IOException e) {
e.printStackTrace();
return null;
}
return filePath;
}
In build.gradle add following dependency
compile 'com.googlecode.mp4parser:isoparser:1.0.5.4'
If you want to working with video then you have to use FFMPEG library
That can be you can work with Video.
That for i have already give answer to How to use ffmpeg in android studio? see this LINK. Go step by step and import in your project
You can use a MediaRecorder without calling setAudio* on it.
remove this line
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
see this link
There is currently no way to directly record android output without "background noise".
Note that this is a security concern to restrict access to other apps audio output, therefore it is very unlikely that it could be achieved directly.
See this answer
I am creating a video from images via FFMPEG and I am able to get the video from images. I am also making use of JavaCV to merge two videos and I am able to join videos using JavaCV without any issues provided both the videos are taken via camera, i.e, a video actually recorded via mobile camera.
Issue that I'm facing :
I am not able to merge the video that was generated from FFMPEG using the images along with the video user has chosen which will mostly be a video that was not generated and taken via mobile camera.
CODE :
Code to generate Video via Images :
FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(path + "/" + "dec16.mp4", 800, 400);
try {
recorder.setVideoCodec(avcodec.AV_CODEC_ID_MPEG4);
//recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
recorder.setVideoCodecName("H264");
recorder.setVideoOption("preset", "ultrafast");
recorder.setFormat("mp4");
recorder.setFrameRate(frameRate);
recorder.setVideoBitrate(60);
recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
startTime = System.currentTimeMillis();
recorder.start();
for(int j=0;j<MomentsGetStarted.chosen_images_uri.size();j++)
{
if(MomentsGetStarted.chosen_images_uri.get(j).video_id==0)
{
chosen_images.add(MomentsGetStarted.chosen_images_uri.get(j));
}
}
for (int i = 0; i < chosen_images.size(); i++) {
opencv_core.IplImage image = cvLoadImage(chosen_images.get(i).sdcardPath);
long t = 3000 * (System.currentTimeMillis() - startTime);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
recorder.record(image);
}
}
recorder.stop();
} catch (Exception e) {
e.printStackTrace();
}
Code to merge Videos :
int count = file_path.size();
System.out.println("final_joined_list size " + file_path.size());
if (file_path.size() != 1) {
try {
Movie[] inMovies = new Movie[count];
mediaStorageDir = new File(
Environment.getExternalStorageDirectory()
+ "/Pictures");
for (int i = 0; i < count; i++) {
File file = new File(file_path.get(i));
System.out.println("fileeeeeeeeeeeeeeee " + file);
System.out.println("file exists!!!!!!!!!!");
FileInputStream fis = new FileInputStream(file);
FileChannel fc = fis.getChannel();
inMovies[i] = MovieCreator.build(fc);
fis.close();
fc.close();
}
List<Track> videoTracks = new LinkedList<Track>();
List<Track> audioTracks = new LinkedList<Track>();
Log.d("Movies length", "isss " + inMovies.length);
if (inMovies.length != 0) {
for (Movie m : inMovies) {
for (Track t : m.getTracks()) {
if (t.getHandler().equals("soun")) {
audioTracks.add(t);
}
if (t.getHandler().equals("vide")) {
videoTracks.add(t);
}
if (t.getHandler().equals("")) {
}
}
}
}
Movie result = new Movie();
System.out.println("audio and videoo tracks : "
+ audioTracks.size() + " , " + videoTracks.size());
if (audioTracks.size() > 0) {
result.addTrack(new AppendTrack(audioTracks
.toArray(new Track[audioTracks.size()])));
}
if (videoTracks.size() > 0) {
result.addTrack(new AppendTrack(videoTracks
.toArray(new Track[videoTracks.size()])));
}
IsoFile out = null;
try {
out = (IsoFile) new DefaultMp4Builder().build(result);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
long timestamp = new Date().getTime();
String timestampS = "" + timestamp;
File storagePath = new File(mediaStorageDir
+ File.separator);
storagePath.mkdirs();
File myMovie = new File(storagePath, String.format("%s.mp4", timestampS));
FileOutputStream fos = new FileOutputStream(myMovie);
FileChannel fco = fos.getChannel();
fco.position(0);
out.getBox(fco);
fco.close();
fos.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String mFileName = Environment.getExternalStorageDirectory()
.getAbsolutePath();
// mFileName += "/output.mp4";
File sdCardRoot = Environment.getExternalStorageDirectory();
File yourDir = new File(mediaStorageDir + File.separator);
for (File f : yourDir.listFiles()) {
if (f.isFile())
name = f.getName();
// make something with the name
}
mFileName = mediaStorageDir.getPath() + File.separator
+ "output-%s.mp4";
System.out.println("final filename : "
+ mediaStorageDir.getPath() + File.separator
+ "output-%s.mp4" + "names of files : " + name);
single_video = false;
return name;
} else {
single_video = true;
name = file_path.get(0);
return name;
}
Error :
The Error that I am facing while trying to merge the videos generated via Images and a normal video is
12-15 12:26:06.155 26022-26111/? W/System.err﹕ java.io.IOException: Cannot append com.googlecode.mp4parser.authoring.Mp4TrackImpl#45417c38 to com.googlecode.mp4parser.authoring.Mp4TrackImpl#44ffac60 since their Sample Description Boxes differ
12-15 12:26:06.155 26022-26111/? W/System.err﹕ at com.googlecode.mp4parser.authoring.tracks.AppendTrack.<init>(AppendTrack.java:48)
Fix that I tried :
Google advised me to change the CODEC in JavaCV from avcodec.AV_CODEC_ID_MPEG4 to avcodec.AV_CODEC_ID_H264. But when I did that, I am not able to get the video from images thereby throwing the following error :
12-15 12:26:05.840 26022-26089/? W/linker﹕ libavcodec.so has text relocations. This is wasting memory and is a security risk. Please fix.
12-15 12:26:05.975 26022-26089/? W/System.err﹕ com.googlecode.javacv.FrameRecorder$Exception: avcodec_open2() error -1: Could not open video codec.
12-15 12:26:05.975 26022-26089/? W/System.err﹕ at com.googlecode.javacv.FFmpegFrameRecorder.startUnsafe(FFmpegFrameRecorder.java:492)
12-15 12:26:05.975 26022-26089/? W/System.err﹕ at com.googlecode.javacv.FFmpegFrameRecorder.start(FFmpegFrameRecorder.java:267)
What I need :
Creating video from Images is inevitable and that video will definitely be used to merge with other videos which might have any Codec Formats. So I need to find a way to merge any kind of videos irrespective of their Codecs or any other parameters. I am trying to keep it simple by just using the Jars and SO files and I dont want to drive myself crazy by going on a full scale implementation of FFMPEG Library. That being said, I am also ready to look into that library if I dont have any other ways to achieve what I want but a solid resource with an ALMOST working code would be much appreciated. Cheers.
Update :
I looked upon the issues mentioned at GitHub of OpenCV, but didnt find anything solid from it.
OpenCV Issues
You are using an MP4 parser which means it won't touch the encoding of the videos. This will only work if the two videos share the same encoder settings like resolution, framerate to list some of the most obvious ones.
If you need to merge videos with different codecs or parameters then you must re-encode them to target a common format and set of parameters and in this case a simple MP4 parser won't do.
You can achieve this directly with ffmpeg.
I am trying to apply Grayscale effect to a video file using JavaCV in android.Everything is working fine but there is no audio present in the output file. Below is the code. Please Help
File file = new File(Environment.getExternalStorageDirectory() + File.separator + "test3.mp4");
FFmpegFrameGrabber frameGrabber = new FFmpegFrameGrabber(file.getAbsolutePath());
FrameRecorder recorder = null;
Log.d("bharat", " Audio channels = " + frameGrabber.getAudioChannels()); // THIS IS RETurnING 0 and not 2
recorder = new FFmpegFrameRecorder("/mnt/sdcard/streaml_t.mp4", 270, 480, frameGrabber.getAudioChannels());
recorder.setVideoCodec(AV_CODEC_ID_H264);
recorder.setFormat("mp4");
recorder.setFrameRate(frameGrabber.getFrameRate());
recorder.setSampleFormat(frameGrabber.getSampleFormat());
recorder.setSampleRate(frameGrabber.getSampleRate());
try {
recorder.start();
frameGrabber.start();
int count = 0;
while (true) {
try {
Frame grabFrame = frameGrabber.grabFrame();
Log.d("bharat:", "frame " + count++);
if (grabFrame == null) {
System.out.println("!!! Failed cvQueryFrame");
break;
}
IplImage frame_copy = cvCreateImage(cvSize(grabFrame.image.width(), grabFrame.image.height()), IPL_DEPTH_8U, 1);
cvCvtColor(grabFrame.image, frame_copy, CV_RGB2GRAY);
grabFrame.image = frame_copy;
recorder.setTimestamp(frameGrabber.getTimestamp());
recorder.record(grabFrame);
} catch (Exception e) {
e.printStackTrace();
}
}
Log.d("bharat:", "frame done");
recorder.stop();
recorder.release();
} catch (Exception e) {
e.printStackTrace();
}
}
P. S : I found that frameGrabber.getAudioChannels() is returning 0 and not 2 (in case it helps)
Once I met a problem due to lack of docs in JavaCV like:
/** Grab next videoframe */
public IplImage grab()
/** Grab next video or audio frame */
public Frame grabFrame()
So you get audio or video frame and then you always try to process this as image. You should check is it audio or video frame before and don't try to call cvCvtColor() for audio.