Unable to merge videos in Android using JavaCV ("Sample Description" Error) - android

I am creating a video from images via FFMPEG and I am able to get the video from images. I am also making use of JavaCV to merge two videos and I am able to join videos using JavaCV without any issues provided both the videos are taken via camera, i.e, a video actually recorded via mobile camera.
Issue that I'm facing :
I am not able to merge the video that was generated from FFMPEG using the images along with the video user has chosen which will mostly be a video that was not generated and taken via mobile camera.
CODE :
Code to generate Video via Images :
FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(path + "/" + "dec16.mp4", 800, 400);
try {
recorder.setVideoCodec(avcodec.AV_CODEC_ID_MPEG4);
//recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
recorder.setVideoCodecName("H264");
recorder.setVideoOption("preset", "ultrafast");
recorder.setFormat("mp4");
recorder.setFrameRate(frameRate);
recorder.setVideoBitrate(60);
recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
startTime = System.currentTimeMillis();
recorder.start();
for(int j=0;j<MomentsGetStarted.chosen_images_uri.size();j++)
{
if(MomentsGetStarted.chosen_images_uri.get(j).video_id==0)
{
chosen_images.add(MomentsGetStarted.chosen_images_uri.get(j));
}
}
for (int i = 0; i < chosen_images.size(); i++) {
opencv_core.IplImage image = cvLoadImage(chosen_images.get(i).sdcardPath);
long t = 3000 * (System.currentTimeMillis() - startTime);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
recorder.record(image);
}
}
recorder.stop();
} catch (Exception e) {
e.printStackTrace();
}
Code to merge Videos :
int count = file_path.size();
System.out.println("final_joined_list size " + file_path.size());
if (file_path.size() != 1) {
try {
Movie[] inMovies = new Movie[count];
mediaStorageDir = new File(
Environment.getExternalStorageDirectory()
+ "/Pictures");
for (int i = 0; i < count; i++) {
File file = new File(file_path.get(i));
System.out.println("fileeeeeeeeeeeeeeee " + file);
System.out.println("file exists!!!!!!!!!!");
FileInputStream fis = new FileInputStream(file);
FileChannel fc = fis.getChannel();
inMovies[i] = MovieCreator.build(fc);
fis.close();
fc.close();
}
List<Track> videoTracks = new LinkedList<Track>();
List<Track> audioTracks = new LinkedList<Track>();
Log.d("Movies length", "isss " + inMovies.length);
if (inMovies.length != 0) {
for (Movie m : inMovies) {
for (Track t : m.getTracks()) {
if (t.getHandler().equals("soun")) {
audioTracks.add(t);
}
if (t.getHandler().equals("vide")) {
videoTracks.add(t);
}
if (t.getHandler().equals("")) {
}
}
}
}
Movie result = new Movie();
System.out.println("audio and videoo tracks : "
+ audioTracks.size() + " , " + videoTracks.size());
if (audioTracks.size() > 0) {
result.addTrack(new AppendTrack(audioTracks
.toArray(new Track[audioTracks.size()])));
}
if (videoTracks.size() > 0) {
result.addTrack(new AppendTrack(videoTracks
.toArray(new Track[videoTracks.size()])));
}
IsoFile out = null;
try {
out = (IsoFile) new DefaultMp4Builder().build(result);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
long timestamp = new Date().getTime();
String timestampS = "" + timestamp;
File storagePath = new File(mediaStorageDir
+ File.separator);
storagePath.mkdirs();
File myMovie = new File(storagePath, String.format("%s.mp4", timestampS));
FileOutputStream fos = new FileOutputStream(myMovie);
FileChannel fco = fos.getChannel();
fco.position(0);
out.getBox(fco);
fco.close();
fos.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String mFileName = Environment.getExternalStorageDirectory()
.getAbsolutePath();
// mFileName += "/output.mp4";
File sdCardRoot = Environment.getExternalStorageDirectory();
File yourDir = new File(mediaStorageDir + File.separator);
for (File f : yourDir.listFiles()) {
if (f.isFile())
name = f.getName();
// make something with the name
}
mFileName = mediaStorageDir.getPath() + File.separator
+ "output-%s.mp4";
System.out.println("final filename : "
+ mediaStorageDir.getPath() + File.separator
+ "output-%s.mp4" + "names of files : " + name);
single_video = false;
return name;
} else {
single_video = true;
name = file_path.get(0);
return name;
}
Error :
The Error that I am facing while trying to merge the videos generated via Images and a normal video is
12-15 12:26:06.155 26022-26111/? W/System.err﹕ java.io.IOException: Cannot append com.googlecode.mp4parser.authoring.Mp4TrackImpl#45417c38 to com.googlecode.mp4parser.authoring.Mp4TrackImpl#44ffac60 since their Sample Description Boxes differ
12-15 12:26:06.155 26022-26111/? W/System.err﹕ at com.googlecode.mp4parser.authoring.tracks.AppendTrack.<init>(AppendTrack.java:48)
Fix that I tried :
Google advised me to change the CODEC in JavaCV from avcodec.AV_CODEC_ID_MPEG4 to avcodec.AV_CODEC_ID_H264. But when I did that, I am not able to get the video from images thereby throwing the following error :
12-15 12:26:05.840 26022-26089/? W/linker﹕ libavcodec.so has text relocations. This is wasting memory and is a security risk. Please fix.
12-15 12:26:05.975 26022-26089/? W/System.err﹕ com.googlecode.javacv.FrameRecorder$Exception: avcodec_open2() error -1: Could not open video codec.
12-15 12:26:05.975 26022-26089/? W/System.err﹕ at com.googlecode.javacv.FFmpegFrameRecorder.startUnsafe(FFmpegFrameRecorder.java:492)
12-15 12:26:05.975 26022-26089/? W/System.err﹕ at com.googlecode.javacv.FFmpegFrameRecorder.start(FFmpegFrameRecorder.java:267)
What I need :
Creating video from Images is inevitable and that video will definitely be used to merge with other videos which might have any Codec Formats. So I need to find a way to merge any kind of videos irrespective of their Codecs or any other parameters. I am trying to keep it simple by just using the Jars and SO files and I dont want to drive myself crazy by going on a full scale implementation of FFMPEG Library. That being said, I am also ready to look into that library if I dont have any other ways to achieve what I want but a solid resource with an ALMOST working code would be much appreciated. Cheers.
Update :
I looked upon the issues mentioned at GitHub of OpenCV, but didnt find anything solid from it.
OpenCV Issues

You are using an MP4 parser which means it won't touch the encoding of the videos. This will only work if the two videos share the same encoder settings like resolution, framerate to list some of the most obvious ones.
If you need to merge videos with different codecs or parameters then you must re-encode them to target a common format and set of parameters and in this case a simple MP4 parser won't do.
You can achieve this directly with ffmpeg.

Related

Merging audio file in android not working

I'm trying to merge/join two audio files.But the merged file contain only first file audio don't know what is the issue.I think the problem with headers.But don't know how to fix it.
e.g
f1=4kb
f2=3kb
finalFile=7 kb
Size shows merging done but don't know why audio is missed of second file.
Here is my code.
public static void meargeAudio(List<File> filesToMearge)
{
while (filesToMearge.size()!=1){
try {
FileInputStream fistream1 = new FileInputStream(filesToMearge.get(0).getPath()); //(/storage/emulated/0/Audio Notes/1455194356500.mp3) first source file
FileInputStream fistream2 = new FileInputStream(filesToMearge.get(1).getPath());//second source file
SequenceInputStream sistream = new SequenceInputStream(fistream1, fistream2);
FileOutputStream fostream = new FileOutputStream(AppConstrants.APP_FOLDER_PATH+"sss.mp3",true);
int temp;
while ((temp = sistream.read()) != -1) {
// System.out.print( (char) temp ); // to print at DOS prompt
fostream.write(temp); // to write to file
}
fostream.close();
sistream.close();
fistream1.close();
fistream2.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Help me if you can.
Thanks in advance.
You are right about header problem. Reason why your new audio file only recognized as first audio file because you directly concat both audio files. When MediaPlayer reads your merged audio header (bit rate, how long audio data, etc), it recognizes your first audio file only, because that is what it found first. To properly join 2 audio file, you need to read both their header and decode their audio data, recalculate new length of merged audio data and concat both uncompressed audio data, recompressed it (as MP3 for example) then write it to file.
Solution:
mp4parser can use to append audio files
https://github.com/sannies/mp4parser/issues/155
use:
aspectjrt-1.7.3.jar
My Working Code:
public static void mergAudio(List<File> filesToMearge){
try {
while (filesToMearge.size()!=1) {
String audio1 = filesToMearge.get(0).getPath();
String audio2 = filesToMearge.get(1).getPath();
// FileOutputStream fostream = new FileOutputStream(filesToMearge.get(0).getPath(),true);//destinationfile
String outputVideo = filesToMearge.get(0).getPath();
Movie[] inMovies = new Movie[]{
MovieCreator.build(audio1),
MovieCreator.build(audio2),
};
List<Track> audioTracks = new LinkedList<Track>();
for (Movie m : inMovies) {
for (Track t : m.getTracks()) {
if (t.getHandler().equals("soun")) {
audioTracks.add(t);
}
}
}
File file1 = new File(filesToMearge.get(0).getPath());
boolean deleted = file1.delete();
File file2 = new File(filesToMearge.get(1).getPath());
boolean deleted1 = file2.delete();
Movie result = new Movie();
if (audioTracks.size() > 0) {
result.addTrack(new AppendTrack(audioTracks.toArray(new Track[audioTracks.size()])));
}
Container out = new DefaultMp4Builder().build(result);
out.writeContainer(new FileOutputStream(outputVideo).getChannel());
filesToMearge.add(0, new File(filesToMearge.get(0).getPath()));
filesToMearge.remove(1);
filesToMearge.remove(1);
}
} catch (IOException e) {
e.printStackTrace();
}
}

WordtoPdf /PdftoWord in Android

Anyone Know the code for converting Word to PDF / PDF to Word in ANDROID...
If you Know
Please Share me....
Already tried:
Jars:
docx4j-3.0.0.jar
Code:
try
{
long start = System.currentTimeMillis();
InputStream is = new FileInputStream(
new File("file1"));
Toast.makeText(getApplicationContext(), "is", Toast.LENGTH_SHORT).show();
WordprocessingMLPackage wordMLPackage = WordprocessingMLPackage
.load(is);
List sections = wordMLPackage.getDocumentModel().getSections();
for (int i = 0; i < sections.size(); i++) {
System.out.println("sections Size" + sections.size());
/* wordMLPackage.getDocumentModel().getSections().get(i)
.getPageDimensions().setHeaderExtent(3000);*/
}
Mapper fontMapper = new IdentityPlusMapper();
PhysicalFont font = (PhysicalFont) PhysicalFonts.getPhysicalFonts().get(
"Comic Sans MS");
fontMapper.getFontMappings().put("Algerian", font);
wordMLPackage.setFontMapper(fontMapper);
PdfSettings pdfSettings = new PdfSettings();
org.docx4j.convert.out.pdf.PdfConversion conversion = new org.docx4j.convert.out.pdf.viaXSLFO.Conversion(
wordMLPackage);
OutputStream out = new FileOutputStream(new File(
"file1/sampleeee.pdf"));
conversion.output(out, pdfSettings);
System.err.println("Time taken to Generate pdf "
+ (System.currentTimeMillis() - start) + "ms");
}
catch(Exception e)
{
e.printStackTrace();
}
But I cant get the output...
One reason why you "cant get the output" is because you are providing an incorrect FileOutputStream. You need to be using either:
Internal storage (openOutputStream(), getFilesDir(), etc.), or
External storage (getExternalFilesDir(), etc.)

No sound is captured in video created using FFmpegFrameGrabber, FrameRecorder JavaCv in android

I am trying to apply Grayscale effect to a video file using JavaCV in android.Everything is working fine but there is no audio present in the output file. Below is the code. Please Help
File file = new File(Environment.getExternalStorageDirectory() + File.separator + "test3.mp4");
FFmpegFrameGrabber frameGrabber = new FFmpegFrameGrabber(file.getAbsolutePath());
FrameRecorder recorder = null;
Log.d("bharat", " Audio channels = " + frameGrabber.getAudioChannels()); // THIS IS RETurnING 0 and not 2
recorder = new FFmpegFrameRecorder("/mnt/sdcard/streaml_t.mp4", 270, 480, frameGrabber.getAudioChannels());
recorder.setVideoCodec(AV_CODEC_ID_H264);
recorder.setFormat("mp4");
recorder.setFrameRate(frameGrabber.getFrameRate());
recorder.setSampleFormat(frameGrabber.getSampleFormat());
recorder.setSampleRate(frameGrabber.getSampleRate());
try {
recorder.start();
frameGrabber.start();
int count = 0;
while (true) {
try {
Frame grabFrame = frameGrabber.grabFrame();
Log.d("bharat:", "frame " + count++);
if (grabFrame == null) {
System.out.println("!!! Failed cvQueryFrame");
break;
}
IplImage frame_copy = cvCreateImage(cvSize(grabFrame.image.width(), grabFrame.image.height()), IPL_DEPTH_8U, 1);
cvCvtColor(grabFrame.image, frame_copy, CV_RGB2GRAY);
grabFrame.image = frame_copy;
recorder.setTimestamp(frameGrabber.getTimestamp());
recorder.record(grabFrame);
} catch (Exception e) {
e.printStackTrace();
}
}
Log.d("bharat:", "frame done");
recorder.stop();
recorder.release();
} catch (Exception e) {
e.printStackTrace();
}
}
P. S : I found that frameGrabber.getAudioChannels() is returning 0 and not 2 (in case it helps)
Once I met a problem due to lack of docs in JavaCV like:
/** Grab next videoframe */
public IplImage grab()
/** Grab next video or audio frame */
public Frame grabFrame()
So you get audio or video frame and then you always try to process this as image. You should check is it audio or video frame before and don't try to call cvCvtColor() for audio.

how to merge audio file with new video file?is it possible in android?

i successfully got video from sequence of images using javacv in android.now i have problem that is how to merge audio to that newly created video.is it possible in android or javacv integration?
Here is my code,
String path ="/mnt/sdcard/Video_images";
File folder = new File(path);
File[] listOfFiles = folder.listFiles();
if(listOfFiles.length>0)
{
iplimage = new opencv_core.IplImage[listOfFiles.length];
for (int j = 0; j < listOfFiles.length; j++) {
String files="";
if (listOfFiles[j].isFile())
{
files = listOfFiles[j].getName();
System.out.println(" j " +j + listOfFiles[j]);
}
String[] tokens = files.split("\\.(?=[^\\.]+$)");
String name=tokens[0];
Toast.makeText(getBaseContext(), "size"+listOfFiles.length, Toast.LENGTH_SHORT).show();
iplimage[j]=cvLoadImage("/mnt/sdcard/Video_images/"+name+".jpg");
}
}
//
FFmpegFrameRecorder recorder = new
FFmpegFrameRecorder("/mnt/sdcard/Video_images
/output"+System.currentTimeMillis()+".mp4",200,150);
try {
recorder.setVideoCodec(a); //CODEC_ID_MPEG4 //CODEC_ID_MPEG1VIDEO
recorder.setFrameRate(24);
recorder.setPixelFormat(PIX_FMT_YUV420P); //PIX_FMT_YUV420P
recorder.start();
for (int i=0;i<iplimage.length;i++)
{
recorder.record(iplimage[i]);
}
recorder.stop();
}
catch (Exception e){
e.printStackTrace();
}
in this code,how to merge my audio file?
There is nothing in Android for merging video files. You will need to find some Java JAR that can handle this for you, if you need to do it on the device.
It should be possible with the newest version of javacv, which can be downloaded from here.
Here is an idea of what your code would like if you wanted to merge the audio with the video while you create your mp4:
FFmpegFrameGrabber grabber1 = new FFmpegFrameGrabber("song.mp3");
grabber1.start();
FFmpegFrameRecorder recorder = new
FFmpegFrameRecorder("/mnt/sdcard/Video_images/output"+System.currentTimeMillis()+".mp4",200,150, grabber1.getAudioChannels());
try {
recorder.setVideoCodec(a); //CODEC_ID_MPEG4 //CODEC_ID_MPEG1VIDEO
recorder.setFrameRate(24);
recorder.setPixelFormat(PIX_FMT_YUV420P); //PIX_FMT_YUV420P
recorder.start();
com.googlecode.javacv.Frame frame1 = new com.googlecode.javacv.Frame();
for (int i=0;i<iplimage.length;i++)
{
frame1 = grabber1.grabFrame();
recorder.record(frame1);
recorder.record(iplimage[i]);
}
recorder.stop();
grabber1.stop();
This may not be 100% correct, but should be a good starting place. Of course you want to make sure that your frame1 isn't null before you try to record it.

Compiling ffmpeg only for converting images to video(w/ sound)

Ok, so I have successfully compiled ffmpeg for android using the Guardian Project source code. Now the binary comes out around 10 MB. But since it has to goto a phone, i just wanted it to be as small as possible.
Now The Guardian Project code has a file configure_ffmpeg.sh(goto link to see) which it says to change to add/remove functionality. But i have no clue what does what. So could some help me out sorting this. Ideally I would be able to disable a few things.
Thanks,
Digvijay
Regarding the size of the project I don't know what you could do.
Regarding the conversion of images to a video, you could do something like:
public static String convert(File file) {
FFmpegFrameGrabber frameGrabber =
new FFmpegFrameGrabber(file.getAbsolutePath());
String newName = file.getName();
newName =
newName.replace("." + FileManager.getExtension(file),
"_RECODED.mp4");
String newFilePath = file.getParent() + "/" + newName;
FFmpegFrameRecorder recorder;
Frame frame = new Frame();
recorder = new FFmpegFrameRecorder(newFilePath, 250, 250);
recorder.setFrameRate(20);
recorder.setFormat("mp4");
recorder.setAudioChannels(2);
recorder.setPreset("veryfast");
try {
long startTime = System.currentTimeMillis();
System.out.println("" + startTime);
Log.d("Tempo", "" + startTime);
frameGrabber.start();
recorder.start();
while (true) {
try {
frame = frameGrabber.grabFrame();
if (frame == null) {
System.out.println("!!! Failed cvQueryFrame");
break;
}
recorder.record(frame);
} catch (Exception e) {
Log.e("converter", e.getMessage());
}
}
frameGrabber.stop();
frameGrabber.release();
recorder.stop();
recorder.release();
long stopTime = System.currentTimeMillis();
System.out.println(""+stopTime);
} catch (Exception e) {
e.printStackTrace();
}
return newFilePath;
}
This actually get every frame from an input video and creates another .mp4 video. You could just rewrite it to code from every frame in your folder (or anything like that).

Categories

Resources