I'm trying to merge/join two audio files.But the merged file contain only first file audio don't know what is the issue.I think the problem with headers.But don't know how to fix it.
e.g
f1=4kb
f2=3kb
finalFile=7 kb
Size shows merging done but don't know why audio is missed of second file.
Here is my code.
public static void meargeAudio(List<File> filesToMearge)
{
while (filesToMearge.size()!=1){
try {
FileInputStream fistream1 = new FileInputStream(filesToMearge.get(0).getPath()); //(/storage/emulated/0/Audio Notes/1455194356500.mp3) first source file
FileInputStream fistream2 = new FileInputStream(filesToMearge.get(1).getPath());//second source file
SequenceInputStream sistream = new SequenceInputStream(fistream1, fistream2);
FileOutputStream fostream = new FileOutputStream(AppConstrants.APP_FOLDER_PATH+"sss.mp3",true);
int temp;
while ((temp = sistream.read()) != -1) {
// System.out.print( (char) temp ); // to print at DOS prompt
fostream.write(temp); // to write to file
}
fostream.close();
sistream.close();
fistream1.close();
fistream2.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Help me if you can.
Thanks in advance.
You are right about header problem. Reason why your new audio file only recognized as first audio file because you directly concat both audio files. When MediaPlayer reads your merged audio header (bit rate, how long audio data, etc), it recognizes your first audio file only, because that is what it found first. To properly join 2 audio file, you need to read both their header and decode their audio data, recalculate new length of merged audio data and concat both uncompressed audio data, recompressed it (as MP3 for example) then write it to file.
Solution:
mp4parser can use to append audio files
https://github.com/sannies/mp4parser/issues/155
use:
aspectjrt-1.7.3.jar
My Working Code:
public static void mergAudio(List<File> filesToMearge){
try {
while (filesToMearge.size()!=1) {
String audio1 = filesToMearge.get(0).getPath();
String audio2 = filesToMearge.get(1).getPath();
// FileOutputStream fostream = new FileOutputStream(filesToMearge.get(0).getPath(),true);//destinationfile
String outputVideo = filesToMearge.get(0).getPath();
Movie[] inMovies = new Movie[]{
MovieCreator.build(audio1),
MovieCreator.build(audio2),
};
List<Track> audioTracks = new LinkedList<Track>();
for (Movie m : inMovies) {
for (Track t : m.getTracks()) {
if (t.getHandler().equals("soun")) {
audioTracks.add(t);
}
}
}
File file1 = new File(filesToMearge.get(0).getPath());
boolean deleted = file1.delete();
File file2 = new File(filesToMearge.get(1).getPath());
boolean deleted1 = file2.delete();
Movie result = new Movie();
if (audioTracks.size() > 0) {
result.addTrack(new AppendTrack(audioTracks.toArray(new Track[audioTracks.size()])));
}
Container out = new DefaultMp4Builder().build(result);
out.writeContainer(new FileOutputStream(outputVideo).getChannel());
filesToMearge.add(0, new File(filesToMearge.get(0).getPath()));
filesToMearge.remove(1);
filesToMearge.remove(1);
}
} catch (IOException e) {
e.printStackTrace();
}
}
Related
In my application I'd like to play videos from URL or files stored locally on the android device using Gluon-mobile VideoService.
This works fine for URLs, but (in my environment) it does not work for files stored on the android device,
e.g. /sdcard/DCIM/tmp/1834.mp4.
The LogCat shows
V/MediaPlayerService(2431): Create new media retriever from pid 10868
W/AndroidVideoService(10868): Invalid video file: /sdcard/DCIM/tmp/1834.mp4
V/MediaPlayer(10868): resetDrmState: mDrmInfo=null mDrmProvisioningThread=null mPrepareDrmInProgress=false mActiveDrmScheme=false
I can play the file in that location with the standalone Android Video-Player.
I also tried to copy the file programmatically to the directories delivered by
StorageService.getPublicStorage("Movies") (->
/storage/emulated/0/Movies) or
StorageService.getPrivateStorage() (-> /data/user/0/mypackage/files)
plus "/tmp/" + "1834.mp4"
and play it via the application from there, but the LogCat message again shows
W/AndroidVideoService(...): Invalid video file ...
The javadoc of VideoService.getPlaylist() says
The media files (video and audio) can either be a valid URL or they
can be provided in the resources folder.
So is it not possible to play media files stored locally on android device ?
Here are the relevant parts of my build.gradle file:
buildscript {
// ...
dependencies {
classpath 'org.javafxports:jfxmobile-plugin:1.3.16'
}
}
dependencies {
compile 'com.gluonhq:charm:5.0.1'
// ...
}
// ...
jfxmobile {
downConfig {
version = '3.8.0'
// Do not edit the line below. Use Gluon Mobile Settings in your project context menu instead
plugins 'display', 'lifecycle', 'statusbar', 'storage', 'video'
}
// ...
}
My phone has Android 8.1.
Added source code for testing purposes:
int i = 0;
try {
File privateAppStorage = Services.get(StorageService.class)
.flatMap(StorageService::getPrivateStorage)
.orElseThrow(() -> new FileNotFoundException("Could not access private storage."));
String outputfilename = privateAppStorage + "/1834.mp4";
if(i == 0) { // to skip copying set i != 0 on 2nd run
// copy video
File input = new File("/sdcard/DCIM/tmp/1834.mp4");
int li = (int) input.length();
byte[] bFile = new byte[li];
FileInputStream fis = new FileInputStream(input);
fis.read(bFile);
fis.close();
File output = new File(outputfilename);
FileOutputStream fos = new FileOutputStream(output);
fos.write(bFile);
fos.flush();
fos.close();
li = (int) output.length();
/* test copying routine
File testoutput = new File("/sdcard/DCIM/tmp/1834_2.mp4");
FileOutputStream tfos = new FileOutputStream(testoutput);
tfos.write(bFile);
tfos.flush();
tfos.close();
li = (int) testoutput.length();
*/ // end test copying routine
}
// play video
Optional<VideoService> service = Services.get(VideoService.class);
if(service.isPresent()){
VideoService videoService = service.get();
videoService.setControlsVisible(true);
videoService.setFullScreen(true);
Status status = videoService.statusProperty().get();
ObservableList<String> sl = videoService.getPlaylist();
if(sl.size() > 0)
sl.set(0, outputfilename);
else
videoService.getPlaylist().add(outputfilename);
videoService.show();
}
} catch ( IOException e) {
e.printStackTrace();
}
Edit 2019-06-05
More debug inormation regarding the DefaultVideoService internal copying process:
I/DefaultVideoService(10544): Copying video file: /data/user/0/mypackage/files/1834.mp4, from resources to /data/user/0/mypackage/files/assets/_data_user_0_com.hp_files_1834.mp4
I/DefaultVideoService(10544): Copying video file /data/user/0/mypackage/files/1834.mp4 finished with result: failed
I/AndroidVideoService(10544): Creating new MediaPlayer for /data/user/0/mypackage/files/1834.mp4
Debugging into DefaultVideoService.copyFile(...) I found that statement DefaultVideoService.class.getResourceAsStream(pathIni) returns null and thus DefaultVideoService internal copying fails.
Why it returns null, I do not know since I do not have the appropriate java.lang.Class source.
Here is a workaround that I found for my environment.
Lets say the file to be played is
String filename = "/sdcard/DCIM/tmp/1834.mp4";
Before calling
VideoService.getPlaylist.add(filename);
copy the file with the statements:
File privateAppStorage = Services.get(StorageService.class)
.flatMap(StorageService::getPrivateStorage)
.orElseThrow(() -> new FileNotFoundException("Could not access private storage."));
File assets = new File(privateAppStorage.getAbsolutePath() + "/assets");
boolean assets_exists = assets.exists();
if(!assets.exists())
{
assets_exists = assets.mkdir();
}
if(assets_exists && assets.canWrite())
{
File input = new File(filename);
int li = (int) input.length();
byte[] bFile = new byte[li];
FileInputStream fis = new FileInputStream(input);
fis.read(bFile);
fis.close();
File copiedToAssets = new File(assets.getAbsolutePath() + "/" + filename.replaceAll("/", "_"));
FileOutputStream fos = new FileOutputStream(copiedToAssets);
fos.write(bFile);
fos.flush();
fos.close();
}
This bypasses DefaultVideoService.class.getResourceAsStream(pathIni) mentioned in my question and the video can be played.
I am trying to merge audio and video file. The length of audio and video file are same but after complete merging process in output video file audio playing only for 3-4 second while audio and video file length was 14 seconds. Please help me why audio not playing completely in the output file.
String OutputPath = path + outputVideo;
try {
FrameGrabber grabber1 = new FFmpegFrameGrabber(videoFile);
FrameGrabber grabber2 = new FFmpegFrameGrabber(audioFile);
grabber1.start();
grabber2.start();
FrameRecorder recorder = new FFmpegFrameRecorder(OutputPath,
grabber1.getImageWidth(), grabber1.getImageHeight(),2);
recorder.setFormat(grabber1.getFormat());
recorder.setVideoQuality(1);
recorder.setFrameRate(grabber1.getFrameRate());
recorder.setSampleRate(grabber2.getSampleRate());
recorder.start();
Frame frame1, frame2;
while ((frame1 = grabber1.grabFrame()) != null &&
(frame2 = grabber2.grabFrame()) != null) {
recorder.record(frame1);
recorder.record(frame2);
}
recorder.stop();
grabber1.stop();
grabber2.stop();
} catch (FrameGrabber.Exception e) {
e.printStackTrace();
} catch (FrameRecorder.Exception e) {
e.printStackTrace();
}
I am creating a video from images via FFMPEG and I am able to get the video from images. I am also making use of JavaCV to merge two videos and I am able to join videos using JavaCV without any issues provided both the videos are taken via camera, i.e, a video actually recorded via mobile camera.
Issue that I'm facing :
I am not able to merge the video that was generated from FFMPEG using the images along with the video user has chosen which will mostly be a video that was not generated and taken via mobile camera.
CODE :
Code to generate Video via Images :
FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(path + "/" + "dec16.mp4", 800, 400);
try {
recorder.setVideoCodec(avcodec.AV_CODEC_ID_MPEG4);
//recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
recorder.setVideoCodecName("H264");
recorder.setVideoOption("preset", "ultrafast");
recorder.setFormat("mp4");
recorder.setFrameRate(frameRate);
recorder.setVideoBitrate(60);
recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
startTime = System.currentTimeMillis();
recorder.start();
for(int j=0;j<MomentsGetStarted.chosen_images_uri.size();j++)
{
if(MomentsGetStarted.chosen_images_uri.get(j).video_id==0)
{
chosen_images.add(MomentsGetStarted.chosen_images_uri.get(j));
}
}
for (int i = 0; i < chosen_images.size(); i++) {
opencv_core.IplImage image = cvLoadImage(chosen_images.get(i).sdcardPath);
long t = 3000 * (System.currentTimeMillis() - startTime);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
recorder.record(image);
}
}
recorder.stop();
} catch (Exception e) {
e.printStackTrace();
}
Code to merge Videos :
int count = file_path.size();
System.out.println("final_joined_list size " + file_path.size());
if (file_path.size() != 1) {
try {
Movie[] inMovies = new Movie[count];
mediaStorageDir = new File(
Environment.getExternalStorageDirectory()
+ "/Pictures");
for (int i = 0; i < count; i++) {
File file = new File(file_path.get(i));
System.out.println("fileeeeeeeeeeeeeeee " + file);
System.out.println("file exists!!!!!!!!!!");
FileInputStream fis = new FileInputStream(file);
FileChannel fc = fis.getChannel();
inMovies[i] = MovieCreator.build(fc);
fis.close();
fc.close();
}
List<Track> videoTracks = new LinkedList<Track>();
List<Track> audioTracks = new LinkedList<Track>();
Log.d("Movies length", "isss " + inMovies.length);
if (inMovies.length != 0) {
for (Movie m : inMovies) {
for (Track t : m.getTracks()) {
if (t.getHandler().equals("soun")) {
audioTracks.add(t);
}
if (t.getHandler().equals("vide")) {
videoTracks.add(t);
}
if (t.getHandler().equals("")) {
}
}
}
}
Movie result = new Movie();
System.out.println("audio and videoo tracks : "
+ audioTracks.size() + " , " + videoTracks.size());
if (audioTracks.size() > 0) {
result.addTrack(new AppendTrack(audioTracks
.toArray(new Track[audioTracks.size()])));
}
if (videoTracks.size() > 0) {
result.addTrack(new AppendTrack(videoTracks
.toArray(new Track[videoTracks.size()])));
}
IsoFile out = null;
try {
out = (IsoFile) new DefaultMp4Builder().build(result);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
long timestamp = new Date().getTime();
String timestampS = "" + timestamp;
File storagePath = new File(mediaStorageDir
+ File.separator);
storagePath.mkdirs();
File myMovie = new File(storagePath, String.format("%s.mp4", timestampS));
FileOutputStream fos = new FileOutputStream(myMovie);
FileChannel fco = fos.getChannel();
fco.position(0);
out.getBox(fco);
fco.close();
fos.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String mFileName = Environment.getExternalStorageDirectory()
.getAbsolutePath();
// mFileName += "/output.mp4";
File sdCardRoot = Environment.getExternalStorageDirectory();
File yourDir = new File(mediaStorageDir + File.separator);
for (File f : yourDir.listFiles()) {
if (f.isFile())
name = f.getName();
// make something with the name
}
mFileName = mediaStorageDir.getPath() + File.separator
+ "output-%s.mp4";
System.out.println("final filename : "
+ mediaStorageDir.getPath() + File.separator
+ "output-%s.mp4" + "names of files : " + name);
single_video = false;
return name;
} else {
single_video = true;
name = file_path.get(0);
return name;
}
Error :
The Error that I am facing while trying to merge the videos generated via Images and a normal video is
12-15 12:26:06.155 26022-26111/? W/System.err﹕ java.io.IOException: Cannot append com.googlecode.mp4parser.authoring.Mp4TrackImpl#45417c38 to com.googlecode.mp4parser.authoring.Mp4TrackImpl#44ffac60 since their Sample Description Boxes differ
12-15 12:26:06.155 26022-26111/? W/System.err﹕ at com.googlecode.mp4parser.authoring.tracks.AppendTrack.<init>(AppendTrack.java:48)
Fix that I tried :
Google advised me to change the CODEC in JavaCV from avcodec.AV_CODEC_ID_MPEG4 to avcodec.AV_CODEC_ID_H264. But when I did that, I am not able to get the video from images thereby throwing the following error :
12-15 12:26:05.840 26022-26089/? W/linker﹕ libavcodec.so has text relocations. This is wasting memory and is a security risk. Please fix.
12-15 12:26:05.975 26022-26089/? W/System.err﹕ com.googlecode.javacv.FrameRecorder$Exception: avcodec_open2() error -1: Could not open video codec.
12-15 12:26:05.975 26022-26089/? W/System.err﹕ at com.googlecode.javacv.FFmpegFrameRecorder.startUnsafe(FFmpegFrameRecorder.java:492)
12-15 12:26:05.975 26022-26089/? W/System.err﹕ at com.googlecode.javacv.FFmpegFrameRecorder.start(FFmpegFrameRecorder.java:267)
What I need :
Creating video from Images is inevitable and that video will definitely be used to merge with other videos which might have any Codec Formats. So I need to find a way to merge any kind of videos irrespective of their Codecs or any other parameters. I am trying to keep it simple by just using the Jars and SO files and I dont want to drive myself crazy by going on a full scale implementation of FFMPEG Library. That being said, I am also ready to look into that library if I dont have any other ways to achieve what I want but a solid resource with an ALMOST working code would be much appreciated. Cheers.
Update :
I looked upon the issues mentioned at GitHub of OpenCV, but didnt find anything solid from it.
OpenCV Issues
You are using an MP4 parser which means it won't touch the encoding of the videos. This will only work if the two videos share the same encoder settings like resolution, framerate to list some of the most obvious ones.
If you need to merge videos with different codecs or parameters then you must re-encode them to target a common format and set of parameters and in this case a simple MP4 parser won't do.
You can achieve this directly with ffmpeg.
I am trying to wirte metadata information using mp4parser but in my code I am getting userDataBox Empty in case of video captured by android but in case of other video (I have tested with downloaded vide) it is not empty and I was added metadata successfully, my problem is for video captured by android having empty userdatabox. Can any body help me ?
moov.getBoxes(UserDataBox.class).size()
My Code is here :
File mediaStorageDir = new File(
Environment
.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM),
"MYFOLDER");
File f = new File(mediaStorageDir, "VID.mp4");
if(f.exists())
{
Toast.makeText(MainActivity.this," file found",Toast.LENGTH_SHORT).show();
}
try {
fc = new FileInputStream(f).getChannel();
// fc = new FileInputStream(f).getChannel();
isoFile = new IsoFile(fc);
String str = f.getAbsolutePath();
MovieBox moov = isoFile.getMovieBox();
// for (Box box : moov.getBoxes()) {
// System.out.println("box" + box);
// }
if(moov.getBoxes(UserDataBox.class).size()>0)
{
UserDataBox udta = moov.getBoxes(UserDataBox.class).get(0);
}else{
}
If the udta (User Data Box) is not there you can create it. You might want to have a look at the ChangeMetadata example on github.
UserDataBox userDataBox;
long sizeBefore;
if ((userDataBox = Path.getPath(tempIsoFile, "/moov/udta")) == null) {
sizeBefore = 0;
userDataBox = new UserDataBox();
tempIsoFile.getMovieBox().addBox(userDataBox);
} else {
sizeBefore = userDataBox.getSize();
}
MetaBox metaBox;
if ((metaBox = Path.getPath(userDataBox, "meta")) == null) {
metaBox = new MetaBox();
userDataBox.addBox(metaBox);
}
XmlBox xmlBox = new XmlBox();
xmlBox.setXml(text);
metaBox.addBox(xmlBox);
now you have added the boxes. Unfortunately the file contains other boxes that reference the actual video/audio samples. These reference are absolute to the beginning of the file and must be adjusted as you might have inserted data between the start of the file and the actual audio/video samples.
The needsOffsetCorrection(...) method checks if the data was really inserted between filestar and samples. correctChunkOffsets(...) then does he actual correction of the offsets stored in the stco (ChunkOffsetBox).
long sizeAfter = userDataBox.getSize();
if (needsOffsetCorrection(tempIsoFile)) {
correctChunkOffsets(tempIsoFile, sizeAfter - sizeBefore);
}
videoFileOutputStream = new FileOutputStream(videoFilePath + "_mod.mp4");
tempIsoFile.getBox(videoFileOutputStream.getChannel());
I hope that helps you a bit understanding how MP4 and metadata in MP4 is working. Good Luck!
i successfully got video from sequence of images using javacv in android.now i have problem that is how to merge audio to that newly created video.is it possible in android or javacv integration?
Here is my code,
String path ="/mnt/sdcard/Video_images";
File folder = new File(path);
File[] listOfFiles = folder.listFiles();
if(listOfFiles.length>0)
{
iplimage = new opencv_core.IplImage[listOfFiles.length];
for (int j = 0; j < listOfFiles.length; j++) {
String files="";
if (listOfFiles[j].isFile())
{
files = listOfFiles[j].getName();
System.out.println(" j " +j + listOfFiles[j]);
}
String[] tokens = files.split("\\.(?=[^\\.]+$)");
String name=tokens[0];
Toast.makeText(getBaseContext(), "size"+listOfFiles.length, Toast.LENGTH_SHORT).show();
iplimage[j]=cvLoadImage("/mnt/sdcard/Video_images/"+name+".jpg");
}
}
//
FFmpegFrameRecorder recorder = new
FFmpegFrameRecorder("/mnt/sdcard/Video_images
/output"+System.currentTimeMillis()+".mp4",200,150);
try {
recorder.setVideoCodec(a); //CODEC_ID_MPEG4 //CODEC_ID_MPEG1VIDEO
recorder.setFrameRate(24);
recorder.setPixelFormat(PIX_FMT_YUV420P); //PIX_FMT_YUV420P
recorder.start();
for (int i=0;i<iplimage.length;i++)
{
recorder.record(iplimage[i]);
}
recorder.stop();
}
catch (Exception e){
e.printStackTrace();
}
in this code,how to merge my audio file?
There is nothing in Android for merging video files. You will need to find some Java JAR that can handle this for you, if you need to do it on the device.
It should be possible with the newest version of javacv, which can be downloaded from here.
Here is an idea of what your code would like if you wanted to merge the audio with the video while you create your mp4:
FFmpegFrameGrabber grabber1 = new FFmpegFrameGrabber("song.mp3");
grabber1.start();
FFmpegFrameRecorder recorder = new
FFmpegFrameRecorder("/mnt/sdcard/Video_images/output"+System.currentTimeMillis()+".mp4",200,150, grabber1.getAudioChannels());
try {
recorder.setVideoCodec(a); //CODEC_ID_MPEG4 //CODEC_ID_MPEG1VIDEO
recorder.setFrameRate(24);
recorder.setPixelFormat(PIX_FMT_YUV420P); //PIX_FMT_YUV420P
recorder.start();
com.googlecode.javacv.Frame frame1 = new com.googlecode.javacv.Frame();
for (int i=0;i<iplimage.length;i++)
{
frame1 = grabber1.grabFrame();
recorder.record(frame1);
recorder.record(iplimage[i]);
}
recorder.stop();
grabber1.stop();
This may not be 100% correct, but should be a good starting place. Of course you want to make sure that your frame1 isn't null before you try to record it.