I have problem in video processing in Android.
To merging multiple videos, I'm now using ffmpeg c++ library and JavaCV.
Here is my code:
protected Void doInBackground(String... params) {
String firstVideo = params[0];
String secondVideo = params[1];
String outPutVideo = params[2];
try {
FrameGrabber grabber1 = new FFmpegFrameGrabber(firstVideo);
grabber1.start();
FrameGrabber grabber2 = new FFmpegFrameGrabber(secondVideo);
grabber2.start();
FrameRecorder recorder2 = new FFmpegFrameRecorder(outPutVideo, grabber2.getImageWidth(),
grabber2.getImageHeight(), grabber1.getAudioChannels());
recorder2.setVideoCodec(grabber2.getVideoCodec());
recorder2.setFrameRate(grabber2.getFrameRate());
recorder2.setSampleFormat(grabber2.getSampleFormat());
recorder2.setSampleRate(grabber2.getSampleRate());
recorder2.setAudioChannels(2);
recorder2.start();
Frame frame;
int j = 0;
while ((frame = grabber1.grabFrame()) != null) {
j++;
recorder2.record(frame);
}
while ((frame = grabber2.grabFrame()) != null) {
recorder2.record(frame);
}
recorder2.stop();
grabber2.stop();
grabber1.stop();
} catch (Exception e) {
e.printStackTrace();
success = false;
}
return null;
}
First video has no sound, and second video has audio tack.
Audio of second video starts from start of result video.
I tried and had search many hours, but cannot find solution. Please give me advise if you have experience!!!
Related
Cast video using Chromecast in a queue is working fine. As per my requirement, it's need to play video constantly for hours on the screen. For that i get bunch of video urls from server for 5 to 10 videos. When 2 video are remain i get new bunch and i append in a queue. Videos are with the length around 40 to 50 seconds.
It continues play for about 45 to 60 min not more than that. It stops than.
I want it to play for hours...
Can any one Help me to come out from this issue. Any help will be useful for me.
Here is my code to play queue.
public void queuePlay(ArrayList<CastModel> data) {
ArrayList<MediaQueueItem> queueList = new ArrayList<>();
for (int i = 0; i < data.size(); i++) {
MediaMetadata mediaMetadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE);
mediaMetadata.putString(MediaMetadata.KEY_TITLE, data.get(i).vTitle);
mediaMetadata.putString(MediaMetadata.KEY_SUBTITLE, data.get(i).vName);
mediaMetadata.addImage(new WebImage(Uri.parse(data.get(i).vImage)));
JSONObject extraData = null;
try {
extraData = getJsonOfObject(data.get(i));
if (extraData == null)
extraData = new JSONObject();
} catch (Exception e) {
Log.i(TAG, "queuePlay: exception " + e.toString());
}
MediaInfo mediaInfo = new MediaInfo.Builder(data.get(i).vVideo)
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
.setContentType("videos/mp4")
.setMetadata(mediaMetadata)
.setCustomData(extraData)
.setStreamDuration(30 * 1000)
.build();
MediaQueueItem item = new MediaQueueItem.Builder(mediaInfo).build();
queueList.add(item);
}
MediaQueueItem[] queueArray = new MediaQueueItem[queueList.size()];
queueArray = queueList.toArray(queueArray);
remoteMediaClient = sessionManager.getCurrentCastSession().getRemoteMediaClient();
remoteMediaClient.queueLoad(queueArray, 0, REPEAT_MODE_REPEAT_OFF, null);
remoteMediaClient.addListener(new RemoteMediaClient.Listener() {
#Override
public void onStatusUpdated() {
try {
Thread.sleep(1000); // Hold for a while
} catch (InterruptedException e) {
e.printStackTrace();
}
MediaStatus mMediaStatus = remoteMediaClient.getMediaStatus();
if (mMediaStatus != null && mMediaStatus.getQueueItems() != null) {
if (queueItemPlayedPosition < mMediaStatus.getCurrentItemId()) {
Log.w(TAG, "onStatusUpdated: Delete video " + queueItemPlayedPosition);
updateCastList(false);
queueItemPlayedPosition++;
}
Log.e(TAG, "onStatusUpdated getCurrentItemId " + remoteMediaClient.getMediaStatus().getCurrentItemId() + " *** onStatusUpdated: getQueueItemCount *** " + mMediaStatus.getQueueItemCount());
}
}
#Override
public void onMetadataUpdated() {
}
#Override
public void onQueueStatusUpdated() {
}
#Override
public void onPreloadStatusUpdated() {
}
#Override
public void onSendingRemoteMediaRequest() {
}
});
}
Haven't played with Cast SDK a lot but found this Autoplay & Queueing APIs which might provide what you're looking for as it mentions ways to play videos continuously using autoplay.
While using MediaRecorder, we don't have pause/resume for API level below 24.
So there can be a way to do this is:
On pause event stop the recorder and create the recorded file.
And on resume start recording again and create another file and keep doing so until user presses stop.
And at last merge all files.
Many people asked this question on SO, but couldn't find anyway to solve this. People talk about creating multiple media files by stopping recording on pause action and restarting on resume. So my question is How can we merge/join all media file programmatically?
Note: in my case MPEG4 container - m4a for audio and mp4 for video.
I tried using SequenceInputStream to merge multiple InputStream of respective generated recorded files. But it always results the first file only.
Code Snippet:
Enumeration<InputStream> enu = Collections.enumeration(inputStreams);
SequenceInputStream sqStream = new SequenceInputStream(enu);
while ((oneByte = sqStream.read(buffer)) != -1) {
fileOutputStream.write(buffer, 0, oneByte);
}
sqStream.close();
while (enu.hasMoreElements()) {
InputStream element = enu.nextElement();
element.close();
}
fileOutputStream.flush();
fileOutputStream.close();
I could solve this problem using mp4parser library. Thanks much to author of this library :)
Add below dependency in your gradle file:
compile 'com.googlecode.mp4parser:isoparser:1.0.2'
The solution is to stop recorder when user pause and start again on resume as already mentioned in many other answers in stackoverflow. Store all the audio/video files generated in an array and use below method to merge all media files. The example is also taken from mp4parser library and modified little bit as per my need.
public static boolean mergeMediaFiles(boolean isAudio, String sourceFiles[], String targetFile) {
try {
String mediaKey = isAudio ? "soun" : "vide";
List<Movie> listMovies = new ArrayList<>();
for (String filename : sourceFiles) {
listMovies.add(MovieCreator.build(filename));
}
List<Track> listTracks = new LinkedList<>();
for (Movie movie : listMovies) {
for (Track track : movie.getTracks()) {
if (track.getHandler().equals(mediaKey)) {
listTracks.add(track);
}
}
}
Movie outputMovie = new Movie();
if (!listTracks.isEmpty()) {
outputMovie.addTrack(new AppendTrack(listTracks.toArray(new Track[listTracks.size()])));
}
Container container = new DefaultMp4Builder().build(outputMovie);
FileChannel fileChannel = new RandomAccessFile(String.format(targetFile), "rw").getChannel();
container.writeContainer(fileChannel);
fileChannel.close();
return true;
}
catch (IOException e) {
Log.e(LOG_TAG, "Error merging media files. exception: "+e.getMessage());
return false;
}
}
Use flag isAudio as true for Audio files and false for Video files.
Another solution is merging with FFmpeg
Add this line to your app build.gradle
implementation 'com.writingminds:FFmpegAndroid:0.3.2'
And use below code to merge videos.
String textFile = "";
try {
textFile = getTextFile().getAbsolutePath();
} catch (IOException e) {
e.printStackTrace();
}
String[] cmd = new String[]{
"-y",
"-f",
"concat",
"-safe",
"0",
"-i",
textFile,
"-c",
"copy",
"-preset",
"ultrafast",
getVideoFilePath()};
mergeVideos(cmd);
getTextFile()
private File getTextFile() throws IOException {
videoFiles = new String[]{firstPath, secondPath, thirdPatch};
File file = new File(getActivity().getExternalFilesDir(null), System.currentTimeMillis() + "inputFiles.txt");
FileOutputStream out = new FileOutputStream(file, false);
PrintWriter writer = new PrintWriter(out);
StringBuilder builder = new StringBuilder();
for (String path : videoFiles) {
if (path != null) {
builder.append("file ");
builder.append("\'");
builder.append(path);
builder.append("\'\n");
}
}
builder.deleteCharAt(builder.length() - 1);
String text = builder.toString();
writer.print(text);
writer.close();
out.close();
return file;
}
getVideoFilePath()
private String getVideoFilePath() {
final File dir = getActivity().getExternalFilesDir(null);
return (dir == null ? "" : (dir.getAbsolutePath() + "/"))
+ System.currentTimeMillis() + ".mp4";
}
mergeVideos()
private void mergeVideos(String[] cmd) {
FFmpeg ffmpeg = FFmpeg.getInstance(getActivity());
try {
ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
#Override
public void onStart() {
startTime = System.currentTimeMillis();
}
#Override
public void onProgress(String message) {
}
#Override
public void onFailure(String message) {
Toast.makeText(getActivity(), "Failed " + message, Toast.LENGTH_SHORT).show();
}
#Override
public void onSuccess(String message) {
}
#Override
public void onFinish() {
Toast.makeText(getActivity(), "Videos are merged", Toast.LENGTH_SHORT).show();
}
});
} catch (FFmpegCommandAlreadyRunningException e) {
// Handle if FFmpeg is already running
}
}
Run this code before merging
private void checkFfmpegSupport() {
FFmpeg ffmpeg = FFmpeg.getInstance(this);
try {
ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
#Override
public void onStart() {
}
#Override
public void onFailure() {
Toast.makeText(VouchActivity.this, "FFmpeg not supported on this device :(", Toast.LENGTH_SHORT).show();
}
#Override
public void onSuccess() {
}
#Override
public void onFinish() {
}
});
} catch (FFmpegNotSupportedException e) {
// Handle if FFmpeg is not supported by device
}
}
This code is able to make the android device as a USB host for the hardware model. It also can read data from the hardware correctly in Main Activity. However, as soon as I moved it to another activity, everything still works but the data reading is incorrect.
For instance, I'm trying to write the data read into file. First activity is to input filename and just a button to send to another activity. The code below is in the second activity
public class Temp extends Activity {
private FileOutputStream outputStream;
public static D2xxManager ftD2xx= null;
Handler mHandler = new Handler();
FT_Device ftDev = null;
int devCount = 0;
UsbDevice device = null;
TextView Text =null;
String temp = null;
_4DPoint P = null;
int rd = 0;
byte[] byt = null;
byte[] Fdata = null;
String outp = "";
String From_Serial = "";
int Min = -1;
String fileName;
Context c;
final Runnable updateResults = new Runnable() {
#Override
public void run() {
// TODO Auto-generated method stub
Text.setText("" + Min + '\n' + temp);
}
};
public void getData(){
try {
outputStream = openFileOutput(fileName, Context.MODE_PRIVATE);
byt = new byte[256];//{(byte)'a','b','c','d',};
Toast.makeText(getBaseContext(), "start " + fileName , Toast.LENGTH_LONG).show();
Text = (TextView)findViewById(R.id.test2);
device = (UsbDevice) getIntent().getParcelableExtra("USB");
ftD2xx = D2xxManager.getInstance(c);
ftD2xx.addUsbDevice(device);
devCount = ftD2xx.createDeviceInfoList(c);
if (devCount > 0) {
ftDev = ftD2xx.openByUsbDevice(c, device);
}
if( ftDev.isOpen() == true ) {
ftDev.setBitMode((byte)0 , D2xxManager.FT_BITMODE_RESET);
ftDev.setBaudRate(38400);
ftDev.setDataCharacteristics(D2xxManager.FT_DATA_BITS_8, D2xxManager.FT_STOP_BITS_1, D2xxManager.FT_PARITY_NONE);
ftDev.setFlowControl(D2xxManager.FT_FLOW_NONE, (byte) 0x0b, (byte) 0x0d);
Thread t = new Thread() {
public void run() {
int i;
while(true){
rd=0;
while (rd==0){
rd = ftDev.read(byt, 14);
}
for(i=0; i<rd; i++)
outp += (char)byt[i];
From_Serial = new String(outp);
P = new _4DPoint(From_Serial);
temp = String.format("%s: %f %f %f %f %d\n", From_Serial, P.R, P.G, P.B, P.L, P.camera);
try {
outputStream.write(temp.getBytes());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
outp = "";
mHandler.post(updateResults);
}
}
};
t.start();
}
} catch (FileNotFoundException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
} catch (D2xxException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_color);
// Show the Up button in the action bar.
setupActionBar();
Intent intent = getIntent();
fileName = intent.getStringExtra("File Name");
c = this;
getData();
}
The set up should be fine since it's reading data from hardware, but the data read is incorrect.
Also, I'm wondering why we need to create new thread while reading data. I tried not creating new thread and it didn't work well, but still have no idea why? I tried to contact the person who wrote the code to read data but no reply.
Any help would be really appreciated :)
You state that you receive data, therefor I think you should look at your ftDev settings. Try for example to set ftDev.setBaudRate(115200) (this worked for me) or try playing with your other ftDev Settings a little bit.
The settings I use in my programm are:
int baudRate = 115200;
byte stopBit = 1; /*1:1stop bits, 2:2 stop bits*/
byte dataBit = 8; /*8:8bit, 7: 7bit*/
byte parity = 0; /* 0: none, 1: odd, 2: even, 3: mark, 4: space*/
byte flowControl = 1; /*0:none, 1: flow control(CTS,RTS)*/
If this won't work, it is wise to first check this data communication with a computer program e.g. or to analyse the incomming 'wrong' data.
Requirement :
I want to develop an app that has Video Recoding with pause and resume feature.
I Have Tried :
I have developed the app upto recoding the video by using surface view.
Already Researched :
I have already searched all the site and also like but till now i can't get the solution and
that i know there is no default option in android for pause and resume video and also know by merging the video we can achieve it.
What i need:
Please share me if there is any external plugin available for it, guide me how to achieve this if you already achieved, and also share me any resouce that related to how to merge video ..i have searched but no proper resource i had seen please share any thing if you find..
Finally i find the answer :)
i research about ffmpeg it seems more deeply and some more days digging around it but can't get proper resource for ffmepg and i try to use mp4parser lib and successfully completed my requirement.
Code For Merging Multiple Video
public class MergeVide extends AsyncTask<String, Integer, String> {
#Override
protected void onPreExecute() {
progressDialog = ProgressDialog.show(Video.this,
"Preparing for upload", "Please wait...", true);
// do initialization of required objects objects here
};
#Override
protected String doInBackground(String... params) {
try {
String paths[] = new String[count];
Movie[] inMovies = new Movie[count];
for (int i = 0; i < count; i++) {
paths[i] = path + filename + String.valueOf(i + 1) + ".mp4";
inMovies[i] = MovieCreator.build(new FileInputStream(
paths[i]).getChannel());
}
List<Track> videoTracks = new LinkedList<Track>();
List<Track> audioTracks = new LinkedList<Track>();
for (Movie m : inMovies) {
for (Track t : m.getTracks()) {
if (t.getHandler().equals("soun")) {
audioTracks.add(t);
}
if (t.getHandler().equals("vide")) {
videoTracks.add(t);
}
}
}
Movie result = new Movie();
if (audioTracks.size() > 0) {
result.addTrack(new AppendTrack(audioTracks
.toArray(new Track[audioTracks.size()])));
}
if (videoTracks.size() > 0) {
result.addTrack(new AppendTrack(videoTracks
.toArray(new Track[videoTracks.size()])));
}
BasicContainer out = (BasicContainer) new DefaultMp4Builder()
.build(result);
#SuppressWarnings("resource")
FileChannel fc = new RandomAccessFile(String.format(Environment
.getExternalStorageDirectory() + "/wishbyvideo.mp4"),
"rw").getChannel();
out.writeContainer(fc);
fc.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String mFileName = Environment.getExternalStorageDirectory()
.getAbsolutePath();
mFileName += "/wishbyvideo.mp4";
filename = mFileName;
return mFileName;
}
#Override
protected void onPostExecute(String value) {
super.onPostExecute(value);
progressDialog.dismiss();
Intent i = new Intent(Video.this, VideoUpload.class);
i.putExtra("videopath", value);
i.putExtra("id", id);
i.putExtra("name", name);
i.putExtra("photo", photo);
startActivity(i);
finish();
}
}
the count is nothing but video file count.
the above code for merge more video and send the final code to another activity in that i have decided to preview the video.
before using above code make sure use mp4parser lib.
File pcmFile = new File(mediaPath, TEMP_PCM_FILE_NAME);
if (pcmFile.exists())
pcmFile.delete();
int total = 0;
mAudioRecordInstance.startRecording();
try {
DataOutputStream pcmDataOutputStream = new DataOutputStream(
new BufferedOutputStream(new FileOutputStream(pcmFile)));
while (isRecording) {
mAudioRecordInstance.read(mBuffer, 0, mBufferSize);
for (int i = 0; i < mBuffer.length; i++) {
Log.d("Capture", "PCM Write:["+i+"]:" + mBuffer[i]);
pcmDataOutputStream.writeShort(mBuffer[i]);
total++;
}
}
pcmDataOutputStream.close();
} catch (IOException e) {
runOnUiThread(new Runnable() {
#Override
public void run() {
DialogCodes e = DialogCodes.ERROR_CREATING_FILE;
showDialog(e.getValue());
actionButton.performClick();
}
});
return;
} catch (OutOfMemoryError om) {
runOnUiThread(new Runnable() {
#Override
public void run() {
DialogCodes e = DialogCodes.OUT_OF_MEMORY;
showDialog(e.getValue());
System.gc();
actionButton.performClick();
}
});
}
Log.d("Capture", "Stopping recording!!!");
mAudioRecordInstance.stop();
Log.d("Capture", "Processing starts");
short[] shortBuffer = new short[total];
try {
DataInputStream pcmDataInputStream = new DataInputStream(
new BufferedInputStream(new FileInputStream(pcmFile)));
for (int j = 0; pcmDataInputStream.available() > 0; j++) {
shortBuffer[j] = pcmDataInputStream.readShort();
Log.d("Capture", "PCM Read:[" + j + "]:" + shortBuffer[j] );
}
outStream.write(Utilities.shortToBytes(shortBuffer));
pcmDataInputStream.close();
} catch (IOException e) {
runOnUiThread(new Runnable() {
#Override
public void run() {
DialogCodes e = DialogCodes.ERROR_CREATING_FILE;
showDialog(e.getValue());
outFile = null;
actionButton.performClick();
}
});
return;
} catch (OutOfMemoryError om) {
runOnUiThread(new Runnable() {
#Override
public void run() {
DialogCodes e = DialogCodes.OUT_OF_MEMORY;
showDialog(e.getValue());
System.gc();
actionButton.performClick();
}
});
}
I am trying to write PCM data to temp file, so that I can later process it without loosing anything recordable. Initially I tried processing in the same recording loop but the recorded duration didn't matched with the actual duration. Now what I want is to read short from PCM file and write it to WAV file (Want to process short data later if this issue is fixed) with header. If I open the file in Audacity it is coming out to be empty. If I write directly to WAV file instead of temp PCM file it works fine.
Other issue is I am using the handler to run a thread in which I update the duration of recording and update VU meter view. I use mBuffer data to display in VU Meter view and is invalidated every second. No synchronization is used on the data but still it effects the recorded duration. Sometimes it comes out to be thrice the original duration.
Questions are (1) Why reading and writing PCM data to temp file is causing WAV file to be empty? Why reading from the unsynchronized short buffer (member variable) in a thread managed by handler is adding duration to WAV data, this happens when I write recorded buffer to WAV file directly?
It was all in the header
http://gitorious.org/android-eeepc/base/blobs/48276ab989a4d775961ce30a43635a317052672a/core/java/android/speech/srec/WaveHeader.java
Once I fixed that everything was fine.