How can i pause voice recording in Android? - android

My aim is to pause in recording file.
I see in Android developer site its but Media Recorder have not pause option.
Java supports merge two audio file programatically but In android its not work.
Join two WAV files from Java?
And also I used default device audio recorder Apps which is available in all device but in Samsung few devices have not returened recording path.
Intent intent = new Intent(MediaStore.Audio.Media.RECORD_SOUND_ACTION);
startActivityForResult(intent,REQUESTCODE_RECORDING);
Any one help for voice recording with pause functionality.

http://developer.android.com/reference/android/media/MediaRecorder.html
MediaRecorder does not have pause and resume methods. You need to use stop and start methods instead.

I had such a requirement in one of my projects, What we done was like make a raw file for saving recorded data in start of recording using AudioRecord , the for each resume we append the data to the same file
like
FileOutputStream fos= new FileOutputStream(filename, true);
here the filename is the name of the raw file and append the new recording data to it.
And when user stop the recording we will convert the entire raw file to .wav( or other) formats. Sorry that i cant post the entire code. Hope this will give you a direction to work.

You can refer my answer here if still have this issue. For API level >= 24 pause/resume methods are available in Android MediaRecorder class.
For API level < 24
Add below dependency in your gradle file:
compile 'com.googlecode.mp4parser:isoparser:1.0.2'
The solution is to stop recorder when user pause and start again on resume as already mentioned in many other answers in stackoverflow. Store all the audio/video files generated in an array and use below method to merge all media files. The example is taken from mp4parser library and modified little bit as per my need.
public static boolean mergeMediaFiles(boolean isAudio, String sourceFiles[], String targetFile) {
try {
String mediaKey = isAudio ? "soun" : "vide";
List<Movie> listMovies = new ArrayList<>();
for (String filename : sourceFiles) {
listMovies.add(MovieCreator.build(filename));
}
List<Track> listTracks = new LinkedList<>();
for (Movie movie : listMovies) {
for (Track track : movie.getTracks()) {
if (track.getHandler().equals(mediaKey)) {
listTracks.add(track);
}
}
}
Movie outputMovie = new Movie();
if (!listTracks.isEmpty()) {
outputMovie.addTrack(new AppendTrack(listTracks.toArray(new Track[listTracks.size()])));
}
Container container = new DefaultMp4Builder().build(outputMovie);
FileChannel fileChannel = new RandomAccessFile(String.format(targetFile), "rw").getChannel();
container.writeContainer(fileChannel);
fileChannel.close();
return true;
}
catch (IOException e) {
Log.e(LOG_TAG, "Error merging media files. exception: "+e.getMessage());
return false;
}
}
Use flag isAudio as true for Audio files and false for Video files.

You can't do it using Android API, but you can save a lot of mp4 files and merge it using mp4parser: powerful library written in Java. Also see my simple recorder with a "pause": https://github.com/lassana/continuous-audiorecorder.

Related

Save video in every 5 second interval while video recoding is ON (Android OS)

I want to save the video every 5 seconds while the video recording is ON.
I have tried many solutions but I am facing a Glitch that is, the Last Saved Frame remains in preview for around 300ms.
I think the reason is in MediaRecorder class "Once a recorder has been stopped, it will need to be completely reconfigured and prepared before being restarted."
Thanks
I think it's impossible to do that with MediaRecorder. The better approach could be encoding video by using MediaCodec and storing encoded content bt using MediaMuxer.
Grafika is a project on Google Github account which is a dumping ground for Android graphics & media hacks. In this project, you can find good examples of using both MediaCodec and MediaMuxer classes.
I forked the Grafika project and did some modifications to support sequential segmented recording. You can find it here. When you run the application, select Show + capture camera item from the list and then set Output Segment Duration for example to 5 and then press Start recording button.
Please look at VideoEncoderCore and CameraCaptureActivity classes source code to find how it works. You can find here how it segments live camera feed to different files.
"I think the reason is in MediaRecorder class, "Once a recorder has been stopped, it will need to be completely reconfigured and prepared before being restarted"."
You can use multiple mediaMuxer's to encode separate files.
The camera should send data to fill a MediaMuxer object (which itself produces an .mp4 file).
When needed, you can start writing the Camera data to a second (different) MediaMuxer thus automatically creating a second new .mp4 file (on begin usage of the muxer).
The first MediaMuxer can then close and save its file. Your first segment is ready...
If needed, try to study this code for a guide on using Camera with mediaMuxer:
https://bigflake.com/mediacodec/CameraToMpegTest.java.txt
So you have a function that handles things when the 5 second interval has passed? In that function, could cycle the recording between two muxers, giving one a chance to close its file, while the other records the next segment and then vice-versa).
Instead of something like below (using MediaRecorder.OutputFormat.MPEG_4):
this.mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
You will instead create a new muxer (with MUXER_OUTPUT_MPEG_4):
//# create a new File to ssave into
File outputFile = new File(OUTPUT_FILENAME_DIR, "/yourFolder/Segment" + "-" + mySegmentNum + ".mp4");
String outputPath = outputFile.toString();
int format = MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4;
try { mMuxer = new MediaMuxer(outputPath, format); }
catch (IOException e) { Log.e(TAG, e.getLocalizedMessage()); }
And you stop a muxer with:
mMuxer1.stop(); mMuxer1.release();
PS:
Another option is to use Threads to run multiple MediaRecorders. It might help your situation. See the Android Background Process guide.

Fastest way to check if Video File have Following MetaData?

There are Number of ways to check video file Meta data , using FFmpegMediaMetadataRetriever(Slow but reliable) and using Native MediaMetadataRetriever(Slow and not reliable).
There are number of question answered in SO for same purpose to get MetaData using FFMPEG or Native Media api , like Q1 , Q2 , Q3 but they are not solving my problem.
My Problem:
Get following meta data from file(Video) of android directory:
Video have Sound/Audio or not?
Creation date and time
thumbnail of video file
Kindly let me know if you have any suggestion or code samples would be big help.
when i want check video has audio or not that time i created this method. method return True if Video has Audio otherwise False
just you pass Context and Uri of Your video file
private boolean setHasAudioOrNot(JoinVideoActivity activity, Uri uri) {
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
retriever.setDataSource(activity.getApplicationContext(),uri);
Log.e("Command","audiohas?? " +retriever.extractMetadata(16));
return retriever.extractMetadata(16) != null;
}

How To: Android OpenCV VideoCapture with File

I am trying to pass a video to the OpenCV VideoCapture class. However when I call the VideoCapture.isOpened() method it always returns false. I have tried two methods:
Saving the video file to the internal memory, context.getFilesDir() => /data/data/package_name/files/VideoToAnalyze/Recording.mp4
and also one to environment.getExternalStorageDirectory() => sdcard/appName/Recording.mp4.
Nothing seems to work here. My question is how do I pass a video file (or what is the correct file path) to a VideoCapture OpenCV object? I've posted some code below as an example. Note that I don't get an error. The file is always found/exists, but when I call isOpened() I always get false.
UPDATE:
So it looks like everyone on the web is saying that OpenCV (I'm using 3.10) is lacking a ffmpeg backend and thus cannot process videos. I'm wondering does anyone know this as a fact ?? And is there a work around. Almost all other alternatives to process videos frame by frame is deathly slow.
String x = getApplicationContext().getFilesDir().getAbsolutePath();
File dir = new File(x + "/VideoToAnalyze");
if(dir.isDirectory()) {
videoFile = new File(dir.getAbsolutePath() + "/Recording1.mp4");
} else {
// handle error
}
if(videoFile.exits(){
String absPath = videoFile.getAbsolutePath();
VideoCapture vc = new VideoCapture();
try{
vc.open(absPath);
} catch (Exception e) {
/// handle error
}
if(!vc.isOpened(){
// this code is always hit
Log.v("VideoCapture", "failed");
} else {
Log.v("VideoCapture", "opened");
.....
Its an old question but nevertheless I had same issue.
Opencv for Android only supports MJPEG codec in AVI container. See this
Just to close this .. I downloaded JavaCV .. included .so files into the android project and then used FFMPEGFrameGrabber.

Xamarin Forms Shared Play Mp3 file on Android

I have asked this question in Xamarins forum but nobody has answered, im hoping this forum can provide an answer for me.
My problem is how to play an mp3 file from a shared project on android.
I'm making a Xamarin Forms project and uses interfaces to make platform specific code for playning tha mp3 file.
I have embedded 360 mp3 files in my shared project, and the pass the file to the interface to make the mp3 play.
The class AudioHelper returns a stream by getting the correct mp3 file based on it's name
var assembly = typeof(StaveApp_v2.App).GetTypeInfo().Assembly;
var stream = assembly.GetManifestResourceStream
(resourcePrefix + $"Audio.{name}.mp3");
return stream;
Then I pass the stream to the Interface
var audioStream = AudioHelper.AudioFile("aften");
DependencyService.Get<IPlayWord>().Play(audioStream);
On windows the file plays as expected, but I’m having trouble getting the file to play on android. I'm using AudioTrack to play the sound (the sound should play 2 times). The app plays a scratchy sound 2 times with a duration matching the file length, but I can’t get it to play correctly.
This is the code im using
var memoryStream = new MemoryStream();
audio.CopyTo(memoryStream);
var audioBytes = memoryStream.ToArray();
var byteLenght = audioBytes.Length * sizeof(short);
var audioTrack = new AudioTrack(Android.Media.Stream.Music,
16000,
ChannelOut.Mono,
Encoding.Pcm16bit,
byteLenght,
AudioTrackMode.Static);
for (int i = 0; i < 2; i++)
{
try
{
audioTrack.Write(audioBytes, 0, audioBytes.Length);
audioTrack.Play();
}
catch (IllegalStateException illEx)
{
Log.Debug("StaveApp", $"Unable to initialize audio exception {illEx.Message}");
}
await Task.Delay(2000);
audioTrack.Stop();
audioTrack.ReloadStaticData();
}
audioTrack.Release();
audioTrack.Dispose();
I'm getting no errors in logcat and

Change a video's soundtrack to an audio file of the same length

I have two variables, Uri audioUri and Uri videoUri which point to the location of an audio file (any format the user has) and a video file (mp4) respectively. The video and audio are the same length.
I would like to create a video file that has the same video/frames as the video file, but uses the audio file as the soundtrack.
I ended up using the mp4parser library. The MediaExtractor and MediaMuxer classes were introduced in API 16 and 18 respectively so they are too new for my project.
The caveat of this method is that at the time of writing the audio source must be an aac or mp3 file and the video file must be an mp4 file.
Using Android Studio and Gradle, you can install and use it like this:
Open Gradle Scripts -> build.gradle (Module: app) and add to the end of the dependencies block
compile 'com.googlecode.mp4parser:isoparser:1.0.+'
Click the "Sync Now" button in the yellow banner that appears after you make this change.
Now in your Java file write:
try
{
H264TrackImpl h264Track = new H264TrackImpl(new FileDataSourceImpl(videoFile);
MP3TrackImpl mp3Track = new MP3TrackImpl(new FileDataSourceImpl(audioFile);
Movie movie = new Movie();
movie.addTrack(h264Track);
movie.addTrack(mp3Track);
Container mp4file = new DefaultMp4Builder().build(movie);
FileChannel fileChannel = new FileOutputStream(new File(outputFile)).getChannel();
mp4file.writeContainer(fileChannel);
fileChannel.close();
}
catch (Exception e)
{
Toast.makeText(this, "An error occurred: " + e.getMessage(), Toast.LENGTH_SHORT).show();
}
Use the Alt+Enter tool to have all the classes imported. There were multiple choices for the Movie class for me, so make sure to choose the one starting with com.googlecode.mp4parser.
It is left to you to handle exceptions and to define the self-explanatory Strings outputFile, audioFile and videoFile.

Categories

Resources