I'm trying to mix a MP3 audio file into a MP4 video.
After hours of searching, I've concluded that I need to convert the MP3 files to an AAC format, which would fit in a MP4 container.
Add audio to video in android
But I can't find any documentation on how to convert the MP3 files into AAC format.
So do you have any advice on how I could convert a MP3 audio to an AAC audio ?
Also, I would need to insert several audios at specific times in the final video.
You can try to use mp4parser which does some muxing to mp4, there are some people (as seen from within github repo "issues" part) who are using this to mux mp3 and mp4.
Another option you've got is to use FFmpeg, either compile your self or use something premade, I've used this.
Library it self is a bit bulky, plus you might need some playing around to get FFmpeg commands right in order to get optimum quality and mux speed.
It looks something like this:
try {
FFmpeg ffmpeg = FFmpeg.getInstance(this);
String cmd = "-i " + videoFilePath + " -i " + audioFilePath + " -shortest -threads 0 -preset ultrafast -strict -2 " + outputFilePath
ffmpeg.execute(cmd, mergeListener);
} catch (FFmpegCommandAlreadyRunningException e) {
e.printStackTrace();
}
And a listener:
ExecuteBinaryResponseHandler mergeListener = new ExecuteBinaryResponseHandler() {
#Override
public void onStart() {
//started
}
#Override
public void onFailure(String message) {
//failed
}
#Override
public void onFinish() {
File output = new File(outputFilePath);
//Do whatever with your muxed file
}
};
Related
The following Code A is from the sample project.
It use the Code B to record voice and save as file.
Code B
setOutputFormat(MediaRecorder.OutputFormat.MPEG_4)
setAudioEncoder(MediaRecorder.AudioEncoder.AAC)
You know the getExtensionText() will return two audio format, mp3 or m4a by user settings , there are different audio format.
I don't know why the author can use only Code B to generate two different audio format file.
Can MediaRecorder save voice as mp3 or m4a audio format file automatically based file name extension in Android Studio?
Code A
// mp4 output format with aac encoding should produce good enough m4a files according to https://stackoverflow.com/a/33054794/1967672
private fun startRecording() {
val baseFolder = if (isQPlus()) {
cacheDir
} else {
val defaultFolder = File(config.saveRecordingsFolder)
if (!defaultFolder.exists()) {
defaultFolder.mkdir()
}
defaultFolder.absolutePath
}
currFilePath = "$baseFolder/${getCurrentFormattedDateTime()}.${config.getExtensionText()}"
recorder = MediaRecorder().apply {
setAudioSource(MediaRecorder.AudioSource.CAMCORDER)
setOutputFormat(MediaRecorder.OutputFormat.MPEG_4)
setAudioEncoder(MediaRecorder.AudioEncoder.AAC)
setAudioEncodingBitRate(128000)
setAudioSamplingRate(44100)
...
}
class Config(context: Context) : BaseConfig(context) {
...
fun getExtensionText() = context.getString(when (extension) {
EXTENSION_M4A -> R.string.m4a
else -> R.string.mp3
})
}
Actually, they already defined the file extension in the SettingsActivity.kt but you can use the file name with the extension. Actually, In this code, you can use any extension of the supported media formats.
mediaRecorder = MediaRecorder()
output = Environment.getExternalStorageDirectory().absolutePath + "/recording.mp3"
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC)
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4)
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC)
mediaRecorder.setOutputFile(output)
for the full code see this or here
searching throw the stack i got confused. So, what is my problem:
I'm using AudioRecord class to record some audio, here's the code:
AudioRecord record = new AudioRecord(AudioSource.VOICE_RECOGNITION,
SAMPLING_RATE,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT,
mBufferSize);
record.startRecording();
int read = 0;
while (mIsRecording) {
read = record.read(mAudioBuffer, 0, mBufferSize);
if ((read == AudioRecord.ERROR_INVALID_OPERATION) ||
(read == AudioRecord.ERROR_BAD_VALUE) ||
(read <= 0)) {
continue;
}
proceed();
write(out);
}
After recording is completed, i'm converting this .pcm raw data from AudioRecord to .wav:
private void convertRawToWav() {
File file_raw = new File(mFileNameRaw);
if (!file_raw.exists()) { return; }
File file_wav = new File(mFileNameWav);
try {
PcmAudioHelper.convertRawToWav(WavAudioFormat.mono16Bit(SAMPLING_RATE), file_raw, file_wav);
if (handler != null) {
handler.onRecordSuccess();
}
} catch (IOException e) {
e.printStackTrace();
if (handler != null) {
handler.onRecordSaveError();
}
}
}
I need .wav format in future, because there is trim function in my application, copied from Ringdroid which doesn't support OGG format, therefor : PLEASE DO NOT RECOMMEND ME TO RECORD AUDIO AS OGG ON THE FLY
MAIN ISSUE:
Wav format is too heavy weight, and i need to convert it to smaller one, which is either MP3 or OGG. MP3 is patented, so it's not an option. What i need is:
To convert .wav file to .ogg file so it's weight will be much smaller
What i found:
This library, but it only converts .pcm data to .ogg while recording, and i need to convert whole file after trimmiing it as .wav
Take a look at this lame wrapper project.
Wav file is just a big header + PCM. All you need to do is remove the 44 bytes in the front of the WAV to get the PCM and use the code that you shared with us to convert to OGG.
I manage to play a live stream from a url such as this
rtsp://192.168.0.18:554/user=admin&password=&channel=1&stream=0.sdp?
But I want to download this stream into a temporary file and then play it locally so that I can make it seems like the buffering time is short (around 2-4 seconds delay maybe)
Is it possible to do this with rtsp? or do I have to use http?Because this url only works on rtsp protocol
If so,a bit of example would help me alot
Example of my codes
cA.mPlayer1 = new MediaPlayer();
try {
cA.mPlayer1.setDataSource("rtsp://192.168.0.18:554/user=admin&password=&channel=1&stream=0.sdp?");
cA.mPlayer1.prepareAsync();
cA.mPlayer1.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
cA.mPlayer1.start();
Toast.makeText(getBaseContext(), "Connecting...", Toast.LENGTH_LONG).show();
}
});
} catch (IOException e) {
e.printStackTrace();
}
cA.mCallback1 = new SurfaceHolder.Callback() {
#Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
cA.mPlayer1.setDisplay(surfaceHolder);
}
#Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i2, int i3) {
}
#Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
}
};
final SurfaceView surfaceView1 =
(SurfaceView) findViewById(R.id.surfaceView1);
// Configure the Surface View.
surfaceView1.setKeepScreenOn(true);
// Configure the Surface Holder and register the callback.
SurfaceHolder holder1 = surfaceView1.getHolder();
holder1.addCallback(cA.mCallback1);
holder1.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
you can not use MediaPlayer to save a Raw stream in file . you can use one of these :
use 'vlc' to save the stream like this:
https://wiki.videolan.org/Documentation:Streaming_HowTo/Receive_and_Save_a_Stream/
or use 'VLC lib' for android that can downloaded from :
https://github.com/mrmaffen/vlc-android-sdk
use FFMPEG library to record RTSP recording locally in sd card :
1- Capture or decode the RAW frames from live stream and pass them to ffmpeg and save them to sdcard in .h264 format.
2- Then again pick .h264 raw file and decode the file using ffmpeg, and save the file with extention .mp4 into sd card.
3- delete the .h264 file programmatically, and save only .mp4, or which format you want.
Try .mp4 playback.
https://stackoverflow.com/a/24586256/6502368
I am trying to find out if my device is recording audio correctly (Volume of recorded audio is not too low and actually the recorded file has sound). The way I tried doing it is:
start recording --> play sound --> stop recording --> get file recorded max volume
The code I used to record sound:
public void playSound() {
File myDataPath = new File(getActivity().getFilesDir().getAbsolutePath()
+ File.separator + ".CheckAudio");
if (!myDataPath.exists())
myDataPath.mkdirs();
recordFile = myDataPath + File.separator + "Recording_" + new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.getDefault()).format(new Date()) + ".mp3";
am.setStreamVolume(AudioManager.STREAM_RING, am.getStreamMaxVolume(AudioManager.STREAM_RING), 0);
am.setStreamVolume(AudioManager.STREAM_NOTIFICATION, am.getStreamMaxVolume(AudioManager.STREAM_NOTIFICATION), 0);
Uri defaultRingtoneUri = RingtoneManager.getDefaultUri(RingtoneManager.TYPE_NOTIFICATION);
try {
md = new MediaRecorder();
md.setAudioSource(MediaRecorder.AudioSource.MIC);
md.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
md.setOutputFile(recordFile);
md.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
md.prepare();
md.start();
} catch (IllegalStateException | IOException e) {
recording = false;
removeItem("Unable to record audio, please try again."); // (Show toast)
return;
}
mediaPlayer = new MediaPlayer();
try {
mediaPlayer.setDataSource(getActivity(), defaultRingtoneUri);
mediaPlayer.setAudioStreamType(AudioManager.STREAM_NOTIFICATION);
mediaPlayer.prepare();
mediaPlayer.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mp) {
md.stop();
md.release();
mediaPlayer.release();
mediaPlayer = null;
// get recordfile volume
}
});
mediaPlayer.start();
} catch (IOException e) {
e.printStackTrace();
removeItem("Unable to play audio");
sound = false;
}
}
However, I can't find out how to analyze the mp3 file created and check if it is not empty (from sound), is there a library or another way?
I hope you guys understood what I am trying to achieve as my English is pretty bad, Thanks.
EDIT:(Some more explaination)
If you play sound (ringtone or something) while recording sound from microphone, the decibels recorded should be around 90 decibels. meaning the sound playing working and also the microphone, but if the decibels recorded around 30 means only microphone is working, and playing sound not, if the decibels are around zero then the microphone is not working.
You can use a visualiser to visualise real time if recording sound is getting too low or too loud.
I have build a project which visualise recording sound strength via bar graph . Higher the bar louder the recorded sound lower the bar low decibels .
This project also have inapp player which allow user to play all his recordings. The inbuilt player also visualise playback sound data.
I am suggesting this because I thought this is what you are trying to achieve in
start recording --> play sound --> stop recording --> get file recorded max volume.
Instead of getting max volume each time you can rely on visualiser to keep an eye on recorder if recording file is getting recorded above acceptable decibals.
You can find source code on github
https://github.com/hiteshsahu/Android-Audio-Recorder-Visualization-Master
I am trying create an android app which merges a video and audio file using mp4Parser. I succeeded when I merge two mp4 file into a single file that displays the video of the first one and plays the audio of the second file.
But I couldn't use an mp3 file as the audio source.
The below code returns exception when I try to create Movie object with an mp3 file. The same code works fine with m4a and mp4 files.
Movie audio;
try {
String audioFileName = Environment.getExternalStorageDirectory().toString()+"/music.mp3";
audio = new MovieCreator().build(audioFileName);
} catch (IOException e) {
e.printStackTrace();
return false;
} catch (NullPointerException e) {
e.printStackTrace();
return false;
}
Is it possible to create Movie object from an mp3 file ?
Anyone please help me on this
You can use MP3TrackImpl class from MP4parser for .mp3 file or AACTrackImpl class for .aac file . Get the track from the object of MP3TrackImpl and add it to Movie object.