it's the second week which I'm trying to just record audio in xamarin which has an RIFF header.
I tried Audio Recorder Plugin and Audio Record and Media Recorder . I asked many questions but got no answer.
the easiest way was Audio Recorder Plugin, but the output hasn't RIFF header.
the output of Media Recorder was .3gp which I couldn't convert it to .wav.
and the output of Media Recorder was .pcm which also couldn't convert to .wav
here is the last code I tried :
#region Properties
int SAMPLING_RATE_IN_HZ = 44100;
ChannelIn CHANNEL_CONFIG = ChannelIn.Mono;
Android.Media.Encoding AUDIO_FORMAT = Android.Media.Encoding.Pcm16bit;
int BUFFER_SIZE_FACTOR = 2;
int BUFFER_SIZE;
bool RecordingInProgress = false;
private AudioRecord recorder = null;
#endregion
public void Record()
{
BUFFER_SIZE = AudioRecord.GetMinBufferSize(SAMPLING_RATE_IN_HZ,
CHANNEL_CONFIG, AUDIO_FORMAT) * BUFFER_SIZE_FACTOR;
recorder = new AudioRecord(AudioSource.Mic, SAMPLING_RATE_IN_HZ,
CHANNEL_CONFIG, AUDIO_FORMAT, BUFFER_SIZE);
recorder.StartRecording();
RecordingInProgress = true;
RecordingTask();
}
public Task RecordingTask()
{
return Task.Run(() =>
{
string path = "appdir/demo.pcm";
MemoryStream buffer = new MemoryStream(BUFFER_SIZE);
FileOutputStream outStream = new FileOutputStream(path);
var demo2 = RecordingInProgress;
while (RecordingInProgress)
{
int result = recorder.Read(buffer.GetBuffer(), 0, BUFFER_SIZE);
if (result < 0)
{
throw new Exception("Reading of audio buffer failed: ");
}
outStream.Write(buffer.GetBuffer(), 0, BUFFER_SIZE);
}
});
}
public void Stop()
{
if (null == recorder)
{
return;
}
RecordingInProgress = false;
recorder.Stop();
recorder.Release();
recorder = null;
}
}
this code makes a .pcm file that can't convert to anything with even cloud converters.
I also tried this :
NWaves.Audio.WaveFile waveFile = new NWaves.Audio.WaveFile(buffer.GetBuffer());
waveFile.SaveTo(new FileStream("appdir/demo.wav", FileMode.Create));
insted of outStream.Write(buffer.GetBuffer(), 0, BUFFER_SIZE); at the bottom of while closing block
but it says : No RIFF found
there is about 4 or 5 way to record audio. but a package like Nwaves can't work with any of them.
the last try I want to do is add RIFF header to the recorded audio buffer(bytes) programmatically or convert .3gp or .pcm to .wav programmatically .
summery: someone help me to record an audio in xamarin which Nwaves can work with.
thanks
Related
I want to make streaming audio recorder in android. I am not sure how to fetch audio stream and how to set buffer size of audio stream.
Below is my media recorder class
public class MyMediaRecorder {
final MediaRecorder recorder = new MediaRecorder();
final File path;
/**
* Creates a new audio recording at the given path (relative to root of SD
* card).
*/
public MyMediaRecorder(File path) {
this.path = path;
}
/**
* Starts a new recording.
*/
public void start() throws IOException {
String state = android.os.Environment.getExternalStorageState();
if (!state.equals(android.os.Environment.MEDIA_MOUNTED)) {
throw new IOException("SD Card is not mounted. It is " + state
+ ".");
}
// make sure the directory we plan to store the recording in exists
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile(path.getAbsolutePath());
recorder.prepare();
recorder.start();
}
/**
* Stops a recording that has been previously started.
*/
public void stop() throws IOException {
recorder.stop();
recorder.release();
}
}
on start of recording i need to fetch a buffer size and sent it to server parallely record audio. Should I use Audio Record and Audio track instaed of Media Recorder ? Please suggest what should i do
You should use AudioRecord.
Here is a simple code that shows how to use AudioRecord class.
final int sampleRate = 48000;
final int channelConfig = AudioFormat.CHANNEL_IN_MONO;
final int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
int minBufferSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
AudioRecord microphone = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleRate, channelConfig, audioFormat, minBufferSize * 10);
microphone.startRecording();
//Since audioformat is 16 bit, we need to create a 16 bit (short data type) buffer
short[] buffer = new short[1024];
while(!stopped) {
int readSize = microphone.read(buffer, 0, buffer.length);
sendDataToServer(buffer, readSize);
}
microphone.stop();
microphone.release();
If you need a real-time audio recorder and streamer, you should be careful about performance and read all data as fast as possible and manage them to send to the server. There are lot's of projects in Github that you can reuse them and learn from their idea. I share some of them (specially recorder class) here but you can search and find more.
1. AudioRecorder
2. FullDuplexNetworkAudioCallService
3. MicRecorder
searching throw the stack i got confused. So, what is my problem:
I'm using AudioRecord class to record some audio, here's the code:
AudioRecord record = new AudioRecord(AudioSource.VOICE_RECOGNITION,
SAMPLING_RATE,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT,
mBufferSize);
record.startRecording();
int read = 0;
while (mIsRecording) {
read = record.read(mAudioBuffer, 0, mBufferSize);
if ((read == AudioRecord.ERROR_INVALID_OPERATION) ||
(read == AudioRecord.ERROR_BAD_VALUE) ||
(read <= 0)) {
continue;
}
proceed();
write(out);
}
After recording is completed, i'm converting this .pcm raw data from AudioRecord to .wav:
private void convertRawToWav() {
File file_raw = new File(mFileNameRaw);
if (!file_raw.exists()) { return; }
File file_wav = new File(mFileNameWav);
try {
PcmAudioHelper.convertRawToWav(WavAudioFormat.mono16Bit(SAMPLING_RATE), file_raw, file_wav);
if (handler != null) {
handler.onRecordSuccess();
}
} catch (IOException e) {
e.printStackTrace();
if (handler != null) {
handler.onRecordSaveError();
}
}
}
I need .wav format in future, because there is trim function in my application, copied from Ringdroid which doesn't support OGG format, therefor : PLEASE DO NOT RECOMMEND ME TO RECORD AUDIO AS OGG ON THE FLY
MAIN ISSUE:
Wav format is too heavy weight, and i need to convert it to smaller one, which is either MP3 or OGG. MP3 is patented, so it's not an option. What i need is:
To convert .wav file to .ogg file so it's weight will be much smaller
What i found:
This library, but it only converts .pcm data to .ogg while recording, and i need to convert whole file after trimmiing it as .wav
Take a look at this lame wrapper project.
Wav file is just a big header + PCM. All you need to do is remove the 44 bytes in the front of the WAV to get the PCM and use the code that you shared with us to convert to OGG.
I created a WAV file based from https://developer.xamarin.com/samples/monodroid/Example_WorkingWithAudio/ where the relevant code are as follows:
private const int RECORDER_SAMPLERATE = 16000;
private const ChannelIn RECORDER_CHANNELS = ChannelIn.Mono;
private const Android.Media.Encoding RECORDER_AUDIO_ENCODING = Android.Media.Encoding.Pcm16bit;
...
var bufferSize = AudioRecord.GetMinBufferSize(RECORDER_SAMPLERATE, RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING);
audioBuffer = new byte[bufferSize];
audioRecord = new AudioRecord(
// Hardware source of recording.
AudioSource.Mic,
// Frequency
RECORDER_SAMPLERATE,
// Mono or stereo
RECORDER_CHANNELS,
// Audio encoding
RECORDER_AUDIO_ENCODING,
// Length of the audio clip.
audioBuffer.Length
);
audioRecord.StartRecording();
...
using (var fileStream = new FileStream(filePath, FileMode.Create, FileAccess.Write))
{
while (true)
{
if (endRecording)
{
endRecording = false;
break;
}
try
{
// Keep reading the buffer while there is audio input.
int numBytes = await audioRecord.ReadAsync(audioBuffer, 0, audioBuffer.Length);
await fileStream.WriteAsync(audioBuffer, 0, numBytes);
// Do something with the audio input.
}
catch (Exception ex)
{
Console.Out.WriteLine(ex.Message);
break;
}
}
fileStream.Close();
}
audioRecord.Stop();
audioRecord.Release();
My question is - how can I play this .WAV file after copying in Windows (or other OS)? Here are my observations:
.WAV file can be played using Android's AudioTrack class, as shown in the https://developer.xamarin.com/samples/monodroid/Example_WorkingWithAudio/ code sample.
The code sample will create a testAudio.wav file in the /data/Example_WorkingWithAudio/files/ directory.
Try to copy this file to your Windows PC, then try playing the .WAV file with an audio player which supports .WAV. See that it won't be able to play the file.
to revert back, I found out that the created track needs to be converted to .WAV first. Here are 2 posts which helped me do it.
http://www.edumobile.org/android/audio-recording-in-wav-format-in-android-programming/
Recorded audio Using AndroidRecord API fails to play
I'm trying to merge/join two audio files.But the merged file contain only first file audio don't know what is the issue.I think the problem with headers.But don't know how to fix it.
e.g
f1=4kb
f2=3kb
finalFile=7 kb
Size shows merging done but don't know why audio is missed of second file.
Here is my code.
public static void meargeAudio(List<File> filesToMearge)
{
while (filesToMearge.size()!=1){
try {
FileInputStream fistream1 = new FileInputStream(filesToMearge.get(0).getPath()); //(/storage/emulated/0/Audio Notes/1455194356500.mp3) first source file
FileInputStream fistream2 = new FileInputStream(filesToMearge.get(1).getPath());//second source file
SequenceInputStream sistream = new SequenceInputStream(fistream1, fistream2);
FileOutputStream fostream = new FileOutputStream(AppConstrants.APP_FOLDER_PATH+"sss.mp3",true);
int temp;
while ((temp = sistream.read()) != -1) {
// System.out.print( (char) temp ); // to print at DOS prompt
fostream.write(temp); // to write to file
}
fostream.close();
sistream.close();
fistream1.close();
fistream2.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Help me if you can.
Thanks in advance.
You are right about header problem. Reason why your new audio file only recognized as first audio file because you directly concat both audio files. When MediaPlayer reads your merged audio header (bit rate, how long audio data, etc), it recognizes your first audio file only, because that is what it found first. To properly join 2 audio file, you need to read both their header and decode their audio data, recalculate new length of merged audio data and concat both uncompressed audio data, recompressed it (as MP3 for example) then write it to file.
Solution:
mp4parser can use to append audio files
https://github.com/sannies/mp4parser/issues/155
use:
aspectjrt-1.7.3.jar
My Working Code:
public static void mergAudio(List<File> filesToMearge){
try {
while (filesToMearge.size()!=1) {
String audio1 = filesToMearge.get(0).getPath();
String audio2 = filesToMearge.get(1).getPath();
// FileOutputStream fostream = new FileOutputStream(filesToMearge.get(0).getPath(),true);//destinationfile
String outputVideo = filesToMearge.get(0).getPath();
Movie[] inMovies = new Movie[]{
MovieCreator.build(audio1),
MovieCreator.build(audio2),
};
List<Track> audioTracks = new LinkedList<Track>();
for (Movie m : inMovies) {
for (Track t : m.getTracks()) {
if (t.getHandler().equals("soun")) {
audioTracks.add(t);
}
}
}
File file1 = new File(filesToMearge.get(0).getPath());
boolean deleted = file1.delete();
File file2 = new File(filesToMearge.get(1).getPath());
boolean deleted1 = file2.delete();
Movie result = new Movie();
if (audioTracks.size() > 0) {
result.addTrack(new AppendTrack(audioTracks.toArray(new Track[audioTracks.size()])));
}
Container out = new DefaultMp4Builder().build(result);
out.writeContainer(new FileOutputStream(outputVideo).getChannel());
filesToMearge.add(0, new File(filesToMearge.get(0).getPath()));
filesToMearge.remove(1);
filesToMearge.remove(1);
}
} catch (IOException e) {
e.printStackTrace();
}
}
I want to compress locally saved video file to a smaller size in order to upload to a server.
Since i used MediaCodec , i have found some tips to compress video . Here are the steps that i followed
1) . Extracted the media file using MediaExrtactor and Decoded it.
2) . Creates the Encoder with required file format
3) . Create muxer to save file in local storage. (not complete)
Question : But i dont know how to encode the already decoded stream and save the stream in to the local storage using MediaMuxer.
public class CompressMedia {
private static final String SAMPLE = Environment
.getExternalStorageDirectory() + "/DCIM/Camera/20140506_174959.mp4";
private static final String OUTPUT_PATH = Environment
.getExternalStorageDirectory()
+ "/DCIM/Camera/20140506_174959_REC.mp4";
private MediaExtractor extractor;
private MediaCodec decoder;
private MediaCodec encoder;
String mime;
private static final String MIME_TYPE = "video/avc";
public void extractMediaFile() {
// work plan
// locate media file
// extract media file using Media Extractor
// retrieve decoded frames
extractor = new MediaExtractor();
try {
extractor.setDataSource(SAMPLE);
} catch (IOException e) {
// TODO Auto-generated catch block
// file not found
e.printStackTrace();
}
// add decoded frames
for (int i = 0; i < extractor.getTrackCount(); i++) {
MediaFormat format = extractor.getTrackFormat(i);
mime = format.getString(MediaFormat.KEY_MIME);
if (mime.startsWith("video/")) {
extractor.selectTrack(i);
decoder = MediaCodec.createDecoderByType(mime);
decoder.configure(format, null, null, 0);
break;
}
}
if (decoder == null) {
Log.e("DecodeActivity", "Can't find video info!");
return;
}
// - start decoder -
decoder.start();
extractor.selectTrack(0);
// - decoded frames can obtain in here -
}
private void createsEncoder() {
// creates media encoder to set formats
encoder = MediaCodec.createDecoderByType(MIME_TYPE);
// init media format
MediaFormat mediaFormat = MediaFormat.createVideoFormat(MIME_TYPE, /* 640 */
320, /* 480 */240);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 400000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 25);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
encoder.configure(mediaFormat, null, null,
MediaCodec.CONFIGURE_FLAG_ENCODE);
encoder.start();
// - encoded data format is avaiable in here
}
private void createMuxer() {
// creates media muxer - media muxer will be used to write the final
// strem in to a desired file :)
try {
MediaMuxer muxer = new MediaMuxer(OUTPUT_PATH,
OutputFormat.MUXER_OUTPUT_MPEG_4);
int videoTrackIndex = muxer.addTrack(encoder.getOutputFormat());
//muxer.writeSampleData(videoTrackIndex, inputBuffers, bufferInfo);
muxer.start();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Here are the links that i follwed
Android MediaCodec: Reduce mp4 video size and
Video compression on android using new MediaCodec Library
You can try Intel INDE on https://software.intel.com/en-us/intel-inde and Media Pack for Android which is a part of INDE, tutorials on https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials. It has a sample that shows how to use media pack to transcode=recompress video files. You can set smaller resolution and\or bitrate to output to get smaller file
in ComposerTranscodeCoreActivity.java
protected void setTranscodeParameters(MediaComposer mediaComposer) throws IOException {
mediaComposer.addSourceFile(mediaUri1);
mediaComposer.setTargetFile(dstMediaPath);
configureVideoEncoder(mediaComposer, videoWidthOut, videoHeightOut);
configureAudioEncoder(mediaComposer);
}
protected void transcode() throws Exception {
factory = new AndroidMediaObjectFactory(getApplicationContext());
mediaComposer = new MediaComposer(factory, progressListener);
setTranscodeParameters(mediaComposer);
mediaComposer.start();
}