I have a program that records sound from microphone and saves the sound in a .pcm file.
I have a separate function in the program that converts .pcm to aac using MediaCodec and MediaMuxer.
Is it possible to modify my program so that directly converts .pcm to aac using a MediaCodec?How can I do that? Is there a guide or is there a code sample to do that?
Here is a code snippet from my program that records sound from the
microphone:
recorder = new AudioRecord(
MediaRecorder.AudioSource.MIC,
RECORDER_SAMPLERATE,
RECORDER_CHANNELS,
RECORDER_AUDIO_ENCODING,
bufferSize);
recorder.startRecording();
isRecording = true;
FileOutputStream os = null;
try {
os = new FileOutputStream(filePathAudioRec);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
while (isRecording) {
// gets the voice output from microphone to byte format
recorder.read(sData, 0, BufferElements2Rec);
try {
// writes the data to file from buffer
// stores the voice buffer
byte bData[] = short2byte(sData);
os.write(bData, 0, BufferElements2Rec * BytesPerElement); //do I need to write it to a file ??
} catch (IOException e) {
e.printStackTrace();
}
// inside while loop: add code to convert to .mp4 while the data is collected from the MIC
}
Related
I want to record and save audio in .mp3 format using AudioRecord class. Currently it is recording in .pcm format. I have used below code for record -
private void writeAudioDataToFile() {
// Write the output audio in byte
String filePath = "/sdcard/voice8K16bitmono.pcm";
short sData[] = new short[BufferElements2Rec];
FileOutputStream os = null;
try {
os = new FileOutputStream(filePath);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
while (isRecording) {
// gets the voice output from microphone to byte format
m_audioRecord.read(sData, 0, BufferElements2Rec);
System.out.println("Short wirting to file" + sData.toString());
try {
// // writes the data to file from buffer
// // stores the voice buffer
byte bData[] = short2byte(sData);
os.write(bData, 0, BufferElements2Rec * BytesPerElement);
} catch (IOException e) {
e.printStackTrace();
}
}
try {
os.close();
} catch (IOException e) {
e.printStackTrace();
}
}
private byte[] short2byte(short[] sData) {
int shortArrsize = sData.length;
byte[] bytes = new byte[shortArrsize * 2];
for (int i = 0; i < shortArrsize; i++) {
bytes[i * 2] = (byte) (sData[i] & 0x00FF);
bytes[(i * 2) + 1] = (byte) (sData[i] >> 8);
sData[i] = 0;
}
return bytes;
}
Above two method is for recording file in .pcm format. Is this possible to save audio in .mp3 format using AudioRecord class?
There is no encoder that record file in mp3 available in android. You can not achieve mp3 using native functionality. You can use any third party library to achieve this.
Check this post, which may help you.
I'm working on Audio recorder app using AudioRecord not MediaRecorder.
I'm writing this code to record:
private void startRecord(){
File file = new File(Environment.getExternalStorageDirectory(), "test.pcm");
try {
file.createNewFile();
OutputStream outputStream = new FileOutputStream(file);
BufferedOutputStream bufferedOutputStream = new BufferedOutputStream(outputStream);
DataOutputStream dataOutputStream = new DataOutputStream(bufferedOutputStream);
int minBufferSize = AudioRecord.getMinBufferSize(8000,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT);
short[] audioData = new short[minBufferSize];
AudioRecord audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
8000,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT,
minBufferSize);
audioRecord.startRecording();
while(recording){
int numberOfShort = audioRecord.read(audioData, 0, minBufferSize);
for(int i = 0; i < numberOfShort; i++){
dataOutputStream.writeShort(audioData[i]);
}
}
audioRecord.stop();
dataOutputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
It's working fine. but, i have a large file size. I recorded about 1 min and i had file with size about 1.2M.
I tried to use MediaRecorder using this code:
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mMediaRecorder.setOutputFile(fileName);
mMediaRecorder.prepare();
mMediaRecorder.start();
It's great and recorded 5 minutes with file size about 500K or less. But, i have to use AudioRecord because i need to apply some task on audio byte by byte.
Can i have the same file size in MediaRecorder when i use AudioRecord?
Thank you very much.
Actually with AudioRecord class you get raw data from sound source without any compression to byte buffer you work with and MediaRecorder class provides only basic functionality for recording media from any available sources without direct access to data buffers.
I assume you should use AudioRecord for capturing audio, apply your byte to byte task for data in AudioRecord buffer and then write modified data from buffer using compression to a file. As I remember, there is no already implemented functionality in android API for audio compression, so you should use third-party library (for example lame) or write compression yourself. You can check this sources for audio recording in MP3 with lame: https://github.com/yhirano/Mp3VoiceRecorderSampleForAndroid
I wrote code to record audio from my phone in PCM16bit Raw format but when I play it back the output sounds weird. What could be wrong.
My code segments are as below
short[] audioBuffer = new short[bufferSize / 2];
AudioRecord record = new AudioRecord(MediaRecorder.AudioSource.MIC,
SAMPLE_RATE,
AudioFormat.CHANNEL_IN_STEREO,
AudioFormat.ENCODING_PCM_16BIT,
bufferSize);
Sample rate is at 44100 .
int numberOfShort1 = record.read(audioData,0,audioData.length);
shortsRead += numberOfShort;
readtimes++;
try {
os.write(audioData, 0, audioData.length);
} catch (IOException e) {
Log.e(LOG_TAG, "Error saving recording ", e);
//runonce=2;
return;
}
Could someone tell me whats wrong with my code?I am able to play other raw audios well.
This is how it sounds Link to audio
My voice sounds like some weird alien growl.
Iam trying to develop an application where i can set the speed of music file(mp3) to be set like 1x,1.5x,2x,2.5x like this.but MediaPlayer does not support this feauture unless it is 23 api.How can i use AudioTrack to play this mp3 file and also seek to position.the below code gives me "zzzzzzz" sound.
public void playAudio(){
int minBufferSize = AudioTrack.getMinBufferSize(8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
int bufferSize = 512;
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 8000, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, minBufferSize, AudioTrack.MODE_STREAM);
String filepath = Environment.getExternalStorageDirectory().getAbsolutePath();
int i = 0;
byte[] s = new byte[bufferSize];
try {
final String path= Environment.getExternalStorageDirectory().getAbsolutePath() + "/folioreader/audio"+".mp3";
FileInputStream fin = new FileInputStream(path);
DataInputStream dis = new DataInputStream(fin);
audioTrack.play();
while((i = dis.read(s, 0, bufferSize)) > -1){
audioTrack.write(s, 0, i);
}
audioTrack.stop();
audioTrack.release();
dis.close();
fin.close();
} catch (FileNotFoundException e) {
// TODO
e.printStackTrace();
} catch (IOException e) {
// TODO
e.printStackTrace();
}
}
AudioTrack can play uncompressed audio (WAV) not any compressed ones (mp3 or AAC etc.,) You need to call MediaCodec to decode and then use Audio Track to play audio. Refer these links, https://developer.android.com/reference/android/media/AudioTrack.html and https://developer.android.com/reference/android/media/MediaCodec.html.
For faster playback, give sample rate proportional to the speed that you require. For ex. to play 8kHz audio in 2X rate, give 16kHz in AudioTrack and so on. This is crude way.
I have recorded voice with android AudioRecord and I would like to convert it to ogg vorbis as it is patent free. I have try vorbis-java beta, but it seem not work or I make some mistake.
Here are my code :
int frequency = 44100;
int channel = AudioFormat.CHANNEL_IN_STEREO;
int mAudioSource = MediaRecorder.AudioSource.MIC;
int mAudioEncoder = AudioFormat.ENCODING_PCM_16BIT;
try {
final File outputFile = new File(mOutputPath);
DataOutputStream dos = new DataOutputStream(new BufferedOutputStream(new FileOutputStream(outputFile)));
int bufferSize = AudioRecord.getMinBufferSize(frequency, channel, mAudioEncoder);
AudioRecord audioRecord = new AudioRecord(mAudioSource, frequency, channel, mAudioEncoder, bufferSize);
short[] buffer = new short[bufferSize];
audioRecord.startRecording();
while (isRecordStart) {
int bufferReadResult = audioRecord.read(buffer, 0, bufferSize);
for(int i = 0; i < bufferReadResult; i++) {
dos.writeShort(buffer[i]);
}
}
audioRecord.stop();
dos.close();
}
catch (FileNotFoundException e) {
e.printStackTrace();
}
catch (IOException e) {
e.printStackTrace();
}
I save it to a file with extension wav and use example of vorbis-java to encode, but output is only zzz.......
How to encode this to ogg vorbis in android?
I think i read this question a few weeks ago and was also super frustrated. I ended up writing the needed ndk wrapper to use Xiph.org's stuff. The only catch is that in order to make it run well, I had to enable floating point instructions. Emulators don't have floating point, so it'll crash the emulator. Run it on pretty much any phone, though, and you'll be good to go. It's designed to emulate a FileInputStream and FileOutputStream for interfacing with the vorbis files.
https://github.com/nwertzberger/libogg-vorbis-android
You seem to write raw audio data into a file instead of wav format. Wav format does have headers, not just audio data.
Note: Don't use vorbis-java, but compile from libogg and libvorbis sources at http://www.xiph.org/downloads/
Use android NDK to compile them for embedding in your apk file.
Then you can call the native code from your app to encode the audio data.