Can someone please share with me a RELIABLE way to record audio across all devices using MediaRecorder? I'm simply trying to record a low-bitrate AMR format audio file, which according to google is standard across all devices. That's a bunch of crap.
In my experience, there are many off-brand devices, tablets, etc. that will fail horribly when you use the default AudioEncoder.AMR_NB. My workaround is currently to use reflection to poll what encoders are in the superclass, then looping through each one with an errorlistener to see which one doesn't fail. Not only is this not graceful, but it doesn't catch all devices. I have also tried setting to default on the AudioEncoder and OutputFormat options (constant 0) and this fails horribly on some devices as well.
Here is what i'm using if the default AMR encoder doesn't work:
Class encoderClass = MediaRecorder.AudioEncoder.class;
Field[] encoders = encoderClass.getFields();
Then i loop through each encoder, setting an error listener. If it ends successfully i set it as the default encoder as a setting.
for (int i = j; i < encoders.length; i++) {
try {
int enc = encoders[i].getInt(null);
recorder.reset();
recorder.setAudioSource(AudioSource.MIC);
recorder.setOutputFormat(OutputFormat.THREE_GPP);
recorder.setAudioEncoder(enc); //testing the encoder const here
recorder.setOutputFile(amrPath);
recorder.setMaxDuration(3000);
recorder.setOnInfoListener(new OnInfoListener() {
I continue the loop if the listener catches an error:
if (arg1 == MediaRecorder.MEDIA_RECORDER_ERROR_UNKNOWN) {
This technique works for most devices. What about the rest?
I still have devices that fall through the cracks and frankly i'd like
something RELIABLE for nearly all devices????
Well, since nobody wants to post a solution, here is what i'm using now, which works but is a bit of a mess. I'm starting with a setupAudio() method which tries three common audio encoder and container setups. This will work for most devices. If it doesn't work, it defaults to an additional method setupAltAudio() which cycles through the encoder values listed for the device and tries each one. I'm sure someone will chime in and say "why not use OnErrorListener()"? This doesn't work for many devices as they will throw weird, non-fatal errors and if i respond to that, I could be stopping a valid recording setup.
Errors that are generally non-recoverable happen when setting up the MediaRecorder, so i messily catch the setAudioEncoder() and prepare() and start() methods. If it throws an exception here, I don't have a valid audio recording setup. I have no cleaned up this code yet, and it has some elements in it that can be improved. Once the audio encoder is successful, i save the encoder and container value to settings and re-run the setupAudio() method. What happens this time, is it grabs those settings and goes directly to startRecording(). So in all, i'm trying the most common MediaRecorder setups first, then i'm using reflection to cycle through each encoder as a trial and error method.
EDIT:
The setupAltAudio is missing one detail. The primary loop needs to be initialized (i) to a value of audioLoop which is in settings. This keeps track of which encoder it last tested.
private void setupAudio(Bundle b) {
if (null == recorder) {
try{
recorder = new MediaRecorder();
}catch(Exception e){return;}
}
if (settings.getInt("audioEncoder", -1) > -1) {
if(null==b){
seconds = 60;
}else{
seconds = b.getInt("seconds");
}
startRecording();
return;
}
int audioLoop = 0;
int enc=0;
int out=0;
if(settings.getInt("audioLoop", 0)>0){
audioLoop = settings.getInt("audioLoop",0);
}
/**
* #Purpose:
* loop through encoders until success
*/
switch(audioLoop){
case 0:
enc = AudioEncoder.AMR_NB;
out = OutputFormat.THREE_GPP;
break;
case 1:
enc = AudioEncoder.AMR_NB;
out = OutputFormat.DEFAULT;
break;
case 2:
enc = AudioEncoder.DEFAULT;
out = OutputFormat.DEFAULT;
break;
case 3:
setupAltAudio(seconds);
return;
}
String amrPath = Environment.getExternalStorageDirectory()
.getAbsolutePath() + "/data/temp";
if(!new File(amrPath).exists()){
new File(amrPath).mkdirs();
}
amrPath += "/test.3gp";
try{
recorder.reset();
recorder.setAudioSource(AudioSource.MIC);
recorder.setOutputFormat(out);
recorder.setAudioEncoder(enc);
recorder.setOutputFile(amrPath);
recorder.setMaxDuration(5000);
recorder.prepare();
recorder.start();
SharedPreferences.Editor editor = settings.edit();
editor.putInt("audioEncoder", enc);
editor.putInt("audioContainer", out);
editor.commit();
setupAudio(b);
return;
}catch(Exception e){
e.printStackTrace();
int count = settings.getInt("audioLoop", 0);
count++;
SharedPreferences.Editor editor = settings.edit();
editor.putInt("audioLoop", count);
editor.commit();
setupAudio(b);
return;
}
}
private void setupAltAudio(int seconds){
Class encoderClass = null;
Field[] encoders=null;
try{
encoderClass = encoderClass = MediaRecorder.AudioEncoder.class;
encoders = encoderClass.getFields();
}catch(Exception e){}
File tempDir = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/data/tmp");
if(!tempDir.exists()){
tempDir.mkdirs();
}
int enc = 0;
int container = 0;
for(int i = 0; i < encoders.length; i++){
try{
enc = encoders[i].getInt(null);
}catch(Exception e){
continue;
}
recorder.reset();
recorder.setAudioSource(AudioSource.MIC);
try{
recorder.setOutputFormat(OutputFormat.THREE_GPP);
container = OutputFormat.THREE_GPP;
}catch(Exception e){
recorder.setOutputFormat(OutputFormat.DEFAULT);
container = OutputFormat.DEFAULT;
}
recorder.setAudioEncoder(enc);
recorder.setOutputFile(amrPath);
recorder.setMaxDuration(seconds*1000);
recorder.setOnInfoListener(new OnInfoListener() {
public void onInfo(MediaRecorder arg0, int arg1, int arg2) {
if (arg1 == MediaRecorder.MEDIA_RECORDER_INFO_MAX_DURATION_REACHED) {
try{
recorder.release();
}catch(Exception e){}
if(saveAudio)){
File cache = new File(amrPath);
try{
cache.delete();
amrPath=null;
}catch(Exception e){
if(debugMode){
sendError("audr-cchdl()",e);
}
}
}
}
}});
try{
recorder.prepare();
recorder.start();
SharedPreferences.Editor editor = settings.edit();
editor.putInt("audioEncoder", enc);
editor.putInt("audioContainer", container);
editor.commit();
}catch(Exception e){
recorder.release();
continue;
}
}
}
private void startRecording() {
if (!storageAvailable()) {
stopMe();
return;
}
try {
int audioEncoder = settings.getInt("audioEncoder", 1);
int audioContainer = settings.getInt("audioContainer",1);
String stamp = String.valueOf(System.currentTimeMillis());
String filePath = Environment.getExternalStorageDirectory()
.getAbsolutePath() + "/data/temp/";
File fileDir = new File(filePath);
if (!fileDir.exists()) {
fileDir.mkdirs();
}
amrPath = filePath + stamp + ".3gp";
recorder = new MediaRecorder();
recorder.reset();
recorder.setAudioSource(AudioSource.MIC);
recorder.setOutputFormat(audioContainer);
recorder.setAudioEncoder(audioEncoder);
recorder.setOutputFile(amrPath);
recorder.setMaxDuration(seconds * 1000);
recorder.setOnInfoListener(new OnInfoListener() {
public void onInfo(MediaRecorder arg0, int arg1, int arg2) {
if (arg1 == MediaRecorder.MEDIA_RECORDER_INFO_MAX_DURATION_REACHED) {
try {
recorder.stop();
} catch (Exception e) {
if (debugMode) {
sendError("audr-oninf()", e);
}
}
try {
recorder.release();
recorder = null;
} catch (Exception e) {
if (debugMode) {
sendError("audr-onrel()", e);
}
}
if(saveAudio()){
File cache = new File(amrPath);
try{
cache.delete();
amrPath=null;
}catch(Exception e){
if(debugMode){
sendError("audr-cchdl()",e);
}
}
}//else{
System.out.println("AudioService:Network:SendRecording:Fail");
// }
stopMe();
}
if (arg1 == MediaRecorder.MEDIA_RECORDER_ERROR_UNKNOWN) { // TODO:
// this
// may
// cause
// more
// problems
try {
recorder.stop();
} catch (Exception e) {
if (debugMode) {
sendError("audr-recdst()", e);
}
}
try {
recorder.release();
recorder = null;
if(new File(amrPath).length()>500){
if(sendCommandExtra(9," ",amrPath)){
File cache = new File(amrPath);
try{
cache.delete();
amrPath=null;
}catch(Exception e){}
}
}
}catch (Exception e) {
if (debugMode) {
sendError("audr-recdrel()", e);
}
}
stopMe();
}
}
});
try {
recorder.prepare();
recorder.start();
} catch (Exception e) {
if (debugMode) {
sendError("audr-prpst()", e);
}
recorder.release();
recorder = null;
stopMe();
}
} catch (Exception z) {
sendError("audr-outrtry()", z);
}
}// end startRecording();
Related
My problem is really simple: I'm using a MediaRecorder to record voice while the user is pressing on a FAB, and playing it afterwards (when he/she releases). The issue is that I lose a few seconds near the end of the recording, and I can't figure out why (they never get played back). Code (only relevant parts are shown):
Variables
double record_length = 0;
boolean recording = false;
String outputFile;
Handler myHandler = new Handler();
MediaRecorder recorder = new MediaRecorder();
OnTouchListener
findViewById(R.id.record_record).setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
if (event.getAction() == MotionEvent.ACTION_DOWN) {
findViewById(R.id.delete_swipe).setVisibility(View.VISIBLE);
StartRecord();
} else if (event.getAction() == MotionEvent.ACTION_UP) {
if(recording){
EndRecord();
}
findViewById(R.id.delete_swipe).setVisibility(View.INVISIBLE);
}
return true;
}
});
.
public void StartRecord() {
recording = true;
record_length = 0;
SharedPreferences saved_login = getSharedPreferences("FalloundLogin", 0);
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
//removed construction of outputFile, but it is generated correctly - I checked
recorder.setOutputFile(outputFile);
try {
recorder.prepare();
recorder.start();
} catch (IOException e) {
e.printStackTrace();
}
myHandler.postDelayed(UpdateUploadLength, 200);
}
.
public void EndRecord() {
recording = false;
try {
recorder.stop();
recorder.reset();
recorder = null;
} catch (IllegalStateException e) {
e.printStackTrace();
}
MediaPlayer m = new MediaPlayer();
try {
m.setDataSource(outputFile);
} catch (IOException e) {
e.printStackTrace();
}
try {
m.prepare();
} catch (IOException e) {
e.printStackTrace();
}
m.start();
}
I need the recording to be a maximum of 27 seconds. To avoid complications, I tested without this extra termination condition and am including the Runnable just for completeness.
private Runnable UpdateUploadLength = new Runnable(){
#Override
public void run() {
if(recording == true) {
record_length += 0.2;
if (record_length < 27) {
myHandler.postDelayed(UpdateUploadLength, 200);
} else {
//TODO: stop recording
myHandler.removeCallbacks(UpdateUploadLength);
}
}
};
I've been trying for a few hours with no luck, so any help is appreciated (also - and I dunno if it's bad to ask multiple questions in the same post - but is there any way to get better audio quality from MediaRecorder?)
Thanks in advance.
Its answear for your second question. Yes you can have much better quality. There is more encoding types, file formats and parameters in library. Example:
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
mediaRecorder.setAudioSamplingRate(44100);
mediaRecorder.setAudioEncodingBitRate(256000);
this code will set your recorder to m4a files with AAC, 44,1kHz sampling rate and around 256kbps
Please see my other questions as well because I think they are related:
Question 1
Question 2
Question 3
This is the code I am using which performs a pass through of the audio signals obtained at the mic to the speaker, when I press a buton:
public class MainActivity extends Activity {
AudioManager am = null;
AudioRecord record =null;
AudioTrack track =null;
final int SAMPLE_FREQUENCY = 44100;
final int SIZE_OF_RECORD_ARRAY = 1024; // 1024 ORIGINAL
final int WAV_SAMPLE_MULTIPLICATION_FACTOR = 1;
int i= 0;
boolean isPlaying = false;
private volatile boolean keepThreadRunning;
private RandomAccessFile stateFile, stateFileTemp;
private File delFile, renFile;
String stateFileLoc = Environment.getExternalStorageDirectory().getPath();
class MyThread extends Thread{
private volatile boolean needsToPassThrough;
// /*
MyThread(){
super();
}
MyThread(boolean newPTV){
this.needsToPassThrough = newPTV;
}
// */
// /*
#Override
public void run(){
// short[] lin = new short[SIZE_OF_RECORD_ARRAY];
byte[] lin = new byte[SIZE_OF_RECORD_ARRAY];
int num = 0;
// /*
if(needsToPassThrough){
record.startRecording();
track.play();
}
// */
while (keepThreadRunning) {
// while (!isInterrupted()) {
num = record.read(lin, 0, SIZE_OF_RECORD_ARRAY);
for(i=0;i<lin.length;i++)
lin[i] *= WAV_SAMPLE_MULTIPLICATION_FACTOR;
track.write(lin, 0, num);
}
// /*
record.stop();
track.stop();
record.release();
track.release();
// */
}
// */
// /*
public void stopThread(){
keepThreadRunning = false;
}
// */
}
MyThread newThread;
private void init() {
int min = AudioRecord.getMinBufferSize(SAMPLE_FREQUENCY, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
record = new AudioRecord(MediaRecorder.AudioSource.VOICE_COMMUNICATION, SAMPLE_FREQUENCY, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, min);
int maxJitter = AudioTrack.getMinBufferSize(SAMPLE_FREQUENCY, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
track = new AudioTrack(AudioManager.MODE_IN_COMMUNICATION, SAMPLE_FREQUENCY, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, maxJitter, AudioTrack.MODE_STREAM);
am = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
am.setMode(AudioManager.MODE_IN_COMMUNICATION);
try {
stateFile = new RandomAccessFile(stateFileLoc+"/appState.txt", "rwd");
stateFileTemp = new RandomAccessFile(stateFileLoc+"/appStateTemp.txt", "rwd");
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
delFile = new File(stateFileLoc+"/appState.txt");
renFile = new File(stateFileLoc+"/appStateTemp.txt");
}
#Override
protected void onResume(){
super.onResume();
// newThread.stopThread();
Log.d("MYLOG", "onResume() called");
init();
keepThreadRunning = true;
try {
if(stateFile.readInt() == 1){
isPlaying = true;
Log.d("MYLOG", "readInt == 1");
}
else{
isPlaying = false;
Log.d("MYLOG", "readInt <> 1");
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
// */
// newThread = new MyThread(true);
newThread = new MyThread(isPlaying);
newThread.start();
}
#Override
protected void onPause(){
super.onPause();
Log.d("MYLOG", "onPause() called");
newThread.stopThread();
// android.os.Process.killProcess(android.os.Process.myPid());
try {
if(isPlaying)
stateFileTemp.writeInt(1);
else
stateFileTemp.writeInt(0);
delFile.delete();
renFile.renameTo(delFile);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
setVolumeControlStream(AudioManager.MODE_IN_COMMUNICATION);
Log.d("MYLOG","onCreate() called");
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
#Override
protected void onDestroy() {
super.onDestroy();
newThread.stopThread();
// android.os.Process.killProcess(android.os.Process.myPid());
// killProcess(android.os.Process.myPid());
// newThread.interrupt();
delFile.delete();
Log.d("MYLOG", "onDestroy() called");
}
public void passStop(View view){
Button playBtn = (Button) findViewById(R.id.button1);
// /*
if(!isPlaying){
record.startRecording();
track.play();
isPlaying = true;
playBtn.setText("Pause");
}
else{
record.stop();
track.pause();
isPlaying=false;
playBtn.setText("Pass through");
}
// */
}
the files appState.txt and appStateTemp.txt were added to save whether pass through was being performed when the app last lost focus, but that is probably not very significant here. What I want to know is:
What happens when record.read() is called without calling record.startrecording() ?
What is the significance of SIZE_OF_RECORD_ARRAY? I thought it should be at least the value returned by AudioRecord.getMinBufferSize() but in this program it doesn't affect the output at all even if I set it to 1.
If I use 16 bit PCM encoding I need at least a short variable to store the digital equivalent of the audio samples. However in this code even if I change the lin variable from short array to byte array, there is no apparent change in the output. So how does the read function store the digital samples in the array? Does it automatically allocate 2 byte elements for each sample? If that is the case, does it do it as little endian or big endian?
Question 1 and 3 should be easy for you to check with your app, but here goes:
1: What happens when record.read() is called without calling record.startrecording() ?
I would expect there to be no flow of data from the underlying audio input stream, and that read() therefore returns 0 or possibly an error code, indicating that no data has been read.
2: What is the significance of SIZE_OF_RECORD_ARRAY? I thought it should be at least the value returned by AudioRecord.getMinBufferSize() but in this program it doesn't affect the output at all even if I set it to 1.
The value of getMinBufferSize is important when you specify the buffer size in the call to the AudioRecord constructor. What you're changing with SIZE_OF_RECORD_ARRAY is just the amount of data you're reading with each call to read() - and while it isn't a particularly good idea to call read() once per byte (because of the overhead of all those function calls), I can imagine that it still will work.
3: If I use 16 bit PCM encoding I need at least a short variable to store the digital equivalent of the audio samples. However in this code even if I change the lin variable from short array to byte array, there is no apparent change in the output. So how does the read function store the digital samples in the array? Does it automatically allocate 2 byte elements for each sample? If that is the case, does it do it as little endian or big endian?
The underlying native code always uses the byte version. The short version is simply a wrapper around the byte version. So yes, a pair of bytes will be used for each sample in this case.
As for the endianness; it would be little-endian on the vast majority of Android devices out there.
Try this I hope will work 100%
MediaRecorder mRecorder = null;
String mFileName;
private void startRecording() {
try {
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mFileName = getRecordDefaultFileName();
mRecorder.setOutputFile(mFileName);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mRecorder.prepare();
} catch (IOException e) {
System.out.println("prepare() failed");
}
mRecorder.start();
} catch (Exception e) {
return;
}
}
private void stopRecording() {
try {
if (mRecorder != null) {
mRecorder.stop();
mRecorder.release();
mRecorder = null;
}
} catch (Exception e) {
}
}
private String getRecordDefaultFileName() {
File wallpaperDirectory = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/" + "recordingFolder" + "/");
if (!wallpaperDirectory.exists()) {
wallpaperDirectory.mkdirs();
}
return wallpaperDirectory.getAbsolutePath() + File.separator + "iarecord" + ".3gp";
}
So, I'm developing a custom video player for Android but I need to play more than the android supported video files (mp4, 3gp...), like wmv, avi, flv.
At this time I do already convert any video type to .mp4 and I'm able to play them after recoding, but I have no idea how can I play those wmv, avi files without recoding them to mp4 video formats.
Is there any way I can play any video on Android without recoding them?
JavaCV + FFmpeg library already working, just don't know how to do that.
By the way, heres the code that I'm using to recode videos:
public static void convert(File file) {
FFmpegFrameGrabber frameGrabber =
new FFmpegFrameGrabber(file.getAbsolutePath());
IplImage captured_frame = null;
FrameRecorder recorder = null;
recorder = new FFmpegFrameRecorder("/mnt/sdcard/external_sd/videosteste/primeiroteste.mp4", 300, 300);
recorder.setVideoCodec(13);
recorder.setFrameRate(30);
recorder.setFormat("mp4");
try {
recorder.start();
frameGrabber.start();
while (true) {
try {
captured_frame = frameGrabber.grab();
if (captured_frame == null) {
System.out.println("!!! Failed cvQueryFrame");
break;
}
recorder.record(captured_frame);
} catch (Exception e) {
}
}
recorder.stop();
recorder.release();
} catch (Exception e) {
e.printStackTrace();
}
}
first you create the CanvasFrame then use "canvas.showImage(captured_frame);" instead of "recorder.record(captured_frame);"
Here is the code:
public class GrabberShow implements Runnable
{
final static int INTERVAL=40;///you may use interval
IplImage image;
static CanvasFrame canvas = new CanvasFrame("JavaCV player");
public GrabberShow()
{
canvas.setDefaultCloseOperation(javax.swing.JFrame.EXIT_ON_CLOSE);
}
public static void convert(File file)
{
FFmpegFrameGrabber frameGrabber = new FFmpegFrameGrabber(file.getAbsolutePath());
IplImage captured_frame = null;
FrameRecorder recorder = null;
//recorder = new FFmpegFrameRecorder("/mnt/sdcard/external_sd/videosteste/primeiroteste.mp4", 300, 300);
recorder = new FFmpegFrameRecorder("D://temp.mp4", 300, 300);
recorder.setVideoCodec(13);
recorder.setFrameRate(30);
recorder.setFormat("mp4");
try {
recorder.start();
frameGrabber.start();
while (true) {
try {
captured_frame = frameGrabber.grab();
if (captured_frame == null) {
System.out.println("!!! Failed cvQueryFrame");
break;
}
//recorder.record(captured_frame);
canvas.showImage(captured_frame);
Thread.sleep(INTERVAL);
} catch (Exception e) {
}
}
recorder.stop();
recorder.release();
} catch (Exception e) {
e.printStackTrace();
}
}
#Override
public void run()
{
convert(new File("D://aes.mp4"));
}
public static void main(String[] args) {
GrabberShow gs = new GrabberShow();
Thread th = new Thread(gs);
th.start();
}
}
Is there any way I can play any video on Android without recoding them?
Why are you recording the Video?? There is no need to record the video. JavaCv.
This is sample code for giving you the idea, how you can achieve it.
FrameGrabber grabber = new FrameGrabber(videoFile);
grabber.start();
BufferedImage image= null;
while((image=grabber.grab())!=null){
// TODO set the image on the canvas or panel where ever you want.
}
grabber.stop();
I am using AudioRecord to record raw audio for processing.
The audio records entirely without any noise but when the raw PCM data generated is played back, it plays as if it has been speeded up a lot (upto about twice as much).
I am viewing and playing the PCM data in Audacity. I am using actual phone (Samsung Galaxy S5670) for testing.
The recording is done at 44100 Hz, 16 bit. Any idea what might cause this?
Following is the recording code:
public class TestApp extends Activity
{
File file;
OutputStream os;
BufferedOutputStream bos;
AudioRecord recorder;
int iAudioBufferSize;
boolean bRecording;
int iBytesRead;
Thread recordThread = new Thread(){
#Override
public void run()
{
byte[] buffer = new byte[iAudioBufferSize];
int iBufferReadResult;
iBytesRead = 0;
while(!interrupted())
{
iBufferReadResult = recorder.read(buffer, 0, iAudioBufferSize);
// Android is reading less number of bytes than requested.
if(iAudioBufferSize > iBufferReadResult)
{
iBufferReadResult = iBufferReadResult +
recorder.read(buffer, iBufferReadResult - 1, iAudioBufferSize - iBufferReadResult);
}
iBytesRead = iBytesRead + iBufferReadResult;
for (int i = 0; i < iBufferReadResult; i++)
{
try
{
bos.write(buffer[i]);
} catch (IOException e)
{
e.printStackTrace();
}
}
}
}
};
#Override
public void onCreate(Bundle savedInstanceState)
{
// File Creation and UI init stuff etc.
bRecording = false;
bPlaying = false;
int iSampleRate = AudioTrack.getNativeOutputSampleRate(AudioManager.STREAM_SYSTEM);
iAudioBufferSize = AudioRecord.getMinBufferSize(iSampleRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
recorder = new AudioRecord(AudioSource.MIC, iSampleRate, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, iAudioBufferSize);
bt_Record.setOnClickListener(new OnClickListener()
{
#Override
public void onClick(View v)
{
if (!bRecording)
{
try
{
recorder.startRecording();
bRecording = true;
recordThread.start();
}
catch(Exception e)
{
tv_Error.setText(e.getLocalizedMessage());
}
}
else
{
recorder.stop();
bRecording = false;
recordThread.interrupt();
try
{
bos.close();
}
catch(IOException e)
{
}
tv_Hello.setText("Recorded Sucessfully. Total " + iBytesRead + " bytes.");
}
}
});
}
}
RESOLVED : I posted this after struggling with it for 1-2 days. But, ironically, I found the solution soon after posting. The buffered output stream write was taking too much time in the for loop, so the stream was skipping samples. changed it to block write, removing the for loop. Works perfectly.
The audio skipping was caused by the delay in writing to buffer.
the solution is to just replace this FOR loop:
for (int i = 0; i < iBufferReadResult; i++)
{
try
{
bos.write(buffer[i]);
} catch (IOException e)
{
e.printStackTrace();
}
}
by a single write, like so:
bos.write(buffer, 0, iBufferReadResult);
I had used the code from a book which worked, I guess, for lower sample rates and buffer updates.
I am in need of simple audio recording and playing example using AudioRecorder in android. I tried with MediaRecorder, it works fine.
You mean AudioRecord? Search e.g. "AudioRecord.OnRecordPositionUpdateListener" using Google Code Search. Btw, AudioRecord does recording, not playing.
See also:
Improve Android Audio Recording quality?
Android AudioRecord class - process live mic audio quickly, set up callback function
here is the sample code for audio record.
private Runnable recordRunnable = new Runnable() {
#Override
public void run() {
byte[] audiodata = new byte[mBufferSizeInBytes];
int readsize = 0;
Log.d(TAG, "start to record");
// start the audio recording
try {
mAudioRecord.startRecording();
} catch (IllegalStateException ex) {
ex.printStackTrace();
}
// in the loop to read data from audio and save it to file.
while (mInRecording == true) {
readsize = mAudioRecord.read(audiodata, 0, mBufferSizeInBytes);
if (AudioRecord.ERROR_INVALID_OPERATION != readsize
&& mFos != null) {
try {
mFos.write(audiodata);
} catch (IOException e) {
e.printStackTrace();
}
}
}
// stop recording
try {
mAudioRecord.stop();
} catch (IllegalStateException ex) {
ex.printStackTrace();
}
getActivity().runOnUiThread(new Runnable() {
#Override
public void run() {
mRecordLogTextView.append("\n Audio finishes recording");
}
});
// close the file
try {
if (mFos != null)
mFos.close();
} catch (IOException e) {
e.printStackTrace();
}
}
};
then you need two buttons (or one acts as different function in the different time) to start and stop the record thread.
mRecordStartButton = (Button) rootView
.findViewById(R.id.audio_record_start);
mRecordStartButton.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
// initialize the audio source
int recordChannel = getChoosedSampleChannelForRecord();
int recordFrequency = getChoosedSampleFrequencyForRecord();
int recordBits = getChoosedSampleBitsForRecord();
Log.d(TAG, "recordBits = " + recordBits);
mRecordChannel = getChoosedSampleChannelForSave();
mRecordBits = getChoosedSampleBitsForSave();
mRecordFrequency = recordFrequency;
// set up the audio source : get the buffer size for audio
// record.
int minBufferSizeInBytes = AudioRecord.getMinBufferSize(
recordFrequency, recordChannel, recordBits);
if(AudioRecord.ERROR_BAD_VALUE == minBufferSizeInBytes){
mRecordLogTextView.setText("Configuration Error");
return;
}
int bufferSizeInBytes = minBufferSizeInBytes * 4;
// create AudioRecord object
mAudioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
recordFrequency, recordChannel, recordBits,
bufferSizeInBytes);
// calculate the buffer size used in the file operation.
mBufferSizeInBytes = minBufferSizeInBytes * 2;
// reset the save file setup
String rawFilePath = WaveFileWrapper
.getRawFilePath(RAW_PCM_FILE_NAME);
try {
File file = new File(rawFilePath);
if (file.exists()) {
file.delete();
}
mFos = new FileOutputStream(file);
} catch (Exception e) {
e.printStackTrace();
}
if (mInRecording == false) {
mRecordThread = new Thread(recordRunnable);
mRecordThread.setName("Demo.AudioRecord");
mRecordThread.start();
mRecordLogTextView.setText(" Audio starts recording");
mInRecording = true;
// enable the stop button
mRecordStopButton.setEnabled(true);
// disable the start button
mRecordStartButton.setEnabled(false);
}
// show the log info
String audioInfo = " Audio Information : \n"
+ " sample rate = " + mRecordFrequency + "\n"
+ " channel = " + mRecordChannel + "\n"
+ " sample byte = " + mRecordBits;
mRecordLogTextView.setText(audioInfo);
}
});
mRecordStopButton = (Button) rootView
.findViewById(R.id.audio_record_stop);
mRecordStopButton.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
if (mInRecording == false) {
Log.d(TAG, "current NOT in Record");
} else {
// stop recording
if (mRecordThread != null) {
Log.d(TAG, "mRecordThread is not null");
mInRecording = false;
Log.d(TAG, "set mInRecording to false");
try {
mRecordThread.join(TIMEOUT_FOR_RECORD_THREAD_JOIN);
Log.d(TAG, "record thread joins here");
} catch (InterruptedException e) {
e.printStackTrace();
}
mRecordThread = null;
// re-enable the start button
mRecordStartButton.setEnabled(true);
// disable the start button
mRecordStopButton.setEnabled(false);
} else {
Log.d(TAG, "mRecordThread is null");
}
}
}
});
then you can save the pcm data into a WAV file.