I want to play a mp3 with mono and stereo effect. Currently i am working on play stereo effect I have read all the documents, but i am getting only white noise.
my code is:
public class AudioTest extends Activity
{ byte[] b;
public void onCreate(Bundle savedInstanceState)
{
AndroidAudioDevice device = new AndroidAudioDevice( );
super.onCreate(savedInstanceState);
File f=new File("/sdcard/lepord.mp3");
try {
FileInputStream in=new FileInputStream(f);
int size=in.available();
b=new byte[size];
in.read(b);
in.close();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
while( true )
{
device.writeSamples(b);
try {
Thread.sleep(2000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
and my AndroidAudioDevice class is:
public class AndroidAudioDevice
{
AudioTrack track;
byte[] buffer = new byte[158616];
#SuppressWarnings("deprecation")
public AndroidAudioDevice( )
{
int minSize =AudioTrack.getMinBufferSize( 44100, AudioFormat.CHANNEL_OUT_STEREO, AudioFormat.ENCODING_PCM_16BIT );
track = new AudioTrack( AudioManager.STREAM_MUSIC, 44100,
AudioFormat.CHANNEL_OUT_STEREO, AudioFormat.ENCODING_PCM_16BIT,
158616, AudioTrack.MODE_STREAM);
Log.e(""," sizewe are using for track buffer is 158616");
track.setStereoVolume(.6f,.6f);
track.play();
}
public void writeSamples(byte[] b) {
// TODO Auto-generated method stub
Log.e("","bytes to be write in track is "+b.length );
fillBuffer(b);
track.write( buffer, 0, b.length );
}
private void fillBuffer( byte[] samples )
{
Log.e("","track buffer length="+buffer.length+" samle length"+samples.length );
if( buffer.length < samples.length )
buffer = new byte[samples.length];
for( int i = 0; i < samples.length; i++ )
buffer[i] = (byte)(samples[i] * Byte.MAX_VALUE);;
}
}
first i am not getting any sound rather than white noise, i just want to play a sound by AudioTrack and then i will work on mono and stereo sound effect
please help me
Thanks in advance.*
Related
I am trying to write Short[] to wav audio file using file output stream but the file only contains scratch sound.
The reason i am using short[] rather than byte[] is because i am trying to use an external library which provides Voice Activity Detection . I did add wav header provided in Android Audio Record to wav and i tried to convert Short[] to byte[] using Converting Short array from Audio Record to Byte array without degrading audio quality? but none of the above links were able to help me.
Here is my code:
private class ProcessVoice implements Runnable {
#Override
public void run() {
File fl = new File(filePath, AUDIO_RECORDING_FILE_NAME);
try {
os = new BufferedOutputStream(new FileOutputStream(fl));
} catch (FileNotFoundException e) {
Log.w(TAG, "File not found for recording ");
}
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_AUDIO);
while (!Thread.interrupted() && isListening && audioRecord != null) {
short[] buffer = new short[vad.getConfig().getFrameSize().getValue() * getNumberOfChannels() * 2];
audioRecord.read(buffer, 0, buffer.length);
isSpeechDetected(buffer);
}
}
private void isSpeechDetected(final short[] buffer) {
vad.isContinuousSpeech(buffer, new VadListener() {
#Override
public void onSpeechDetected() {
callback.onSpeechDetected();
bytes2 = new byte[buffer.length * 2];
ByteBuffer.wrap(bytes2).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().put(buffer);
//Log.w(TAG, String.valueOf(buffer));
try {
// // writes the data to file from buffer
// // stores the voice buffer
os.write(header, 0, 44);
working = true;
os.write(bytes2, 0, bytes2.length);
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public void onNoiseDetected() {
callback.onNoiseDetected();
if(working == true){
working = false;
try {
doneRec();
} catch (IOException e) {
e.printStackTrace();
}
}
//Log.w(TAG, String.valueOf(bytes2));
}
});
}
}
I'm building, inside my existing app, a player using the AudioTrack class,in MODE_STATIC, because i want to implement the timestretch and the loop points features.
The code is ok for start() and stop(), but when paused, if i try to resume, calling play() again, the status bar remain fixed and no audio is played.
Now, from the docs :
Public void pause ()Pauses the playback of the audio data. Data that has not been played >back will not be discarded. Subsequent calls to play() will play this data back. See >flush() to discard this data.
It seems so easy to understand but there is something that escapes me.
Can some one help me?
Is it necessary to create boolean variables like start, play, pause, stopAudio etc?
If yes, where is the utility of the methods inherited from the AudioTrack class?
In MODE_STREAM i have realized the project, using the above boolean variables., but i need the MODE_STATIC.
This is the code, thanks:
Button playpause, stop;
SeekBar posBar;
int sliderval=0;
int headerOffset = 0x2C;
File file =new File(Environment.getExternalStorageDirectory(), "raw.pcm");
int fileSize = (int) file.length();
int dataSize = fileSize-headerOffset ;
byte[] dataArray = new byte[dataSize];
int posValue;
int dataBytesRead = initializeTrack();
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 44100,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, dataBytesRead , AudioTrack.MODE_STATIC);
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
playpause= (Button)(findViewById(R.id.playpause));
stop= (Button)(findViewById(R.id.stop));
posBar=(SeekBar)findViewById(R.id.posBar);
// create a listener for the slider bar;
OnSeekBarChangeListener listener = new OnSeekBarChangeListener() {
public void onStopTrackingTouch(SeekBar seekBar) { }
public void onStartTrackingTouch(SeekBar seekBar) { }
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
if (fromUser) { sliderval = progress;}
}
};
// set the listener on the slider
posBar.setOnSeekBarChangeListener(listener); }
public void toggleButtonSound(View button)
{
switch (button.getId())
{
case R.id.playpause:
play();
break;
case R.id.stop:
stop();
break;
}
}
private void stop() {
if(audioTrack.getState()==AudioTrack.PLAYSTATE_PLAYING ||
audioTrack.getState()==AudioTrack.PLAYSTATE_PAUSED || audioTrack.getState()==AudioTrack.PLAYSTATE_STOPPED)
{ audioTrack.stop();
resetPlayer();}
}
Context context;
private double actualPos=0;
public void pause() {}
public void play()
{
if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PLAYING)
{ //Log.i("", "Play pressed in state "+audioTrack.getPlayState());
audioTrack.pause();
}
else if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PAUSED)
{ //Log.i("", "Play pressed in state "+audioTrack.getPlayState());
audioTrack.play();
}
else if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_STOPPED)
{ //Log.i("", "Play pressed in state "+audioTrack.getPlayState());
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 44100, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, dataSize, AudioTrack.MODE_STATIC);
audioTrack.write(dataArray, 0, dataBytesRead);
audioTrack.play();
}
posBar.setMax((int) (dataBytesRead/2)); // Set the Maximum range of the
audioTrack.setNotificationMarkerPosition((int) (dataSize/2));
audioTrack.setPositionNotificationPeriod(1000);
audioTrack.setPlaybackPositionUpdateListener(new OnPlaybackPositionUpdateListener() {
#Override
public void onPeriodicNotification(AudioTrack track) {
posBar.setProgress(audioTrack.getPlaybackHeadPosition());
Log.i("", " " + audioTrack.getPlaybackHeadPosition() + " " + dataBytesRead/2);
}
#Override
public void onMarkerReached(AudioTrack track) {
Log.i("", " End reached ");
audioTrack.pause();
audioTrack.flush();
audioTrack.release();
posBar.setProgress(0);
resetPlayer();}
});
}
private int initializeTrack() {
InputStream is;
BufferedInputStream bis;
DataInputStream dis;
int temp = 0;
try {
is = new FileInputStream(file);
bis = new BufferedInputStream(is);
dis = new DataInputStream(bis);
temp = dis.read(dataArray, 0, dataSize);
dis.close();
bis.close();
is.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return temp;
}
public void resetPlayer() {
audioTrack.flush();
audioTrack.release();
posBar.setProgress(0);
sliderval=0;
}
You see, you did implement AudioTrack so that even when its paused the contents of file still uploads to AudioTrack:
I don't know how it manage it but in my case I also pause data uploading to AT. Like:
while (byteOffset < fileLengh) {
if(isPaused)
continue;
ret = in.read(byteData, 0, byteCount);
if (ret != -1) { // Write the byte array to the track
audioTrack.write(byteData, 0, ret);
byteOffset += ret;
} else
break;
}
So then I unpause the AT the file uploading while cycle resumes too. I guess that's it. Also I have to mention that even when AT is playing the following:
if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PLAYING)
and
if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PAUSED)
doesn't work for me and getPlayState() always returns 1 (AudioTrack.PLAYSTATE_STOPPED) for me, no matter if its playing or has been paused.
I am using AudioRecord to read microphone data and RandomAccessFile to write it to a wav file. This is the code:
public class MainActivity extends Activity {
AudioManager am = null;
AudioRecord record =null;
// AudioTrack track =null;
final int SAMPLE_FREQUENCY = 44100;
final int SIZE_OF_RECORD_ARRAY = 1024; // 1024 ORIGINAL
final int WAV_SAMPLE_MULTIPLICATION_FACTOR = 1;
int i= 0;
boolean isPlaying = false;
private volatile boolean keepThreadRunning;
// private RandomAccessFile stateFile, stateFileTemp, savToDisk;
private RandomAccessFile savToDisk;
private FileDescriptor fd = new FileDescriptor();
private File delFile, renFile;
String stateFileLoc = Environment.getExternalStorageDirectory().getPath();
// To keep hederWriter() happy
private short nChannels = 1;
private int sRate = SAMPLE_FREQUENCY;
private short mBitsPersample = 16; // represents 16 bits of one PCM sample
private int payload;
class MyThread extends Thread{
private volatile boolean needsToPassThrough;
// /*
MyThread(){
super();
}
MyThread(boolean newPTV){
this.needsToPassThrough = newPTV;
}
// */
// /*
#Override
public void run(){
short[] lin = new short[SIZE_OF_RECORD_ARRAY];
// byte[] lin = new byte[SIZE_OF_RECORD_ARRAY];
int num = 0;
// /*
if(needsToPassThrough){
record.startRecording();
// track.play();
}
// */
while (keepThreadRunning) {
// while (!isInterrupted()) {
// num = record.read(lin, 0, SIZE_OF_RECORD_ARRAY);
num = record.read(lin, 0, lin.length);
try {
// savToDisk.write(lin); // use only this line if lin is a byte array
// use the for loop block below if lin is an array of short
for(i=0;i <lin.length; i++)
savToDisk.writeShort(Short.reverseBytes(lin[i]));
// payload += lin.length; // use this line if lin is an array of byte
payload = payload + (lin.length)*2;
fd.sync();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
/*
catch (SyncFailedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
*/
}
// /*
record.stop();
// track.stop();
record.release();
// track.release();
// */
}
// */
// /*
public void stopThread(){
keepThreadRunning = false;
}
// */
}
MyThread newThread;
private void init() {
int min = AudioRecord.getMinBufferSize(SAMPLE_FREQUENCY, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
// Toast.makeText(getApplicationContext(), Integer.toString(min), Toast.LENGTH_SHORT).show(); // Shows 4096
record = new AudioRecord(MediaRecorder.AudioSource.VOICE_COMMUNICATION, SAMPLE_FREQUENCY, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, min);
// int maxJitter = AudioTrack.getMinBufferSize(SAMPLE_FREQUENCY, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
// track = new AudioTrack(AudioManager.MODE_IN_COMMUNICATION, SAMPLE_FREQUENCY, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, maxJitter, AudioTrack.MODE_STREAM);
am = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
am.setMode(AudioManager.MODE_IN_COMMUNICATION);
try {
savToDisk = new RandomAccessFile(stateFileLoc+"/audSampData.wav", "rw");
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
try {
fd = savToDisk.getFD();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
private void writeHeader(){
try {
savToDisk.setLength(0); // Set file length to 0, to prevent unexpected behavior in case the file already existed
savToDisk.writeBytes("RIFF");
savToDisk.writeInt(0); // Final file size not known yet, write 0
savToDisk.writeBytes("WAVE");
savToDisk.writeBytes("fmt ");
savToDisk.writeInt(Integer.reverseBytes(16)); // Sub-chunk size, 16 for PCM
savToDisk.writeShort(Short.reverseBytes((short) 1)); // AudioFormat, 1 for PCM
savToDisk.writeShort(Short.reverseBytes(nChannels));// Number of channels, 1 for mono, 2 for stereo
savToDisk.writeInt(Integer.reverseBytes(sRate)); // Sample rate
savToDisk.writeInt(Integer.reverseBytes(sRate*nChannels*mBitsPersample/8)); // Byte rate, SampleRate*NumberOfChannels*mBitsPersample/8
savToDisk.writeShort(Short.reverseBytes((short)(nChannels*mBitsPersample/8))); // Block align, NumberOfChannels*mBitsPersample/8
savToDisk.writeShort(Short.reverseBytes(mBitsPersample)); // Bits per sample
savToDisk.writeBytes("data");
savToDisk.writeInt(0); // Data chunk size not known yet, write 0
fd.sync();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#Override
protected void onResume(){
super.onResume();
// newThread.stopThread();
Log.d("MYLOG", "onResume() called");
init();
writeHeader();
keepThreadRunning = true;
// */
// newThread = new MyThread(true);
newThread = new MyThread(isPlaying);
newThread.start();
}
#Override
protected void onPause(){
super.onPause();
Log.d("MYLOG", "onPause() called");
newThread.stopThread();
// android.os.Process.killProcess(android.os.Process.myPid());
try {
savToDisk.seek(4);
savToDisk.writeInt(Integer.reverseBytes(36+payload));
savToDisk.seek(40);
savToDisk.writeInt(Integer.reverseBytes(payload));
savToDisk.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
setVolumeControlStream(AudioManager.MODE_IN_COMMUNICATION);
payload = 0;
Log.d("MYLOG","onCreate() called");
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
#Override
protected void onDestroy() {
super.onDestroy();
newThread.stopThread();
// android.os.Process.killProcess(android.os.Process.myPid());
// killProcess(android.os.Process.myPid());
// newThread.interrupt();
// delFile.delete();
Log.d("MYLOG", "onDestroy() called");
/*
try {
savToDisk.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
*/
}
public void passStop(View view){
Button playBtn = (Button) findViewById(R.id.button1);
// /*
if(!isPlaying){
record.startRecording();
// track.play();
isPlaying = true;
playBtn.setText("Pause");
}
else{
record.stop();
// track.pause();
isPlaying=false;
playBtn.setText("Pass through");
}
// */
}
}
When I play the wav file in an audio player, it sounds speeded up, and also seems to skip frames. What could be the reasons for this? I believe the skipping frames problem is probably due to the fact that I have used writeShort() function to write out each element of the short array that stores the audio sample data separately, but if that is the case please suggest a workaround to it that involves writing data as shorts (and not using the write(byte[]) function, because I need to use parts of this code in my main project which involves obtaining audio samples in a short array). Also why is it speeded up?
Take a look at this, this question put me on the right track.
Android : recording audio using audiorecord class play as fast forwarded
The following code should record audio and store it in to SD card in PCM format.
the code is working with me ,but the PCM file doesn't play !!!!
I got this code from this link.... Android : recording audio using audiorecord class play as fast forwarded
I need to play the PCM file How can i do that ??????
public class Audio_Record extends Activity {
private static final int RECORDER_SAMPLERATE = 8000;
private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_MONO;
private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
private AudioRecord recorder = null;
private Thread recordingThread = null;
private boolean isRecording = false;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
setButtonHandlers();
enableButtons(false);
int bufferSize = AudioRecord.getMinBufferSize(RECORDER_SAMPLERATE,
RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING);
System.out.println("BUFFER SIZE VALUE IS " + bufferSize);
}
private void setButtonHandlers() {
((Button) findViewById(R.id.btnStart)).setOnClickListener(btnClick);
((Button) findViewById(R.id.btnStop)).setOnClickListener(btnClick);
}
private void enableButton(int id, boolean isEnable) {
((Button) findViewById(id)).setEnabled(isEnable);
}
private void enableButtons(boolean isRecording) {
enableButton(R.id.btnStart, !isRecording);
enableButton(R.id.btnStop, isRecording);
}
int BufferElements2Rec = 1024; // want to play 2048 (2K) since 2 bytes we
// use only 1024
int BytesPerElement = 2; // 2 bytes in 16bit format
private void startRecording() {
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
RECORDER_SAMPLERATE, RECORDER_CHANNELS,
RECORDER_AUDIO_ENCODING, BufferElements2Rec * BytesPerElement);
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable() {
public void run() {
writeAudioDataToFile();
}
}, "AudioRecorder Thread");
recordingThread.start();
}
private byte[] short2byte(short[] sData) {
int shortArrsize = sData.length;
byte[] bytes = new byte[shortArrsize * 2];
for (int i = 0; i < shortArrsize; i++) {
bytes[i * 2] = (byte) (sData[i] & 0x00FF);
bytes[(i * 2) + 1] = (byte) (sData[i] >> 8);
sData[i] = 0;
}
return bytes;
}
private void writeAudioDataToFile() {
// Write the output audio in byte
String filePath = "/sdcard/voice8K16bitmono.pcm";
short sData[] = new short[BufferElements2Rec];
FileOutputStream os = null;
try {
os = new FileOutputStream(filePath);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
while (isRecording) {
// gets the voice output from microphone to byte format
recorder.read(sData, 0, BufferElements2Rec);
System.out.println("Short wirting to file" + sData.toString());
try {
// // writes the data to file from buffer
// // stores the voice buffer
byte bData[] = short2byte(sData);
os.write(bData, 0, BufferElements2Rec * BytesPerElement);
} catch (IOException e) {
e.printStackTrace();
}
}
try {
os.close();
} catch (IOException e) {
e.printStackTrace();
}
}
private void stopRecording() {
// stops the recording activity
if (null != recorder) {
isRecording = false;
recorder.stop();
recorder.release();
recorder = null;
recordingThread = null;
}
}
private View.OnClickListener btnClick = new View.OnClickListener() {
public void onClick(View v) {
switch (v.getId()) {
case R.id.btnStart: {
enableButtons(true);
startRecording();
break;
}
case R.id.btnStop: {
enableButtons(false);
stopRecording();
break;
}
}
}
};
#Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
if (keyCode == KeyEvent.KEYCODE_BACK) {
finish();
}
return super.onKeyDown(keyCode, event);
}
}
Android's media player by default doesn't play PCM files. Either
Copy it from your SD card to your computer and play it there
Write your own player using AudioTrack
Install an app that plays PCM
Here's a tutorial on how to play PCM using the AudioTrack class: (http://jongladwin.blogspot.co.uk/2010/03/android-play-pcmwav-audio-buffer-using.html)
Windows Media Player should be able to play PCM, some alternatives are mentioned here: (http://www.makeuseof.com/answers/play-pcm-file-pc/)
I guess most of the big music player apps on Android will support PCM.
I also used your code, but my voice record was like a "zzzzz" record. So I changed a little the code and now I can listen without problems and distortions the record both by smartphone and by PC (in this case with Audacity).
This is my code:
public class VoiceActivity extends Activity {
private static final String TAG = "VoiceRecord";
private static final int RECORDER_SAMPLERATE = 8000;
private static final int RECORDER_CHANNELS_IN = AudioFormat.CHANNEL_IN_MONO;
private static final int RECORDER_CHANNELS_OUT = AudioFormat.CHANNEL_OUT_MONO;
private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
private static final int AUDIO_SOURCE = MediaRecorder.AudioSource.MIC;
// Initialize minimum buffer size in bytes.
private int bufferSize = AudioRecord.getMinBufferSize(RECORDER_SAMPLERATE, RECORDER_CHANNELS_IN, RECORDER_AUDIO_ENCODING);
private AudioRecord recorder = null;
private Thread recordingThread = null;
private boolean isRecording = false;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_voice);
((Button) findViewById(R.id.start_button)).setOnClickListener(btnClick);
((Button) findViewById(R.id.stop_button)).setOnClickListener(btnClick);
enableButtons(false);
}
private void enableButton(int id, boolean isEnable) {
((Button) findViewById(id)).setEnabled(isEnable);
}
private void enableButtons(boolean isRecording) {
enableButton(R.id.start_button, !isRecording);
enableButton(R.id.stop_button, isRecording);
}
private void startRecording() {
if( bufferSize == AudioRecord.ERROR_BAD_VALUE)
Log.e( TAG, "Bad Value for \"bufferSize\", recording parameters are not supported by the hardware");
if( bufferSize == AudioRecord.ERROR )
Log.e( TAG, "Bad Value for \"bufferSize\", implementation was unable to query the hardware for its output properties");
Log.e( TAG, "\"bufferSize\"="+bufferSize);
// Initialize Audio Recorder.
recorder = new AudioRecord(AUDIO_SOURCE, RECORDER_SAMPLERATE, RECORDER_CHANNELS_IN, RECORDER_AUDIO_ENCODING, bufferSize);
// Starts recording from the AudioRecord instance.
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable() {
public void run() {
writeAudioDataToFile();
}
}, "AudioRecorder Thread");
recordingThread.start();
}
private void writeAudioDataToFile() {
//Write the output audio in byte
String filePath = "/sdcard/8k16bitMono.pcm";
byte saudioBuffer[] = new byte[bufferSize];
FileOutputStream os = null;
try {
os = new FileOutputStream(filePath);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
while (isRecording) {
// gets the voice output from microphone to byte format
recorder.read(saudioBuffer, 0, bufferSize);
try {
// writes the data to file from buffer stores the voice buffer
os.write(saudioBuffer, 0, bufferSize);
} catch (IOException e) {
e.printStackTrace();
}
}
try {
os.close();
} catch (IOException e) {
e.printStackTrace();
}
}
private void stopRecording() throws IOException {
// stops the recording activity
if (null != recorder) {
isRecording = false;
recorder.stop();
recorder.release();
recorder = null;
recordingThread = null;
PlayShortAudioFileViaAudioTrack("/sdcard/8k16bitMono.pcm");
}
}
private void PlayShortAudioFileViaAudioTrack(String filePath) throws IOException{
// We keep temporarily filePath globally as we have only two sample sounds now..
if (filePath==null)
return;
//Reading the file..
File file = new File(filePath); // for ex. path= "/sdcard/samplesound.pcm" or "/sdcard/samplesound.wav"
byte[] byteData = new byte[(int) file.length()];
Log.d(TAG, (int) file.length()+"");
FileInputStream in = null;
try {
in = new FileInputStream( file );
in.read( byteData );
in.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
// Set and push to audio track..
int intSize = android.media.AudioTrack.getMinBufferSize(RECORDER_SAMPLERATE, RECORDER_CHANNELS_OUT, RECORDER_AUDIO_ENCODING);
Log.d(TAG, intSize+"");
AudioTrack at = new AudioTrack(AudioManager.STREAM_MUSIC, RECORDER_SAMPLERATE, RECORDER_CHANNELS_OUT, RECORDER_AUDIO_ENCODING, intSize, AudioTrack.MODE_STREAM);
if (at!=null) {
at.play();
// Write the byte array to the track
at.write(byteData, 0, byteData.length);
at.stop();
at.release();
}
else
Log.d(TAG, "audio track is not initialised ");
}
private View.OnClickListener btnClick = new View.OnClickListener() {
public void onClick(View v) {
switch (v.getId()) {
case R.id.start_button: {
enableButtons(true);
startRecording();
break;
}
case R.id.stop_button: {
enableButtons(false);
try {
stopRecording();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
break;
}
}
}
};
// onClick of backbutton finishes the activity.
#Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
if (keyCode == KeyEvent.KEYCODE_BACK) {
finish();
}
return super.onKeyDown(keyCode, event);
}
}
it's my solution
public class AudioTrackPlayer {
private String pathAudio;
private AudioTrack audioPlayer;
private Thread mThread;
private int bytesread = 0, ret = 0;
private int size;
private FileInputStream in = null;
private byte[] byteData = null;
private int count = 512 * 1024; // 512 kb
private boolean isPlay = true;
private boolean isLooping = false;
private static Handler mHandler;
public AudioTrackPlayer() {
}
public void prepare(String pathAudio){
this.pathAudio = pathAudio;
mHandler = new Handler();
}
public void play(){
stop();
isPlay = true;
bytesread = 0;
ret = 0;
if (pathAudio == null)
return;
audioPlayer = createAudioPlayer();
if (audioPlayer == null) return;
audioPlayer.play();
mThread = new Thread(new PlayerProcess());
mThread.start();
}
private final Runnable mLopingRunnable = new Runnable() {
#Override
public void run() {
play();
}
};
private AudioTrack createAudioPlayer(){
int intSize = android.media.AudioTrack.getMinBufferSize(16000, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT);
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 16000, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT, intSize, AudioTrack.MODE_STREAM);
if (audioTrack == null) {
Log.d("TCAudio", "audio track is not initialised ");
return null;
}
File file = null;
file = new File(pathAudio);
byteData = new byte[(int) count];
try {
in = new FileInputStream(file);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
size = (int) file.length();
return audioTrack;
}
private class PlayerProcess implements Runnable{
#Override
public void run() {
while (bytesread < size && isPlay) {
if (Thread.currentThread().isInterrupted()) {
break;
}
try {
ret = in.read(byteData, 0, count);
} catch (IOException e) {
e.printStackTrace();
}
if (ret != -1) { // Write the byte array to the track
audioPlayer.write(byteData,0, ret);
bytesread += ret;
} else break;
}
try {
in.close();
} catch (IOException e) {
e.printStackTrace();
}
if (audioPlayer!=null){
if (audioPlayer.getState()!=AudioTrack.PLAYSTATE_STOPPED){
audioPlayer.stop();
audioPlayer.release();
mThread = null;
}
}
if (isLooping && isPlay ) mHandler.postDelayed(mLopingRunnable,100);
}
}
public void setLooping(){
isLooping = !isLooping;
}
public void pause(){
}
public void stop(){
isPlay = false;
if (mThread != null) {
mThread.interrupt();
mThread = null;
}
if (audioPlayer != null) {
audioPlayer.stop();
audioPlayer.release();
audioPlayer = null;
}
}
public void reset(){
}
}
private void startRecording() {
if( bufferSize == AudioRecord.ERROR_BAD_VALUE)
Log.e( TAG, "Bad Value for \"bufferSize\", recording parameters are not supported by the hardware");
if( bufferSize == AudioRecord.ERROR )
Log.e( TAG, "Bad Value for \"bufferSize\", implementation was unable to query the hardware for its output properties");
Log.e( TAG, "\"bufferSize\"="+bufferSize);
// Initialize Audio Recorder.
recorder = new AudioRecord(AUDIO_SOURCE, RECORDER_SAMPLERATE, AudioFormat.CHANNEL_CONFIGURATION_MONO, RECORDER_AUDIO_ENCODING, bufferSize);
// Starts recording from the AudioRecord instance.
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable() {
public void run() {
writeAudioDataToFile();
}
}, "AudioRecorder Thread");
recordingThread.start();
}
Replace Recording Code...
I am using AudioRecord to record raw audio for processing.
The audio records entirely without any noise but when the raw PCM data generated is played back, it plays as if it has been speeded up a lot (upto about twice as much).
I am viewing and playing the PCM data in Audacity. I am using actual phone (Samsung Galaxy S5670) for testing.
The recording is done at 44100 Hz, 16 bit. Any idea what might cause this?
Following is the recording code:
public class TestApp extends Activity
{
File file;
OutputStream os;
BufferedOutputStream bos;
AudioRecord recorder;
int iAudioBufferSize;
boolean bRecording;
int iBytesRead;
Thread recordThread = new Thread(){
#Override
public void run()
{
byte[] buffer = new byte[iAudioBufferSize];
int iBufferReadResult;
iBytesRead = 0;
while(!interrupted())
{
iBufferReadResult = recorder.read(buffer, 0, iAudioBufferSize);
// Android is reading less number of bytes than requested.
if(iAudioBufferSize > iBufferReadResult)
{
iBufferReadResult = iBufferReadResult +
recorder.read(buffer, iBufferReadResult - 1, iAudioBufferSize - iBufferReadResult);
}
iBytesRead = iBytesRead + iBufferReadResult;
for (int i = 0; i < iBufferReadResult; i++)
{
try
{
bos.write(buffer[i]);
} catch (IOException e)
{
e.printStackTrace();
}
}
}
}
};
#Override
public void onCreate(Bundle savedInstanceState)
{
// File Creation and UI init stuff etc.
bRecording = false;
bPlaying = false;
int iSampleRate = AudioTrack.getNativeOutputSampleRate(AudioManager.STREAM_SYSTEM);
iAudioBufferSize = AudioRecord.getMinBufferSize(iSampleRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
recorder = new AudioRecord(AudioSource.MIC, iSampleRate, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, iAudioBufferSize);
bt_Record.setOnClickListener(new OnClickListener()
{
#Override
public void onClick(View v)
{
if (!bRecording)
{
try
{
recorder.startRecording();
bRecording = true;
recordThread.start();
}
catch(Exception e)
{
tv_Error.setText(e.getLocalizedMessage());
}
}
else
{
recorder.stop();
bRecording = false;
recordThread.interrupt();
try
{
bos.close();
}
catch(IOException e)
{
}
tv_Hello.setText("Recorded Sucessfully. Total " + iBytesRead + " bytes.");
}
}
});
}
}
RESOLVED : I posted this after struggling with it for 1-2 days. But, ironically, I found the solution soon after posting. The buffered output stream write was taking too much time in the for loop, so the stream was skipping samples. changed it to block write, removing the for loop. Works perfectly.
The audio skipping was caused by the delay in writing to buffer.
the solution is to just replace this FOR loop:
for (int i = 0; i < iBufferReadResult; i++)
{
try
{
bos.write(buffer[i]);
} catch (IOException e)
{
e.printStackTrace();
}
}
by a single write, like so:
bos.write(buffer, 0, iBufferReadResult);
I had used the code from a book which worked, I guess, for lower sample rates and buffer updates.