Android How to set loop on AudioTrack? - android

I'm playing .wav files using AudioTrack.
I have a problem.
I set the setLoopPoints to loop my .wav files, but it doesn't work.
This is my sample code.
public class PleaseActivity extends Activity implements Runnable{
AudioTrack audioTrack;
public static final String MEDIA_PATH = Environment.getExternalStorageDirectory().getAbsolutePath()+"/TEST";
/** Called when the activity is first created. */
Button play_button, stop_button;
File file = null;
byte[] byteData = null;
Boolean playing = false;
int bufSize;
AudioTrack myAT = null;
Thread play_thread = null;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
play_button = (Button) findViewById(R.id.btn1);
stop_button = (Button) findViewById(R.id.btn2);
file = new File(MEDIA_PATH+"/untitled1.wav");
byteData = new byte[(int) file.length()];
FileInputStream in = null;
try {
in = new FileInputStream(file);
in.read(byteData);
in.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
initialize();
play_button.setOnClickListener(new OnClickListener() {
public void onClick(View v) {
play_thread.start();
}
});
//
stop_button.setOnClickListener(new OnClickListener() {
public void onClick(View v) {
//
if (myAT.getPlayState() == AudioTrack.PLAYSTATE_PLAYING) {
myAT.stop();
play_thread = null;
initialize();
}
}
});
}
void initialize() {
bufSize = android.media.AudioTrack.getMinBufferSize(44100,
AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT);
myAT = new AudioTrack(AudioManager.STREAM_MUSIC,
44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT, bufSize,
AudioTrack.MODE_STREAM);
play_thread = new Thread(this);
}
public void run() {
if (myAT != null) {
myAT.play();
myAT.setLoopPoints(0, byteData.length, 2);
myAT.write(byteData, 0, byteData.length);
}
}
}
I can play my wave files well, but setLoopPoints doesn't work!
Anybody helps me..
I solved this looping problems like this.
I have another problem.
whenever I wrote data into audiotrack,
I mean whenever audiotrack is repeated, some noise like "tick" is added at the first part.
I don't know how to eliminate this noise..
Is there anybody knows how to solve it?
class DLThread extends Thread
{
public void run()
{
while(!DLThread.interrupted())
{
if (myAT != null) {
//
myAT.play();
myAT.flush();
myAT.write(byteData, 0, byteData.length);
}
}
}
}

public int setLoopPoints (int startInFrames, int endInFrames, int loopCount)
Sets the loop points and the loop count. The loop can be infinite. Similarly to setPlaybackHeadPosition, the track must be stopped or paused for the position to be changed, and *must use the MODE_STATIC mode*.

The tick is probably the wav file header. Try offsetting the playback by 44 bytes.

Related

How to record a part of voice when it playing

I have a hard issue about audio recording in android. I used AudioTrack to play my sound when I speak to my phone. I want to record a part in my voice if i press button record. For example, I will speak within 10 seconds. However, I only want to record my sound iff I press record button. It can be from 3th second to 8th second. However, my phone need to play my sound during I speaking (from 1st to 10th second).
Currently, I used a thread to play my sound as following code. I created a flag in the recording thread to decide when I will record. When I press button record, the flag will be set true. And I click stop, It will set false and write to file
public class AudioSoundThread extends Thread {
private short[] audioBuffer;
private boolean isRecording = false;
private boolean isSounding = true;
private AudioRecordingHandler handler = null;
private AudioRecord record;
private AudioTrack track;
public AudioSoundThread(AudioTrack mtrack,AudioRecord mrecord,short[] maudioBuffer, AudioRecordingHandler handler) {
this.handler = handler;
this.audioBuffer = maudioBuffer;
this.record=mrecord;
this.track=mtrack;
}
#Override
public void run() {
record.startRecording();
DataOutputStream output =null;
if(isRecording){
output = prepareWriting();
if (output == null) { return; }
}
track.play();
///////////Play during recording
int readSize =0;
while (isSound) {
readSize=record.read(audioBuffer, 0, audioBuffer.length);
if ((readSize == AudioRecord.ERROR_INVALID_OPERATION) ||
(readSize == AudioRecord.ERROR_BAD_VALUE) ||
(readSize <= 0)) {
continue;
}
if(AudioRecord.ERROR_INVALID_OPERATION != readSize){
track.write(audioBuffer, 0, readSize);
}
if(isRecording)
write(output,readSize);
//Finished to write
if(!isRecording&&output!=null)
{
finishWriting(output);
File waveFile = getFile("wav");
try {
rawToWave(mRecording, waveFile);
deleteTempFile(mRecording);
} catch (IOException e) {
Log.e("Error writing file : ", e.getMessage());
}
}
}
record.stop();
record.release();
}
public synchronized void stopSound() {
isSound = false;
}
public synchronized void startSound() {
isSound = true;
}
public synchronized void startRecordingFlag() {
isRecording = true;
}
public synchronized void stopRecording() {
isRecording = false;
}
private void finishWriting(DataOutputStream out) {
try {
out.close();
} catch (IOException e) {
e.printStackTrace();
if (handler != null) {
handler.onRecordingError();
}
}
}
private DataOutputStream prepareWriting() {
if (mRecording.exists()) { mRecording.delete(); }
DataOutputStream out = null;
try {
out = new DataOutputStream(new BufferedOutputStream(new FileOutputStream(mRecording)));
} catch (FileNotFoundException e) {
e.printStackTrace();
if (handler != null) {
handler.onRecordingError();
}
}
return out;
}
In mainActivity, I have two buttons that are play my sound and button recording
private AudioSoundThread recordingThread;
//Play button
btnPlay = (Button) findViewById(R.id.btnPlay);
btnPlay.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
startSounding();
}
});
//Record button
btnRecord = (Button) findViewById(R.id.btnRecord);
btnRecord.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
//record();
recordingThread.startRecordingFlag();
}
});
//Stop Record button
btnStopRecord = (Button) findViewById(R.id.btnStopRecord);
btnStopRecord .setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
//record();
recordingThread.stopRecording();
}
});
private void startSounding() {
soundingThread = new AudioSoundThread(track,mRecorder,mBuffer,new AudioRecordingHandler() {
});
soundingThread.start();
}
However, my scheme does not work. I think that my flag cannot send to the thread. Could you look at my code and give me one solution?

logical error in my App

I'm developing an App.but doesn't work very well.
it doesn't go inside the loop.
what's wrong with my code?
code :
private int sampleRate =44100;//Integer.parseInt(audioManager.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE));
private int channelConfig = AudioFormat.CHANNEL_OUT_MONO;
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
AudioRecord recorder;
private boolean status = true;
#SuppressLint("NewApi") #TargetApi(Build.VERSION_CODES.GINGERBREAD) #Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
if (android.os.Build.VERSION.SDK_INT > 9) {
StrictMode.ThreadPolicy policy = new StrictMode.ThreadPolicy.Builder().permitAll().build();
StrictMode.setThreadPolicy(policy);
}
receiveButton = (Button) findViewById (R.id.receive_button);
stopButton = (Button) findViewById (R.id.stop_button);
receive_label= (TextView) findViewById(R.id.receive_label);
receiveButton.setOnClickListener(receiveListener);
stopButton.setOnClickListener(stopListener);
port=(EditText) findViewById(R.id.editText1);
button= (Button) findViewById(R.id.button1);
button.setOnClickListener(close);
//AudioManager audioManager = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
// sampleRate =Integer.parseInt( audioManager.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE));
}
private final OnClickListener close = new OnClickListener() {
#Override
public void onClick(View arg0) {
System.exit(0);
}
};
private final OnClickListener stopListener = new OnClickListener() {
#Override
public void onClick(View v) {
status = false;
speaker.release();
Log.d("VR","Speaker released.s");
}
};
private final OnClickListener receiveListener = new OnClickListener() {
#Override
public void onClick(View arg0) {
status = true;
receive_label.setText("socket...2");
startReceiving();
}
};
public void startReceiving() {
Thread receiveThread = new Thread (new Runnable() {
#Override
public void run() {
try {
int minBufSize =460;//recorder.getMinBufferSize(sampleRate,channelConfig,audioFormat);
Log.d("VR", ""+channelConfig+" "+audioFormat+sampleRate);
DatagramSocket socket = new DatagramSocket(50005);
Log.d("VR", "Socket Created.s");
byte[] buffer = new byte[minBufSize+=4096];
// for (int sampleRate : new int[] {44100,8000, 11025, 16000 }) { // add the rates you wish to check against
Log.d("bufersize", "bufer size :"+minBufSize);
Log.d("bufersize", "bufer size :"+sampleRate);
if (minBufSize != AudioRecord.ERROR_BAD_VALUE) {
speaker = new AudioTrack(AudioManager.STREAM_MUSIC,sampleRate,channelConfig,audioFormat,minBufSize,AudioTrack.MODE_STREAM);
speaker.play();
Log.d("VR", "spekaer playing...");
}
// }
//minimum buffer size. need to be careful. might cause problems. try setting manually if any problems faced
// int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
Log.d("VR", ""+status);
while(status == true) {
DatagramPacket packet = new DatagramPacket(buffer,buffer.length);
socket.receive(packet);
Log.d("VR", "Packet Received.s");
//reading content from packet
buffer=packet.getData();
Log.d("VR", "Packet data read into buffer.s");
//sending data to the Audiotrack obj i.e. speaker
speaker.write(buffer, 0, minBufSize);
Log.d("VR", "Writing buffer content to speaker.s");
}
} catch (SocketException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
});
receiveThread.start();
}}
i see all logs that i have put in the code but not those inside the loop
please give me your best help
thanks in advance
Add receiveThread.run(); to startReceiving()

Audio recording using AudioRecord

I've decided to record audio with AudioRecord rather than MediaRecorder, in order to achieve maximum quality. Problem is, the app won't work for some reason. There are 2 buttons: record and play, record is used to start and stop the recording (using a new thread) and play is supposed to play the file using MediaPlayer.
Code:
public class MyActivity extends Activity {
AudioRecord recorder = null;
int SAMPLE_RATE = 44100;
int ENCODING = AudioFormat.ENCODING_PCM_16BIT;
int SOURCE = MediaRecorder.AudioSource.MIC;
int CONFIG = AudioFormat.CHANNEL_IN_MONO;
int BUFFER_SIZE;
boolean isRecording = false;
boolean isPlaying = false;
String currentFileDir;
byte[] b;
File file;
OutputStream FOS;
int count =0;
MediaPlayer mediaPlayer;
Thread recordThread;
private Button recordButton;
private Button playButton;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_my);
currentFileDir = getFilesDir().getAbsolutePath() + "Record.pcm";
recordButton = (Button)findViewById(R.id.RecordButton);
recordButton.setOnClickListener(new MyOCL());
playButton = (Button)findViewById(R.id.PlayButton);
playButton.setOnClickListener(new MyOCL());
}
protected void record(){
file = new File(currentFileDir);
BUFFER_SIZE = AudioRecord.getMinBufferSize(SAMPLE_RATE, CONFIG, ENCODING);
recorder = new AudioRecord(SOURCE, SAMPLE_RATE, CONFIG, ENCODING, BUFFER_SIZE);
isRecording = true;
b = new byte[BUFFER_SIZE];
try{
FOS = new FileOutputStream(file);
}
catch (Exception e){Log.e("Open FOS", "new failed");}
while (isRecording){
recorder.read(b, 0, BUFFER_SIZE);
try{
FOS.write(b, count * BUFFER_SIZE, BUFFER_SIZE);
count++;
}
catch (Exception e){Log.e("write FOS", "write failed");}
}
try {
FOS.close();
}
catch (Exception e){Log.e("close FOS", "close failed");}
}
private class MyOCL implements View.OnClickListener{
#Override
public void onClick(View view){
switch(view.getId()){
case R.id.PlayButton:
if(isPlaying == false){
playButton.setText("Stop Playing");
setPlaying();
mediaPlayer.start();
}
else {
playButton.setText("Start Playing");
mediaPlayer.stop();
mediaPlayer.release();
mediaPlayer.reset();
}
break;
case R.id.RecordButton:
if(isRecording == false) {
recordThread = new Thread(new Runnable() {
#Override
public void run() {
record();
}
});
recordThread.start();
recordButton.setText("Stop Recording");
}
else{
recordButton.setText("Start recording");
isRecording = false;
}
break;
}
}
}
protected void setPlaying(){
try{
mediaPlayer = new MediaPlayer();
mediaPlayer.reset();
mediaPlayer.setDataSource(currentFileDir);
mediaPlayer.prepare();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
//mediaPlayer.start();
}
catch (Exception e){
Log.e("Play initialize", "Can't call prepare function" + e.getMessage());
}
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.my, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
}
Here, this is my code, which work for me:
public class MainActivity extends Activity
{
AudioRecord record = null;
AudioTrack track = null;
boolean isRecording;
int sampleRate = 44100;
Button startRecord, stopRecord, playRecord = null;
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
setVolumeControlStream(AudioManager.MODE_IN_COMMUNICATION);
startRecord = (Button) findViewById(R.id.start_recording);
stopRecord = (Button) findViewById(R.id.stop_recording);
playRecord = (Button) findViewById(R.id.play_recording);
startRecord.setOnClickListener(new StartRecordListener());
stopRecord.setOnClickListener(new StopRecordListener());
playRecord.setOnClickListener(new PlayRecordListener());
stopRecord.setEnabled(false);
}
private void startRecord()
{
File recordFile = new File(Environment.getExternalStorageDirectory(), "Record.pcm");
try
{
recordFile.createNewFile();
OutputStream outputStream = new FileOutputStream(recordFile);
BufferedOutputStream bufferedOutputStream = new BufferedOutputStream(outputStream);
DataOutputStream dataOutputStream = new DataOutputStream(bufferedOutputStream);
int minBufferSize = AudioRecord.getMinBufferSize(sampleRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
short[] audioData = new short[minBufferSize];
record = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT,
minBufferSize);
record.startRecording();
while (isRecording)
{
int numberOfShort = record.read(audioData, 0, minBufferSize);
for (int i = 0; i < numberOfShort; i++)
{
dataOutputStream.writeShort(audioData[i]);
}
}
record.stop();
dataOutputStream.close();
}
catch (IOException e)
{
e.printStackTrace();
}
}
public void playRecord()
{
File recordFile = new File(Environment.getExternalStorageDirectory(), "Record.pcm");
int shortSizeInBytes = Short.SIZE / Byte.SIZE;
int bufferSizeInBytes = (int) (recordFile.length() / shortSizeInBytes);
short[] audioData = new short[bufferSizeInBytes];
try
{
InputStream inputStream = new FileInputStream(recordFile);
BufferedInputStream bufferedInputStream = new BufferedInputStream(inputStream);
DataInputStream dataInputStream = new DataInputStream(bufferedInputStream);
int i = 0;
while (dataInputStream.available() > 0)
{
audioData[i] = dataInputStream.readShort();
i++;
}
dataInputStream.close();
track = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRate, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT,
bufferSizeInBytes, AudioTrack.MODE_STREAM);
track.play();
track.write(audioData, 0, bufferSizeInBytes);
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
catch (IOException e)
{
e.printStackTrace();
}
}
public class StartRecordListener implements View.OnClickListener
{
#Override
public void onClick(View v)
{
Thread recordThread = new Thread(new Runnable()
{
#Override
public void run()
{
isRecording = true;
MainActivity.this.startRecord();
}
});
recordThread.start();
startRecord.setEnabled(false);
stopRecord.setEnabled(true);
}
}
public class StopRecordListener implements View.OnClickListener
{
#Override
public void onClick(View v)
{
isRecording = false;
startRecord.setEnabled(true);
stopRecord.setEnabled(false);
}
}
public class PlayRecordListener implements View.OnClickListener
{
#Override
public void onClick(View v)
{
MainActivity.this.playRecord();
}
}
}
XML layout contains 3 buttons with the following ids: start_recording, stop_recording, play_recording
And add to following permissions:
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
Good luck and I hope it's okay for you, that I'm using 3 buttons in the code above.

Media Player using AudioTrack class does not resume after pause()

I'm building, inside my existing app, a player using the AudioTrack class,in MODE_STATIC, because i want to implement the timestretch and the loop points features.
The code is ok for start() and stop(), but when paused, if i try to resume, calling play() again, the status bar remain fixed and no audio is played.
Now, from the docs :
Public void pause ()Pauses the playback of the audio data. Data that has not been played >back will not be discarded. Subsequent calls to play() will play this data back. See >flush() to discard this data.
It seems so easy to understand but there is something that escapes me.
Can some one help me?
Is it necessary to create boolean variables like start, play, pause, stopAudio etc?
If yes, where is the utility of the methods inherited from the AudioTrack class?
In MODE_STREAM i have realized the project, using the above boolean variables., but i need the MODE_STATIC.
This is the code, thanks:
Button playpause, stop;
SeekBar posBar;
int sliderval=0;
int headerOffset = 0x2C;
File file =new File(Environment.getExternalStorageDirectory(), "raw.pcm");
int fileSize = (int) file.length();
int dataSize = fileSize-headerOffset ;
byte[] dataArray = new byte[dataSize];
int posValue;
int dataBytesRead = initializeTrack();
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 44100,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, dataBytesRead , AudioTrack.MODE_STATIC);
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
playpause= (Button)(findViewById(R.id.playpause));
stop= (Button)(findViewById(R.id.stop));
posBar=(SeekBar)findViewById(R.id.posBar);
// create a listener for the slider bar;
OnSeekBarChangeListener listener = new OnSeekBarChangeListener() {
public void onStopTrackingTouch(SeekBar seekBar) { }
public void onStartTrackingTouch(SeekBar seekBar) { }
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
if (fromUser) { sliderval = progress;}
}
};
// set the listener on the slider
posBar.setOnSeekBarChangeListener(listener); }
public void toggleButtonSound(View button)
{
switch (button.getId())
{
case R.id.playpause:
play();
break;
case R.id.stop:
stop();
break;
}
}
private void stop() {
if(audioTrack.getState()==AudioTrack.PLAYSTATE_PLAYING ||
audioTrack.getState()==AudioTrack.PLAYSTATE_PAUSED || audioTrack.getState()==AudioTrack.PLAYSTATE_STOPPED)
{ audioTrack.stop();
resetPlayer();}
}
Context context;
private double actualPos=0;
public void pause() {}
public void play()
{
if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PLAYING)
{ //Log.i("", "Play pressed in state "+audioTrack.getPlayState());
audioTrack.pause();
}
else if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PAUSED)
{ //Log.i("", "Play pressed in state "+audioTrack.getPlayState());
audioTrack.play();
}
else if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_STOPPED)
{ //Log.i("", "Play pressed in state "+audioTrack.getPlayState());
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 44100, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, dataSize, AudioTrack.MODE_STATIC);
audioTrack.write(dataArray, 0, dataBytesRead);
audioTrack.play();
}
posBar.setMax((int) (dataBytesRead/2)); // Set the Maximum range of the
audioTrack.setNotificationMarkerPosition((int) (dataSize/2));
audioTrack.setPositionNotificationPeriod(1000);
audioTrack.setPlaybackPositionUpdateListener(new OnPlaybackPositionUpdateListener() {
#Override
public void onPeriodicNotification(AudioTrack track) {
posBar.setProgress(audioTrack.getPlaybackHeadPosition());
Log.i("", " " + audioTrack.getPlaybackHeadPosition() + " " + dataBytesRead/2);
}
#Override
public void onMarkerReached(AudioTrack track) {
Log.i("", " End reached ");
audioTrack.pause();
audioTrack.flush();
audioTrack.release();
posBar.setProgress(0);
resetPlayer();}
});
}
private int initializeTrack() {
InputStream is;
BufferedInputStream bis;
DataInputStream dis;
int temp = 0;
try {
is = new FileInputStream(file);
bis = new BufferedInputStream(is);
dis = new DataInputStream(bis);
temp = dis.read(dataArray, 0, dataSize);
dis.close();
bis.close();
is.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return temp;
}
public void resetPlayer() {
audioTrack.flush();
audioTrack.release();
posBar.setProgress(0);
sliderval=0;
}
You see, you did implement AudioTrack so that even when its paused the contents of file still uploads to AudioTrack:
I don't know how it manage it but in my case I also pause data uploading to AT. Like:
while (byteOffset < fileLengh) {
if(isPaused)
continue;
ret = in.read(byteData, 0, byteCount);
if (ret != -1) { // Write the byte array to the track
audioTrack.write(byteData, 0, ret);
byteOffset += ret;
} else
break;
}
So then I unpause the AT the file uploading while cycle resumes too. I guess that's it. Also I have to mention that even when AT is playing the following:
if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PLAYING)
and
if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PAUSED)
doesn't work for me and getPlayState() always returns 1 (AudioTrack.PLAYSTATE_STOPPED) for me, no matter if its playing or has been paused.

How exactly does the AudioRecord class work?

Please see my other questions as well because I think they are related:
Question 1
Question 2
Question 3
This is the code I am using which performs a pass through of the audio signals obtained at the mic to the speaker, when I press a buton:
public class MainActivity extends Activity {
AudioManager am = null;
AudioRecord record =null;
AudioTrack track =null;
final int SAMPLE_FREQUENCY = 44100;
final int SIZE_OF_RECORD_ARRAY = 1024; // 1024 ORIGINAL
final int WAV_SAMPLE_MULTIPLICATION_FACTOR = 1;
int i= 0;
boolean isPlaying = false;
private volatile boolean keepThreadRunning;
private RandomAccessFile stateFile, stateFileTemp;
private File delFile, renFile;
String stateFileLoc = Environment.getExternalStorageDirectory().getPath();
class MyThread extends Thread{
private volatile boolean needsToPassThrough;
// /*
MyThread(){
super();
}
MyThread(boolean newPTV){
this.needsToPassThrough = newPTV;
}
// */
// /*
#Override
public void run(){
// short[] lin = new short[SIZE_OF_RECORD_ARRAY];
byte[] lin = new byte[SIZE_OF_RECORD_ARRAY];
int num = 0;
// /*
if(needsToPassThrough){
record.startRecording();
track.play();
}
// */
while (keepThreadRunning) {
// while (!isInterrupted()) {
num = record.read(lin, 0, SIZE_OF_RECORD_ARRAY);
for(i=0;i<lin.length;i++)
lin[i] *= WAV_SAMPLE_MULTIPLICATION_FACTOR;
track.write(lin, 0, num);
}
// /*
record.stop();
track.stop();
record.release();
track.release();
// */
}
// */
// /*
public void stopThread(){
keepThreadRunning = false;
}
// */
}
MyThread newThread;
private void init() {
int min = AudioRecord.getMinBufferSize(SAMPLE_FREQUENCY, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
record = new AudioRecord(MediaRecorder.AudioSource.VOICE_COMMUNICATION, SAMPLE_FREQUENCY, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, min);
int maxJitter = AudioTrack.getMinBufferSize(SAMPLE_FREQUENCY, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
track = new AudioTrack(AudioManager.MODE_IN_COMMUNICATION, SAMPLE_FREQUENCY, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, maxJitter, AudioTrack.MODE_STREAM);
am = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
am.setMode(AudioManager.MODE_IN_COMMUNICATION);
try {
stateFile = new RandomAccessFile(stateFileLoc+"/appState.txt", "rwd");
stateFileTemp = new RandomAccessFile(stateFileLoc+"/appStateTemp.txt", "rwd");
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
delFile = new File(stateFileLoc+"/appState.txt");
renFile = new File(stateFileLoc+"/appStateTemp.txt");
}
#Override
protected void onResume(){
super.onResume();
// newThread.stopThread();
Log.d("MYLOG", "onResume() called");
init();
keepThreadRunning = true;
try {
if(stateFile.readInt() == 1){
isPlaying = true;
Log.d("MYLOG", "readInt == 1");
}
else{
isPlaying = false;
Log.d("MYLOG", "readInt <> 1");
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
// */
// newThread = new MyThread(true);
newThread = new MyThread(isPlaying);
newThread.start();
}
#Override
protected void onPause(){
super.onPause();
Log.d("MYLOG", "onPause() called");
newThread.stopThread();
// android.os.Process.killProcess(android.os.Process.myPid());
try {
if(isPlaying)
stateFileTemp.writeInt(1);
else
stateFileTemp.writeInt(0);
delFile.delete();
renFile.renameTo(delFile);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
setVolumeControlStream(AudioManager.MODE_IN_COMMUNICATION);
Log.d("MYLOG","onCreate() called");
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
#Override
protected void onDestroy() {
super.onDestroy();
newThread.stopThread();
// android.os.Process.killProcess(android.os.Process.myPid());
// killProcess(android.os.Process.myPid());
// newThread.interrupt();
delFile.delete();
Log.d("MYLOG", "onDestroy() called");
}
public void passStop(View view){
Button playBtn = (Button) findViewById(R.id.button1);
// /*
if(!isPlaying){
record.startRecording();
track.play();
isPlaying = true;
playBtn.setText("Pause");
}
else{
record.stop();
track.pause();
isPlaying=false;
playBtn.setText("Pass through");
}
// */
}
the files appState.txt and appStateTemp.txt were added to save whether pass through was being performed when the app last lost focus, but that is probably not very significant here. What I want to know is:
What happens when record.read() is called without calling record.startrecording() ?
What is the significance of SIZE_OF_RECORD_ARRAY? I thought it should be at least the value returned by AudioRecord.getMinBufferSize() but in this program it doesn't affect the output at all even if I set it to 1.
If I use 16 bit PCM encoding I need at least a short variable to store the digital equivalent of the audio samples. However in this code even if I change the lin variable from short array to byte array, there is no apparent change in the output. So how does the read function store the digital samples in the array? Does it automatically allocate 2 byte elements for each sample? If that is the case, does it do it as little endian or big endian?
Question 1 and 3 should be easy for you to check with your app, but here goes:
1: What happens when record.read() is called without calling record.startrecording() ?
I would expect there to be no flow of data from the underlying audio input stream, and that read() therefore returns 0 or possibly an error code, indicating that no data has been read.
2: What is the significance of SIZE_OF_RECORD_ARRAY? I thought it should be at least the value returned by AudioRecord.getMinBufferSize() but in this program it doesn't affect the output at all even if I set it to 1.
The value of getMinBufferSize is important when you specify the buffer size in the call to the AudioRecord constructor. What you're changing with SIZE_OF_RECORD_ARRAY is just the amount of data you're reading with each call to read() - and while it isn't a particularly good idea to call read() once per byte (because of the overhead of all those function calls), I can imagine that it still will work.
3: If I use 16 bit PCM encoding I need at least a short variable to store the digital equivalent of the audio samples. However in this code even if I change the lin variable from short array to byte array, there is no apparent change in the output. So how does the read function store the digital samples in the array? Does it automatically allocate 2 byte elements for each sample? If that is the case, does it do it as little endian or big endian?
The underlying native code always uses the byte version. The short version is simply a wrapper around the byte version. So yes, a pair of bytes will be used for each sample in this case.
As for the endianness; it would be little-endian on the vast majority of Android devices out there.
Try this I hope will work 100%
MediaRecorder mRecorder = null;
String mFileName;
private void startRecording() {
try {
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mFileName = getRecordDefaultFileName();
mRecorder.setOutputFile(mFileName);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mRecorder.prepare();
} catch (IOException e) {
System.out.println("prepare() failed");
}
mRecorder.start();
} catch (Exception e) {
return;
}
}
private void stopRecording() {
try {
if (mRecorder != null) {
mRecorder.stop();
mRecorder.release();
mRecorder = null;
}
} catch (Exception e) {
}
}
private String getRecordDefaultFileName() {
File wallpaperDirectory = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/" + "recordingFolder" + "/");
if (!wallpaperDirectory.exists()) {
wallpaperDirectory.mkdirs();
}
return wallpaperDirectory.getAbsolutePath() + File.separator + "iarecord" + ".3gp";
}

Categories

Resources