I'm trying make pause and resume in my AudioRecorder. To play sound I use AudioTrack.
public void startPlay(int playFrequency) {
...
audioTrack.play();
short[] buffer = new short[bufferSize];
while (isPlaying) {
for (int i = 0;i < bufferSize; i++) {
try {
short s = dataIS.readShort();
buffer[i] = s;
}catch (EOFException eofe){isPlaying = false;}]
audioTrack.write(buffer, 0, bufferSize); }
audioTrack.stop();
dataIS.close();}
I tried use audioTrack.pause() and again play but it didn't work.
public void Pause(){
isPlaying=false;
audioTruck.pause() }
And here it's OK I click pause button, player stop played sound but when I click resmue/play nothing happened no resume, no sound.
public void Resume(){
isPlaying=true;
audioTruck.play(); i tried too startPlay();}
Related
I an developing an android app. I want to accomplish below feature.
I will use my phone's built-in mic to record and at the same time i want the recorded audio to be played through either phone's speakers or headphones.
Is it feasible? If yes, please help me in this.
Here is a simple Recording and Playback application.
Uses Android AudioRecord and AudioTrack,
Design:
The recorded audio is written to a buffer and played back from the same buffer, This mechanism runs in a loop (using Android thread) controlled by buttons.
Code
private String TAG = "AUDIO_RECORD_PLAYBACK";
private boolean isRunning = true;
private Thread m_thread; /* Thread for running the Loop */
private AudioRecord recorder = null;
private AudioTrack track = null;
int bufferSize = 320; /* Buffer for recording data */
byte buffer[] = new byte[bufferSize];
/* Method to Enable/Disable Buttons */
private void enableButton(int id,boolean isEnable){
((Button)findViewById(id)).setEnabled(isEnable);
}
The GUI has two Buttons START and STOP.
Enable the Button:
enableButton(R.id.StartButton,true);
enableButton(R.id.StopButton,false);
/* Assign Button Click Handlers */
((Button)findViewById(R.id.StartButton)).setOnClickListener(btnClick);
((Button)findViewById(R.id.StopButton)).setOnClickListener(btnClick);
Mapping START and STOP Button for OnClickListener
private View.OnClickListener btnClick = new View.OnClickListener() {
#Override
public void onClick(View v) {
switch(v.getId()){
case R.id.StartButton:
{
Log.d(TAG, "======== Start Button Pressed ==========");
isRunning = true;
do_loopback(isRunning);
enableButton(R.id.StartButton,false);
enableButton(R.id.StopButton,true);
break;
}
case R.id.StopButton:
{
Log.d(TAG, "======== Stop Button Pressed ==========");
isRunning = false;
do_loopback(isRunning);
enableButton(R.id.StopButton,false);
enableButton(R.id.StartButton,true);
break;
}
}
}
Start the Thread:
private void do_loopback(final boolean flag)
{
m_thread = new Thread(new Runnable() {
public void run() {
run_loop(flag);
}
});
m_thread.start();
}
Method for Initializing AudioRecord and AudioTrack:
public AudioTrack findAudioTrack (AudioTrack track)
{
Log.d(TAG, "===== Initializing AudioTrack API ====");
int m_bufferSize = AudioTrack.getMinBufferSize(8000,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT);
if (m_bufferSize != AudioTrack.ERROR_BAD_VALUE)
{
track = new AudioTrack(AudioManager.STREAM_MUSIC, 8000,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, m_bufferSize,
AudioTrack.MODE_STREAM);
if (track.getState() == AudioTrack.STATE_UNINITIALIZED) {
Log.e(TAG, "===== AudioTrack Uninitialized =====");
return null;
}
}
return track;
}
public AudioRecord findAudioRecord (AudioRecord recorder)
{
Log.d(TAG, "===== Initializing AudioRecord API =====");
int m_bufferSize = AudioRecord.getMinBufferSize(8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
if (m_bufferSize != AudioRecord.ERROR_BAD_VALUE)
{
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC, 8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, m_bufferSize);
if (recorder.getState() == AudioRecord.STATE_UNINITIALIZED) {
Log.e(TAG, "====== AudioRecord UnInitilaised ====== ");
return null;
}
}
return recorder;
}
The Values for findAudioRecord or findAudioTrack can change based on device.
Please refer this question.
Code for Running the loop:
public void run_loop (boolean isRunning)
{
/** == If Stop Button is pressed == **/
if (isRunning == false) {
Log.d(TAG, "===== Stop Button is pressed ===== ");
if (AudioRecord.STATE_INITIALIZED == recorder.getState()){
recorder.stop();
recorder.release();
}
if (AudioTrack.STATE_INITIALIZED == track.getState()){
track.stop();
track.release();
}
return;
}
/** ======= Initialize AudioRecord and AudioTrack ======== **/
recorder = findAudioRecord(recorder);
if (recorder == null) {
Log.e(TAG, "======== findAudioRecord : Returned Error! =========== ");
return;
}
track = findAudioTrack(track);
if (track == null) {
Log.e(TAG, "======== findAudioTrack : Returned Error! ========== ");
return;
}
if ((AudioRecord.STATE_INITIALIZED == recorder.getState()) &&
(AudioTrack.STATE_INITIALIZED == track.getState()))
{
recorder.startRecording();
Log.d(TAG, "========= Recorder Started... =========");
track.play();
Log.d(TAG, "========= Track Started... =========");
}
else
{
Log.d(TAG, "==== Initilazation failed for AudioRecord or AudioTrack =====");
return;
}
/** ------------------------------------------------------ **/
/* Recording and Playing in chunks of 320 bytes */
bufferSize = 320;
while (isRunning == true)
{
/* Read & Write to the Device */
recorder.read(buffer, 0, bufferSize);
track.write(buffer, 0, bufferSize);
}
Log.i(TAG, "Loopback exit");
return;
}
Please include the following in AndroidManifest.xml
<uses-permission android:name="android.permission.RECORD_AUDIO" > </uses-permission>
This above procedure is also possible by Writing/Reading from a file using the same APIs.
Why use audioRecord over mediaRecorder - See here.
The Code is tested (on Google Nexus 5) and working perfectly.
Note: Please add some error-checking code for recorder.read and track.write, in case you fail. Same applies for findAudioRecord and findAudioTrack.
First create objects in onCreate method, MediaRecorder class object and the path to file where you want to save the recorded data.
String outputFile = Environment.getExternalStorageDirectory().
getAbsolutePath() + "/myrecording.3gp"; // Define outputFile outside onCreate method
MediaRecorder myAudioRecorder = new MediaRecorder(); // Define this outside onCreate method
myAudioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
myAudioRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
myAudioRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
myAudioRecorder.setOutputFile(outputFile);
These three function you can call it on any button, in order to play Rec, stop Rec and start Rec;
public void start(View view){
try {
myAudioRecorder.prepare();
myAudioRecorder.start();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
start.setEnabled(false);
stop.setEnabled(true);
Toast.makeText(getApplicationContext(), "Recording started", Toast.LENGTH_LONG).show();
}
public void stop(View view){
myAudioRecorder.stop();
myAudioRecorder.release();
myAudioRecorder = null;
stop.setEnabled(false);
play.setEnabled(true);
Toast.makeText(getApplicationContext(), "Audio recorded successfully",
Toast.LENGTH_LONG).show();
}
public void play(View view) throws IllegalArgumentException,
SecurityException, IllegalStateException, IOException{
MediaPlayer m = new MediaPlayer();
m.setDataSource(outputFile);
m.prepare();
m.start();
Toast.makeText(getApplicationContext(), "Playing audio", Toast.LENGTH_LONG).show();
}
As I read Developer document here , Android supports RTSP protocol (for real time streaming) and also HTTP/HTTPS live streaming draft protocol.
There is also an example here. You must have base knowledge about Streaming server, like Red5 or Wowza.
I'm building, inside my existing app, a player using the AudioTrack class,in MODE_STATIC, because i want to implement the timestretch and the loop points features.
The code is ok for start() and stop(), but when paused, if i try to resume, calling play() again, the status bar remain fixed and no audio is played.
Now, from the docs :
Public void pause ()Pauses the playback of the audio data. Data that has not been played >back will not be discarded. Subsequent calls to play() will play this data back. See >flush() to discard this data.
It seems so easy to understand but there is something that escapes me.
Can some one help me?
Is it necessary to create boolean variables like start, play, pause, stopAudio etc?
If yes, where is the utility of the methods inherited from the AudioTrack class?
In MODE_STREAM i have realized the project, using the above boolean variables., but i need the MODE_STATIC.
This is the code, thanks:
Button playpause, stop;
SeekBar posBar;
int sliderval=0;
int headerOffset = 0x2C;
File file =new File(Environment.getExternalStorageDirectory(), "raw.pcm");
int fileSize = (int) file.length();
int dataSize = fileSize-headerOffset ;
byte[] dataArray = new byte[dataSize];
int posValue;
int dataBytesRead = initializeTrack();
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 44100,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, dataBytesRead , AudioTrack.MODE_STATIC);
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
playpause= (Button)(findViewById(R.id.playpause));
stop= (Button)(findViewById(R.id.stop));
posBar=(SeekBar)findViewById(R.id.posBar);
// create a listener for the slider bar;
OnSeekBarChangeListener listener = new OnSeekBarChangeListener() {
public void onStopTrackingTouch(SeekBar seekBar) { }
public void onStartTrackingTouch(SeekBar seekBar) { }
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
if (fromUser) { sliderval = progress;}
}
};
// set the listener on the slider
posBar.setOnSeekBarChangeListener(listener); }
public void toggleButtonSound(View button)
{
switch (button.getId())
{
case R.id.playpause:
play();
break;
case R.id.stop:
stop();
break;
}
}
private void stop() {
if(audioTrack.getState()==AudioTrack.PLAYSTATE_PLAYING ||
audioTrack.getState()==AudioTrack.PLAYSTATE_PAUSED || audioTrack.getState()==AudioTrack.PLAYSTATE_STOPPED)
{ audioTrack.stop();
resetPlayer();}
}
Context context;
private double actualPos=0;
public void pause() {}
public void play()
{
if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PLAYING)
{ //Log.i("", "Play pressed in state "+audioTrack.getPlayState());
audioTrack.pause();
}
else if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PAUSED)
{ //Log.i("", "Play pressed in state "+audioTrack.getPlayState());
audioTrack.play();
}
else if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_STOPPED)
{ //Log.i("", "Play pressed in state "+audioTrack.getPlayState());
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 44100, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, dataSize, AudioTrack.MODE_STATIC);
audioTrack.write(dataArray, 0, dataBytesRead);
audioTrack.play();
}
posBar.setMax((int) (dataBytesRead/2)); // Set the Maximum range of the
audioTrack.setNotificationMarkerPosition((int) (dataSize/2));
audioTrack.setPositionNotificationPeriod(1000);
audioTrack.setPlaybackPositionUpdateListener(new OnPlaybackPositionUpdateListener() {
#Override
public void onPeriodicNotification(AudioTrack track) {
posBar.setProgress(audioTrack.getPlaybackHeadPosition());
Log.i("", " " + audioTrack.getPlaybackHeadPosition() + " " + dataBytesRead/2);
}
#Override
public void onMarkerReached(AudioTrack track) {
Log.i("", " End reached ");
audioTrack.pause();
audioTrack.flush();
audioTrack.release();
posBar.setProgress(0);
resetPlayer();}
});
}
private int initializeTrack() {
InputStream is;
BufferedInputStream bis;
DataInputStream dis;
int temp = 0;
try {
is = new FileInputStream(file);
bis = new BufferedInputStream(is);
dis = new DataInputStream(bis);
temp = dis.read(dataArray, 0, dataSize);
dis.close();
bis.close();
is.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return temp;
}
public void resetPlayer() {
audioTrack.flush();
audioTrack.release();
posBar.setProgress(0);
sliderval=0;
}
You see, you did implement AudioTrack so that even when its paused the contents of file still uploads to AudioTrack:
I don't know how it manage it but in my case I also pause data uploading to AT. Like:
while (byteOffset < fileLengh) {
if(isPaused)
continue;
ret = in.read(byteData, 0, byteCount);
if (ret != -1) { // Write the byte array to the track
audioTrack.write(byteData, 0, ret);
byteOffset += ret;
} else
break;
}
So then I unpause the AT the file uploading while cycle resumes too. I guess that's it. Also I have to mention that even when AT is playing the following:
if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PLAYING)
and
if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PAUSED)
doesn't work for me and getPlayState() always returns 1 (AudioTrack.PLAYSTATE_STOPPED) for me, no matter if its playing or has been paused.
I am currently trying to build an amplifier for the Android. The goal is to record and playback what is being recorded simultaneously. I created a thread that would take care of this. However, the sound comes out choppy. Here is what I tried.
private class RecordAndPlay extends Thread{
int bufferSize;
AudioRecord aRecord;
short[] buffer;
public RecordAndPlay() {
bufferSize = AudioRecord.getMinBufferSize(22050, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
buffer = new short[bufferSize];
}
#Override
public void run() {
aRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, 22050, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
try {
aRecord.startRecording();
} catch (Exception e) {
}
int bufferedResult = aRecord.read(buffer,0,bufferSize);
final AudioTrack aTrack = new AudioTrack(AudioManager.STREAM_MUSIC, samplingRate, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferedResult, AudioTrack.MODE_STREAM);
aTrack.setNotificationMarkerPosition(bufferedResult);
aTrack.setPlaybackPositionUpdateListener(new OnPlaybackPositionUpdateListener() {
#Override
public void onPeriodicNotification(AudioTrack track) {
// TODO Auto-generated method stub
}
#Override
public void onMarkerReached(AudioTrack track) {
Log.d("Marker reached", "...");
aTrack.release();
aRecord.release();
run();
}
});
aTrack.play();
aTrack.write(buffer, 0, buffer.length);
}
public void cancel(){
aRecord.stop();
aRecord.release();
}
}
Your playback is choppy because the AudioTrack is getting starved and not getting data smoothly. In your code you are recursively calling run and creating a new AudioTrack per marker. Instead, instantiate AudioRecord and AudioTrack only once and just handle their events. Also, to help smooth out playback you should probably start recording slightly before playback and maintain a queue of the recorded buffers. You can then manage passing these buffers to the AudioTrack and make sure there is always a new buffer to submit on each marker event.
I'm trying to record sound from the mic and then play it back, but all I here is a clicking sound. Both of the AudioTrack and AudioRecord are on the same settings, and initalised correctly. Here is my code:
stop.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
isRecording = false;
recordingThread.join();
player = findAudioTrack();
player.play();
for (int i = 0; i < audioQueue.size(); i++) {
int written = player.write(audioQueue.get(i), 0,
audioQueue.get(i).length);
}
player.stop();
player.release();
player = null;
}
});
}
private void startRecording() {
recorder = findAudioRecord();
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable() {
public void run() {
addAudioToQueue();
}
}, "AudioRecorder Thread");
recordingThread.start();
}
private void addAudioToQueue() {
short[] capturedAudio = new short[recordBufferSize/2];
while (isRecording) {
int read = recorder.read(capturedAudio, 0, capturedAudio.length);
audioQueue.add(capturedAudio);
}
recorder.stop();
recorder.release();
recorder = null;
}
}
Does anyone know why this is?
Here is the full source code:
https://www.dropbox.com/s/h38cs9vjkztyyro/AudioTesting.java
It probably has to do with the fact that you are using a short array instead of a byte array.
public int read (byte[] audioData, int offsetInBytes, int sizeInBytes)
It's likely that this function is only filling the first byte of each short so you are left with half of each index being 1 byte of audio data, and the other byte being junk or zeroes.
Try using a byte array instead of short.
I am using AudioRecord to record raw audio for processing.
The audio records entirely without any noise but when the raw PCM data generated is played back, it plays as if it has been speeded up a lot (upto about twice as much).
I am viewing and playing the PCM data in Audacity. I am using actual phone (Samsung Galaxy S5670) for testing.
The recording is done at 44100 Hz, 16 bit. Any idea what might cause this?
Following is the recording code:
public class TestApp extends Activity
{
File file;
OutputStream os;
BufferedOutputStream bos;
AudioRecord recorder;
int iAudioBufferSize;
boolean bRecording;
int iBytesRead;
Thread recordThread = new Thread(){
#Override
public void run()
{
byte[] buffer = new byte[iAudioBufferSize];
int iBufferReadResult;
iBytesRead = 0;
while(!interrupted())
{
iBufferReadResult = recorder.read(buffer, 0, iAudioBufferSize);
// Android is reading less number of bytes than requested.
if(iAudioBufferSize > iBufferReadResult)
{
iBufferReadResult = iBufferReadResult +
recorder.read(buffer, iBufferReadResult - 1, iAudioBufferSize - iBufferReadResult);
}
iBytesRead = iBytesRead + iBufferReadResult;
for (int i = 0; i < iBufferReadResult; i++)
{
try
{
bos.write(buffer[i]);
} catch (IOException e)
{
e.printStackTrace();
}
}
}
}
};
#Override
public void onCreate(Bundle savedInstanceState)
{
// File Creation and UI init stuff etc.
bRecording = false;
bPlaying = false;
int iSampleRate = AudioTrack.getNativeOutputSampleRate(AudioManager.STREAM_SYSTEM);
iAudioBufferSize = AudioRecord.getMinBufferSize(iSampleRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
recorder = new AudioRecord(AudioSource.MIC, iSampleRate, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, iAudioBufferSize);
bt_Record.setOnClickListener(new OnClickListener()
{
#Override
public void onClick(View v)
{
if (!bRecording)
{
try
{
recorder.startRecording();
bRecording = true;
recordThread.start();
}
catch(Exception e)
{
tv_Error.setText(e.getLocalizedMessage());
}
}
else
{
recorder.stop();
bRecording = false;
recordThread.interrupt();
try
{
bos.close();
}
catch(IOException e)
{
}
tv_Hello.setText("Recorded Sucessfully. Total " + iBytesRead + " bytes.");
}
}
});
}
}
RESOLVED : I posted this after struggling with it for 1-2 days. But, ironically, I found the solution soon after posting. The buffered output stream write was taking too much time in the for loop, so the stream was skipping samples. changed it to block write, removing the for loop. Works perfectly.
The audio skipping was caused by the delay in writing to buffer.
the solution is to just replace this FOR loop:
for (int i = 0; i < iBufferReadResult; i++)
{
try
{
bos.write(buffer[i]);
} catch (IOException e)
{
e.printStackTrace();
}
}
by a single write, like so:
bos.write(buffer, 0, iBufferReadResult);
I had used the code from a book which worked, I guess, for lower sample rates and buffer updates.