Please see my other questions as well because I think they are related:
Question 1
Question 2
Question 3
This is the code I am using which performs a pass through of the audio signals obtained at the mic to the speaker, when I press a buton:
public class MainActivity extends Activity {
AudioManager am = null;
AudioRecord record =null;
AudioTrack track =null;
final int SAMPLE_FREQUENCY = 44100;
final int SIZE_OF_RECORD_ARRAY = 1024; // 1024 ORIGINAL
final int WAV_SAMPLE_MULTIPLICATION_FACTOR = 1;
int i= 0;
boolean isPlaying = false;
private volatile boolean keepThreadRunning;
private RandomAccessFile stateFile, stateFileTemp;
private File delFile, renFile;
String stateFileLoc = Environment.getExternalStorageDirectory().getPath();
class MyThread extends Thread{
private volatile boolean needsToPassThrough;
// /*
MyThread(){
super();
}
MyThread(boolean newPTV){
this.needsToPassThrough = newPTV;
}
// */
// /*
#Override
public void run(){
// short[] lin = new short[SIZE_OF_RECORD_ARRAY];
byte[] lin = new byte[SIZE_OF_RECORD_ARRAY];
int num = 0;
// /*
if(needsToPassThrough){
record.startRecording();
track.play();
}
// */
while (keepThreadRunning) {
// while (!isInterrupted()) {
num = record.read(lin, 0, SIZE_OF_RECORD_ARRAY);
for(i=0;i<lin.length;i++)
lin[i] *= WAV_SAMPLE_MULTIPLICATION_FACTOR;
track.write(lin, 0, num);
}
// /*
record.stop();
track.stop();
record.release();
track.release();
// */
}
// */
// /*
public void stopThread(){
keepThreadRunning = false;
}
// */
}
MyThread newThread;
private void init() {
int min = AudioRecord.getMinBufferSize(SAMPLE_FREQUENCY, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
record = new AudioRecord(MediaRecorder.AudioSource.VOICE_COMMUNICATION, SAMPLE_FREQUENCY, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, min);
int maxJitter = AudioTrack.getMinBufferSize(SAMPLE_FREQUENCY, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
track = new AudioTrack(AudioManager.MODE_IN_COMMUNICATION, SAMPLE_FREQUENCY, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, maxJitter, AudioTrack.MODE_STREAM);
am = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
am.setMode(AudioManager.MODE_IN_COMMUNICATION);
try {
stateFile = new RandomAccessFile(stateFileLoc+"/appState.txt", "rwd");
stateFileTemp = new RandomAccessFile(stateFileLoc+"/appStateTemp.txt", "rwd");
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
delFile = new File(stateFileLoc+"/appState.txt");
renFile = new File(stateFileLoc+"/appStateTemp.txt");
}
#Override
protected void onResume(){
super.onResume();
// newThread.stopThread();
Log.d("MYLOG", "onResume() called");
init();
keepThreadRunning = true;
try {
if(stateFile.readInt() == 1){
isPlaying = true;
Log.d("MYLOG", "readInt == 1");
}
else{
isPlaying = false;
Log.d("MYLOG", "readInt <> 1");
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
// */
// newThread = new MyThread(true);
newThread = new MyThread(isPlaying);
newThread.start();
}
#Override
protected void onPause(){
super.onPause();
Log.d("MYLOG", "onPause() called");
newThread.stopThread();
// android.os.Process.killProcess(android.os.Process.myPid());
try {
if(isPlaying)
stateFileTemp.writeInt(1);
else
stateFileTemp.writeInt(0);
delFile.delete();
renFile.renameTo(delFile);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
setVolumeControlStream(AudioManager.MODE_IN_COMMUNICATION);
Log.d("MYLOG","onCreate() called");
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
#Override
protected void onDestroy() {
super.onDestroy();
newThread.stopThread();
// android.os.Process.killProcess(android.os.Process.myPid());
// killProcess(android.os.Process.myPid());
// newThread.interrupt();
delFile.delete();
Log.d("MYLOG", "onDestroy() called");
}
public void passStop(View view){
Button playBtn = (Button) findViewById(R.id.button1);
// /*
if(!isPlaying){
record.startRecording();
track.play();
isPlaying = true;
playBtn.setText("Pause");
}
else{
record.stop();
track.pause();
isPlaying=false;
playBtn.setText("Pass through");
}
// */
}
the files appState.txt and appStateTemp.txt were added to save whether pass through was being performed when the app last lost focus, but that is probably not very significant here. What I want to know is:
What happens when record.read() is called without calling record.startrecording() ?
What is the significance of SIZE_OF_RECORD_ARRAY? I thought it should be at least the value returned by AudioRecord.getMinBufferSize() but in this program it doesn't affect the output at all even if I set it to 1.
If I use 16 bit PCM encoding I need at least a short variable to store the digital equivalent of the audio samples. However in this code even if I change the lin variable from short array to byte array, there is no apparent change in the output. So how does the read function store the digital samples in the array? Does it automatically allocate 2 byte elements for each sample? If that is the case, does it do it as little endian or big endian?
Question 1 and 3 should be easy for you to check with your app, but here goes:
1: What happens when record.read() is called without calling record.startrecording() ?
I would expect there to be no flow of data from the underlying audio input stream, and that read() therefore returns 0 or possibly an error code, indicating that no data has been read.
2: What is the significance of SIZE_OF_RECORD_ARRAY? I thought it should be at least the value returned by AudioRecord.getMinBufferSize() but in this program it doesn't affect the output at all even if I set it to 1.
The value of getMinBufferSize is important when you specify the buffer size in the call to the AudioRecord constructor. What you're changing with SIZE_OF_RECORD_ARRAY is just the amount of data you're reading with each call to read() - and while it isn't a particularly good idea to call read() once per byte (because of the overhead of all those function calls), I can imagine that it still will work.
3: If I use 16 bit PCM encoding I need at least a short variable to store the digital equivalent of the audio samples. However in this code even if I change the lin variable from short array to byte array, there is no apparent change in the output. So how does the read function store the digital samples in the array? Does it automatically allocate 2 byte elements for each sample? If that is the case, does it do it as little endian or big endian?
The underlying native code always uses the byte version. The short version is simply a wrapper around the byte version. So yes, a pair of bytes will be used for each sample in this case.
As for the endianness; it would be little-endian on the vast majority of Android devices out there.
Try this I hope will work 100%
MediaRecorder mRecorder = null;
String mFileName;
private void startRecording() {
try {
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mFileName = getRecordDefaultFileName();
mRecorder.setOutputFile(mFileName);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mRecorder.prepare();
} catch (IOException e) {
System.out.println("prepare() failed");
}
mRecorder.start();
} catch (Exception e) {
return;
}
}
private void stopRecording() {
try {
if (mRecorder != null) {
mRecorder.stop();
mRecorder.release();
mRecorder = null;
}
} catch (Exception e) {
}
}
private String getRecordDefaultFileName() {
File wallpaperDirectory = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/" + "recordingFolder" + "/");
if (!wallpaperDirectory.exists()) {
wallpaperDirectory.mkdirs();
}
return wallpaperDirectory.getAbsolutePath() + File.separator + "iarecord" + ".3gp";
}
Related
I have not used the audio recording classes of android much before, so i dont really have much knowledge in the area.
I have written a little app that will record audio in the background, and then play in back, all in the PCM format (i am doing some tests to see how much battery the microphone uses in the background).
But when i try and run my play() method, i get the logcat errors:
11-03 00:20:05.744 18248-18248/com.bacon.corey.audiotimeshift E/android.media.AudioTrack﹕ Front channels must be present in multichannel configurations
11-03 00:20:05.748 18248-18248/com.bacon.corey.audiotimeshift E/AudioTrack﹕ Playback Failed
I have googled the errors, but i cant seem to find anything what so ever about them.
If someone wouldnt mind giving me a few pointers, i would be hugely grateful.
This is the code for the application (it is quite sloppy and unfinished, as it is only for testing battery life):
public class MainActivity extends ActionBarActivity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
if (savedInstanceState == null) {
getSupportFragmentManager().beginTransaction()
.add(R.id.container, new PlaceholderFragment())
.commit();
}
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu_main, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
/**
* A placeholder fragment containing a simple view.
*/
public static class PlaceholderFragment extends Fragment {
public PlaceholderFragment() {
}
#Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
View rootView = inflater.inflate(R.layout.fragment_main, container, false);
return rootView;
}
}
public void play(View view) {
Toast.makeText(this, "play", Toast.LENGTH_SHORT).show();
// Get the file we want to playback.
File file = new File(Environment.getExternalStorageDirectory() + File.separator + "ACS.pcm");
// Get the length of the audio stored in the file (16 bit so 2 bytes per short)
// and create a short array to store the recorded audio.
int musicLength = (int)(file.length()/2);
short[] music = new short[musicLength];
try {
// Create a DataInputStream to read the audio data back from the saved file.
InputStream is = new FileInputStream(file);
BufferedInputStream bis = new BufferedInputStream(is);
DataInputStream dis = new DataInputStream(bis);
// Read the file into the music array.
int i = 0;
while (dis.available() > 0) {
music[musicLength-1-i] = dis.readShort();
i++;
}
// Close the input streams.
dis.close();
// Create a new AudioTrack object using the same parameters as the AudioRecord
// object used to create the file.
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
11025,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT,
musicLength,
AudioTrack.MODE_STREAM);
// Start playback
audioTrack.play();
// Write the music buffer to the AudioTrack object
audioTrack.write(music, 0, musicLength);
} catch (Throwable t) {
Log.e("AudioTrack","Playback Failed");
}
}
public void record(View view){
Toast.makeText(this, "record", Toast.LENGTH_SHORT).show();
Log.v("ACS", "OnCreate called");
Intent intent = new Intent(this, ACS.class);
startService(intent);
}
public void stop(View view){
Toast.makeText(this, "stop", Toast.LENGTH_SHORT).show();
Intent intent = new Intent(this, ACS.class);
stopService(intent);
}
}
And
public class ACS extends IntentService {
AudioRecord audioRecord;
public ACS() {
super("ACS");
}
#Override
protected void onHandleIntent(Intent intent) {
Log.v("ACS", "ACS called");
record();
}
public void record() {
Log.v("ACS", "Record started");
int frequency = 11025;
int channelConfiguration = AudioFormat.CHANNEL_IN_MONO;
int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
File file = new File(Environment.getExternalStorageDirectory() + File.separator + "ACS.pcm");
// Delete any previous recording.
if (file.exists())
file.delete();
// Create the new file.
try {
file.createNewFile();
} catch (IOException e) {
throw new IllegalStateException("Failed to create " + file.toString());
}
try {
// Create a DataOuputStream to write the audio data into the saved file.
OutputStream os = new FileOutputStream(file);
BufferedOutputStream bos = new BufferedOutputStream(os);
DataOutputStream dos = new DataOutputStream(bos);
// Create a new AudioRecord object to record the audio.
int bufferSize = AudioRecord.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
frequency, channelConfiguration,
audioEncoding, bufferSize);
short[] buffer = new short[bufferSize];
audioRecord.startRecording();
while (audioRecord.getRecordingState() == audioRecord.RECORDSTATE_RECORDING) {
int bufferReadResult = audioRecord.read(buffer, 0, bufferSize);
for (int i = 0; i < bufferReadResult; i++)
dos.writeShort(buffer[i]);
}
audioRecord.stop();
dos.close();
} catch (Throwable t) {
Log.e("AudioRecord", "Recording Failed");
}
Log.v("ACS", "Record stopped");
}
public void onDestroy(){
audioRecord.stop();
Log.v("ACS", "onDestroy called, Record stopped");
}
}
Thanks in advance
Corey :)
I have the same error message "android.media.AudioTrack﹕ Front channels must be present in multichannel configurations".
When I change the audio settings from AudioFormat.CHANNEL_OUT_MONO to AudioFormat.CHANNEL_IN_MONO, the error messages disappeared. (Or you can try different configuration, like AudioFormat.CHANNEL_IN_STEREO)
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
11025,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT,
musicLength,
AudioTrack.MODE_STREAM);
But I don't know why this happened. Hope this help.
A mono audio file needs to be sent to both left and right speaker. Do a logical OR to set this routing:
final int frontPair =
AudioFormat.CHANNEL_OUT_FRONT_LEFT | AudioFormat.CHANNEL_OUT_FRONT_RIGHT;
AudioFormat audioFormat = new AudioFormat.Builder()
.setEncoding(AudioFormat.ENCODING_PCM_8BIT)
.setSampleRate(audioSamplingRate)
.setChannelMask(frontPair)
.build();
I an developing an android app. I want to accomplish below feature.
I will use my phone's built-in mic to record and at the same time i want the recorded audio to be played through either phone's speakers or headphones.
Is it feasible? If yes, please help me in this.
Here is a simple Recording and Playback application.
Uses Android AudioRecord and AudioTrack,
Design:
The recorded audio is written to a buffer and played back from the same buffer, This mechanism runs in a loop (using Android thread) controlled by buttons.
Code
private String TAG = "AUDIO_RECORD_PLAYBACK";
private boolean isRunning = true;
private Thread m_thread; /* Thread for running the Loop */
private AudioRecord recorder = null;
private AudioTrack track = null;
int bufferSize = 320; /* Buffer for recording data */
byte buffer[] = new byte[bufferSize];
/* Method to Enable/Disable Buttons */
private void enableButton(int id,boolean isEnable){
((Button)findViewById(id)).setEnabled(isEnable);
}
The GUI has two Buttons START and STOP.
Enable the Button:
enableButton(R.id.StartButton,true);
enableButton(R.id.StopButton,false);
/* Assign Button Click Handlers */
((Button)findViewById(R.id.StartButton)).setOnClickListener(btnClick);
((Button)findViewById(R.id.StopButton)).setOnClickListener(btnClick);
Mapping START and STOP Button for OnClickListener
private View.OnClickListener btnClick = new View.OnClickListener() {
#Override
public void onClick(View v) {
switch(v.getId()){
case R.id.StartButton:
{
Log.d(TAG, "======== Start Button Pressed ==========");
isRunning = true;
do_loopback(isRunning);
enableButton(R.id.StartButton,false);
enableButton(R.id.StopButton,true);
break;
}
case R.id.StopButton:
{
Log.d(TAG, "======== Stop Button Pressed ==========");
isRunning = false;
do_loopback(isRunning);
enableButton(R.id.StopButton,false);
enableButton(R.id.StartButton,true);
break;
}
}
}
Start the Thread:
private void do_loopback(final boolean flag)
{
m_thread = new Thread(new Runnable() {
public void run() {
run_loop(flag);
}
});
m_thread.start();
}
Method for Initializing AudioRecord and AudioTrack:
public AudioTrack findAudioTrack (AudioTrack track)
{
Log.d(TAG, "===== Initializing AudioTrack API ====");
int m_bufferSize = AudioTrack.getMinBufferSize(8000,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT);
if (m_bufferSize != AudioTrack.ERROR_BAD_VALUE)
{
track = new AudioTrack(AudioManager.STREAM_MUSIC, 8000,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, m_bufferSize,
AudioTrack.MODE_STREAM);
if (track.getState() == AudioTrack.STATE_UNINITIALIZED) {
Log.e(TAG, "===== AudioTrack Uninitialized =====");
return null;
}
}
return track;
}
public AudioRecord findAudioRecord (AudioRecord recorder)
{
Log.d(TAG, "===== Initializing AudioRecord API =====");
int m_bufferSize = AudioRecord.getMinBufferSize(8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
if (m_bufferSize != AudioRecord.ERROR_BAD_VALUE)
{
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC, 8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, m_bufferSize);
if (recorder.getState() == AudioRecord.STATE_UNINITIALIZED) {
Log.e(TAG, "====== AudioRecord UnInitilaised ====== ");
return null;
}
}
return recorder;
}
The Values for findAudioRecord or findAudioTrack can change based on device.
Please refer this question.
Code for Running the loop:
public void run_loop (boolean isRunning)
{
/** == If Stop Button is pressed == **/
if (isRunning == false) {
Log.d(TAG, "===== Stop Button is pressed ===== ");
if (AudioRecord.STATE_INITIALIZED == recorder.getState()){
recorder.stop();
recorder.release();
}
if (AudioTrack.STATE_INITIALIZED == track.getState()){
track.stop();
track.release();
}
return;
}
/** ======= Initialize AudioRecord and AudioTrack ======== **/
recorder = findAudioRecord(recorder);
if (recorder == null) {
Log.e(TAG, "======== findAudioRecord : Returned Error! =========== ");
return;
}
track = findAudioTrack(track);
if (track == null) {
Log.e(TAG, "======== findAudioTrack : Returned Error! ========== ");
return;
}
if ((AudioRecord.STATE_INITIALIZED == recorder.getState()) &&
(AudioTrack.STATE_INITIALIZED == track.getState()))
{
recorder.startRecording();
Log.d(TAG, "========= Recorder Started... =========");
track.play();
Log.d(TAG, "========= Track Started... =========");
}
else
{
Log.d(TAG, "==== Initilazation failed for AudioRecord or AudioTrack =====");
return;
}
/** ------------------------------------------------------ **/
/* Recording and Playing in chunks of 320 bytes */
bufferSize = 320;
while (isRunning == true)
{
/* Read & Write to the Device */
recorder.read(buffer, 0, bufferSize);
track.write(buffer, 0, bufferSize);
}
Log.i(TAG, "Loopback exit");
return;
}
Please include the following in AndroidManifest.xml
<uses-permission android:name="android.permission.RECORD_AUDIO" > </uses-permission>
This above procedure is also possible by Writing/Reading from a file using the same APIs.
Why use audioRecord over mediaRecorder - See here.
The Code is tested (on Google Nexus 5) and working perfectly.
Note: Please add some error-checking code for recorder.read and track.write, in case you fail. Same applies for findAudioRecord and findAudioTrack.
First create objects in onCreate method, MediaRecorder class object and the path to file where you want to save the recorded data.
String outputFile = Environment.getExternalStorageDirectory().
getAbsolutePath() + "/myrecording.3gp"; // Define outputFile outside onCreate method
MediaRecorder myAudioRecorder = new MediaRecorder(); // Define this outside onCreate method
myAudioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
myAudioRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
myAudioRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
myAudioRecorder.setOutputFile(outputFile);
These three function you can call it on any button, in order to play Rec, stop Rec and start Rec;
public void start(View view){
try {
myAudioRecorder.prepare();
myAudioRecorder.start();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
start.setEnabled(false);
stop.setEnabled(true);
Toast.makeText(getApplicationContext(), "Recording started", Toast.LENGTH_LONG).show();
}
public void stop(View view){
myAudioRecorder.stop();
myAudioRecorder.release();
myAudioRecorder = null;
stop.setEnabled(false);
play.setEnabled(true);
Toast.makeText(getApplicationContext(), "Audio recorded successfully",
Toast.LENGTH_LONG).show();
}
public void play(View view) throws IllegalArgumentException,
SecurityException, IllegalStateException, IOException{
MediaPlayer m = new MediaPlayer();
m.setDataSource(outputFile);
m.prepare();
m.start();
Toast.makeText(getApplicationContext(), "Playing audio", Toast.LENGTH_LONG).show();
}
As I read Developer document here , Android supports RTSP protocol (for real time streaming) and also HTTP/HTTPS live streaming draft protocol.
There is also an example here. You must have base knowledge about Streaming server, like Red5 or Wowza.
I'm building, inside my existing app, a player using the AudioTrack class,in MODE_STATIC, because i want to implement the timestretch and the loop points features.
The code is ok for start() and stop(), but when paused, if i try to resume, calling play() again, the status bar remain fixed and no audio is played.
Now, from the docs :
Public void pause ()Pauses the playback of the audio data. Data that has not been played >back will not be discarded. Subsequent calls to play() will play this data back. See >flush() to discard this data.
It seems so easy to understand but there is something that escapes me.
Can some one help me?
Is it necessary to create boolean variables like start, play, pause, stopAudio etc?
If yes, where is the utility of the methods inherited from the AudioTrack class?
In MODE_STREAM i have realized the project, using the above boolean variables., but i need the MODE_STATIC.
This is the code, thanks:
Button playpause, stop;
SeekBar posBar;
int sliderval=0;
int headerOffset = 0x2C;
File file =new File(Environment.getExternalStorageDirectory(), "raw.pcm");
int fileSize = (int) file.length();
int dataSize = fileSize-headerOffset ;
byte[] dataArray = new byte[dataSize];
int posValue;
int dataBytesRead = initializeTrack();
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 44100,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, dataBytesRead , AudioTrack.MODE_STATIC);
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
playpause= (Button)(findViewById(R.id.playpause));
stop= (Button)(findViewById(R.id.stop));
posBar=(SeekBar)findViewById(R.id.posBar);
// create a listener for the slider bar;
OnSeekBarChangeListener listener = new OnSeekBarChangeListener() {
public void onStopTrackingTouch(SeekBar seekBar) { }
public void onStartTrackingTouch(SeekBar seekBar) { }
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
if (fromUser) { sliderval = progress;}
}
};
// set the listener on the slider
posBar.setOnSeekBarChangeListener(listener); }
public void toggleButtonSound(View button)
{
switch (button.getId())
{
case R.id.playpause:
play();
break;
case R.id.stop:
stop();
break;
}
}
private void stop() {
if(audioTrack.getState()==AudioTrack.PLAYSTATE_PLAYING ||
audioTrack.getState()==AudioTrack.PLAYSTATE_PAUSED || audioTrack.getState()==AudioTrack.PLAYSTATE_STOPPED)
{ audioTrack.stop();
resetPlayer();}
}
Context context;
private double actualPos=0;
public void pause() {}
public void play()
{
if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PLAYING)
{ //Log.i("", "Play pressed in state "+audioTrack.getPlayState());
audioTrack.pause();
}
else if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PAUSED)
{ //Log.i("", "Play pressed in state "+audioTrack.getPlayState());
audioTrack.play();
}
else if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_STOPPED)
{ //Log.i("", "Play pressed in state "+audioTrack.getPlayState());
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 44100, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, dataSize, AudioTrack.MODE_STATIC);
audioTrack.write(dataArray, 0, dataBytesRead);
audioTrack.play();
}
posBar.setMax((int) (dataBytesRead/2)); // Set the Maximum range of the
audioTrack.setNotificationMarkerPosition((int) (dataSize/2));
audioTrack.setPositionNotificationPeriod(1000);
audioTrack.setPlaybackPositionUpdateListener(new OnPlaybackPositionUpdateListener() {
#Override
public void onPeriodicNotification(AudioTrack track) {
posBar.setProgress(audioTrack.getPlaybackHeadPosition());
Log.i("", " " + audioTrack.getPlaybackHeadPosition() + " " + dataBytesRead/2);
}
#Override
public void onMarkerReached(AudioTrack track) {
Log.i("", " End reached ");
audioTrack.pause();
audioTrack.flush();
audioTrack.release();
posBar.setProgress(0);
resetPlayer();}
});
}
private int initializeTrack() {
InputStream is;
BufferedInputStream bis;
DataInputStream dis;
int temp = 0;
try {
is = new FileInputStream(file);
bis = new BufferedInputStream(is);
dis = new DataInputStream(bis);
temp = dis.read(dataArray, 0, dataSize);
dis.close();
bis.close();
is.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return temp;
}
public void resetPlayer() {
audioTrack.flush();
audioTrack.release();
posBar.setProgress(0);
sliderval=0;
}
You see, you did implement AudioTrack so that even when its paused the contents of file still uploads to AudioTrack:
I don't know how it manage it but in my case I also pause data uploading to AT. Like:
while (byteOffset < fileLengh) {
if(isPaused)
continue;
ret = in.read(byteData, 0, byteCount);
if (ret != -1) { // Write the byte array to the track
audioTrack.write(byteData, 0, ret);
byteOffset += ret;
} else
break;
}
So then I unpause the AT the file uploading while cycle resumes too. I guess that's it. Also I have to mention that even when AT is playing the following:
if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PLAYING)
and
if (audioTrack.getPlayState()==AudioTrack.PLAYSTATE_PAUSED)
doesn't work for me and getPlayState() always returns 1 (AudioTrack.PLAYSTATE_STOPPED) for me, no matter if its playing or has been paused.
I am using AudioRecord to read microphone data and RandomAccessFile to write it to a wav file. This is the code:
public class MainActivity extends Activity {
AudioManager am = null;
AudioRecord record =null;
// AudioTrack track =null;
final int SAMPLE_FREQUENCY = 44100;
final int SIZE_OF_RECORD_ARRAY = 1024; // 1024 ORIGINAL
final int WAV_SAMPLE_MULTIPLICATION_FACTOR = 1;
int i= 0;
boolean isPlaying = false;
private volatile boolean keepThreadRunning;
// private RandomAccessFile stateFile, stateFileTemp, savToDisk;
private RandomAccessFile savToDisk;
private FileDescriptor fd = new FileDescriptor();
private File delFile, renFile;
String stateFileLoc = Environment.getExternalStorageDirectory().getPath();
// To keep hederWriter() happy
private short nChannels = 1;
private int sRate = SAMPLE_FREQUENCY;
private short mBitsPersample = 16; // represents 16 bits of one PCM sample
private int payload;
class MyThread extends Thread{
private volatile boolean needsToPassThrough;
// /*
MyThread(){
super();
}
MyThread(boolean newPTV){
this.needsToPassThrough = newPTV;
}
// */
// /*
#Override
public void run(){
short[] lin = new short[SIZE_OF_RECORD_ARRAY];
// byte[] lin = new byte[SIZE_OF_RECORD_ARRAY];
int num = 0;
// /*
if(needsToPassThrough){
record.startRecording();
// track.play();
}
// */
while (keepThreadRunning) {
// while (!isInterrupted()) {
// num = record.read(lin, 0, SIZE_OF_RECORD_ARRAY);
num = record.read(lin, 0, lin.length);
try {
// savToDisk.write(lin); // use only this line if lin is a byte array
// use the for loop block below if lin is an array of short
for(i=0;i <lin.length; i++)
savToDisk.writeShort(Short.reverseBytes(lin[i]));
// payload += lin.length; // use this line if lin is an array of byte
payload = payload + (lin.length)*2;
fd.sync();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
/*
catch (SyncFailedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
*/
}
// /*
record.stop();
// track.stop();
record.release();
// track.release();
// */
}
// */
// /*
public void stopThread(){
keepThreadRunning = false;
}
// */
}
MyThread newThread;
private void init() {
int min = AudioRecord.getMinBufferSize(SAMPLE_FREQUENCY, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
// Toast.makeText(getApplicationContext(), Integer.toString(min), Toast.LENGTH_SHORT).show(); // Shows 4096
record = new AudioRecord(MediaRecorder.AudioSource.VOICE_COMMUNICATION, SAMPLE_FREQUENCY, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, min);
// int maxJitter = AudioTrack.getMinBufferSize(SAMPLE_FREQUENCY, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
// track = new AudioTrack(AudioManager.MODE_IN_COMMUNICATION, SAMPLE_FREQUENCY, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, maxJitter, AudioTrack.MODE_STREAM);
am = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
am.setMode(AudioManager.MODE_IN_COMMUNICATION);
try {
savToDisk = new RandomAccessFile(stateFileLoc+"/audSampData.wav", "rw");
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
try {
fd = savToDisk.getFD();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
private void writeHeader(){
try {
savToDisk.setLength(0); // Set file length to 0, to prevent unexpected behavior in case the file already existed
savToDisk.writeBytes("RIFF");
savToDisk.writeInt(0); // Final file size not known yet, write 0
savToDisk.writeBytes("WAVE");
savToDisk.writeBytes("fmt ");
savToDisk.writeInt(Integer.reverseBytes(16)); // Sub-chunk size, 16 for PCM
savToDisk.writeShort(Short.reverseBytes((short) 1)); // AudioFormat, 1 for PCM
savToDisk.writeShort(Short.reverseBytes(nChannels));// Number of channels, 1 for mono, 2 for stereo
savToDisk.writeInt(Integer.reverseBytes(sRate)); // Sample rate
savToDisk.writeInt(Integer.reverseBytes(sRate*nChannels*mBitsPersample/8)); // Byte rate, SampleRate*NumberOfChannels*mBitsPersample/8
savToDisk.writeShort(Short.reverseBytes((short)(nChannels*mBitsPersample/8))); // Block align, NumberOfChannels*mBitsPersample/8
savToDisk.writeShort(Short.reverseBytes(mBitsPersample)); // Bits per sample
savToDisk.writeBytes("data");
savToDisk.writeInt(0); // Data chunk size not known yet, write 0
fd.sync();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#Override
protected void onResume(){
super.onResume();
// newThread.stopThread();
Log.d("MYLOG", "onResume() called");
init();
writeHeader();
keepThreadRunning = true;
// */
// newThread = new MyThread(true);
newThread = new MyThread(isPlaying);
newThread.start();
}
#Override
protected void onPause(){
super.onPause();
Log.d("MYLOG", "onPause() called");
newThread.stopThread();
// android.os.Process.killProcess(android.os.Process.myPid());
try {
savToDisk.seek(4);
savToDisk.writeInt(Integer.reverseBytes(36+payload));
savToDisk.seek(40);
savToDisk.writeInt(Integer.reverseBytes(payload));
savToDisk.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
setVolumeControlStream(AudioManager.MODE_IN_COMMUNICATION);
payload = 0;
Log.d("MYLOG","onCreate() called");
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
#Override
protected void onDestroy() {
super.onDestroy();
newThread.stopThread();
// android.os.Process.killProcess(android.os.Process.myPid());
// killProcess(android.os.Process.myPid());
// newThread.interrupt();
// delFile.delete();
Log.d("MYLOG", "onDestroy() called");
/*
try {
savToDisk.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
*/
}
public void passStop(View view){
Button playBtn = (Button) findViewById(R.id.button1);
// /*
if(!isPlaying){
record.startRecording();
// track.play();
isPlaying = true;
playBtn.setText("Pause");
}
else{
record.stop();
// track.pause();
isPlaying=false;
playBtn.setText("Pass through");
}
// */
}
}
When I play the wav file in an audio player, it sounds speeded up, and also seems to skip frames. What could be the reasons for this? I believe the skipping frames problem is probably due to the fact that I have used writeShort() function to write out each element of the short array that stores the audio sample data separately, but if that is the case please suggest a workaround to it that involves writing data as shorts (and not using the write(byte[]) function, because I need to use parts of this code in my main project which involves obtaining audio samples in a short array). Also why is it speeded up?
Take a look at this, this question put me on the right track.
Android : recording audio using audiorecord class play as fast forwarded
I'm playing .wav files using AudioTrack.
I have a problem.
I set the setLoopPoints to loop my .wav files, but it doesn't work.
This is my sample code.
public class PleaseActivity extends Activity implements Runnable{
AudioTrack audioTrack;
public static final String MEDIA_PATH = Environment.getExternalStorageDirectory().getAbsolutePath()+"/TEST";
/** Called when the activity is first created. */
Button play_button, stop_button;
File file = null;
byte[] byteData = null;
Boolean playing = false;
int bufSize;
AudioTrack myAT = null;
Thread play_thread = null;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
play_button = (Button) findViewById(R.id.btn1);
stop_button = (Button) findViewById(R.id.btn2);
file = new File(MEDIA_PATH+"/untitled1.wav");
byteData = new byte[(int) file.length()];
FileInputStream in = null;
try {
in = new FileInputStream(file);
in.read(byteData);
in.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
initialize();
play_button.setOnClickListener(new OnClickListener() {
public void onClick(View v) {
play_thread.start();
}
});
//
stop_button.setOnClickListener(new OnClickListener() {
public void onClick(View v) {
//
if (myAT.getPlayState() == AudioTrack.PLAYSTATE_PLAYING) {
myAT.stop();
play_thread = null;
initialize();
}
}
});
}
void initialize() {
bufSize = android.media.AudioTrack.getMinBufferSize(44100,
AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT);
myAT = new AudioTrack(AudioManager.STREAM_MUSIC,
44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT, bufSize,
AudioTrack.MODE_STREAM);
play_thread = new Thread(this);
}
public void run() {
if (myAT != null) {
myAT.play();
myAT.setLoopPoints(0, byteData.length, 2);
myAT.write(byteData, 0, byteData.length);
}
}
}
I can play my wave files well, but setLoopPoints doesn't work!
Anybody helps me..
I solved this looping problems like this.
I have another problem.
whenever I wrote data into audiotrack,
I mean whenever audiotrack is repeated, some noise like "tick" is added at the first part.
I don't know how to eliminate this noise..
Is there anybody knows how to solve it?
class DLThread extends Thread
{
public void run()
{
while(!DLThread.interrupted())
{
if (myAT != null) {
//
myAT.play();
myAT.flush();
myAT.write(byteData, 0, byteData.length);
}
}
}
}
public int setLoopPoints (int startInFrames, int endInFrames, int loopCount)
Sets the loop points and the loop count. The loop can be infinite. Similarly to setPlaybackHeadPosition, the track must be stopped or paused for the position to be changed, and *must use the MODE_STATIC mode*.
The tick is probably the wav file header. Try offsetting the playback by 44 bytes.