I'm trying to stream music in using AudioTrack.
The track plays, but the audio plays at half the rate. It's like the song has become slow motion.
try{
AudioTrack track= new AudioTrack( AudioManager.STREAM_MUSIC,
44100,
android.media.AudioFormat.CHANNEL_CONFIGURATION_MONO,
android.media.AudioFormat.ENCODING_PCM_16BIT,
android.media.AudioTrack.getMinBufferSize( 44100,
android.media.AudioFormat.CHANNEL_CONFIGURATION_STEREO,
android.media.AudioFormat.ENCODING_PCM_16BIT ),
AudioTrack.MODE_STREAM );
System.out.println("Min buffer: " + android.media.AudioTrack.getMinBufferSize( 44100,
android.media.AudioFormat.CHANNEL_CONFIGURATION_STEREO,
android.media.AudioFormat.ENCODING_PCM_16BIT ));
int cnt;
int totalWrite = 0;
boolean play = true;
byte buff[]=new byte[16384];
while((cnt=CircularByteBuffer.getInstance().getInputStream().read(buff))>0){
totalWrite += cnt;
System.out.println("Writing: " + cnt);
track.write(buff, 0, cnt);
if ( totalWrite > 60000 && play ){
track.play();
play = false;
}
}
}catch (Exception e)
{
}//end catch
In the CircularByteBuffer, the bytes are being written on another thread and read on this one. The song plays consistently without any pauses, but it just plays at a slow rate. I have no idea what it could be. Any ideas?
Assuming this is in fact a stereo stream, why are you creating the AudioTrack with CHANNEL_CONFIGURATION_MONO?
try with
track.setPlaybackRate(88200);
track.play();
It should play in normal speed
So it turns out that all the settings were fine. I ended up using the "CHANNEL_CONFIGURATION_STEREO" setting but I tried to run the application on an actual device. When I streamed it from the device, the music was perfectly fine. But the simulator was causing all the problems for the song.
I hope that is the reason and the simulator is just a memory hog and runs inefficiently.
Related
I have an Android app where there is some raw audio bytes stored in a variable.
If I use an AudioTrack to play this audio data, it only works if I use AudioTrack.MODE_STREAM:
byte[] recordedAudioAsBytes;
public void playButtonPressed(View v) {
// this verifies that audio data exists as expected
for (int i=0; i<recordedAudioAsBytes.length; i++) {
Log.i("ABC", "byte[" + i + "] = " + recordedAudioAsBytes[i]);
}
// STREAM MODE ACTUALLY WORKS!!
/*
AudioTrack player = new AudioTrack(AudioManager.STREAM_MUSIC, SAMPLERATE, CHANNELS,
ENCODING, MY_CHOSEN_BUFFER_SIZE, AudioTrack.MODE_STREAM);
player.play();
player.write(recordedAudioAsBytes, 0, recordedAudioAsBytes.length);
*/
// STATIC MODE DOES NOT WORK
AudioTrack player = new AudioTrack(AudioManager.STREAM_MUSIC, SAMPLERATE, PLAYBACK_CHANNELS,
ENCODING, MY_CHOSEN_BUFFER_SIZE, AudioTrack.MODE_STATIC);
player.write(recordedAudioAsBytes, 0, recordedAudioAsBytes.length);
player.play();
}
If I use AudioTrack.MODE_STATIC, the output is glitchy -- it just makes a nasty pop and sounds very short with hardly anything audible.
So why is that? Does STATIC_MODE require that the audio data have a header?
That's all I can think of.
If you'd like to see all the code, check this question.
It seems to me that you are using the same MY_CHOSEN_BUFFER_SIZE for 'streaming' and 'static' mode!? This might explain why it sounds short...
In order to use Audiotracks 'static-mode' you have to use the size of your Byte-Array (bigger will also work) as buffersize. The Audio will be treated as one big chunk of data.
See: AudioTrack.Builder
setBufferSizeInBytes()... "If using the AudioTrack in static mode (see AudioTrack#MODE_STATIC), this is the maximum size of the sound that will be played by this instance."
I am currently developing realtime audio DSP app for android... Signal is taken from mic input(on the phone or tablet) processed and then played back....
When my app starts, if mic and headphone jack is pressent, Audio DSP gets intialized.... For the first second there is output from the speakers or headphones, but after that there is just silence and it seems that playback has stopped, but recording still works in background....
I use AudioRecorder class to get RAW audio data and for playback I use AudioTrack....
For now I have the following code(I wish i knew why it is not working) for stream reading from mic and playing back:
public void ProcessAudio(){
_track = new AudioTrack(AudioManager.STREAM_MUSIC, _recorder.getSampleRate(),AudioFormat.CHANNEL_OUT_MONO,_recorder.getAudioFormat(), _bufferSize, AudioTrack.MODE_STATIC);
boolean first=true;
while(_IsRecording){
try {
byte sData[] = new byte[_bufferSize];
_recorder.read(sData, 0, _bufferSize);
//TODO:add audio DSP stuff here
_track.write(sData,0,_bufferSize);
if(first){
first = false;
_track.play();
}
}catch (Exception ex){
Log.e(StaticVars.SOFTWARE_TAG, "Failed:" + ex.getMessage());
}
}
_track.flush();
_track.stop();
_track.release();
}
So.... It turns out that i didn't initialize my AudioTrack correctly... once i changed :
_track = new AudioTrack(AudioManager.STREAM_MUSIC, _recorder.getSampleRate(),AudioFormat.CHANNEL_CONFIGURATION_MONO,_recorder.getAudioFormat(), _bufferSize, AudioTrack.MODE_STATIC);
to:
_track = new AudioTrack(AudioManager.STREAM_MUSIC, _recorder.getSampleRate(),AudioFormat.CHANNEL_CONFIGURATION_MONO,_recorder.getAudioFormat(), _bufferSize, AudioTrack.MODE_STREAM);
it started to work....
Edit:
Beacose of latency in Android the realtime audio processing is not plausible atleast on Android 4.4 and down.
Basically, my program records user input via the microphone and store it as a .pcm file in the sdcard/ directory. It'll be overwritten should there be an existing one. The file is then later sent for playback and analysis (mainly FFT, RMS computation).
I have added another function which allows the program to record system audio, so it allows user's mp3 files to be analyzed as well. It streams the system audio and store it as a .pcm file for later playback and analysis.
It's all functioning well. However, there's a slight issue, when the program streams audio, it captures input from the mic and there'll be noises in the playback. I do not want this as it'll affect the analysis reading. I googled for a solution and found that I can actually mute the mic. So now, I want to mute the mic when the mp3 file is being streamed.
The code I have found is,
AudioManager.setMicrophoneMute(true);
I tried to implement it but it just crashes my application. I tried to find for solutions these few days but I cannot seem to get any.
Here is my code snippet for the part where I want to stream system audio and muting the microphone before it starts streaming.
//create a new AudioRecord object to record the audio data of an mp3 file
int bufferSize = AudioRecord.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
audioRecord = new AudioRecord(AudioManager.STREAM_MUSIC,
frequency, channelConfiguration,
audioEncoding, bufferSize);
//a short array to store raw pcm data
short[] buffer = new short[bufferSize];
Log.i("decoder", "The audio record created fine ready to record");
try {
audioManager.setMicrophoneMute(true);
} catch (Exception e) {
e.printStackTrace();
}
audioRecord.startRecording();
isDecoding = true;
When the setMicrophoneMute(true) line is surrounded with try-catch, the program would only crash when I want to send the recording for play back. Errors are as follow:
"AudioFlinger could not create track, status: -12"
"Error initializing AudioTrack"
"[android.media.AudioTrack] Error code -20 when initializing AudioTrack."
When it is not surrounded with try-catch, the program would just crash the moment I click on the start streaming button.
"Decoding failed" < this is an error log from catching a throwable.
How can I mute the microphone input while streaming the system audio? Let me know if I can provide you with more codes. Thank you!
**EDIT
I have implemented my mutemicrophone successfully, it even returns me a true for isMicrophoneMute(), however, it's not muted as it still records from the microphone; it's a false true.
Based on the suggested answer, I have already created a class for audio focus as below:
private final Context c;
private final AudioManager.OnAudioFocusChangeListener changeListener =
new AudioManager.OnAudioFocusChangeListener()
{
public void onAudioFocusChange(int focusChange)
{
//nothing to do
}
};
AudioFocus(Context context)
{
c = context;
}
public void grabFocus()
{
final AudioManager am = (AudioManager) c.getSystemService(Context.AUDIO_SERVICE);
final int result = am.requestAudioFocus(changeListener,
AudioManager.STREAM_MUSIC,
AudioManager.AUDIOFOCUS_GAIN);
Log.d("audiofocus","Grab audio focus: " + result);
}
public void releaseFocus()
{
final AudioManager am = (AudioManager) c.getSystemService(Context.AUDIO_SERVICE);
final int result = am.abandonAudioFocus(changeListener);
Log.d("audiofocus","Abandon audio focus: " + result);
}
I then call the method from my Decoder class to request for audio focus:
int bufferSize = AudioRecord.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
audioFocus.grabFocus();
audioRecord = new AudioRecord(AudioManager.STREAM_MUSIC,
frequency, channelConfiguration,
audioEncoding, bufferSize);
//a short array to store raw pcm data
short[] buffer = new short[bufferSize];
Log.i("decoder", "The audio record created fine ready to record");
audioRecord.startRecording();
isDecoding = true;
Log.i("decoder", "Start recording fine");
And then release the focus when stop decoding is pressed:
//stops recording
public void stopDecoding(){
isDecoding = false;
Log.i("decoder", "Out of recording");
audioRecord.stop();
try {
dos.close();
} catch (IOException e) {
e.printStackTrace();
}
mp.stop();
mp.release();
audioFocus.releaseFocus();
}
However, this makes my application crash. Where did I went wrong?
The following snippet requests permanent audio focus on the music audio stream. You should request the audio focus immediately before you begin playback, such as when the user presses play. I think this would be the way to go rather than muting the input microphone. Check out the developer audio focus docs for more information
AudioManager am = mContext.getSystemService(Context.AUDIO_SERVICE);
...
// Request audio focus for playback
int result = am.requestAudioFocus(afChangeListener,
// Use the music stream.
AudioManager.STREAM_MUSIC,
// Request permanent focus.
AudioManager.AUDIOFOCUS_GAIN);
if (result == AudioManager.AUDIOFOCUS_REQUEST_GRANTED) {
am.registerMediaButtonEventReceiver(RemoteControlReceiver);
// Start playback.
}
I'm trying to record from the MIC direcly to a short array.
The goal is not to write a file with the audio track, just save it within a short array.
If've tried several methods and the best I've found is recording with AudioRecord and to play it with AudioTrack. I've found a good class here:
Android: Need to record mic input
This class makes all I need, I just have to modify it to achieve my desired result, but...I don't get it well, I'm missing something...
Here's is my modification (not working at all):
private class Audio extends Thread {
private boolean stopped = false;
/**
* Give the thread high priority so that it's not canceled unexpectedly, and start it
*/
private Audio()
{
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
start();
}
#Override
public void run()
{
Log.i("Audio", "Running Audio Thread");
AudioRecord recorder = null;
AudioTrack track = null;
//short[][] buffers = new short[256][160];
int ix = 0;
/*
* Initialize buffer to hold continuously recorded audio data, start recording, and start
* playback.
*/
try
{
int N = AudioRecord.getMinBufferSize(8000,AudioFormat.CHANNEL_IN_MONO,AudioFormat.ENCODING_PCM_16BIT);
recorder = new AudioRecord(AudioSource.MIC, 8000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, N*10);
short[] buff = new short[N];
recorder.startRecording();
/*
* Loops until something outside of this thread stops it.
* Reads the data from the recorder and writes it to the audio track for playback.
*/
while(!stopped) {
//Log.i("Map", "Writing new data to buffer");
//short[] buffer = buffer[ix++ % buffer.length];
N = recorder.read(buff, 0, buff.length);
}
recorder.stop();
recorder.release();
track = new AudioTrack(AudioManager.STREAM_MUSIC, 8000,
AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, N*10, AudioTrack.MODE_STREAM);
track.play();
for (int i =0; i< buff.length;i++) {
track.write(buff, i, buff.length);
}
} catch(Exception x) {
//Log.e("Audio", x.getMessage());
x.printStackTrace();
} finally {
track.stop();
track.release();
}
}
/**
* Called from outside of the thread in order to stop the recording/playback loop
*/
private void close()
{
stopped = true;
}
}
What I need is to record the sound in the short array buffer and when the user push a button, play it...But right now, I'm trying to record the sound and, when user push a button, recording stop and the sound start playing...
Anyone can help me?
Thanks.
You need to restructure the code to do what you want it to do. If I understand correctly you want to read sound until the 'stopped' is set true, then play the data.
Just so you understand that is potentially a lot of buffered data depending on how long that recording time is. You could write it to a file or store a series of buffers into some abstract data type.
Just to get something to work create a Vector of short [] and allocate a new short [] buffer in your 'while(!stopped)' loop and then stuff it into the vector.
After the while loop stops you can iterate through the vector and write the buffers to the AudioTrack.
As you now understand, the blip you were hearing is just the last 20ms or so of audio since your buffer only kept that last little bit.
Is there a way to record mic input in android while it is being process for playback/preview in real time? I tried to use AudioRecord and AudioTrack to do this but the problem is that my device cannot play the recorded audio file. Actually, any android player application cannot play the recorded audio file.
On the other hand, Using Media.Recorder to record generates a good recorded audio file that can be played by any player application. But the thing is that I cannot make a preview/palyback while recording the mic input in real time.
To record and play back audio in (almost) real time you can start a separate thread and use an AudioRecord and an AudioTrack.
Just be careful with feedback. If the speakers are turned up loud enough on your device, the feedback can get pretty nasty pretty fast.
/*
* Thread to manage live recording/playback of voice input from the device's microphone.
*/
private class Audio extends Thread
{
private boolean stopped = false;
/**
* Give the thread high priority so that it's not canceled unexpectedly, and start it
*/
private Audio()
{
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
start();
}
#Override
public void run()
{
Log.i("Audio", "Running Audio Thread");
AudioRecord recorder = null;
AudioTrack track = null;
short[][] buffers = new short[256][160];
int ix = 0;
/*
* Initialize buffer to hold continuously recorded audio data, start recording, and start
* playback.
*/
try
{
int N = AudioRecord.getMinBufferSize(8000,AudioFormat.CHANNEL_IN_MONO,AudioFormat.ENCODING_PCM_16BIT);
recorder = new AudioRecord(AudioSource.MIC, 8000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, N*10);
track = new AudioTrack(AudioManager.STREAM_MUSIC, 8000,
AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, N*10, AudioTrack.MODE_STREAM);
recorder.startRecording();
track.play();
/*
* Loops until something outside of this thread stops it.
* Reads the data from the recorder and writes it to the audio track for playback.
*/
while(!stopped)
{
Log.i("Map", "Writing new data to buffer");
short[] buffer = buffers[ix++ % buffers.length];
N = recorder.read(buffer,0,buffer.length);
track.write(buffer, 0, buffer.length);
}
}
catch(Throwable x)
{
Log.w("Audio", "Error reading voice audio", x);
}
/*
* Frees the thread's resources after the loop completes so that it can be run again
*/
finally
{
recorder.stop();
recorder.release();
track.stop();
track.release();
}
}
/**
* Called from outside of the thread in order to stop the recording/playback loop
*/
private void close()
{
stopped = true;
}
}
EDIT
The audio is not really recording to a file. The AudioRecord object encodes the audio as 16 bit PCM data and places it in a buffer. Then the AudioTrack object reads the data from that buffer and plays it through the speakers. There is no file on the SD card that you will be able to access later.
You can't read and write a file from the SD card at the same time to get playback/preview in real time, so you have to use buffers.
Following permission in manifest is required to work properly:
<uses-permission android:name="android.permission.RECORD_AUDIO" ></uses-permission>
Also, 2d buffer array is not necessary. The logic of the code is valid even with just one buffer, like this:
short[] buffer = new short[160];
while (!stopped) {
//Log.i("Map", "Writing new data to buffer");
int n = recorder.read(buffer, 0, buffer.length);
track.write(buffer, 0, n);
}