I'm trying to record from the MIC direcly to a short array.
The goal is not to write a file with the audio track, just save it within a short array.
If've tried several methods and the best I've found is recording with AudioRecord and to play it with AudioTrack. I've found a good class here:
Android: Need to record mic input
This class makes all I need, I just have to modify it to achieve my desired result, but...I don't get it well, I'm missing something...
Here's is my modification (not working at all):
private class Audio extends Thread {
private boolean stopped = false;
/**
* Give the thread high priority so that it's not canceled unexpectedly, and start it
*/
private Audio()
{
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
start();
}
#Override
public void run()
{
Log.i("Audio", "Running Audio Thread");
AudioRecord recorder = null;
AudioTrack track = null;
//short[][] buffers = new short[256][160];
int ix = 0;
/*
* Initialize buffer to hold continuously recorded audio data, start recording, and start
* playback.
*/
try
{
int N = AudioRecord.getMinBufferSize(8000,AudioFormat.CHANNEL_IN_MONO,AudioFormat.ENCODING_PCM_16BIT);
recorder = new AudioRecord(AudioSource.MIC, 8000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, N*10);
short[] buff = new short[N];
recorder.startRecording();
/*
* Loops until something outside of this thread stops it.
* Reads the data from the recorder and writes it to the audio track for playback.
*/
while(!stopped) {
//Log.i("Map", "Writing new data to buffer");
//short[] buffer = buffer[ix++ % buffer.length];
N = recorder.read(buff, 0, buff.length);
}
recorder.stop();
recorder.release();
track = new AudioTrack(AudioManager.STREAM_MUSIC, 8000,
AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, N*10, AudioTrack.MODE_STREAM);
track.play();
for (int i =0; i< buff.length;i++) {
track.write(buff, i, buff.length);
}
} catch(Exception x) {
//Log.e("Audio", x.getMessage());
x.printStackTrace();
} finally {
track.stop();
track.release();
}
}
/**
* Called from outside of the thread in order to stop the recording/playback loop
*/
private void close()
{
stopped = true;
}
}
What I need is to record the sound in the short array buffer and when the user push a button, play it...But right now, I'm trying to record the sound and, when user push a button, recording stop and the sound start playing...
Anyone can help me?
Thanks.
You need to restructure the code to do what you want it to do. If I understand correctly you want to read sound until the 'stopped' is set true, then play the data.
Just so you understand that is potentially a lot of buffered data depending on how long that recording time is. You could write it to a file or store a series of buffers into some abstract data type.
Just to get something to work create a Vector of short [] and allocate a new short [] buffer in your 'while(!stopped)' loop and then stuff it into the vector.
After the while loop stops you can iterate through the vector and write the buffers to the AudioTrack.
As you now understand, the blip you were hearing is just the last 20ms or so of audio since your buffer only kept that last little bit.
Related
My android OS is Android M. Nexus 6.
I implemented a AndroidSpeakerWriter as
public class AndroidSpeakerWriter {
private final static String TAG= "AndroidSpeakerWriter";
private AudioTrack audioTrack;
short[] buffer;
public AndroidSpeakerWriter() {
buffer = new short[1024];
}
public void init(int sampleRateInHZ){
int minBufferSize = AudioTrack.getMinBufferSize(sampleRateInHZ,
AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRateInHZ,
AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, minBufferSize,
AudioTrack.MODE_STREAM); // 0-static 1-stream
}
public void fillBuffer(short[] samples) {
if (buffer.length<samples.length) {
buffer = new short[samples.length];
}
System.arraycopy(samples, 0, buffer, 0, samples.length);
}
public void writeSamples(short[] samples) {
fillBuffer(samples);
audioTrack.write(buffer, 0, samples.length);
}
public void stop() {
audioTrack.stop();
}
public void play() {
audioTrack.play();
}
}
Then I just send samples when I click a button
public void play(final short[] signal) {
if (signal == null){
Log.d(TAG, "play: a null signal");
return;
}
Thread t = new Thread(new Runnable() {
#Override
public void run() {
android.os.Process
.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
androidSpeakerWriter.play();
androidSpeakerWriter.writeSamples(signal);
androidSpeakerWriter.stop();
}
});
t.start();
}
The problem is the device does not beep every time I click the button.
Sometimes it works, sometimes it doesn't.
There is no such a problem when I run this on an old nexus galaxy phone android 4.3. Anybody has encountered a similar problem? Thanks in advance for any help.
One thing is that currently my beep is pretty short (256 samples), not even close to the minBufferSize.
The bufferSizeInBytes in the constructor of AudioTrack for static mode should be the audio sample length you wanna play according to the vague document.
So is it still has a minimal size constraint on the buffer even for static mode? Why a nexus galaxy can play a 256 sample audio in static mode and a nexus 6 can not.
I use AudioManager to get the native buffer size/ sampling rate
nexus galaxy: 144/44100 nexus 6: 192/48000
I found those related:
AudioRecord and AudioTrack latency
Does AudioTrack buffer need to be full always in streaming mode?
https://github.com/igorski/MWEngine/wiki/Understanding-Android-audio-towards-achieving-low-latency-response
I believe it is caused by improper synchronization between thread. Your androidSpeakerWriter instance is running continously in different thread calling play(), writeSamples(), stop() respectively. Click of button will trigger creation of new thread with same androidSpeakerWriter instance.
So while Thread A is executing androidSpeakerWriter.play(), Thread B might be executing androidSpeakerWriter.writeSamples() which might overwrite current audio data being played.
Try
synchronized(androidSpeakerWriter) {
androidSpeakerWriter.play();
androidSpeakerWriter.writeSamples(signal);
androidSpeakerWriter.stop();
}
MODE_STREAM is used if you must play long audio data that will not fit into memory. If you need to play short audio file such beep sound, you can use MODE_STATIC when creating AudioTrack. then change your playback code such following:
synchronized(androidSpeakerWriter) {
androidSpeakerWriter.writeSamples(signal);
androidSpeakerWriter.play();
}
Basically, my program records user input via the microphone and store it as a .pcm file in the sdcard/ directory. It'll be overwritten should there be an existing one. The file is then later sent for playback and analysis (mainly FFT, RMS computation).
I have added another function which allows the program to record system audio, so it allows user's mp3 files to be analyzed as well. It streams the system audio and store it as a .pcm file for later playback and analysis.
It's all functioning well. However, there's a slight issue, when the program streams audio, it captures input from the mic and there'll be noises in the playback. I do not want this as it'll affect the analysis reading. I googled for a solution and found that I can actually mute the mic. So now, I want to mute the mic when the mp3 file is being streamed.
The code I have found is,
AudioManager.setMicrophoneMute(true);
I tried to implement it but it just crashes my application. I tried to find for solutions these few days but I cannot seem to get any.
Here is my code snippet for the part where I want to stream system audio and muting the microphone before it starts streaming.
//create a new AudioRecord object to record the audio data of an mp3 file
int bufferSize = AudioRecord.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
audioRecord = new AudioRecord(AudioManager.STREAM_MUSIC,
frequency, channelConfiguration,
audioEncoding, bufferSize);
//a short array to store raw pcm data
short[] buffer = new short[bufferSize];
Log.i("decoder", "The audio record created fine ready to record");
try {
audioManager.setMicrophoneMute(true);
} catch (Exception e) {
e.printStackTrace();
}
audioRecord.startRecording();
isDecoding = true;
When the setMicrophoneMute(true) line is surrounded with try-catch, the program would only crash when I want to send the recording for play back. Errors are as follow:
"AudioFlinger could not create track, status: -12"
"Error initializing AudioTrack"
"[android.media.AudioTrack] Error code -20 when initializing AudioTrack."
When it is not surrounded with try-catch, the program would just crash the moment I click on the start streaming button.
"Decoding failed" < this is an error log from catching a throwable.
How can I mute the microphone input while streaming the system audio? Let me know if I can provide you with more codes. Thank you!
**EDIT
I have implemented my mutemicrophone successfully, it even returns me a true for isMicrophoneMute(), however, it's not muted as it still records from the microphone; it's a false true.
Based on the suggested answer, I have already created a class for audio focus as below:
private final Context c;
private final AudioManager.OnAudioFocusChangeListener changeListener =
new AudioManager.OnAudioFocusChangeListener()
{
public void onAudioFocusChange(int focusChange)
{
//nothing to do
}
};
AudioFocus(Context context)
{
c = context;
}
public void grabFocus()
{
final AudioManager am = (AudioManager) c.getSystemService(Context.AUDIO_SERVICE);
final int result = am.requestAudioFocus(changeListener,
AudioManager.STREAM_MUSIC,
AudioManager.AUDIOFOCUS_GAIN);
Log.d("audiofocus","Grab audio focus: " + result);
}
public void releaseFocus()
{
final AudioManager am = (AudioManager) c.getSystemService(Context.AUDIO_SERVICE);
final int result = am.abandonAudioFocus(changeListener);
Log.d("audiofocus","Abandon audio focus: " + result);
}
I then call the method from my Decoder class to request for audio focus:
int bufferSize = AudioRecord.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
audioFocus.grabFocus();
audioRecord = new AudioRecord(AudioManager.STREAM_MUSIC,
frequency, channelConfiguration,
audioEncoding, bufferSize);
//a short array to store raw pcm data
short[] buffer = new short[bufferSize];
Log.i("decoder", "The audio record created fine ready to record");
audioRecord.startRecording();
isDecoding = true;
Log.i("decoder", "Start recording fine");
And then release the focus when stop decoding is pressed:
//stops recording
public void stopDecoding(){
isDecoding = false;
Log.i("decoder", "Out of recording");
audioRecord.stop();
try {
dos.close();
} catch (IOException e) {
e.printStackTrace();
}
mp.stop();
mp.release();
audioFocus.releaseFocus();
}
However, this makes my application crash. Where did I went wrong?
The following snippet requests permanent audio focus on the music audio stream. You should request the audio focus immediately before you begin playback, such as when the user presses play. I think this would be the way to go rather than muting the input microphone. Check out the developer audio focus docs for more information
AudioManager am = mContext.getSystemService(Context.AUDIO_SERVICE);
...
// Request audio focus for playback
int result = am.requestAudioFocus(afChangeListener,
// Use the music stream.
AudioManager.STREAM_MUSIC,
// Request permanent focus.
AudioManager.AUDIOFOCUS_GAIN);
if (result == AudioManager.AUDIOFOCUS_REQUEST_GRANTED) {
am.registerMediaButtonEventReceiver(RemoteControlReceiver);
// Start playback.
}
I am trying to build a opensource video system in android, since we have no access to the data in a closed system. In this system, we can modify the raw data captured by camera.
I used MediaCodec and MediaMux to do the video data encoding and muxing job, and that works. But I have no idea about the audio part. I used onFramePreview to get each frame and do the encoding/muxing work by frame. But how do I do the audio recording at the same time(I mean capturing the audio by frame, encode it and send the data to the MediaMux).
I've done some research. It seems that we use audiorecorder to get the raw data of audio. But audiorecorder does a constant recording job, I don't think it can work.
Can anyone give me a hint? Thank you!
Create audioRecorder like this:
private AudioRecord getRecorderInstance() {
AudioRecord ar = null;
try {
//Get a audiorecord
int N = AudioRecord.getMinBufferSize(8000,AudioFormat.CHANNEL_IN_MONO,AudioFormat.ENCODING_PCM_16BIT);
ar = new AudioRecord(AudioSource.MIC, 8000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, N*10);
}
catch (Exception e) {
}
return ar; //Returns null if mic is unavailable
}
Prepare and send the data for the encoding and muxing later like this is separate thread:
public class MicrophoneInput implements Runnable {
#Override
public void run() {
// Buffer for 200 milliseconds of data, e.g. 400 samples at 8kHz.
byte[] buffer200ms = new byte[8000 / 10];
try {
while (recording) {
audioRecorder.read(buffer200ms, 0, buffer200ms.length);
//process buffer i.e send to encoder
//don't forget to set correct timestamps synchronized with video
}
}
catch(Throwable x) {
//
}
}
}
//constructor
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
/////////////
//thread run() method
int N = AudioRecord.getMinBufferSize(8000,AudioFormat.CHANNEL_IN_MONO,AudioFormat.ENCODING_PCM_16BIT);
AudioRecord recorder = new AudioRecord(AudioSource.MIC, 8000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, N*10);
recorder.startRecording();
while(!stopped)
{
try {
//if not paused upload audio
if (uploadAudio == true) {
short[][] buffers = new short[256][160];
int ix = 0;
//allocate buffer for audio data
short[] buffer = buffers[ix++ % buffers.length];
//write audio data to track
N = recorder.read(buffer,0,buffer.length);
//create bytes big enough to hold audio data
byte[] bytes2 = new byte[buffer.length * 2];
//convert audio data from short[][] to byte[]
ByteBuffer.wrap(bytes2).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().put(buffer);
//encode audio data for ulaw
read(bytes2, 0, bytes2.length);
See here for ulaw encoder code. Im using the read, maxAbsPcm and encode methods
//send audio data
//os.write(bytes2,0,bytes2.length);
}
} finally {
}
}
os.close();
}
catch(Throwable x)
{
Log.w("AudioWorker", "Error reading voice AudioWorker", x);
}
finally
{
recorder.stop();
recorder.release();
}
///////////
So this works ok. The audio is sent in the proper format to the server and played at the opposite end. However the audio skips often. Example: saying 1,2,3,4 will play back with the 4 cut off.
I believe it to be a performance issue because I have timed some of these methods and when they take 0 or less seconds everything works but they quite often take a couple seconds. With the converting of bytes and encoding taking the most.
Any idea how I can optimize this code to get better performance? Or maybe a way to deal with lag (possibly build a cache)?
Is there a way to record mic input in android while it is being process for playback/preview in real time? I tried to use AudioRecord and AudioTrack to do this but the problem is that my device cannot play the recorded audio file. Actually, any android player application cannot play the recorded audio file.
On the other hand, Using Media.Recorder to record generates a good recorded audio file that can be played by any player application. But the thing is that I cannot make a preview/palyback while recording the mic input in real time.
To record and play back audio in (almost) real time you can start a separate thread and use an AudioRecord and an AudioTrack.
Just be careful with feedback. If the speakers are turned up loud enough on your device, the feedback can get pretty nasty pretty fast.
/*
* Thread to manage live recording/playback of voice input from the device's microphone.
*/
private class Audio extends Thread
{
private boolean stopped = false;
/**
* Give the thread high priority so that it's not canceled unexpectedly, and start it
*/
private Audio()
{
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
start();
}
#Override
public void run()
{
Log.i("Audio", "Running Audio Thread");
AudioRecord recorder = null;
AudioTrack track = null;
short[][] buffers = new short[256][160];
int ix = 0;
/*
* Initialize buffer to hold continuously recorded audio data, start recording, and start
* playback.
*/
try
{
int N = AudioRecord.getMinBufferSize(8000,AudioFormat.CHANNEL_IN_MONO,AudioFormat.ENCODING_PCM_16BIT);
recorder = new AudioRecord(AudioSource.MIC, 8000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, N*10);
track = new AudioTrack(AudioManager.STREAM_MUSIC, 8000,
AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, N*10, AudioTrack.MODE_STREAM);
recorder.startRecording();
track.play();
/*
* Loops until something outside of this thread stops it.
* Reads the data from the recorder and writes it to the audio track for playback.
*/
while(!stopped)
{
Log.i("Map", "Writing new data to buffer");
short[] buffer = buffers[ix++ % buffers.length];
N = recorder.read(buffer,0,buffer.length);
track.write(buffer, 0, buffer.length);
}
}
catch(Throwable x)
{
Log.w("Audio", "Error reading voice audio", x);
}
/*
* Frees the thread's resources after the loop completes so that it can be run again
*/
finally
{
recorder.stop();
recorder.release();
track.stop();
track.release();
}
}
/**
* Called from outside of the thread in order to stop the recording/playback loop
*/
private void close()
{
stopped = true;
}
}
EDIT
The audio is not really recording to a file. The AudioRecord object encodes the audio as 16 bit PCM data and places it in a buffer. Then the AudioTrack object reads the data from that buffer and plays it through the speakers. There is no file on the SD card that you will be able to access later.
You can't read and write a file from the SD card at the same time to get playback/preview in real time, so you have to use buffers.
Following permission in manifest is required to work properly:
<uses-permission android:name="android.permission.RECORD_AUDIO" ></uses-permission>
Also, 2d buffer array is not necessary. The logic of the code is valid even with just one buffer, like this:
short[] buffer = new short[160];
while (!stopped) {
//Log.i("Map", "Writing new data to buffer");
int n = recorder.read(buffer, 0, buffer.length);
track.write(buffer, 0, n);
}