Android record and playback - android

I am currently trying to build an amplifier for the Android. The goal is to record and playback what is being recorded simultaneously. I created a thread that would take care of this. However, the sound comes out choppy. Here is what I tried.
private class RecordAndPlay extends Thread{
int bufferSize;
AudioRecord aRecord;
short[] buffer;
public RecordAndPlay() {
bufferSize = AudioRecord.getMinBufferSize(22050, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
buffer = new short[bufferSize];
}
#Override
public void run() {
aRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, 22050, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
try {
aRecord.startRecording();
} catch (Exception e) {
}
int bufferedResult = aRecord.read(buffer,0,bufferSize);
final AudioTrack aTrack = new AudioTrack(AudioManager.STREAM_MUSIC, samplingRate, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferedResult, AudioTrack.MODE_STREAM);
aTrack.setNotificationMarkerPosition(bufferedResult);
aTrack.setPlaybackPositionUpdateListener(new OnPlaybackPositionUpdateListener() {
#Override
public void onPeriodicNotification(AudioTrack track) {
// TODO Auto-generated method stub
}
#Override
public void onMarkerReached(AudioTrack track) {
Log.d("Marker reached", "...");
aTrack.release();
aRecord.release();
run();
}
});
aTrack.play();
aTrack.write(buffer, 0, buffer.length);
}
public void cancel(){
aRecord.stop();
aRecord.release();
}
}

Your playback is choppy because the AudioTrack is getting starved and not getting data smoothly. In your code you are recursively calling run and creating a new AudioTrack per marker. Instead, instantiate AudioRecord and AudioTrack only once and just handle their events. Also, to help smooth out playback you should probably start recording slightly before playback and maintain a queue of the recorded buffers. You can then manage passing these buffers to the AudioTrack and make sure there is always a new buffer to submit on each marker event.

Related

AudioTrack AudioRecord android issue, silence

Do anyone know if there is a need for audiorecord and audiotrack to be seperated in a special way/type of thread, runnable, service etc for it to be able to work together? Now I can start an audiotrack, then audiorecord without issues.
But if I start and stop the audiotrack while recording the audiorecord starts to output 0's as if it was muted. (But it is not muted or stopped)
If I start the audiorecord then the audiotrack then the audiorecord is also "muted".
Also weird is it that when I unplug and plug in my wired headset it will start recording/output recording other than 0's again (which makes me think my phones Lenovo B and Lenovo C2 is too cheap(circuit/hardware issues) or have build issues) but I do not know.
Anyone heard of this issue with a suddenly "muted" audiorecord or an audiorecord which responds to unplug/plugging of a wiredheadset without having any settings/methods applied for it?
Code update
class myRecordAndPlayer(){
public void initiateRecorder() {
if(audio.getMode()!=AudioManager.MODE_IN_CALL) {
audio.setMode(AudioManager.MODE_IN_CALL);
//audio.setSpeakerphoneOn(false); for mode in com (mode in com produces more echo/crosstalk)
}
rec = true;
Thread thread = new Thread(new Runnable() {
#Override
public void run() {
android.os.Process.setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO);
AudioRecord audioRecorder = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLE_RATE,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT,
AudioRecord.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT) * 10);
try {
audioRecorder.startRecording();
while (rec) {
bytes_read = audioRecorder.read(buf_audio, 0, buf_audio_len);
public void initiatePlayer() {
if(!play) {
play = true;
android.os.Process.setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO);
Thread receiveThread = new Thread(new Runnable() {
#Override
public void run() {
AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC, SAMPLE_RATE, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, BUF_SIZE, AudioTrack.MODE_STREAM);
track.play();
try {
while(play) {
track.write(bufToPlay, 0, bufToPlay.length);
Not tested.
private Thread audioRecordThread
private AudioRecord audioRecorder;
private AudioTrack audioTrack;
public void initialize() {
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, SAMPLE_RATE, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, BUF_SIZE, AudioTrack.MODE_STREAM);
audioRecorder = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLE_RATE,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT,
AudioRecord.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT) * 10);
audioRecordThread = new Thread(new Runnable() {
public void run() {
writeAudioDataToAudioTrack();
}
});
}
public void start() {
audioTrack.play();
audioRecorder.startRecording();
audioRecordThread.start();
}
private void writeAudioDataToAudioTrack() {
while(AudioRecord.RECORDSTATE_RECORDING == audioRecord.getRecordingState()) {
audioTrack.write(bufToPlay, 0, bufToPlay.length);
}
}

Update UI Immediately for Real-Time Processing

I am developing a real-time audio processing software which should update the UI in at least 100ms from the worker thread.
However, this appears harder to achieve than it looks.
I am calling runOnUiThread(uiUpdaterRunnable) from the worker thread but the delay of execution of uiUpdaterRunnable is variable and generally more than 300ms.
I tried to Use AsyncTask with publishProgress but it also gave me similar results.
How can i update UI from the worker thread to get at least 10FPS ?
Here is the runnable of my worker thread :
new Runnable() {
public void run() {
try {
int sampleRate = 44100;
int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
int channelConfig = MediaRecorder.AudioSource.MIC;
int bufferSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
audioRecorder = new AudioRecord(channelConfig, sampleRate, AudioFormat.CHANNEL_IN_MONO, audioFormat, bufferSize);
if (bufferSize == AudioRecord.ERROR_BAD_VALUE)
Log.i("AudioRecord", "AudioRecord Bad Buffer Size");
if (audioRecorder.getState() == AudioRecord.STATE_INITIALIZED)
Log.i("AudioRecord", "AudioRecord Initialized Successfully");
audioRecorder.startRecording();
while (recording) {
startTime = System.currentTimeMillis();
short[] buffer = new short[bufferSize];
audioRecorder.read(buffer, 0, bufferSize);
double[] bufferDouble = shortToDoubleArray(buffer);
final DataPoint[] resultArray = getFFTResult(bufferDouble);
//This is where I update UI
runOnUiThread(new Runnable() {
public void run() {
updateGraph(resultArray);
}
});
}
releaseAudioResources();
} catch (Exception e) {
if (e.getMessage() == null)
e.printStackTrace();
else
Log.e("Exception", e.getMessage());
releaseAudioResources();
}
}
}
....
// write this outside your code block
Handler mHandler = new Handler(){
#Override
public void handleMessage(Message msg) {
super.handleMessage(msg);
updateGraph(resultArray);
}
};
// to call it inside runnable
mHandler.sendEmptyMessage(0);

Cracking noise when streaming audio in Android

I am trying to create an app to stream audio from a local file in one device, to a different device using the Nearby communications API.
The problem is that I manage to stream the audio, but the only thing I can hear on the remote device is some sort of non-sense cracking noise.
What I´ve read so far is that I need to adjust the value in the minBufferSize I´m using and the value in the sampleRate, but I´ve been trying this and I´haven´t achieved much.
This is my code to send the byte chunks:
AudioTrack speaker;
//Audio Configuration.
private int sampleRate = 16000; //How much will be ideal?
private int channelConfig = AudioFormat.CHANNEL_OUT_MONO;
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
minBufSize=2048;
speaker = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRate, channelConfig, audioFormat, 10*minBufSize, AudioTrack.MODE_STREAM);
}
final InputStream file;
final byte [] arrayStream = new byte [minBufSize];
try {
file= new FileInputStream (filepath);
bytesRead = file.read(arrayStream);
while (bytesRead!=-1) {
new Thread(new Runnable() {
public void run() {
sendMessage(arrayStream);
}
}).start();
bytesRead = file.read(arrayStream);
}
Toast.makeText(this, "Mensaje totalmente completado",Toast.LENGTH_SHORT).show();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
private void sendMessage(byte [] payload) {
Nearby.Connections.sendReliableMessage(mGoogleApiClient, mOtherEndpointId, payload);
//mMessageText.setText(null);
}
And this is the code to receive and playback the message on the remote device:
#Override
public void onMessageReceived(String endpointId, byte[] payload, boolean isReliable) {
// A message has been received from a remote endpoint.
Toast.makeText(this,"Mensaje recibido",Toast.LENGTH_SHORT).show();
debugLog("onMessageReceived:" + endpointId + ":" + new String(payload));
playMp3(payload);
}
private void playMp3(final byte[] mp3SoundByteArray) {
if (isPlaying==false){
speaker.play();
isPlaying=true;
}else{
//sending data to the Audiotrack obj i.e. speaker
speaker.write(mp3SoundByteArray, 0, minBufSize);
Log.d("VR", "Writing buffer content to speaker");
}
}
Can anyone help me with this??
Thanks!
You need to buffer some audio on the client before trying to play it. Most likely you get some data and play it and the next chunk of data has not arrived in time.
See this post and this post about buffering data and streaming audio.

Audio Recording and Streaming in Android

I an developing an android app. I want to accomplish below feature.
I will use my phone's built-in mic to record and at the same time i want the recorded audio to be played through either phone's speakers or headphones.
Is it feasible? If yes, please help me in this.
Here is a simple Recording and Playback application.
Uses Android AudioRecord and AudioTrack,
Design:
The recorded audio is written to a buffer and played back from the same buffer, This mechanism runs in a loop (using Android thread) controlled by buttons.
Code
private String TAG = "AUDIO_RECORD_PLAYBACK";
private boolean isRunning = true;
private Thread m_thread; /* Thread for running the Loop */
private AudioRecord recorder = null;
private AudioTrack track = null;
int bufferSize = 320; /* Buffer for recording data */
byte buffer[] = new byte[bufferSize];
/* Method to Enable/Disable Buttons */
private void enableButton(int id,boolean isEnable){
((Button)findViewById(id)).setEnabled(isEnable);
}
The GUI has two Buttons START and STOP.
Enable the Button:
enableButton(R.id.StartButton,true);
enableButton(R.id.StopButton,false);
/* Assign Button Click Handlers */
((Button)findViewById(R.id.StartButton)).setOnClickListener(btnClick);
((Button)findViewById(R.id.StopButton)).setOnClickListener(btnClick);
Mapping START and STOP Button for OnClickListener
private View.OnClickListener btnClick = new View.OnClickListener() {
#Override
public void onClick(View v) {
switch(v.getId()){
case R.id.StartButton:
{
Log.d(TAG, "======== Start Button Pressed ==========");
isRunning = true;
do_loopback(isRunning);
enableButton(R.id.StartButton,false);
enableButton(R.id.StopButton,true);
break;
}
case R.id.StopButton:
{
Log.d(TAG, "======== Stop Button Pressed ==========");
isRunning = false;
do_loopback(isRunning);
enableButton(R.id.StopButton,false);
enableButton(R.id.StartButton,true);
break;
}
}
}
Start the Thread:
private void do_loopback(final boolean flag)
{
m_thread = new Thread(new Runnable() {
public void run() {
run_loop(flag);
}
});
m_thread.start();
}
Method for Initializing AudioRecord and AudioTrack:
public AudioTrack findAudioTrack (AudioTrack track)
{
Log.d(TAG, "===== Initializing AudioTrack API ====");
int m_bufferSize = AudioTrack.getMinBufferSize(8000,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT);
if (m_bufferSize != AudioTrack.ERROR_BAD_VALUE)
{
track = new AudioTrack(AudioManager.STREAM_MUSIC, 8000,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, m_bufferSize,
AudioTrack.MODE_STREAM);
if (track.getState() == AudioTrack.STATE_UNINITIALIZED) {
Log.e(TAG, "===== AudioTrack Uninitialized =====");
return null;
}
}
return track;
}
public AudioRecord findAudioRecord (AudioRecord recorder)
{
Log.d(TAG, "===== Initializing AudioRecord API =====");
int m_bufferSize = AudioRecord.getMinBufferSize(8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
if (m_bufferSize != AudioRecord.ERROR_BAD_VALUE)
{
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC, 8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, m_bufferSize);
if (recorder.getState() == AudioRecord.STATE_UNINITIALIZED) {
Log.e(TAG, "====== AudioRecord UnInitilaised ====== ");
return null;
}
}
return recorder;
}
The Values for findAudioRecord or findAudioTrack can change based on device.
Please refer this question.
Code for Running the loop:
public void run_loop (boolean isRunning)
{
/** == If Stop Button is pressed == **/
if (isRunning == false) {
Log.d(TAG, "===== Stop Button is pressed ===== ");
if (AudioRecord.STATE_INITIALIZED == recorder.getState()){
recorder.stop();
recorder.release();
}
if (AudioTrack.STATE_INITIALIZED == track.getState()){
track.stop();
track.release();
}
return;
}
/** ======= Initialize AudioRecord and AudioTrack ======== **/
recorder = findAudioRecord(recorder);
if (recorder == null) {
Log.e(TAG, "======== findAudioRecord : Returned Error! =========== ");
return;
}
track = findAudioTrack(track);
if (track == null) {
Log.e(TAG, "======== findAudioTrack : Returned Error! ========== ");
return;
}
if ((AudioRecord.STATE_INITIALIZED == recorder.getState()) &&
(AudioTrack.STATE_INITIALIZED == track.getState()))
{
recorder.startRecording();
Log.d(TAG, "========= Recorder Started... =========");
track.play();
Log.d(TAG, "========= Track Started... =========");
}
else
{
Log.d(TAG, "==== Initilazation failed for AudioRecord or AudioTrack =====");
return;
}
/** ------------------------------------------------------ **/
/* Recording and Playing in chunks of 320 bytes */
bufferSize = 320;
while (isRunning == true)
{
/* Read & Write to the Device */
recorder.read(buffer, 0, bufferSize);
track.write(buffer, 0, bufferSize);
}
Log.i(TAG, "Loopback exit");
return;
}
Please include the following in AndroidManifest.xml
<uses-permission android:name="android.permission.RECORD_AUDIO" > </uses-permission>
This above procedure is also possible by Writing/Reading from a file using the same APIs.
Why use audioRecord over mediaRecorder - See here.
The Code is tested (on Google Nexus 5) and working perfectly.
Note: Please add some error-checking code for recorder.read and track.write, in case you fail. Same applies for findAudioRecord and findAudioTrack.
First create objects in onCreate method, MediaRecorder class object and the path to file where you want to save the recorded data.
String outputFile = Environment.getExternalStorageDirectory().
getAbsolutePath() + "/myrecording.3gp"; // Define outputFile outside onCreate method
MediaRecorder myAudioRecorder = new MediaRecorder(); // Define this outside onCreate method
myAudioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
myAudioRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
myAudioRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
myAudioRecorder.setOutputFile(outputFile);
These three function you can call it on any button, in order to play Rec, stop Rec and start Rec;
public void start(View view){
try {
myAudioRecorder.prepare();
myAudioRecorder.start();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
start.setEnabled(false);
stop.setEnabled(true);
Toast.makeText(getApplicationContext(), "Recording started", Toast.LENGTH_LONG).show();
}
public void stop(View view){
myAudioRecorder.stop();
myAudioRecorder.release();
myAudioRecorder = null;
stop.setEnabled(false);
play.setEnabled(true);
Toast.makeText(getApplicationContext(), "Audio recorded successfully",
Toast.LENGTH_LONG).show();
}
public void play(View view) throws IllegalArgumentException,
SecurityException, IllegalStateException, IOException{
MediaPlayer m = new MediaPlayer();
m.setDataSource(outputFile);
m.prepare();
m.start();
Toast.makeText(getApplicationContext(), "Playing audio", Toast.LENGTH_LONG).show();
}
As I read Developer document here , Android supports RTSP protocol (for real time streaming) and also HTTP/HTTPS live streaming draft protocol.
There is also an example here. You must have base knowledge about Streaming server, like Red5 or Wowza.

Stream G711 ulaw on android with AudioTrack

I am trying to stream live audio from an Axis network security camera over a Multipart HTTP stream that is encoded in g711 ulaw 8 khz, 8 bit samples on an Android phone. It seems like this should be pretty straight forward, and this is the basis of my code. I reused some streaming code I had that grabbed JPEG frames from a MJPEG stream, and now it grabs 512 byte blocks of audio data and hands it down to the AudioTrack. The audio sounds all garbled and distorted though, am I missing something obvious?
#Override
public void onResume() {
super.onResume();
int bufferSize = AudioTrack.getMinBufferSize(8000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_8BIT);
mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 8000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_8BIT, bufferSize, AudioTrack.MODE_STREAM);
mAudioTrack.play();
thread.start();
}
class StreamThread extends Thread {
public boolean running = true;
public void run() {
try {
MjpegStreamer streamer = MjpegStreamer.read("/axis-cgi/audio/receive.cgi?httptype=multipart");
while(running) {
byte[] buf = streamer.readMjpegFrame();
if(buf != null && mAudioTrack != null) {
mAudioTrack.write(buf, 0, buf.length);
}
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}

Categories

Resources