Cracking noise when streaming audio in Android - android

I am trying to create an app to stream audio from a local file in one device, to a different device using the Nearby communications API.
The problem is that I manage to stream the audio, but the only thing I can hear on the remote device is some sort of non-sense cracking noise.
What I´ve read so far is that I need to adjust the value in the minBufferSize I´m using and the value in the sampleRate, but I´ve been trying this and I´haven´t achieved much.
This is my code to send the byte chunks:
AudioTrack speaker;
//Audio Configuration.
private int sampleRate = 16000; //How much will be ideal?
private int channelConfig = AudioFormat.CHANNEL_OUT_MONO;
private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
minBufSize=2048;
speaker = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRate, channelConfig, audioFormat, 10*minBufSize, AudioTrack.MODE_STREAM);
}
final InputStream file;
final byte [] arrayStream = new byte [minBufSize];
try {
file= new FileInputStream (filepath);
bytesRead = file.read(arrayStream);
while (bytesRead!=-1) {
new Thread(new Runnable() {
public void run() {
sendMessage(arrayStream);
}
}).start();
bytesRead = file.read(arrayStream);
}
Toast.makeText(this, "Mensaje totalmente completado",Toast.LENGTH_SHORT).show();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
private void sendMessage(byte [] payload) {
Nearby.Connections.sendReliableMessage(mGoogleApiClient, mOtherEndpointId, payload);
//mMessageText.setText(null);
}
And this is the code to receive and playback the message on the remote device:
#Override
public void onMessageReceived(String endpointId, byte[] payload, boolean isReliable) {
// A message has been received from a remote endpoint.
Toast.makeText(this,"Mensaje recibido",Toast.LENGTH_SHORT).show();
debugLog("onMessageReceived:" + endpointId + ":" + new String(payload));
playMp3(payload);
}
private void playMp3(final byte[] mp3SoundByteArray) {
if (isPlaying==false){
speaker.play();
isPlaying=true;
}else{
//sending data to the Audiotrack obj i.e. speaker
speaker.write(mp3SoundByteArray, 0, minBufSize);
Log.d("VR", "Writing buffer content to speaker");
}
}
Can anyone help me with this??
Thanks!

You need to buffer some audio on the client before trying to play it. Most likely you get some data and play it and the next chunk of data has not arrived in time.
See this post and this post about buffering data and streaming audio.

Related

How to write Short[] to wav output file in Android?

I am trying to write Short[] to wav audio file using file output stream but the file only contains scratch sound.
The reason i am using short[] rather than byte[] is because i am trying to use an external library which provides Voice Activity Detection . I did add wav header provided in Android Audio Record to wav and i tried to convert Short[] to byte[] using Converting Short array from Audio Record to Byte array without degrading audio quality? but none of the above links were able to help me.
Here is my code:
private class ProcessVoice implements Runnable {
#Override
public void run() {
File fl = new File(filePath, AUDIO_RECORDING_FILE_NAME);
try {
os = new BufferedOutputStream(new FileOutputStream(fl));
} catch (FileNotFoundException e) {
Log.w(TAG, "File not found for recording ");
}
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_AUDIO);
while (!Thread.interrupted() && isListening && audioRecord != null) {
short[] buffer = new short[vad.getConfig().getFrameSize().getValue() * getNumberOfChannels() * 2];
audioRecord.read(buffer, 0, buffer.length);
isSpeechDetected(buffer);
}
}
private void isSpeechDetected(final short[] buffer) {
vad.isContinuousSpeech(buffer, new VadListener() {
#Override
public void onSpeechDetected() {
callback.onSpeechDetected();
bytes2 = new byte[buffer.length * 2];
ByteBuffer.wrap(bytes2).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().put(buffer);
//Log.w(TAG, String.valueOf(buffer));
try {
// // writes the data to file from buffer
// // stores the voice buffer
os.write(header, 0, 44);
working = true;
os.write(bytes2, 0, bytes2.length);
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public void onNoiseDetected() {
callback.onNoiseDetected();
if(working == true){
working = false;
try {
doneRec();
} catch (IOException e) {
e.printStackTrace();
}
}
//Log.w(TAG, String.valueOf(bytes2));
}
});
}
}

Audio Playback Failed - E/android.media.AudioTrack﹕ Front channels must be present in multichannel configurations

I have not used the audio recording classes of android much before, so i dont really have much knowledge in the area.
I have written a little app that will record audio in the background, and then play in back, all in the PCM format (i am doing some tests to see how much battery the microphone uses in the background).
But when i try and run my play() method, i get the logcat errors:
11-03 00:20:05.744 18248-18248/com.bacon.corey.audiotimeshift E/android.media.AudioTrack﹕ Front channels must be present in multichannel configurations
11-03 00:20:05.748 18248-18248/com.bacon.corey.audiotimeshift E/AudioTrack﹕ Playback Failed
I have googled the errors, but i cant seem to find anything what so ever about them.
If someone wouldnt mind giving me a few pointers, i would be hugely grateful.
This is the code for the application (it is quite sloppy and unfinished, as it is only for testing battery life):
public class MainActivity extends ActionBarActivity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
if (savedInstanceState == null) {
getSupportFragmentManager().beginTransaction()
.add(R.id.container, new PlaceholderFragment())
.commit();
}
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu_main, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
/**
* A placeholder fragment containing a simple view.
*/
public static class PlaceholderFragment extends Fragment {
public PlaceholderFragment() {
}
#Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
View rootView = inflater.inflate(R.layout.fragment_main, container, false);
return rootView;
}
}
public void play(View view) {
Toast.makeText(this, "play", Toast.LENGTH_SHORT).show();
// Get the file we want to playback.
File file = new File(Environment.getExternalStorageDirectory() + File.separator + "ACS.pcm");
// Get the length of the audio stored in the file (16 bit so 2 bytes per short)
// and create a short array to store the recorded audio.
int musicLength = (int)(file.length()/2);
short[] music = new short[musicLength];
try {
// Create a DataInputStream to read the audio data back from the saved file.
InputStream is = new FileInputStream(file);
BufferedInputStream bis = new BufferedInputStream(is);
DataInputStream dis = new DataInputStream(bis);
// Read the file into the music array.
int i = 0;
while (dis.available() > 0) {
music[musicLength-1-i] = dis.readShort();
i++;
}
// Close the input streams.
dis.close();
// Create a new AudioTrack object using the same parameters as the AudioRecord
// object used to create the file.
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
11025,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT,
musicLength,
AudioTrack.MODE_STREAM);
// Start playback
audioTrack.play();
// Write the music buffer to the AudioTrack object
audioTrack.write(music, 0, musicLength);
} catch (Throwable t) {
Log.e("AudioTrack","Playback Failed");
}
}
public void record(View view){
Toast.makeText(this, "record", Toast.LENGTH_SHORT).show();
Log.v("ACS", "OnCreate called");
Intent intent = new Intent(this, ACS.class);
startService(intent);
}
public void stop(View view){
Toast.makeText(this, "stop", Toast.LENGTH_SHORT).show();
Intent intent = new Intent(this, ACS.class);
stopService(intent);
}
}
And
public class ACS extends IntentService {
AudioRecord audioRecord;
public ACS() {
super("ACS");
}
#Override
protected void onHandleIntent(Intent intent) {
Log.v("ACS", "ACS called");
record();
}
public void record() {
Log.v("ACS", "Record started");
int frequency = 11025;
int channelConfiguration = AudioFormat.CHANNEL_IN_MONO;
int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
File file = new File(Environment.getExternalStorageDirectory() + File.separator + "ACS.pcm");
// Delete any previous recording.
if (file.exists())
file.delete();
// Create the new file.
try {
file.createNewFile();
} catch (IOException e) {
throw new IllegalStateException("Failed to create " + file.toString());
}
try {
// Create a DataOuputStream to write the audio data into the saved file.
OutputStream os = new FileOutputStream(file);
BufferedOutputStream bos = new BufferedOutputStream(os);
DataOutputStream dos = new DataOutputStream(bos);
// Create a new AudioRecord object to record the audio.
int bufferSize = AudioRecord.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
frequency, channelConfiguration,
audioEncoding, bufferSize);
short[] buffer = new short[bufferSize];
audioRecord.startRecording();
while (audioRecord.getRecordingState() == audioRecord.RECORDSTATE_RECORDING) {
int bufferReadResult = audioRecord.read(buffer, 0, bufferSize);
for (int i = 0; i < bufferReadResult; i++)
dos.writeShort(buffer[i]);
}
audioRecord.stop();
dos.close();
} catch (Throwable t) {
Log.e("AudioRecord", "Recording Failed");
}
Log.v("ACS", "Record stopped");
}
public void onDestroy(){
audioRecord.stop();
Log.v("ACS", "onDestroy called, Record stopped");
}
}
Thanks in advance
Corey :)
I have the same error message "android.media.AudioTrack﹕ Front channels must be present in multichannel configurations".
When I change the audio settings from AudioFormat.CHANNEL_OUT_MONO to AudioFormat.CHANNEL_IN_MONO, the error messages disappeared. (Or you can try different configuration, like AudioFormat.CHANNEL_IN_STEREO)
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
11025,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT,
musicLength,
AudioTrack.MODE_STREAM);
But I don't know why this happened. Hope this help.
A mono audio file needs to be sent to both left and right speaker. Do a logical OR to set this routing:
final int frontPair =
AudioFormat.CHANNEL_OUT_FRONT_LEFT | AudioFormat.CHANNEL_OUT_FRONT_RIGHT;
AudioFormat audioFormat = new AudioFormat.Builder()
.setEncoding(AudioFormat.ENCODING_PCM_8BIT)
.setSampleRate(audioSamplingRate)
.setChannelMask(frontPair)
.build();

Android record and playback

I am currently trying to build an amplifier for the Android. The goal is to record and playback what is being recorded simultaneously. I created a thread that would take care of this. However, the sound comes out choppy. Here is what I tried.
private class RecordAndPlay extends Thread{
int bufferSize;
AudioRecord aRecord;
short[] buffer;
public RecordAndPlay() {
bufferSize = AudioRecord.getMinBufferSize(22050, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
buffer = new short[bufferSize];
}
#Override
public void run() {
aRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, 22050, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
try {
aRecord.startRecording();
} catch (Exception e) {
}
int bufferedResult = aRecord.read(buffer,0,bufferSize);
final AudioTrack aTrack = new AudioTrack(AudioManager.STREAM_MUSIC, samplingRate, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferedResult, AudioTrack.MODE_STREAM);
aTrack.setNotificationMarkerPosition(bufferedResult);
aTrack.setPlaybackPositionUpdateListener(new OnPlaybackPositionUpdateListener() {
#Override
public void onPeriodicNotification(AudioTrack track) {
// TODO Auto-generated method stub
}
#Override
public void onMarkerReached(AudioTrack track) {
Log.d("Marker reached", "...");
aTrack.release();
aRecord.release();
run();
}
});
aTrack.play();
aTrack.write(buffer, 0, buffer.length);
}
public void cancel(){
aRecord.stop();
aRecord.release();
}
}
Your playback is choppy because the AudioTrack is getting starved and not getting data smoothly. In your code you are recursively calling run and creating a new AudioTrack per marker. Instead, instantiate AudioRecord and AudioTrack only once and just handle their events. Also, to help smooth out playback you should probably start recording slightly before playback and maintain a queue of the recorded buffers. You can then manage passing these buffers to the AudioTrack and make sure there is always a new buffer to submit on each marker event.

Stream G711 ulaw on android with AudioTrack

I am trying to stream live audio from an Axis network security camera over a Multipart HTTP stream that is encoded in g711 ulaw 8 khz, 8 bit samples on an Android phone. It seems like this should be pretty straight forward, and this is the basis of my code. I reused some streaming code I had that grabbed JPEG frames from a MJPEG stream, and now it grabs 512 byte blocks of audio data and hands it down to the AudioTrack. The audio sounds all garbled and distorted though, am I missing something obvious?
#Override
public void onResume() {
super.onResume();
int bufferSize = AudioTrack.getMinBufferSize(8000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_8BIT);
mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 8000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_8BIT, bufferSize, AudioTrack.MODE_STREAM);
mAudioTrack.play();
thread.start();
}
class StreamThread extends Thread {
public boolean running = true;
public void run() {
try {
MjpegStreamer streamer = MjpegStreamer.read("/axis-cgi/audio/receive.cgi?httptype=multipart");
while(running) {
byte[] buf = streamer.readMjpegFrame();
if(buf != null && mAudioTrack != null) {
mAudioTrack.write(buf, 0, buf.length);
}
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}

Recording with AudioRecord on Android speeds up the audio?

I am using AudioRecord to record raw audio for processing.
The audio records entirely without any noise but when the raw PCM data generated is played back, it plays as if it has been speeded up a lot (upto about twice as much).
I am viewing and playing the PCM data in Audacity. I am using actual phone (Samsung Galaxy S5670) for testing.
The recording is done at 44100 Hz, 16 bit. Any idea what might cause this?
Following is the recording code:
public class TestApp extends Activity
{
File file;
OutputStream os;
BufferedOutputStream bos;
AudioRecord recorder;
int iAudioBufferSize;
boolean bRecording;
int iBytesRead;
Thread recordThread = new Thread(){
#Override
public void run()
{
byte[] buffer = new byte[iAudioBufferSize];
int iBufferReadResult;
iBytesRead = 0;
while(!interrupted())
{
iBufferReadResult = recorder.read(buffer, 0, iAudioBufferSize);
// Android is reading less number of bytes than requested.
if(iAudioBufferSize > iBufferReadResult)
{
iBufferReadResult = iBufferReadResult +
recorder.read(buffer, iBufferReadResult - 1, iAudioBufferSize - iBufferReadResult);
}
iBytesRead = iBytesRead + iBufferReadResult;
for (int i = 0; i < iBufferReadResult; i++)
{
try
{
bos.write(buffer[i]);
} catch (IOException e)
{
e.printStackTrace();
}
}
}
}
};
#Override
public void onCreate(Bundle savedInstanceState)
{
// File Creation and UI init stuff etc.
bRecording = false;
bPlaying = false;
int iSampleRate = AudioTrack.getNativeOutputSampleRate(AudioManager.STREAM_SYSTEM);
iAudioBufferSize = AudioRecord.getMinBufferSize(iSampleRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
recorder = new AudioRecord(AudioSource.MIC, iSampleRate, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, iAudioBufferSize);
bt_Record.setOnClickListener(new OnClickListener()
{
#Override
public void onClick(View v)
{
if (!bRecording)
{
try
{
recorder.startRecording();
bRecording = true;
recordThread.start();
}
catch(Exception e)
{
tv_Error.setText(e.getLocalizedMessage());
}
}
else
{
recorder.stop();
bRecording = false;
recordThread.interrupt();
try
{
bos.close();
}
catch(IOException e)
{
}
tv_Hello.setText("Recorded Sucessfully. Total " + iBytesRead + " bytes.");
}
}
});
}
}
RESOLVED : I posted this after struggling with it for 1-2 days. But, ironically, I found the solution soon after posting. The buffered output stream write was taking too much time in the for loop, so the stream was skipping samples. changed it to block write, removing the for loop. Works perfectly.
The audio skipping was caused by the delay in writing to buffer.
the solution is to just replace this FOR loop:
for (int i = 0; i < iBufferReadResult; i++)
{
try
{
bos.write(buffer[i]);
} catch (IOException e)
{
e.printStackTrace();
}
}
by a single write, like so:
bos.write(buffer, 0, iBufferReadResult);
I had used the code from a book which worked, I guess, for lower sample rates and buffer updates.

Categories

Resources