I don't know how to play live wave audio stream through socket. Now I have got the socket audio stream. the stream format :
wav format header +pcm data
wav format header +pcm data
wav format header +pcm data
So how do i parse the live audio stream to play in the AudioTrack class in android. Thanks.
Here is my code :
private void PlayAudio(int mode)
{
if(AudioTrack.MODE_STATIC != mode && AudioTrack.MODE_STREAM != mode)
throw new InvalidParameterException();
long bytesWritten = 0;
int bytesRead = 0;
int bufferSize = 0;
byte[] buffer;
AudioTrack track;
Socket socket=null;
DataInputStream dIn=null;
bufferSize = 55584; // i donnt know how much the buffer size should be. 55584 is the size that i got first from the socket stream. maybe the buffer size is setted wrong.
//sample rate 16khz,channel: mono sample bits:16 bits channel:1
bufferSize = AudioTrack.getMinBufferSize(16000,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT);
buffer = new byte[bufferSize];
track = new AudioTrack(AudioManager.STREAM_MUSIC, 16000,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT,
bufferSize, mode);
// in stream mode,
// 1. start track playback
// 2. write data to track
if(AudioTrack.MODE_STREAM == mode)
track.play();
try
{
socket = new Socket("192.168.11.123", 8081);
dIn = new DataInputStream(socket.getInputStream());
// dIn.skipBytes(44);
}
catch (Exception e)
{
e.printStackTrace();
}
try
{
do
{
long t0 = SystemClock.elapsedRealtime();
try
{
bytesRead = dIn.read(buffer, 0, buffer.length);
}
catch (IOException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
catch (NullPointerException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
bytesWritten += track.write(buffer, 0, bytesRead);
Log.e("debug", "WritesBytes "+bytesRead);
} while (dIn.read() != -1);
}
catch (IOException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
}
I got mute when i running an activity, but i can hear some music intermittently in debug mode but it is noisy. Could you please help me ?
the server send the stream 100ms interval:
audio format : //sample rate 16khz,channel: mono sample bits:16 bits channel:1
Related
I am using AudioTrack library to play pcm samples that are being streamed via wifi. The problem is the playback is very choppy while playing on Lenovo K900 (4.1.2), whereas it is seamless while playing on OnePlusOne (Marshmallow 6.0.1 cyanogen) handset.
To rule out streaming delay I first loaded the entire stream in an array then played the audio by looping over the pre-populated array. Still things do not improve at all.
Now K900 is running Android 4.1.2 whereas the android sdk I used is for API23, I mean when I open the definition of AudioTrack class it open the class file of API23. Is it causing any problem?
int bufferSize = AudioTrack.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_8BIT);
if (bufferSize == AudioTrack.ERROR || bufferSize == AudioTrack.ERROR_BAD_VALUE) {
bufferSize = SAMPLE_RATE * 2;
}
int readBytes;
int bufSize=2123880;//size of the file being streamed
byte[] buffer=new byte[bufSize];
//bufferSize*=2;
DataInputStream in = new DataInputStream(socket.getInputStream());
readBytes=0;
while(true) {
try {
if(in.available()>0){
readBytes+=in.read(buffer, readBytes, in.available());
// audioTrack.write(buffer,0,readBytes);
}else{
break;
}
} catch (SocketTimeoutException s) {
System.out.println("Socket timed out!");
//audioTrack.release();
break;
} catch (IOException e) {
e.printStackTrace();
//audioTrack.release();
break;
}
}
in.close();
readBytes=0;
AudioTrack audioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC,
SAMPLE_RATE,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_8BIT,
bufferSize,
AudioTrack.MODE_STREAM);
audioTrack.play();
while(readBytes<bufSize){
readBytes+=audioTrack.write(buffer,readBytes,bufferSize);
}
audioTrack.release();
I tested with samples having different sampling rate ranging from 19250hz to 44100hz. For all cases K900 strong text is playing choppy whereas Oneplus one plays smooth. I dont think that K900 is underpowered to pull up this easy job.
Kindly assist.
EDIT
Here is the code I intend to use:
try {
//dataInputStream = new DataInputStream(socket.getInputStream());
//response = dataInputStream.readUTF();
int bufferSize = AudioTrack.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_8BIT);
if (bufferSize == AudioTrack.ERROR || bufferSize == AudioTrack.ERROR_BAD_VALUE) {
bufferSize = SAMPLE_RATE * 2;
}
int readBytes;
byte[] buffer = new byte[bufferSize];
bufferSize = bufferSize * 2;
AudioTrack audioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC,
SAMPLE_RATE,
AudioFormat.CHANNEL_OUT_STEREO,
AudioFormat.ENCODING_PCM_8BIT,
bufferSize,
AudioTrack.MODE_STREAM);
audioTrack.play();
DataInputStream in = new DataInputStream(socket.getInputStream());
while (true) {
try {
if (in.available() > 0) {
readBytes = in.read(buffer, 0, buffer.length);
audioTrack.write(buffer, 0, readBytes);
}
} catch (SocketTimeoutException s) {
System.out.println("Socket timed out!");
audioTrack.release();
in.close();
socket.close();
break;
} catch (IOException e) {
e.printStackTrace();
audioTrack.release();
in.close();
socket.close();
break;
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
Thanks,
Debojit
To produce sound on Android, I am using AudioTrack.
I have been able to produce sine waves, sawtooth waves, square waves but it would be nice to have a more realistic sound.
I found that .wav files were the easiest to play on AudioTrack because they are basically just a sequence of bytes with a header.
So I have got my wav file in the res/raw folder and I tried playing it with this code :
public void writeWav(){
byte[] byteData = null;
InputStream is = getResources().openRawResource(R.raw.high);
byteData = new byte[mBufferSize];
try {
is.read(byteData);
is.close();
}
catch (FileNotFoundException e) {}
catch (IOException e) {}
mAudioTrack.write(byteData, 0, byteData.length);
}
But all I get is noise. I realize there are lots of questions about AudioTrack and wav files, but I couldn't find an answer to my noise problem.
Use this method to play sound using audioTrack. It works for me
public void playAudioTrack() {
int sampleFreq = 16000;
File file = new File("--filePath--");
int shortSizeInBytes = Short.SIZE / Byte.SIZE;
int minBufferSize = AudioTrack.getMinBufferSize(sampleFreq, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT);
int bufferSizeInBytes = (int)(file.length() / shortSizeInBytes);
final AudioTrack at = new AudioTrack(AudioManager.STREAM_MUSIC, sampleFreq,
AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, minBufferSize,
AudioTrack.MODE_STREAM);
int i = 0;
byte[] s = new byte[bufferSizeInBytes];
try {
final FileInputStream fin = new FileInputStream("--filePath--");
final DataInputStream dis = new DataInputStream(fin);
at.setNotificationMarkerPosition((int)(file.length() / 2));
at.play();
while ((i = dis.read(s, 0, bufferSizeInBytes)) > -1) {
at.write(s, 0, i);
}
} catch (FileNotFoundException e) {
} catch (IOException e) {
} catch (Exception e) {
}
}
I'm developing an app that requires PCM audio to be recorded 16khz/16bits/mono (1 channel). It works perfectly on a Motorola ATRIX, but the recording is choppy throughout the file on a HTC One. I think it is because it is still trying to record in Stereo but writing blanks for the second channel. If I record in stereo it works great, but I need it in mono.
Has anyone heard of this being an issue? Mixing the track from stereo to mono is not an option due to time constraints.
private static final int RECORDER_SAMPLERATE = 16000;
private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_MONO;
private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
private void startRecording()
{
int bufferSize = AudioRecord.getMinBufferSize(RECORDER_SAMPLERATE, RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING);
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
RECORDER_SAMPLERATE, RECORDER_CHANNELS,
RECORDER_AUDIO_ENCODING, bufferSize);
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable()
{
public void run()
{
try {
writeAudioDataToFile();
} catch (IOException e) {
e.printStackTrace();
}
}
}, "AudioRecorder Thread");
recordingThread.start();
}
I was able to fix the problem, and found an easy solution that works great on both the Atrix and the HTC One. I don't know WHY it works (any insight on that would be greatly appreciated), but here is what I did.
All I did was change my BufferElements2Rec constant from 1024 to 512. The constant is used in the code below. I guess a smaller buffer size allowed it to work properly.
This is my writeAudioDataToFile() function called inside the recording thread:
private void writeAudioDataToFile() throws IOException
{
//create filename
filePath = generateFilePath();
//start writing data
short sData[] = new short[BufferElements2Rec];
FileOutputStream os = null;
try {
os = new FileOutputStream(filePath);
} catch (Exception e) {
e.printStackTrace();
}
byte[] headerBytes = writeWAVHeader(1);
os.write(headerBytes, 0, headerBytes.length);
while (isRecording)
{
// gets the voice output from microphone to byte format
recorder.read(sData, 0, BufferElements2Rec);
System.out.println("Recording audio to file" + sData.toString());
try {
// // writes the data to file from buffer
byte bData[] = short2byte(sData);
os.write(bData, 0, BufferElements2Rec * BytesPerElement);
} catch (IOException e) {
e.printStackTrace();
}
}
try {
os.close();
} catch (IOException e) {
e.printStackTrace();
}
}
I want to create an audio mixer (DJ music track) kind of app which can create Dj mixer of an audio song. User can select a music song track that can be mixed with two or more separate rhythm, bass or beat tracks to create a new modified Dj music.
I did a lot of research over this but could not find any idea or clue.
If anyone have some idea or some reference URL regarding this, please share it.
There is no build-in library on Android that supports audio mixing (combining two audio input streams into one output stream). The Java javax.sound library which supports audio mixing was not ported to Android - there's an interesting discussion on Google Groups with Google engineer Diane Hackborn about the decision to not port javax.sound to Android.
It looks like you have to develop your own solution from scratch. There are several helpful answers on SO on how to combine two audio streams into one:
Mixing Audio Files
Audio editing in Android
Android - Mixing multiple static waveforms into a single AudioTrack
It sounds like the hardest part of this would be playing multiple tracks at once, and that the rest can be done with the UI. One link that might help you is How to play multiple ogg or mp3 at the same time..? The documentation for SoundPool, which lets you play multiple sounds at once, can be found here.
It is late but if someone needs, AudioMixer-android can be used.
File dir;
dir = new File(Environment.getExternalStorageDirectory().getAbsolutePath());
dir.mkdirs();
//Audio Mixer for two .raw file into single .wav file...
void AudioMixer() {
File file_play1 = new File(dir, "Neww.raw");
int shortSizeInBytes = Short.SIZE / Byte.SIZE;
int bufferSizeInBytes = (int) (file_play1.length() / shortSizeInBytes);
short[] audioData = new short[bufferSizeInBytes];
try {
InputStream inputStream = new FileInputStream(file_play1);
BufferedInputStream bufferedInputStream = new BufferedInputStream(inputStream);
DataInputStream dataInputStream = new DataInputStream(bufferedInputStream);
InputStream inputStream1 = getResources().openRawResource(R.raw.trainss); //Play form raw folder
BufferedInputStream bufferedInputStream1 = new BufferedInputStream(inputStream1);
DataInputStream dataInputStream1 = new DataInputStream(bufferedInputStream1);
int i = 0;
while (dataInputStream.available() > 0 && dataInputStream1.available() > 0) {
audioData[i] = (short) (dataInputStream.readShort() + dataInputStream1.readShort());
i++;
}
dataInputStream.close();
dataInputStream1.close();
AudioTrack audioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC,
11025,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT,
bufferSizeInBytes,
AudioTrack.MODE_STREAM);
audioTrack.write(audioData, 0, bufferSizeInBytes);
//merge two .raw files in single .raw file...
File file_record = new File(dir, "testing.raw");
try {
OutputStream outputStream = new FileOutputStream(file_record);
BufferedOutputStream bufferedOutputStream = new BufferedOutputStream(outputStream);
DataOutputStream dataOutputStream = new DataOutputStream(bufferedOutputStream);
for (int j = 0; j < audioData.length; j++) {
dataOutputStream.writeShort(audioData[j]);
}
dataOutputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
//Convert that .raw (testing.raw) file into .wav (testingNew.wav) file
File des = new File(dir, "testingNew.wav");
try {
rawToWave(file_record, des);
} catch (IOException e) {
e.printStackTrace();
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
}
}
//convert .raw file to .wav File...
private void rawToWave(final File rawFile, final File waveFile) throws IOException {
byte[] rawData = new byte[(int) rawFile.length()];
DataInputStream input = null;
try {
input = new DataInputStream(new FileInputStream(rawFile));
input.read(rawData);
} finally {
if (input != null) {
input.close();
}
}
DataOutputStream output = null;
try {
output = new DataOutputStream(new FileOutputStream(waveFile));
// WAVE header
writeString(output, "RIFF"); // chunk id
writeInt(output, 36 + rawData.length); // chunk size
writeString(output, "WAVE"); // format
writeString(output, "fmt "); // subchunk 1 id
writeInt(output, 16); // subchunk 1 size
writeShort(output, (short) 1); // audio format (1 = PCM)
writeShort(output, (short) 1); // number of channels
writeInt(output, SAMPLE_RATE); // sample rate
writeInt(output, SAMPLE_RATE * 2); // byte rate
writeShort(output, (short) 2); // block align
writeShort(output, (short) 16); // bits per sample
writeString(output, "data"); // subchunk 2 id
writeInt(output, rawData.length); // subchunk 2 size
// Audio data (conversion big endian -> little endian)
short[] shorts = new short[rawData.length / 2];
ByteBuffer.wrap(rawData).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(shorts);
ByteBuffer bytes = ByteBuffer.allocate(shorts.length * 2);
for (short s : shorts) {
bytes.putShort(s);
}
output.write(bytes.array());
} finally {
if (output != null) {
output.close();
}
}
}
private void writeInt(final DataOutputStream output, final int value) throws IOException {
output.write(value >> 0);
output.write(value >> 8);
output.write(value >> 16);
output.write(value >> 24);
}
private void writeShort(final DataOutputStream output, final short value) throws IOException {
output.write(value >> 0);
output.write(value >> 8);
}
private void writeString(final DataOutputStream output, final String value) throws IOException {
for (int i = 0; i < value.length(); i++) {
output.write(value.charAt(i));
}
}
//playing merged file...
private void playWavFile() {
MediaPlayer recorded_audio_in_sounds = new MediaPlayer();
String outputFile = Environment.getExternalStorageDirectory().getAbsolutePath() + "/testingNew.wav";
try {
if (recorded_audio_in_sounds != null) {
if (recorded_audio_in_sounds.isPlaying()) {
recorded_audio_in_sounds.pause();
recorded_audio_in_sounds.stop();
recorded_audio_in_sounds.reset();
recorded_audio_in_sounds.setDataSource(outputFile);
recorded_audio_in_sounds.prepare();
recorded_audio_in_sounds.setAudioStreamType(AudioManager.STREAM_MUSIC);
recorded_audio_in_sounds.start();
recorded_audio_in_sounds.start();
} else {
recorded_audio_in_sounds.reset();
recorded_audio_in_sounds.setDataSource(outputFile);
recorded_audio_in_sounds.prepare();
recorded_audio_in_sounds.start();
recorded_audio_in_sounds.setVolume(2.0f, 2.0f);
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
i try and play an mp3 file in my SDCard for my Android emulator but all that comes out is some weird buzzing noise. I made sure the sample rate is 44.1k hz i don't know what else could be wrong
if(AudioTrack.MODE_STATIC != mode && AudioTrack.MODE_STREAM != mode)
throw new InvalidParameterException();
String audioFilePath = "/sdcard/test.mp3";
long fileSize = 0;
long bytesWritten = 0;
int bytesRead = 0;
int bufferSize = 0;
byte[] buffer;
AudioTrack track;
File audioFile = new File(audioFilePath);
fileSize = audioFile.length();
if(AudioTrack.MODE_STREAM == mode)
{
bufferSize = 8000;
}
else
{// AudioTrack.MODE_STATIC
bufferSize = (int)fileSize;
}
buffer = new byte[bufferSize];
track = new AudioTrack(AudioManager.STREAM_MUSIC,/* this is for pcm*/ /*22050*/ /*this is for mp3*/ 44100,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_DEFAULT/*AudioFormat.ENCODING_PCM_16BIT*/,
bufferSize, mode);
// in stream mode,
// 1. start track playback
// 2. write data to track
if(AudioTrack.MODE_STREAM == mode)
track.play();
FileInputStream audioStream = null;
try {
audioStream = new FileInputStream(audioFile);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
while(bytesWritten < fileSize)
{
try {
bytesRead = audioStream.read(buffer, 0, bufferSize);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
bytesWritten += track.write(buffer, 0, bytesRead);
}
// in static mode,
// 1. write data to track
// 2. start track playback
if(AudioTrack.MODE_STATIC == mode)
track.play();
It's correct that you hear strange noise, since you need to decode the MP3 first before
you feed it to an AudioTrack! AudioTrack only plays raw PCM audio.
You should use android.media.MediaPlayer to play mp3 audio file