Not able to read the data from the MIC - android

I am trying to read the data from the MIC and process it and store it in a file. But i am not getting any data from the MIC, the buffer is all zeroes.
int MIN_BUF = AudioRecord.getMinBufferSize(8000,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT);
AudioRecord recorder = new AudioRecord(
MediaRecorder.AudioSource.MIC, 8000,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT, MIN_BUF);
byte[] pcm_in = new byte[320];
recorder.startRecording();
while(record)
{
int bytes_read = recorder.read(pcm_in, 0, pcm_in.length);
switch(bytes_read)
{
case AudioRecord.ERROR_INVALID_OPERATION:
case AudioRecord.ERROR_BAD_VALUE:
Log.i("Microphone", "Error in reading the data");
break;
default:
print(pcm_in);
break;
}
}
recorder.stop();
recorder.release();
But in the print(pcm), when i printed byte by byte i am getting all zeroes. Some posts are there in stackoverflow with similar issues, but my issue didn't got fixed with that.
Please help me in fixing this.
Thanks & Regards,
SSuman185

print(pcm_in) will show you the actual data. you need to get the pcm_in data to pcm in a loop till you stop recording.
Meaning that your variable record is a boolean right. and you will make it false in another method. so till you make it false recorder.read(pcm_in, 0, pcm_in.length) operation will get the data from your mic and put it into the pcm_in(so you need to be sure that the size of pcm_in is equal to pcm). the bytes_read will be the size of the bytes read in this operation. so you can copy the pcm_in bytes to pcm in a loop that can read whole pcm_in data.
for example:
bytes_read = recorder.read(pcm_in, 0, pcm_in.length);
for(int i=0; i<bytes_read ;i++){
pcm[i] = pcm_in[i];
}
But this is a weird usage. I think your pcm should be as large as the file you need to load in it. and make sure you are addin the pcm_in to it , not overriding. I think this is what you want.

Related

Why Can't I Play Raw Audio Bytes Using AudioTrack's Static Mode?

I have an Android app where there is some raw audio bytes stored in a variable.
If I use an AudioTrack to play this audio data, it only works if I use AudioTrack.MODE_STREAM:
byte[] recordedAudioAsBytes;
public void playButtonPressed(View v) {
// this verifies that audio data exists as expected
for (int i=0; i<recordedAudioAsBytes.length; i++) {
Log.i("ABC", "byte[" + i + "] = " + recordedAudioAsBytes[i]);
}
// STREAM MODE ACTUALLY WORKS!!
/*
AudioTrack player = new AudioTrack(AudioManager.STREAM_MUSIC, SAMPLERATE, CHANNELS,
ENCODING, MY_CHOSEN_BUFFER_SIZE, AudioTrack.MODE_STREAM);
player.play();
player.write(recordedAudioAsBytes, 0, recordedAudioAsBytes.length);
*/
// STATIC MODE DOES NOT WORK
AudioTrack player = new AudioTrack(AudioManager.STREAM_MUSIC, SAMPLERATE, PLAYBACK_CHANNELS,
ENCODING, MY_CHOSEN_BUFFER_SIZE, AudioTrack.MODE_STATIC);
player.write(recordedAudioAsBytes, 0, recordedAudioAsBytes.length);
player.play();
}
If I use AudioTrack.MODE_STATIC, the output is glitchy -- it just makes a nasty pop and sounds very short with hardly anything audible.
So why is that? Does STATIC_MODE require that the audio data have a header?
That's all I can think of.
If you'd like to see all the code, check this question.
It seems to me that you are using the same MY_CHOSEN_BUFFER_SIZE for 'streaming' and 'static' mode!? This might explain why it sounds short...
In order to use Audiotracks 'static-mode' you have to use the size of your Byte-Array (bigger will also work) as buffersize. The Audio will be treated as one big chunk of data.
See: AudioTrack.Builder
setBufferSizeInBytes()... "If using the AudioTrack in static mode (see AudioTrack#MODE_STATIC), this is the maximum size of the sound that will be played by this instance."

AudioRecord read() returns strange values

I'm trying to read raw data from mic by following code:
short buffer[] = new short[AudioRecord.getMinBufferSize(8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT)];
Log.d("O_o",""+buffer.length);
AudioRecord rec = new AudioRecord(
MediaRecorder.AudioSource.MIC, 8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, buffer.length);
rec.startRecording();
int read = rec.read(buffer, 0, buffer.length);
for (int i = 0; i < read; i++) {
Log.d("O_o",i+" "+buffer[i]);
}
rec.stop();
rec.release();
But buffer always filled with 257 values.
What's wrong ?
UDP: look like i'ts initital values. Calling read() in cycle causes normal values.
You definitely should take a look at this question + answer. It shows some code which would improve your code very much.
Basically, your problem is that you're trying to read it synchronously. The Audio process will usually have to be implemented asynchronously and you'll be getting 256 byte-sized chunks of audio at any one time.

android java audio dsp sites or android sound library?

anyone know of any usefull links for learning audio dsp for android?
or a sound library?
im trying to make a basic mixer for playing wav files but realised i dont know enough about dsp, and i cant find anything at all for android.
i have a wav file loaded into a byte array and an AudioTrack on a short loop.
how can i feed the data in?
i expect this post will be ignored but its worth a try.
FileInputStream is = new FileInputStream(filePath);
BufferedInputStream bis = new BufferedInputStream(is);
DataInputStream dis = new DataInputStream(bis);
int i = 0;
while (dis.available() > 0) {
byteData[i] = dis.readByte(); //byteData
i++;
}
final int minSize = AudioTrack.getMinBufferSize( 44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO, AudioFormat.ENCODING_PCM_16BIT );
track = new AudioTrack( AudioManager.STREAM_MUSIC, 44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO, AudioFormat.ENCODING_PCM_16BIT,
minSize, AudioTrack.MODE_STREAM);
track.play();
bRun=true;
new Thread(new Runnable() {
public void run() {
track.write(byteData, 0, minSize);
}
}).start();
I'll give this a shot just because I was in your position a few months ago...
If you already have the wav file audio samples in a byte array, you simple need to pass the samples to the audio track object (lookup the write() methods).
To mix audio together you simply add the sames from each track. For example, add the first sample from track 1 to track 2, add the second sample from track 1 to track 2 and so on. The end result would ideally be a third array containing the added samplws which you pass to the 'write' method of your audio track instance.
You must be mindful of clipping here. If your data type 'short' then the maximum value allowed is 32768. A simple way to ensure that your added samples do not exceed this limit is to peform the addition and store the result in a variable whose data type is larger than a short (eg. int) and evaluate the result. If it's greater than 32768 then make it equal to 32768 and cast it back to a short.
int result = track1[i] + track2[i];
if(result > 32768) {
result = 32768;
}
else if(result < -32768) {
result = -32768;
}
mixedAudio[i] = (short)result;
Notice how the snippet above also tests for the minimum range of a short.
Appologies for the lack of formatting here, I'm on my mobile phone on a train :-)
Good luck.

Have you tried Class AudioTrack in Android with a buffer size which is a power of two?

I'm working in one application for Android with class AudioTrack, and sometimes I get the exception "Invalid audio buffer size". Since I'm planning to use FFT, I make the buffer size a power of two, and since then, sometimes I get this exception. Any ideas why is that?
Thanks,
Daniel
My code is very straight forward:
private void playTrack(short []buffer){
try{
Log.i(TAG,"Play track, Buffer size: "+buffer.length );
AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
mAudioIn.getFrequency(),
mAudioIn.getChannelConfig(),
mAudioIn.getAudioEncoding(),
buffer.length,
AudioTrack.MODE_STREAM);
audioTrack.play();
audioTrack.write(buffer, 0, buffer.length);
}catch(Throwable t){
Log.e(TAG,"Play track, something's wrong: "+t.getMessage()+ " When buffer size is:"+buffer.length );
}
}`enter code here`
I just realized, you can't use an odd number as the buffer size. Try taking your size and doing something like this:
if (size % 2 == 1)
size++;`
This will make an odd number even.
Try calling something like
_audioTrackSize = android.media.AudioTrack.getMinBufferSize(
audioSampleRate, AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT);
and then round the returned minimum size up to a power of two.

Using AudioTrack in Android to play a WAV file

I'm working with Android, trying to make my AudioTrack application play a Windows .wav file (Tada.wav). Frankly, it shouldn't be this hard, but I'm hearing a lot of strange stuff. The file is saved on my phone's mini SD card and reading the contents doesn't seem to be a problem, but when I play the file (with parameters I'm only PRETTY SURE are right), I get a few seconds of white noise before the sound seems to resolve itself into something that just may be right.
I have successfully recorded and played my own voice back on the phone -- I created a .pcm file according to the directions in this example:
http://emeadev.blogspot.com/2009/09/raw-audio-manipulation-in-android.html
(without the backwards masking)...
Anybody got some suggestions or awareness of an example on the web for playing a .wav file on an Android??
Thanks,
R.
I stumbled on the answer (frankly, by trying &^#! I didn't think would work), in case anybody's interested... In my original code (which is derived from the example in the link in the original post), the data is read from the file like so:
InputStream is = new FileInputStream (file);
BufferedInputStream bis = new BufferedInputStream (is, 8000);
DataInputStream dis = new DataInputStream (bis); // Create a DataInputStream to read the audio data from the saved file
int i = 0; // Read the file into the "music" array
while (dis.available() > 0)
{
music[i] = dis.readShort(); // This assignment does not reverse the order
i++;
}
dis.close(); // Close the input stream
In this version, music[] is array of SHORTS. So, the readShort() method would seem to make sense here, since the data is 16-bit PCM... However, on the Android that seems to be the problem. I changed that code to the following:
music=new byte[(int) file.length()];//size & length of the file
InputStream is = new FileInputStream (file);
BufferedInputStream bis = new BufferedInputStream (is, 8000);
DataInputStream dis = new DataInputStream (bis); // Create a DataInputStream to read the audio data from the saved file
int i = 0; // Read the file into the "music" array
while (dis.available() > 0)
{
music[i] = dis.readByte(); // This assignment does not reverse the order
i++;
}
dis.close(); // Close the input stream
In this version, music[] is an array of BYTES. I'm still telling the AudioTrack that it's 16-bit PCM data, and my Android doesn't seem to have a problem with writing an array of bytes into an AudioTrack thus configured... Anyway, it finally sounds right, so if anyone else wants to play Windows sounds on their Android, for some reason, that's the solution. Ah, Endianness......
R.
I found a lot of long answers to this question. My final solution, which given all the cutting and pasting is hardly mine, comes down to:
public boolean play() {
int i = 0;
byte[] music = null;
InputStream is = mContext.getResources().openRawResource(R.raw.noise);
at = new AudioTrack(AudioManager.STREAM_MUSIC, 44100,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT,
minBufferSize, AudioTrack.MODE_STREAM);
try{
music = new byte[512];
at.play();
while((i = is.read(music)) != -1)
at.write(music, 0, i);
} catch (IOException e) {
e.printStackTrace();
}
at.stop();
at.release();
return STOPPED;
}
STOPPED is just a "true" sent back as a signal to reset the pause/play button.
And in the class initializer:
public Mp3Track(Context context) {
mContext = context;
minBufferSize = AudioTrack.getMinBufferSize(44100,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
}
Context is just "this" from the calling activity.
You can use a FileInputStream on the sdcard, etc. My files are in res/raw
Are you skipping the first 44 bytes of the file before you dump the rest of the file's data into the buffer? The first 44 bytes are the WAVE header and they would sound like random noise if you tried to play them.
Also, are you sure you are creating the AudioTrack with the same properties as the WAVE you are trying to play (sample rate, bit rate, number of channels, etc)? Windows actually does a good job of giving this information to you in the File Properties page:
As said by Aaron C, you have to skip initial 44 bytes or (as I prefer) read first 44 bytes that are the WAVE header. In this way you know how many channels, bits per sample, length, etc... the WAVE contains.
Here you can find a good implementation of a WAVE header parser/writer.
Please don't perpetuate terrible parsing code. WAV parsing is trivial to implement
http://soundfile.sapp.org/doc/WaveFormat/
and you will thank yourself by being able to parse things such as the sampling rate, bit depth, and number of channels.
Also x86 and ARM (at least by default) are both little endian , so native-endian WAV files should be fine without any shuffling.
Just confirm if you have AudioTrack.MODE_STREAM and not AudioTrack.MODE_STATIC in the AudioTrack constructor:
AudioTrack at = new AudioTrack(
AudioManager.STREAM_MUSIC,
sampleRate,
AudioFormat.CHANNEL_IN_STEREO,
AudioFormat.ENCODING_PCM_16BIT,
// buffer length in bytes
outputBufferSize,
AudioTrack.MODE_STREAM
);
Sample wav file:
http://www.mauvecloud.net/sounds/pcm1644m.wav
Sample Code:
public class AudioTrackPlayer {
Context mContext;
int minBufferSize;
AudioTrack at;
boolean STOPPED;
public AudioTrackPlayer(Context context) {
Log.d("------","init");
mContext = context;
minBufferSize = AudioTrack.getMinBufferSize(44100,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
}
public boolean play() {
Log.d("------","play");
int i = 0;
byte[] music = null;
InputStream is = mContext.getResources().openRawResource(R.raw.pcm1644m);
at = new AudioTrack(AudioManager.STREAM_MUSIC, 44100,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT,
minBufferSize, AudioTrack.MODE_STREAM);
try {
music = new byte[512];
at.play();
while ((i = is.read(music)) != -1)
at.write(music, 0, i);
} catch (IOException e) {
e.printStackTrace();
}
at.stop();
at.release();
return STOPPED;
}
}

Categories

Resources