I'm developing an app in Android that has to record sound and display in the screen some vaules that represents the frequency or the intensity of the sound.
For the record thing I use this piece of code:
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mRecorder.setOutputFile(name);
mRecorder.prepare();
mRecorder.start();
Then I was initially trying to convert the stored file into byte like this:
DataInputStream dis1 = new DataInputStream ( new FileInputStream (name));
byte[] datainBytes1 = new byte[dis1.available()];
dis1.readFully(datainBytes1);
dis1.close();
But I want to convert that byte values into short or floats to display them by using a drawing method:
canvas.drawLine(xini,yini,xfinal,yfinal,paint)
Could you recommend me another way to convert the audio file into short values that I could draw?
Thank you very much for your help!!
I think the way you are trying to display recorded sounds is not correct. As starting point you need to study audio encoding format and have a look to the following android class: http://developer.android.com/reference/android/media/AudioTrack.html
Here You will find a Library class for converting between floats and bytes taking into account signed/unsigned, big/little endian, and sample size.
https://java.net/projects/gervill/sources/Mercurial/content/src/com/sun/media/sound/AudioFloatConverter.java
Related
I am trying to get the audio input stream from a file in the local file system on an android device.
This is so that i can use this library to show a wave form for the audio file.
https://github.com/newventuresoftware/WaveformControl/blob/master/app/src/main/java/com/newventuresoftware/waveformdemo/MainActivity.java#L125
The example in the project uses rawResource like so
InputStream is = getResources().openRawResource(R.raw.jinglebells);
This input stream is later converted into byte array and passed to somewhere that uses it to paint a wave picture and sound.
however when I did
InputStream is = new InputFileSystem(new File(filePath));
But this does not seem to work properly. The image generated is wrong, and the sound played is nothing like what the file actually is.
This is the body of the function in that library that gets the input stream and convert it into byte arrays.
private short[] getAudioSample() throws IOException {
// If i replace this part with new FileInput(new File(filePath))
// the generated "samples" from it does not work properly with the library.
InputStream is = getResources().openRawResource(R.raw.jinglebells);
byte[] data;
try {
data = IOUtils.toByteArray(is);
} finally {
if (is != null) {
is.close();
}
}
ShortBuffer sb = ByteBuffer.wrap(data).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
short[] samples = new short[sb.limit()];
sb.get(samples);
return samples;
}
The sound file that I would like to get processed and pass to that library is created by a MediaRecorder with the following configurations
MediaRecorder recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
Basically that library requires PCM samples. Using that 3gp FileInputStream generated by MediaRecorder directly is not gonna cut it.
What I did is to use MediaExtractor and MediaCodec to convert the 3gp audio data into PCM samples by samples. and then it is fed into that library. Then everything worked =)
The logic in encoding audio data can be almost directly taken from here this awesome github repo
I have followed this example to convert raw audio data coming from AudioRecord to mp3, and it happened successfully, if I store this data in a file the mp3 file and play with music player then it is audible.
Now my question is instead of storing mp3 data to a file i need to play it with AudioTrack, the data is coming from the Red5 media server as live stream, but the problem is AudioTrack can only play PCM data, so i can only hear noise from my data.
Now i am using JLayer to my require task.
My code is as follows.
int readresult = recorder.read(audioData, 0, recorderBufSize);
int encResult = SimpleLame.encode(audioData,audioData, readresult, mp3buffer);
and this mp3buffer data is sent to other user by Red5 stream.
data received at other user is in form of stream, so for playing it the code is
Bitstream bitstream = new Bitstream(data.read());
Decoder decoder = new Decoder();
Header frameHeader = bitstream.readFrame();
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
short[] pcm = output.getBuffer();
player.write(pcm, 0, pcm.length);
But my code freezes at bitstream.readFrame after 2-3 seconds, also no sound is produced before that.
Any guess what will be the problem? Any suggestion is appreciated.
Note: I don't need to store the mp3 data, so i cant use MediaPlayer, as it requires a file or filedescriptor.
just a tip, but try to
output.close();
bitstream.closeFrame();
after yours write code. I'm processing MP3 same as you do, but I'm closing buffers after usage and I have no problem.
Second tip - do it in Thread or any other Background process. As you mentioned these deaf 2 seconds, media player may wait until you process whole stream because you are loading it in same thread.
Try both tips (and you should anyway). In first, problem could be in internal buffers; In second you probably fulfill Media's input buffer and you locked app (same thread, full buffer cannot receive your input and code to play it and release same buffer is not invoked because writing locks it...)
Also, if you don't doing it now, check for 'frameHeader == null' due to file end.
Good luck.
You need to loop through the frames like this:
While (frameHeader = bitstream.readFrame()){
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
short[] pcm = output.getBuffer();
player.write(pcm, 0, pcm.length);
bitstream.close();
}
And make sure you are not running them on main thread.(This is probably the reason of freezing.)
I've found lots of tutorials and posts showing how to use AudioTrack to play wav files in AudioTrack.MODE_STREAM and I've successfully implemented this example.
However I'm having issues with performance when playing multiple audio tracks at once and thinking that I should first create the tracks using AudioTrack.MODE_STATIC then just call play each time.
I can't find any resources on how to implement this. How can I do this?
Thanks
The two main sticking points for me were realizing that .write() comes first and that the instantiated player must have the size of the entire clip as the buffer_size_in_bytes.
Assuming you have recorded a PCM file using AudioRecord, you can play it back with STATIC_MODE like so...
File file = new File(FILENAME);
int audioLength = (int)file.length();
byte filedata[] = new byte[audioLength];
try{
InputStream inputStream = new BufferedInputStream(new FileInputStream(FILENAME));
int lengthOfAudioClip = inputStream.read(filedata, 0, audioLength);
player = new AudioTrack(STREAM_TYPE, SAMPLE_RATE, CHANNEL_OUT_CONFIG, AUDIO_FORMAT,audioLength, AUDIO_MODE);
player.write(filedata, OFFSET, lengthOfAudioClip);
player.setPlaybackRate(playbackRate);
player.play();
}
My task is to record audio for a long time and send three second files to server for recognition text(I use google for that). If I write directly to file, then I need to reinitialize my recorder to start new file an this takes a lot of time to start and stop mediarecorder. That's why i decided to use ParcelFileDescriptor.createPipe()That's how I create and initialize my recorder:
ParcelFileDescriptor[] fdPair = new ParcelFileDescriptor[0];
fdPair = ParcelFileDescriptor.createPipe();
ParcelFileDescriptor readFD = fdPair[0];
ParcelFileDescriptor writeFD = fdPair[1];
mediaRecorder = new MediaRecorder();
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.AMR_NB);
mediaRecorder.setAudioSamplingRate(8000);
mediaRecorder.setOutputFile(writeFD.getFileDescriptor());
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
Then, I create Input stream to read recording data:
final ParcelFileDescriptor.AutoCloseInputStream reader = new ParcelFileDescriptor.AutoCloseInputStream(readFD);
Data read from this reader I write to FileOutputStream.
Every 3 seconds I create new file and new FileOutputStream, and send data from previous file to server.
I'am reading bytes according to ARM format(file header, than read every frame's header bit, and then whole frame).
On Nexus One with android 2.3.3, it works well, but on Galaxy S III, Galaxy Nexus and LG p990 it writes something that device player can't read and google can't recognize. Google recognizer returns something like “ok ok” or “######” (looks like files contain some kind of noise).
Does anybody know, why could this work not in the same way for differenet phones/platforms?
Thanks,
P.S. I can't use AudioRecord – google needs AMR (or speex, or flac)
I am developing a low data rate VoIP kind of project . I need to capture audio at low data rates and store it in an internal buffer or FIFO (NOT in a file).
I would like to use low data rate .AMR encoders, which means AudioRecord is out. MediaRecorder looks like it does exactly what I want except that it seems to write to a file.
MediaRecorder takes a FileDescriptor... is there any way I can write a class that implements the FileDescriptor interface... acting as a sync for bytes... but instead of sending them to a file they are stored in a buffer? The documentation on FileDescriptor specifically says that Applications shouldn't write their own but why not and is it possible anyway?
http://docs.oracle.com/javase/1.4.2/docs/api/java/io/FileDescriptor.html
In short, I'd like to develop my own stream, and trick MediaRecorder to send data to it. Perhaps doing something tricky with opening both ends of a socket within the same APK and giving MediaRecorder the socket to write to? Using the socket as my FIFO? I'm somewhat new to this so any help/suggestions greatly appreciated.
I have a related question on the RX side. I'd like to have a buffer/fifo that feeds MediaPlayer. Can I trick MediaPlayer to accept data from a buffer fed by my own proprietary stream?
I know its a bit late to answer this question now...
...But if it helps here's the solution.
Android MediaRecorder's method setOutputFile() accepts FileDescriptor as a parameter.
As for your need a unix data pipe could be created and its FD could be passed as an argument in the following manner...
mediaRecorder.setOutputFile(getPipeFD());
FileDescriptor getPipeFD()
{
final String FUNCTION = "getPipeFD";
FileDescriptor outputPipe = null;
try
{
ParcelFileDescriptor[] pipe = ParcelFileDescriptor.createPipe();
outputPipe = pipe[1].getFileDescriptor();
}
catch(Exception e)
{
Log.e(TAG, FUNCTION + " : " + e.getMessage());
}
return outputPipe;
}
The ParcelFileDescriptor.createPipe() creates a Unix Data Pipe and returns an array of ParcelFileDescriptors. The first object refers to the read channel (Source Channel) and the second one refers to the write channel (Sink Channel) of the pipe. Use MediaRecorder object to write the recorded data to the write channel...
As far as MediaPlayer is concerned the same technique could be used by passing the FileDescriptor object related to the created pipe's read channel to the setDataSource() method...