I want create an app which records a sound into bytestream and it can be listened again. It will not save the recorded audio into a file, rather it will save it into a hardware memory. And a play option will be there by which the recorded file can be played. Recorded system will be like talking tom: record the audio and play it instantly
Can anyone provide me a sample code for this functions ?
Without saving the audio into a file, you can use the AudioRecord class which lets you read the recorded audio into a byte array.
An easier option to code and use would be the MediaRecorder, although it will come with a loss in performance as it does save the audio into a file every time.
Related
I need to record an audio on android that I later want to encrypt. So I'm working with the AudioRecord class, since it works the audio at a low level using the bytes directly.
I found a piece of code that works with short and then converts it in to bytes, which is what I want. But once I have created the audio, I can not play it with any audio player in the phone.
What should I have to do in order for the phone to recognize it as a valid audio file?
Please forgive me because I really don't remember all in detail, but I had this issue before and I do remember that the audio recorded by AudioRecord has no format, so in order to make it playable you first need to set a format to it, where you have to specify all of the characteristics that you've set up when initializing your AudioRecord instance (such as sample rate, number of channels, etc). I found an example of how to record an audio using AudioRecord and later setting up wav format: https://selvaline.blogspot.com/2016/04/record-audio-wav-format-android-how-to.html I hope it helps.
I am working on one small android application.With in that I have a requirement like audio recording with pause and resume options.I searched a lot in the Google,but I couldn't get anything related to that.
Somebody said that there is no default pause and resume functionality in android.
If you want that ,"you need to save that recorded audio file while click on pause button and save into your SD card.And you need to start again from the beginning when you click on resume button.At lost If you want to upload ,then you need to merge all the recorded files as one file then upload it".
I followed his suggestion,but there is no such related code for merging audio files of m4a format available in java/android.
Can anyone suggest me on this.
yes that's correct android audio recorder don't support pause you need to stop and start this
and merge all the files together to create a recording
You need to use the audiorecorder class the good example is here
Android : recording audio using audiorecord class play as fast forwarded
and you need to merge them as demostrated here
Merging pcm audio files
Hope this helps
I want to stream an audio mp3 file and then play it through android media player plus I also want to cache this file, so that mediaplayer don't have to stream for recently played tracks.
I have tried using prepareAsync method but it doesn't give me access to buffer content, so I have decided to stream the audio file myself and then pass it to the media player for playing. I have achieved this by following this article here but this approach has a problem i.e. while transferring the file to media player it goes into error mode which causes my player to behave inconsistently.
When media player enters its error mode it doesn't come out of it automatically so I am forced to create a new media player and then re-provide it the downloaded file, this workaround causes the user to experience an undesired pause in the song playing.
So, does any one have improved an version of code given in above link? or do they know a better solution to this problem or is there is actually a library for streaming an audio file in android?
Thanks
The link you provided looks like a less than ideal solution (not to mention outdated). What you probably want is a local proxy server that gives you access to byte data before the MediaPlayer gets it. See my answer here for a little more explanation.
I'm developing an audio processing application where I need to record audio, and then process it to obtain features of that recording. However, I want the audio in a playable format to play it after with MediaPlayer.
I've seen that to record audio to process it it's better to use AudioRecord, because I can get the raw audio from there. But then I can't write the data to a file in a playable format (is there any library to do this in android?).
I used this method to record raw data and then write it into a file:
http://andrewbrobinson.com/2011/11/27/capturing-raw-audio-data-in-android/
But when i try to play this file on the device, it is not playable.
Then if I use MediaRecorder I don't know how to decode the data to extract the features. I've been looking at MediaExtractor, but it seams that MediaExtractor doesn't decode the frames.
So.. what's the best way to do this? I imagine that's common in any audio processing application, but I wasn't able to find the way to manage this.
Thanks to your replies.
Using AudioRecord is the right way to go if you need to do any kind of processing. To play it back, you have a couple options. If you're only going to be playing it back in your app, you can use AudioTrack instead of MediaPlayer to play raw PCM streams.
If you want it to be playable with other applications, you'll need to convert it to something else first. WAV is normally the simplest, since you just need to add the header. You can also find libraries for converting to other formats, like JOrbis for OGG, or JLayer for MP3, etc.
For best quality result you have to use AudioRecord class instead of MediaRecorder.
Please have a look to below link:
Need a simple example for audio recording
Also have a look to this link: http://i-liger.com/article/android-wav-audio-recording
If you use AudioRecord object to get the raw audio signal, the next step to save it save as a playable file is not so difficult, you just need to add a WAV Head before the audio data you capture, then you get a .WAV file which you can play it on most mobile phones.
A .WAV file is a file under the RIFF format. the RIFF header for WAV file is 44 byte long and contains the sample rate, sample width and channel counts information. you can get the detail information from here
I did the sample function on Android phones and it worked.
I writing simple application that need to make some record of the incoming sound.
That mean that if the user will turn the record on => the application need to 'listen' what the user is saying and convert the sound of the user to byte array and save the byte array to some file ( mp3 format ) .
I don't finding any way to get the sound that coming from the user ..
Someone can help me with this issue ?
Thanks for any help.
Performing Audio Capture
Create a new instance of android.media.MediaRecorder.
Set the audio source using MediaRecorder.setAudioSource(). You will probably want to use MediaRecorder.AudioSource.MIC.
Set output file format using MediaRecorder.setOutputFormat().
Set output file name using MediaRecorder.setOutputFile().
Set the audio encoder using MediaRecorder.setAudioEncoder().
Call MediaRecorder.prepare() on the MediaRecorder instance.
To start audio capture, call MediaRecorder.start().
To stop audio capture, call MediaRecorder.stop().
When you are done with the MediaRecorder instance, call MediaRecorder.release() on it. Calling MediaRecorder.release() is always recommended to free the resource immediately.
Example and source can be found here:
http://developer.android.com/guide/topics/media/audio-capture.html
You could also use AudioRecorder if you need other things and codecs (for example raw pcm 16 bit).
With AudioRecorder you get directly the bytes which you can process like you want (ie converting to mp3 codec yourself).
Anyhow I think you should explain what exactly you want to do.. ie if you like to process the audio for speech recognition mp3 is not what you want.