Android AudioRecord and MediaRecorder - android

I'm developing an audio processing application where I need to record audio, and then process it to obtain features of that recording. However, I want the audio in a playable format to play it after with MediaPlayer.
I've seen that to record audio to process it it's better to use AudioRecord, because I can get the raw audio from there. But then I can't write the data to a file in a playable format (is there any library to do this in android?).
I used this method to record raw data and then write it into a file:
http://andrewbrobinson.com/2011/11/27/capturing-raw-audio-data-in-android/
But when i try to play this file on the device, it is not playable.
Then if I use MediaRecorder I don't know how to decode the data to extract the features. I've been looking at MediaExtractor, but it seams that MediaExtractor doesn't decode the frames.
So.. what's the best way to do this? I imagine that's common in any audio processing application, but I wasn't able to find the way to manage this.
Thanks to your replies.

Using AudioRecord is the right way to go if you need to do any kind of processing. To play it back, you have a couple options. If you're only going to be playing it back in your app, you can use AudioTrack instead of MediaPlayer to play raw PCM streams.
If you want it to be playable with other applications, you'll need to convert it to something else first. WAV is normally the simplest, since you just need to add the header. You can also find libraries for converting to other formats, like JOrbis for OGG, or JLayer for MP3, etc.

For best quality result you have to use AudioRecord class instead of MediaRecorder.
Please have a look to below link:
Need a simple example for audio recording
Also have a look to this link: http://i-liger.com/article/android-wav-audio-recording

If you use AudioRecord object to get the raw audio signal, the next step to save it save as a playable file is not so difficult, you just need to add a WAV Head before the audio data you capture, then you get a .WAV file which you can play it on most mobile phones.
A .WAV file is a file under the RIFF format. the RIFF header for WAV file is 44 byte long and contains the sample rate, sample width and channel counts information. you can get the detail information from here
I did the sample function on Android phones and it worked.

Related

Why can not I play an audio recorded with AudioRecord on Android?

I need to record an audio on android that I later want to encrypt. So I'm working with the AudioRecord class, since it works the audio at a low level using the bytes directly.
I found a piece of code that works with short and then converts it in to bytes, which is what I want. But once I have created the audio, I can not play it with any audio player in the phone.
What should I have to do in order for the phone to recognize it as a valid audio file?
Please forgive me because I really don't remember all in detail, but I had this issue before and I do remember that the audio recorded by AudioRecord has no format, so in order to make it playable you first need to set a format to it, where you have to specify all of the characteristics that you've set up when initializing your AudioRecord instance (such as sample rate, number of channels, etc). I found an example of how to record an audio using AudioRecord and later setting up wav format: https://selvaline.blogspot.com/2016/04/record-audio-wav-format-android-how-to.html I hope it helps.

Android mediacodec: Is it possible to encode audio and video at the same time using mediacodec and muxer?

There is some good documentation on this site called big flake about how to use media muxer and mediacodec to encode then decode video as mp4, or extract video then encode it again and more stuff.
But it doesnt seem that there is a way to encode audio with video at the same time, no documentation or code about this. It doesn't seem impossible.
Question
Do you know any stable way of doing it that will work on all devices greater than android 18?
Why no one implemented it, is it hard to implement?
You have to create 2 Mediacodec instances, one for video and one for audio and then use MediaMuxer to mux the video with audio after encoding, you can take a look at ExtractDecodeEditEncodeMuxTest.java and at this project to capture camera/mic and save to mp4 file using Mediamuxer and Mediacodec

Recording sound into bytestream in Android

I want create an app which records a sound into bytestream and it can be listened again. It will not save the recorded audio into a file, rather it will save it into a hardware memory. And a play option will be there by which the recorded file can be played. Recorded system will be like talking tom: record the audio and play it instantly
Can anyone provide me a sample code for this functions ?
Without saving the audio into a file, you can use the AudioRecord class which lets you read the recorded audio into a byte array.
An easier option to code and use would be the MediaRecorder, although it will come with a loss in performance as it does save the audio into a file every time.

How to get the speaking sound that comes from the phone user and convert to byte array ?

I writing simple application that need to make some record of the incoming sound.
That mean that if the user will turn the record on => the application need to 'listen' what the user is saying and convert the sound of the user to byte array and save the byte array to some file ( mp3 format ) .
I don't finding any way to get the sound that coming from the user ..
Someone can help me with this issue ?
Thanks for any help.
Performing Audio Capture
Create a new instance of android.media.MediaRecorder.
Set the audio source using MediaRecorder.setAudioSource(). You will probably want to use MediaRecorder.AudioSource.MIC.
Set output file format using MediaRecorder.setOutputFormat().
Set output file name using MediaRecorder.setOutputFile().
Set the audio encoder using MediaRecorder.setAudioEncoder().
Call MediaRecorder.prepare() on the MediaRecorder instance.
To start audio capture, call MediaRecorder.start().
To stop audio capture, call MediaRecorder.stop().
When you are done with the MediaRecorder instance, call MediaRecorder.release() on it. Calling MediaRecorder.release() is always recommended to free the resource immediately.
Example and source can be found here:
http://developer.android.com/guide/topics/media/audio-capture.html
You could also use AudioRecorder if you need other things and codecs (for example raw pcm 16 bit).
With AudioRecorder you get directly the bytes which you can process like you want (ie converting to mp3 codec yourself).
Anyhow I think you should explain what exactly you want to do.. ie if you like to process the audio for speech recognition mp3 is not what you want.

Android convert pcm file from AudioRecord to smaller file

I tried to record audio in Android. The quality of the sound using the MediaRecorder really sucks.
So I tried writing the sound to a stream using the AudioRecord function. Great quality but pcm-files are too large in size as I want to upload them to a remote server.
Does anybody know how to compress the pcm (like mp3 or else)?
Any help is mostly appreciated.
Tom
As far as I know, there are no built-in audio converters in Android. Your best bet is to use third party library, maybe even a c/c++ one.
Look at this question for more info: How to encode a WAV to a mp3 on a Android device

Categories

Resources