I'm trying to get sound level from mp3 file that is playing. But first, I should attach that sound. So,
How to attach sound from mp3 file as audio source in my code?
As far as I know, if we want to get audio source from MIC, we can use this code below.
mRecordInstance = new AudioRecord( MediaRecorder.AudioSource.MIC, FREQUENCY,
CHANNEL, ENCODING, BUFFSIZE
);
And what about if the audio source is mp3 file ?
I don't know anymore for solving this problem.
Please, help me..
Thank you,, :)
try this code:
MediaPlayer mp = new MediaPlayer();
mp.setDataSource("/mnt/sd/lalala.mp3");
AudioRecord ar = new AudioRecord(mp.getAudioSessionId(), FREQUENCY,
CHANNEL, ENCODING, BUFFSIZE);
Related
So the app I'm writing is hitting an API and getting base64 encoded WAV voicemails. Decoding and creating the WAV file works but some will crash when calling setDataSource. I can pull the WAV files from my phones storage and play them on Mac so I know the file is being saved okay.
The Base64 I'm decoding looks different and I can tell whether MediaPlayer will play it or not based on looking at it. I have read that WAV file bitrates behave differently with MediaPlayer so I think that may be the issue? The Base64 that doesn't play is much shorter than the Base64 that does even with the same audio duration.
try {
val fileInputStream = FileInputStream(voicemailFile)
if (audioManager == null) {
audioManager = context.getSystemService(Context.AUDIO_SERVICE) as AudioManager?
}
audioManager?.let { audioManager ->
audioManager.isSpeakerphoneOn = isSpeakerPhoneOn
audioManager.mode = AudioManager.MODE_IN_CALL
}
mediaPlayer.setAudioAttributes(AudioAttributes
.Builder()
.setUsage(AudioAttributes.USAGE_VOICE_COMMUNICATION)
.build())
mediaPlayer.setDataSource(fileInputStream.fd)
mediaPlayer.setOnCompletionListener {
finishPlayingVoicemail()
}
mediaPlayer.prepare()
currentProgress?.let { currentProgress ->
mediaPlayer.seekTo(currentProgress)
}
mediaPlayer.start()
shouldResumePlaying = true
fileInputStream.close()
} catch (e: Exception) {
e.printStackTrace()
}
Crash -
E/GenericSource: initFromDataSource, cannot create extractor!
E/GenericSource: Failed to init from data source!
E/MediaPlayerNative: error (1, -2147483648)
W/System.err: java.io.IOException: Prepare failed.: status=0x1
W/System.err: at android.media.MediaPlayer.prepare(MediaPlayer.java:1274)
Examples of the Base64 that plays vs not plays (Just shorter snippets of them)
UklGRi96AABXQVZFZm10IBQAAAARAAEAQB8AANcPAAAAAQQAAgD5AWZhY3QEAAAAoPAAAGRhdGH7eQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAADwACqAA1CwyADYA2wQEgDeupwoNzQiub6tCiE1JYHJvLsYRDQSqMysiTBEM9QAGQCozLsKMDYkgcnLqxhTQxKozLuJQTQjktrbmikyNSKozLuLQDQkgrq9
^^ Does not play
UklGRqTvAwBXQVZFZm10IBAAAAABAAEAgD4AAAB9AAACABAAZGF0YYDvAwD//wAAAAD/////AAD/////AAAAAP///f/+/wEAAgD///z/+v/6//3//v/+//3//v/9//3//v8BAAEA/v/6//j//P8CAAAA9//1//v//P/4//f//P/+//z/9//0//j//v8AAPv/+f/7//z/+v/5//v//P/8//v/+f/6/////P/1//b//f/9//f/9f/6//3/+//4//b/+v8CAAAA9//0//r//f/8//v//f/+//3/+v/5//v///8AAP7//f8AAP///P/7//3//v/8//v//P///wIAAAD7//
^^ Does play
First sample is:
RIFF (little-endian) data, WAVE audio, IMA ADPCM, mono 8000 Hz
second is:
RIFF (little-endian) data, WAVE audio, Microsoft PCM, 16 bit, mono 16000 Hz
As per Supported media formats, under the details for PCM/WAVE:
8- and 16-bit linear PCM (rates up to limit of hardware). Sampling rates for raw PCM recordings at 8000, 16000 and 44100 Hz.
In determining whether those files are linear PCM or not, from my reading of Pulse-code modulation, the second is linear PCM, whereas the former is adaptive differential PCM (ADPCM) so it may not be supported, hence what you are seeing.
msbit's answer led me to looking at ways to play IMA ADPCM wav files on Android. I decided to try swapping out MediaPlayer for ExoPlayer. ExoPlayer seems to be playing the files MediaPlayer couldn't.
Let's try to play one .wav file. Let's say the name of your .wav file is test.wav and it is in res.raw folder in your android project. Please try the following lines of code and let me know.
Media mPlayer;
private Context mContext;
mPlayer = MediaPlayer.create(mContext, R.raw.test);
mPlayer.start();
This is my first question so please let me know if I missed anything!
I want to copy the video using android Media Codec class. (form.mp4 -> to.mp4)
I did decoding video and audio . but i have no idea how to incorde video and audio. I already saw http://bigflake.com/mediacodec/ this page. But i can't find simple example.
mFormat = MediaFormat.createVideoFormat("video/avc", mMetadata.V_WIDTH,mMetadata.V_HEIGHT);
mFormat.setInteger(MediaFormat.KEY_WIDTH, mMetadata.V_WIDTH);
mFormat.setInteger(MediaFormat.KEY_HEIGHT, mMetadata.V_HEIGHT);
mFormat.setInteger(MediaFormat.KEY_BIT_RATE, (int) mMetadata.BIT_RATE);
mFormat.setInteger(MediaFormat.KEY_FRAME_RATE, mMetadata.FRAME_RATE);
mFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, mMetadata.I_FRAME_INTERVAL);
videoEncoder = MediaCodec.createEncoderByType("video/avc");
videoEncoder.configure(mFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
videoEncoder.start();
I did Media-codec start. but I don't know how connect row data of video. please save me from this hell.
I have followed this example to convert raw audio data coming from AudioRecord to mp3, and it happened successfully, if I store this data in a file the mp3 file and play with music player then it is audible.
Now my question is instead of storing mp3 data to a file i need to play it with AudioTrack, the data is coming from the Red5 media server as live stream, but the problem is AudioTrack can only play PCM data, so i can only hear noise from my data.
Now i am using JLayer to my require task.
My code is as follows.
int readresult = recorder.read(audioData, 0, recorderBufSize);
int encResult = SimpleLame.encode(audioData,audioData, readresult, mp3buffer);
and this mp3buffer data is sent to other user by Red5 stream.
data received at other user is in form of stream, so for playing it the code is
Bitstream bitstream = new Bitstream(data.read());
Decoder decoder = new Decoder();
Header frameHeader = bitstream.readFrame();
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
short[] pcm = output.getBuffer();
player.write(pcm, 0, pcm.length);
But my code freezes at bitstream.readFrame after 2-3 seconds, also no sound is produced before that.
Any guess what will be the problem? Any suggestion is appreciated.
Note: I don't need to store the mp3 data, so i cant use MediaPlayer, as it requires a file or filedescriptor.
just a tip, but try to
output.close();
bitstream.closeFrame();
after yours write code. I'm processing MP3 same as you do, but I'm closing buffers after usage and I have no problem.
Second tip - do it in Thread or any other Background process. As you mentioned these deaf 2 seconds, media player may wait until you process whole stream because you are loading it in same thread.
Try both tips (and you should anyway). In first, problem could be in internal buffers; In second you probably fulfill Media's input buffer and you locked app (same thread, full buffer cannot receive your input and code to play it and release same buffer is not invoked because writing locks it...)
Also, if you don't doing it now, check for 'frameHeader == null' due to file end.
Good luck.
You need to loop through the frames like this:
While (frameHeader = bitstream.readFrame()){
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
short[] pcm = output.getBuffer();
player.write(pcm, 0, pcm.length);
bitstream.close();
}
And make sure you are not running them on main thread.(This is probably the reason of freezing.)
I've found lots of tutorials and posts showing how to use AudioTrack to play wav files in AudioTrack.MODE_STREAM and I've successfully implemented this example.
However I'm having issues with performance when playing multiple audio tracks at once and thinking that I should first create the tracks using AudioTrack.MODE_STATIC then just call play each time.
I can't find any resources on how to implement this. How can I do this?
Thanks
The two main sticking points for me were realizing that .write() comes first and that the instantiated player must have the size of the entire clip as the buffer_size_in_bytes.
Assuming you have recorded a PCM file using AudioRecord, you can play it back with STATIC_MODE like so...
File file = new File(FILENAME);
int audioLength = (int)file.length();
byte filedata[] = new byte[audioLength];
try{
InputStream inputStream = new BufferedInputStream(new FileInputStream(FILENAME));
int lengthOfAudioClip = inputStream.read(filedata, 0, audioLength);
player = new AudioTrack(STREAM_TYPE, SAMPLE_RATE, CHANNEL_OUT_CONFIG, AUDIO_FORMAT,audioLength, AUDIO_MODE);
player.write(filedata, OFFSET, lengthOfAudioClip);
player.setPlaybackRate(playbackRate);
player.play();
}
I've been looking at this example https://stackoverflow.com/a/8974361/1191501 and it works perfectly. But my problem is how do I reference the recorded audio so it can be played back straight away?
the output code is:
recorder.setOutputFile("/sdcard/audio/"+filename);
and this definitely records the audio.
and then to playback the audio, I was using:
player.setDataSource();
but I don't know how to reference the filename bit so it plays back. Any ideas?
I had similar problems playing audio from the SD card at one point. This is what did it for me:
private void playMedia() {
String path = Environment.getExternalStorageDirectory() + "/audio_stuff.mp3";
mediaPlayer = MediaPlayer.create(this, Uri.parse(path));
mediaPlayer.start();
}
Make sure to release your MediaPlayer instance and set it to null when you are done. And just in case, make sure your SD card is not mounted when you try to play your audio file. :)
Looking here,
player.setDataSource("/sdcard/audio/"+filename);
player.prepare();
player.start();
would work I would think.