MediaPlayer only plays certain WAV files - android

So the app I'm writing is hitting an API and getting base64 encoded WAV voicemails. Decoding and creating the WAV file works but some will crash when calling setDataSource. I can pull the WAV files from my phones storage and play them on Mac so I know the file is being saved okay.
The Base64 I'm decoding looks different and I can tell whether MediaPlayer will play it or not based on looking at it. I have read that WAV file bitrates behave differently with MediaPlayer so I think that may be the issue? The Base64 that doesn't play is much shorter than the Base64 that does even with the same audio duration.
try {
val fileInputStream = FileInputStream(voicemailFile)
if (audioManager == null) {
audioManager = context.getSystemService(Context.AUDIO_SERVICE) as AudioManager?
}
audioManager?.let { audioManager ->
audioManager.isSpeakerphoneOn = isSpeakerPhoneOn
audioManager.mode = AudioManager.MODE_IN_CALL
}
mediaPlayer.setAudioAttributes(AudioAttributes
.Builder()
.setUsage(AudioAttributes.USAGE_VOICE_COMMUNICATION)
.build())
mediaPlayer.setDataSource(fileInputStream.fd)
mediaPlayer.setOnCompletionListener {
finishPlayingVoicemail()
}
mediaPlayer.prepare()
currentProgress?.let { currentProgress ->
mediaPlayer.seekTo(currentProgress)
}
mediaPlayer.start()
shouldResumePlaying = true
fileInputStream.close()
} catch (e: Exception) {
e.printStackTrace()
}
Crash -
E/GenericSource: initFromDataSource, cannot create extractor!
E/GenericSource: Failed to init from data source!
E/MediaPlayerNative: error (1, -2147483648)
W/System.err: java.io.IOException: Prepare failed.: status=0x1
W/System.err: at android.media.MediaPlayer.prepare(MediaPlayer.java:1274)
Examples of the Base64 that plays vs not plays (Just shorter snippets of them)
UklGRi96AABXQVZFZm10IBQAAAARAAEAQB8AANcPAAAAAQQAAgD5AWZhY3QEAAAAoPAAAGRhdGH7eQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAADwACqAA1CwyADYA2wQEgDeupwoNzQiub6tCiE1JYHJvLsYRDQSqMysiTBEM9QAGQCozLsKMDYkgcnLqxhTQxKozLuJQTQjktrbmikyNSKozLuLQDQkgrq9
^^ Does not play
UklGRqTvAwBXQVZFZm10IBAAAAABAAEAgD4AAAB9AAACABAAZGF0YYDvAwD//wAAAAD/////AAD/////AAAAAP///f/+/wEAAgD///z/+v/6//3//v/+//3//v/9//3//v8BAAEA/v/6//j//P8CAAAA9//1//v//P/4//f//P/+//z/9//0//j//v8AAPv/+f/7//z/+v/5//v//P/8//v/+f/6/////P/1//b//f/9//f/9f/6//3/+//4//b/+v8CAAAA9//0//r//f/8//v//f/+//3/+v/5//v///8AAP7//f8AAP///P/7//3//v/8//v//P///wIAAAD7//
^^ Does play

First sample is:
RIFF (little-endian) data, WAVE audio, IMA ADPCM, mono 8000 Hz
second is:
RIFF (little-endian) data, WAVE audio, Microsoft PCM, 16 bit, mono 16000 Hz
As per Supported media formats, under the details for PCM/WAVE:
8- and 16-bit linear PCM (rates up to limit of hardware). Sampling rates for raw PCM recordings at 8000, 16000 and 44100 Hz.
In determining whether those files are linear PCM or not, from my reading of Pulse-code modulation, the second is linear PCM, whereas the former is adaptive differential PCM (ADPCM) so it may not be supported, hence what you are seeing.

msbit's answer led me to looking at ways to play IMA ADPCM wav files on Android. I decided to try swapping out MediaPlayer for ExoPlayer. ExoPlayer seems to be playing the files MediaPlayer couldn't.

Let's try to play one .wav file. Let's say the name of your .wav file is test.wav and it is in res.raw folder in your android project. Please try the following lines of code and let me know.
Media mPlayer;
private Context mContext;
mPlayer = MediaPlayer.create(mContext, R.raw.test);
mPlayer.start();

Related

Android MediaExtractor fails to add track for a valid AAC file

Using MediaExtractor with AAC-LC file obtained from MediaRecorder :
`val mediaExtractor = MediaExtractor()
mediaExtractor.setDataSource(filePath)
val trackCount = mediaExtractor.trackCount`
The track count is 0 for some of the files while it works fine for other AAC-LC files. All files are playable on other platforms.Faulty AAC File
`recorder = MediaRecorder().apply {
setAudioSource(MediaRecorder.AudioSource.MIC);
setOutputFormat(MediaRecorder.OutputFormat.AAC_ADTS);
setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
setOutputFile(pipe[1].fileDescriptor);
setAudioSamplingRate(dataRecorder.audioSamplingRate)
setAudioEncodingBitRate(bitrate)
setOnErrorListener { mr, what, extra ->
logger.msgToFile(
tag,
"handlerStartRecorder",
"Recording stopped with error : $what : $extra"
)
handler.post {
handlerRestartRecorder()
}
}
prepare()
start()
}`
I am using MediaRecorder with ParcelFileDescriptor pipe structure. When writing to the file some of the samples might have been skipped but the file always starts with the AAC ADTS header and ends with the whole sample.
Does skipping a few frames affect MediaExtractor. As per AAC ADTS each frame has it's own header and can be decoded so give that we have a whole frame, why is media extraction failing?

An AAC audio stream is playable in VLC for Android, but not in Exoplayer

I have an RTMP stream I want to play in my app using the Exoplayer library. My setup for that is as follows:
TrackSelector trackSelector = new DefaultTrackSelector();
RtmpDataSourceFactory rtmpDataSourceFactory = new RtmpDataSourceFactory(bandwidthMeter);
ExtractorsFactory extractorsFactory = new DefaultExtractorsFactory();
factory = new ExtractorMediaSource.Factory(rtmpDataSourceFactory);
factory.setExtractorsFactory(extractorsFactory);
createSource();
mPlayer = ExoPlayerFactory.newSimpleInstance(mActivity, trackSelector, new DefaultLoadControl(
new DefaultAllocator(true, C.DEFAULT_BUFFER_SEGMENT_SIZE),
1000, // min buffer
3000, // max buffer
1000, // playback
2000, //playback after rebuffer
DefaultLoadControl.DEFAULT_TARGET_BUFFER_BYTES,
true
));
vwExoPlayer.setPlayer(mPlayer);
mPlayer.addListener(mVideoStreamHandler);
mPlayer.addVideoListener(new VideoListener() {
#Override
public void onVideoSizeChanged(int width, int height, int unappliedRotationDegrees, float pixelWidthHeightRatio) {
Log.d("hasil", "onVideoSizeChanged: w:" + width + ", h:" + height);
String res = width + "x" + height;
resolution.setText(res);
}
#Override
public void onRenderedFirstFrame() {
}
});
Where createSource() is as follows:
private void createSource() {
mMediaSource180 = factory.createMediaSource(Uri.parse(API.GAME_VIDEO_STREAM_URL_180));
mMediaSource360 = factory.createMediaSource(Uri.parse(API.GAME_VIDEO_STREAM_URL_360));
mMediaSource720 = factory.createMediaSource(Uri.parse(API.GAME_VIDEO_STREAM_URL_720));
mMediaSourceAudio = factory.createMediaSource(Uri.parse(API.GAME_AUDIO_STREAM_URL));
}
My current problem is that only the first three ExtractorMediaSources work fine in Exoplayer. The mMediaSourceAudio refuses to play in Exoplayer, but works just fine in the VLC Media Player for Android.
Right now I have a suspicion that the format is AAC-LTP, or whatever AAC variant that requires a codec available in VLC but not in default Android. However, I do not have access to the encoding process so I don't know for sure.
If this isn't the case, what is it?
EDIT:
I've been debugging the BandwidthMeter and added a MediaSourceEventListener. When I use the normal Video sources, onDownstreamFormatChanged() gets called, but not when I use that Audio Stream source.
In addition, the BandwidthMeter works fine, with bytes always downloaded in all parts of the stream and more bytes when the video stream comes in, but only in the Audio only stream that, when I call mPlayer.getBufferedPosition(), the returned value is always 0. Also, when I use the Audio Stream source, no OMX code was called - no decoders were set up.
Am I seeing a malformed audio stream, or do I need to change my Exoplayer's settings?
EDIT 2:
Further debugging reveals that, in all the Video streams and Audio stream, the same FlvExtractor is used. Even though the Video streams have the avc video track encoding and mp4a-latm audio track encoding. Is this normal?
Turns out it's because the stream was recognized to have two tracks/sampleQueues. One Audio track, and one track with null format. That null track was supposed to be the video track, which was supposed to exist according to the stream's flvHeader flag.
For now, I get around this by creating a custom MediaSource using a custom MediaPeriod. Said custom MediaPeriod having code to separate the video and audio tracks of the SampleQueues, then using the audio-only SampleQueue[] instead of the source SampleQueue[] when I want to play the audio-only stream.
Though this gives me another point of concern: There's something one can do to alter the 'has audio track (flag & 0x04) and video track (flag & 0x01)' flag in the rtmp stream, right?
Thanks for the comments, I'm new to ExoPlayer. But your comments helped me in debugging and getting multiple workarounds to the issue.
I tried to use custom MediaSource and custom MediaPeriod to address this audio issue. I have observed video format data coming after audio data incase of video+audio wowza stream, so the function maybeFinishPrepare() will wait for getting both video and audio format tag data before invoking onPrepared, incase if video tagData is received first. Incase of audio data received first, it wont wait and will call onPrepare().
With the above changes, I was able to play audio alone and video_audio wowza streams, where rtmp tagHeader with tagTypes were coming in the order of video tagData and then followed by audio data.
I wasn't able to use the same patch with srs server to play both audio_only and video_audio streams with the same changes. srs server is giving tagData in the order of audio and then video tagData,
So, I debugged further in FlvExtractor. In readFlvHeader, I have overriden the hasAudio and hasVideo variables. These variables will be set based on the first few tagHeaders(5 or 6). I used peekFully on input for 6 times in a loop. In each loop after fetching tagType and tagDataSize, tagDataSize is used to input.advancePeekPosition(), and tagType is used to identify whether we have audio/video format data in tagData. After peeking for first 6 consecutive tagHeaders, I was able to get actual values of hasAudio and hasVideo, and ignored the flvHeaders.flags, which were used to set these variables.
Custom FlvExtractor workaround, looked cleaner than custom MediaSource/MediaPeriod, as we will create those many tracks as necessary, as we are setting proper hasVideo/hasAudio values.

Add second audio track to MediaMuxer

I have a video file .mp4 - video track only.
I'm using MediaExtractor and MediaMuxer to add audio file.
this Works good.
On the processed file i want to add another audio track.
So i'm using again MediaExtractor and MediaMuxer to kind of copy the file, (Creating video and audio tracks, reading [extractor] and writing [muxer]). In addition i'm trying to add the second audio track to the muxer. but this throws the error Failed to add the track to the muxer.
in this link we can see that muxer does not support multiple tracks.
Code From the link:
// Throws exception b/c 2 audio tracks were added.
muxer = new MediaMuxer(outputFile, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
muxer.addTrack(MediaFormat.createAudioFormat("audio/mp4a-latm", 48000, 1));
try {
muxer.addTrack(MediaFormat.createAudioFormat("audio/mp4a-latm", 48000, 1));
fail("should throw IllegalStateException.");
} catch (IllegalStateException e) {
// expected
}
Is there other way to do it ?
Elegant way ?
BTW, i'm trying to avoid using 3rd parties - like ffmpeg or so.. But if would be my only solution...
--EDIT--
Relevant piece of my code
MediaMuxer muxer = new MediaMuxer(outputFile, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
MediaExtractor extractor = new MediaExtractor();
extractor.setDataSource(videoAndAudioFile);
for (int currTrackIdx = 0; currTrackIdx < extractor.getTrackCount(); currTrackIdx++) {
MediaFormat trackFormat = extractor.getTrackFormat(currTrackIdx);
tracksIdx.add(muxer.addTrack(trackFormat));
}
MediaExtractor extractor2 = new MediaExtractor();
extractor2.setDataSource(secondAudioFile);
MediaFormat trackFormat = extractor2.getTrackFormat(0);
tracksIdx.add(muxer.addTrack(trackFormat)); // Crashes here
For someone who reaches here, I found this official doc at link. Muxing Multiple Video/Audio Tracks seems not supported in old API versions and even restricted in the latest version.

Playing mp3 data compressed by lame mp3 with JLayer and Audiotrack in android

I have followed this example to convert raw audio data coming from AudioRecord to mp3, and it happened successfully, if I store this data in a file the mp3 file and play with music player then it is audible.
Now my question is instead of storing mp3 data to a file i need to play it with AudioTrack, the data is coming from the Red5 media server as live stream, but the problem is AudioTrack can only play PCM data, so i can only hear noise from my data.
Now i am using JLayer to my require task.
My code is as follows.
int readresult = recorder.read(audioData, 0, recorderBufSize);
int encResult = SimpleLame.encode(audioData,audioData, readresult, mp3buffer);
and this mp3buffer data is sent to other user by Red5 stream.
data received at other user is in form of stream, so for playing it the code is
Bitstream bitstream = new Bitstream(data.read());
Decoder decoder = new Decoder();
Header frameHeader = bitstream.readFrame();
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
short[] pcm = output.getBuffer();
player.write(pcm, 0, pcm.length);
But my code freezes at bitstream.readFrame after 2-3 seconds, also no sound is produced before that.
Any guess what will be the problem? Any suggestion is appreciated.
Note: I don't need to store the mp3 data, so i cant use MediaPlayer, as it requires a file or filedescriptor.
just a tip, but try to
output.close();
bitstream.closeFrame();
after yours write code. I'm processing MP3 same as you do, but I'm closing buffers after usage and I have no problem.
Second tip - do it in Thread or any other Background process. As you mentioned these deaf 2 seconds, media player may wait until you process whole stream because you are loading it in same thread.
Try both tips (and you should anyway). In first, problem could be in internal buffers; In second you probably fulfill Media's input buffer and you locked app (same thread, full buffer cannot receive your input and code to play it and release same buffer is not invoked because writing locks it...)
Also, if you don't doing it now, check for 'frameHeader == null' due to file end.
Good luck.
You need to loop through the frames like this:
While (frameHeader = bitstream.readFrame()){
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
short[] pcm = output.getBuffer();
player.write(pcm, 0, pcm.length);
bitstream.close();
}
And make sure you are not running them on main thread.(This is probably the reason of freezing.)

Android playback recorded audio

I've been looking at this example https://stackoverflow.com/a/8974361/1191501 and it works perfectly. But my problem is how do I reference the recorded audio so it can be played back straight away?
the output code is:
recorder.setOutputFile("/sdcard/audio/"+filename);
and this definitely records the audio.
and then to playback the audio, I was using:
player.setDataSource();
but I don't know how to reference the filename bit so it plays back. Any ideas?
I had similar problems playing audio from the SD card at one point. This is what did it for me:
private void playMedia() {
String path = Environment.getExternalStorageDirectory() + "/audio_stuff.mp3";
mediaPlayer = MediaPlayer.create(this, Uri.parse(path));
mediaPlayer.start();
}
Make sure to release your MediaPlayer instance and set it to null when you are done. And just in case, make sure your SD card is not mounted when you try to play your audio file. :)
Looking here,
player.setDataSource("/sdcard/audio/"+filename);
player.prepare();
player.start();
would work I would think.

Categories

Resources