I have a simple app, which plays a short sound repeatedly, by invoking the play() method on the audio element in JavaScript. It works well on desktop browsers, ipads, iphones, etc. On a mobile device running Android 2.3.3, the first time I play the sound, I hear it immediately after invoking the play() method, but on subsequent invocations, there is a noticeable and variable delay.
I have done some sleuthing and found that the device is fetching the audio file from the server each time the play() method is invoked. I can invoke the load() method on the audio element to re-load it after each play, thus queueing it up for the next play, but there are a number of problems with that band-aid. I'd really like to make the browser just keep the audio element loaded permanently, instead of unloading it as soon as it finishes playing. Does anyone know if that's possible?
EDIT: I've done a little more investigating, and I've found that after playing the sound, the audio element's readystate remains at HAVE_ENOUGH_DATA, even though the browser won't play that sound again without re-fetching it from the server. I believe this is a bug. I'd hoped to use the readystate to detect browsers that unload after playing, and only explicitly load if necessary, but that's not going to work.
The more experiments I do, the more rough edges I find in Android 2.3.3's implementation of the HTML5 audio tag. There's a lot broken there, at least on the Droid X phone I'm using for testing.
The best I have come up with so far is the band-aid alluded to in my original question: as soon as an invocation of play() completes, invoke the load() method, to prepare for the next play():
if (navigator.userAgent.toLowerCase().indexOf("android") > -1)
audElt.addEventListener('ended', function () {
var t = setTimeout(function () { audElt.load(); }, 1000);
}, false);
I had to restrict the work-around to Android user agents, because simply invoking the load() method creates problems on Chrome and generates unnecessary trips to the server on non-Android systems.
I had to add a 1-sec delay, because if I simply invoked load() from the "ended" handler, it interrupted the playback, which, apparently, hadn't really "ended" yet....
Of course, it's still fetching the sound from the server repeatedly, so if you try to play the sound multiple times in rapid succession, things go south quickly.
best solution i've found. Another options is to just use the video tag but there are some problems with that as well. Nothing seems to work good enough to implement.
luckily im using phone gap so I'll give their audio methods a shot.
You can bind on updatetime to pause() the playback a second before reaching the end of the audio and rewind before playing it again. Android will not flush the audio then.
Related
I know this is strange, but this is my observation:
I managed to play some music with the Media Plugin in Ionic 1 using:
//Method 1: just plugin
mediaRes = new Media(myMusicPath, onMediaSuccess, onMediaError, onMediaStatus);
//Method 2: ngCordova
mediaRes = $cordovaMedia.newMedia(myMusicPath)
mediaRes.play().then(onMediaSuccess, onMediaError, onMediaStatus);
As read from other posts, the path is crucial:
"/android_asset/www/music/mymusic.mp3"
Basically the path name has been printed on my HTML so it won't be wrong.
For the first N times, it works properly. Then, on the (N + 1)th, it fails with error {"code":1}.
Why this happens? Do I need to somehow clear cache or something before re-initializing the variable? Currently I re-initialize the variable every time i run it.
EDIT:
Seems that it is OK to play infinitely if I just perform 'play()' without re-initializing the same variable again. Would continue observing...
release()
Releases the underlying operating system’s audio resources. This is
particularly important for Android, since there are a finite amount of
OpenCore instances for media playback. Applications should call the
release function for any Media resource that is no longer needed.
Apparently, the release function is required. And that solves the problem.
I recently started working with Google Cast SDK v3(Android App).
I am working on casting audio streams on cast device, it works well untill any stream throw failure callback (FAILURE STATUS CODE : 15 (TIMEOUT)) in setResultCallback and cast device stuck on loading screen, i tried using
remoteMediaClient.stop()
method, but it doesn't work and also it stops playing other streams as well after failure.
I am looking forward for a way to clear Media request which i made earlier to cast device, so that i will be able to load new stream on cast device.
Here is the stream that stuck on Buffering state while playing:
http://14523.live.streamtheworld.com/KALLAMAAC
Don't know exactly what your setup is; when you call stop(), it not only stops the playback, it clears/unloads the media and any queue that you have so to play another item afterward, you need to load the media again, as if nothing was playing back before; it is rare to call stop(). To be able to help you further, it is important to see what type of error your receiver is running into.
I am unable to clear myself that why do we need to use prepare() method in Mediaplayer. Why start() independently doesn't work in music players...
The prepare method collects metadata about the file or stream to be played, which may be necessary for proper function of the player itself and related components (like UI). The fact that you can call prepare and prepareAsync separately from calling setDataSource or start is simply a means of allowing the developer control over when and how things happen to suit his/her particular circumstance. Particularly for streaming media, preparation may take a significant amount of time, and so doing things the same way all the time will not be ideal in every situation.
suppose if you want doing some work that can be possible when media player is collecting infoemation then what you do. if start() work for both what happened if media player is collecting information about media. this will be treated as playing and it crashes completely. these are the states and has there works.
I've written some code for Android 2.2 that plays an audio file using the Android MediaPlayer. Without getting into the details of the code, I noticed that there exists a function called
isPlaying()
that allows you to check if an audio file is currently being played by the MediaPlayer. So, for example, when the following snippet of code runs
Toast.makeText(getApplicationContext(), "Sound playing is: " +
mediaPlayer.isPlaying(), Toast.LENGTH_SHORT).show();
it displays the following message
Sound playing is: true / false
depending on whether there's sound playing or not.
When I wrote some code to record sound from the microphone using the Android MediaRecorder, however, I noticed that there did not look like there exists a function called
isRecording()
that checks to see whether a recording is in progress.
So, I was wondering if the onus is on the programmer then to figure out if a recording is in progress by embedding some logic into their code - or if perhaps there indeed exists a way to do this (check if a recording is in progress) by using another in-built function offered by the Android API.
Doesn't look like there is such a function after all. I think it makes sense to try to embed some logic in the code to do this cleanly.
Important
I found a nice workaround to handle this issue, and you can read the workaround right away, but I strongly suggest that you read the entire explanation first.
The scenario
I was trying to find a way to figure out if the mediarecorder had started recording or not, I assumed that calling the start() method on the recorder after the prepare() method was enough, but it turns out, it isn't.
Before you get offended by what I just said, Let me explain the scenario...
I was building a simple audio recording app from scratch, no libraries, copy paste, all hardwork. So I knew exactly what each part of my code is doing. Or at least I thought I did.
Until I decided to try and break my application by clicking on the buttons to start and stop recording like I was playing a piano. And yes, my stop button wasn't even appearing until the mediarecorder's start() method was called.
So I was greeted with a crash and logcat welcomed me with
java.lang.RuntimeException: stop failed.
at android.media.MediaRecorder.stop(Native Method)
along with
E/MediaRecorder(15709): stop failed: -1007
So I read online and found out that calling stop() right after start() on MediaRecorder causes this problem.
So the biggest question was, how do I detect if it is now SAFE for me to enable the STOP button on my recorder?
The Workaround (Not at all perfect, but it works)
MediaRecorder.getMaxAmplitude() // The maximum absolute amplitude measured since the last call, or 0 when called for the first time
As you can see, the MediaRecorder.getMaxAmplitude() method or the MediaRecorder.maxAmplitude property return 0 when called for the first time and the amplitude after that.
So, instead of allowing the user to Stop the recording right after calling MediaRecorder.start() I am now waiting until the MediaRecorder.maxAmplitude value is greater than zero, at which point I can be sure that the MediaRecorder is initialized, started, and recording, and is in a state where calling stop() is allowed. You can accomplish this by using a runnable that keeps checking until the amplitude is greater than 0. I am already using a runnable for the timer, so I perform the check in that.
Please Note
When working on an emulator, the value returned by MediaRecorder.maxAmplitude is always 0. So, you should use an Android Device to check if everything works as expected.
Now my buttons stay disabled for less than a second when I first start recording. But if I stop and start too quickly, they remain disabled for a bit longer, and I show a "Please wait while the recording starts" message for the user.
I hope this answer helps someone.
Regards!
I am trying to play multiple audio files, one after the other and am currently using AsyncTasks to prepare and start the mediaPlayer but have failed to find a good way to move on the to next track at the end of the current one. Not every audio file will be played every time, and it's playing is decided by a boolean value.
Any help is much apprecieated.
I guess you have read android-sdk/docs/reference/android/media/MediaPlayer.html , it says:
When the playback reaches the end of stream, the playback completes.
If the looping mode was being set to truewith setLooping(boolean), the
MediaPlayer object shall remain in the Started state. If the looping
mode was set to false , the player engine calls a user supplied
callback method, OnCompletion.onCompletion(), if a
OnCompletionListener is registered beforehand via
setOnCompletionListener(OnCompletionListener). The invoke of the
callback signals that the object is now in the PlaybackCompleted
state. While in the PlaybackCompleted state, calling start() can
restart the playback from the beginning of the audio/video source.
So you may set a new source, prepareAsync then start in completion callback. In this way , you get continuous playback, but it is not seamless.
Doubtful using MediaPlayer for this will work like you want it to. Try this tutorial:
http://www.droidnova.com/creating-sound-effects-in-android-part-1,570.html
If that doesn't work you'll probably have to mix the sounds together yourself them stream that result directly to the hardware using AudioTrack. That's more low level, but it will give you the most control. It just depends on what you are doing if the AudioManager solution will work for you or not. It's definitely the simpler route. But, if you're trying to line up two samples so that when one finishes the next begins, like in a music app, you probably will have to mix and stream that audio yourself.
http://developer.android.com/reference/android/media/AudioTrack.html
Algorithm to mix sound