What I'm attempting:
I'm using youtube-dl in a Python Daemon on a remote server to get a URL.
That URL is fed into an Android App into a MediaPlayer instance.
What is happening:
Occasionally and unexpectedly the Media player will BLAST static and play at normal speed, sometimes it will blast static and play at 1.5 times speed.
Here's a video of what happens. HEADPHONE WARNING
YouTube Video
Observations:
If there is static it is for the whole song (it isn't intermittent).
I've taken the URLs it provides and they play fine in a PC browser with no static.
It happens on different phones, and it is not just my particular phone.
It takes longer to start tracks that end up being staticy.
Tracks that are staticy make my progress bar (seconds minutes display) just behave strangely. I've seen it count up and down in the first couple seconds, and there is the 1.5x speed I was talking about.
MediaHTTPConnection throws alot of exceptions that I don't know how to handle.
E/MediaHTTPConnectionEx: disconnecting
E/MediaHTTPConnectionEx: RuntimeException: Unbalanced enter/exit
mConnection.disconnect();
Below is the portion of my Python daemon that returns the URL
ydl_opts = {
'skip_download':True, # We just want to extract the info
'format':'bestaudio',
'forceurl':True,
'ignoreerrors':True,
'youtube_include_dash_manifest':False,
'restrict_filenames':True,
'source_address':'10.1.0.38',#we have to set this to force ipv4
'logger': MyLogger()
}
def ytdl(self, url):
url2 = "https://www.youtube.com/watch?v="+url
ydl.download([url2])
Here's the (basically boilerplate) MediaPlayer
public static Stack<Track> tracks = new Stack<>();
private static MediaPlayer mediaPlayer;
private String mediaFile;
private static int duration = 0;
private AudioManager audioManager;
private Boolean userPause = false;
// Binder given to clients
private final IBinder iBinder = new LocalBinder();
public static final String TAG = "Player";
#Override
public IBinder onBind(Intent intent) {
return iBinder;
}
class LocalBinder extends Binder {
Player getService() {
return Player.this;
}
}
public static void seekTo(int msec){
if(mediaPlayer != null){
mediaPlayer.seekTo(msec);
}
}
//The system calls this method when an activity, requests the service be started
#Override
public int onStartCommand(Intent intent, int flags, int startId) {
boolean success = true;
//An audio file is passed to the service through putExtra();
if(intent.hasExtra("uri")){
mediaFile = intent.getStringExtra("uri");
} else {
stopSelf();
success = false;
}
//Request audio focus
if (!requestAudioFocus()) {
//Could not gain focus
Log.d(TAG, "error requesting audio focus");
stopSelf();
success = false;
}
if (mediaFile != null && !mediaFile.equals("") && success) {
Log.d(TAG, "Media File:" + mediaFile);
success = initMediaPlayer();
}
return super.onStartCommand(intent, flags, startId);
}
#Override
public void onDestroy() {
//we cant destroy the player here because the back button fires this
//maybe i can not fire super?
super.onDestroy();
/*if (mediaPlayer != null) {
if (mediaPlayer.isPlaying()) {
mediaPlayer.stop();
}
mediaPlayer.release();
}
removeAudioFocus();*/
}
private boolean initMediaPlayer() {
boolean error = false;
//one time setup
if(mediaPlayer == null) {
mediaPlayer = new MediaPlayer();
//setup listeners
mediaPlayer.setOnCompletionListener(this);
mediaPlayer.setOnErrorListener(this);
mediaPlayer.setOnPreparedListener(this);
mediaPlayer.setOnBufferingUpdateListener(this);
mediaPlayer.setOnSeekCompleteListener(this);
mediaPlayer.setOnInfoListener(this);
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
}
//Reset so that the MediaPlayer is not pointing to another data source
mediaPlayer.reset();
try {
Log.d(TAG, "setDataSource");
mediaPlayer.setDataSource(mediaFile);
} catch (IOException e) {
Log.e(TAG,"setDataSource error:"+e);
error = true;
}
try {
Log.d(TAG, "prepare");
mediaPlayer.prepare();
} catch (IOException e) {
Log.d(TAG, "prepare error");
e.printStackTrace();
error = true;
}
return error;
}
#Override
public void onPrepared(MediaPlayer mp) {
//Invoked when the media source is ready for playback.
Log.d(TAG, "onPrepared");
mp.start();
duration = mp.getDuration();
}
#Override
public void onCompletion(MediaPlayer mp) {
//Invoked when playback of a media source has completed.
removeAudioFocus();
mp.stop();
mp.reset();
}
#Override
public boolean onInfo(MediaPlayer mp, int what, int extra) {
//Invoked to communicate some info.
return false;
}
#Override
public void onBufferingUpdate(MediaPlayer mp, int percent) {
//Invoked indicating buffering status of
//a media resource being streamed over the network.
if(percent%25==0)
Log.d(TAG, "buffering:"+percent);
}
#Override
public void onSeekComplete(MediaPlayer mp) {
//Invoked indicating the completion of a seek operation.
Log.d(TAG, "onSeekComplete() current pos : " + mp.getCurrentPosition());
SystemClock.sleep(200);
start();
}
//Handle errors
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
//Invoked when there has been an error during an asynchronous operation
switch (what) {
case MediaPlayer.MEDIA_ERROR_NOT_VALID_FOR_PROGRESSIVE_PLAYBACK:
Log.e(TAG, "MEDIA ERROR NOT VALID FOR PROGRESSIVE PLAYBACK " + extra);
break;
case MediaPlayer.MEDIA_ERROR_SERVER_DIED:
Log.e(TAG, "MEDIA ERROR SERVER DIED " + extra);
break;
case MediaPlayer.MEDIA_ERROR_UNKNOWN:
Log.e(TAG, "MEDIA ERROR UNKNOWN " + extra);
//NowPlaying.error = true;
break;
default:
Log.e(TAG, what + "," + extra);
break;
}
PlayerActivity.error = true;
return false;
}
#Override
public void onAudioFocusChange(int focusState) {
//Invoked when the audio focus of the system is updated.
switch (focusState) {
case AudioManager.AUDIOFOCUS_GAIN:
// resume playback
mediaPlayer.setVolume(1.0f, 1.0f);
if(!mediaPlayer.isPlaying()
&& !userPause) {
pause(false);
}
break;
case AudioManager.AUDIOFOCUS_LOSS:
// Lost focus for an unbounded amount of time: stop playback and release media player
if (mediaPlayer.isPlaying()) mediaPlayer.pause();
removeAudioFocus();
break;
case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT:
// Lost focus for a short time, but we have to stop
// playback. We don't release the media player because playback
// is likely to resume
if (mediaPlayer.isPlaying()) mediaPlayer.pause();
break;
case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK:
// Lost focus for a short time, but it's ok to keep playing
// at an attenuated level
if (mediaPlayer.isPlaying()) mediaPlayer.setVolume(0.1f, 0.1f);
break;
}
}
private boolean requestAudioFocus() {
int result = 0;
if(audioManager == null) audioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
if (audioManager != null) {
result = audioManager.requestAudioFocus(this, AudioManager.STREAM_MUSIC, AudioManager.AUDIOFOCUS_GAIN);
}
Log.d(TAG, "requestAudioFocus:"+result);
return result == AudioManager.AUDIOFOCUS_REQUEST_GRANTED;
//Could not gain focus
}
private void removeAudioFocus() {
audioManager.abandonAudioFocus(this);
}
boolean isPlaying() {
if(mediaPlayer != null)
return mediaPlayer.isPlaying();
return false;
}
//pause(true) == pause
//pause(false) == play
//this is used by the system
void pause(Boolean state){
//pause
if (state) {
if (mediaPlayer.isPlaying()) {
mediaPlayer.pause();
}
} else {
if (!mediaPlayer.isPlaying()) {
start();
}
}
}
//this is a pause toggle that is only triggered by the pause/play button
boolean pause() {
if (mediaPlayer.isPlaying()){
userPause = true;
mediaPlayer.pause();
} else {
userPause = false;
start();
}
return !mediaPlayer.isPlaying();
}
void start(){
mediaPlayer.start();
}
int getCurrentPosition(){
if(mediaPlayer != null)
return mediaPlayer.getCurrentPosition();
return 0;
}
int getDuration(){
return duration;
}
}
I feel like someone else is going to have this problem so I'm going to post my solution.
So I noticed a mime type error popped up for every track. The error still shows up now that I've fixed this problem but the loud static has stopped.
Here is the error that started the wheels turning:
E/MediaHTTPConnectionEx: getMIMEType
[seekToEx] offset:0/mCurrentOffset:-1
I noticed some of the URLs youtube-dl was giving me for the webm didn't have a mime type specified in the URL.
Here is an example:
...O8c2&mime=&pl=...
But all of the m4a streams had a mime type in the URL
...70F61&mime=audio%2Fmp4&itag=140&key...
So I think that while my solution isn't the best solution, it's the easiest. Since ALL the m4a streams had a mime specified I just limited myself to those streams.
The rub is this:
I'm pretty sure that if I just checked the URL for a specified mime field I could still play most webm files. The only ones that were failing (staticy) were URLs that did not have that field.
My solution:
Python only pulls m4a files:
...
'format':'bestaudio[ext=m4a]',
...
Android now passes hard coded headers:
Map<String, String> headers = new HashMap<>();
headers.put("Content-Type", "audio/mp4"); // change content type if necessary
Uri uri = Uri.parse(mediaFile);
Log.d(TAG, "getMimeType="+getMimeType(uri));//this is ALWAYS null
mediaPlayer.setDataSource(getApplicationContext(), uri, headers);
I tried almost everything found on the internet and I can't stop the media player once it starts. I'm using broadcast receiver and I'm controlling the media player using SMS. Here is my code.
public class Receiver extends BroadcastReceiver{
String body;
String address;
public static final String SMS_EXTRA_NAME="pdus";
MediaPlayer mp = new MediaPlayer();
#Override
public void onReceive(Context context, Intent intent) {
// TODO Auto-generated method stub
SharedPreferences obj1=context.getSharedPreferences("mypref", Context.MODE_PRIVATE);
String newstring=obj1.getString("key1", null);
String name=newstring;
Bundle bund=intent.getExtras();
String space="";
if(bund!=null)
{
Object[] smsExtra=(Object[])bund.get(SMS_EXTRA_NAME);
for(int i=0;i<smsExtra.length;i++)
{
SmsMessage sms=SmsMessage.createFromPdu((byte[])smsExtra[i]);
body=sms.getMessageBody().toString();
address=sms.getOriginatingAddress();
if(body.equals("ON"))
{
if(mp.isPlaying())
{
mp.stop();
}
try {
mp.reset();
AssetFileDescriptor afd;
afd = context.getAssets().openFd("file.mp3");
mp.setDataSource(afd.getFileDescriptor(),afd.getStartOffset(),afd.getLength());
mp.prepare();
mp.start();
mp.setLooping(true);
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
else if(body.equals("OFF"))
{
if (mp.isPlaying()==true||mp!=null)
{
try{
mp.stop();
mp.release();
} catch(Exception e){
System.out.println("Exception"+e);
}
}
}
}
}
}
}
The media player is turning on when I send "ON", but it won't turn off. And yes I have given the required permissions in the Manifest file.
The BroadcastReciever it stays alive for around 9 seconds, you should not create big operation in it. However, you can let it start an operation like start acitivty or service and there you play a track or start download a file ...etc
If you want to only start a player and no need for user interaction, I suggest that you start a service and there you play your what you want.
I spent a lot of time studying this problem, and found out that:
The problem here is that I create a MediaPlayer inside a thread that is managed by the IntentService. And at the time of starting playback the thread is no longer valid.
So the way out is:
final Handler handler = new Handler(getMainLooper());
handler.post(new Runnable() {
#Override
public void run() {
mediaPlayer.start();
}
});
handler.postDelayed(new Runnable() {
#Override
public void run() {
if (mediaPlayer.isPlaying()) {
mediaPlayer.stop();
}
}
, 30 * 1000);
It helped me stop the mediaplayer.
I have tried almost every method but I've failed to achieve gapless audio playback between looping a single track with a duration of 10-15 seconds.
Steps I've tried and failed :
Different audio file formats .mp3 .wav .ogg using
setLooping(true):
MediaPlayer mp1 = MediaPlayer.create(MainActivity.this, R.raw.track1);
mp1.setLooping(true);
mp1.start();
Creating two mediaplayers and looping one after another using
setOnCompletionListenersame failed to loop without gaps.
Using setNextMediaPlayer(nextmp) some how it works but only two loops is possible. We have to prepare and start again after the completion of previous two loops.
mp1.start();
mp1.setNextMediaPlayer(mp2);
Update:
Result of #Jeff Mixon answer:
Mediaplayer looping stops with an error Android.
Jeff Mixon works fine but only for 10 or 20 loops after that, due to some garbage collection issue the Mediaplayers stops immediately leaving the logs as posted below. I'm really kind of stuck here for 2 years. Thanks in advance.
E/MediaPlayer(24311): error (1, -38)
E/MediaPlayer(23256): Error(1,-1007)
E/MediaPlayer(23546): Error (1,-2147483648)
From the test that I have done, this solution works fine, over 150 loops with a 13 seconds 160 kbps MP3 without any problem:
public class LoopMediaPlayer {
public static final String TAG = LoopMediaPlayer.class.getSimpleName();
private Context mContext = null;
private int mResId = 0;
private int mCounter = 1;
private MediaPlayer mCurrentPlayer = null;
private MediaPlayer mNextPlayer = null;
public static LoopMediaPlayer create(Context context, int resId) {
return new LoopMediaPlayer(context, resId);
}
private LoopMediaPlayer(Context context, int resId) {
mContext = context;
mResId = resId;
mCurrentPlayer = MediaPlayer.create(mContext, mResId);
mCurrentPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
mCurrentPlayer.start();
}
});
createNextMediaPlayer();
}
private void createNextMediaPlayer() {
mNextPlayer = MediaPlayer.create(mContext, mResId);
mCurrentPlayer.setNextMediaPlayer(mNextPlayer);
mCurrentPlayer.setOnCompletionListener(onCompletionListener);
}
private MediaPlayer.OnCompletionListener onCompletionListener = new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mediaPlayer) {
mediaPlayer.release();
mCurrentPlayer = mNextPlayer;
createNextMediaPlayer();
Log.d(TAG, String.format("Loop #%d", ++mCounter));
}
};
}
To use LoopMediaPlayer you can just call:
LoopMediaPlayer.create(context, R.raw.sample);
Ugly proof-of-concept code, but you'll get the idea:
// Will need this in the callbacks
final AssetFileDescriptor afd = getResources().openRawResourceFd(R.raw.sample);
// Build and start first player
final MediaPlayer player1 = MediaPlayer.create(this, R.raw.sample);
player1.start();
// Ready second player
final MediaPlayer player2 = MediaPlayer.create(this, R.raw.sample);
player1.setNextMediaPlayer(player2);
player1.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mediaPlayer) {
// When player1 completes, we reset it, and set up player2 to go back to player1 when it's done
mediaPlayer.reset();
try {
mediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
mediaPlayer.prepare();
} catch (Exception e) {
e.printStackTrace();
}
player2.setNextMediaPlayer(player1);
}
});
player2.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mediaPlayer) {
// Likewise, when player2 completes, we reset it and tell it player1 to user player2 after it's finished again
mediaPlayer.reset();
try {
mediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
mediaPlayer.prepare();
} catch (Exception e) {
e.printStackTrace();
}
player1.setNextMediaPlayer(player2);
}
});
// This loop repeats itself endlessly in this fashion without gaps
This worked for me on an API 19 device and a 5-second 128 kbps MP3. No gaps in the loop.
At least as of KitKat, Mattia Maestrini's Answer (to this question) is the only solution I've found that allows gapless looping of a large (> 1Mb uncompressed) audio sample. I've tried:
.setLooping(true): gives interloop noise or pause even with perfectly trimmed .WAV sample (published bug in Android);
OGG format: frameless format, so better than MP3, but MediaPlayer still emits interloop artifacts; and
SoundPool: may work for small sound samples but large samples cause heap size overflow.
By simply including Maestrini's LoopMediaPlayer class in my project and then replacing my MediaPlayer.create() calls with LoopMediaPlayer.create() calls, I can ensure my .OGG sample is looped seamlessly. LoopMediaPlayer is therefore a commendably practical and transparent solution.
But this transparency begs the question: once I swap my MediaPlayer calls for LoopMediaPlayer calls, how does my instance call MediaPlayer methods such as .isPlaying, .pause or .setVolume? Below is my solution for this issue. Possibly it can be improved upon by someone more Java-savvy than myself (and I welcome their input), but so far I've found this a reliable solution.
The only changes I make to Maestrini's class (aside from some tweaks recommended by Lint) are as marked at the end of the code below; the rest I include for context. My addition is to implement several methods of MediaPlayer within LoopMediaPlayer by calling them on mCurrentPlayer.
Caveat: while I implement several useful methods of MediaPlayer below, I do not implement all of them. So if you expect for example to call .attachAuxEffect you will need to add this yourself as a method to LoopMediaPlayer along the lines of what I have added. Be sure to replicate the original interfaces of these methods (i.e., Parameters, Throws, and Returns):
public class LoopMediaPlayer {
private static final String TAG = LoopMediaPlayer.class.getSimpleName();
private Context mContext = null;
private int mResId = 0;
private int mCounter = 1;
private MediaPlayer mCurrentPlayer = null;
private MediaPlayer mNextPlayer = null;
public static LoopMediaPlayer create(Context context, int resId) {
return new LoopMediaPlayer(context, resId);
}
private LoopMediaPlayer(Context context, int resId) {
mContext = context;
mResId = resId;
mCurrentPlayer = MediaPlayer.create(mContext, mResId);
mCurrentPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
mCurrentPlayer.start();
}
});
createNextMediaPlayer();
}
private void createNextMediaPlayer() {
mNextPlayer = MediaPlayer.create(mContext, mResId);
mCurrentPlayer.setNextMediaPlayer(mNextPlayer);
mCurrentPlayer.setOnCompletionListener(onCompletionListener);
}
private final MediaPlayer.OnCompletionListener onCompletionListener = new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mediaPlayer) {
mediaPlayer.release();
mCurrentPlayer = mNextPlayer;
createNextMediaPlayer();
Log.d(TAG, String.format("Loop #%d", ++mCounter));
}
};
// code-read additions:
public boolean isPlaying() throws IllegalStateException {
return mCurrentPlayer.isPlaying();
}
public void setVolume(float leftVolume, float rightVolume) {
mCurrentPlayer.setVolume(leftVolume, rightVolume);
}
public void start() throws IllegalStateException {
mCurrentPlayer.start();
}
public void stop() throws IllegalStateException {
mCurrentPlayer.stop();
}
public void pause() throws IllegalStateException {
mCurrentPlayer.pause();
}
public void release() {
mCurrentPlayer.release();
mNextPlayer.release();
}
public void reset() {
mCurrentPlayer.reset();
}
}
Something like this should work. Keep two copies of the same file in the res.raw directory. Please note that this is just a POC and not an optimized code. I just tested this out and it is working as intended. Let me know what you think.
public class MainActivity extends Activity {
MediaPlayer mp1;
MediaPlayer mp2;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
mp1 = MediaPlayer.create(MainActivity.this, R.raw.demo);
mp2 = MediaPlayer.create(MainActivity.this, R.raw.demo2);
mp1.start();
Thread thread = new Thread(new Runnable() {
#Override
public void run() {
int duration = mp1.getDuration();
while (mp1.isPlaying() || mp2.isPlaying()) {
try {
Thread.sleep(100);
} catch (InterruptedException e) {
e.printStackTrace();
}
duration = duration - 100;
if (duration < 1000) {
if (mp1.isPlaying()) {
mp2.start();
mp1.reset();
mp1 = MediaPlayer.create(MainActivity.this,
R.raw.demo);
duration = mp2.getDuration();
} else {
mp1.start();
mp2.reset();
mp2 = MediaPlayer.create(MainActivity.this,
R.raw.demo2);
duration = mp1.getDuration();
}
}
}
}
});
thread.start();
}
}
I suggest you to use SoundPool API instead of MediaPlayer.
From the official documentation:
The SoundPool class manages and plays audio resources for
applications.
...
Sounds can be looped by setting a non-zero loop
value. A value of -1 causes the sound to loop forever. In this case,
the application must explicitly call the stop() function to stop the
sound. Any other non-zero value will cause the sound to repeat the
specified number of times, e.g. a value of 3 causes the sound to play
a total of 4 times.
...
Take a look here for a practical example of how to use SoundPool.
In using Mattia Maestrini's answer, I was able to get the audio looping the way I wanted but, since I was using this for Android Auto, discovered that the audio only played over my phones speakers instead of my car speakers. I eventually found this answer which points out a bug which makes it important in this context to use the new MediaPlayer() constructor with the setDataSource method. I was already using Uris in my code so I used that variant, so I'm not 100% sure how important that is, I would assume any of the other setDataSource variants would be sufficient if it matters for your code.
Here's what ultimately ended up working for me:
public class LoopMediaPlayer extends MediaPlayer {
private static final String TAG = LoopMediaPlayer.class.getSimpleName();
private Context mContext = null;
private Uri mMediaUri = null;
private int mCounter = 1;
private MediaPlayer mCurrentPlayer = null;
private MediaPlayer mNextPlayer = null;
private Float mLeftVolume;
private Float mRightVolume;
public static LoopMediaPlayer create(Context context, Uri mediaUri) {
try {
return new LoopMediaPlayer(context, mediaUri);
}
catch (Exception e) {
throw new RuntimeException("Unable to create media player", e);
}
}
private LoopMediaPlayer(Context context, Uri mediaUri) throws IOException {
mContext = context;
mMediaUri = mediaUri;
mCurrentPlayer = new MediaPlayer();
mCurrentPlayer.setDataSource(mContext, mMediaUri);
mCurrentPlayer.prepare();
createNextMediaPlayer();
}
private void createNextMediaPlayer() {
try {
mNextPlayer = new MediaPlayer();
mNextPlayer.setDataSource(mContext, mMediaUri);
if (mLeftVolume != null && mRightVolume != null) {
mNextPlayer.setVolume(mLeftVolume, mRightVolume);
}
mNextPlayer.prepare();
mCurrentPlayer.setNextMediaPlayer(mNextPlayer);
mCurrentPlayer.setOnCompletionListener(onCompletionListener);
}
catch (Exception e) {
Log.e(TAG, "Problem creating next media player", e);
}
}
private MediaPlayer.OnCompletionListener onCompletionListener = new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mediaPlayer) {
mediaPlayer.release();
mCurrentPlayer = mNextPlayer;
createNextMediaPlayer();
Log.d(TAG, String.format("Loop #%d", ++mCounter));
}
};
#Override
public void prepare() throws IllegalStateException {
// no-op, internal media-players are prepared when they are created.
}
#Override
public boolean isPlaying() throws IllegalStateException {
return mCurrentPlayer.isPlaying();
}
#Override
public void setVolume(float leftVolume, float rightVolume) {
mCurrentPlayer.setVolume(leftVolume, rightVolume);
mNextPlayer.setVolume(leftVolume, rightVolume);
mLeftVolume = leftVolume;
mRightVolume = rightVolume;
}
#Override
public void start() throws IllegalStateException {
mCurrentPlayer.start();
}
#Override
public void stop() throws IllegalStateException {
mCurrentPlayer.stop();
}
#Override
public void pause() throws IllegalStateException {
mCurrentPlayer.pause();
}
#Override
public void release() {
mCurrentPlayer.release();
mNextPlayer.release();
}
#Override
public void reset() {
mCurrentPlayer.reset();
}
}
For some reason, I found that my "OnCompletion" Event was always firing a fraction of second late when attempting to loop an 8-second OGG file. For anyone experiencing this type of delay, try the following.
It is possible to forcibly queue a "nextMediaPlayer" as recommend in previous solutions, by simply posting a delayed Runnable to a Handler for your MediaPlayers and avoiding looping in onCompletion Event altogether.
This performs flawlessly for me with my 160kbps 8-second OGG, min API 16.
Somewhere in your Activity/Service, create a HandlerThread & Handler...
private HandlerThread SongLooperThread = new HandlerThread("SongLooperThread");
private Handler SongLooperHandler;
public void startSongLooperThread(){
SongLooperThread.start();
Looper looper = SongLooperThread.getLooper();
SongLooperHandler = new Handler(looper){
#Override
public void handleMessage(Message msg){
//do whatever...
}
}
}
public void stopSongLooperThread(){
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.JELLY_BEAN_MR2){
SongLooperThread.quit();
} else {
SongLooperThread.quitSafely();
}
}`
...start the Thread, declare and set up your MediaPlayers...
#Override
public void onCreate() {
// TODO Auto-generated method stub
super.onCreate();
startSongLooperThread();
activeSongResID = R.raw.some_loop;
activeMP = MediaPlayer.create(getApplicationContext(), activeSongResID);
activeSongMilliseconds = activeMP.getDuration();
queuedMP = MediaPlayer.create(getApplicationContext(),activeSongResID);
}
#Override
public void onDestroy() {
// TODO Auto-generated method stub
super.onDestroy();
stopSongLooperThread();
activeMP.release();
queuedMP.release();
activeMP = null;
queuedMP = null;
}
...create a Method for swapping your MediaPlayers...
private void swapActivePlayers(){
Log.v("SongLooperService","MediaPlayer swap started....");
queuedMP.start();
//Immediately get the Duration of the current track, then queue the next swap.
activeSongMilliseconds = queuedMP.getDuration();
SongLooperHandler.postDelayed(timedQueue,activeSongMilliseconds);
Log.v("SongLooperService","Next call queued...");
activeMP.release();
//Swap your active and queued MPs...
Log.v("SongLooperService","MediaPlayers swapping....");
MediaPlayer temp = activeMP;
activeMP = queuedMP;
queuedMP = temp;
//Prepare your now invalid queuedMP...
queuedMP = MediaPlayer.create(getApplicationContext(),activeSongResID);
Log.v("SongLooperService","MediaPlayer swapped.");
}
...create Runnables to post to your thread...
private Runnable startMP = new Runnable(){
public void run(){
activeMP.start();
SongLooperHandler.postDelayed(timedQueue,activeSongMilliseconds);
}
};
private Runnable timedQueue = new Runnable(){
public void run(){
swapActivePlayers();
}
};
In your Service's onStartCommand() or somewhere in your Activity, start the MediaPlayer...
...
SongLooperHandler.post(startMP);
...
I have tried everything suggested here and elsewhere and the only thing that worked is ExoPlayer instead of the Music class. You can access your libgdx files with:
Uri.parse("file:///android_asset/" + path)
You'll also need platform specific code.
CODE-REad's LoopMediaPlayer example is great, but if you use the new MediaPlayer() method of creating the MediaPlayer (like I do for using File or AssetFileDescriptor datasources) rather than the MediaPlayer.Create() method then you must be careful to
Call the setOnCompletionListener method AFTER .start() or it will
not fire.
Fully .prepare() or .prepareAsync() the mNextPlayer before
calling .setNextMediaPlayer on the mCurrentPlayer or it will fail to
play the mNextPlayer. This means calling .start, setOnCompletionListener, and .setNextMediaPlayer in the onPreparedListener as shown below.
I have modified his code to use the new MediaPlayer() method to create the player and also added the ability to set datasource from AssetFileDescriptor and a File. I hope this saves someone some time.
public class LoopMediaPlayer {
private static final String TAG = LoopMediaPlayer.class.getSimpleName();
private Context mContext = null;
private int mResId = 0;
private int mCounter = 1;
private AssetFileDescriptor mAfd = null;
private File mFile = null;
private MediaPlayer mCurrentPlayer = null;
private MediaPlayer mNextPlayer = null;
public static LoopMediaPlayer create(Context context, int resId) {
return new LoopMediaPlayer(context, resId);
}
public LoopMediaPlayer(Context context, File file){
mContext = context;
mFile = file;
try {
mCurrentPlayer = new MediaPlayer();
mCurrentPlayer.setLooping(false);
mCurrentPlayer.setDataSource(file.getAbsolutePath());
mCurrentPlayer.prepareAsync();
mCurrentPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
mCurrentPlayer.start();
mCurrentPlayer.setOnCompletionListener(onCompletionListener);
createNextMediaPlayer();
}
});
} catch (Exception e) {
Log.e("media", e.getLocalizedMessage());
}
}
public LoopMediaPlayer(Context context, AssetFileDescriptor afd){
mAfd = afd;
mContext = context;
try {
mCurrentPlayer = new MediaPlayer();
mCurrentPlayer.setLooping(false);
mCurrentPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
mCurrentPlayer.prepareAsync();
mCurrentPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
mCurrentPlayer.start();
mCurrentPlayer.setOnCompletionListener(onCompletionListener);
createNextMediaPlayer();
}
});
} catch (Exception e) {
Log.e("media", e.getLocalizedMessage());
}
}
private LoopMediaPlayer(Context context, int resId) {
mContext = context;
mResId = resId;
mCurrentPlayer = MediaPlayer.create(mContext, mResId);
mCurrentPlayer.setLooping(false);
mCurrentPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
mCurrentPlayer.start();
mCurrentPlayer.setOnCompletionListener(onCompletionListener);
createNextMediaPlayer();
}
});
mCurrentPlayer.prepareAsync();
}
private void createNextMediaPlayer() {
try{
if(mAfd != null){
mNextPlayer = new MediaPlayer();
mNextPlayer.setDataSource(mAfd.getFileDescriptor(), mAfd.getStartOffset(), mAfd.getLength());
mNextPlayer.prepareAsync();
mNextPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mCurrentPlayer.setNextMediaPlayer(mNextPlayer);
}
});
}
else if(mFile!=null){
mNextPlayer = new MediaPlayer();
mNextPlayer.setDataSource(mFile.getAbsolutePath());
mNextPlayer.prepareAsync();
mNextPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mCurrentPlayer.setNextMediaPlayer(mNextPlayer);
}
});
}
else {
mNextPlayer = MediaPlayer.create(mContext, mResId);
mNextPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mCurrentPlayer.setNextMediaPlayer(mNextPlayer);
}
});
}
} catch (Exception e) {
}
}
private final MediaPlayer.OnCompletionListener onCompletionListener = new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mediaPlayer) {
mediaPlayer.release();
mCurrentPlayer = mNextPlayer;
mCurrentPlayer.setOnCompletionListener(onCompletionListener);
createNextMediaPlayer();
Log.d("LoopMediaPlayer", String.format("Loop #%d", ++mCounter));
}
};
// code-read additions:
public boolean isPlaying() throws IllegalStateException {
return mCurrentPlayer.isPlaying();
}
public void setVolume(float leftVolume, float rightVolume) {
mCurrentPlayer.setVolume(leftVolume, rightVolume);
}
public void start() throws IllegalStateException {
mCurrentPlayer.start();
}
public void stop() throws IllegalStateException {
mCurrentPlayer.stop();
}
public void pause() throws IllegalStateException {
mCurrentPlayer.pause();
}
public void release() {
mCurrentPlayer.release();
mNextPlayer.release();
}
public void reset() {
mCurrentPlayer.reset();
}
}
again I turn to you for a question that I have been presented.
I have a class that extend the class service and implements the class Runnable for audio playback, I'm working with a progressbar to display the progress of the reproduction of music. In the run method add the code to update the value of the progressbar, up there all right, but when I'm playing a song and jump to another without finishing the previous one, the thread that was created earlier is not destroyed, help me please?
public class PlaySongService extends Service implements MediaPlayer.OnCompletionListener, Runnable {
public static MediaPlayer mediaPlayer = new MediaPlayer();
private ProgressBar pbSong;
private final String TAG = "PlaySongService";
#Override
public IBinder onBind(Intent intent) {
return null; //To change body of implemented methods use File | Settings | File Templates.
}
public void onCreate() {
mediaPlayer.setOnCompletionListener(this);
mediaPlayer.reset();
pbSong = new WeakReference<ProgressBar>(Main.pbSong);
super.onCreate();
}
....... /* Selected song in listview*/
private void playSong(int position) {
try {
mediaPlayer.reset();
mediaPlayer.setDataSource(my_file_selected_in_listview);
mediaPlayer.prepare();
mediaPlayer.start();
pbSong.get().setProgress(0);
pbSong.get().setMax(mediaPlayer.getDuration());
new Thread(this).start();
} catch (Exception e) {
Log.e(TAG,e.getMessage());
}
}
#Override
public void run() {
int currentPosition= 0;
int total = mediaPlayer.getDuration();
while (mediaPlayer!=null && currentPosition<total) {
try {
Thread.sleep(1000);
currentPosition= mediaPlayer.getCurrentPosition();
Log.i(TAG,Thread.currentThread().toString());
} catch (InterruptedException e) {
return;
} catch (Exception e) {
return;
}
pbSong.get().setProgress(currentPosition);
}
}
}
Threads are not "destroyed" under normal circumstances, they end when they return from the run method. This can be initiated by e.g. a boolean flag or by call to interrupt() (outside of the thread) and periodic checking of Thread.interrupted() in the thread.
I have managed to get a working video player that can stream rtsp links, however im not sure how to display the videos current time position in the UI, i have used the getDuration and getCurrentPosition calls, stored this information in a string and tried to display it in the UI but it doesnt seem to work
**in main.xml:**
TextView android:id="#+id/player"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_margin="1px"
android:text="#string/cpos"
/>
**in strings.xml:**
string name="cpos">"" /string>
**in Player.java**
private void playVideo(String url) {
try {
media.setEnabled(false);
if (player == null) {
player = new MediaPlayer();
player.setScreenOnWhilePlaying(true);
} else {
player.stop();
player.reset();
}
player.setDataSource(url);
player.getCurrentPosition();
player.setDisplay(holder);
player.setAudioStreamType(AudioManager.STREAM_MUSIC);
player.setOnPreparedListener(this);
player.prepareAsync();
player.setOnBufferingUpdateListener(this);
player.setOnCompletionListener(this);
} catch (Throwable t) {
Log.e(TAG, "Exception in media prep", t);
goBlooey(t);
try {
try {
player.prepare();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Log.v(TAG, "Duration: ===> " + player.getDuration());
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
private Runnable onEverySecond = new Runnable() {
public void run() {
if (lastActionTime > 0
&& SystemClock.elapsedRealtime() - lastActionTime > 3000) {
clearPanels(false);
}
if (player != null) {
timeline.setProgress(player.getCurrentPosition());
//stores getCurrentPosition as a string
cpos = String.valueOf(player.getCurrentPosition());
System.out.print(cpos);
}
if (player != null) {
timeline.setProgress(player.getDuration());
//stores getDuration as a string
cdur = String.valueOf(player.getDuration());
System.out.print(cdur);
}
if (!isPaused) {
surface.postDelayed(onEverySecond, 1000);
}
}
};
Your code snippet looks significantly like my vidtry sample. getCurrentPosition() and getDuration() works for HTTP streaming, such as for use in updating the progress bar.
I have not tried vidtry with an RTSP video stream, mostly because I don't know of any.
Check the SDP response from the server to ensure that it is sending the duration in the response (live streams don't have a recognizable time and that may cause the client to not provide this information.)
E.g. a live feed will look like:
a=range:npt=0-
Whereas a VoD clip should look like:
a=range:npt=0-399.1680
If getCurrentPosition() doesn't work, but you know the Duration (either getDuration() works or you have an alternate way of getting this information; you could calculate it by watching the buffering events and tracking this your self. Your approach is the more desirable approach than this one.
If I got you right, you want to show in a TextView elapsed time e.g. hh:mm:ss?
If so, I'll give you a little walkthrough on how to do that.
private TextView mElapsedTimeText;
private VideoView mVideoView;
private Thread mThread;
#Override
public void onCreate(Bundle savedInstanceState) {
/* here goes your code */
// let's assume that your IDs are elapsedId and videoId
mElapsedTimeText = (TextView) findViewById(R.id.elapsedId);
mVideoView = (VideoView) findViewById(R.id.videoId);
mThread = new Thread() {
#Override
public void run() {
mElapsedTime.setText(getNiceString());
mVideoView.postDelayed(mThread, 1000);
}
}
/* here goes your code */
}
public String getNiceString() {
String result = "";
int position = mVideoView.getCurrentPosition();
/* here goes your code */
//result is hh:mm:ss formatted string
return result;
}
#Override
public void onPrepared(MediaPlayer mp) {
/* here goes your code */
// you have to trigger the process somewhere
mVideoView.postDelayed(mThread, 1000);
/* here goes your code */
}
And one more thing I forgot to mention. In order to make this work your activity class has to implement the OnPreparedListener interface.
I hope you or someone else will find this post useful.
Best regards,
Igor