androidx.media2.player.MediaPlayer combining two sounds - android

I have two instances of androidx.media2.player.MediaPlayer, one for playing .mp4, one for playing .wav. I want them both to play audio simultaneously.
I am using setAudioAttributes for two players, but as soon as I set the attributes for both player, none of them plays sound, and the video player doesn't even play video.
val soundFile = course.header.directory + soundFileName
val file = File(soundFile)
val fileDescriptor = ParcelFileDescriptor.open(
file,
ParcelFileDescriptor.MODE_READ_ONLY)
val mediaItem = FileMediaItem.Builder(fileDescriptor).build()
audioPlayer.setMediaItem(mediaItem)
audioPlayer.setAudioAttributes(AudioAttributesCompat.Builder()
.setContentType(AudioAttributesCompat.CONTENT_TYPE_SPEECH)
.setUsage(AudioAttributesCompat.USAGE_MEDIA)
.build()
)
audioPlayer.prepare().addListener({
videoPlayer.play()
audioPlayer.play()
}, ContextCompat.getMainExecutor(context))
I set attributes to videoPlayer in a similar way.
I've been trying to set the same session id to both players, but it didn't help.
Blocking the thread for five seconds appears to work around the problem, but I don't know why:
audioPlayer.prepare().addListener({
videoPlayer.play()
audioPlayer.play()
Thread.sleep(5000)
}, ContextCompat.getMainExecutor(context))

I haven't found a solution. I've moved on to ExoPlayer, which doesn't have setAudioAttributes.

Related

Showing video play progress while selecting the video range for trimming

I am designing an android video editor app and one of the feature is to trim video, selected from gallery. I can give an option to select the range using the RangeSlider, displayed at the bottom of the VideoView, to the user and then use FFMPEG library to trim the video.
But i am not able to show the progress of the video being played, within the selected range, on the RangeSlider.
Not sure if i am approaching properly, hence please provide me a solution to achieve this.
When you change the bounds of the RangeSlider, you need to calculate the startTime and endTime of the video. Once you are able to calculate the startTime and endTime, you need to create a ClippingMediaSource instance.
public ClippingMediaSource(MediaSource mediaSource, long startPositionUs, long endPositionUs)
ClippingMediaSource takes three paramters:
MediaSource
startPositionUs
endPositionUs
You can create media source by following the below snippet:
fun getMediaSource(file: String): MediaSource {
return ProgressiveMediaSource.Factory(DefaultDataSourceFactory(context, userAgent))
.createMediaSource(MediaItem.fromUri(Uri.parse(file)))
}
After creating the MediaSource you can pass on the values for start and end time you had calculated.
Note: startPositionUs and endPositionUs are in micro-seconds.
Once done, you can pass this media source to your ExoPlayer and it will play only the selected/trimmed part of your video.

Does ExoPlayer create a windowIndex for each media source in ConcatenatingMediaSource?

I'm working on an app that streams a list of mp3 files, to do this I've used ExoPlayer with a ConcatenatingMediaSource as this:
private fun createMediaSource(
tracks: List<Track>
): MediaSource = ConcatenatingMediaSource(true).apply {
tracks.forEach { track ->
val mediaSource = ProgressiveMediaSource
.Factory(DefaultDataSourceFactory(context))
.createMediaSource(MediaItem.fromUri(track.getFullUri()))
addMediaSource(mediaSource)
}
}
This works great, the files play as list with no errors at all, however what's required from me is to play all these streams as a single stream, where I show the total length of all streams on the seek bar, and the user would seek seamlessly between them.
Of course I'm not using the VideoPlayer provided by ExoPlayer because I need the seekbar to span all media sources, which apparently this is not possible to do with ExoPlayerUi.
So this is the logic I've used when the user tries to seek:
exoPlayer.apply {
var previousTracksLength = 0L
var windowIndex = 0
var currentItemLength = 0L
run loop#{
tracksList.forEachIndexed { index, track ->
currentItemLength = track.getLengthMillis()
previousTracksLength += currentItemLength
if (newPositionMillis < previousTracksLength) {
windowIndex = index
return#loop
}
}
}
val positionForCurrentTrack = (newPositionMillis - (previousTracksLength - currentItemLength))
pause()
if (windowIndex == currentWindowIndex) {
seekTo(positionForCurrentTrack)
} else {
seekTo(windowIndex, positionForCurrentTrack)
}
play()
}
This works amazingly well when the ConcatenatingMediaSource has only 3 or less media sources, but if it's bigger than that, weird behavior starts showing up, I might just want to seek 10 seconds forward the player would move more than 2 minutes instead.
After debugging it was obvious for me that when I call: seekTo(windowIndex, positionForCurrentTrack) exoPlayer is seeking to a window that's not mapped with a specific media source in the ConcatenatingMediaSource !
And here comes my questions:
Does ExoPlayer create a single window for each mediaSource in the ConcatenatingMediaSource or not ?
and If not is there a way to force it to do that ?
This is not really an answer but the explanation to why when I called seekTo(windowIndex, position) the player seemed like it was ignoring the windowIndex and actually seek to a completely unexpected position is because the media type was mp3 !
Apparently many devs have suffered the same issue where the player seek position is out of sync with the real position of the media that's being played when it's an mp3.
More details for anyone having weird issues when playing mp3 using ExoPlayer
https://github.com/google/ExoPlayer/issues/6787#issuecomment-568180969

Playing an audio file with ExoPlayer a few seconds after calling .prepare()

I have a feature in my app that has to play one short audio file multiple times, and using the code below I'm preparing ExoPlayer to play the audio:
SimpleExoPlayer player;
private void readyExoPlayerRaw(int rawSound) {
player = ExoPlayerFactory.newSimpleInstance(this, new DefaultTrackSelector());
DataSpec dataSpec = new DataSpec(RawResourceDataSource.buildRawResourceUri(rawSound));
final RawResourceDataSource rawResourceDataSource = new RawResourceDataSource(this);
try {
rawResourceDataSource.open(dataSpec);
} catch (RawResourceDataSource.RawResourceDataSourceException e) {
e.printStackTrace();
}
MediaSource audioSource =
new ExtractorMediaSource.Factory(
new DefaultDataSourceFactory(this, "appName"))
.createMediaSource(rawResourceDataSource.getUri());
player.prepare(audioSource);
//player.setPlayWhenReady(true);
}
The problem is that the sound plays fine when I uncomment the last line (//player.setPlayWhenReady(true);), but since I'm playing the sound a few seconds after I run this method, this line of code wont work! I think because it waits for a callback from previous line (.prepare(...)) and when it's ready, it will then play. So I thought that maybe I should call start() or something like that on the player, but there's just no such method.
So I'm stuck with calling .prepare() each time I want to play the audio, but since it plays in very short intervals, calling prepare() causes an unsuitable delay.
So am I missing something? How can I play a prepared MediaSource without preparing it again?
I think you're on the right track!
setPlayWhenReady is how you pause and play the audio, but it'll only actually play when it's "ready" aka done preparing.
So you should only have to call your readyExoPlayerRaw once probably in onResume or onCreate. Then at the end of that method you can set playWhenReady(false) so that it doesn't start playing until you tell it to.
Then when you want to play the sound, do something like the following:
private void playSound() {
// Go to the beginning of the audio (in case it has played before)
player.seekTo(0);
// Tell it to play the sound
exoPlayer. setPlayWhenReady(true);
}
Let me know if that works.

Beat matching crossfade with ExoPlayer

I want to implement a beat matching Crossfade feature using ExoPlayer. Basically I have a concept how it should work, but I find it hard to adapt it to ExoPlayer.
Let me please first write how I want to do this so you can understand the case.
As you probably know Beat Matching Crossfade let to seamlessly switch from one song to another. Additionally it adjusts second song tempo to the first song tempo during the crossfade.
So my plan is as follows:
1. Load song A and B so they both starts to buffer.
2. Decoded samples of song A and B are stored in buffers BF1 and BF2.
3. There would be a class called MUX which is a main buffer and contains both songs buffers, BF1 and BF2. MUX provides audio samples to the Player. Samples provided to the Player are BF1 samples or mixed samples from BF1 and BF2 if there is a crossfade.
4. When buffer reaches the crossfade point then samples are send to Analyser class so it can analyse samples from both buffers and modify them for crossfade. Analyser sends modified samples to MUX which updates it's main buffer.
When crossfade is finished then load a next song from playlist.
My main question is how to mix two songs so I can implement class like MUX.
What I know so far is that I can access decoded samples in MediaCodecRender.processOutputBuffer() method so from that point I could create my BF1 and BF2 buffers.
There was also an idea to create two instances of ExoPlayer and while first song is playing the second one is analysed and it's samples are modified for further crossfade, but I think it may be hard to synchronise two players so the beats would match.
Thanks in advance for any help!
Answering #David question about crossfade implementation it looks more or less like this. You have to listen for active player playback and call this method when you want to start crossfade. It's in RxJava but can be easily migrated to Couroutines Flow.
fun crossfadeObservable(
fadeOutPlayer: Player?,
fadeInPlayer: Player,
crossfadeDurationMs: Long,
crossfadeScheduler: Scheduler,
uiScheduler: Scheduler
): Observable<Unit> {
val fadeInMaxGain = fadeInPlayer.audioTrack?.volume ?: 1f
val fadeOutMaxGain = fadeOutPlayer?.audioTrack?.volume ?: 1f
fadeOutPlayer?.enableAudioFocus(false)
fadeInPlayer.enableAudioFocus(true)
fadeInPlayer.playerVolume = 0f
fadeInPlayer.play()
fadeOutPlayer?.playerVolume = fadeOutMaxGain
fadeOutPlayer?.play()
val iterations: Float = crossfadeDurationMs / CROSSFADE_STEP_MS.toFloat()
return Observable.interval(CROSSFADE_STEP_MS, TimeUnit.MILLISECONDS, crossfadeScheduler)
.take(iterations.toInt())
.map { iteration -> (iteration + 1) / iterations }
.filter { percentOfCrossfade -> percentOfCrossfade <= 1f }
.observeOn(uiScheduler)
.map { percentOfCrossfade ->
fadeInPlayer.playerVolume = percentOfCrossfade.coerceIn(0f, fadeInMaxGain)
fadeOutPlayer?.playerVolume = (fadeOutMaxGain - percentOfCrossfade).coerceIn(0f, fadeOutMaxGain)
}
.last()
.doOnTerminate { fadeOutPlayer?.pause() }
.doOnUnsubscribe { fadeOutPlayer?.pause() }
}
const val CROSSFADE_STEP_MS = 100L

How do I get the index of the current track being played using ExoPlayer

I am working on an Android project that involves the use of Google's ExoPlayer.
I have a list of video sources which I build a playlist from using the following code:
for (int i = 0; i < vidList.length(); i++) {
MediaSource source = new ExtractorMediaSource(Uri.parse(vidList.getJSONObject(i).getString("url")),
buildDataSourceFactory(bandwidthMeter), extractorsFactory, mainHandler, HomeFragment.this);
mediaSources.add(source);
captions.add(vidList.getJSONObject(i).getString("caption"));
}
mediaSource = new ConcatenatingMediaSource(mediaSources.toArray(new MediaSource[mediaSources.size()]));
I then call
exoplayer.prepare(mediasource, false, false)
and the videos play in succession fine. I would like to display the caption of the currently playing video in a textView and so I have a separate list that holds the "caption" values for each video.
From scouring through the code I see that I can get the currently playing video in the playlist like this;
exoPlayer.getCurrentPeriodIndex()
Which seems to work and returns the index except for one problem. It returns the value of 0 twice as playback starts. That is video at index 0 returns period 0 as well as video at index 1. This only occurs at indexes 0 and 1 and thereafter everything else looks fine except that the getCurrentPeriodIndex() will return theAccurateIndex - 1.
I see this also happening in the demo Exoplayer application.
Is there a better way to determine what track is currently playing in the playlist?
Thanks.
To find the currently playing track, you need to reference currentWindowIndex exoPlayer field. Looks like this in Java...
exoPlayer.getCurrentWindowIndex()
I'm not sure what getCurrentPeriodIndex() does, and the docs don't elaborate, and I don't like speculating.
exoPlayer.getCurrentWindowIndex() is Deprecated.
Use exoPlayer.getCurrentMediaItemIndex() instead.

Categories

Resources