How do I add audio tracks on a MediaItem on ExoPlayer? - android

I'm basically trying to achieve the dub functionality in a video player, using ExoPlayer, however, I can't figure out how to add multiple audio tracks to a video.
I've tried creating multiple MediaSources and adding the the player, but that doesn't work. I also tried to add multiple MediaItems for each language, but the while debugging after the exoplayer envokes the callback that the tracks changes, the tracks size is still 0.
Here's how I do that currently:
resource.url?.let { url ->
val mediaItem = MediaItem.Builder()
.setUri(url.toUri()) //this is the video url and works fine, the video plays
.setSubtitleConfigurations(generateSubtitles()) //subs work fine
.build()
val allMediaItems = listOf(mediaItem) + generateDubs()
with(exoPlayer) {
setMediaItems(allMediaItems)
selectAudioTrack()
prepare()
play()
}
}
generateDubs() gets url's of different audios and maps them into MediaItem's instances(which currently is just mock data with 1 audio)
private fun generateDubs(): List<MediaItem> {
return listOf(MediaItem.fromUri("https://www.soundhelix.com/examples/mp3/SoundHelix-Song-1.mp3"))
}
viewModel.exoPlayer.addListener(object : Listener {
override fun onTracksChanged(tracks: Tracks) {
super.onTracksChanged(tracks) // <- debug here shows tracks = 0
Log.i("mytag", "$tracks")
}
})
Am I doing something wrong here or am I thinking about this way of achieving dub functionality wrong?

Related

Android, how to add multiple tracks to peer connection in WebRTC?

I want to send multiple track to remote peer. For example, videoTrack, audioTrack, shareScreenTrack.
I used UNIFIED_PLAN like below usage.
val rtcConfig = PeerConnection.RTCConfiguration(
arrayListOf(PeerConnection.IceServer.builder("stun:stun.l.google.com:19302").createIceServer())
).apply { sdpSemantics = PeerConnection.SdpSemantics.UNIFIED_PLAN }
And I add tracks like this
peerConnection?.addTrack(videoTrack)
peerConnection?.addTrack(audioTrack)
peerConnection?.addTrack(captureScreenVideoTrack)
but only the first track goes.
When I add onTrack debugging, debugging for videoTrack drops only once. It doesn't fall for audioTrack and captureScreenVideoTrack.
override fun onTrack(transceiver: RtpTransceiver?) {
super.onTrack(transceiver)
val track = transceiver?.receiver?.track() ?: return
when (track.kind()){
MediaStreamTrack.VIDEO_TRACK_KIND ->{
//videoTrack or captureScreenVideoTrack
}
MediaStreamTrack.AUDIO_TRACK_KIND ->{
//audioTrack
}
else -> {}
}
}
I found my problem. my fault. I was calling createOffer from onRenegotiationNeeded. OnRenegotiationNeeded is triggered when the first track is added to the peerconnection. Since the second track is added right after this, onRenegotiationNeeded is triggered again.
I removed the createOffer from onRenegotiationNeeded and made it call only on the first call and the problem was resolved.

Adding MediaItem when using the media3 library caused an error

I am using the latest Android Media3 library, but I found a problem in using it...
I created a MediaSessionService, and then got the MediaController in the Activity, and then when I tried to call the media controller and add some MediaItems, an error occurred:
java.lang.NullPointerException
at androidx.media3.common.util.Assertions.checkNotNull(Assertions.java:155)
at androidx.media3.exoplayer.source.DefaultMediaSourceFactory.createMediaSource(DefaultMediaSourceFactory.java:338)
at androidx.media3.exoplayer.ExoPlayerImpl.createMediaSources(ExoPlayerImpl.java:1164)
at androidx.media3.exoplayer.ExoPlayerImpl.addMediaItems(ExoPlayerImpl.java:463)
at androidx.media3.exoplayer.SimpleExoPlayer.addMediaItems(SimpleExoPlayer.java:1146)
at androidx.media3.common.BasePlayer.addMediaItems(BasePlayer.java:69)
at androidx.media3.common.BasePlayer.addMediaItem(BasePlayer.java:64)
at androidx.media3.common.ForwardingPlayer.addMediaItem(ForwardingPlayer.java:90)
at androidx.media3.session.PlayerWrapper.addMediaItem(PlayerWrapper.java:346)
at androidx.media3.session.MediaSessionStub.lambda$addMediaItem$28(MediaSessionStub.java:1052)
at androidx.media3.session.MediaSessionStub$$ExternalSyntheticLambda8.run(Unknown Source:2)
at androidx.media3.session.MediaSessionStub.lambda$getSessionTaskWithPlayerCommandRunnable$2$androidx-media3-session-MediaSessionStub(MediaSessionStub.java:234)
at androidx.media3.session.MediaSessionStub$$ExternalSyntheticLambda52.run(Unknown Source:14)
at androidx.media3.session.MediaSessionStub.lambda$flushCommandQueue$50(MediaSessionStub.java:1479)
at androidx.media3.session.MediaSessionStub$$ExternalSyntheticLambda58.run(Unknown Source:2)
at androidx.media3.common.util.Util.postOrRun(Util.java:517)
at androidx.media3.session.MediaSessionStub.flushCommandQueue(MediaSessionStub.java:1473)
at androidx.media3.session.MediaControllerImplBase$FlushCommandQueueHandler.handleMessage(MediaControllerImplBase.java:3035)
at android.os.Handler.dispatchMessage(Handler.java:106)
at android.os.Looper.loopOnce(Looper.java:201)
at android.os.Looper.loop(Looper.java:288)
at android.app.ActivityThread.main(ActivityThread.java:7813)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:548)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1003)
So I checked the createMediaSource function of DefaultMediaSourceFactory and found that it is checking whether the localConfiguration of MediaItem is null:
#Override
public MediaSource createMediaSource(MediaItem mediaItem) {
checkNotNull(mediaItem.localConfiguration);
...
}
And this is localConfiguration:
/**
* Optional configuration for local playback. May be {#code null} if shared over process
* boundaries.
*/
#Nullable public final LocalConfiguration localConfiguration;
I am pretty sure that there is no problem with the way I created the MediaItem, and it works well inside the Service, but when I try to insert the MediaItem in the Activity, an error occurs. According to the comments, I guess this may be a cross-process communication problem, but I don't have any clue about this. Does anyone have experience with Media3?
When you add/set MediaItems from a controller, the localConfiguration (uri, mimeType, drm config, etc) of MediaItem is removed for security/privacy reasons. Without localConfiguration the player can't play the media item. We need to add the missing information back to the MediaItem.
Updated answer (media3 1.0.0-beta01 or higher)
Open the Callback you defined when creating the MediaLibrarySession in your Service.
// My MediaLibraryService
// onCreate()
mediaLibrarySession = MediaLibrarySession.Builder(
this,
player,
librarySessionCallback // <--
).build()
// NOTE: If you are using MediaSessionService instead of MediaLibraryService,
// use `setCallback(librarySessionCallback)` from the MediaSession.Builder.
Override onAddMediaItems inside your MediaLibrarySession.Callback. Every time you use setMediaItem/addMediaItem from a controller, your onAddMediaItems will be called and the MediaItems returned there are the ones that will be played.
class CustomMediaLibrarySessionCallback : MediaLibraryService.MediaLibrarySession.Callback {
// [...]
override fun onAddMediaItems(
mediaSession: MediaSession,
controller: MediaSession.ControllerInfo,
mediaItems: MutableList<MediaItem>
): ListenableFuture<List<MediaItem>> {
// NOTE: You can use the id from the mediaItems to look up missing
// information (e.g. get URI from a database) and return a Future with
// a list of playable MediaItems.
// If your use case is really simple and the security/privacy reasons
// mentioned earlier don't apply to you, you can add the URI to the
// MediaItem request metadata in your activity/fragment and use it
// to rebuild the playable MediaItem.
val updatedMediaItems = mediaItems.map { mediaItem ->
mediaItem.buildUpon()
.setUri(mediaItem.requestMetadata.mediaUri)
.build()
}
return Futures.immediateFuture(updatedMediaItems)
}
}
Create and play your MediaItem from the activity/fragment.
// My Activity
val mmd = MediaMetadata.Builder()
.setTitle("Example")
.setArtist("Artist name")
.build()
// Request metadata. New in (1.0.0-beta01)
// This is optional. I'm adding a RequestMetadata to the MediaItem so I
// can get the mediaUri from my `onAddMediaItems` simple use case (see
// onAddMediaItems for more info).
// If you are going to get the final URI from a database, you can move your
// query to your `MediaLibrarySession.Callback#onAddMediaItems` and skip this.
val rmd = RequestMetadata.Builder()
.setMediaUri("...".toUri())
.build()
val mediaItem = MediaItem.Builder()
.setMediaId("123")
.setMediaMetadata(mmd)
.setRequestMetadata(rmd)
.build()
browser.setMediaItem(mediaItem)
browser.prepare()
browser.play()
Old answer (media3 1.0.0-alpha)
When you create the MediaLibrarySession inside your MediaLibraryService, you can add a MediaItemFiller. This MediaItemFiller has a fillInLocalConfiguration method that will be "Called to fill in the MediaItem.localConfiguration of the media item from controllers."
Knowing this, you need to:
Add a MediaItemFiller to your MediaLibrarySession builder inside your service.
// My MediaLibraryService
// onCreate()
mediaLibrarySession = MediaLibrarySession.Builder(this, player, librarySessionCallback)
.setMediaItemFiller(CustomMediaItemFiller()) // <--
.setSessionActivity(pendingIntent)
.build()
Create a custom MediaSession.MediaItemFiller. Any time you use a setMediaItem/addMediaItem from a controller this will be called and the MediaItem returned here will be the one played.
class CustomMediaItemFiller : MediaSession.MediaItemFiller {
override fun fillInLocalConfiguration(
session: MediaSession,
controller: MediaSession.ControllerInfo,
mediaItem: MediaItem
): MediaItem {
// Return the media item to be played
return mediaItem.buildUpon()
// Use the metadata values to fill our media item
.setUri(mediaItem.mediaMetadata.mediaUri)
.build()
}
}
And finally, create and play your MediaItem from the activity.
// My Activity
// Fill some metadata that the MediaItemFiller
// will use to create the new MediaItem
val mmd = MediaMetadata.Builder()
.setTitle("Example")
.setArtist("Artist name")
.setMediaUri("...".toUri())
.build()
val mediaItem: MediaItem =
MediaItem.Builder()
.setMediaMetadata(mmd)
.build()
browser.setMediaItem(mediaItem)
browser.prepare()
browser.play()
I don't know why it has to be this awkward, but if you have a look to the CustomMediaItemFiller they use in the official repo, you will see that they use the mediaItem.mediaId to fetch a valid MediaItem from a media catalog. That's why their demo works when they use setMediaItem from an activity.
Also, as far as I know, anything you do inside fillInLocalConfiguration has to block the main thread (I believe setMediaItem has to be called from main) so, if you can, try to move any heavy work (ie, get media info from your database) to your Activity/ViewModel where you have more control, fill all the metadata you need there, and use your MediaSession.MediaItemFiller to do a simple transformation. Or move everything to your service and forget about everything.
I hope the flow is understood. I don't have much experience with media3 and maybe I'm missing something, but with the limitations of MediaItemFiller I found it a bit useless and I would really like to know more about its purpose.

Pause background service in Xamarin.Forms

I have an application that occasionally speaks via the systems text to speech(TTS) system, but if there's a background service (like an audiobook, or music stream) running at the same time they overlap.
I would like to pause the media, play my TTS, then unpause the media. I've looked, but can't find any solutions.
I believe if I were to play actual audio from my app, it would pause the media until my playback was complete (if I understand what I've found correctly). But TTS doesn't seem to have the same affect. The speech is totally dynamic, so I can't just record all the options.
Using the latest Xamarin.Forms, I've looked into all the media nuget packages I could find, and they all seem pretty centered on controlling media from files.
My only potential thought (I don't like it), is to maybe play an empty audio file while the TTS is running. But would like a more elegant solution if it exists.
(I don't care about iOS at the moment, so if it's an android only solution, I'm okay with it. And if it's native (java/kotlin), I can convert/incorporate it.)
Agree with rbonestell said, you can use DependencyService and AudioFocus to achieve it, when you record the audio, you can create interface in PCL.
public interface IControl
{
void StopBackgroundMusic();
}
When you record the audio, you can executed the DependencyService with following code.
private void Button_Clicked(object sender, EventArgs e)
{
DependencyService.Get<IControl>().StopBackgroundMusic();
//record the audio
}
In android folder, you can create a StopMusicService to achieve that.
[assembly: Dependency(typeof(StopMusicService))]
namespace TTSDemo.Droid
{
public class StopMusicService : IControl
{
AudioManager audioMan;
AudioManager.IOnAudioFocusChangeListener listener;
public void StopBackgroundMusic()
{
audioMan = (AudioManager)Android.App.Application.Context.GetSystemService(Context.AudioService);
listener = new MyAudioListener(this);
var ret = audioMan.RequestAudioFocus(listener, Stream.Music, AudioFocus.Gain);
}
}
internal class MyAudioListener :Java.Lang.Object, AudioManager.IOnAudioFocusChangeListener
{
private StopMusicService stopMusicService;
public MyAudioListener(StopMusicService stopMusicService)
{
this.stopMusicService = stopMusicService;
}
public void OnAudioFocusChange([GeneratedEnum] AudioFocus focusChange)
{
// throw new NotImplementedException();
}
}
}
Thanks to Leon Lu - MSFT, I was able to go in the right direction. I took his implementation (which has some deprecated calls to the Android API), and updated it for what I needed.
I'll be doing a little more work making sure it's stable and functional. I'll also see if I can clean it up a little too. But here's what works on my first test:
[assembly: Dependency(typeof(MediaService))]
namespace ...Droid.Services
{
public class MediaService : IMediaService
public async Task PauseBackgroundMusicForTask(Func<Task> onFocusGranted)
{
var manager = (AudioManager)Android.App.Application.Context.GetSystemService(Context.AudioService);
var builder = new AudioFocusRequestClass.Builder(AudioFocus.GainTransientMayDuck);
var focusRequest = builder.Build();
var ret = manager.RequestAudioFocus(focusRequest);
if (ret == AudioFocusRequest.Granted)
{
await onFocusGranted?.Invoke();
manager.AbandonAudioFocusRequest(focusRequest);
}
}
}
}

How to put media buttons on the lockscreen with ExoPlayer

I am using ExoPlayer (https://github.com/google/ExoPlayer) and custom notifications.
I want to access my music player from lock screen and headphone like in google play music and wync.
Please help me.
For playback controls on the lock screen you need to do a MediaStyle notification.
If you want to have an artwork as the lockscreen background you need to support MediaSession and maintain the metadata of the session properly:
new MediaMetadata.Builder(track)
.putBitmap(MediaMetadata.METADATA_KEY_ALBUM_ART, bitmap)
.putBitmap(MediaMetadata.METADATA_KEY_DISPLAY_ICON, bitmap)
.build();
I am also using Exoplayer with PlayerNotificationManager, And I used MediaSessionConnector and TimelineQueueNavigator to build notifications for the lock screen and background image for the lock screen.
Here, is my question with the Exoplayer team which is resolved for android 11 and above, regarding how to use MediaSessionConnector:
Why PlayerNotificationManager not showing Notification on startForeground in Android 11(R)?
Now, I just added a bitmap (using putParcelable()) for the current session track for MediaDescriptionCompact which will be set to the current MediaSession internally.
Here is the code for that:
val mediaSession = MediaSessionCompat(serviceContext, "DPS_APP")
mediaSession.isActive = true
mediaSessionConnector = MediaSessionConnector(mediaSession).also {
it.setQueueNavigator(
object : TimelineQueueNavigator(mediaSession) {
override fun getMediaDescription(
player: Player,
windowIndex: Int
): MediaDescriptionCompat {
val data: MediaMetaData =
getEmptyOfNullMedia(player)
isBitmapAvailable(getCurrentMediaArt(data))
val extras = Bundle().apply {
putString(
MediaMetadataCompat.METADATA_KEY_TITLE,
getCurrentTitle(data)
)
putString(
MediaMetadataCompat.METADATA_KEY_ARTIST,
getMediaTitle(data)
)
putParcelable(
MediaMetadataCompat.METADATA_KEY_ALBUM_ART,
sessionCurrentBitmap
)
}
return MediaDescriptionCompat.Builder()
.setIconBitmap(sessionCurrentBitmap)
.setExtras(extras)
.build()
}
it.setPlayer(mPlayer)
}
Here is a small brief on that how to refresh or invalidate MediaSession in the given below issue on GitHub:
https://github.com/google/ExoPlayer/issues/5494

How to integrate Google Cast v3 with ExoPlayer v2?

How to fully integrate Google Cast v3 with ExoPlayer v2? The activity will contain a FrameLayout with a com.google.android.exoplayer2.ui.SimpleExoPlayerView in it. The Google tutorial only covers integration with VideoView.
The code below is available in a Kotlin class in this Gist that should help people trying to set up their CastPlayer for the first time:
https://gist.github.com/stefan-zh/fd52e0ee06088ac4086d2ea3fb7d7f3e
Also, going through this tutorial from Google will help you: https://codelabs.developers.google.com/codelabs/cast-videos-android/index.html#0
I also used this tutorial to get started: https://android.jlelse.eu/sending-media-to-chromecast-has-never-been-easier-c331eeef1e0a
Here is a breakdown how to achieve this using ExoPlayer and its Cast extension.
1. You will need these dependencies:
// ExoPlayer is an advanced media player for playing media files
implementation "com.google.android.exoplayer:exoplayer-core:$exoplayer_version"
implementation "com.google.android.exoplayer:exoplayer-ui:$exoplayer_version"
implementation "com.google.android.exoplayer:extension-cast:$exoplayer_version"
2. You will need the Cast button
The Cast button can be added in the options menu for the activities. This is the recommended way to do it.
Add the following to res/menu/browse.xml (in my case the menu file is called browse.xml):
<?xml version="1.0" encoding="utf-8"?>
<menu xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto" >
<item
android:id="#+id/media_route_menu_item"
android:title="#string/media_route_menu_title"
app:actionProviderClass="androidx.mediarouter.app.MediaRouteActionProvider"
app:showAsAction="always"/>
</menu>
Then add the following code to your Activity to enable the castButton:
/**
* We need to populate the Cast button across all activities as suggested by Google Cast Guide:
* https://developers.google.com/cast/docs/design_checklist/cast-button#sender-cast-icon-available
*/
override fun onCreateOptionsMenu(menu: Menu?): Boolean {
val result = super.onCreateOptionsMenu(menu)
menuInflater.inflate(R.menu.browse, menu)
castButton = CastButtonFactory.setUpMediaRouteButton(applicationContext, menu, R.id.media_route_menu_item)
return result
}
3. Declare your Options Provider for the Cast context
You need this so that you get the options dialog with the list of devices that you can cast to. Add this to your AndroidManifest.xml in the application tag:
<meta-data
android:name="com.google.android.gms.cast.framework.OPTIONS_PROVIDER_CLASS_NAME"
android:value="com.google.android.exoplayer2.ext.cast.DefaultCastOptionsProvider" />
4. Your Activity needs to implement ExoPlayer's Cast Extension interface SessionAvailabilityListener
This interface will allow you to listen for changes in the Cast session availability. Based on whether a Cast session is available you can direct playback to the local player or the remote player.
override fun onCastSessionAvailable() {
playOnPlayer(castPlayer)
}
override fun onCastSessionUnavailable() {
playOnPlayer(exoPlayer)
}
5. You will need logic to initialize the players:
Notice how we are calling castPlayer?.setSessionAvailabilityListener(this) where this refers to your Activity that implements the SessionAvailabilityListener interface. The listener's methods will be called when the Cast session availability changes.
private fun initializePlayers() {
exoPlayer = SimpleExoPlayer.Builder(this).build()
playerView.player = exoPlayer
if (castPlayer == null) {
castPlayer = CastPlayer(castContext)
castPlayer?.setSessionAvailabilityListener(this)
}
// start the playback
if (castPlayer?.isCastSessionAvailable == true) {
playOnPlayer(castPlayer)
} else {
playOnPlayer(exoPlayer)
}
}
6. You need logic to play on the selected player:
This method allows you to store the playback state (playbackPosition, playWhenReady, or windowIndex)
Create the correct media type for the local or remote players
Select which player should start playback
playOnPlayer() method:
private fun playOnPlayer(player: Player?) {
if (currentPlayer == player) {
return
}
// save state from the existing player
currentPlayer?.let {
if (it.playbackState != Player.STATE_ENDED) {
it.rememberState()
}
it.stop(true)
}
// set the new player
currentPlayer = player
// set up the playback
// if the current player is the ExoPlayer, play from it
if (currentPlayer == exoPlayer) {
// build the MediaSource from the URI
val uri = Uri.parse(videoClipUrl)
val dataSourceFactory = DefaultDataSourceFactory(this#SampleCastingPlayerActivity, "exoplayer-agent")
val mediaSource = ProgressiveMediaSource.Factory(dataSourceFactory).createMediaSource(uri)
// use stored state (if any) to resume (or start) playback
exoPlayer?.playWhenReady = playWhenReady
exoPlayer?.seekTo(currentWindow, playbackPosition)
exoPlayer?.prepare(mediaSource, false, false)
}
// if the current player is the CastPlayer, play from it
if (currentPlayer == castPlayer) {
val metadata = MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE)
metadata.putString(MediaMetadata.KEY_TITLE, "Title")
metadata.putString(MediaMetadata.KEY_SUBTITLE, "Subtitle")
metadata.addImage(WebImage(Uri.parse("any-image-url")))
val mediaInfo = MediaInfo.Builder(videoClipUrl)
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
.setContentType(MimeTypes.VIDEO_MP4)
.setMetadata(metadata)
.build()
val mediaItem = MediaQueueItem.Builder(mediaInfo).build()
castPlayer?.loadItem(mediaItem, playbackPosition)
}
}
7. Remember state and clean up resources
Each time you switch your application between background or foreground you would need to release or request resources. Each time you release the Player's resources back to the system you would need to save its state.
/**
* Remembers the state of the playback of this Player.
*/
private fun Player.rememberState() {
this#SampleCastingPlayerActivity.playWhenReady = playWhenReady
this#SampleCastingPlayerActivity.playbackPosition = currentPosition
this#SampleCastingPlayerActivity.currentWindow = currentWindowIndex
}
/**
* Releases the resources of the local player back to the system.
*/
private fun releaseLocalPlayer() {
exoPlayer?.release()
exoPlayer = null
playerView.player = null
}
/**
* Releases the resources of the remote player back to the system.
*/
private fun releaseRemotePlayer() {
castPlayer?.setSessionAvailabilityListener(null)
castPlayer?.release()
castPlayer = null
}
Google Cast SDK is independent of Local Player, you can use ExoPlayer or MediaPlayer ( VideoView )
Once your APP has an active session, place the url in MediaInfo
val movieMetadata = MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE)
movieMetadata.putString(MediaMetadata.KEY_TITLE, "Title")
movieMetadata.putString(MediaMetadata.KEY_SUBTITLE, "Sub")
val mediaLoadOptions = MediaInfo.Builder( < URL > )
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
.setContentType(< Content Type of Media>)
.setMetadata(movieMetadata)
.setStreamDuration(<Media Duration >)
.build()
mCastSession.remoteMediaClient.load(buildMediaInfo(url), mediaLoadOptions)
If you need to stream a local media, you will need to stream it for yourself, using NanoHttpd or another of your choice, and also implement a Cast Receiver

Categories

Resources