I am working on YouTubePlayerFragment integration.
while the initialize YouTubePlayer in YouTubePlayerFragment it acquires audio from another app and that particular app stops playing audio ie. PlayMusic.
As User does not touch play button in my app's youtube player it would not acquire audio from another app. so how to avoid such issue and let another app plays the audio?
Here is my Fragment code, written in Kotlin.
class MyVideoFragment : YouTubePlayerFragment() {
lateinit var mPlayer: YouTubePlayer
companion object {
fun newInstance(url: String): TutorialVideoFragment {
val v = TutorialVideoFragment()
val b = Bundle()
b.putString("url", url)
v.init()
v.arguments = b
return v
}
}
private fun init() {
initialize(DEVELOPER_KEY,
object : YouTubePlayer.OnInitializedListener {
override fun onInitializationSuccess(arg0: YouTubePlayer.Provider,
player: YouTubePlayer, wasRestored: Boolean) {
if (!wasRestored) {
player.cueVideo(arguments.getString("url"))
if (player.isPlaying) {
player.pause();
}
player.setShowFullscreenButton(true)
mPlayer = player
}
}
override fun onInitializationFailure(provider: YouTubePlayer.Provider,
errorReason: YouTubeInitializationResult) {
if (errorReason.isUserRecoverableError) {
// errorReason.getErrorDialog(getActivity(), RECOVERY_DIALOG_REQUEST).show();
} else {
val errorMessage = String.format(
getString(R.string.error_player), errorReason.toString())
toast(errorMessage)
}
}
})
}
}
I faced the same issue. setManageAudioFocus(false) is what you need.
You can disable auto gain of audio focus inside your onInitializationSuccess by doing player.setManageAudioFocus(false) but remember you need to manage audio focus manually. You can read about managing audio focus here
Related
I'm using YoutubePlayer to play YouTube videos. I want to show a network error dialog when the internet is not available. I'm using the AbstractYouTubePlayerListener callback to listen for network errors, but the onError method is not getting called when the network is not available.
I'm building this app for Android-Tv.
private fun initializePlayer() {
val customPlayerUi =
binding.youtubePlayerView.inflateCustomPlayerUi(R.layout.yt_custom_player_ui)
binding.youtubePlayerView.getYouTubePlayerWhenReady(object : YouTubePlayerCallback {
override fun onYouTubePlayer(youTubePlayer: YouTubePlayer) {
this#YoutubeVideoFragment.initializedYouTubePlayer = youTubePlayer
val customPlayerUiController = YtCustomPlayerController(
requireContext(),
customPlayerUi,
youTubePlayer
)
youTubePlayer.addListener(customPlayerUiController)
if (isEulaAccepted) {
youTubePlayer.loadVideo(
glance.peek.videoPeek.video.youtubeVideo.videoId,
videoStartTime.toFloat()
)
}
if (videoItemBinding.nudgeReaction.visibility == View.VISIBLE &&
videoItemBinding.textPre.text == getString(R.string.skipping_to_next_video)
) {
videoItemBinding.nudgeReaction.setGone()
}
}
})
binding.youtubePlayerView.addYouTubePlayerListener(object :
AbstractYouTubePlayerListener() {
override fun onStateChange(
youTubePlayer: YouTubePlayer,
state: PlayerConstants.PlayerState
) {
handleVideoStateChange(state)
}
override fun onCurrentSecond(youTubePlayer: YouTubePlayer, second: Float) {
if (seekRequestedTime == -1 || seekRequestedTime == second.toInt()) {
// when user starts seeking,running time is getting updated parallely.So,to avoid jerk,will update running time after user seek time is reached
// else if user is not seeking by default seekRequestedTime will be -1
updateVideoRunningTime(second)
seekRequestedTime = -1
}
}
override fun onVideoDuration(youTubePlayer: YouTubePlayer, duration: Float) {
currentSeekBar.max = duration.toInt()
updateUIDuration(duration)
}
override fun onError(
youTubePlayer: YouTubePlayer,
error: PlayerConstants.PlayerError
) {
super.onError(youTubePlayer, error)
Timber.i("onError: $error")
setErrorCodes("Youtube error:${error.name}")
}
})
}
It doesn't look like that the onError method handle this kind of errors.
Here are the error codes definded:
enum class PlayerError {
UNKNOWN, INVALID_PARAMETER_IN_REQUEST, HTML_5_PLAYER, VIDEO_NOT_FOUND, VIDEO_NOT_PLAYABLE_IN_EMBEDDED_PLAYER
}
I would check the internet connection first and then call the video player.
Try to read the state of player. If it is buffering, then I would display that.
enum class PlayerState {
UNKNOWN, UNSTARTED, ENDED, PLAYING, PAUSED, BUFFERING, VIDEO_CUED
}
I am using an android media player to play audio in my app. URL I am using to set data source in media player is from s3 amazon bucket, which is stored in amazon server. That file is come by using FTP protocol.
Problem:
If I check the URL that I am using, in-browser then it automatically starts downloading (as it is using FTP). And if I check in my app then it will take too much time to start (for e.g. 7 min of a clip of size 10 MB takes 1 min to play)
I conclude that MediaPlayer first downloads that audio from the URL and then starts playing. So it takes time to Download.
I want the way that it can directly start playing inside the media player instead of downloading first.
URL type: https://demo-podcast.s3.us-west-1.amazonaws.com/media/clips/41e7d280-cde1-4676-95ff-f7026ae6bdde.mp3
Does anyone know anyway? Thanks in advance!
EDIT:
Here my code for mediaplayer, I am using it in recyclerview.
private fun attachMusic(data: ClipsResponse.ClipsResponseItem, binding: ItemLayoutCardBinding, mediaPlayer: MediaPlayer) {
binding.apply {
try {
mediaPlayer.setAudioAttributes(AudioAttributes.Builder().setContentType(AudioAttributes.CONTENT_TYPE_MUSIC).build())
mediaPlayer.setDataSource(data.audioUrl)
mediaPlayer.prepare()
setControls(this, mediaPlayer)
} catch (e: Exception) {
e.printStackTrace()
}
mediaPlayer.setOnCompletionListener {
btnPlay.setImageResource(R.drawable.ic_play_circle)
it.release()
}
}
}
private fun setControls(binding: ItemLayoutCardBinding, mediaPlayer: MediaPlayer) {
binding.apply {
seekBar.setMax(mediaPlayer.duration)
mediaPlayer.start()
playCycle(this, mediaPlayer)
if (mediaPlayer.isPlaying) {
btnPlay.setImageResource(R.drawable.ic_pause_circle)
playProgressBar.isVisible = false
)
}
seekBar.setOnSeekBarChangeListener(object : SeekBar.OnSeekBarChangeListener {
override fun onProgressChanged(seekBar: SeekBar, progress: Int, fromUser: Boolean) {
if (fromUser) {
mediaPlayer.seekTo(progress)
}
}
override fun onStartTrackingTouch(seekBar: SeekBar) {}
override fun onStopTrackingTouch(seekBar: SeekBar) {}
})
}
}
private fun playCycle(binding: ItemLayoutCardBinding, mediaPlayer: MediaPlayer) {
val handler = Handler()
binding.apply {
try {
seekBar.setProgress(mediaPlayer.currentPosition)
if (mediaPlayer.isPlaying) {
val runnable = Runnable { playCycle(binding, mediaPlayer) }
handler.postDelayed(runnable, 100)
}
} catch (e: java.lang.Exception) {
e.printStackTrace()
}
}
}
Please remove mediaPlayer.prepare() in code and add mediaPlayer.prepareAsync()
so, I am developing a online music player app with json data. here i am using recycelr view to show list of song. I have to face problem in controlling the media player to pause start in another activity. so I create a new file Audio.kt in this file i have create companion object with all the media player methods. And Now I can use mediaplayer anywhere using Audio.[methods]
here is the Audio.kt file
class Audio {
companion object {
val mp = MediaPlayer()
#RequiresApi(Build.VERSION_CODES.LOLLIPOP)
fun loadMediaPlayer(url: String?) {
mp.setAudioAttributes(
AudioAttributes.Builder()
.setContentType(AudioAttributes.CONTENT_TYPE_SONIFICATION)
.setUsage(AudioAttributes.USAGE_MEDIA)
.build()
)
mp.setDataSource(url)
mp.prepareAsync()
mp.setOnPreparedListener {
mp.start()
MainActivity.progressBar.visibility = View.GONE
MainActivity.slidingPlayBtn.visibility = View.VISIBLE
MainActivity.slidingPlayBtn.setImageResource(R.drawable.ic_pause)
MainActivity.main_pbar.visibility = View.GONE
MainActivity.mainPlayBtn.visibility = View.VISIBLE
val duration: Long = Audio.mp.duration.toLong()
val time = kotlin.String.format(
"%02d:%02d",
TimeUnit.MILLISECONDS.toMinutes(duration),
TimeUnit.MILLISECONDS.toSeconds(duration) -
TimeUnit.MINUTES.toSeconds(TimeUnit.MILLISECONDS.toMinutes(duration))
)
MainActivity.totalDur.text = time
MainActivity.seekBar.max = duration.toInt()
}
}
fun startMediaPlayer() {
mp.start()
}
fun pauseMediaPlayer() {
mp.pause()
}
fun stopMediaPlayer() {
if (mp.isPlaying) {
mp.stop()
mp.reset()
}
}
fun isNotPlaying(): Boolean {
return !mp.isPlaying
}
fun mediaCurrPos(): Long {
return mp.currentPosition.toLong()
}
fun mediaPlayerSeek(progress: Int) {
mp.seekTo(progress)
}
fun nextMediaPlayer() {
mp.setNextMediaPlayer(MediaPlayer())
}
}
}
so Now my Question, is this good approach? or I could have to face problems in future
have you any other idea to do this?
I am using interactive video broadcasting in my app.
I am attaching class in which I am using live streaming.
I am getting the audio issue when I go back from the live streaming screen to the previous screen. I still listen to the audio of the host.
previously I was using leave channel method and destroying rtc client object, but after implementing this when I go back from streaming class then it closes all users screen who are using this app because of leave channel method. after that, I removed this option from my on destroy method.
Now I am using disable audio method which disables the audio but when I open live streaming class it doesn't enable audio. Enable audio method is not working I also used the mute audio local stream method and rtc handler on user mute audio method.
I am getting error--
"LiveStreamingActivity has leaked IntentReceiver io.agora.rtc.internal.AudioRoutingController$HeadsetBroadcastReceiver#101a7a7
that was originally registered here. Are you missing a call to
unregisterReceiver()? android.app.IntentReceiverLeaked: Activity
com.allin.activities.home.homeActivities.LiveStreamingActivity has
leaked IntentReceiver
io.agora.rtc.internal.AudioRoutingController$HeadsetBroadcastReceiver#101a7a7
that was originally registered here. Are you missing a call to
unregisterReceiver()?"
Receiver is registering in SDK and exception is coming inside the SDK that is jar file I can't edit.
Please help this in resolving my issue as I have to live the app on
play store.
//firstly I have tried this but it automatically stops other
devices streaming.
override fun onDestroy() {
/* if (mRtcEngine != null) {
leaveChannel()
RtcEngine.destroy(mRtcEngine)
mRtcEngine = null
}*/
//second I have tried disabling the audio so that user will
not hear
the host voice
if (mRtcEngine != null) //
{
mRtcEngine!!.disableAudio()
}
super.onDestroy()
}
// then I when I came back from the previous screen to live streaming activity everything is initializing again but the audio is not able to audible.
override fun onResume() {
super.onResume()
Log.e("resume", "resume")
if (mRtcEngine != null) {
mRtcEngine!!.enableAudio()
// mRtcEngine!!.resumeAudio()
}
}
code I am using
//agora rtc engine and handler initialization-----------------
private var mRtcEngine: RtcEngine? = null
private var mRtcEventHandler = object : IRtcEngineEventHandler() {
#SuppressLint("LongLogTag")
override fun onFirstRemoteVideoDecoded(uid: Int, width: Int,
height: Int, elapsed: Int) {
}
override fun onUserOffline(uid: Int, reason: Int) {
runOnUiThread {
val a = reason //if login =0 user is offline
try {
if (mUid == uid) {
if (surfaceView?.parent != null)
(surfaceView?.parent as ViewGroup).removeAllViews()
if (mRtcEngine != null) {
leaveChannel()
RtcEngine.destroy(mRtcEngine)
mRtcEngine = null
}
setResult(IntentConstants.REQUEST_CODE_LIVE_STREAMING)
finish()
}
} catch (e: Exception) {
e.printStackTrace()
}
}
}
override fun onUserMuteVideo(uid: Int, muted: Boolean) {
runOnUiThread {
// onRemoteUserVideoMuted(uid, muted);
Log.e("video","muted")
}
}
override fun onAudioQuality(uid: Int, quality: Int, delay:
Short, lost: Short) {
super.onAudioQuality(uid, quality, delay, lost)
Log.e("", "")
}
override fun onUserJoined(uid: Int, elapsed: Int) {
// super.onUserJoined(uid, elapsed)
mUid = uid
runOnUiThread {
try {
setupRemoteVideo(mUid!!)
} catch (e: Exception) {
e.printStackTrace()
}
}
Log.e("differnt_uid----", mUid.toString())
}
}
private fun initAgoraEngineAndJoinChannel() {
if(mRtcEngine==null)
{
initializeAgoraEngine()
setupVideoProfile()
}
}
//initializing rtc engine class
#Throws(Exception::class)
private fun initializeAgoraEngine() {
try {
var s = RtcEngine.getSdkVersion()
mRtcEngine = RtcEngine.create(baseContext, AgoraConstants.APPLICATION_ID, mRtcEventHandler)
} catch (e: Exception) {
// Log.e(LOG_TAG, Log.getStackTraceString(e));
throw RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e))
}
}
#Throws(Exception::class)
private fun setupVideoProfile() {
//mRtcEngine?.muteAllRemoteAudioStreams(true)
// mLogger.log("channelName account = " + channelName + ",uid = " + 0);
mRtcEngine?.enableVideo()
//mRtcEngine.clearVideoCompositingLayout();
mRtcEngine?.enableLocalVideo(false)
mRtcEngine?.setEnableSpeakerphone(false)
mRtcEngine?.muteLocalAudioStream(true)
joinChannel()
mRtcEngine?.setVideoProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING, true)
mRtcEngine?.setChannelProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING)
mRtcEngine?.setClientRole(Constants.CLIENT_ROLE_AUDIENCE,"")
val speaker = mRtcEngine?.isSpeakerphoneEnabled
val camerafocus = mRtcEngine?.isCameraAutoFocusFaceModeSupported
Log.e("", "")
}
#Throws(Exception::class)
private fun setupRemoteVideo(uid: Int) {
val container = findViewById<FrameLayout>(R.id.fl_video_container)
if (container.childCount >= 1) {
return
}
surfaceView = RtcEngine.CreateRendererView(baseContext)
container.addView(surfaceView)
mRtcEngine?.setupRemoteVideo(VideoCanvas(surfaceView, VideoCanvas.RENDER_MODE_HIDDEN, uid))
mRtcEngine?.setRemoteVideoStreamType(uid, 1)
mRtcEngine?.setCameraAutoFocusFaceModeEnabled(false)
mRtcEngine?.muteRemoteAudioStream(uid, false)
mRtcEngine?.adjustPlaybackSignalVolume(0)
// mRtcEngine.setVideoProfile(Constants.VIDEO_PROFILE_180P, false); // Earlier than 2.3.0
surfaceView?.tag = uid // for mark purpose
val audioManager: AudioManager =
this#LiveStreamingActivity.getSystemService(Context.AUDIO_SERVICE) as AudioManager
//audioManager.mode = AudioManager.MODE_IN_CALL
val isConnected: Boolean = audioManager.isWiredHeadsetOn
if (isConnected) {
/* audioManager.isSpeakerphoneOn = false
audioManager.isWiredHeadsetOn = true*/
mRtcEngine?.setEnableSpeakerphone(false)
mRtcEngine?.setDefaultAudioRoutetoSpeakerphone(false)
mRtcEngine?.setSpeakerphoneVolume(0)
mRtcEngine?.enableInEarMonitoring(true)
// Sets the in-ear monitoring volume to 50% of original volume.
mRtcEngine?.setInEarMonitoringVolume(200)
mRtcEngine?.adjustPlaybackSignalVolume(200)
} else {
/* audioManager.isSpeakerphoneOn = true
audioManager.isWiredHeadsetOn = false*/
mRtcEngine?.setEnableSpeakerphone(true)
mRtcEngine?.setDefaultAudioRoutetoSpeakerphone(true)
mRtcEngine?.setSpeakerphoneVolume(50)
mRtcEngine?.adjustPlaybackSignalVolume(50)
mRtcEngine?.enableInEarMonitoring(false)
// Sets the in-ear monitoring volume to 50% of original volume.
mRtcEngine?.setInEarMonitoringVolume(0)
}
Log.e("", "")
}
#Throws(Exception::class)
private fun joinChannel() {
mRtcEngine?.joinChannel(
null,
AgoraConstants.CHANNEL_NAME,
"Extra Optional Data",
0
) // if you do not specify the uid, we will generate the uid for you
}
#Throws(Exception::class)
private fun leaveChannel() {
mRtcEngine!!.leaveChannel()
}
I think first you want to put setupRemoteVideo in onFirstRemoteVideoDecoded callback instead of the onUserJoined callback. Also, in the onDestroy callback, you should call RtcEngine.destroy() instead of RtcEngine.destroy(mRtcEngine).
I have implemented the ExoPlayer in my application using the example from the Codelab : https://codelabs.developers.google.com/codelabs/exoplayer-intro/#3, algo with the example from https://medium.com/google-exoplayer/playing-ads-with-exoplayer-and-ima-868dfd767ea, the only difference is that I use AdsMediaSource instead of the deprecated ImaAdsMediaSource.
My Implementation is this:
class HostVideoFullFragment : Fragment(), AdsMediaSource.MediaSourceFactory {
override fun getSupportedTypes() = intArrayOf(C.TYPE_DASH, C.TYPE_HLS, C.TYPE_OTHER)
override fun createMediaSource(uri: Uri?, handler: Handler?, listener: MediaSourceEventListener?): MediaSource {
#C.ContentType val type = Util.inferContentType(uri)
return when (type) {
C.TYPE_DASH -> {
DashMediaSource.Factory(
DefaultDashChunkSource.Factory(mediaDataSourceFactory),
manifestDataSourceFactory)
.createMediaSource(uri, handler, listener)
}
C.TYPE_HLS -> {
HlsMediaSource.Factory(mediaDataSourceFactory)
.createMediaSource(uri, handler, listener)
}
C.TYPE_OTHER -> {
ExtractorMediaSource.Factory(mediaDataSourceFactory)
.createMediaSource(uri, handler, listener)
}
else -> throw IllegalStateException("Unsupported type for createMediaSource: $type")
}
}
private var player: SimpleExoPlayer? = null
private lateinit var playerView: SimpleExoPlayerView
private lateinit var binding: FragmentHostVideoFullBinding
private var playbackPosition: Long = 0
private var currentWindow: Int = 0
private var playWhenReady = true
private var inErrorState: Boolean = false
private lateinit var adsLoader: ImaAdsLoader
private lateinit var manifestDataSourceFactory: DataSource.Factory
private lateinit var mediaDataSourceFactory: DataSource.Factory
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
//Initialize the adsLoader
adsLoader = ImaAdsLoader(activity as Context, Uri.parse("https://pubads.g.doubleclick.net/gampad/ads?sz=640x480&iu=/124319096/external/ad_rule_samples&ciu_szs=300x250&ad_rule=1&impl=s&gdfp_req=1&env=vp&output=vmap&unviewed_position_start=1&cust_params=deployment%3Ddevsite%26sample_ar%3Dpremidpost&cmsid=496&vid=short_onecue&correlator="))
manifestDataSourceFactory = DefaultDataSourceFactory(
context, Util.getUserAgent(context, "BUO-APP"))//TODO change the applicationName with the right application name
//
mediaDataSourceFactory = DefaultDataSourceFactory(
context,
Util.getUserAgent(context, "BUO-APP"),//TODO change the applicationName with the right application name
DefaultBandwidthMeter())
}
private fun initializePlayer() {
/*
* Since the player can change from null (when we release resources) to not null we have to check if it's null.
* If it is then reset again
* */
if (player == null) {
//Initialize the Exo Player
player = ExoPlayerFactory.newSimpleInstance(DefaultRenderersFactory(activity as Context),
DefaultTrackSelector())
}
val uri = Uri.parse(videoURl)
val mediaSourceWithAds = buildMediaSourceWithAds(uri)
//Bind the view from the xml to the SimpleExoPlayer instance
playerView.player = player
//Add the listener that listens for errors
player?.addListener(PlayerEventListener())
player?.seekTo(currentWindow, playbackPosition)
player?.prepare(mediaSourceWithAds, true, false)
//In case we could not set the exo player
player?.playWhenReady = playWhenReady
//We got here without an error, therefore set the inErrorState as false
inErrorState = false
//Re update the retry button since, this method could have come from a retry click
updateRetryButton()
}
private inner class PlayerEventListener : Player.DefaultEventListener() {
fun updateResumePosition() {
player?.let {
currentWindow = player!!.currentWindowIndex
playbackPosition = Math.max(0, player!!.contentPosition)
}
}
override fun onPlayerStateChanged(playWhenReady: Boolean, playbackState: Int) {
//The player state has ended
//TODO check if there is going to be a UI change here
// if (playbackState == Player.STATE_ENDED) {
// showControls()
// }
// updateButtonVisibilities()
}
override fun onPositionDiscontinuity(#Player.DiscontinuityReason reason: Int) {
if (inErrorState) {
// This will only occur if the user has performed a seek whilst in the error state. Update
// the resume position so that if the user then retries, playback will resume from the
// position to which they seek.
updateResumePosition()
}
}
override fun onPlayerError(e: ExoPlaybackException?) {
var errorString: String? = null
//Check what was the error so that we can show the user what was the correspond problem
if (e?.type == ExoPlaybackException.TYPE_RENDERER) {
val cause = e.rendererException
if (cause is MediaCodecRenderer.DecoderInitializationException) {
// Special case for decoder initialization failures.
errorString = if (cause.decoderName == null) {
when {
cause.cause is MediaCodecUtil.DecoderQueryException -> getString(R.string.error_querying_decoders)
cause.secureDecoderRequired -> getString(R.string.error_no_secure_decoder,
cause.mimeType)
else -> getString(R.string.error_no_decoder,
cause.mimeType)
}
} else {
getString(R.string.error_instantiating_decoder,
cause.decoderName)
}
}
}
if (errorString != null) {
//Show the toast with the proper error
Toast.makeText(activity as Context, errorString, Toast.LENGTH_LONG).show()
}
inErrorState = true
if (isBehindLiveWindow(e)) {
clearResumePosition()
initializePlayer()
} else {
updateResumePosition()
updateRetryButton()
}
}
}
private fun clearResumePosition() {
//Clear the current resume position, since there was an error
currentWindow = C.INDEX_UNSET
playbackPosition = C.TIME_UNSET
}
/*
* This is for the multi window support
* */
private fun isBehindLiveWindow(e: ExoPlaybackException?): Boolean {
if (e?.type != ExoPlaybackException.TYPE_SOURCE) {
return false
}
var cause: Throwable? = e.sourceException
while (cause != null) {
if (cause is BehindLiveWindowException) {
return true
}
cause = cause.cause
}
return false
}
private fun buildMediaSourceWithAds(uri: Uri): MediaSource {
/*
* This content media source is the video itself without the ads
* */
val contentMediaSource = ExtractorMediaSource.Factory(
DefaultHttpDataSourceFactory("BUO-APP")).createMediaSource(uri) //TODO change the user agent
/*
* The method constructs and returns a ExtractorMediaSource for the given uri.
* We simply use a new DefaultHttpDataSourceFactory which only needs a user agent string.
* By default the factory will use a DefaultExtractorFactory for the media source.
* This supports almost all non-adaptive audio and video formats supported on Android. It will recognize our mp3 file and play it nicely.
* */
return AdsMediaSource(
contentMediaSource,
/* adMediaSourceFactory= */ this,
adsLoader,
playerView.overlayFrameLayout,
/* eventListener= */ null, null)
}
override fun onStart() {
super.onStart()
if (Util.SDK_INT > 23) {
initializePlayer()
}
}
override fun onResume() {
super.onResume()
hideSystemUi()
/*
* Starting with API level 24 Android supports multiple windows.
* As our app can be visible but not active in split window mode, we need to initialize the player in onStart.
* Before API level 24 we wait as long as possible until we grab resources, so we wait until onResume before initializing the player.
* */
if ((Util.SDK_INT <= 23 || player == null)) {
initializePlayer()
}
}
}
The ad never shows and if it shows it shows a rendering error E/ExoPlayerImplInternal: Renderer error. which never allows the video to show. I've run the examples from the IMA ads https://developers.google.com/interactive-media-ads/docs/sdks/android/ example code and it doesn't work neither. Does anyone have implemented the Exo Player succesfully with the latest ExoPlayer library version?
Please Help. Thanks!
When on an emulator, be sure to enable gpu rendering on the virtual device
The problem is that the emulator can not render videos. Therefore it wasn't showing the ads or the video. Run the app on a phone and it will work