I have an app that implements the video calling feature using WebRTC in android.
From the App, I am creating an Offer for calls and receiving from another App/ Web.
So, when I am working on app to app, everything is working fine, I can see the remote peer, and the remote peer can also see me.
But when I am working on app (creating offer) to web (receiver). In the app, I can see both local and remote Stream, but from the web, I can't see the remote stream only seeing the remote stream.
This is my android code while creating an offer
val pcConstraints = object : MediaConstraints() {
init {
optional.add(KeyValuePair("DtlsSrtpKeyAgreement", "true"))
optional.add(KeyValuePair("OfferToReceiveAudio", "true"))
optional.add(KeyValuePair("OfferToReceiveVideo", "true"))
}
}
val peerConnection = getOrCreatePeerConnection(mRemoteSocketId, "A")
peerConnection.createOffer(object : CustomSdpObserver() {
override fun onCreateSuccess(sessionDescription: SessionDescription?) {
Timber.d("onCreateSuccess: ")
peerConnection.setLocalDescription(CustomSdpObserver(), sessionDescription)
if (sessionDescription != null) {
// sending sessionDescription from here
}
// starting the call from here
}
}, pcConstraints)
Here Creating the peerConnection
private fun getOrCreatePeerConnection(socketId: String, role: String): PeerConnection {
Timber.tag("live").d("getOrCreatePeerConnection socketId $socketId role $role")
var peerConnection = peerConnectionMap[socketId]
if (peerConnection != null) {
return peerConnection
}
val rtcConfig = PeerConnection.RTCConfiguration(iceServers)
rtcConfig.bundlePolicy = PeerConnection.BundlePolicy.MAXBUNDLE
rtcConfig.rtcpMuxPolicy = PeerConnection.RtcpMuxPolicy.REQUIRE
rtcConfig.keyType = PeerConnection.KeyType.ECDSA
rtcConfig.iceTransportsType = PeerConnection.IceTransportsType.ALL
rtcConfig.enableCpuOveruseDetection = true
peerConnection =
peerConnectionFactory?.createPeerConnection(
rtcConfig,
pcConstraints,
object : CustomPCObserver() {
override fun onIceCandidate(p0: IceCandidate?) {
super.onIceCandidate(p0)
Timber.tag("live").d("getOrCreatePeerConnection onIceCandidate${p0} ")
if (p0 != null) {
SignalingServer.get()?.sendIceCandidate(p0)
}
}
override fun onAddStream(p0: MediaStream?) {
super.onAddStream(p0)
Timber.tag("live")
.d("onAddStream Remote MediaStream ${p0?.videoTracks?.size} ")
gotRemoteStream(p0!!)
}
override fun onRenegotiationNeeded() {
super.onRenegotiationNeeded()
Timber.tag("live").d("onRenegotiationNeeded")
}
})
peerConnection!!.addStream(localMediaStream)
peerConnectionMap[socketId] = peerConnection
Timber.tag("live")
.d("getOrCreatePeerConnection size $socketId ${peerConnectionMap.size} , ${peerConnectionMap.values} ")
return peerConnection
}
So, what am I missing, I believe in Web end somehow my local Stream is not received. Your help would be highly appreciated.
Related
I've followed the instruction from Google on how to cast media metadata to chromecast, the initial loading is fine, it will show the title, image and play the stream, but my problem is that I am streaming a live audio stream and need to update the metadata from time to time without having to buffer the audio again.
This is a sample of my code:
override fun loadMediaLoadRequestData(request: PlatformBridgeApis.MediaLoadRequestData?)
{
if (request == null) return
val remoteMediaClient: RemoteMediaClient = remoteMediaClient ?: return
val mediaLoadRequest = getMediaLoadRequestData(request)
remoteMediaClient.load(mediaLoadRequest)
}
fun getMediaLoadRequestData(request: PlatformBridgeApis.MediaLoadRequestData): MediaLoadRequestData {
val mediaInfo = getMediaInfo(request.mediaInfo)
return MediaLoadRequestData.Builder()
.setMediaInfo(mediaInfo)
.setAutoplay(request.shouldAutoplay)
.setCurrentTime(request.currentTime)
.build()
}
fun getMediaInfo(mediaInfo: PlatformBridgeApis.MediaInfo?): MediaInfo? {
if (mediaInfo == null) return null
val streamType = getStreamType(mediaInfo.streamType)
val metadata = getMediaMetadata(mediaInfo.mediaMetadata)
val mediaTracks = mediaInfo.mediaTracks.map { getMediaTrack(it) }
val customData = JSONObject(mediaInfo.customDataAsJson ?: "{}")
return MediaInfo.Builder(mediaInfo.contentId)
.setStreamType(streamType)
.setContentType(mediaInfo.contentType)
.setMetadata(metadata)
.setMediaTracks(mediaTracks)
.setStreamDuration(mediaInfo.streamDuration)
.setCustomData(customData)
.build()
}
Does anyone have any suggestion on how to modify loadMediaLoadRequestData in order to trigger the Chromecast receiver to update only the MediaMetadata and not have the stream buffer again?
I am using interactive video broadcasting in my app.
I am attaching class in which I am using live streaming.
I am getting the audio issue when I go back from the live streaming screen to the previous screen. I still listen to the audio of the host.
previously I was using leave channel method and destroying rtc client object, but after implementing this when I go back from streaming class then it closes all users screen who are using this app because of leave channel method. after that, I removed this option from my on destroy method.
Now I am using disable audio method which disables the audio but when I open live streaming class it doesn't enable audio. Enable audio method is not working I also used the mute audio local stream method and rtc handler on user mute audio method.
I am getting error--
"LiveStreamingActivity has leaked IntentReceiver io.agora.rtc.internal.AudioRoutingController$HeadsetBroadcastReceiver#101a7a7
that was originally registered here. Are you missing a call to
unregisterReceiver()? android.app.IntentReceiverLeaked: Activity
com.allin.activities.home.homeActivities.LiveStreamingActivity has
leaked IntentReceiver
io.agora.rtc.internal.AudioRoutingController$HeadsetBroadcastReceiver#101a7a7
that was originally registered here. Are you missing a call to
unregisterReceiver()?"
Receiver is registering in SDK and exception is coming inside the SDK that is jar file I can't edit.
Please help this in resolving my issue as I have to live the app on
play store.
//firstly I have tried this but it automatically stops other
devices streaming.
override fun onDestroy() {
/* if (mRtcEngine != null) {
leaveChannel()
RtcEngine.destroy(mRtcEngine)
mRtcEngine = null
}*/
//second I have tried disabling the audio so that user will
not hear
the host voice
if (mRtcEngine != null) //
{
mRtcEngine!!.disableAudio()
}
super.onDestroy()
}
// then I when I came back from the previous screen to live streaming activity everything is initializing again but the audio is not able to audible.
override fun onResume() {
super.onResume()
Log.e("resume", "resume")
if (mRtcEngine != null) {
mRtcEngine!!.enableAudio()
// mRtcEngine!!.resumeAudio()
}
}
code I am using
//agora rtc engine and handler initialization-----------------
private var mRtcEngine: RtcEngine? = null
private var mRtcEventHandler = object : IRtcEngineEventHandler() {
#SuppressLint("LongLogTag")
override fun onFirstRemoteVideoDecoded(uid: Int, width: Int,
height: Int, elapsed: Int) {
}
override fun onUserOffline(uid: Int, reason: Int) {
runOnUiThread {
val a = reason //if login =0 user is offline
try {
if (mUid == uid) {
if (surfaceView?.parent != null)
(surfaceView?.parent as ViewGroup).removeAllViews()
if (mRtcEngine != null) {
leaveChannel()
RtcEngine.destroy(mRtcEngine)
mRtcEngine = null
}
setResult(IntentConstants.REQUEST_CODE_LIVE_STREAMING)
finish()
}
} catch (e: Exception) {
e.printStackTrace()
}
}
}
override fun onUserMuteVideo(uid: Int, muted: Boolean) {
runOnUiThread {
// onRemoteUserVideoMuted(uid, muted);
Log.e("video","muted")
}
}
override fun onAudioQuality(uid: Int, quality: Int, delay:
Short, lost: Short) {
super.onAudioQuality(uid, quality, delay, lost)
Log.e("", "")
}
override fun onUserJoined(uid: Int, elapsed: Int) {
// super.onUserJoined(uid, elapsed)
mUid = uid
runOnUiThread {
try {
setupRemoteVideo(mUid!!)
} catch (e: Exception) {
e.printStackTrace()
}
}
Log.e("differnt_uid----", mUid.toString())
}
}
private fun initAgoraEngineAndJoinChannel() {
if(mRtcEngine==null)
{
initializeAgoraEngine()
setupVideoProfile()
}
}
//initializing rtc engine class
#Throws(Exception::class)
private fun initializeAgoraEngine() {
try {
var s = RtcEngine.getSdkVersion()
mRtcEngine = RtcEngine.create(baseContext, AgoraConstants.APPLICATION_ID, mRtcEventHandler)
} catch (e: Exception) {
// Log.e(LOG_TAG, Log.getStackTraceString(e));
throw RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e))
}
}
#Throws(Exception::class)
private fun setupVideoProfile() {
//mRtcEngine?.muteAllRemoteAudioStreams(true)
// mLogger.log("channelName account = " + channelName + ",uid = " + 0);
mRtcEngine?.enableVideo()
//mRtcEngine.clearVideoCompositingLayout();
mRtcEngine?.enableLocalVideo(false)
mRtcEngine?.setEnableSpeakerphone(false)
mRtcEngine?.muteLocalAudioStream(true)
joinChannel()
mRtcEngine?.setVideoProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING, true)
mRtcEngine?.setChannelProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING)
mRtcEngine?.setClientRole(Constants.CLIENT_ROLE_AUDIENCE,"")
val speaker = mRtcEngine?.isSpeakerphoneEnabled
val camerafocus = mRtcEngine?.isCameraAutoFocusFaceModeSupported
Log.e("", "")
}
#Throws(Exception::class)
private fun setupRemoteVideo(uid: Int) {
val container = findViewById<FrameLayout>(R.id.fl_video_container)
if (container.childCount >= 1) {
return
}
surfaceView = RtcEngine.CreateRendererView(baseContext)
container.addView(surfaceView)
mRtcEngine?.setupRemoteVideo(VideoCanvas(surfaceView, VideoCanvas.RENDER_MODE_HIDDEN, uid))
mRtcEngine?.setRemoteVideoStreamType(uid, 1)
mRtcEngine?.setCameraAutoFocusFaceModeEnabled(false)
mRtcEngine?.muteRemoteAudioStream(uid, false)
mRtcEngine?.adjustPlaybackSignalVolume(0)
// mRtcEngine.setVideoProfile(Constants.VIDEO_PROFILE_180P, false); // Earlier than 2.3.0
surfaceView?.tag = uid // for mark purpose
val audioManager: AudioManager =
this#LiveStreamingActivity.getSystemService(Context.AUDIO_SERVICE) as AudioManager
//audioManager.mode = AudioManager.MODE_IN_CALL
val isConnected: Boolean = audioManager.isWiredHeadsetOn
if (isConnected) {
/* audioManager.isSpeakerphoneOn = false
audioManager.isWiredHeadsetOn = true*/
mRtcEngine?.setEnableSpeakerphone(false)
mRtcEngine?.setDefaultAudioRoutetoSpeakerphone(false)
mRtcEngine?.setSpeakerphoneVolume(0)
mRtcEngine?.enableInEarMonitoring(true)
// Sets the in-ear monitoring volume to 50% of original volume.
mRtcEngine?.setInEarMonitoringVolume(200)
mRtcEngine?.adjustPlaybackSignalVolume(200)
} else {
/* audioManager.isSpeakerphoneOn = true
audioManager.isWiredHeadsetOn = false*/
mRtcEngine?.setEnableSpeakerphone(true)
mRtcEngine?.setDefaultAudioRoutetoSpeakerphone(true)
mRtcEngine?.setSpeakerphoneVolume(50)
mRtcEngine?.adjustPlaybackSignalVolume(50)
mRtcEngine?.enableInEarMonitoring(false)
// Sets the in-ear monitoring volume to 50% of original volume.
mRtcEngine?.setInEarMonitoringVolume(0)
}
Log.e("", "")
}
#Throws(Exception::class)
private fun joinChannel() {
mRtcEngine?.joinChannel(
null,
AgoraConstants.CHANNEL_NAME,
"Extra Optional Data",
0
) // if you do not specify the uid, we will generate the uid for you
}
#Throws(Exception::class)
private fun leaveChannel() {
mRtcEngine!!.leaveChannel()
}
I think first you want to put setupRemoteVideo in onFirstRemoteVideoDecoded callback instead of the onUserJoined callback. Also, in the onDestroy callback, you should call RtcEngine.destroy() instead of RtcEngine.destroy(mRtcEngine).
I am trying to stream video from Raspberry Pi to android device via webrtc. I am using firebase (firestore) as signalling. I am able to run the setup while connected to same wifi but it fails when different networks are being used.
Device - RPI
Client
1) Web client (hosted on firebase)
2) Android App
On same network (wifi) between device and clients, both clients are able to play video and audio.
But when device and client are on different network, web client is able to show video but Android App is not able to show video.
Signalling is working correctly and on device, camera and microphone are started and ice candidates are exchanged successfully. I also get remote stream added (onAddStream called) on android. But no video and audio is playing.
Android PeerConnectionClient
class PeerConnectionClient(private val activity: MainActivity, private val fSignalling: FSignalling) {
internal var isVideoRunning = false
private val rootEglBase by lazy {
EglBase.create()
}
private val peerConnectionFactory: PeerConnectionFactory by lazy {
val initializationOptions = PeerConnectionFactory.InitializationOptions.builder(activity).createInitializationOptions()
PeerConnectionFactory.initialize(initializationOptions)
val options = PeerConnectionFactory.Options()
val defaultVideoEncoderFactory = DefaultVideoEncoderFactory(rootEglBase.eglBaseContext, true, true)
val defaultVideoDecoderFactory = DefaultVideoDecoderFactory(rootEglBase.eglBaseContext)
PeerConnectionFactory.builder()
.setOptions(options)
.setVideoEncoderFactory(defaultVideoEncoderFactory)
.setVideoDecoderFactory(defaultVideoDecoderFactory)
.createPeerConnectionFactory()
}
private val iceServersList = mutableListOf("stun:stun.l.google.com:19302")
private var sdpConstraints: MediaConstraints? = null
private var localAudioTrack: AudioTrack? = null
private var localPeer: PeerConnection? = null
private var gotUserMedia: Boolean = false
private var peerIceServers: MutableList<PeerConnection.IceServer> = ArrayList()
init {
peerIceServers.add(PeerConnection.IceServer.builder(iceServersList).createIceServer())
// activity.surface_view.release()
activity.surface_view.init(rootEglBase.eglBaseContext, null)
activity.surface_view.setZOrderMediaOverlay(true)
createPeer()
}
private fun createPeer() {
sdpConstraints = MediaConstraints()
val audioconstraints = MediaConstraints()
val audioSource = peerConnectionFactory.createAudioSource(audioconstraints)
localAudioTrack = peerConnectionFactory.createAudioTrack("101", audioSource)
gotUserMedia = true
activity.runOnUiThread {
if (localAudioTrack != null) {
createPeerConnection()
// doCall()
}
}
}
/**
* Creating the local peerconnection instance
*/
private fun createPeerConnection() {
val constraints = MediaConstraints()
constraints.mandatory.add(MediaConstraints.KeyValuePair("offerToReceiveAudio", "true"))
constraints.mandatory.add(MediaConstraints.KeyValuePair("offerToReceiveVideo", "true"))
constraints.optional.add(MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true"))
val rtcConfig = PeerConnection.RTCConfiguration(peerIceServers)
// TCP candidates are only useful when connecting to a server that supports
// ICE-TCP.
rtcConfig.enableDtlsSrtp = true
rtcConfig.enableRtpDataChannel = true
// rtcConfig.tcpCandidatePolicy = PeerConnection.TcpCandidatePolicy.DISABLED
// rtcConfig.bundlePolicy = PeerConnection.BundlePolicy.MAXBUNDLE
// rtcConfig.rtcpMuxPolicy = PeerConnection.RtcpMuxPolicy.REQUIRE
// rtcConfig.continualGatheringPolicy = PeerConnection.ContinualGatheringPolicy.GATHER_CONTINUALLY
// Use ECDSA encryption.
// rtcConfig.keyType = PeerConnection.KeyType.ECDSA
localPeer = peerConnectionFactory.createPeerConnection(rtcConfig, constraints, object : PeerObserver {
override fun onIceCandidate(p0: IceCandidate) {
super.onIceCandidate(p0)
onIceCandidateReceived(p0)
}
override fun onAddStream(p0: MediaStream) {
activity.showToast("Received Remote stream")
super.onAddStream(p0)
gotRemoteStream(p0)
}
})
addStreamToLocalPeer()
}
/**
* Adding the stream to the localpeer
*/
private fun addStreamToLocalPeer() {
//creating local mediastream
val stream = peerConnectionFactory.createLocalMediaStream("102")
stream.addTrack(localAudioTrack)
localPeer!!.addStream(stream)
}
/**
* This method is called when the app is initiator - We generate the offer and send it over through socket
* to remote peer
*/
/*private fun doCall() {
localPeer!!.createOffer(object : mySdpObserver {
override fun onCreateSuccess(p0: SessionDescription) {
super.onCreateSuccess(p0)
localPeer!!.setLocalDescription(object: mySdpObserver {}, p0)
Log.d("onCreateSuccess", "SignallingClient emit ")
}
}, sdpConstraints)
}*/
private fun onIceCandidateReceived(iceCandidate: IceCandidate) {
//we have received ice candidate. We can set it to the other peer.
if (localPeer == null) {
return
}
val message = JSONObject()
message.put("type", "candidate")
message.put("label", iceCandidate.sdpMLineIndex)
message.put("id", iceCandidate.sdpMid)
message.put("candidate", iceCandidate.serverUrl)
fSignalling.doSignalingSend(message.toString())
}
private fun gotRemoteStream(stream: MediaStream) {
isVideoRunning = true
//we have remote video stream. add to the renderer.
val videoTrack = stream.videoTracks[0]
videoTrack.setEnabled(true)
activity.runOnUiThread {
try {
// val remoteRenderer = VideoRenderer(surface_view)
activity.surface_view.visibility = View.VISIBLE
// videoTrack.addRenderer(remoteRenderer)
videoTrack.addSink(activity.surface_view)
} catch (e: Exception) {
e.printStackTrace()
}
}
}
fun onReceivePeerMessage(data: JSONObject) {
if (data.getString("type") == "offer") {
// val sdpReturned = SdpUtils.forceChosenVideoCodec(data.getString("sdp"), "H264")
val sdpReturned = data.getString("sdp")
// data.remove("sdp")
// data.put("sdp", sdpReturned)
val sessionDescription = SessionDescription(SessionDescription.Type.OFFER, sdpReturned)
localPeer?.setRemoteDescription(object: mySdpObserver { }, sessionDescription)
localPeer?.createAnswer(object : mySdpObserver {
override fun onCreateSuccess(p0: SessionDescription) {
super.onCreateSuccess(p0)
localPeer!!.setLocalDescription( object : mySdpObserver {}, p0)
val description = JSONObject()
description.put("type", p0.type.canonicalForm())
description.put("sdp", p0.description)
this#PeerConnectionClient.fSignalling.doSignalingSend(description.toString())
}
override fun onCreateFailure(p0: String) {
super.onCreateFailure(p0)
activity.showToast("Failed to create answer")
}
}, MediaConstraints())
} else if (data.getString("type") == "candidate") {
val iceCandidates = IceCandidate(data.getString("id"), data.getInt("label"), data.getString("candidate"))
localPeer?.addIceCandidate(iceCandidates)
}
}
internal fun close() {
isVideoRunning = false
localPeer?.close()
localPeer = null
}
}
I am under the impression that if web client is able to display video on different network (mobile hotspot), android client on same internet used by web client should be able to display video as well. Is it wrong?
Why won't android display video (onAddStream is called)
Is it required to use Turn server? My assumption again is the if web client works, so should android. The service i am using on RPI do not have support for turn server.
Additional info:
Device is behind double natted ISP (i guess) (but since web client can connect, it won't be an issue i guess).
I have found a solution to the issue
I was using
private fun onIceCandidateReceived(iceCandidate: IceCandidate) {
//we have received ice candidate. We can set it to the other peer.
if (localPeer == null) {
return
}
val message = JSONObject()
message.put("type", "candidate")
message.put("label", iceCandidate.sdpMLineIndex)
message.put("id", iceCandidate.sdpMid)
message.put("candidate", iceCandidate.serverUrl)
fSignalling.doSignalingSend(message.toString())
}
Instead was required to use
message.put("candidate", iceCandidate.sdp) // iceCandidate.serverUrl)
I try to cast youtube video from my android app to chromecast or smart tv through miracast.
But I can cast only source url for video like https://media.w3.org/2010/05/sintel/trailer.mp4
How can I cast web page with youtube or vimeo url video?
I know that YouTube app can cast video to some smart tv without chromecast and it looks like YouTube TV page. For example here https://www.youtube.com/watch?v=x5ImUYDjocY
I try to use Presentation API to set WebView in it:
#TargetApi(Build.VERSION_CODES.JELLY_BEAN_MR1)
class CastPresentation constructor(outerContext: Context?, display: Display?) : Presentation(outerContext, display) {
override fun onCreate(savedInstanceState: Bundle?) {
val wv = WebView(context)
wv.settings.javaScriptEnabled = true
wv.webChromeClient = WebChromeClient()
wv.loadUrl("https://www.youtube.com/watch?v=DxGLn_Cu5l0")
setContentView(wv)
super.onCreate(savedInstanceState)
}
}
But it doesn't affect. I don't undertand how to use it.
This is how I use it:
#TargetApi(Build.VERSION_CODES.JELLY_BEAN_MR1)
class CastDelegate constructor(private val activity: AppCompatActivity) {
private var mediaRouter: MediaRouter? = null
private var mediaRouteSelector: MediaRouteSelector? = null
// Variables to hold the currently selected route and its playback client
private var route: MediaRouter.RouteInfo? = null
private var remotePlaybackClient: RemotePlaybackClient? = null
private var presentation: Presentation? = null
// Define the Callback object and its methods, save the object in a class variable
private val mediaRouterCallback = object : MediaRouter.Callback() {
override fun onRouteSelected(router: MediaRouter, route: MediaRouter.RouteInfo) {
Timber.d("CastDelegate --> onRouteSelected: route=$route")
if (route.supportsControlCategory(MediaControlIntent.CATEGORY_REMOTE_PLAYBACK)) {
// Stop local playback (if necessary)
// ...
// Save the new route
this#CastDelegate.route = route
// Attach a new playback client
remotePlaybackClient = RemotePlaybackClient(activity, this#CastDelegate.route)
// Start remote playback (if necessary)
// ...
updatePresentation()
val uri = Uri.parse("https://media.w3.org/2010/05/sintel/trailer.mp4")
remotePlaybackClient?.play(uri, null, null, 0, null, object: RemotePlaybackClient.ItemActionCallback() {
override fun onResult(data: Bundle?, sessionId: String?, sessionStatus: MediaSessionStatus?, itemId: String?, itemStatus: MediaItemStatus?) {
super.onResult(data, sessionId, sessionStatus, itemId, itemStatus)
}
})
}
}
override fun onRouteUnselected(router: MediaRouter, route: MediaRouter.RouteInfo, reason: Int) {
Timber.d("CastDelegate --> onRouteUnselected: route=$route")
if (route.supportsControlCategory(MediaControlIntent.CATEGORY_REMOTE_PLAYBACK)) {
// Changed route: tear down previous client
this#CastDelegate.route?.also {
remotePlaybackClient?.release()
remotePlaybackClient = null
}
// Save the new route
this#CastDelegate.route = route
updatePresentation()
when (reason) {
MediaRouter.UNSELECT_REASON_ROUTE_CHANGED -> {
// Resume local playback (if necessary)
// ...
}
}
}
}
override fun onRoutePresentationDisplayChanged(router: MediaRouter?, route: MediaRouter.RouteInfo?) {
updatePresentation()
}
}
fun onCreate() {
// Get the media router service.
mediaRouter = MediaRouter.getInstance(activity)
// Create a route selector for the type of routes your app supports.
mediaRouteSelector = MediaRouteSelector.Builder()
// These are the framework-supported intents
.addControlCategory(MediaControlIntent.CATEGORY_REMOTE_PLAYBACK)
.build()
// val selectedRoute = mediaRouter?.selectedRoute ?: return
// val presentationDisplay = selectedRoute.presentationDisplay ?: return
// presentation = CastPresentation(activity, presentationDisplay)
// presentation?.show()
}
fun onStart() {
mediaRouteSelector?.also { selector ->
mediaRouter?.addCallback(selector, mediaRouterCallback,
MediaRouter.CALLBACK_FLAG_REQUEST_DISCOVERY)
}
updatePresentation()
}
fun onStop() {
mediaRouter?.removeCallback(mediaRouterCallback)
presentation?.dismiss()
presentation = null
}
fun onCreateOptionsMenu(menu: Menu?, inflater: MenuInflater?) {
// Attach the MediaRouteSelector to the menu item
val mediaRouteMenuItem = menu?.findItem(R.id.media_route_menu_item)
val mediaRouteActionProvider = MenuItemCompat.getActionProvider(mediaRouteMenuItem) as MediaRouteActionProvider
// Attach the MediaRouteSelector that you built in onCreate()
mediaRouteSelector?.also(mediaRouteActionProvider::setRouteSelector)
}
private fun updatePresentation() {
// Get the current route and its presentation display.
val selectedRoute = mediaRouter?.selectedRoute
val presentationDisplay = selectedRoute?.presentationDisplay
// Dismiss the current presentation if the display has changed.
if (presentation?.display != presentationDisplay) {
Timber.d("CastDelegate --> Dismissing presentation because the current route no longer " + "has a presentation display.")
presentation?.dismiss()
presentation = null
}
// Show a new presentation if needed.
if (presentation == null && presentationDisplay != null) {
Timber.d("CastDelegate --> Showing presentation on display: $presentationDisplay")
presentation = CastPresentation(activity, presentationDisplay)
try {
presentation?.show()
} catch (ex: WindowManager.InvalidDisplayException) {
Timber.d("CastDelegate --> Couldn't show presentation! Display was removed in the meantime.", ex)
presentation = null
}
}
}
}
As a result now playing video https://media.w3.org/2010/05/sintel/trailer.mp4 from
remotePlaybackClient?.play(...)
What I trying to do is listen to socket data and convert into an observable string that my UI can Subscribe this event and do Change on UI
So far I created a class SocketConnection maintain in dagger connection happen properly and received data and able to do with interface correctly, but want to apply with rxkotlin.
Using Socket.io,kotlin
SocketConnection class
class SocketConnection : SocketStreamListener {
private var socket: Socket? = null
var responseSocket :ResponseHandler?= null
companion object {
var instance = SocketConnection()
}
override fun createSocket(socketQuery: SocketQuery): Socket? {
try {
val okHttpClient = UnsafeOkHttpClient.getUnsafeOkHttpClient()
IO.setDefaultOkHttpWebSocketFactory(okHttpClient)
IO.setDefaultOkHttpCallFactory(okHttpClient)
val opts = IO.Options()
opts.reconnection = false
opts.callFactory = okHttpClient
opts.webSocketFactory = okHttpClient
opts.query = "userID=" + socketQuery.userID + "&token=" + socketQuery.token
socket = IO.socket(CommonContents.BASE_API_LAYER, opts)
L.d("Socket object created")
} catch (e: URISyntaxException) {
L.e("Error creating socket", e)
}
return socket
}
override fun createSocketListener(socket: Socket) {
L.d("inside the socket Listner")
socket.connect()?.on(Socket.EVENT_CONNECT, {
L.d("connected")
listenSocketEvents()
//socketDataListener()
createMessageListener()
})?.on(Socket.EVENT_DISCONNECT,
{
L.d("disconnected")
return#on
})
}
/**
* function used to listen a socket chanel data
*/
private fun listenSocketEvents() {
/* socket?.on("1502", { args ->
// This Will Work
L.d("Socket market depth event successfully")
val socketData = args[0] as String
L.d(socketData)
// instance.data = Observable.just(socketData)
//data!!.doOnNext({ socketData })
*//*
data = args[0] as String
for (i in 0 until arr.size) {
arr[i].socketStreamingData(data)
}*//*
})*/
}
// This Will Not Work
fun socketDataListener(): Observable<String>{
return Observable.create({
subscibe ->
// L.d("Socket market depth event successfully")
socket?.on("1502", { args ->
L.d("Socket market depth event successfully")
val socketData = args[0] as String
subscibe.onNext(socketData)
})
})
}
}
Repository
fun getSocketData(): Observable<String> {
// L.e("" + SocketConnection.instance.socketDataListener())
return SocketConnection.instance.createMessageListener()
}
ViewModel
fun getSocketData(): Observable<String>{
return groupRepository.getSocketData()
}
OnFragement (UI)
private fun getSocketUpdate(){
subscribe(watchlistViewModel.getSocketData()
.subscribeOn(Schedulers.io())
.observeOn(AndroidSchedulers.mainThread())
.subscribe({
L.d("SocketData : " + it.count())
}, {
L.e("Error")
}))
}
In this UI using disposable subscribe method into base class.
Please let me know what i doing wrong thanx in advance
Instead of creating an Observable every time a message is sent, I suggest using a Subject for that, since it has a similar "nature" as the Socket connection.
val subject = PublishSubject.create<String>()
...
fun listenSocketEvents() {
socket?.on("1502") { args ->
val socketData = args[0] as String
subject.onNext(socketData)
}
}
fun observable(): Observable<String>{
return subject
}
You can then listen to the changes on the subject via (repository layer etc not included, you'd have to do that yourself)
private fun getSocketUpdate() {
disposable = socketConnection.observable()
.subscribeOn(Schedulers.io())
.observeOn(...)
.subscribe({...}, {...})
}
As a side note, your singleton instance is not how you'd do that in kotlin.
Instead of having an instance field in a companion object, you should make the declare the class as object SocketConnection.
This will automatically give you all singleton features. (I do not know whether it is smart to use a singleton with socket.io, but I assume that you know what you're doing :-) )