WEBRTC onAddStream not Called in native APIS - android

I am working with videocall plugin for an app. Every thing is working fine on web but in mobile using react native , there is only support for onAddStream and not on track. So Call sets up fine but during renegotiation while converting audio call to video call, video from mobile to web is added successfully but video from web to mobile is not added as onAddStream event is not called while adding tracks to existing stream.
A workaround has been given in this link and this which is to return a new stream everytime with new track labels. If I do this and use add Stream instead of addTrack, will it raise compatibility issues with other browsers? Also as the peer connection is between browser and janus, or mobile and janus so does it need to be implemented in janus or the other peer? Is there any other workaround that I can use or has someone used renegotiation with older apis i.e onaddstream?
If this is the only option, can some one guide me on how to to change labels for tracks and streams either directly or by editing sdp etc.? The same solution is mentioned in this answer but unable to find any relevant code.
Looking forward to your answers.

Related

Play Domain Restricted Video in Android/iOS App

I am building an Android & iOS App that has a video player, I am using one video hosting site(Wistia) for my videos. All videos are domain restricted, which means those will be played on a listed domain. The videos are getting properly played inside the web app(As we have allowed the video to be played for that domain) but I am not able to play those in my Android/iOS app.
Note: When I remove domain restriction from the video, then I am able to play the video in my app.
Can someone help me to find the domain of my Android app? Where should I define it in the code?
Below is the Wistia embedded code:
<script src="https://fast.wistia.com/embed/medias/j4q2kxdfd4.jsonp" async></script><script src="https://fast.wistia.com/assets/external/E-v1.js" async></script><span class="wistia_embed wistia_async_j4q2kxdfd4 popover=true popoverAnimateThumbnail=true" style="display:inline-block;height:84px;position:relative;width:150px"> </span>
Thank you.
Wistia is targeted at websites - they did have an iOS mobile app in the past but this was aimed more at contact owners, I believe, and is not supported anymore, either way.
They highlight this in their documentation (at time of writing):
Mobile OS Support
Most mobile devices only support HTML5 playback, which is Wistia’s default for mobile. This includes Android phones and tablets (4.1 and up), and iOS devices like iPhones and iPads.
To include Wistia in an app, the most recent way I have seen recommended by Wistia is to use a WebView and the standard embed code. This will allow you use the usual domain restrictions you have set.
The domain checking feature is most likely using the 'origin' or the 'referrer' field in the HTTPS request to determine the site the embed code is being used in. It is possible it is using a more complex mechanism than this but I think you will have to contact Wistia directly if and ask for support of that is the case.
Assuming it is this mechanism, you can look at the request headers in a browser inspector. For example, taking a site that uses Wistia and looks at the requests you will see something like this:
I've hidden the exact site name but both the origin and the referrer are the same top level domain name for the site hosting the videos.
The website on a mobile app will work the same way but if you are using a WebView in an Android app you will need to set the fields yourself, You may need to experiment as there seems to be different approaches but this is a good starting point: https://stackoverflow.com/a/5342527/334402
If you set these headers to a domain that is included in your set of allowed domains and the video still will not play then I think you will need to contact Wistia support directly.

video call using Session Initiation Protocol android

Is it possible to make video call using Session Initiation Protocol in android.
I found SipAudioCall for audio calling but did not find any similar class for video calling.
It is possible, however video streaming is not implemented.
Consider using one of third party libraries.
See also:
Android video calls using android's sip
Video Call through SIP in Android

Issues adding Chromecast support to Android App

I am attempting to integrate Google Cast into my app. I am using the CastCompanionLibrary. Integrating it is simple. ActionBar is updated with the Cast Icon. I can connect to my Chromecast devices. However, when I attempt to invoke playback, I get one of two results:
VideoCastControllerActivity starts but a spinning Loading icon displays and no video is played on the Chromecast. Only option is to hit the back button.
Only after step 1 above, Attempting to play a video just produces a Toast stating "Faile to load media"
What I can't seem to determine is if this is a problem with the App, if it resides in the content, or if it is a problem with the receiver in Google Cast Dev Console. My content is stored on a MythTV Backend and it can be HLS, MP4, MKV, AVI or 3GPP. I don't get any exceptions in the logs, not in my code or in the CastCompanionLibrary.
Any ideas on how to debug this issue?
Anyone know of a test tool that I can plug a url in to verify if the content will cast correctly on a Chromecast?
I suggest you start with simple MP4 content to first make sure that your app is set up correctly. Once you are able to play mp4, you can move to more sophisticated formats. The media formats that Chromecast supports is listed on our documentation site, so you need to limit your content to that list. The next thing to do to debug your issue, for supported formats, is to make sure that the server that is serving the content provides CORS headers since that is a requirement for almost all (but mp4) content. Finally, if that is satisfied as well, take a look at your receiver logs (turn on logging on the receiver side through the chrome console) and that should give you additional information as to why your content is not playing. To access the receiver logs/console, you need to be running your own custom receiver or your own styled receiver.

How does the Twitch-application stream to Android?

I'm currently working on a project to develop a e-sport streaming calendar for a company. The app works fine but the problem is that twitch only lets you stream flash, and for Android that doesn't quite work after Google's decision to remove the support. Http-live-streaming isn't very well supported either so the group is currently at a dead end.
My question is therefore: How does the Twitch-application stream to Android?
It works on Android devices that doesn't support flash or HLS, so there should be another way do it.
My guess is it probably used HLS or RTSP(RTMP+RTSP is the most common scenarios) inside its flash client already, the Android app is just merely another stream client implementation.
As of HLS, it doen't need any kind of native support to work on Android, it's just plain simple HTTP, you can even write you own implementation if you want. The native MediaPlayer API Android has already provide implementation. It's the same for RTMP + RTSP.
So, as of your problem, there're two ways I can think to solve it:
Get a router that supports packet sniffering(maybe one router with OpenWRT flashed and tcpdump installed), and reverse-engineer the URL and protocol twitch Android client used, then use it in your app.
pros: no dependency on twitch app itself
cons: harder to pull off, may break if twitch changed its internal protocol
Reverse-engineer the Intent twitch app used to pass to its video player Activity, and mimic one of your own to allow user to open the player to watch the stream.
tools you may find useful: https://play.google.com/store/apps/details?id=uk.co.ashtonbrsc.android.intentintercept
pros: it's more reliable and more consistent
cons: may not work if the Intent is private, depends on user installing the twtich app
UPDATE:
I just found out Twitch website works on Android native browser, too. Seems like it used <video> tag from HTML5 standard. So the simplest solution could be just use a WebView to wrap around that stream page, but it's not good for user experience.
Alternatively, you could write a server-side code which accept a stream page URL as parameter and the video tag as an output and use regular expression or XPath or some XML parser library to extract the <video> tag to client. The client app can just set up a WebView with just that <video> tag inside it. This approach prevents your app from stopping to work if Twitch changes its page structure.
Also, if you wish not to use WebView, you can extract the src attribute of that <video> tag and play it with Android's native MediaPlayer AP if you want.

Use the iframe api (youtube) with Cordova for android

After a good two hours of searching the Internet I can't find a solution to my problem.
I'm trying to develop an application for android and iOs using Cordova. In this application I make use of the Iframe API of Youtube. I need to be aware of the end of a certain video in order to call another one.
But it seems that this API is not working on Android (but well on iOs). Does anyone has a solution for this ?
Here is what I've thought of:
Basic embedding of a Youtube (without the API) and a custom listener to be notified of the end of a song (No idea how to do that...)
Use a Cordova plugin for the android youtube api (the one I saw only had the possibility to load a song not having listeners)

Categories

Resources