I am new to mobile programming, and I am making an application which will play video files. I am trying Azure media service (AMS) with Xamarin Forms. I would like to know how to it works to have a file that will play with Widevine on android and Fairplay on IOS.
I need explanations!
Thanks in advance!
The implementation on the client side for those different DRM's is about as different as you can imagine. Xamarin.Forms does not out of the box provide a component that can handle the DRM specific methods. You will likely need to create platform specific plugins for that or find a ready-made component for that - although in my search i haven't found that. The closest is a cross platform video player such as https://github.com/adamfisher/Xamarin.Forms.VideoPlayer
Also, you are not likely to get a file to play, it will likely be a stream with different manifests for the two platforms. Azure Media Services however can do that on the fly from a set of MP4's. So it might look like MP4's but what is actually served to the client is a video manifest file.
There is no library/component in Xamarin that allows you to handle multiple DRM schemes. Your best option is to use platform libraries wrapped, to be able to consume DRM content. I would check out the Inside Secure DRM solution, that allows the playback of the DRM Schemes that you listed.
Related
What to achieve:
I need to support offline videos on android device just like Netflix and prohibiting its distribution using DRM support.
What I've done till now:
I've converted a sample video into m3u8 format using Shaka Packager. Used this link https://google.github.io/shakapackager/html/tutorials/widevine.html
Problem faced:
1. Is this enough for DRM protection?
2. I know i'll have to use Licensed Widevine Server, which I'm unable to find anywhere on how to get one. Please help me out on this.
3. I suppose for point 2 I have to store a secret key on server. This same key will be used on android device to enable the video player. I'm a little confused on how to setup this.
Thanks in advance!!
You probably want to use MPEG-DASH instead of HLS. Widevine doesn't support HLS as the packaging format, and MPEG-DASH is what pretty much all Widevine content is packaged as.
The shaka packager documentation has information on both how to package DASH and apply DRM.
In order for DRM to work, you will, as you mention, need a Widevine License Server. You have to options for this. Option 1 is to become a CWIP yourself, the other is to work with an existing CWIP. You are also correct that the key used to encrypt the content should be stored on the license server side. Some of the more popular providers might be DRMToday and BUY DRM.
When you have your content, and the license server, the last piece you need is a video player. For Android, the most popular player is ExoPlayer which is developed at least partially by Google. ExoPlayer has documentation on how to work with Widevine and has a downloader component.
I am planning on having a YouTube player in my android app and found two alternatives: IFRAME API or with YouTube Android Player API. So far I don't find any reference for comparing the two approaches.
I am new to both so I don't have a good background to compare both (yet). But so far this is what I know:
IFRAME
Pro: Don't need to get Developer/App Key to access the API.
Pro: Don't need to include YouTubePlayer API's jar (don't increase your APK size)
Con: Unnecessary webview and javascript glue code to hook up
YT Player API:
Pro: Native Java, no need of javascript glue code
Con: Need YouTube app on the device
Con: Need to get Developer/App Key and need to include the API jar to your APK.
Playing with both, I don't know yet any perf/memory usage between the two. I also don't know if we can have more detailed events from the API vs through IFRAME.
I am trying to assess these but would like to hear if any of you have opinions on these.
Thank you
Here you can find a few reason for not using the YouTube Player API.
Overall I'd say: if need to do basic stuff (like using YouTubeBaseActivity/YouTubeStandalonePlayer) you can safely use it. If you need to use the YouTubePlayerFragment a WebView based approach may be a better idea.
Why should you consider not using the official library from YouTube?
If you’re concerned about stability:
The YouTube Android Player API is not the best API ever designed. You are probably going to be fine if you need to use the
YouTubeBaseActivity/YouTubeStandalonePlayer, but you’re going to run into issues with the YouTubePlayerFragment.
The library has some very old bugs, this one is the most significant I have encountered. While developing my app I kept running into it, seeing my app randomly crash for apparently no reason. It made my app unstable and never ready for production.
The bug is still there, as far as I know. A new version of the library should be in the making, but it has yet to be released.
If you don’t want to be tied to Google Play and the YouTube app:
In order to run an app that utilizes the YouTube Android Player API a device needs to have both Google Play and the last version of the YouTube app installed.
This may not be a limitation in most cases, since you’re probably going to distribute your app through Google Play. But I have talked with people that had this problem, maybe you care about it as well.
If you want more control over the player looks and behavior:
The YouTube Android Player API is not open source, therefore the customization you can do goes as far as the API allows to. Maybe you want to change the UI of the player or write some custom behavior specific for your use cases. That is going to be hard to do with the official library.
If you don’t want to register your app in the Google Developers Console
I am developing a hybrid mobile application (using Ionic 2) that allows users to overlay audio on a video. Essentially, the video and audio track are able to play at the same time. The challenge is getting this behavior with Google Cast functionality – I want the user to be able to Cast the video as well as the separate audio overlay to their Cast device.
I’ve decided to develop a custom Cordova plug-in (starting with Android) that implements the Google Cast API natively for the following reasons: Google Cast Web API depends on the Chrome browser so our hybrid app running in Cordova’s WebView won’t work, and the most up-to-date Cordova plugin that I have seen A) doesn’t implement this audio overlay functionality and B) appears to be abandoned since Sept 2015.
That’s the background, now the question.
How I understand it so far, generally the Cast API works by sending the Receiver Application a URI, and the Receiver Application takes care of fetching this resource either from a server or from the Sender app’s resources. This poses a problem for me: my application utilizes two resources simultaneously.
I saw here that MediaInfo can represent a grouping of MediaTracks, which could be audio, video, text…etc. Am I able to have multiple MediaTracks active at the same time? Should I be exploring custom implementations in a Custom Receiver App to enable multiple active MediaTracks?
Should I look into demuxing the mp4, mixing the audio streams and muxing into a separate, temporary mp4 file, and handing the URI of this temp mp4 off to the Cast Device?
I’m in the research phase right now, but will be implementing and testing various solutions over the next few weeks. What other creative solutions can anyone think of? Has anyone done this before? And lastly, can anyone say for certain that this cannot be done?
Any help/advice is appreciated.
Cast SDK on the receiver doesn't support more than one active media element, so even if you write a custom receiver and include two media elements, one for video and one for audio, only one can be active at a time, so that is not going to work. If you can mix them into one mp4, then that is going to be the best approach and can work with Default or Styled receivers as well (hence no need to write a custom one).
Installing Sonos or Chromecast enables the compliant apps (called for example Cast senders) to use these kind of streaming APIs.
Now, without going into the details of the specific API, and without necessarily thinking of a multimedia streaming app, let's keep it simple.
What are the ways, on the common platforms such as iOS and Android, to provide the following:
knowledge to the API-enabled apps that the specific app/service is installed, so its API is usable
providing a callable, system wide framework/API
a way of retrieving the list of the apps that use the API, and a way of sending queries to them (Sonos for example can ask for the list of tracks)
On the internet I can find various examples of the client code to use these specific APIs, but nothing about how to build such APIs. Links to docs and examples are more than welcome.
I'm currently planning on building an application using the ionic framework wrapping angularJS and cordova.
The app, once downloaded must be able to play its own audio files that were bundled with the application without streaming them from the internet.
Can anyone give guidance as to whether the phones 'hard drive' can be accessed to store audio files on? Or, if not whether the $localstorage facility would be suitable for storing audio files up to perhaps 50Mb?
In the api docs you can find the info for it.
I haven't used cordova yet but that should work.
Localstorage in browser is limited to 2.5MB, 5MB, 10 MB or unlimited depending on used browser. Not sure how it is handled in a native webview but probably similar to these limits.
In this blog post is a text read demo that could also help.