We have to develop a DRM component for audio/video streaming for various mobile platforms. The DRM server supplier is currently unknown as well as the streaming protocol format (assumption is HLS or RTSP).
Do you have idea what has to be developed on the client side? I think even we do not have the server ready we can start doing a lot of things in advance.
BR
SteN
I think you need to lock down at least some component of the DRM implementation. There are a lot of vendors coming out now with cross platform (iOS & Android) solutions. I don't think RTSP is viable as this type of streaming hasn't been implemented with encryption as it was only used when the operator/carrier did the streaming themselves directly to the handsets. There are many vendors with HLS support and Playready (microsoft smooth streaming) appears to be getting hold on multiple devices also. Unfortunately you have to look at point solutions and may not be able to work generically across all devices on day one unless you completely build the DRM into your application and hand off to the native media player upon playback. If you do a few google searches around DRM android/ios you will find many options.
Related
We are working on a mobile video app and need to decide on the video protocol between HLS or MPEG-DASH.
Our key consideration is which of the 2 has a better support and compatibility for browsers and mobile platforms (iOS, Android)?
The video content is recorded/uploaded/watched on a mobile app, but also needs to be shared for viewing on browsers.
It seems initially that HLS has a wider range of platforms support, but would love to hear from anyone's experience.
Thanks!
You will most likely find you have to support both at this time if you want to reach as many users as possible, especially if the content is encrypted.
iOS and Safari typically use HLS and FairPlay, Android, Firefox and Chrome use DASH and Widevine and Windows and Edge use DASH and PlayReady.
At this time, Apple iOS devices must use HLS for content greater than 10 mins over a mobile network:
2.5.7 Video streaming content over a cellular network longer than 10 minutes must use HTTP Live Streaming and include a baseline 192 kbps HTTP Live stream.
(https://developer.apple.com/app-store/review/guidelines/)
For this reason streams served to Apple devices are usually HLS, while DASH is used for other devices.
If you streams are not encrypted you can use CMAF a single source file. If they are encrypted then it will be some time before devices support a single CMAF encrypted source - See more detail here: https://stackoverflow.com/a/62020034/334402
CMFA info here: https://developer.apple.com/documentation/http_live_streaming/about_the_common_media_application_format_with_http_live_streaming
I have a requirement to develop a Android and iOS mobile apps that allow subscribers to view movies like the way Netflix and Iflix does it.
I would like to know if this can be achieved by inbuilt Video playing classes or widgets on the Android and iOS platforms, or if we will need a library or SDK for this.
I came across this URL on how to stream video in Android apps. Would this approach suffice for this requirement?
https://code.tutsplus.com/tutorials/streaming-video-in-android-apps--cms-19888
Netflix and similar systems use ABR to deliver video to mobile devices - ABR allows the client device or player download the video in chunks, e.g 10 second chunks, and select the next chunk from the bit rate most appropriate to the current network conditions. See here for an example:
https://stackoverflow.com/a/42365034/334402
There are several ABR protocols but the two most common at this time are HLS and DASH. HLS must be used to deliver video streams to iOS devices due to the apple guidelines (at this time and for video over 10 mins which may be accessed on a mobile network - the guidelines can change over time) and DASH is probably more common on Android devices, although HLS can be supported on Android also.
Most Android players now can handle ABR - the Android Exoplayer is a good example, is very well used and supports this natively:
https://github.com/google/ExoPlayer
Take a look at the Developers Guide (included in the link above at the time of writing) which shows how to include ExoPlayer in your app.
On iOS the native player supports ABR using HLS.
For the past month I have been searching over the Internet for ways to implement recording live video from an application on Android and sending it over to a server, but the more I research the more confused I get.
First of all, I am looking for a streaming protocol that can be used for iOS also in the future, so I came to a conclusion that DASH(Dynamic Adaptive Streaming over HTTP) is the ideal solution.
In addition, the recent Android framework, ExoPlayer, support this feature.
Furthermore, I do not wish to use a Live Streaming engine such as WOWZA.
Secondly, based on my research I also concluded that any HTTP server can be used to receive the "chuncks" of data, but I must have a streaming server to be able to stream the video back to the users.
I believe this process is quite complex but I will not give up until I successfully make it work.
Lastly, my question is, what Server, Protocol should I use to be able to achieve this ? And how to convert video directly and send to server ?
Looking at your questions re protocol and server:
A 'streaming protocol that can be used for iOS also in the future'
It probably depends what you mean by 'future. At the moment apple require you to use HLS on iOS for any video on a Mobile Network (cellular) which is over 10 mins long. DASH is establishing itself as the industry standard so this may change and apple may accept it also, but if you need something in the near future you may want to plan to support DASH and HLS.
What server should you use for streaming
Streaming video is complex and the domain is fast changing so it really is good to use or build on a dedicated streaming server, if you can. These will generally have mechanisms and/or well documented procedures for converting input videos to the different formats and bit rates you need, depending on the reach and user experience goals you have. Reach will determine the different encodings you need, different browsers and devices supporting different encodings, and if you want your user to have good experience avoiding buffering you will want multiple bit rate versions of each format also - this allows DASH and HLS provide Adaptive Bit rate Streaming (ABR) which means the clients can select the best bit rate at any given time depending on network conditions. Video manipulation, especially transcoding, is a CPU intensive task so another advantage of dedicated streaming server software is that it should be optimised as much as possible to reduce your server loads.
If you do decide to go the streaming server route, then there are open source alternatives, as well as Wowza which you mention above, such as:
https://gstreamer.freedesktop.org
These have plugins that support ABR etc - if you search for 'GStreamer streaming server ABR' you will find some good blogs about setting this up.
I have an Internet Radio and to be honest I went through almost everything to get how can I actually make a Streaming android application and what i should use for that .. no use .. not a single useful information ..
can anyone help me ?
This is most commonly done with Shoutcast/Icecast HTTP streaming to a MediaPlayer component contained in a Service.
NPR has open-sourced their app, and it's a great reference application for radio.
http://code.google.com/p/npr-android-app/
Note: Shoutcast streaming is currently broken in Google TV, so your app will not work on those devices unless you choose another protocol.
If you wish to only support android 3+ (including Google TV) you can also try streaming over HLS HTTP. This protocol is much less commonly used for radio, though.
I would like to know the best way to stream video to an iOS and Android app.
I would like to use the same technology for all mobile operating system.
I think to use the HTTP Streaming Live because this is the only protocol supported by IOS.
But I do not know if the HLS works on Android, Blackberry and Windows Phone.
If I use the HTTP Live Streaming, I do not need to use a streaming server like Wowza or DSS, right?
Regards
Aleanar
the android tablet versions (3.0+) only supports HTTP Live Streaming, but (nearly) all versions support rtsp streaming.
HLS should be your protocol of choice although it is not supported by legacy handsets such as Nokia and Blackberry. If you are creating native applications it is possible to use third party players supporting HLS and embed that in your application.