I have uploaded some video files to my Azure Media Service with the multi-bitrate MP4 encoding. I have the Media Service set up with one streaming unit and a Premium subscription, so it supports adaptive bitrate streaming.
On my Android app, I use the default VideoView widget but it doesn't seem to actually be using adaptive bitrate streaming. How can I make sure it is using adaptive bitrate?
EDIT: we are using the HLSv4 link from Azure Media Service (format=m3u8-aapl)
What kind of streaming protocol are you using exactly? The standard media library in Android is somewhat limited in this regard, so you might wanna take a look at ExoPlayer, it supports a much wider range of streaming protocols (like DASH and SmoothStreaming for example)
There's also a wrapper for ExoPlayer, which allows you to more or less use it as a drop in replacement for your VideoView.
Related
I'm developing an Android application that allow the users to watch the tv channels via streaming.
The user must "tap" on the channel (for example chan 1) and an activity show the real time video, but I have one question, there are other solutions, different by the use of a webview to show the live video ?
Exist some solutions more "professional" or functionals?
You can use ExoPlayer to play streams. Take a look at the DemoApp. As official documentation says
ExoPlayer has support for Dynamic Adaptive Streaming over HTTP (DASH)
and SmoothStreaming, neither of which are are supported by MediaPlayer
(it also supports HTTP Live Streaming (HLS), MP4, MP3, WebM, M4A,
MPEG-TS and AAC).
But make sure you can get the direct link to your streams.
I'm trying to make my app playing some videos from some TV channels that have online broadcasting on their homepages. Apparently I need to know what ** streaming protocol** is appropriate for those kind of videos. Does it mean that I need to know what protocol they are using in their streaming? or should I choose my own protocol? and what should I think about when it comes to choosing?
And final question: I heard that choosing the appropriate class (Media player) or Video view is depended on what protocol it is. is it true? that class has to have support for swiping on the screen.
Thanks in advance.
Firstly it is worth checking that the stream you want to play is actually available for playback - many online TV providers will use encryption and authentication mechanisms so that their video streams can only be played back in an app or browser that a registered user has logged in to.
Assuming that it is available then you need to check to see what format they make it available in.
In high level terms streaming video is typically packaged as follows:
raw video
-> encoded into compressed format (e.g. H.264)
-> packaged into container (e.g.mp4) along with audio streams etc
-> packaged into adaptive bit rate streaming format (e.g. HLS, Smoothstreaming, MPEG DASH)
Different devices and different browsers support different encoding, packaging and streaming formats.
Assuming that you want to use a HTML5 browser, either standalone or in a web view in an app, then the following links provide a good, regularly updated, overview of the which devices and browsers support which encoding and streaming for HTML5 video playback (this is a constantly changing picture so you need to check the current status using links such as these):
https://developer.mozilla.org/en-US/docs/Web/HTML/Supported_media_formats
http://www.jwplayer.com/html5/
So your steps:
make sure video is available either unprotected or that you have access to encryption keys authentication credentials etc
identify the streaming technology being used, for example by looking at the file type in the URL (e.g. '.mpd' for a DASH format)
Look at the individual video and audio streams within the streaming 'index' or 'manifest' file and check that your device can support them
You can take a short cut initially by testing the streams you have on your target device in some of the available browser based test players for the different formats, for example for DASH:
http://www.dash-player.com/demo/manifest-test/
http://shaka-player-demo.appspot.com
If they play here then you should be able to get them working in your app.
I am testing out Azure Media services and I am looking for a preset or a custom config which takes an mp3 file and encodes it for playback on iOS, Android (4.0+) and HTML5 for streaming (one manifest hopefully). Currently I am seeing presets for HTML5 and HLS (none for Android) however they are separate and not in one config/workflow. How can I set this up to done. Note I am using the UI and not programming at this time.
Are you planning to deliver in both HLS and MPEG-DASH across multiple HTML5 browsers? You will need to use both of those protocols to reach all of the devices that you have in mind. Android has a really poor implementation of HLS. Most of the Android devices only support HLS v3, so make sure to test your devices with the v3 (muxed ts) protocol.
I would encourage you to use the Azure Media Explorer tool for everything.
http://aka.ms/amse
It gives you easier access to all of the protocol URLs that you will need.
Try encoding everything to Standard Definition Multiple bitrate MP4 files to begin with. Most Android devices only like Baseline profile encoding.
Once you have Multiple bitrate Mp4 files encoded, you will need to enable at least 1 streaming Reserved Unit to allow you to get "dynamic packaging" to work. You need that feature to re-package your MP4 files on-the-fly to HLS and DASH.
Also, if you are looking for an awesome player framework for HTML5 delivery -check out our new Azure Media Player http://azure.microsoft.com/blog/2015/04/15/announcing-azure-media-player/
I am trying to play a video in android native code using new API mediacodec. I dont want to go mediaPlayer way due to unavoidable reasons. can anybody share some code snippet as to how to go about it? Thanks in advance.
Your original question is too generic. And to be honest, create a new media player in native code is a huge task for your own.
If you are only seeking for some media player solution which has better supporting for variety of formats/codecs like VLC player, you can either try VLC lib which is open source but still in beta release. I have tried VLC, but it really has some crash issues or ANR issues, which is inside the whole framework.
Or you can try with Vitamio SDK which is a library without souce code. Check it out at this link: https://github.com/yixia/VitamioBundle Below is the feature list of it:
I have tried this solution, it is very stable, also some minor issue on 4.3, but still acceptable. So I am not posting any spam here, just copying from the official document:
Vitamio is an open multimedia framework or library for Android and iOS, with full and real hardware accelerated decoder and renderer. It's the simple, clean and powerful API of Vitamio that makes it famous and popular in multimedia apps development for Android and iOS.
According to the developers' feedback, Vitamio has been used by more than 1000 apps and 100 million users around the world.
Vitamio can play 720p/1080p HD mp4,mkv,m4v,mov,flv,avi,rmvb,rm,ts,tp and many other video formats in Android and iOS. Almost all popular streaming protocols are supported by Vitamio, including HLS(m3u8), MMS, RTSP, RTMP, and HTTP.
Network Protocols
The following streaming protocols are supported for audio and video playback:
MMS
RTSP (RTP, SDP), RTMP
HTTP progressive streaming
HLS - HTTP live streaming (M3U8)
And yes, Vitamio can handle on demand and live videos in all above protocols.
Media formats
Vitamio used FFmpeg as the demuxers and main decoders, many audio and video codecs are packed into Vitamio beside the default media format built in Android platform, some of them are listed below.
DivX/Xvid
WMV
FLV
TS/TP
RMVB
MKV
MOV
M4V
AVI
MP4
3GP
Subtitles
Vitamio support the display of many external and embedded subtitle formats.
SubRip(.srt)
Sub Station Alpha(.ssa) / Advanced Sub Station Alpha(.ass)
SAMI(.smi/.sami)
MicroDVD(.sub/.txt)
SubViewer2.0(.sub)
MPL2(.mpl/.txt)
Matroska (.mkv) Subtitle Track
More features
More wonderful features
Support wide range screens from small phone to large tablet
Multiple audio tracks support
Mutitiple subtitles support, including external and embedded ones
Processor optimization for many platforms
Buffering when streaming
Adjustable aspect ratio
Automatically text encoding detection
I know that android supports RTSP streaming if you use the native video player, but I have not been able to find if you can stream video on Android using the HTML5 tag.
The real reason I want to figure this out is I will be using Apple HTTP Live Streaming to serve video to the iPhone using HTML5 and would like to keep things simple and be able to simply define another streaming video source for android.
Any help would be greatly appreciated! Thanks.
No, it doesn't work and probably never will. However, from Android 3.0, HTTP Live Streaming is supported.
I recommend you use a flexible media server such as Wowza. It can stream HLS as well as RTSP and RTMP(Flash) from a single source.