Server for broadcasting RTSP video to Android - android

I am new to video streaming and am working on a project to broadcast video to android phone over internet, and users to view the video at the same time may reach 100.
After looking around for a while I think using rtsp streaming for the phone client may be convenient(Am I right?) and so I have to choose a server, My current choice will be using
VLC
Darwin Streaming Server
Are they suitable? Or any other better choice?
How about the performance of these two servers while 100 users accessing at the same time.
Thanks in advance
Regards
Bolton

RTSP streaming in H.264/AAC would be the most convenient way to reach Android devices. End-users will not need to install an app or open the stream in one - the native media player will seamlessly open the stream.
If you intend on using VLC for the encoding portion - you may want to reconsider, as I'm not sure it supports H.264/AAC compression, which is required to reach Android devices. You may want to consider using commercial software like Wirecast or the free Flash Media Encoder with the AAC plugin.
Darwin Streaming Server is stable enough to handle that load (100 concurrent viewers), however the amount of throughput you have available and the bit-rate you will be broadcasting at are more important factors to consider when delivering video. In other words - your upload speed has to be able to be sufficient. If it's not intended strictly as a DIY project, I would suggest tapping into a commercial CDN's network (I would recommend NetroMedia).

Related

How streaming apps change video quality based on changes in network speed?

How streaming apps like Youtube, Hotstar or any other video player app, programmatically detects if network is getting slow over run-time and based on that they change video quality based on changes in network speed?
Many streaming services nowadays use HTTP-based streaming protocols. But there are exceptions; especially with low-latency streaming; e.g. WebRTC or Websocket-based solutions.
Assuming that you're using a HTTP-based protocol like HLS or MPEG-DASH, the "stream" is a long chain of video segments that are downloaded one after another. A video segment is a file in "TS" or "MP4" format (in some MP4 cases, video and audio are splitted into separate files); typically a segment has 2 or 6 or 10 seconds of audio and/or video.
Based on the playlist or manifest (or sometimes simply from decoding the segment), the player knows how many seconds of a single segment contains. It also knows how long it took to download that segment. You can measure the available bandwidth by diving the (average) size of a video segment file by the (average) time it took to download.
At the moment that it takes more time to download a segment than to play it, you know that the player will stall as soon as the buffer is empty; stalling is generally referred to as "buffering". Adaptive Bitrate (aka. ABR) is a technique that tries to prevent buffering; see https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming (or Google for the expression) - when the player notices that the available bandwidth is lower than the bit rate of the video stream, it can switch to another version of the same stream that has a lower bit rate (typically achieved by higher compression and/or lower resolution - which results in less quality, but that's better than buffering)
PS #1: WebRTC and Websocket-based streaming solutions cannot use this measuring trick and must implement other solutions
PS #2: New/upcoming variants of HLS (eg. LL-HLS and LHLS) and MPEG-DASH use other HTTP technologies (like chunked-transfer or HTTP PUSH) to achieve lower latency - these typically do not work well with the mentioned measuring technique and use different techniques which I consider outside scope here.
You have to use a streaming server in order to do that. Wowza server is one of them (not free). The client and server will exchange information about the connexion and distribute chuncks of the video, depending on the network speed.

What is the most efficient way to implement HTTP Live Video Streaming in Android?

For the past month I have been searching over the Internet for ways to implement recording live video from an application on Android and sending it over to a server, but the more I research the more confused I get.
First of all, I am looking for a streaming protocol that can be used for iOS also in the future, so I came to a conclusion that DASH(Dynamic Adaptive Streaming over HTTP) is the ideal solution.
In addition, the recent Android framework, ExoPlayer, support this feature.
Furthermore, I do not wish to use a Live Streaming engine such as WOWZA.
Secondly, based on my research I also concluded that any HTTP server can be used to receive the "chuncks" of data, but I must have a streaming server to be able to stream the video back to the users.
I believe this process is quite complex but I will not give up until I successfully make it work.
Lastly, my question is, what Server, Protocol should I use to be able to achieve this ? And how to convert video directly and send to server ?
Looking at your questions re protocol and server:
A 'streaming protocol that can be used for iOS also in the future'
It probably depends what you mean by 'future. At the moment apple require you to use HLS on iOS for any video on a Mobile Network (cellular) which is over 10 mins long. DASH is establishing itself as the industry standard so this may change and apple may accept it also, but if you need something in the near future you may want to plan to support DASH and HLS.
What server should you use for streaming
Streaming video is complex and the domain is fast changing so it really is good to use or build on a dedicated streaming server, if you can. These will generally have mechanisms and/or well documented procedures for converting input videos to the different formats and bit rates you need, depending on the reach and user experience goals you have. Reach will determine the different encodings you need, different browsers and devices supporting different encodings, and if you want your user to have good experience avoiding buffering you will want multiple bit rate versions of each format also - this allows DASH and HLS provide Adaptive Bit rate Streaming (ABR) which means the clients can select the best bit rate at any given time depending on network conditions. Video manipulation, especially transcoding, is a CPU intensive task so another advantage of dedicated streaming server software is that it should be optimised as much as possible to reduce your server loads.
If you do decide to go the streaming server route, then there are open source alternatives, as well as Wowza which you mention above, such as:
https://gstreamer.freedesktop.org
These have plugins that support ABR etc - if you search for 'GStreamer streaming server ABR' you will find some good blogs about setting this up.

Most instant way to stream live video to iOS and Android

I'm making an app that needs to send a video feed from a single source to a server where it can be accessed by desktop browsers and mobile apps.
So far, I've been using Adobe Media Server 5 with a live RTMP stream. This gives me about a 2.5 second delay on desktop browsers, which gives me no native support for iOS, but leaves me with the option to use Air to export the app for iOS, which produces a minimum 5-6 second delay.
The iOS docs strongly recommend the use of HTTP Live Streaming which segments the stream into chunks and serves it using a dynamic playlist in a .m3u8 file. Doing this produces a 15+ second delay in desktop browsers and mobile devices. A Google search seemed to reveal that this is to be expected from HLS.
I need a maximum of 2-4 second delays across all devices, if possible. I've gotten poor results with Wowza, but am open to revisiting it. FFMpeg seems inefficient, but I'm open to that as well, if someone has had good results with it. Anybody have any suggestions?? Thanks in advance.
I haven't even begun to find the most efficient way to stream to Android, so any help in that department would be much appreciated.
EDIT: Just to be clear, my plan is to make an iOS app, whether it's written natively or in Air. Same goes for Android, but I've yet to start on that.
In the ios browser HLS is the only way to serve live video. The absolute lowest latency would be to use 2 second segments with a 2 segment windows in the manifest. This will give you 4 seconds latency on the client, plus another 2 to 4 on the server. There is no way to do better without writing an app.
15 Second delay for HLS streams is pretty good, to provide lower latency you need to use a different streaming protocol.
RTP/RTSP will give you the lowest latency and is typically used for VoIP and video conferencing, but you will find it very difficult to use over multiple mobile and WiFi networks (some of them unintentionally block RTP).
If you can write an iOS app that supports RTMP then that is the easiest way to go and should work on Android too (only old Androids support Flash/RTMP natively). Decoding in software will result in poor battery life. There are other iOS apps that don't use HLS for streaming, but I think you need to limit it to your service (not a generic video player).
Also please remember that higher latency equals higher video quality, less buffering, better user experience etc. so don't unnecessarily reduce latency.

Mandatory to use Darwin or wowza or VLC to stream live video in android?

I want to know is it mandatory to use any of the streaming servers like Darwin,Wowza or VLC to stream an RTSP live stream video? I am receiving an RTSP link from my client and it tends to change everytime. I can successfully play it in the VLC player but on phone I cant see anything. I tried playing a sample link having .3gp extension and it worked fine. But my links dont have an extension. They look like this rtsp://122.166.229.151:1950/1346a0cf0ef7c2. Please help me.If its compulsory to use an extension or a server, I will continue working in that direction.
A streaming server (as you describe) isn't strictly necessary - as long as you can pull RTSP from whatever your source is, you should be able to see it. Most IP cameras have onboard RTSP servers (although I wouldn't put too many connections on it). If you can see it in VLC, the phone should be able to consume it as well, given that the codec used to encode is one supported by the android device (in most cases, if you're doing H.264 Baseline 3.0 with AAC, you should be good to go).
A streaming server like Wowza can make that stream available to a wider audience than pulling directly from the source device, but if you're not intending to broadcast to a wide audience, it's not required for streaming to Android devices.
Newer versions of Android (Gingerbread and later) are also able to consume Apple HTTP Live Streaming.

Android how to video record, upload, transcode, download, play

I'm researching the development of an Android (2.2) app/service that will enable users to record short (I do emphasize short, < 30seconds) video on their phones and then upload that video (HTTP) to a server that will then transcode the video to other formats. That same user can download videos from other Android users and play them.
Now, I get a bit lost with everyones recommended approaches to all the issues in doing something like this because I haven't seen any ask this in a cohesive context. Ideally I would like a non commercial solution to this (as in no vendor/service being needed for the the video hosting/transcoding), but, feel free to include those as a recommendation (I've marked this as a wiki) as I know many like to use youtube and vimeo for the middle layer in all this.
The questions are
What server technologies do you
recommend for hosting and
transcoding?
What technology do you
recommend for streaming the video (it
would be nice to offer a high and
low quality encoding depending on
the users network connection)
What video format and software do you recommend for converting the uploaded video on the server to be viewable later by other Android owners.
Im assuming it's bad to do any transcoding on the phone prior to upload (battery/proc issues), but, if I'm wrong with that assumption what do you recommend?
Some things that may help you...
The video will only need to render on an Android device, and in the future in a webkit html5 browser.
Bandwidth isnt cheap (even with numerous 30 second videos), so a good mix of video quality and video file size is important (streaming if needed to ensure quality vs. download).
This is for android 2.2 devices with a video camera of course and medium to high density screen of 800x400 min.
Open source solutions (server to receive the uploads, code to do the transcoding, server to do the streaming) are preferred, but not required.
CDN's are an option, but I don't think that really figures in to the picture right now.
Check out this page to see all the video formats that Android supports for encoding and decoding.
http://developer.android.com/guide/appendix/media-formats.html
For encoding use FFmpeg or a service like encoding.com

Categories

Resources