How can Periscope App broadcast the video so successful? - android

Twitter's new application Periscope broadcasts video. I watched a broadcast just a couple minutes ago first time and I wonder that how can it stream live video without any freezing or annoying freezing (Actually I didn't see any freezing but maybe there was somebody has) on 3g? 2 or 3 weeks ago I have tried Twitter's video post feature and it was a disastrous. What is the difference between live streaming and recorded video uploading? Or is it difference between iPhone and Android?

The answer is not that simple.
HLS is for example how they do it on the web and how Meerkat does that using short segment sizes to speed up the buffering and playlist creation that HLS creates.
On mobile they show a 2-3 seconds latency which I never saw using HLS.
Sniffing the connections themselves I see that on mobile they are using RTMP which is way more expensive and way less scalable to give that experience.
Here is a short article talking about that - note the comments about the rtmp playback:
http://www.alamtechstuffs.com/periscope-livestreaming-app/

There's no secret, it's a well established technique which is not Twitter specific.
Uploaded videos are fetched using pseudo-streaming (progressive download) while the live stream is delivered using adaptive bitrate streaming which means there are multiple renditions of the same live stream for different bandwidths. The player can then choose one version that makes the most of your connection.
http://en.wikipedia.org/wiki/Adaptive_bitrate_streaming

Related

How streaming apps change video quality based on changes in network speed?

How streaming apps like Youtube, Hotstar or any other video player app, programmatically detects if network is getting slow over run-time and based on that they change video quality based on changes in network speed?
Many streaming services nowadays use HTTP-based streaming protocols. But there are exceptions; especially with low-latency streaming; e.g. WebRTC or Websocket-based solutions.
Assuming that you're using a HTTP-based protocol like HLS or MPEG-DASH, the "stream" is a long chain of video segments that are downloaded one after another. A video segment is a file in "TS" or "MP4" format (in some MP4 cases, video and audio are splitted into separate files); typically a segment has 2 or 6 or 10 seconds of audio and/or video.
Based on the playlist or manifest (or sometimes simply from decoding the segment), the player knows how many seconds of a single segment contains. It also knows how long it took to download that segment. You can measure the available bandwidth by diving the (average) size of a video segment file by the (average) time it took to download.
At the moment that it takes more time to download a segment than to play it, you know that the player will stall as soon as the buffer is empty; stalling is generally referred to as "buffering". Adaptive Bitrate (aka. ABR) is a technique that tries to prevent buffering; see https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming (or Google for the expression) - when the player notices that the available bandwidth is lower than the bit rate of the video stream, it can switch to another version of the same stream that has a lower bit rate (typically achieved by higher compression and/or lower resolution - which results in less quality, but that's better than buffering)
PS #1: WebRTC and Websocket-based streaming solutions cannot use this measuring trick and must implement other solutions
PS #2: New/upcoming variants of HLS (eg. LL-HLS and LHLS) and MPEG-DASH use other HTTP technologies (like chunked-transfer or HTTP PUSH) to achieve lower latency - these typically do not work well with the mentioned measuring technique and use different techniques which I consider outside scope here.
You have to use a streaming server in order to do that. Wowza server is one of them (not free). The client and server will exchange information about the connexion and distribute chuncks of the video, depending on the network speed.

Most instant way to stream live video to iOS and Android

I'm making an app that needs to send a video feed from a single source to a server where it can be accessed by desktop browsers and mobile apps.
So far, I've been using Adobe Media Server 5 with a live RTMP stream. This gives me about a 2.5 second delay on desktop browsers, which gives me no native support for iOS, but leaves me with the option to use Air to export the app for iOS, which produces a minimum 5-6 second delay.
The iOS docs strongly recommend the use of HTTP Live Streaming which segments the stream into chunks and serves it using a dynamic playlist in a .m3u8 file. Doing this produces a 15+ second delay in desktop browsers and mobile devices. A Google search seemed to reveal that this is to be expected from HLS.
I need a maximum of 2-4 second delays across all devices, if possible. I've gotten poor results with Wowza, but am open to revisiting it. FFMpeg seems inefficient, but I'm open to that as well, if someone has had good results with it. Anybody have any suggestions?? Thanks in advance.
I haven't even begun to find the most efficient way to stream to Android, so any help in that department would be much appreciated.
EDIT: Just to be clear, my plan is to make an iOS app, whether it's written natively or in Air. Same goes for Android, but I've yet to start on that.
In the ios browser HLS is the only way to serve live video. The absolute lowest latency would be to use 2 second segments with a 2 segment windows in the manifest. This will give you 4 seconds latency on the client, plus another 2 to 4 on the server. There is no way to do better without writing an app.
15 Second delay for HLS streams is pretty good, to provide lower latency you need to use a different streaming protocol.
RTP/RTSP will give you the lowest latency and is typically used for VoIP and video conferencing, but you will find it very difficult to use over multiple mobile and WiFi networks (some of them unintentionally block RTP).
If you can write an iOS app that supports RTMP then that is the easiest way to go and should work on Android too (only old Androids support Flash/RTMP natively). Decoding in software will result in poor battery life. There are other iOS apps that don't use HLS for streaming, but I think you need to limit it to your service (not a generic video player).
Also please remember that higher latency equals higher video quality, less buffering, better user experience etc. so don't unnecessarily reduce latency.

Recording and watching video live streaming using red 5 and rtsp for android

I have spent enough time searching for a solution but I am not sure which way I should go. That is why I am asking question here.
I want to make an application from where I can record a video and same video will be shown live on any other device connected to the network or it can be on web using JWplayer or some thing like that.
Now I have done enough R&D and we are not considering WOwza server (as it is paid) . So on web side we have already implemented in which web cam in recording video and which can be seen on our web site and also on android and iphone browsers live.
But problem with red 5 is that is does not give support to RTSP (it only supports RTMP) unlike Wowza which handles RTSP internally and convert it into RTMP.
I have found few links who suggested client side RTMP handling what so far no success.
Convert video Input Stream to RTMP
This links explains lot of things but I am not sure which way I should spend my time or it is possible to convert RTSP into RTMP for red 5 server.
https://play.google.com/store/apps/details?id=air.Broadcaster
this guy has achieve this functionality I guess but we are looking for the native application.
Please suggest me in the right direction I should go and invest my time live record and watch video stream.
We can use other servers which are open source rather than red 5 , if it fulfill my needs on both iOS, android and web else I ll have to go with wowza which I dnt want for now.

Server for broadcasting RTSP video to Android

I am new to video streaming and am working on a project to broadcast video to android phone over internet, and users to view the video at the same time may reach 100.
After looking around for a while I think using rtsp streaming for the phone client may be convenient(Am I right?) and so I have to choose a server, My current choice will be using
VLC
Darwin Streaming Server
Are they suitable? Or any other better choice?
How about the performance of these two servers while 100 users accessing at the same time.
Thanks in advance
Regards
Bolton
RTSP streaming in H.264/AAC would be the most convenient way to reach Android devices. End-users will not need to install an app or open the stream in one - the native media player will seamlessly open the stream.
If you intend on using VLC for the encoding portion - you may want to reconsider, as I'm not sure it supports H.264/AAC compression, which is required to reach Android devices. You may want to consider using commercial software like Wirecast or the free Flash Media Encoder with the AAC plugin.
Darwin Streaming Server is stable enough to handle that load (100 concurrent viewers), however the amount of throughput you have available and the bit-rate you will be broadcasting at are more important factors to consider when delivering video. In other words - your upload speed has to be able to be sufficient. If it's not intended strictly as a DIY project, I would suggest tapping into a commercial CDN's network (I would recommend NetroMedia).

Android how to video record, upload, transcode, download, play

I'm researching the development of an Android (2.2) app/service that will enable users to record short (I do emphasize short, < 30seconds) video on their phones and then upload that video (HTTP) to a server that will then transcode the video to other formats. That same user can download videos from other Android users and play them.
Now, I get a bit lost with everyones recommended approaches to all the issues in doing something like this because I haven't seen any ask this in a cohesive context. Ideally I would like a non commercial solution to this (as in no vendor/service being needed for the the video hosting/transcoding), but, feel free to include those as a recommendation (I've marked this as a wiki) as I know many like to use youtube and vimeo for the middle layer in all this.
The questions are
What server technologies do you
recommend for hosting and
transcoding?
What technology do you
recommend for streaming the video (it
would be nice to offer a high and
low quality encoding depending on
the users network connection)
What video format and software do you recommend for converting the uploaded video on the server to be viewable later by other Android owners.
Im assuming it's bad to do any transcoding on the phone prior to upload (battery/proc issues), but, if I'm wrong with that assumption what do you recommend?
Some things that may help you...
The video will only need to render on an Android device, and in the future in a webkit html5 browser.
Bandwidth isnt cheap (even with numerous 30 second videos), so a good mix of video quality and video file size is important (streaming if needed to ensure quality vs. download).
This is for android 2.2 devices with a video camera of course and medium to high density screen of 800x400 min.
Open source solutions (server to receive the uploads, code to do the transcoding, server to do the streaming) are preferred, but not required.
CDN's are an option, but I don't think that really figures in to the picture right now.
Check out this page to see all the video formats that Android supports for encoding and decoding.
http://developer.android.com/guide/appendix/media-formats.html
For encoding use FFmpeg or a service like encoding.com

Categories

Resources