Live stream on Android - android

I'm developing an Android application that allow the users to watch the tv channels via streaming.
The user must "tap" on the channel (for example chan 1) and an activity show the real time video, but I have one question, there are other solutions, different by the use of a webview to show the live video ?
Exist some solutions more "professional" or functionals?

You can use ExoPlayer to play streams. Take a look at the DemoApp. As official documentation says
ExoPlayer has support for Dynamic Adaptive Streaming over HTTP (DASH)
and SmoothStreaming, neither of which are are supported by MediaPlayer
(it also supports HTTP Live Streaming (HLS), MP4, MP3, WebM, M4A,
MPEG-TS and AAC).
But make sure you can get the direct link to your streams.

Related

WebView getVideo/Audio tracks or MediaStream and send it to the server via WebRTC

I have to get video/audio tracks or if it's possible MediaStream object from the Android WebView which plays HLS stream ("m3m8").
I load HLS stream in the WebView using method loadUrl("...m3m8"). It's running without issues, but i can't figure out how to get live video and audio tracks from the HLS stream. I read a lot of articles and I was not able to find any solution. So my question is - Is it possible to get audio and video tracks from the running HLS stream on the WebView? I need to get the audio and video tracks because I should send them via PeerConnection(WebRTC) which accepts MediaStream or audio tracks and video tracks. Any ideas and examples will be helpful. Thank you in advance.
HLS works by defining a playlist (your .m3u8 file) which points to chunks of the video (say segment00000.ts, segment00001.ts, ...). It can contain different streams for different resolutions, but the idea (very simplified) is that your player can download a chunk, play it right away, and download the next one in the meantime.
A WebRTC video track takes RTP packets. I don't know exactly what the Android API exposes, but it feels like either you can pass it an m3u8 as a source (though that may be a bit weird for a generic API), or you can't, and you have to read the HLS stream yourself, extract RTP frames, and pass them to WebRTC.
For instance, gstreamer's hlsdemux seems to be able to read from an HLS source. From there you could extract RTP frames and feed them to your WebRTC track.
Probably that's a bit lower-level than you hoped, but if the Android API doesn't do it for you, you have to do it yourself (or find a library that does it).

How to make a Video Player using Native Android that have resolution switching ability like we see in YouTube?

I want to use a video player eg. using ExoPlayer in android app which will support switching of resolution like we see in YouTube.
My API have video files for 480p,720p,1080p.
I want to give those options in the player and can switch it from the player itself and will play the respective files from URL.
I have seen solutions like track selector etc,but does that work for online files? I have links like :
www.example.com/videos/480/demo.mp4
www.example.com/videos/720/demo.mp4
www.example.com/videos/1080/demo.mp4
Please suggest if there is any other solutions like API change or any other protocols etc.
Why you don't convert your mp4 file to hls or mpd streaming format with ffmpeg so you can stream it chunk by chunk and the player will select the best resolution based on his algorithm?
Have a look at this project, this allows user to select resolution manually via track selection from hls stream encoded by ffmpeg.
https://github.com/namespace7/HLS_Player
To generate hls stream from a video,go through this link
https://superuser.com/a/1302736/1108219

Android - play adaptive bitrate video from Azure Media Services

I have uploaded some video files to my Azure Media Service with the multi-bitrate MP4 encoding. I have the Media Service set up with one streaming unit and a Premium subscription, so it supports adaptive bitrate streaming.
On my Android app, I use the default VideoView widget but it doesn't seem to actually be using adaptive bitrate streaming. How can I make sure it is using adaptive bitrate?
EDIT: we are using the HLSv4 link from Azure Media Service (format=m3u8-aapl)
What kind of streaming protocol are you using exactly? The standard media library in Android is somewhat limited in this regard, so you might wanna take a look at ExoPlayer, it supports a much wider range of streaming protocols (like DASH and SmoothStreaming for example)
There's also a wrapper for ExoPlayer, which allows you to more or less use it as a drop in replacement for your VideoView.

What protocol and how to stream a video that is already streamed- Android

I'm trying to make my app playing some videos from some TV channels that have online broadcasting on their homepages. Apparently I need to know what ** streaming protocol** is appropriate for those kind of videos. Does it mean that I need to know what protocol they are using in their streaming? or should I choose my own protocol? and what should I think about when it comes to choosing?
And final question: I heard that choosing the appropriate class (Media player) or Video view is depended on what protocol it is. is it true? that class has to have support for swiping on the screen.
Thanks in advance.
Firstly it is worth checking that the stream you want to play is actually available for playback - many online TV providers will use encryption and authentication mechanisms so that their video streams can only be played back in an app or browser that a registered user has logged in to.
Assuming that it is available then you need to check to see what format they make it available in.
In high level terms streaming video is typically packaged as follows:
raw video
-> encoded into compressed format (e.g. H.264)
-> packaged into container (e.g.mp4) along with audio streams etc
-> packaged into adaptive bit rate streaming format (e.g. HLS, Smoothstreaming, MPEG DASH)
Different devices and different browsers support different encoding, packaging and streaming formats.
Assuming that you want to use a HTML5 browser, either standalone or in a web view in an app, then the following links provide a good, regularly updated, overview of the which devices and browsers support which encoding and streaming for HTML5 video playback (this is a constantly changing picture so you need to check the current status using links such as these):
https://developer.mozilla.org/en-US/docs/Web/HTML/Supported_media_formats
http://www.jwplayer.com/html5/
So your steps:
make sure video is available either unprotected or that you have access to encryption keys authentication credentials etc
identify the streaming technology being used, for example by looking at the file type in the URL (e.g. '.mpd' for a DASH format)
Look at the individual video and audio streams within the streaming 'index' or 'manifest' file and check that your device can support them
You can take a short cut initially by testing the streams you have on your target device in some of the available browser based test players for the different formats, for example for DASH:
http://www.dash-player.com/demo/manifest-test/
http://shaka-player-demo.appspot.com
If they play here then you should be able to get them working in your app.

Mandatory to use Darwin or wowza or VLC to stream live video in android?

I want to know is it mandatory to use any of the streaming servers like Darwin,Wowza or VLC to stream an RTSP live stream video? I am receiving an RTSP link from my client and it tends to change everytime. I can successfully play it in the VLC player but on phone I cant see anything. I tried playing a sample link having .3gp extension and it worked fine. But my links dont have an extension. They look like this rtsp://122.166.229.151:1950/1346a0cf0ef7c2. Please help me.If its compulsory to use an extension or a server, I will continue working in that direction.
A streaming server (as you describe) isn't strictly necessary - as long as you can pull RTSP from whatever your source is, you should be able to see it. Most IP cameras have onboard RTSP servers (although I wouldn't put too many connections on it). If you can see it in VLC, the phone should be able to consume it as well, given that the codec used to encode is one supported by the android device (in most cases, if you're doing H.264 Baseline 3.0 with AAC, you should be good to go).
A streaming server like Wowza can make that stream available to a wider audience than pulling directly from the source device, but if you're not intending to broadcast to a wide audience, it's not required for streaming to Android devices.
Newer versions of Android (Gingerbread and later) are also able to consume Apple HTTP Live Streaming.

Categories

Resources