Can flowplayer handle rtsp stream? - android

Flowplayer can play rtmp and http live stream but can i use the same player to play rtsp stream. I have rtsp stream for android which can be played using external player but it opens in fullscreen mode. I thought of putting it inside a frame but the external player opens outside of the frame in android device. So i want to use flowplayer to play rtsp stream in android. Is it possible and if not what to use.

I am fairly certain that Flowplayer, while a great solution for many things, cannot be extended to accept a straight RTSP stream. In any case, I don't believe there is a supported mobile version or plugin of Flowplayer for Android at this point. I have even seen reports that embedded flowplayers being viewed on Android have been sketchy at best.
I have, however, used ffserver and ffmpeg (http://ffmpeg.org/) to transcode the RTSP stream into .flv to be played with Flowplayer, but if a transcoded stream could be broadcast on your system, you'd be well on your way!
Mason

Related

WebView getVideo/Audio tracks or MediaStream and send it to the server via WebRTC

I have to get video/audio tracks or if it's possible MediaStream object from the Android WebView which plays HLS stream ("m3m8").
I load HLS stream in the WebView using method loadUrl("...m3m8"). It's running without issues, but i can't figure out how to get live video and audio tracks from the HLS stream. I read a lot of articles and I was not able to find any solution. So my question is - Is it possible to get audio and video tracks from the running HLS stream on the WebView? I need to get the audio and video tracks because I should send them via PeerConnection(WebRTC) which accepts MediaStream or audio tracks and video tracks. Any ideas and examples will be helpful. Thank you in advance.
HLS works by defining a playlist (your .m3u8 file) which points to chunks of the video (say segment00000.ts, segment00001.ts, ...). It can contain different streams for different resolutions, but the idea (very simplified) is that your player can download a chunk, play it right away, and download the next one in the meantime.
A WebRTC video track takes RTP packets. I don't know exactly what the Android API exposes, but it feels like either you can pass it an m3u8 as a source (though that may be a bit weird for a generic API), or you can't, and you have to read the HLS stream yourself, extract RTP frames, and pass them to WebRTC.
For instance, gstreamer's hlsdemux seems to be able to read from an HLS source. From there you could extract RTP frames and feed them to your WebRTC track.
Probably that's a bit lower-level than you hoped, but if the Android API doesn't do it for you, you have to do it yourself (or find a library that does it).

How to play live AAC stream on Android with html5 audio element

I am trying to embed an html5 audio tag in a page to allow playing a live AAC+ stream coming from an Icecast server.
According to the media formats developer's guide, Android supports playback for several AAC flavors, either inside an MPEG-4 container or in ADTS.
I have successfully played AAC-encoded audio files in an MPEG-4 container, thus:
<audio controls="controls">
<source src="http://www.example.com/audio/program1.mp4" type="audio/mp4"/>
</audio>
However, I have not been able to play any AAC live stream (which, as far as I understand, is output by Icecast using ADTS) with the audio tag. I have tried setting different types (e.g., "audio/aac", which the player says it can "probably" play) as well as different file extensions for the stream URL. Nothing works. The player, by the way, initializes as if everything is OK, then when you press the play button nothing happens (other than the play button changing to a pause icon).
The only way I have been able to play a live AAC stream is by using a URL pointing to a .sdp manifest containing a link to an RTSP version of the stream. The browser then hands off the stream to the native audio player or another audio app, which plays it after a brief buffering period. This is not an option for us, as we would like to use a simple Icecast server for our stream.
Is there just no way to play a live AAC stream on Android via HTTP? It seems iOS supports it, but not Android.
From the lack of responses to the contrary, I have to conclude that the answer to the original question is, "No, it is not possible to use the HTML5 audio tag to play a live AAC+ stream from an Icecast server".
I am posting an answer to share what I ended up doing.
My first inclination was simply to set up a second Icecast stream using MP3 instead of AAC. This will work, but you must be willing to accept the buffering delay that Android's audio player introduces with MP3 streams. Unfortunately, at 64 kpbs Android makes you wait for over 40 seconds before it will start playing the MP3 stream. Admittedly, 64 kpbs is not very good quality for MP3, but even at 128 kbps the buffering takes over 20 seconds, enough for listeners to conclude that the stream is down. So MP3 is not an option for us.
My eventual solution was to ask our CDN to add a Wowza application that pulls from the AAC+ Icecast stream and transmuxes it using HLS.
Now my audio tag looks like this:
<audio controls="controls">
<source src="http://www.example.com/wowza/stream.m3u8"/>
<source src="http://www.example.com/audio/aac"/>
</audio>
Note that I had to list the HLS source first, because otherwise the Android device will actually pick the Icecast stream and try to play it, which it can't (you'd think it would know enough not to do that).
So in the end Android does play a live AAC+ stream with no delay, as long as it is delivered via HLS, and not directly from Icecast. I must say I was very disappointed with Android, both its lack of support for direct Icecast AAC+ and for its poor handling of live MP3 streams, especially since the competition (iOS) handles everything you throw at it without blinking.

is MP4 a streaming protocol or file format?

I am currently using Wowza to stream videos. I am currently trying to integrate Wowza, Android, and ChromeCast Device (CCD). According to this document, https://developers.google.com/cast/docs/media, Google Cast supports the "MP4 protocol".
So, my question is this: is MP4 a streaming protocol, file format, or both?
In the ChromeCast Android demo applications, they simply pass a URL like this http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4 as metadata to the CCD.
To me, this implies that no server is required to stream the MP4 file. Meaning, I won't even need Wowza as an intermediary party to stream.
Is this understanding correct?
It seems that the client player will then be responsible to interact with the MP4 file directly (e.g. seek, pause, stop, play, etc...).
While you've already accepted an answer, and gotten your app to work (which was likely your ultimate goal), I thought it might be helpful to answer your question as well about what MP4 really is.
MP4 is a video container format; inside the MP4 container is video stream data (generally encoded in the H.264 format) and audio stream data (often encoded in the AAC format). The client player can interact with it directly because the Chromecast's browser has HTML5 video support for interpreting the MP4 container format and playing back the H.264 video and AAC audio, but it isn't "streaming" in the way that term is often used ... it's just downloading it from your web server in chunks and playing it back. There's nothing wrong with this if it's performing as you'd like (in fact, this is one of the big benefits of HTML5 video, that it doesn't need a streaming server backend), but if you actually want true media streaming (to leverage things such as adaptive bitrate switching, licensing, and so forth), you would have the MP4 file served via Wowza rather than via your web server.
If you simply have an MP4 file, just pass its url and it should work fine, just like the samples (CastVideos) projects that we have on the Github.

Live-streaming on Android with HTML5 or an Application

I want to create a live-stream by using vlc on rtp, (preferably) rtsp or http protocols and I want to play this stream with android 2.3.4 based cell phone. I have tried starting from scratch and tried to advance step by step. I have created an html5 based offline streaming page, and it worked. However, I have some trouble with live streaming issues. I have noticed that live-streaming with html5 will be painful. So, I wanted to get the stream directly from vlc.exe to the media player on the phone. However, I couldn't decide what to do because vlc for android is in development right now, and couldn't find a suitable player which allows me to enter the address of the vlc server.
What should I do? Should I continue trying on html5, or should I try to find a suitable application for rtsp streaming on android? Should I try Wowza or another service? (BTW, I don't want to mess with socket stuff on the server side.)
I have solved this problem by streaming the content from vlc on rtsp from the PC and created a very basic html page which gives only a link to the rtsp stream. Then, I clicked the link from that html page using the phone (actually I can carry out this step by simply writing the ip address of the server to the streaming player on the phone also) and it forwarded me to choose an external media player to play the content. In this step, choosing any streaming media player (RealPlayer, MX Player etc.) can be used to play this content. There we go! We have a live-streaming from PC to Android :)

Server to stream RTSP to android

Can Flash Media Server 3.5 serve a video rtsp stream to the Android media player?
Or do we need to use Wowza or red5 to serve an rtsp stream to the Android media player rtsp client?
Are there any other rtsp servers to consider when the video needs to be viewed on Android (without the Adobe Flash app)?
Try darwin streaming server. This can stream MPEG-4 and 3GPP.
Note that Videoplayer in Android supports RTSP streaming as per 3GPP PSS Streaming specifications. i.e fileformat is 3GP/MP4 and the supported codecs are MPEG-4 Video, AVC, MPEG4 Audio, AMR, and H263.
One potential drawback with darwin is you need 3rd party tools to do hinting first.There are several free hinting tools. Definitely worth a try.
DarwinStreaming Server Link
FMS 3.5 does not support rtsp streaming.
I could manage to play a RTSP stream in Android 2.3 with HTML5 on default browser, no external app, no flash plugin.
<audio autoplay="autoplay" controls="controls" autobuffer="autobuffer" loop="loop">
<source src="rtsp://74.115.208.37:1935/live/luxweradio2_8403.stream" type="audio/mpeg" />
</audio>
I explain each part:
-I'm using the default Android browser (also worked in Dolphin)
-The loop="loop" part made the stream actually play continuously, without that I heard like 0.1 seconds of audio each time I clicked play. Is this a hack or what?
-The autoplay="autoplay" part works, it actually auto-plays after some seconds (no feedback like "loading..." though)
-The type="audio/mpeg" for me it seems like Android browser ignores that, I've tried others like video/3GPP (yes, video) and still worked!
-I'm not sure if autobuffer="autobuffer" actually does anythinhg 'cause it isn't part of the HTML5 audio tag definition, but I've seen so many "bad" things that work... I should test some cases with and without, but it's pretty late now here in Argentina :P
WOWZA 3 is able to serve the same stream in RTMP, HTTP and RTSP. I have it running it. It is publish as RTMP and delivered as RTMP, HTTP and RTSP at the same time.
See this:
http://www.wowza.com/forums/content.php?36#publish
Hope it helps.
My Media Server / library supports streaming to Android or any other compatible RtspClient.
http://net7mma.codeplex.com/
Check out the article on CodeProject # http://www.codeproject.com/Articles/507218/Managed-Media-Aggregation-using-Rtsp-and-Rtp
If you need anything else let me know!

Categories

Resources