is MP4 a streaming protocol or file format? - android

I am currently using Wowza to stream videos. I am currently trying to integrate Wowza, Android, and ChromeCast Device (CCD). According to this document, https://developers.google.com/cast/docs/media, Google Cast supports the "MP4 protocol".
So, my question is this: is MP4 a streaming protocol, file format, or both?
In the ChromeCast Android demo applications, they simply pass a URL like this http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4 as metadata to the CCD.
To me, this implies that no server is required to stream the MP4 file. Meaning, I won't even need Wowza as an intermediary party to stream.
Is this understanding correct?
It seems that the client player will then be responsible to interact with the MP4 file directly (e.g. seek, pause, stop, play, etc...).

While you've already accepted an answer, and gotten your app to work (which was likely your ultimate goal), I thought it might be helpful to answer your question as well about what MP4 really is.
MP4 is a video container format; inside the MP4 container is video stream data (generally encoded in the H.264 format) and audio stream data (often encoded in the AAC format). The client player can interact with it directly because the Chromecast's browser has HTML5 video support for interpreting the MP4 container format and playing back the H.264 video and AAC audio, but it isn't "streaming" in the way that term is often used ... it's just downloading it from your web server in chunks and playing it back. There's nothing wrong with this if it's performing as you'd like (in fact, this is one of the big benefits of HTML5 video, that it doesn't need a streaming server backend), but if you actually want true media streaming (to leverage things such as adaptive bitrate switching, licensing, and so forth), you would have the MP4 file served via Wowza rather than via your web server.

If you simply have an MP4 file, just pass its url and it should work fine, just like the samples (CastVideos) projects that we have on the Github.

Related

WebView getVideo/Audio tracks or MediaStream and send it to the server via WebRTC

I have to get video/audio tracks or if it's possible MediaStream object from the Android WebView which plays HLS stream ("m3m8").
I load HLS stream in the WebView using method loadUrl("...m3m8"). It's running without issues, but i can't figure out how to get live video and audio tracks from the HLS stream. I read a lot of articles and I was not able to find any solution. So my question is - Is it possible to get audio and video tracks from the running HLS stream on the WebView? I need to get the audio and video tracks because I should send them via PeerConnection(WebRTC) which accepts MediaStream or audio tracks and video tracks. Any ideas and examples will be helpful. Thank you in advance.
HLS works by defining a playlist (your .m3u8 file) which points to chunks of the video (say segment00000.ts, segment00001.ts, ...). It can contain different streams for different resolutions, but the idea (very simplified) is that your player can download a chunk, play it right away, and download the next one in the meantime.
A WebRTC video track takes RTP packets. I don't know exactly what the Android API exposes, but it feels like either you can pass it an m3u8 as a source (though that may be a bit weird for a generic API), or you can't, and you have to read the HLS stream yourself, extract RTP frames, and pass them to WebRTC.
For instance, gstreamer's hlsdemux seems to be able to read from an HLS source. From there you could extract RTP frames and feed them to your WebRTC track.
Probably that's a bit lower-level than you hoped, but if the Android API doesn't do it for you, you have to do it yourself (or find a library that does it).

Playing video from custom live stream (Android)

My android app needs to play a live video from a remote RTMP server (Adobe Flash Media Server).
As android's android.media.MediaPlayer doesn't support rtmp protocol, I found a library which provides me with streaming functionality. I.e. I can connect and receive portions of video stream as byte array.
The question is how I can use this incoming video stream data to display it in a view?
It seems that current standard API doesn't allow me to do that. MediaPlayer accepts either file or url. For audio data there is an AudioTrack which allows to achieve similar goal to mine, but for audio. For video I don't see an option.
Any suggestions are appreciated, maybe there is a third party media library which may provide the functionality for Android.

Can flowplayer handle rtsp stream?

Flowplayer can play rtmp and http live stream but can i use the same player to play rtsp stream. I have rtsp stream for android which can be played using external player but it opens in fullscreen mode. I thought of putting it inside a frame but the external player opens outside of the frame in android device. So i want to use flowplayer to play rtsp stream in android. Is it possible and if not what to use.
I am fairly certain that Flowplayer, while a great solution for many things, cannot be extended to accept a straight RTSP stream. In any case, I don't believe there is a supported mobile version or plugin of Flowplayer for Android at this point. I have even seen reports that embedded flowplayers being viewed on Android have been sketchy at best.
I have, however, used ffserver and ffmpeg (http://ffmpeg.org/) to transcode the RTSP stream into .flv to be played with Flowplayer, but if a transcoded stream could be broadcast on your system, you'd be well on your way!
Mason

Is There a Way to Playback Raw Audio/Video Streams in Android API's?

I have a Vorbis stream that I can decode to PCM if necessary, and I have a raw h264 stream all three of which are supported by Android when in a container. Wondering if there is any way to manually feed video and audio samples into the android MediaPlayer without any container. I would imagine I would have to override methods within the MediaPlayer. Does anyone have experience with this or have an easier way to do this? I can't imagine its impossible...
You may be able to play the audio pcm samples (For that also, I guess you may have to put a wav header) but you may not be able to play the H264 elementary stream without a container. In the media framework (Stagefright), there are sniffers functions registered for various container formats which could tell what kind of parser need to be used and then extractor(parser) is created. I don't think that you will be able to play the H264 elementary stream from the application using the inbuilt media framework of Android.

stream audio file from FTP server to Android App

I have a FTP server setup that holds audio files in one of its directories. I would like to stream the audio from the server and play it on my Android phone instead of downloading it and playing it back that way. Also, is it possible to stream it to the MediaPlayer in Android for playback?
The FTP protocol does not support streaming audio or video.
However, you could set up a streaming server on the same box that will do it for you. I've used VLC to stream video and it's pretty easy to set up. Should work for audio too.
http://www.videolan.org/doc/streaming-howto/en/index.html
You can stream video over FTP. It is just a basic transfer protocol and once you have the data streaming to your device you can do what you want with it. Take a look at this tutorial if you want to set up streaming to your phone:
https://www.digitaldrugs.co.uk/wordpress/?p=37
Sure it is possible, the only problem I see is that your media files should be in a continuous file format, such as MP3. See shoutcast streaming for example, it works via http.
yxplayer is what you want, but it might be a bit limited
You can stream mp3 over FTP. Same way you can DL mp3 from ftp and listen to it before it's finished DLing. There's File Managers/Explorers like FX for one that will do this, but all it's streaming stuff is a trial or maybe by now a paid unlockable feature. Look for an open source remedy.

Categories

Resources