My android app needs to play a live video from a remote RTMP server (Adobe Flash Media Server).
As android's android.media.MediaPlayer doesn't support rtmp protocol, I found a library which provides me with streaming functionality. I.e. I can connect and receive portions of video stream as byte array.
The question is how I can use this incoming video stream data to display it in a view?
It seems that current standard API doesn't allow me to do that. MediaPlayer accepts either file or url. For audio data there is an AudioTrack which allows to achieve similar goal to mine, but for audio. For video I don't see an option.
Any suggestions are appreciated, maybe there is a third party media library which may provide the functionality for Android.
Related
I have to get video/audio tracks or if it's possible MediaStream object from the Android WebView which plays HLS stream ("m3m8").
I load HLS stream in the WebView using method loadUrl("...m3m8"). It's running without issues, but i can't figure out how to get live video and audio tracks from the HLS stream. I read a lot of articles and I was not able to find any solution. So my question is - Is it possible to get audio and video tracks from the running HLS stream on the WebView? I need to get the audio and video tracks because I should send them via PeerConnection(WebRTC) which accepts MediaStream or audio tracks and video tracks. Any ideas and examples will be helpful. Thank you in advance.
HLS works by defining a playlist (your .m3u8 file) which points to chunks of the video (say segment00000.ts, segment00001.ts, ...). It can contain different streams for different resolutions, but the idea (very simplified) is that your player can download a chunk, play it right away, and download the next one in the meantime.
A WebRTC video track takes RTP packets. I don't know exactly what the Android API exposes, but it feels like either you can pass it an m3u8 as a source (though that may be a bit weird for a generic API), or you can't, and you have to read the HLS stream yourself, extract RTP frames, and pass them to WebRTC.
For instance, gstreamer's hlsdemux seems to be able to read from an HLS source. From there you could extract RTP frames and feed them to your WebRTC track.
Probably that's a bit lower-level than you hoped, but if the Android API doesn't do it for you, you have to do it yourself (or find a library that does it).
I have a live audio stream hosted on ice cast server. There is an API that returns the information about the audio played and also details of the actual audio being played (mp3 file).
I would like to know how to play an live audio stream from server in android app? I believe I will have to use Ice cast client? Are there any other alternative streaming APIs that I can use?
If you can point out some libraries it would be great help.
The built in multimedia capabilities of Android should work just fine. Just give it the stream URL (not the playlist).
I am currently using Wowza to stream videos. I am currently trying to integrate Wowza, Android, and ChromeCast Device (CCD). According to this document, https://developers.google.com/cast/docs/media, Google Cast supports the "MP4 protocol".
So, my question is this: is MP4 a streaming protocol, file format, or both?
In the ChromeCast Android demo applications, they simply pass a URL like this http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4 as metadata to the CCD.
To me, this implies that no server is required to stream the MP4 file. Meaning, I won't even need Wowza as an intermediary party to stream.
Is this understanding correct?
It seems that the client player will then be responsible to interact with the MP4 file directly (e.g. seek, pause, stop, play, etc...).
While you've already accepted an answer, and gotten your app to work (which was likely your ultimate goal), I thought it might be helpful to answer your question as well about what MP4 really is.
MP4 is a video container format; inside the MP4 container is video stream data (generally encoded in the H.264 format) and audio stream data (often encoded in the AAC format). The client player can interact with it directly because the Chromecast's browser has HTML5 video support for interpreting the MP4 container format and playing back the H.264 video and AAC audio, but it isn't "streaming" in the way that term is often used ... it's just downloading it from your web server in chunks and playing it back. There's nothing wrong with this if it's performing as you'd like (in fact, this is one of the big benefits of HTML5 video, that it doesn't need a streaming server backend), but if you actually want true media streaming (to leverage things such as adaptive bitrate switching, licensing, and so forth), you would have the MP4 file served via Wowza rather than via your web server.
If you simply have an MP4 file, just pass its url and it should work fine, just like the samples (CastVideos) projects that we have on the Github.
I am new in live streaming. I have a problem to create live streaming of recording file. I can upload the audio file to server and play from url using media player but i dont want this i want as i speak it broadcast to all and when i finish my speech then it stops broadcasting. Is it feasible or not. If feasible then how?
Can i do this with Amazon CloudFront?
As per my knowledge you can use Spydroid
it is basically for video streaming but you can use it for audio streaming as well and then you have to alter its code for audio streaming..
seccondly you will need some media server i will prefer you to use Red5 media server as it is opensource Red 5
red5 supports audio and video streaming both but you have to study it little bit
I would like to develope a application for viewing a IP camera , DVR video feeds from my own application. Can anyone please tell me the best possible way to achieve it so that the delay is as minimum as possible. What all servers are required to stream the video the formats etc..
androidfan , I believe you need to setup a media server to send & receive video streams.
Red5 will be a good option in your case.
Red5 is an Open Source Flash Server written in Java that supports: Streaming Video , Audio & RTMP protocols.