I am new in live streaming. I have a problem to create live streaming of recording file. I can upload the audio file to server and play from url using media player but i dont want this i want as i speak it broadcast to all and when i finish my speech then it stops broadcasting. Is it feasible or not. If feasible then how?
Can i do this with Amazon CloudFront?
As per my knowledge you can use Spydroid
it is basically for video streaming but you can use it for audio streaming as well and then you have to alter its code for audio streaming..
seccondly you will need some media server i will prefer you to use Red5 media server as it is opensource Red 5
red5 supports audio and video streaming both but you have to study it little bit
Related
I have to get video/audio tracks or if it's possible MediaStream object from the Android WebView which plays HLS stream ("m3m8").
I load HLS stream in the WebView using method loadUrl("...m3m8"). It's running without issues, but i can't figure out how to get live video and audio tracks from the HLS stream. I read a lot of articles and I was not able to find any solution. So my question is - Is it possible to get audio and video tracks from the running HLS stream on the WebView? I need to get the audio and video tracks because I should send them via PeerConnection(WebRTC) which accepts MediaStream or audio tracks and video tracks. Any ideas and examples will be helpful. Thank you in advance.
HLS works by defining a playlist (your .m3u8 file) which points to chunks of the video (say segment00000.ts, segment00001.ts, ...). It can contain different streams for different resolutions, but the idea (very simplified) is that your player can download a chunk, play it right away, and download the next one in the meantime.
A WebRTC video track takes RTP packets. I don't know exactly what the Android API exposes, but it feels like either you can pass it an m3u8 as a source (though that may be a bit weird for a generic API), or you can't, and you have to read the HLS stream yourself, extract RTP frames, and pass them to WebRTC.
For instance, gstreamer's hlsdemux seems to be able to read from an HLS source. From there you could extract RTP frames and feed them to your WebRTC track.
Probably that's a bit lower-level than you hoped, but if the Android API doesn't do it for you, you have to do it yourself (or find a library that does it).
I have used javacv library to stream the live video from mobile to wowza media server and it's working fine. And I tried to record the live video in media server for broadcasting the video to multiple devices but I can hear only audio from the recorded video file.
Please let me know where I have done wrong and let me know the steps for re-stream the video to multiple device. I'm new to the Media server platform.
Thanks.
I am looking to develop an app, that can make an audio call using sip and then user A (the caller) plays a media file from his android phone and user b (the receiver) can also hear the audio and video in realtime, how i can accomplish this? can we transfer media sounds through sip or voip without using mic? that is a sound of media playing on device's media player, any guidance or links to study will be helpful.thanks!
I am looking to develop an app, that can make an audio call using sip
Yes, Android's SIP package has that facility built-in for you.
user A (the caller) plays a media file from his android phone and user
b (the receiver) can also hear the audio and video in realtime, how i
can accomplish this?
As for streaming the video, you need RTSP instead of SIP. Have a look at Video streaming using RTSP: Android
Further Reading:
Upload live android webcam video to RTP/RTSP Server
Creating RTP Packets from Android Camera to Send
Library:
https://github.com/fyhertz/libstreaming
My android app needs to play a live video from a remote RTMP server (Adobe Flash Media Server).
As android's android.media.MediaPlayer doesn't support rtmp protocol, I found a library which provides me with streaming functionality. I.e. I can connect and receive portions of video stream as byte array.
The question is how I can use this incoming video stream data to display it in a view?
It seems that current standard API doesn't allow me to do that. MediaPlayer accepts either file or url. For audio data there is an AudioTrack which allows to achieve similar goal to mine, but for audio. For video I don't see an option.
Any suggestions are appreciated, maybe there is a third party media library which may provide the functionality for Android.
I have a FTP server setup that holds audio files in one of its directories. I would like to stream the audio from the server and play it on my Android phone instead of downloading it and playing it back that way. Also, is it possible to stream it to the MediaPlayer in Android for playback?
The FTP protocol does not support streaming audio or video.
However, you could set up a streaming server on the same box that will do it for you. I've used VLC to stream video and it's pretty easy to set up. Should work for audio too.
http://www.videolan.org/doc/streaming-howto/en/index.html
You can stream video over FTP. It is just a basic transfer protocol and once you have the data streaming to your device you can do what you want with it. Take a look at this tutorial if you want to set up streaming to your phone:
https://www.digitaldrugs.co.uk/wordpress/?p=37
Sure it is possible, the only problem I see is that your media files should be in a continuous file format, such as MP3. See shoutcast streaming for example, it works via http.
yxplayer is what you want, but it might be a bit limited
You can stream mp3 over FTP. Same way you can DL mp3 from ftp and listen to it before it's finished DLing. There's File Managers/Explorers like FX for one that will do this, but all it's streaming stuff is a trial or maybe by now a paid unlockable feature. Look for an open source remedy.