I have tried broadcasting videos in Android app through Wowza server and it works pretty well with this demo of javaCV.
But the only problem with this is that it uses flv file format to broadcast on Wowza server. It uses FFmpegFrameRecorder to broadcast live video to wowza server (not VOD). To set format of the video broadcasted at server, it uses following method:
recorder.setFormat("flv");
So the main problem with this method is that when I use other formats, it doesn't work. For E.g.:
recorder.setFormat("mpeg"); //or something like mov, etc
Please suggest a way to broadcast video from Android device to wowza server with any format other than flv. Please provide any links or tutorials with which I can start.
Thanks in advance.
Edit: I am able to play only audio while streaming vlc video format in Android and iOS devices.
Your issue is not container, but protocol. For live streaming, your best options are RTMP, RTP/RTSP. RTMP is essentially FLV with VCR style commands (play pause stop). There is no streaming protocol for MP4, except HTTP. and this would require you to produce a new mp4 ever few seconds and reassemble server side.
Im not sure you FLV/RTMP is off limits to you, because it probably is the best. But next I would suggest RTSP, and maybe WebRTC.
Related
I want to integrate Video Broadcasting and Streaming in my android application through Wowza server. I have tried many different thing like this demo from JavaCV and this one from AndroidHive.
But the only problem with first one is that it uses flv file format to broadcast on Wowza server. It uses FFmpegFrameRecorder to broadcast live video to wowza server (not VOD). To set format of the video broadcasted at server, it uses following method:
recorder.setFormat("flv");
So the main problem with this method is that when I use other formats, it doesn't work. For E.g.:
recorder.setFormat("mpeg"); //or something like mov, etc
The second one was appropriate and had exactly what I needed. But I am facing a weird scenario in that one. It works perfectly fine (can broadcast Video with Audio) for localhost links like rtsp://192.168.1.58:1935/live/myStream but it fails to Broadcast live links like rtsp://54.208.***.***:1935/live/myStream. It shows as playing in wowza server but I cannot see Video or hear Audio from that link.
Please suggest a way to overcome this problem so that I can get Video and Audio at my end while boradcasting
I have used this code to communicate between Wowza and Android for Video Broadcasting part. I completed the part with completed success. The link given shows quite a good explanation on the topic and the configuration of Wowza server that needs to be done in order to make the brodcasting from Android happen.
Try android-ffmpeg library. This will definitely help you!
I am currently using Wowza to stream videos. I am currently trying to integrate Wowza, Android, and ChromeCast Device (CCD). According to this document, https://developers.google.com/cast/docs/media, Google Cast supports the "MP4 protocol".
So, my question is this: is MP4 a streaming protocol, file format, or both?
In the ChromeCast Android demo applications, they simply pass a URL like this http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4 as metadata to the CCD.
To me, this implies that no server is required to stream the MP4 file. Meaning, I won't even need Wowza as an intermediary party to stream.
Is this understanding correct?
It seems that the client player will then be responsible to interact with the MP4 file directly (e.g. seek, pause, stop, play, etc...).
While you've already accepted an answer, and gotten your app to work (which was likely your ultimate goal), I thought it might be helpful to answer your question as well about what MP4 really is.
MP4 is a video container format; inside the MP4 container is video stream data (generally encoded in the H.264 format) and audio stream data (often encoded in the AAC format). The client player can interact with it directly because the Chromecast's browser has HTML5 video support for interpreting the MP4 container format and playing back the H.264 video and AAC audio, but it isn't "streaming" in the way that term is often used ... it's just downloading it from your web server in chunks and playing it back. There's nothing wrong with this if it's performing as you'd like (in fact, this is one of the big benefits of HTML5 video, that it doesn't need a streaming server backend), but if you actually want true media streaming (to leverage things such as adaptive bitrate switching, licensing, and so forth), you would have the MP4 file served via Wowza rather than via your web server.
If you simply have an MP4 file, just pass its url and it should work fine, just like the samples (CastVideos) projects that we have on the Github.
I want to know is it mandatory to use any of the streaming servers like Darwin,Wowza or VLC to stream an RTSP live stream video? I am receiving an RTSP link from my client and it tends to change everytime. I can successfully play it in the VLC player but on phone I cant see anything. I tried playing a sample link having .3gp extension and it worked fine. But my links dont have an extension. They look like this rtsp://122.166.229.151:1950/1346a0cf0ef7c2. Please help me.If its compulsory to use an extension or a server, I will continue working in that direction.
A streaming server (as you describe) isn't strictly necessary - as long as you can pull RTSP from whatever your source is, you should be able to see it. Most IP cameras have onboard RTSP servers (although I wouldn't put too many connections on it). If you can see it in VLC, the phone should be able to consume it as well, given that the codec used to encode is one supported by the android device (in most cases, if you're doing H.264 Baseline 3.0 with AAC, you should be good to go).
A streaming server like Wowza can make that stream available to a wider audience than pulling directly from the source device, but if you're not intending to broadcast to a wide audience, it's not required for streaming to Android devices.
Newer versions of Android (Gingerbread and later) are also able to consume Apple HTTP Live Streaming.
I would like to develope a application for viewing a IP camera , DVR video feeds from my own application. Can anyone please tell me the best possible way to achieve it so that the delay is as minimum as possible. What all servers are required to stream the video the formats etc..
androidfan , I believe you need to setup a media server to send & receive video streams.
Red5 will be a good option in your case.
Red5 is an Open Source Flash Server written in Java that supports: Streaming Video , Audio & RTMP protocols.
I'm trying to play video file on a remote server. Video format is flv and server is Flash Media Server3.5.
I'm going to connect to server over RTMP and to implement the palyback of video file using Android Media Player.
Really,is it possible? Any help is my pleasure.
http://www.aftek.com/afteklab/aftek-RTMP-library.shtml
I found this one, but haven't had much luck, there are very few docs and after jigging it to try and support Video (no examples as i can see) i found that the core method RtmpStreamFactory.getRtmpStream(); failed.
This one has also cropped up, but i haven't looked at this yet.
http://code.google.com/p/android-rtmp-client/
It looks like that for me i'll be looking at getting the media server to deliver rtsp instead and this is supported by android. You may also find that later versions of Android i.e. 3> support rtmp.