My goal is to stream the audio/video content in real time from Android mobile to Wowza Server.
1- As I understand, the protocol for the Wowza's incoming and outgoing(broadcasted) streams can be different. True?
2- If that is so, then I can either upload my video data through HLS protocol or RTMP protocol.
3- I'm little bit familiar with these protocols after searching here and there, but I don't have enough knowledge to make a decision which protocol will be best to stream recorded audio and video data to Wowza server in real time, so that it can be broadcasted and seen through HLS protocol. So help in selecting the best protocol will be appreciated.
It should be noted that the video will be recorded and streamed in real time, that means I will have to get the encoded buffer from MediaCodec and send it to the Wowza Server.
Again, if the point 1 above is true then it means my only concern should be to choose the best protocol for the Wowza's inbound stream.
Any advice regarding Android OS versions, SDK vs. NDK etc. will be highly appreciated.
Thanks.
1 - True.
2 - RTMP. HLS is a pull base protocol over HTTP. fine for server to client, but not great for client to server.
3 - This is not a question.
Protocol it completely independent of OS choice.
Related
We have to capture the real-time video using Android Camera, and send them to the server, then other users would read them through the browser or something else.
I have Googled and searched at SO, and there are some examples about video stream app like:
1 Android-eye: https://github.com/Teaonly/android-eye
2 Spydroid-ipcamera:https://code.google.com/p/spydroid-ipcamera/
However it seems that they have different environments, most of the apps will start an HTTP server for stream requests, then the client will visit the page through the local network and see the video.
Then the video stream source and the server are both the device like this:
But we need the internet support like this:
So I wonder if there are any alternative ideas.
I can see you have designed the three stages correctly, in your second diagram.
So what you need is to determine how to choose among these protocols and how to interface them.
No one can give you a complete solution but having completed an enterprise project on Android Video Streaming I will try to straighten your sight towards your goal.
There are three parts in your picture, I'll elaborate from left to right:
1. Android Streamer Device
Based on my experience, I can say Android does well sending Camera streams over RTP, due to native support, while converting your video to FLV gives you headache. (In many cases, e.g. if later you want to deliver the stream on to the Android devices.)
So I would suggest building up on something like spyDroid.
2. Streaming Server
There are tools like Wowza Server which can get a source stream
and put it on the output of the server for other clients. I guess
VLC can do this too, via File-->Stream menu, an then putting the
RTSP video stream address from your spyDroid based app. But I have
not tried it personally.
Also it is not a hard work to implement your own streamer server.
I'll give you an example:
For Implementation of an HLS server, you just need three things:
Video files, segmented into 10 second MPEG2 chunks. (i.e. .ts files)
An m3U8 playlist of the chunks.
A Web Server with a simple WebService that deliver the playlist to the Clients (PC, Android, iPhone, mostly every device) over HTTP. The clients will then look up the playlist file and ask for the appropriate chunks on their according timing. Because nearly all players have built-in HLS support.
3. The Client-Side
Based on our comments, I suggest you might want to dig deeper into Android Video Streaming.
To complete a project this big, you need much more research. For example you should be able to distinguish RTP from RTSP and understand how they are related to each other.
Read my answer here to get a sense of state-of-the-art Video Streaming and please feel free to ask for more.
Hope you got the big picture of the journey ahead,
Good Luck and Have Fun
Quite a general question, but I will try to give you a direction for research:
First of all you will need answer several questions:
1) What is the nature and purpose of a video stream? Is it security application, where details in stills are vital (then you will have to use something like MJPEG codec) or it will be viewed only in motion?
2) Are stream source, server and clients on the same network, so that RTSP might be used for more exact timing, or WAN will be involved and something more stable like HTTP should be used?
3) What is the number of simultaneous output connection? In other words, is it worth to pay for something like Wowza with transcoding add-on (and maybe nDVR too) or Flussonic, or simple solution like ffserver will suffice?
To cut long story short, for a cheap and dirty solution for couple of viewers, you may use something like IP Webcam -> ffserver -> VLC for Android and avoid writing your own software.
You can handle it this way:
Prepare the camera preview in the way described here. The Camera object has a setPreviewCallback method in which you register the preview callback. This callback provides data buffer (byte array) in YUV format that you can stream to your server.
I want to know is it mandatory to use any of the streaming servers like Darwin,Wowza or VLC to stream an RTSP live stream video? I am receiving an RTSP link from my client and it tends to change everytime. I can successfully play it in the VLC player but on phone I cant see anything. I tried playing a sample link having .3gp extension and it worked fine. But my links dont have an extension. They look like this rtsp://122.166.229.151:1950/1346a0cf0ef7c2. Please help me.If its compulsory to use an extension or a server, I will continue working in that direction.
A streaming server (as you describe) isn't strictly necessary - as long as you can pull RTSP from whatever your source is, you should be able to see it. Most IP cameras have onboard RTSP servers (although I wouldn't put too many connections on it). If you can see it in VLC, the phone should be able to consume it as well, given that the codec used to encode is one supported by the android device (in most cases, if you're doing H.264 Baseline 3.0 with AAC, you should be good to go).
A streaming server like Wowza can make that stream available to a wider audience than pulling directly from the source device, but if you're not intending to broadcast to a wide audience, it's not required for streaming to Android devices.
Newer versions of Android (Gingerbread and later) are also able to consume Apple HTTP Live Streaming.
I am working on an Android app to play video stream in RTSP protocol, which provided by the Darwin streaming server.
My problem is that the RTSP stream cannot be played using Android's VideoView/MediaPlayer via some specific WiFi hotspots, e.g. at my workplace. I searched around and found that Darwin streaming server use UDP Ports 6970 - 6999 for media data streaming, and the firewall may be the problem. but the same stream can be played using VLC on PC via the same WiFi hotspot.
What's the difference between the mechanism that VLC and the Android's build-in media framework OpenCore use? Is that possible for me to write my own rtsp client with live555's openRTSP source on Android? Any help will be very appreciated.
Bolton
I've used wireshark to scan my network and I think I now know the difference:
When I use android emulator, I can see the client keeps sending UDP requests through ports 6970, 6971 but get no response. And when using VLC, the RTP data is transfered in TCP via port 554.
And the problem is caused by the firewall I think.
As you stated in your answer VLC Switches to Interleaved Rtp over Rtsp when Udp fails.
This is why VLC continues to work.
You can use my library # https://net7mma.codeplex.com/ if you can use .Net or you can use it as a reference for your own development.
I am new to video streaming and am working on a project to broadcast video to android phone over internet, and users to view the video at the same time may reach 100.
After looking around for a while I think using rtsp streaming for the phone client may be convenient(Am I right?) and so I have to choose a server, My current choice will be using
VLC
Darwin Streaming Server
Are they suitable? Or any other better choice?
How about the performance of these two servers while 100 users accessing at the same time.
Thanks in advance
Regards
Bolton
RTSP streaming in H.264/AAC would be the most convenient way to reach Android devices. End-users will not need to install an app or open the stream in one - the native media player will seamlessly open the stream.
If you intend on using VLC for the encoding portion - you may want to reconsider, as I'm not sure it supports H.264/AAC compression, which is required to reach Android devices. You may want to consider using commercial software like Wirecast or the free Flash Media Encoder with the AAC plugin.
Darwin Streaming Server is stable enough to handle that load (100 concurrent viewers), however the amount of throughput you have available and the bit-rate you will be broadcasting at are more important factors to consider when delivering video. In other words - your upload speed has to be able to be sufficient. If it's not intended strictly as a DIY project, I would suggest tapping into a commercial CDN's network (I would recommend NetroMedia).
Let's say that I have Microsoft Media Server stream (i.e. mms://[some ip address here]). This stream contains both audio and video. Is it possible to stream this to an Android phone? How would I go about doing this? Preferably with video, but if it is an audio stream only that would also be okay.
mms:// is used as placeholder url schema. WMS will stream through either RTSP or HTTP protocols. However in order to playback stream on the phone you need streaming code and codec. Android seems to support WMA/WMV codecs, but I do not see any information about protocols.
Sorry, Microsoft halted support in 08 for MMS. I would dump them and find a better solution.