We have to capture the real-time video using Android Camera, and send them to the server, then other users would read them through the browser or something else.
I have Googled and searched at SO, and there are some examples about video stream app like:
1 Android-eye: https://github.com/Teaonly/android-eye
2 Spydroid-ipcamera:https://code.google.com/p/spydroid-ipcamera/
However it seems that they have different environments, most of the apps will start an HTTP server for stream requests, then the client will visit the page through the local network and see the video.
Then the video stream source and the server are both the device like this:
But we need the internet support like this:
So I wonder if there are any alternative ideas.
I can see you have designed the three stages correctly, in your second diagram.
So what you need is to determine how to choose among these protocols and how to interface them.
No one can give you a complete solution but having completed an enterprise project on Android Video Streaming I will try to straighten your sight towards your goal.
There are three parts in your picture, I'll elaborate from left to right:
1. Android Streamer Device
Based on my experience, I can say Android does well sending Camera streams over RTP, due to native support, while converting your video to FLV gives you headache. (In many cases, e.g. if later you want to deliver the stream on to the Android devices.)
So I would suggest building up on something like spyDroid.
2. Streaming Server
There are tools like Wowza Server which can get a source stream
and put it on the output of the server for other clients. I guess
VLC can do this too, via File-->Stream menu, an then putting the
RTSP video stream address from your spyDroid based app. But I have
not tried it personally.
Also it is not a hard work to implement your own streamer server.
I'll give you an example:
For Implementation of an HLS server, you just need three things:
Video files, segmented into 10 second MPEG2 chunks. (i.e. .ts files)
An m3U8 playlist of the chunks.
A Web Server with a simple WebService that deliver the playlist to the Clients (PC, Android, iPhone, mostly every device) over HTTP. The clients will then look up the playlist file and ask for the appropriate chunks on their according timing. Because nearly all players have built-in HLS support.
3. The Client-Side
Based on our comments, I suggest you might want to dig deeper into Android Video Streaming.
To complete a project this big, you need much more research. For example you should be able to distinguish RTP from RTSP and understand how they are related to each other.
Read my answer here to get a sense of state-of-the-art Video Streaming and please feel free to ask for more.
Hope you got the big picture of the journey ahead,
Good Luck and Have Fun
Quite a general question, but I will try to give you a direction for research:
First of all you will need answer several questions:
1) What is the nature and purpose of a video stream? Is it security application, where details in stills are vital (then you will have to use something like MJPEG codec) or it will be viewed only in motion?
2) Are stream source, server and clients on the same network, so that RTSP might be used for more exact timing, or WAN will be involved and something more stable like HTTP should be used?
3) What is the number of simultaneous output connection? In other words, is it worth to pay for something like Wowza with transcoding add-on (and maybe nDVR too) or Flussonic, or simple solution like ffserver will suffice?
To cut long story short, for a cheap and dirty solution for couple of viewers, you may use something like IP Webcam -> ffserver -> VLC for Android and avoid writing your own software.
You can handle it this way:
Prepare the camera preview in the way described here. The Camera object has a setPreviewCallback method in which you register the preview callback. This callback provides data buffer (byte array) in YUV format that you can stream to your server.
Related
For the past month I have been searching over the Internet for ways to implement recording live video from an application on Android and sending it over to a server, but the more I research the more confused I get.
First of all, I am looking for a streaming protocol that can be used for iOS also in the future, so I came to a conclusion that DASH(Dynamic Adaptive Streaming over HTTP) is the ideal solution.
In addition, the recent Android framework, ExoPlayer, support this feature.
Furthermore, I do not wish to use a Live Streaming engine such as WOWZA.
Secondly, based on my research I also concluded that any HTTP server can be used to receive the "chuncks" of data, but I must have a streaming server to be able to stream the video back to the users.
I believe this process is quite complex but I will not give up until I successfully make it work.
Lastly, my question is, what Server, Protocol should I use to be able to achieve this ? And how to convert video directly and send to server ?
Looking at your questions re protocol and server:
A 'streaming protocol that can be used for iOS also in the future'
It probably depends what you mean by 'future. At the moment apple require you to use HLS on iOS for any video on a Mobile Network (cellular) which is over 10 mins long. DASH is establishing itself as the industry standard so this may change and apple may accept it also, but if you need something in the near future you may want to plan to support DASH and HLS.
What server should you use for streaming
Streaming video is complex and the domain is fast changing so it really is good to use or build on a dedicated streaming server, if you can. These will generally have mechanisms and/or well documented procedures for converting input videos to the different formats and bit rates you need, depending on the reach and user experience goals you have. Reach will determine the different encodings you need, different browsers and devices supporting different encodings, and if you want your user to have good experience avoiding buffering you will want multiple bit rate versions of each format also - this allows DASH and HLS provide Adaptive Bit rate Streaming (ABR) which means the clients can select the best bit rate at any given time depending on network conditions. Video manipulation, especially transcoding, is a CPU intensive task so another advantage of dedicated streaming server software is that it should be optimised as much as possible to reduce your server loads.
If you do decide to go the streaming server route, then there are open source alternatives, as well as Wowza which you mention above, such as:
https://gstreamer.freedesktop.org
These have plugins that support ABR etc - if you search for 'GStreamer streaming server ABR' you will find some good blogs about setting this up.
I'm trying to develop an Android app that reads a video coming from either the server or other peers that have the video. For this usecase I have to split my videos into smaller pieces to optimize transfer times and each piece can be provided by either the central server or another peer.
I would like to know if exoplayer is able to read a sequence of video pieces without interruption ?
I am free to do whatever I want in the splitting process e.g. split the video with linux command split.
Most adaptive bit rate streaming formats work something like your description - they spit the video into multiple chunks and the video player requests them one at a time. Examples of adaptive rate streaming protocols are HLS, MPEG-DASH, SmoothStreaming.
It is possible to have the url for the next 'chunk' of video route to a 'central' server which could proxy the request to another 'peer', if this would meet your needs.
Its worth noting that many videos are delivered via CDN's which might interfere with your desired approach (or alternatively might actually match what you want, depending on what your underlying requirements are) so you may want to check this also.
Update
Assuming you mean that some chunks will come from the server and some chunks from peer devices on the network, then the above server proxy method would not work.
One way you could do this would be to have all the chunks delivered to the device from whatever source is best for each chunk, and then put them together as you require on the device and stream the result from 'localhost' on your device to the player.
This sounds like a huge amount of overhead and something that would never work but I believe it is actually a technique used in some apps to convert from one streaming format to another (can't provide example - sorry...).
One example of a 'localhost' server on Android that might be useful to look at is:
http://www.laptopmag.com/articles/android-web-server
An alternative, if you were to use HTML5 inside a web page on the device you could use the Media Source Extension mechanism to load the video chunks from the different sources before passing them to the player. This does require Chrome at this point rather than the standard Android browser as the latter does not support the MSE extension at the time of writing.
In all these approaches you obviously need to make sure you load enough in advance to keep the video pipeline and buffer full, to avoid pauses.
I have never worked on video related project, however we have to now.
1. What we tried to do
Build an Andriod application which can take the real time steam of video an audio.
Send the captured stream to Server
Other clients(Either Android client or iOS or HTML5) can view these streams
All of the above three steps should work at the same time.
Video streamed to server should be cached by future play.
2. What I know at the moment
I have searched at google and sf to see if someone have the same requirement.
After that I know a little about the video transformation:
Protocol:
RTSP/RTP/RTCP
RTSP: control the state of the transformation like PLAY,PAUSE,STOP..
RTP: which do the real transport job
RTCP: work in conjunction with RTP(synchronize the stream)
HTTP:
1) Download the small pieces of the video file and play them, use the `range-requset` to control the download(play) location.
2) HLS by Apple. Even it said it is live stream, it is based on `.m3u8` file, by updating the index of which to do the live job.
RTMP by Adobe.
Encoding:
Nothing I know yet.
And it seems that RTSP/RTP/RTCP can be used for both uploading to server and playing at the client. So it apply for the application which need high real-timing. However since the RTSP/RTP/RTCP based on TCP/UDP so getting through the Router would be a problem.
While the HTTP can only be used for playing at the client(Technologically you can stream the small pieces of file by HTTP, but I think it is not a good idea), so it can be use to play the existed video stream either from file or something else. And you don't worry about the Router, which means it can be used under complex network environment.
For our application, since we do not have a strict requirement for the real-timing during the playing. So we tried to stream the video source from the Android client to Server by RTSP/RTP/RTCP, and serve these streams by HTTP.
2. Questions:
Anything wrong in all of the above?
Is it posssible of my idea:streaming by RTSP/RTP/RTCP and serving
by HTTP.
If yes, it seems that the Server shoud do something to cache the video to a proper format for further serving. I am not sure if this job can be done by a Video Server out of the box, or by myself?
What should I know more about the streaming development(at least for
my current project)? Any tutorial are welcome.
I have spent enough time searching for a solution but I am not sure which way I should go. That is why I am asking question here.
I want to make an application from where I can record a video and same video will be shown live on any other device connected to the network or it can be on web using JWplayer or some thing like that.
Now I have done enough R&D and we are not considering WOwza server (as it is paid) . So on web side we have already implemented in which web cam in recording video and which can be seen on our web site and also on android and iphone browsers live.
But problem with red 5 is that is does not give support to RTSP (it only supports RTMP) unlike Wowza which handles RTSP internally and convert it into RTMP.
I have found few links who suggested client side RTMP handling what so far no success.
Convert video Input Stream to RTMP
This links explains lot of things but I am not sure which way I should spend my time or it is possible to convert RTSP into RTMP for red 5 server.
https://play.google.com/store/apps/details?id=air.Broadcaster
this guy has achieve this functionality I guess but we are looking for the native application.
Please suggest me in the right direction I should go and invest my time live record and watch video stream.
We can use other servers which are open source rather than red 5 , if it fulfill my needs on both iOS, android and web else I ll have to go with wowza which I dnt want for now.
I'm working on a radio Android app in which I'd like to have options to rewind/fast-forward/back to live the audio stream.
It seems that it's not possible with Mediaplayer (I can't find any method to do that), so how can I do that?
The developer of the iOS version of the app is using the RadioKit SDK. Is there anything similar for Android?
I found this link that goes over some of the reasons why HTTP streaming isn't well-supported on Android. You can write your own HTTP streaming client and insert it as a proxy between the MediaPlayer and the media source, but that is the only way as far as I am aware. As far as trick mode, there is no real fast-forward or rewind protocol built into HTTP streaming. You have to simply request the correct byte from the server (see here for a little more info). The good news is it should be much easier to estimate the byte to request given a time position for audio than video (I've seen some pretty ridiculous algorithms for video).