I have never worked on video related project, however we have to now.
1. What we tried to do
Build an Andriod application which can take the real time steam of video an audio.
Send the captured stream to Server
Other clients(Either Android client or iOS or HTML5) can view these streams
All of the above three steps should work at the same time.
Video streamed to server should be cached by future play.
2. What I know at the moment
I have searched at google and sf to see if someone have the same requirement.
After that I know a little about the video transformation:
Protocol:
RTSP/RTP/RTCP
RTSP: control the state of the transformation like PLAY,PAUSE,STOP..
RTP: which do the real transport job
RTCP: work in conjunction with RTP(synchronize the stream)
HTTP:
1) Download the small pieces of the video file and play them, use the `range-requset` to control the download(play) location.
2) HLS by Apple. Even it said it is live stream, it is based on `.m3u8` file, by updating the index of which to do the live job.
RTMP by Adobe.
Encoding:
Nothing I know yet.
And it seems that RTSP/RTP/RTCP can be used for both uploading to server and playing at the client. So it apply for the application which need high real-timing. However since the RTSP/RTP/RTCP based on TCP/UDP so getting through the Router would be a problem.
While the HTTP can only be used for playing at the client(Technologically you can stream the small pieces of file by HTTP, but I think it is not a good idea), so it can be use to play the existed video stream either from file or something else. And you don't worry about the Router, which means it can be used under complex network environment.
For our application, since we do not have a strict requirement for the real-timing during the playing. So we tried to stream the video source from the Android client to Server by RTSP/RTP/RTCP, and serve these streams by HTTP.
2. Questions:
Anything wrong in all of the above?
Is it posssible of my idea:streaming by RTSP/RTP/RTCP and serving
by HTTP.
If yes, it seems that the Server shoud do something to cache the video to a proper format for further serving. I am not sure if this job can be done by a Video Server out of the box, or by myself?
What should I know more about the streaming development(at least for
my current project)? Any tutorial are welcome.
Related
I'm trying to stream audio across several devices using the Nearby Connections API but I'm not really sure if this is really possible/recommendable.
What I want to do is broadcast the audio files (both songs stored on the phone and from apps such as Google Music, Spotify ...) to the other devices connected and so they can start playing the songs while the receive all the data chunks of the songs.
I think with the Nearby Connections API we can only send 4KB payload chunks when we call Nearby.Connections.sendReliableMessage() so what I'm doing so far is call that function as many times as required sending 4KB chunks each time until I manage to deliver the entire file.For the onMessageReceived () Listener what I do is to store all the chunks that I receive in a byte array so once all the chunks have been transferred I can play back the song from the byte array file.
With the approach I'm taking I guess I'd be able to reproduce the song once I've transferred it on its totality, but I'd like to reproduce the songs while I'm actually receiving the data chunks, and in a synchronized manner with all the devices.
Does this makes sense to you guys? Is it the right approach? Is there any other more effective way of doing this? (I already know about the option of streaming audio using Wifi-Direct, but I'd like to use Nearby)
The guy in this tutorial had a similar problem with chunks of audio.
He shows how to play the song, while the bytes are still downloaded and the audio file is built.
Maybe you can utilize the "Incremental Media Download"-part of the tutorial.
Quote:
This is where the magic happens as we download media content from the the url stream until we have enough content buffered to start the MediaPlayer. We then let the MediaPlayer play in the background while we download the remaining audio. If the MediaPlayer reaches the end of the buffered audio, then we transfer any newly downloaded audio to the MediaPlayer and let it start playing again.
Things get a little tricky here because:
(a) The MediaPlayer seems to lock the file so we can’t simply append our content to the existing file.
...
If this doesn't work I would just use the nearby-connection to exchange IP Adresses and go for a Wifi-Direct solution.
I hope this helps and I'd love to hear what your final solution looks like!
I implemented this several years ago for live audio/video packets sent in a serial stream to an Android 4.0 device. It would work the same for audio (or video) packets being streamed over the Nearby connections API.
The solution was to run a http streaming server from within the Android App, then consume this using the Android media player API with its http streaming capabilities (or you can embed ExoPlayer in your app if you prefer as it also supports http streaming).
This was achieved by piping the data stream directly into a FFSERVER process running on the device. The Android NDK was used to create and manage the named pipe required as input into FFSERVER.
As this was done a few years ago I have not tested this on versions of Android 4.1+. Anyone who does this will need to adhere to the FFmpeg GPL/LGPL license when building and distributing FFSERVER.
We have to capture the real-time video using Android Camera, and send them to the server, then other users would read them through the browser or something else.
I have Googled and searched at SO, and there are some examples about video stream app like:
1 Android-eye: https://github.com/Teaonly/android-eye
2 Spydroid-ipcamera:https://code.google.com/p/spydroid-ipcamera/
However it seems that they have different environments, most of the apps will start an HTTP server for stream requests, then the client will visit the page through the local network and see the video.
Then the video stream source and the server are both the device like this:
But we need the internet support like this:
So I wonder if there are any alternative ideas.
I can see you have designed the three stages correctly, in your second diagram.
So what you need is to determine how to choose among these protocols and how to interface them.
No one can give you a complete solution but having completed an enterprise project on Android Video Streaming I will try to straighten your sight towards your goal.
There are three parts in your picture, I'll elaborate from left to right:
1. Android Streamer Device
Based on my experience, I can say Android does well sending Camera streams over RTP, due to native support, while converting your video to FLV gives you headache. (In many cases, e.g. if later you want to deliver the stream on to the Android devices.)
So I would suggest building up on something like spyDroid.
2. Streaming Server
There are tools like Wowza Server which can get a source stream
and put it on the output of the server for other clients. I guess
VLC can do this too, via File-->Stream menu, an then putting the
RTSP video stream address from your spyDroid based app. But I have
not tried it personally.
Also it is not a hard work to implement your own streamer server.
I'll give you an example:
For Implementation of an HLS server, you just need three things:
Video files, segmented into 10 second MPEG2 chunks. (i.e. .ts files)
An m3U8 playlist of the chunks.
A Web Server with a simple WebService that deliver the playlist to the Clients (PC, Android, iPhone, mostly every device) over HTTP. The clients will then look up the playlist file and ask for the appropriate chunks on their according timing. Because nearly all players have built-in HLS support.
3. The Client-Side
Based on our comments, I suggest you might want to dig deeper into Android Video Streaming.
To complete a project this big, you need much more research. For example you should be able to distinguish RTP from RTSP and understand how they are related to each other.
Read my answer here to get a sense of state-of-the-art Video Streaming and please feel free to ask for more.
Hope you got the big picture of the journey ahead,
Good Luck and Have Fun
Quite a general question, but I will try to give you a direction for research:
First of all you will need answer several questions:
1) What is the nature and purpose of a video stream? Is it security application, where details in stills are vital (then you will have to use something like MJPEG codec) or it will be viewed only in motion?
2) Are stream source, server and clients on the same network, so that RTSP might be used for more exact timing, or WAN will be involved and something more stable like HTTP should be used?
3) What is the number of simultaneous output connection? In other words, is it worth to pay for something like Wowza with transcoding add-on (and maybe nDVR too) or Flussonic, or simple solution like ffserver will suffice?
To cut long story short, for a cheap and dirty solution for couple of viewers, you may use something like IP Webcam -> ffserver -> VLC for Android and avoid writing your own software.
You can handle it this way:
Prepare the camera preview in the way described here. The Camera object has a setPreviewCallback method in which you register the preview callback. This callback provides data buffer (byte array) in YUV format that you can stream to your server.
I am (at long last) at the very end of a VOD project. It works perfectly, except on Android. Basically, on Android video will not play until the entire video has downloaded. A media server was well out of scope, so we are just serving the videos up from AWS S3. Works fantastically on iOS. Both streaming and downloading the video works exactly as you would expect it to. On Android, it just doesn't seem to want to play before the download finishes. It works well when using a server on the local network (I even see the occasional buffer, so I know it's not just quickly downloading), but nothing remote.
My only guess is that it is to do with the differences in the way iOS and Android stream video. On iOS, video streams via byte-range requests. Every few seconds, it will time itself out and request another range of bytes for the file. On Android, it only sends a single request for the entire file. Not sure how that could be fixed, however.
Does anyone have any tips or pointers here? Any help would be greatly appreciated here.
Happens on Android 4.4 and 4.3.
Using both a remote prod server we own and AWS S3.
AIR 3.9 with Flex 4.11
Utilizing StageVideo and NetStream
Test devices are a Nexus 5 and a Nexus 4
The issue was with the videos themselves. AIR for Android uses the standard approach to streaming where the entire file is requested and it reads it bit-by-bit (as opposed to iOS which requests specific byte-ranges repeatedly).
The problem here is that the player cannot begin playback until the video's metadata has been read. A standard h.264 encode sees the metadata (moov atom) located at the very end of the file, so the video does not begin until the entire video has been downloaded.
Easiest way I have found to fix this is re-encoding the videos through Handbrake with the "Web Optimized" option selected. This will ensure the metadata is located at the very beginning (byte 24, I believe) so the video should begin playing instantly.
Explanation from Adobe
Thread that gave me the idea to use the "Web Optimized" option
I'm working on a radio Android app in which I'd like to have options to rewind/fast-forward/back to live the audio stream.
It seems that it's not possible with Mediaplayer (I can't find any method to do that), so how can I do that?
The developer of the iOS version of the app is using the RadioKit SDK. Is there anything similar for Android?
I found this link that goes over some of the reasons why HTTP streaming isn't well-supported on Android. You can write your own HTTP streaming client and insert it as a proxy between the MediaPlayer and the media source, but that is the only way as far as I am aware. As far as trick mode, there is no real fast-forward or rewind protocol built into HTTP streaming. You have to simply request the correct byte from the server (see here for a little more info). The good news is it should be much easier to estimate the byte to request given a time position for audio than video (I've seen some pretty ridiculous algorithms for video).
I want to know is it mandatory to use any of the streaming servers like Darwin,Wowza or VLC to stream an RTSP live stream video? I am receiving an RTSP link from my client and it tends to change everytime. I can successfully play it in the VLC player but on phone I cant see anything. I tried playing a sample link having .3gp extension and it worked fine. But my links dont have an extension. They look like this rtsp://122.166.229.151:1950/1346a0cf0ef7c2. Please help me.If its compulsory to use an extension or a server, I will continue working in that direction.
A streaming server (as you describe) isn't strictly necessary - as long as you can pull RTSP from whatever your source is, you should be able to see it. Most IP cameras have onboard RTSP servers (although I wouldn't put too many connections on it). If you can see it in VLC, the phone should be able to consume it as well, given that the codec used to encode is one supported by the android device (in most cases, if you're doing H.264 Baseline 3.0 with AAC, you should be good to go).
A streaming server like Wowza can make that stream available to a wider audience than pulling directly from the source device, but if you're not intending to broadcast to a wide audience, it's not required for streaming to Android devices.
Newer versions of Android (Gingerbread and later) are also able to consume Apple HTTP Live Streaming.