I'd like to get advices on how to do the following if possible:
I've given 3 android devices and I'd like to stream from one to an other (maybe backwards too) at a time but also save it on a third platform (a pc maybe to have a lot of space) for later processing. I'd like to make this pc as a "server" where I recieve a stream from device A, saving it and forwarding to device B. I also want this type of connection between device A-C and B-C at once. This is the idea in a nutshell.
What I have now is I can stream device A's camera to device B using libstreaming and libvlc to recieve it.
Is it possible to achive such system and if so how difficult is it?
Thanks in advance for any kind of reply.
If you setup the stream as RTSP you can have multiple subscribers, so one device could subscribe and record to video, and the other could do whatever. They would then all get the feed at the same time, no need for extra routing. This can all be done with libVLC, and it's not too difficult. You'll have to find examples online for the server and client. It will only get tricky if you want to stream data from memory (for server using item) or write data to memory (for client using seem) directly, but there are examples for this too.
Related
There are plenty of good sources on how to receive audio data on mobile phones. I find (almost) none of streaming audio data TO a server in a standardized (i.e., using a standard protocol, such as RTP) way.
What is the standard way to stream audio data from Android or iOS to a server?
Some additional info:
The closest solutions I found are:
Objective c: Send audio data in rtp packet via socket, where the accepted answer does send audio data, but not inside RTP
Using ffserver, which can listen to a port and receive streamed audio for further processing. But it has been discontinued.
I can write all of that functionality myself. I.e., wrap the audio data in RTP, RTSP, RTMP, whatever, write a server that receives the stream, decodes it and does the processing, but that's days of work for something that seems to be a standard task and thus should already exist.
Furthermore, I firmly believe that you shouldn't reinvent the wheel. If there's a good, established, API for something, Apple, Google or a third party, writing the functionality myself is a bad idea, in particular if networking is involved and thus security concerns.
I think I figured it out, at least one standard way. One likely answer to my question is simply: SIP
There are (even native) interfaces for SIP on both iOS and Android, there are (many) ready-made servers for this, and very likely there exist libraries or command-line clients to run on server side and use for the further processing.
I haven't tried it, but this looks very standardised, is likely widely used for such problems.
I would like to stream a video between two android devices (android-android). There wouldn't be any server, so the streaming has to be direct between devices. Devices would be in the same network so they could communicate via WiFi.
I've tried using MediaRecorder - MediaPlayer via sockets, but I've received many exceptions.
I also looked for library, but I just want to stream a video between two devices directly.
Any solutions?
If your video if for real time communication, e.g. a web chat or sharing some CCTV in real time with minimal delay then a real time video communication approach like WebRTC would be one additional possibility - this type of approach prioritises low latency over quality to ensure minimum delay. See here for Android WebRTC documentation:
https://webrtc.org/native-code/android/
If the requirement is just to allow one device act as a server for non-real time videos then the easiest approach may be to use one of the available HTTP server libraries or apps to allow one device act as a server that the other one can simply connect to via a browser or player. An example Android HTTP server that seems to get good reviews is:
https://play.google.com/store/apps/details?id=jp.ubi.common.http.server&hl=en
We have to capture the real-time video using Android Camera, and send them to the server, then other users would read them through the browser or something else.
I have Googled and searched at SO, and there are some examples about video stream app like:
1 Android-eye: https://github.com/Teaonly/android-eye
2 Spydroid-ipcamera:https://code.google.com/p/spydroid-ipcamera/
However it seems that they have different environments, most of the apps will start an HTTP server for stream requests, then the client will visit the page through the local network and see the video.
Then the video stream source and the server are both the device like this:
But we need the internet support like this:
So I wonder if there are any alternative ideas.
I can see you have designed the three stages correctly, in your second diagram.
So what you need is to determine how to choose among these protocols and how to interface them.
No one can give you a complete solution but having completed an enterprise project on Android Video Streaming I will try to straighten your sight towards your goal.
There are three parts in your picture, I'll elaborate from left to right:
1. Android Streamer Device
Based on my experience, I can say Android does well sending Camera streams over RTP, due to native support, while converting your video to FLV gives you headache. (In many cases, e.g. if later you want to deliver the stream on to the Android devices.)
So I would suggest building up on something like spyDroid.
2. Streaming Server
There are tools like Wowza Server which can get a source stream
and put it on the output of the server for other clients. I guess
VLC can do this too, via File-->Stream menu, an then putting the
RTSP video stream address from your spyDroid based app. But I have
not tried it personally.
Also it is not a hard work to implement your own streamer server.
I'll give you an example:
For Implementation of an HLS server, you just need three things:
Video files, segmented into 10 second MPEG2 chunks. (i.e. .ts files)
An m3U8 playlist of the chunks.
A Web Server with a simple WebService that deliver the playlist to the Clients (PC, Android, iPhone, mostly every device) over HTTP. The clients will then look up the playlist file and ask for the appropriate chunks on their according timing. Because nearly all players have built-in HLS support.
3. The Client-Side
Based on our comments, I suggest you might want to dig deeper into Android Video Streaming.
To complete a project this big, you need much more research. For example you should be able to distinguish RTP from RTSP and understand how they are related to each other.
Read my answer here to get a sense of state-of-the-art Video Streaming and please feel free to ask for more.
Hope you got the big picture of the journey ahead,
Good Luck and Have Fun
Quite a general question, but I will try to give you a direction for research:
First of all you will need answer several questions:
1) What is the nature and purpose of a video stream? Is it security application, where details in stills are vital (then you will have to use something like MJPEG codec) or it will be viewed only in motion?
2) Are stream source, server and clients on the same network, so that RTSP might be used for more exact timing, or WAN will be involved and something more stable like HTTP should be used?
3) What is the number of simultaneous output connection? In other words, is it worth to pay for something like Wowza with transcoding add-on (and maybe nDVR too) or Flussonic, or simple solution like ffserver will suffice?
To cut long story short, for a cheap and dirty solution for couple of viewers, you may use something like IP Webcam -> ffserver -> VLC for Android and avoid writing your own software.
You can handle it this way:
Prepare the camera preview in the way described here. The Camera object has a setPreviewCallback method in which you register the preview callback. This callback provides data buffer (byte array) in YUV format that you can stream to your server.
I have been working on an android app that streams videos live on a server using android built-in camera and anyone can watch that live stream from my website which is deployed on the server.
So can any one help me on how should i start working on my project because at present i have no direction to start with.
More specific example is:-
Like a person goes to a picnic and he wants his friends and family to see whats going on with the tour and his family can see live what he's doing live.....
There is an open-source project that does a very similar thing:
http://code.google.com/p/ipcamera-for-android/
It basically uses the LocalSocket of the camera to read the video and stream it from a webserver. You should be able to find lots of information in the source code.
If you want to stream over the internet, for everyone to see i can recommend you the service justin.tv which lets you broadcast you stream to the whole internet. If tried it, and it works very good!
However, if there's no wifi you will probably have a very laggy connection, unless you convert the video in a smaller size...
I am trying to write a network traffic monitor application myself. I have been using the TrafficStat to get per app network traffic stat. But for video applications like YouTube, the streamed data cannot be captured by TrafficStat. Instead, the streamed data is captured in "android.process.media". Sometimes it is captured by the total network traffic API in TrafficStat instead of the per app API. If there is just one video application, say YouTube, I can always assign the data usage captured by "android.process.media" part back to YouTube. But some people have multiple different video applications on the phone and those applications usually use the same method to stream video. Thus I cannot distinguish how much data each video app consumes.
From Android market, I found My Data Manager, which seems to correctly capture each video application's data usage. So I assume there must be a way to do it. But I have spent a lot of time searching the solutions. Not successful yet. Does anyone know how to do it?
===== Update on 02/05/2014 ====
I happened to talk to the guy who implements Android TrafficStat in a Google event. He told me that earlier versions (Gingerbread and eariler) of TrafficStat is buggy. The new ones in ICS or later should be correct. I didn't test the new versions. So use it with caution.
You can use netstat command with shell to find network statistic hope this link will help you
http://en.wikipedia.org/wiki/Netstat