Stream video android-android - android

I would like to stream a video between two android devices (android-android). There wouldn't be any server, so the streaming has to be direct between devices. Devices would be in the same network so they could communicate via WiFi.
I've tried using MediaRecorder - MediaPlayer via sockets, but I've received many exceptions.
I also looked for library, but I just want to stream a video between two devices directly.
Any solutions?

If your video if for real time communication, e.g. a web chat or sharing some CCTV in real time with minimal delay then a real time video communication approach like WebRTC would be one additional possibility - this type of approach prioritises low latency over quality to ensure minimum delay. See here for Android WebRTC documentation:
https://webrtc.org/native-code/android/
If the requirement is just to allow one device act as a server for non-real time videos then the easiest approach may be to use one of the available HTTP server libraries or apps to allow one device act as a server that the other one can simply connect to via a browser or player. An example Android HTTP server that seems to get good reviews is:
https://play.google.com/store/apps/details?id=jp.ubi.common.http.server&hl=en

Related

Stream multiple langauges audio from a playing video on a local live-streaming server

I have a case where I need to have a video being played repeatedly (looped) on a server (that video is being showed on a connected screen) and a client on the same wireless network that can get the current playing video audio stream/s in-sync with video that is being played on the server (the video may have multiple languages or could be linked with multiple audio files that needs to be streamed in-sync to that playing video).
Is there a Nodejs package or a library that I can use to achieve this scenario? and a streaming protocol/server that is easy to use or program those features in the client without too much hassle with the sync and stream?
I'm looking for support on the client side for mobile devices (preferably browser dependent) that can get the stream with Javascript (which is the simpler for me) or something else and potentially run as an full-screen app on the mobile device.
My plan is to have an app on mobile devices where a user can chose a language and video playing on screens to listen to in-sync with the video, like choosing a channel where they can connect to and listen to the audio of that video over the wireless network, So I'm looking for a way or a solutions to program this.
Thanks!

What is the most efficient way to implement HTTP Live Video Streaming in Android?

For the past month I have been searching over the Internet for ways to implement recording live video from an application on Android and sending it over to a server, but the more I research the more confused I get.
First of all, I am looking for a streaming protocol that can be used for iOS also in the future, so I came to a conclusion that DASH(Dynamic Adaptive Streaming over HTTP) is the ideal solution.
In addition, the recent Android framework, ExoPlayer, support this feature.
Furthermore, I do not wish to use a Live Streaming engine such as WOWZA.
Secondly, based on my research I also concluded that any HTTP server can be used to receive the "chuncks" of data, but I must have a streaming server to be able to stream the video back to the users.
I believe this process is quite complex but I will not give up until I successfully make it work.
Lastly, my question is, what Server, Protocol should I use to be able to achieve this ? And how to convert video directly and send to server ?
Looking at your questions re protocol and server:
A 'streaming protocol that can be used for iOS also in the future'
It probably depends what you mean by 'future. At the moment apple require you to use HLS on iOS for any video on a Mobile Network (cellular) which is over 10 mins long. DASH is establishing itself as the industry standard so this may change and apple may accept it also, but if you need something in the near future you may want to plan to support DASH and HLS.
What server should you use for streaming
Streaming video is complex and the domain is fast changing so it really is good to use or build on a dedicated streaming server, if you can. These will generally have mechanisms and/or well documented procedures for converting input videos to the different formats and bit rates you need, depending on the reach and user experience goals you have. Reach will determine the different encodings you need, different browsers and devices supporting different encodings, and if you want your user to have good experience avoiding buffering you will want multiple bit rate versions of each format also - this allows DASH and HLS provide Adaptive Bit rate Streaming (ABR) which means the clients can select the best bit rate at any given time depending on network conditions. Video manipulation, especially transcoding, is a CPU intensive task so another advantage of dedicated streaming server software is that it should be optimised as much as possible to reduce your server loads.
If you do decide to go the streaming server route, then there are open source alternatives, as well as Wowza which you mention above, such as:
https://gstreamer.freedesktop.org
These have plugins that support ABR etc - if you search for 'GStreamer streaming server ABR' you will find some good blogs about setting this up.

How to sequentially play chunks of videos in Exoplayer

I'm trying to develop an Android app that reads a video coming from either the server or other peers that have the video. For this usecase I have to split my videos into smaller pieces to optimize transfer times and each piece can be provided by either the central server or another peer.
I would like to know if exoplayer is able to read a sequence of video pieces without interruption ?
I am free to do whatever I want in the splitting process e.g. split the video with linux command split.
Most adaptive bit rate streaming formats work something like your description - they spit the video into multiple chunks and the video player requests them one at a time. Examples of adaptive rate streaming protocols are HLS, MPEG-DASH, SmoothStreaming.
It is possible to have the url for the next 'chunk' of video route to a 'central' server which could proxy the request to another 'peer', if this would meet your needs.
Its worth noting that many videos are delivered via CDN's which might interfere with your desired approach (or alternatively might actually match what you want, depending on what your underlying requirements are) so you may want to check this also.
Update
Assuming you mean that some chunks will come from the server and some chunks from peer devices on the network, then the above server proxy method would not work.
One way you could do this would be to have all the chunks delivered to the device from whatever source is best for each chunk, and then put them together as you require on the device and stream the result from 'localhost' on your device to the player.
This sounds like a huge amount of overhead and something that would never work but I believe it is actually a technique used in some apps to convert from one streaming format to another (can't provide example - sorry...).
One example of a 'localhost' server on Android that might be useful to look at is:
http://www.laptopmag.com/articles/android-web-server
An alternative, if you were to use HTML5 inside a web page on the device you could use the Media Source Extension mechanism to load the video chunks from the different sources before passing them to the player. This does require Chrome at this point rather than the standard Android browser as the latter does not support the MSE extension at the time of writing.
In all these approaches you obviously need to make sure you load enough in advance to keep the video pipeline and buffer full, to avoid pauses.

Most instant way to stream live video to iOS and Android

I'm making an app that needs to send a video feed from a single source to a server where it can be accessed by desktop browsers and mobile apps.
So far, I've been using Adobe Media Server 5 with a live RTMP stream. This gives me about a 2.5 second delay on desktop browsers, which gives me no native support for iOS, but leaves me with the option to use Air to export the app for iOS, which produces a minimum 5-6 second delay.
The iOS docs strongly recommend the use of HTTP Live Streaming which segments the stream into chunks and serves it using a dynamic playlist in a .m3u8 file. Doing this produces a 15+ second delay in desktop browsers and mobile devices. A Google search seemed to reveal that this is to be expected from HLS.
I need a maximum of 2-4 second delays across all devices, if possible. I've gotten poor results with Wowza, but am open to revisiting it. FFMpeg seems inefficient, but I'm open to that as well, if someone has had good results with it. Anybody have any suggestions?? Thanks in advance.
I haven't even begun to find the most efficient way to stream to Android, so any help in that department would be much appreciated.
EDIT: Just to be clear, my plan is to make an iOS app, whether it's written natively or in Air. Same goes for Android, but I've yet to start on that.
In the ios browser HLS is the only way to serve live video. The absolute lowest latency would be to use 2 second segments with a 2 segment windows in the manifest. This will give you 4 seconds latency on the client, plus another 2 to 4 on the server. There is no way to do better without writing an app.
15 Second delay for HLS streams is pretty good, to provide lower latency you need to use a different streaming protocol.
RTP/RTSP will give you the lowest latency and is typically used for VoIP and video conferencing, but you will find it very difficult to use over multiple mobile and WiFi networks (some of them unintentionally block RTP).
If you can write an iOS app that supports RTMP then that is the easiest way to go and should work on Android too (only old Androids support Flash/RTMP natively). Decoding in software will result in poor battery life. There are other iOS apps that don't use HLS for streaming, but I think you need to limit it to your service (not a generic video player).
Also please remember that higher latency equals higher video quality, less buffering, better user experience etc. so don't unnecessarily reduce latency.

Server for broadcasting RTSP video to Android

I am new to video streaming and am working on a project to broadcast video to android phone over internet, and users to view the video at the same time may reach 100.
After looking around for a while I think using rtsp streaming for the phone client may be convenient(Am I right?) and so I have to choose a server, My current choice will be using
VLC
Darwin Streaming Server
Are they suitable? Or any other better choice?
How about the performance of these two servers while 100 users accessing at the same time.
Thanks in advance
Regards
Bolton
RTSP streaming in H.264/AAC would be the most convenient way to reach Android devices. End-users will not need to install an app or open the stream in one - the native media player will seamlessly open the stream.
If you intend on using VLC for the encoding portion - you may want to reconsider, as I'm not sure it supports H.264/AAC compression, which is required to reach Android devices. You may want to consider using commercial software like Wirecast or the free Flash Media Encoder with the AAC plugin.
Darwin Streaming Server is stable enough to handle that load (100 concurrent viewers), however the amount of throughput you have available and the bit-rate you will be broadcasting at are more important factors to consider when delivering video. In other words - your upload speed has to be able to be sufficient. If it's not intended strictly as a DIY project, I would suggest tapping into a commercial CDN's network (I would recommend NetroMedia).

Categories

Resources