How to config WebRTC for the lowest latency for streaming live video only one side from Android phone camera to PC via WebRTC app on android to Firefox PC?
the quality maybe 15-24 fps and maybe 640 x 480?
My app need to live streaming video in android phone and transporting it as real time as possible to the PC to view in Firefox PC (using P2P protocol). That app looks like control some robot, play live streaming video game.
How do I do it for the best expected? Maybe it can do with 50 ms latency with 3G/4G network?
Thank you.
Maybe it can do with 50 ms latency with 3G/4G network? Thank you.
Impossible. You can't send a single packet with that little amount of latency over a mobile network, let alone capture video, encode video, mux video with audio, send it, receive it, buffer it, demux it, decode it, present it. 50ms latency per frame is not a whole lot higher than what you get with analog transmission!
You'll find that even many cameras on phones are going to have that much lag by the time the system gets the data to even work with it.
You realize it can take ~200ms for a human to even react to visual stimulus anyway? My TV takes at least 150ms to display a frame from its lossless HDMI input.
Your project requirements are completely out of touch with reality. You should also take time to gain an understanding of the tradeoffs that occur when you push digital video down into the extreme ends of low latency. You're about to make some real sacrifices by going under 1s or 500ms or so. Consider reading my post here: https://stackoverflow.com/a/37475943/362536 Particularly the "why not [magic technology here]" section.
Related
How streaming apps like Youtube, Hotstar or any other video player app, programmatically detects if network is getting slow over run-time and based on that they change video quality based on changes in network speed?
Many streaming services nowadays use HTTP-based streaming protocols. But there are exceptions; especially with low-latency streaming; e.g. WebRTC or Websocket-based solutions.
Assuming that you're using a HTTP-based protocol like HLS or MPEG-DASH, the "stream" is a long chain of video segments that are downloaded one after another. A video segment is a file in "TS" or "MP4" format (in some MP4 cases, video and audio are splitted into separate files); typically a segment has 2 or 6 or 10 seconds of audio and/or video.
Based on the playlist or manifest (or sometimes simply from decoding the segment), the player knows how many seconds of a single segment contains. It also knows how long it took to download that segment. You can measure the available bandwidth by diving the (average) size of a video segment file by the (average) time it took to download.
At the moment that it takes more time to download a segment than to play it, you know that the player will stall as soon as the buffer is empty; stalling is generally referred to as "buffering". Adaptive Bitrate (aka. ABR) is a technique that tries to prevent buffering; see https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming (or Google for the expression) - when the player notices that the available bandwidth is lower than the bit rate of the video stream, it can switch to another version of the same stream that has a lower bit rate (typically achieved by higher compression and/or lower resolution - which results in less quality, but that's better than buffering)
PS #1: WebRTC and Websocket-based streaming solutions cannot use this measuring trick and must implement other solutions
PS #2: New/upcoming variants of HLS (eg. LL-HLS and LHLS) and MPEG-DASH use other HTTP technologies (like chunked-transfer or HTTP PUSH) to achieve lower latency - these typically do not work well with the mentioned measuring technique and use different techniques which I consider outside scope here.
You have to use a streaming server in order to do that. Wowza server is one of them (not free). The client and server will exchange information about the connexion and distribute chuncks of the video, depending on the network speed.
I'm making an app that needs to send a video feed from a single source to a server where it can be accessed by desktop browsers and mobile apps.
So far, I've been using Adobe Media Server 5 with a live RTMP stream. This gives me about a 2.5 second delay on desktop browsers, which gives me no native support for iOS, but leaves me with the option to use Air to export the app for iOS, which produces a minimum 5-6 second delay.
The iOS docs strongly recommend the use of HTTP Live Streaming which segments the stream into chunks and serves it using a dynamic playlist in a .m3u8 file. Doing this produces a 15+ second delay in desktop browsers and mobile devices. A Google search seemed to reveal that this is to be expected from HLS.
I need a maximum of 2-4 second delays across all devices, if possible. I've gotten poor results with Wowza, but am open to revisiting it. FFMpeg seems inefficient, but I'm open to that as well, if someone has had good results with it. Anybody have any suggestions?? Thanks in advance.
I haven't even begun to find the most efficient way to stream to Android, so any help in that department would be much appreciated.
EDIT: Just to be clear, my plan is to make an iOS app, whether it's written natively or in Air. Same goes for Android, but I've yet to start on that.
In the ios browser HLS is the only way to serve live video. The absolute lowest latency would be to use 2 second segments with a 2 segment windows in the manifest. This will give you 4 seconds latency on the client, plus another 2 to 4 on the server. There is no way to do better without writing an app.
15 Second delay for HLS streams is pretty good, to provide lower latency you need to use a different streaming protocol.
RTP/RTSP will give you the lowest latency and is typically used for VoIP and video conferencing, but you will find it very difficult to use over multiple mobile and WiFi networks (some of them unintentionally block RTP).
If you can write an iOS app that supports RTMP then that is the easiest way to go and should work on Android too (only old Androids support Flash/RTMP natively). Decoding in software will result in poor battery life. There are other iOS apps that don't use HLS for streaming, but I think you need to limit it to your service (not a generic video player).
Also please remember that higher latency equals higher video quality, less buffering, better user experience etc. so don't unnecessarily reduce latency.
I'm new to the android platform, and I wanted to develop an app that runs in the background and reads the microphone input, applies a transformation to it, and outputs the resulting audio to the speaker.
I'm wondering if there is any lag perceived by the user in this process, or if it's possible to do it in near-realtime so that the user can hear the transformed audio in sync with the ambient audio. Thanks!
Yes, users will hear a severe latency lag or echo with attempts at real-time audio on current unmodified Android devices using the provided APIs.
The summary is that Android devices are configured for fairly long audio buffers, which has been reported to be in the somewhere around the range of 100 to 400 milliseconds long, depending on the particular device and the Android OS version it is running. (Shorter buffers might be possible on Android devices on which one can build and install a modified custom build of the OS with your own custom audio drivers.)
(Humans hear echoes at somewhere around or above 25 mS. Audio buffers on iOS can be as short as 5.8 mS, so you may have better luck trying to develop your near-real-time audio processing on a different device platform.)
Audio processing on android isn't all the great, in fact to be honest, it sucks. The out-of-the-box latency on android devices for such things is pretty awful. You can however tinker with the NDK and try to put together something based on OpenSL ES which will have significantly low latency.
There is a similar StackOverflow question: Playing back sound coming from microphone in real-time
Some other helpful links:
http://arunraghavan.net/2012/01/pulseaudio-vs-audioflinger-fight/
http://www.musiquetactile.fr/android-is-far-behind-ios/
http://www.geardiary.com/2012/02/21/the-dismal-state-of-android-as-a-music-production-solution/
On the other side of the coin, android mic quality is way better than IOS quality. I have a galaxy s4 and a huawei very low end phone and both have a wonderful mic quality when recording.
Does anybody have any luck streaming a high quality video (over 1000kbps) to Android through RTSP?
We currently have low quality video streams (around 200kbps) that work wonderfully over 3G. Now we are trying to serve a high-quality stream for when the user has a faster connection. The high quality videos play smoothly in VLC, but the Android playback seems to drop frames and get blocky, even on a 4 megabit connection.
It seems like the YouTube app uses a plain HTTP download for their high quality videos. This works well and plays smoothly, but will not work for streaming live videos. Has anybody had luck streaming high quality videos to Android through RTSP?
The videos are encoded using H.264, 1500kbps, 24fps, and a 720x480 resolution. In the app, we are using a VideoView to play the videos. We are using Darwin Streaming Server, but we are open to other options if necessary.
Update 6/23/2011
Looking through Darwin some more today. So far, I am just logging the request and session information in a Darwin module.
The original Droid tries to use these settings: 3GPP-Adaptation:...size=131072;target-time=4000. Although that means it wants 4 seconds of buffer, 131Kb only holds about a second of playback at 1200kbps. I understand that 1200kbps is large, but it is necessary for a high quality video (minimal compression on 720x480).
I am trying to force the client to buffer more, but I haven't figured out how to do that yet. I'm just looking through the Darwin Streaming Server source and trying to figure out how they do things. Any Darwin experts out there?
Update 6/24/2011
As it turns out, using plain old HTTP for viewing videos on demand works well with no loss of quality. When we get to live streaming, we will have to look more into RTSP.
Well even if the network is able to transmit at that rate, you still need to decode it. What are you using for decoding? You will probably need to use a NEON accelerated video decoder so you can have a proper framerate, and a decent size buffer... the graphics processor is only as good as the bus that it is in... Also what are your encoding settings and resolution?
Edit: You are encoding those at much to high bitrate, half of that will do fine. Also you need to make sure where the issue lies. Is the mediaPlayer getting the data and failing to stream at a decent framerate, in that case you have to replace the MediaPlayer code with your own player. Is it's network issue then only solution is to lower the bitrate, 600Kbps would be just fine (or 500Kbps video, 128Kbps audio), it's 3x your 200k stream and on a screen this small, the difference is not noticeable.
I am new to video streaming and am working on a project to broadcast video to android phone over internet, and users to view the video at the same time may reach 100.
After looking around for a while I think using rtsp streaming for the phone client may be convenient(Am I right?) and so I have to choose a server, My current choice will be using
VLC
Darwin Streaming Server
Are they suitable? Or any other better choice?
How about the performance of these two servers while 100 users accessing at the same time.
Thanks in advance
Regards
Bolton
RTSP streaming in H.264/AAC would be the most convenient way to reach Android devices. End-users will not need to install an app or open the stream in one - the native media player will seamlessly open the stream.
If you intend on using VLC for the encoding portion - you may want to reconsider, as I'm not sure it supports H.264/AAC compression, which is required to reach Android devices. You may want to consider using commercial software like Wirecast or the free Flash Media Encoder with the AAC plugin.
Darwin Streaming Server is stable enough to handle that load (100 concurrent viewers), however the amount of throughput you have available and the bit-rate you will be broadcasting at are more important factors to consider when delivering video. In other words - your upload speed has to be able to be sufficient. If it's not intended strictly as a DIY project, I would suggest tapping into a commercial CDN's network (I would recommend NetroMedia).