Live video streaming with 200 ms latency [closed] - android

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I am using below sdk for video live streaming.
https://github.com/ant-media/LiveVideoBroadcaster
Server is rtmp based. We send video live stream to the RTMP server and then play video on AMS (Adobe Media Server) player.
Currently we are getting latency value greater than 30 seconds. How can we reduce this latency. We want to achieve 200 ms. Is it possible to do this with above sdk.
If not Please suggest any other android native sdk that can provide live video streaming with ultra low latency value.
Any help appreciated.
Thanks.

The latency is caused by your choice of TCP-based RTMP and by the caching server in the middle. For better results, switch to WebRTC, which is UDP-based. If you have one or few players, you will be better served by streaming to them directly.
If you have many subscribers and/or sophisticated subscription policy, you need a relay server. But even then, the best strategy is to send video via WebRTC to a server that can convert it to RTMP if necessary. See how WOWZA and flashphoner address that.
I have used these references to learn the subject:
Live streaming video latency
Oh, latency, thou art a heartless bitch
WebRTC vs. RTMP – Which Protocol Should You Choose for Your Live Streaming App?

Try setting keyframe interval as 1, generally this is 10.
Also set Segment Duration as 1, default is 3

Related

Webbased live video streaming from smartphone camera [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
I am looking for a way to livestream videos from a smartphone camera inside a browser like Chrome or Edge. It should be able to transfer in protocols like RTMP, Fragmented MP4, or RTP/MPEG-2 Transport stream.
(Basically, I like to receive the livestream data on the Azure Media Service Portal)
I found similar solutions but they require downloading an app.
The web solution needs to be open-source or based on Microsoft products/services.
Does anyone know a suitable approach?
I am not expecting code here, but rather naming open-source libraries or microsoft products that can be used to achieve this goal.
You could use WebSockets. These allow for real-time communication where the client doesn't have to constantly check the server for updated information. Check this out: Video streaming over websockets using JavaScript - Stack Overflow. I've done it before, but the code is a bit outdated. You'd also have to either build your own WebSocket server (I used Raspberry Pi) or do the simpler thing and rent a connection.

Downloading a video from Firebase Storage - chunk by chunk [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
I'm building an app with java (for android) that one of it main goals is to show a video to the client.
Right now I'm storing all my videos in firebase storage, At the beginning I wanted to stream the video (youtube style) but unfortunately firebase storage does not support it.
I read that there is an alternative way of "faking" to stream the video by downloading the video chunk by chunk and playing it one by one, that way you don’t need to wait until the whole video is downloaded locally to the phone and only after that start playing it.
You can see what I'm talking about here -
Speeding up firebase storage download
So my question is which API/library/thing can I use to do it, and if somebody has an example code that he can show me ?
Thank you very much !
The Firebase SDK for Cloud Storage does not have any methods to stream the results.
The options I can quickly think of:
Store the video in smaller chunks, each chunk in a separate file. That way you can retrieve the files one by one, and start playing when you have the minimum number of chunks.
Set up your own server which reads from Cloud Storage (typically at a much higher bandwidth), and then sends a response to the client in smaller chunks. For more on this, also see this answer: Video Streaming from Google Cloud Storage
Neither of these is going to be trivial to implement, so you might want to consider if maybe a dedicated video streaming service isn't a better fit for your needs.
You may use Mux or Cloud Flare to stream videos
https://blog.codemagic.io/build-video-streaming-with-flutter-and-mux/
https://developers.cloudflare.com/stream/

video calling app for android to desktop [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
Which is the best api to use to build the video calling app
I have seen about web RTC which can't track the record of the two or more clients, it just connect them, because it is client to client protocol, the signalling server only involves to make the hand shake of the two or more clients.
Can you suggest me the protocol which keeps the record of video calling, is there any protocol exist?
You can get Who and When via your signalling. For the rest, you'll need an SFU. An SFU is a piece of software that the peers connect to instead of each other, and then it forwards the media data onwards to other peers. This would let you get all the other attributes you want.
I work for an WebRTC company and we have a SFU product called LiveSwitch (https://www.frozenmountain.com/products-services/liveswitch/). Check it out if you want to go the paid route.
Checkout http://www.pjsip.org/
PJSIP has the ability to do both audio and video signaling over SIP and works great on embedded devices as well as desktop environments.
IMHO WebRTC isn't mature enough for large scale deployments. Maybe in another 12-18 months, but today it's still too fragile. If you want consistent day after day performance and stability I suggest speaking the same language as the telcos: SIP and G711.

Streaming .m3u8 format using Chromecast [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
Hi i developed the application to cast my videos to TV via Chromecast.Now trying to play m3u8 videos via Chromecast is it possible as m3u8 videos buffering is good and control is good compare to mp4 any links will also do i cannot find how to stream m3u8 using Chromecast.
The .m3u8 format is for HLS (HTTP Live Streaming). To play .m3u8 formats you need to host your media content on a server. Make sure CORS is enabled. Then have your sender load the url, where your content is hosted, onto the receiver. This documentation has much more details and sample code. If the problem was with your application, check out the sample apps on GitHub as a reference.

Broadcasting Video Over RTMP - Android [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I Have to develop a App Which can broadcast live stream to the Wowza Server over RTMP Protocol in H.264 format. I am looking for a library of the RTMP protocol that can help me publish the stream to Server.
Ignore me if this Question is been already posted.
Thanks !
I also have a similar problem, and what I found.
First of all RTMP hasn't native support from Android.
Aftek Android RTMP library. (not free)
Smaxe RTMP Library.(not free)
Android RTMP Client. (free)
If you will find another solution, please, provide it.
I hope follow links helps you.
Update (answer for your e-mail)
About RTSP:
Very useful opensource sample Spydroid Ipcamera
Also check SO question about RTSP client
And try to find another SO question. This subject frequently discussed.
You can do with Adobe AIR Platform for Android/iOS Devices.
I am not sure that you can do this with Android SDK, So if you are looking to seriously do that.Go for Adobe AIR.
While this question is old, if some one is still looking for library for android to rtmp to Wowza , they can checkout Intel INDE, which has built in support for WOWZA
https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials

Categories

Resources