Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
I'm building an app with java (for android) that one of it main goals is to show a video to the client.
Right now I'm storing all my videos in firebase storage, At the beginning I wanted to stream the video (youtube style) but unfortunately firebase storage does not support it.
I read that there is an alternative way of "faking" to stream the video by downloading the video chunk by chunk and playing it one by one, that way you don’t need to wait until the whole video is downloaded locally to the phone and only after that start playing it.
You can see what I'm talking about here -
Speeding up firebase storage download
So my question is which API/library/thing can I use to do it, and if somebody has an example code that he can show me ?
Thank you very much !
The Firebase SDK for Cloud Storage does not have any methods to stream the results.
The options I can quickly think of:
Store the video in smaller chunks, each chunk in a separate file. That way you can retrieve the files one by one, and start playing when you have the minimum number of chunks.
Set up your own server which reads from Cloud Storage (typically at a much higher bandwidth), and then sends a response to the client in smaller chunks. For more on this, also see this answer: Video Streaming from Google Cloud Storage
Neither of these is going to be trivial to implement, so you might want to consider if maybe a dedicated video streaming service isn't a better fit for your needs.
You may use Mux or Cloud Flare to stream videos
https://blog.codemagic.io/build-video-streaming-with-flutter-and-mux/
https://developers.cloudflare.com/stream/
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
I am looking for a way to livestream videos from a smartphone camera inside a browser like Chrome or Edge. It should be able to transfer in protocols like RTMP, Fragmented MP4, or RTP/MPEG-2 Transport stream.
(Basically, I like to receive the livestream data on the Azure Media Service Portal)
I found similar solutions but they require downloading an app.
The web solution needs to be open-source or based on Microsoft products/services.
Does anyone know a suitable approach?
I am not expecting code here, but rather naming open-source libraries or microsoft products that can be used to achieve this goal.
You could use WebSockets. These allow for real-time communication where the client doesn't have to constantly check the server for updated information. Check this out: Video streaming over websockets using JavaScript - Stack Overflow. I've done it before, but the code is a bit outdated. You'd also have to either build your own WebSocket server (I used Raspberry Pi) or do the simpler thing and rent a connection.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I am using below sdk for video live streaming.
https://github.com/ant-media/LiveVideoBroadcaster
Server is rtmp based. We send video live stream to the RTMP server and then play video on AMS (Adobe Media Server) player.
Currently we are getting latency value greater than 30 seconds. How can we reduce this latency. We want to achieve 200 ms. Is it possible to do this with above sdk.
If not Please suggest any other android native sdk that can provide live video streaming with ultra low latency value.
Any help appreciated.
Thanks.
The latency is caused by your choice of TCP-based RTMP and by the caching server in the middle. For better results, switch to WebRTC, which is UDP-based. If you have one or few players, you will be better served by streaming to them directly.
If you have many subscribers and/or sophisticated subscription policy, you need a relay server. But even then, the best strategy is to send video via WebRTC to a server that can convert it to RTMP if necessary. See how WOWZA and flashphoner address that.
I have used these references to learn the subject:
Live streaming video latency
Oh, latency, thou art a heartless bitch
WebRTC vs. RTMP – Which Protocol Should You Choose for Your Live Streaming App?
Try setting keyframe interval as 1, generally this is 10.
Also set Segment Duration as 1, default is 3
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
Which is the best api to use to build the video calling app
I have seen about web RTC which can't track the record of the two or more clients, it just connect them, because it is client to client protocol, the signalling server only involves to make the hand shake of the two or more clients.
Can you suggest me the protocol which keeps the record of video calling, is there any protocol exist?
You can get Who and When via your signalling. For the rest, you'll need an SFU. An SFU is a piece of software that the peers connect to instead of each other, and then it forwards the media data onwards to other peers. This would let you get all the other attributes you want.
I work for an WebRTC company and we have a SFU product called LiveSwitch (https://www.frozenmountain.com/products-services/liveswitch/). Check it out if you want to go the paid route.
Checkout http://www.pjsip.org/
PJSIP has the ability to do both audio and video signaling over SIP and works great on embedded devices as well as desktop environments.
IMHO WebRTC isn't mature enough for large scale deployments. Maybe in another 12-18 months, but today it's still too fragile. If you want consistent day after day performance and stability I suggest speaking the same language as the telcos: SIP and G711.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I want put video on my application that I wrote it , the problem the size video is big , so what can I do?!
I can't upload the video on YouTube because I don't want anyone see the video except who download app.
any way or solution for how could I do that ?!!
The apk file has a size limit of 50 MB. I think you have a couple of options.
Put the video on your server and on launch of the app, download the video from the internet and store on the device. Some games work like that by downloading content from private servers or clouds once installed.
I am not sure if this can be used, but try creating an expansion file and upload that to the play store and let your app download from there. Expansion Files
Since you don't want non app users to see the video, consider using some DRM so that once the video is downloaded, it is not shared by the users.
Upload video to your server and display it in your application. Play your video using video url. See this answer here.
Use APK Expansions files for external files-
http://developer.android.com/google/play/expansion-files.html
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I'm currently developing an android app and need to be able to render a Waveform of an audio file. I already read about the Visualizer class, but I think that class is designed to do realtime calculation. What I want to do is go trough an audio file and generate for example an .PNG that contains a waveform of the whole audio file.
Basically something like soundcloud does:
http://www.djban.com.br/wp-content/uploads/soundcloud-waveform.png
Are there any libraries that can do this? Or can you tell me how I can get an array of amplitudes of an audio file?
I don't know of any library but it should be easy to get it working.
First, in order to get the Amplitude of the wave form, you will need the audio in PCM format.
Assuming that you need the waveform for arbitrary audio files (mp3 in particular), you will need to find a way to convert between formats. A quick google search gives this result.
NOTE: if you just need to do it for audio coming from the microphone, you can avoid the conversion since it delivers audio data in the PCM format.
After getting the samples, you will need to draw their amplitudes somehow. You have several options, but I would give the Canvas a try. You can draw lines (or general shapes) there and then export it to a file or draw it on the screen.
Another option to draw is OpenGL. This is more complicated if you haven't use it before, but it will be probably faster.