I have been searching a lot about this by now and I got nothing:
I am trying to play a video from firebase storage and trying to be able to see its progress on the player as it loads and to be able to seek it backward and forward (stuff that any player does while streaming a video).
The problem:
Firebase team say that it is not possible to stream a video from the cloud storage (it is not supported).
Eventhough I was able to do this:
String url = "my_url_at_firebase_storage";
video_View.setVideoUri(Uri.parse(url));
video_View.start();
and I was able to load the video from firebase storage into a video view.
I checked:
I checked this link that has an answer that says you have to transcode the video to chunks and save the chuncks to firebase storage and then load them:
But I am lost here:
1) What are chunks of video?
2) How would you stream these chunks if firebase doesn't support streaming?
My question:
As this topic is rarely documented and the link above doesn't provide enough info about how to acheive it:
I ask:
If firebase doesnt support streaming how come we are able to load video directly to videoview?
Tried the same with exoplayer and didn't work?
Thanks for your efforts.
"Transcoding the video into chunks" means dividing it into multiple small pieces (separate files). Those parts are then uploaded to Firebase Cloud Storage.
Once you divided the video into those pieces, you can download them. Since Firebase does not support streaming, you have to download each chunk entirely before playing, but the trick is that you only have to download that chunk, not the entire video.
Does that answer your question?
Related
I already found a similar question here but that was asked almost 2 years ago. Currently, I am working on a video streaming app for Android and I was thinking if using Firebase to host the videos is a viable option.
I tried using the URL of the video stored in Firebase storage to play the video in my app and it is working just fine.
But I searched online and found that everybody is advising not to use Firebase for the same. Is there any particular reason why Firebase shouldn't be used for this purpose?
P.S. In my case, all the videos that need to be streamed will be in HD and will have a longer time duration.
Any help will be appreciated. Thanks in advance.
Firebase Storage does not place restrictions on the type of files you can upload to it. So you can upload video files with no problem.
However, if you are expecting to be able to stream the video out in different formats for different types of clients, you might be disappointed. You should think of Firebase Storage mostly as a general file storage solution, not a video streaming solution.
And Also with consideration of the firebase usage costings, you probably will have to pay a lot of money as your userbase grows.
read here
and here
In order to stream videos it's better to use some streaming protocol like DASH of HLS, as it will load the next video chunk adaptively according to network speed etc.
If you have any thoughts of supporting iOS as well, you'd want to choose HLS which is supported on Android and iOS.
There's a way to host HLS videos for streaming on Firebase Cloud Storage, as I wrote in this answer:
Firebase Storage Video Streaming
Nope, Firebase developers haven't realeased anything related to video streaming support. I'm actually using the storage as you. First I download it and save it in a local file:
private fun downLoadVideosLocal(firebaseStorageUrl: String): String {
// Getting the data from storage from uri
val videoRef = storage.getReferenceFromURL(firebaseStorageUrl)
// Download video to local file
try {
val localFile = File.createTempFile("videos", "mp4")
videoRef.getFile(localFile).addOnSuccessListener {
// Local temp file has been created
Log.v("VideoDownloaded", "The video was downloaded")
}.addOnFailureListener {
// Handle any error
Log.v("VideoDownloaded", "File Failure")
}
return localFile.absolutePath
} catch (e: IOException) {
e.printStackTrace()
return ""
}
}
Then I play it on a videoView parsing the absolutePath into Uri so that I can use the setVideoUri() method and that's it.
What I'm gonna do next is create the video streamming server on Linux to avoid many costs of downloading the video. Because I might download many videos to 200 or more clients. I hope Firebase or Google developers implement a video streamming server in the future. Also you can use Youtube Api but what if you don't want your videos might been seen by other people??? XD
Yes, video streaming from firebase is easy and possible.
Depending on your use case protocols like HLS or DASH can be used. But this is not necessary.
See my answer from the other question here.
I have been searching all around the web to try to understand how to stream videos to my android app.
I learned:
That the videos must be in the HLS or MPEG-DASH format to be adaptive for streaming .
What I mean by adaptive streaming:
The kind of streaming that allows the user to change the quality while video is streaming.
What could help me do this:
The first thing should be to encode my videos into an HLS or MPEG-DASH, for that I found a service that could do this which is Bitmovin.
The second thing is to play the adaptive video, I found two ways exoplayer and Bitmovin-player.
The problem:
I made an account on bitmovin and tried to test an mp4 video to encode, but there are too many stuff there like input, output and manifest and I don't know what URL I should use to pass to the bitmovin player to play. I encoded the video but I don't know where the reference to the enocoded video is.
My question:
1) Is my approach of streaming videos correct?
2) Can someone explain which url I must pass to the player or where I can find the video that was encoded in Bitmovin cloud?
1) Is my approach of streaming videos correct?
Short answer: Yes :) Adaptive streaming is used by almost every major VoD platform out there, and a proper way to do that. Further it allows you and your viewers to either
let the player decide on its own to select the optimal quality for the given connection and device of the viewer to provide continuous playback
and the viewer can select a specific quality on their own as well, if they want to.
When creating adaptive streaming content using MPEG-DASH and/or HLS as streaming format, your output would typically consist of the following:
Video/Audio Segments
MPD Manifest, and/or HLS playlists
1) Your input file (e.g. an mp4 file) will be downloaded and splitted into segments, which are being processed by the Bitmovin encoding. Out of these, it creates the different qualities the player or your viewer can choose from later.
2) This segmented output is then transferred back to your own storage, e.g. a cloud storage like AWS Simple Storage, or Google Cloud Storage. Other output types like (S)FTP and many others can be used as well.
3) In order to play your created MPEG-DASH or HLS content, a MPD manifest and/or HLS playlist needs to be created. Those are basically an index for the player, which tells it which qualities are available and where to find them to start the playback.
2) Can someone explain which url I must pass to the player or where I can find the video that was encoded in Bitmovin cloud?
The URL you would have to provide to the player, has to point the MPD and/or HLS master playlist, that gets transferred to your storage. Bitmovin doesn't offer a hosting service for your encoded content, which is why you didn't find an URL to the manifest that is used. So you would need a storage first, where the encoding could be transferred to.
Give the getting started guide a try. Select your preferred API client. Then you will be guided step by step on how to integrate this encoding service. I hope this helps :)
To test the playback of the player you can also have a look at https://bitmovin.com/demos/stream-test and select "Use our defaults" which provide URL's to sample content for testing the player and playback.
I have an app which is using android webview .i am able to stream video in webview .i want know how can i allow users to to save videos automatically in thier internal storage which they stream in webview on website ? Thanks in advance
Most videos streaming to a browser (PC or mobile) will not allow download as the owner of the video may not want to allow it be saved and copied etc.
Simple http progressive download may allow you to save the video in a regular browser by simply right clicking on it.
DRM protected video will generally use a video pipeline (a sort of 'path' through your device) that does not allow an application of any kind access the raw video - this is again is to avoid people making unauthorised copies.
If you think the video in your case is allowed to be copied then you may find it easier to write some functionality to act as the streaming client and save the file locally. You could also pass the received stream along to the media player or to a webview if you wanted to play the file in parallel to saving it.
I want to prefetch/predownload some data of video which is stored on my server.
Right now I streaming the video from the server, It requires some time to buffer then it will start playing video. So I want to prefetch some data of video so that when user clicks on it video get play without taking too much time.
How to achieve this can I store it in database some data or download file while set list of videos in listview.
Please help with this problem.I didn't understand what to do?
I have used Exoplayer which as this feature and many more. Demo repositorie also available.
I have an app that download a video from ftp then save on sd card in encrypted form , when user want to see that videos , then it decrpted and then showing but i have a problem with that is takes long dely on decrption. Is there any way to play a video like live streaming when it is in decrption process.
To implement your streaming scheme, you need two main components: a streaming server such as a local http instance and a javax.crypto.CipherInputStream. LocalSingleHttpServer is an example of that kind of implementation.