Stream videos using Firebase in Android - android

I already found a similar question here but that was asked almost 2 years ago. Currently, I am working on a video streaming app for Android and I was thinking if using Firebase to host the videos is a viable option.
I tried using the URL of the video stored in Firebase storage to play the video in my app and it is working just fine.
But I searched online and found that everybody is advising not to use Firebase for the same. Is there any particular reason why Firebase shouldn't be used for this purpose?
P.S. In my case, all the videos that need to be streamed will be in HD and will have a longer time duration.
Any help will be appreciated. Thanks in advance.

Firebase Storage does not place restrictions on the type of files you can upload to it. So you can upload video files with no problem.
However, if you are expecting to be able to stream the video out in different formats for different types of clients, you might be disappointed. You should think of Firebase Storage mostly as a general file storage solution, not a video streaming solution.
And Also with consideration of the firebase usage costings, you probably will have to pay a lot of money as your userbase grows.
read here
and here

In order to stream videos it's better to use some streaming protocol like DASH of HLS, as it will load the next video chunk adaptively according to network speed etc.
If you have any thoughts of supporting iOS as well, you'd want to choose HLS which is supported on Android and iOS.
There's a way to host HLS videos for streaming on Firebase Cloud Storage, as I wrote in this answer:
Firebase Storage Video Streaming

Nope, Firebase developers haven't realeased anything related to video streaming support. I'm actually using the storage as you. First I download it and save it in a local file:
private fun downLoadVideosLocal(firebaseStorageUrl: String): String {
// Getting the data from storage from uri
val videoRef = storage.getReferenceFromURL(firebaseStorageUrl)
// Download video to local file
try {
val localFile = File.createTempFile("videos", "mp4")
videoRef.getFile(localFile).addOnSuccessListener {
// Local temp file has been created
Log.v("VideoDownloaded", "The video was downloaded")
}.addOnFailureListener {
// Handle any error
Log.v("VideoDownloaded", "File Failure")
}
return localFile.absolutePath
} catch (e: IOException) {
e.printStackTrace()
return ""
}
}
Then I play it on a videoView parsing the absolutePath into Uri so that I can use the setVideoUri() method and that's it.
What I'm gonna do next is create the video streamming server on Linux to avoid many costs of downloading the video. Because I might download many videos to 200 or more clients. I hope Firebase or Google developers implement a video streamming server in the future. Also you can use Youtube Api but what if you don't want your videos might been seen by other people??? XD

Yes, video streaming from firebase is easy and possible.
Depending on your use case protocols like HLS or DASH can be used. But this is not necessary.
See my answer from the other question here.

Related

Video is not playing fetched from azure blob storage

I am creating an application in which I upload any video from mobile through browser
It stores in blob storage but when I try to fetch it and play it in media player of android there is error in logcat saying "can't open the file" and same is happening if I upload a video from laptop which record by mobile
Why this is happening ? Does codec format of mobile video plays vital role in it ? If yes then what should I do
Thank you in advance
There are something we need to check when streaming video in Azure storage:
1. Check your upload tools.
Includes the tools and the settings, such as bit rate. Sometimes the issue comes with half-baked tools or the transmitting settings.
2. Check your Blob Type.
Make sure your videos are BlockBlobs. Check the header x-ms-blob-type. There are BlockBlobs and PageBlobs … but for streaming video you want BlockBlobs.
3. Check your storage version. It is no problem if we use General-purpose v2 accounts by default.
Azure Storage offers several types of storage accounts. Each type supports different features and has its own pricing model. Consider these differences before you create a storage account to determine the type of account that is best for your applications.
4. Check your video codec format and size.
Storage clients default to a 128 MiB maximum single blob upload. You can see the details in block blobs. For Input video codecs supported, you can refer to the docs.
Reference: this blog post by Tom.

Android: Bitmovin I cant find the encoded video in the bitmovin console?

I have been searching all around the web to try to understand how to stream videos to my android app.
I learned:
That the videos must be in the HLS or MPEG-DASH format to be adaptive for streaming .
What I mean by adaptive streaming:
The kind of streaming that allows the user to change the quality while video is streaming.
What could help me do this:
The first thing should be to encode my videos into an HLS or MPEG-DASH, for that I found a service that could do this which is Bitmovin.
The second thing is to play the adaptive video, I found two ways exoplayer and Bitmovin-player.
The problem:
I made an account on bitmovin and tried to test an mp4 video to encode, but there are too many stuff there like input, output and manifest and I don't know what URL I should use to pass to the bitmovin player to play. I encoded the video but I don't know where the reference to the enocoded video is.
My question:
1) Is my approach of streaming videos correct?
2) Can someone explain which url I must pass to the player or where I can find the video that was encoded in Bitmovin cloud?
1) Is my approach of streaming videos correct?
Short answer: Yes :) Adaptive streaming is used by almost every major VoD platform out there, and a proper way to do that. Further it allows you and your viewers to either
let the player decide on its own to select the optimal quality for the given connection and device of the viewer to provide continuous playback
and the viewer can select a specific quality on their own as well, if they want to.
When creating adaptive streaming content using MPEG-DASH and/or HLS as streaming format, your output would typically consist of the following:
Video/Audio Segments
MPD Manifest, and/or HLS playlists
1) Your input file (e.g. an mp4 file) will be downloaded and splitted into segments, which are being processed by the Bitmovin encoding. Out of these, it creates the different qualities the player or your viewer can choose from later.
2) This segmented output is then transferred back to your own storage, e.g. a cloud storage like AWS Simple Storage, or Google Cloud Storage. Other output types like (S)FTP and many others can be used as well.
3) In order to play your created MPEG-DASH or HLS content, a MPD manifest and/or HLS playlist needs to be created. Those are basically an index for the player, which tells it which qualities are available and where to find them to start the playback.
2) Can someone explain which url I must pass to the player or where I can find the video that was encoded in Bitmovin cloud?
The URL you would have to provide to the player, has to point the MPD and/or HLS master playlist, that gets transferred to your storage. Bitmovin doesn't offer a hosting service for your encoded content, which is why you didn't find an URL to the manifest that is used. So you would need a storage first, where the encoding could be transferred to.
Give the getting started guide a try. Select your preferred API client. Then you will be guided step by step on how to integrate this encoding service. I hope this helps :)
To test the playback of the player you can also have a look at https://bitmovin.com/demos/stream-test and select "Use our defaults" which provide URL's to sample content for testing the player and playback.

Stream mp3s From Firebase Storage

Issue
I am building a backend service that creates mp3s for an Android client to stream via ExoPlayer.
Looking at Firebase's Storage Pricing in the long term, if there are 10 gbs stored and 10,000 users there would be 100,000 gb to transfer which is very expensive ($11,600).
What would be the best solution to stream mp3s on the cloud in order to avoid data transfer fees?
Possible Solutions
Use ExoPlayer to stream mp3s directly from the cloud without downloading.
Use a separate API to download the 10gb from Cloud Storage one time, and stream the mp3s to the mobile client from the separate API.
Possible solution #1 is the best solution: Use ExoPlayer to stream mp3s directly from the cloud without downloading.
Thank you Oliver Woodman on the ExoPlayer team for resolving this question on Github!
If a mp3 file is saved to the cloud, ie:Firebase Storage / Google Cloud Storage, can the file be streamed from Exoplayer without needing to download the full file size?
Yes. That's just what happens by default when you use ExoPlayer to play a stream.
If the mp3 can be streamed directly from Cloud Storage roughly what percentage of memory of the file is used in the transfer of the stream since the file itself is not being downloaded?
You can configure this by instantiating your own DefaultLoadControl, which you can pass to ExoPlayerFactory.newSimpleInstance when building the player. You can also implement your own LoadControl from scratch if you need more control.
Note that whilst buffering less far ahead saves on data transfer costs, it also makes re-buffers more likely to occur because the player will be less able to ride out temporary network connectivity issues.

Firebase Storage: Play back of a video url directly?

I have been searching a lot about this by now and I got nothing:
I am trying to play a video from firebase storage and trying to be able to see its progress on the player as it loads and to be able to seek it backward and forward (stuff that any player does while streaming a video).
The problem:
Firebase team say that it is not possible to stream a video from the cloud storage (it is not supported).
Eventhough I was able to do this:
String url = "my_url_at_firebase_storage";
video_View.setVideoUri(Uri.parse(url));
video_View.start();
and I was able to load the video from firebase storage into a video view.
I checked:
I checked this link that has an answer that says you have to transcode the video to chunks and save the chuncks to firebase storage and then load them:
But I am lost here:
1) What are chunks of video?
2) How would you stream these chunks if firebase doesn't support streaming?
My question:
As this topic is rarely documented and the link above doesn't provide enough info about how to acheive it:
I ask:
If firebase doesnt support streaming how come we are able to load video directly to videoview?
Tried the same with exoplayer and didn't work?
Thanks for your efforts.
"Transcoding the video into chunks" means dividing it into multiple small pieces (separate files). Those parts are then uploaded to Firebase Cloud Storage.
Once you divided the video into those pieces, you can download them. Since Firebase does not support streaming, you have to download each chunk entirely before playing, but the trick is that you only have to download that chunk, not the entire video.
Does that answer your question?

In Android, possible to save video directly in Amazon server while recording

While uploading the videos/audio files from my application,mostly i followed the following ways,
Record the file (video/audio)
Save into Internal/External Storage (inside application folder)
Then upload in server.
here my question is, whether its possible to save directly in the server (Amazon s3 or others). Why i'm asking this, while i using Periscope application they streaming the video as well as stored the video in their server.
Checkout LibStreaming : https://github.com/fyhertz/libstreaming or some of the suggestion in : Streaming video from Android camera to server
You can see this github or use ffmpeg
I think that it all depends on the importance of your video. When you use your original approach you guarantee that you have the full video in hand (device) and you can make sure it will be fully uploaded to your server. On the other hand, streaming it directly to the server can make you lose frames (connectivity hiccups and such) and hurt the video. I'm sure that streaming is done using UDP which makes loosing packages a really good option.

Categories

Resources