AudioTrack and Google cast / Chrome cast - android

I am trying to enable Google Cast / Chromecast support to my audio app which decodes audio and writes the bytes to AudioTrack and play it.
I know that using audio files, it is very simple as I can just provide the url for the file and it works; but what about situations like mine? I could not find an example for this. Where can I find more information or do you have any experience with this kind of thing? (Adding Google Cast support to your live audio streaming app)

The Cast media player needs a media source served over HTTP. You will need to provide a web server to access your audio data.

Related

Play encrypted .m3u8 stream in exoplayer Android

I use ExoPlayer in my Android app for playing .m3u8 video and audio streams. It works well. But now I want to add the ability to play encrypted(aes encryption on server-side) videos. For example, I obtain from backend the next parameters:
streamUrl: "https://some_stream.m3u8",
aes_key: "16symbols_key",
aes_iv: "16symbols_IV",
decrypt the stream and play it in my ExoPlayer. How can I make this?
You might wanna look at this issue where it is explained how to play AES-128.
Also do check that your encryption is supported https://exoplayer.dev/hls.html

Android: Bitmovin I cant find the encoded video in the bitmovin console?

I have been searching all around the web to try to understand how to stream videos to my android app.
I learned:
That the videos must be in the HLS or MPEG-DASH format to be adaptive for streaming .
What I mean by adaptive streaming:
The kind of streaming that allows the user to change the quality while video is streaming.
What could help me do this:
The first thing should be to encode my videos into an HLS or MPEG-DASH, for that I found a service that could do this which is Bitmovin.
The second thing is to play the adaptive video, I found two ways exoplayer and Bitmovin-player.
The problem:
I made an account on bitmovin and tried to test an mp4 video to encode, but there are too many stuff there like input, output and manifest and I don't know what URL I should use to pass to the bitmovin player to play. I encoded the video but I don't know where the reference to the enocoded video is.
My question:
1) Is my approach of streaming videos correct?
2) Can someone explain which url I must pass to the player or where I can find the video that was encoded in Bitmovin cloud?
1) Is my approach of streaming videos correct?
Short answer: Yes :) Adaptive streaming is used by almost every major VoD platform out there, and a proper way to do that. Further it allows you and your viewers to either
let the player decide on its own to select the optimal quality for the given connection and device of the viewer to provide continuous playback
and the viewer can select a specific quality on their own as well, if they want to.
When creating adaptive streaming content using MPEG-DASH and/or HLS as streaming format, your output would typically consist of the following:
Video/Audio Segments
MPD Manifest, and/or HLS playlists
1) Your input file (e.g. an mp4 file) will be downloaded and splitted into segments, which are being processed by the Bitmovin encoding. Out of these, it creates the different qualities the player or your viewer can choose from later.
2) This segmented output is then transferred back to your own storage, e.g. a cloud storage like AWS Simple Storage, or Google Cloud Storage. Other output types like (S)FTP and many others can be used as well.
3) In order to play your created MPEG-DASH or HLS content, a MPD manifest and/or HLS playlist needs to be created. Those are basically an index for the player, which tells it which qualities are available and where to find them to start the playback.
2) Can someone explain which url I must pass to the player or where I can find the video that was encoded in Bitmovin cloud?
The URL you would have to provide to the player, has to point the MPD and/or HLS master playlist, that gets transferred to your storage. Bitmovin doesn't offer a hosting service for your encoded content, which is why you didn't find an URL to the manifest that is used. So you would need a storage first, where the encoding could be transferred to.
Give the getting started guide a try. Select your preferred API client. Then you will be guided step by step on how to integrate this encoding service. I hope this helps :)
To test the playback of the player you can also have a look at https://bitmovin.com/demos/stream-test and select "Use our defaults" which provide URL's to sample content for testing the player and playback.

Error in MPEG-DASH with Azure Media Services

I am trying to implement Video on Demand using Azure Media Services on Android.
I have uploaded the video, encoded it to Multi-Bitrate MP4 and then added AES encryption and published it.
I received this MPEG-DASH Url: http://prepladder-inct.streaming.media.azure.net/xxxx-xxxx-xxxx-xxxx-xxxxxxxx/sample-video.ism/manifest(format=mpd-time-csf)
But i am unable to play this video on any MPEG-DASH player (Android or web)
I am even unable to play this video on Azure dash player link: http://dashplayer.azurewebsites.net/
When i enter the stream url, the player is able to show the duration of the video but does not play.
On ExoPlayer on Android, i am able to play all MPEG-DASH streams available online for testing purpose.
Also, i am able to play HLS stream provided by Azure for the same media on ExoPlayer and on iOS.
I have not added any token authorization or DRM during video encryption.
I am missing something? Please help.
The problem is likely that your video file name has unsupported characters. The issue is documented here, where we point out that certain characters should not be used. Try renaming the source video to, say, GM_1st_acid_fast_stain.mp4, upload it to a new Asset, encode, and then create the streaming URL.
For FairPlay/HLS, it is critical to provide the Application Cert (public key only) to the player. Apple's recommended way is to host this App Cert on a web server and pass the URL to player.
SPC/CKC negotiation is performed inside the player.
I would suggest to use this test/diag tool http://aka.ms/amtest which is equipped to support all 3 DRMs (FairPlay, Widevine, PlayReady) and AES-128. If you expand "player_settings" you will see an entry called "FPS AC Path" which is where you paste in your App Cert URL. Also put in "FairPlay" under protectionInfo.type. Of course you need to run the test in Safari on macOS.
Hope this helps.
William

Android Chromecast: Casting local audio file fails

I'm trying to cast a local device audio file to the Chromecast device.
I have a URI, for example content://settings/system/alarm_alert and when casting it to Chromecast it simply does not work - RemoteMediaPlayer returns result of 2100, which means FAILED. The same approach but with files from the server, for example http://storage.googleapis.com/automotive-media/Awakening.mp3 returns SUCCESS.
Audio playback is implemented the same way as in UAMP sample.
Is there any way to cast local audio files? I've heard the way with setting up local Android server however this sounds a bit complicated. Any ideas?
Using a local embedded server for streaming local content is the only way.

How to implement custom intermediate processor to play DRM protected video stream?

I'm trying to play video stream with specific DRM implementation. I've got specific parameters for video segments in HLS playlist.
So I need to write a class(es) that gets information from HLS playlist, decrypts and decompresses the video segments and passes them further to video decoding. That wouldn't be a big problem.
The problem is that I can't find any way to tell Android component how to handle this file. Both VideoView and MediaPlayer take only URI of media/video and no further information about processing playlist.
I appreciate any kind of help. It's the biggest problem in application I'm programming and I'm wondering if it is even possible to solve.
HLS doesn't have direct support for DRM, but it does have support for AES-128 CBC encrypted media. I don't know which DRM type you are looking at, but one approach taken by some DRM vendors is to independently access the decryption keys for the encrypted media segments, then use either a custom URL scheme registered by your app or a localhost https proxy to serve the keys. This might require rewriting the HLS variant playlists to point to the appropriate place.

Categories

Resources