I'm trying to cast a local device audio file to the Chromecast device.
I have a URI, for example content://settings/system/alarm_alert and when casting it to Chromecast it simply does not work - RemoteMediaPlayer returns result of 2100, which means FAILED. The same approach but with files from the server, for example http://storage.googleapis.com/automotive-media/Awakening.mp3 returns SUCCESS.
Audio playback is implemented the same way as in UAMP sample.
Is there any way to cast local audio files? I've heard the way with setting up local Android server however this sounds a bit complicated. Any ideas?
Using a local embedded server for streaming local content is the only way.
Related
In my app i implemeted a chromecast support for casting local audio files to Android TV. I dig out on google about my query but did't get a any answers which will clear my doubt.
My dobut is for casting any local audio files do i need to create a http server and uplaoding audio files to it and then passing URL for casting to cast receiver devices. I don't think this will be the approach.
I heard somewhere that, we have to use any embedded Http library but i dont know how to use it and i have no any idea. it seems bit complicated.
Any sugesstion how can i cast my local audios.
I used xampp to host some video files locally. This way I can just use the url: http://192.168.x.x/myvideo.mp4 (x.x would be the IP your xampp is running on) to cast my own videos. Audio Files should work the same I guess.
I am trying to enable Google Cast / Chromecast support to my audio app which decodes audio and writes the bytes to AudioTrack and play it.
I know that using audio files, it is very simple as I can just provide the url for the file and it works; but what about situations like mine? I could not find an example for this. Where can I find more information or do you have any experience with this kind of thing? (Adding Google Cast support to your live audio streaming app)
The Cast media player needs a media source served over HTTP. You will need to provide a web server to access your audio data.
While uploading the videos/audio files from my application,mostly i followed the following ways,
Record the file (video/audio)
Save into Internal/External Storage (inside application folder)
Then upload in server.
here my question is, whether its possible to save directly in the server (Amazon s3 or others). Why i'm asking this, while i using Periscope application they streaming the video as well as stored the video in their server.
Checkout LibStreaming : https://github.com/fyhertz/libstreaming or some of the suggestion in : Streaming video from Android camera to server
You can see this github or use ffmpeg
I think that it all depends on the importance of your video. When you use your original approach you guarantee that you have the full video in hand (device) and you can make sure it will be fully uploaded to your server. On the other hand, streaming it directly to the server can make you lose frames (connectivity hiccups and such) and hurt the video. I'm sure that streaming is done using UDP which makes loosing packages a really good option.
I want to play encrypted video files present on my device after decrypting them. I want to pre-process the data-stream and parallel play it using videoview like streaming video from Internet.
Is there any way I could buffer the processed data to videoview like a network stream ?
I think you are saying that you want to decrypt the video in one process and then pass the decrypted 'clear stream' video to another process to play it?
If the video is DRM protected, then your use case is very unlikely to be supported by any of the leading DRM solutions - they go to great lengths to ensure the clear stream video is not accessible by an application on the device (for obvious reasons).
If you are using or a simple encryption with the encryption key available to your application then you should be able to do this.
Update
Answering BMvit's question in the comment - one way is to follow these steps:
Stream the encrypted file from the server as usual, 'chunk by chunk'
On your Android device, read from the stream and decrypt each chunk as it is received
Using a localhost http server on your Android device, now 'serve' the decrypted chunks to the MediaPlayer (the media player should be set up to use a URL pointing at your localhost http server)
I am guessing this is the most likely the approach that the libMedia library uses, although I have never seen the source so I could not say for sure: http://libeasy.alwaysdata.net
It is worth being aware that this is tricky (which is probably why LibMedia is not free).
Working on an android app to stream music from my ftp server, I was wondering if anyone can offer any code, advice, or tutorial on how to access the specific folder/directory containing the audio files, list the audio files, and stream them through my app, also if it is possible or not. Thanks
Would something like this be helpful? For accessing the data via FTP, the FTPClient class could become handy.