I need to implement Multi-track Audio Support feature in Sender application. Means, When user casts any video/movie on TV using Chromecast, then on sender application, user should be able to see all available audio tracks and choose desired audio track to cast on TV.
In Developers website, I saw Tracks API, in that MediaTrack object, which can be used to configure a track.
https://developers.google.com/cast/docs/android_sender#tracks
Here, I am not getting, how to fetch all available audio tracks from selected mp4 file? Can anyone provide the direction to work with this?
what will be the role of receiver application, regarding this?
I looked at given reference sample app. From that app, I am clear of how to set MediaTracks.
https://github.com/googlecast/CastVideos-android
I am not getting how to extract audio tracks from mp4 file, so that we can set them as MediaTracks?
Is this part needs to be done in receiver app?
Any help will be highly appreciated.
Thanks.
Currently, in the Cast SDK, there is no support for multiple embedded audio tracks in an mp4 file
Related
I want to create an app for sharing audio files. I want to build native mobile apps and made most of my progress on android with a PHP/MySQL backend so far. But now I want to step up my game and build my backend with Node.js and MongoDB.
Since big audio files take a while do download and can worsen the user experience I wondered if it is possible to just download the first 20% of an audio file. When the user reaches a certain point of the audio it downloads the rest/the next section. Therefore, we do not need to download the entire audio that might never get used. I just wonder if it is that difficult to add another section of the audio while playing without any interruptions.
For some reason I think this is how the big social media apps work but I cannot find any sources on this topic. I don't ask for code but just suggestions and references to help me. Am I on a good track or are there are ways to solve this problem? Also can you recommend to use Digital Ocean Spaces for this task??
Sources for Android and IOS will be very helpful!
You can do it by providing a "Range" attribute to the header of the HTTP-Request you are sending to the Digital Ocean space for the desired media file. The "Range" attribute says the number of bytes you want to receive.
How to play video hosted on DigitalOcean Space on Android app?
What is the type of video that Digital Ocean offers like MPEG-DASH, HSL or RTMP, etc?
Can I use Exoplayer to play Digital Ocean videos?
DigitalOcean is a general purpose cloud computing provider and Spaces is its's AWS S3-API compatible object storage solution - i.e. it's not a dedicated video host solution.
You can store and access your video there, but the format, codecs containers etc, are up to you.
It does have a built in CDN which you may find helps video response times, but as with all CDN's you probably want to test and compare to decide which is best for you.
For streaming video you can:
use simply HTTPS streaming, essentially just downloading the video in chunks and playing it as you download it.
use a dedicated streaming protocol like HLS or HASH which will allow you offer multiple bit rates to improve the user experience. More on ABR streaming here: https://stackoverflow.com/a/42365034/334402
This usual way to support HLS and/or DASH is by using a streaming server - there are open source ones available, e.g. http://www.videolan.org/vlc/streaming.html
In either case, if you video is mp4 you also need to make sure the header information, the 'moov atom' is at the start as the player needs this to allow it start playback. There are tools to allow you do this and more info here: https://multimedia.cx/eggs/improving-qt-faststart/
I'm developing an android app where user can start live streaming using his/her android-camera. I have AWS & GCP resources at hand. What I understand after some reading is,
I have to stream/publish, whatever android-camera is picking, to some server over some protocols(RTMP or HLS, etc..)
I have to setup server that will pull this input source and packages & stores it in a form that could be streamed/consumed on the mobile/web browser(Basically, an URL) and I believe AWS's MediaLive, MediaPackage, etc.. resources should do that.
I could use this URL are MediaSource for players on Android(like ExoPlayer)
My problem is I couldn't find good documentation on 1st part. I found this, https://github.com/bytedeco/javacv, which doesn't appear to be production-level work. While trying out 2nd part, while creating MediaLive channel on AWS, I was asked to point the channel to 2 destinations(I don't know what it means) which made me doubt my understanding of this process. I'm looking for some skeleton-procedure with official documentation on how to achieve this.
EDIT 1:
For the Input-Production part, I'm experimenting with this answer. https://stackoverflow.com/a/29061628/3881561
EDIT 2:
I've used https://github.com/ant-media/LiveVideoBroadcaster to send video source to RTMP server. I've created RTMP push input source in MediaLive and a channel with output - Archive(stores .ts files in S3). Now that the flow is working, How can I modify this architecture to allow multiple users to create live streaming?
I have one video file which have 3 audio track say English, Hindi and Russian.
I want to play all track simultaneously so for performing this task i want all embedded audio track as separate source on application layer.
I tried mediaplayer.getTrackInfo() but it only returns one track at a time.
Please suggest me if you have any idea regarding this concern.
Thanks in advance.
Is it possible to get track name while playing radio stream via MediaPlayer?
I would say pretty much with certainty - no, it isn't possible.
I can't see any MediaPlayer methods which suggest it's possible plus the way that metadata such as track name etc is presented in streaming media, will depend on the source, e.g, Shoutcast or otherwise.
If it can be done I'd be interested to know but I'd suspect you'd need to write something like a Shoutcast client (or other client depending on source). You'd still use MediaPlayer for streaming but would need extra code for accessing the metadata.