Unable to cast youtube video URL from my Android application through Chromecast whereas I successfully played other video content from my server.
Here is the code:
private void startVideo() {
MediaMetadata mediaMetadata = new MediaMetadata( MediaMetadata.MEDIA_TYPE_MOVIE );
mediaMetadata.putString( MediaMetadata.KEY_TITLE, getString( R.string.video_title ) );
MediaInfo mediaInfo = new MediaInfo.Builder(
"http://www.youtube.com/watch?v=Zy0cCn7F7mw")
.setContentType( video/mp4 )
.setStreamType( MediaInfo.STREAM_TYPE_BUFFERED )
.setMetadata( mediaMetadata )
.build();
try {
mRemoteMediaPlayer.load( mApiClient, mediaInfo, true )
.setResultCallback( new ResultCallback<RemoteMediaPlayer.MediaChannelResult>() {
#Override
public void onResult( RemoteMediaPlayer.MediaChannelResult mediaChannelResult ) {
if( mediaChannelResult.getStatus().isSuccess() ) {
mVideoIsLoaded = true;
mButton.setText( getString( R.string.pause_video ) );
}
}
} );
} catch( Exception e ) {
}
}
I tried the MIME type with "video/webM" but not worked. Please help.
You're trying to cast an HTML web page that has the YouTube player on it rather than an actual video file itself, which won't work ... the content of the MediaInfo has to be the direct URL to the video stream.
What's more, you won't be able to get that for a YouTube video; all YouTube videos must be played via the YouTube player (on desktop or mobile).
Related
I'm able to successfully cast and play video with chromecast, however the video I'm playing has multi audio and I would like to switch between the audio tracks in real time. I tried below methods but failed with it. Please let me know what needs to be done in order to able to switch audio tracks with chromecast.
val audioTrack = MediaTrack.Builder(1, MediaTrack.TYPE_AUDIO)
.setName(audioFormat.label)
.setContentId(audioFormat.format.id)
.setLanguage("english")
.build()
audioFormats.add(audioTrack)
val mediaInfo: MediaInfo = MediaInfo.Builder(videoUrl)
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
.setContentType(MimeTypes.VIDEO_UNKNOWN)
.setCustomData(customData)
.setMediaTracks(audioFormats)
.setMetadata(movieMetadata).build()
return MediaQueueItem.Builder(mediaInfo).build()
override fun switchAudioTrack(languageId: String, langId: Long) {
getRemoteMediaClient()?.setActiveMediaTracks(longArrayOf(langId))
?.setResultCallback { mediaChannelResult: RemoteMediaClient.MediaChannelResult ->
if (!mediaChannelResult.status.isSuccess) {
Log.e(
this.javaClass.simpleName, "Failed with status code:" +
mediaChannelResult.status.statusCode
)
}
}
}
Any help is greatly appreciated. Thanks!
I am developing native android WebRTC client that is suppoded to stream audio from custom device (I am getting audio stream via Bluetooth from that device). I am using libjingle library to implement WebRTC and I wonder if and how it is possible to hook up custom audio stream to audio track?
Currently I am adding default audio track like this:
localMS = factory.createLocalMediaStream("ARDAMS");
AudioSource audioSource = factory.createAudioSource(new MediaConstraints());
localMS.addTrack(factory.createAudioTrack("ARDAMSa0", audioSource));
I saw that there is WebRtcAuidioRecord (https://github.com/pristineio/webrtc-android/blob/master/libjingle_peerconnection/src/main/java/org/webrtc/voiceengine/WebRtcAudioRecord.java) - is it possible to override it?
Anybody tried doing something like that?
Your post lead me to the below code, I am going to try it and let you know if I get it to work. I am trying to send one audio stream to Watson API and one to WebRTC but Android only lets one InputStream read for the microphone. I will update you if I get it to work.
private org.webrtc.MediaStream createMediaStream() {
org.webrtc.MediaStream mediaStream = mFactory.createLocalMediaStream(ARDAMS);
if (mEnableVideo) {
mVideoCapturer = createVideoCapturer();
if (mVideoCapturer != null) {
mediaStream.addTrack(createVideoTrack(mVideoCapturer));
} else {
mEnableVideo = false;
}
}
if (mEnableAudio) {
createAudioCapturer();
mediaStream.addTrack(mFactory.createAudioTrack(
AUDIO_TRACK_ID,
mFactory.createAudioSource(mAudioConstraints)));
}
return mediaStream;
}
/**
* Creates a instance of WebRtcAudioRecord.
*/
private void createAudioCapturer() {
if (mOption.getAudioType() == PeerOption.AudioType.EXTERNAL_RESOURCE) {
WebRtcAudioRecord.setAudioRecordModuleFactory(new WebRtcAudioRecordModuleFactory() {
#Override
public WebRtcAudioRecordModule create() {
AudioCapturerExternalResource module = new AudioCapturerExternalResource();
module.setUri(mOption.getAudioUri());
module.setSampleRate(mOption.getAudioSampleRate());
module.setBitDepth(mOption.getAudioBitDepth());
module.setChannel(mOption.getAudioChannel());
return module;
}
});
} else {
WebRtcAudioRecord.setAudioRecordModuleFactory(null);
}
}
Source:
https://www.programcreek.com/java-api-examples/?code=DeviceConnect/DeviceConnect-Android/DeviceConnect-Android-master/dConnectDevicePlugin/dConnectDeviceWebRTC/app/src/main/java/org/deviceconnect/android/deviceplugin/webrtc/core/MediaStream.java
So I am trying to play a DASH video in a SimpleExoplayerView. I am following the most basic tutorials from https://codelabs.developers.google.com/codelabs/exoplayer-intro/#0.
The trouble is, I am able to hear the audio but the screen is always white.
Things I have already tried -
Using three different .mpd links. Can hear the audio each time and the video is white in each case. So there is no problem with the .mpd links.
Tested in different devices - an API 19 HTC Desire and an API 24 Moto G4. So I'm sure it is not a device specific issue.
Here is the initialising method -
private void initialisePlayer() {
TrackSelection.Factory adaptiveTrackSelectionFactory = new AdaptiveTrackSelection.Factory
(defaultBandwidthMeter);
if (exoPlayer == null) {
exoPlayer = ExoPlayerFactory.newSimpleInstance(new DefaultRenderersFactory(context), new
DefaultTrackSelector(adaptiveTrackSelectionFactory), new DefaultLoadControl());
simpleExoPlayerView.setPlayer(exoPlayer);
exoPlayer.setPlayWhenReady(true);
DashMediaSource mediaSource = buildMediaResource();
exoPlayer.prepare(mediaSource);
}
}
And this is the buildMediaSource method -
private DashMediaSource buildMediaResource() {
String userAgent = Util.getUserAgent(context, getString(R.string.app_name));
DataSource.Factory manifestDataSourceFactory = new DefaultHttpDataSourceFactory(userAgent);
DashChunkSource.Factory dashChunkSourceFactory = new DefaultDashChunkSource.Factory(new
DefaultHttpDataSourceFactory(userAgent, defaultBandwidthMeter));
return new DashMediaSource.Factory(dashChunkSourceFactory, manifestDataSourceFactory).createMediaSource(Uri
.parse(url));
}
Am I missing something?
I am using CastCompanionLibrary-android for chromecast integration.
I am going to show few methods of chromecast integration
first one is configuration initalization in application cast
private void initchromecast() {
String applicationId = getString(R.string.app_id);
// Build a CastConfiguration object and initialize VideoCastManager
CastConfiguration options = new CastConfiguration.Builder(applicationId)
.enableAutoReconnect()
.enableCaptionManagement()
.enableDebug()
.enableLockScreen()
.enableNotification()
.enableWifiReconnection()
.setCastControllerImmersive(true)
.setLaunchOptions(false, Locale.getDefault())
.setNextPrevVisibilityPolicy(CastConfiguration.NEXT_PREV_VISIBILITY_POLICY_DISABLED)
.addNotificationAction(CastConfiguration.NOTIFICATION_ACTION_REWIND, false)
.addNotificationAction(CastConfiguration.NOTIFICATION_ACTION_PLAY_PAUSE, true)
.addNotificationAction(CastConfiguration.NOTIFICATION_ACTION_DISCONNECT, true)
.setForwardStep(10)
.build();
VideoCastManager.initialize(this, options);
}
and TO pass meta data
private void loadRemoteMedia(int position, boolean autoPlay) {
MediaMetadata mediaMetadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_MUSIC_TRACK);
mediaMetadata.putString(MediaMetadata.KEY_TITLE, TITLE);
MediaInfo mediaInfo = new MediaInfo.Builder(
path)
.setContentType("application/x-mpegURL")
.setStreamType(MediaInfo.STREAM_TYPE_LIVE)
.setMetadata(mediaMetadata)
.setStreamDuration(MediaInfo.UNKNOWN_DURATION)
.build();
mCastManager.startVideoCastControllerActivity(getActivity(), mediaInfo, position, autoPlay);
}
Now my question is in below cod when i pass CastCompanion sample id 4F8B3483 ,i can successfully cast most of the videos but when i put my sample application reciver id some of the videos which works with their id does not work with mine.
For example http://iptvcanales.com/xw/peliculas.php?movie=006 url works with CastCompanion reciver id but not with mine id in my code.
How do I set the CORS headers for M3U8 file streaming in Chromecast? In my sender (Android) I am setting the Metadata and MediaInfo like this:
metaData = new MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE);
metaData.putString(MediaMetadata.KEY_TITLE, "Demo Video");
MediaInfo mediaInfo = new MediaInfo.Builder(
"http://playertest.longtailvideo.com/adaptive/bbbfull/bbbfull.m3u8")
.setContentType("application/vnd.apple.mpegurl")
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
.setMetadata(metaData)
.build();
player.load(client, mediaInfo, true)
.setResultCallback(new ResultCallback<RemoteMediaPlayer.MediaChannelResult>() {
#Override
public void onResult(RemoteMediaPlayer.MediaChannelResult mediaChannelResult) {
Status status = mediaChannelResult.getStatus();
if (status.isSuccess()) {
}
}
});
My onLoad method is set up like this:
mediaManager.onLoad = function(event) {
console.log("### Media Manager - LOAD: " + JSON.stringify(event));
if(mediaPlayer !== null) {
mediaPlayer.unload(); // Ensure unload before loading again
}
if (event.data['media'] && event.data['media']['contentId']) {
var url = event.data['media']['contentId'];
mediaHost = new cast.player.api.Host({
'mediaElement': mediaElement,
'url': url
});
mediaHost.onError = function (errorCode) {
console.error('### HOST ERROR - Fatal Error: code = ' + errorCode);
if (mediaPlayer !== null) {
mediaPlayer.unload();
}
}
var initialTimeIndexSeconds = event.data['media']['currentTime'] || 0;
// TODO: real code would know what content it was going to access and this would not be here.
var protocol = null;
var parser = document.createElement('a');
parser.href = url;
var ext = ext = parser.pathname.split('.').pop();
if (ext === 'm3u8') {
protocol = cast.player.api.CreateHlsStreamingProtocol(mediaHost);
} else if (ext === 'mpd') {
protocol = cast.player.api.CreateDashStreamingProtocol(mediaHost);
} else if (ext === 'ism/') {
protocol = cast.player.api.CreateSmoothStreamingProtocol(mediaHost);
}
console.log('### Media Protocol Identified as ' + ext);
if (protocol === null) {
mediaManager['onLoadOrig'](event); // Call on the original callback
} else {
mediaPlayer = new cast.player.api.Player(mediaHost);
mediaPlayer.load(protocol, initialTimeIndexSeconds);
}
}
}
However, I am getting this error:
XMLHttpRequest cannot load http://playertest.longtailvideo.com/adaptive/bbbfull/bbbfull.m3u8. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin '----' is therefore not allowed access.
For Chromecast, how do I set the CORS headers for Chromecast?
Probably too late, but just came across with the same issue and get it done by the approach mentioned below.
I didn't get a way to add headers on sender app side so, sharing my own experience. I get the CORS issue fixed upon firstly confirming that my server is supporting CORS. And then for playing the media on chromecast, i was needed to add gstatic.com and in my case another one as well as allowed domains on server, which is in-fact the whole idea of CORS, that each domain should be known to our server. And that's it.
Note: Be very much sure to go through this official documentation. But for being a beginner it may appear a bit tricky to grab all the stuff from here. So shared own experience as well.