Trying to cast media to default media receiver of the Chromecast from android app, but it doesn't cast at all. Following is the code snippet using to find the routes :
MediaRouteSelector selector = new MediaRouteSelector.Builder()
.addControlCategory(CastMediaControlIntent
.categoryForRemotePlayback(CastMediaControlIntent.DEFAULT_MEDIA_RECEIVER_APPLICATION_ID))
.build();
Then, it will show the Chromecast device within the WiFi and running the following code when the device is selected :
MediaRouter.RouteInfo route = adapter.getItem(position).routeInfo;
// select the route for usage
route.select();
// send the play control request with the video uri
route.sendControlRequest(
new Intent(MediaControlIntent.ACTION_PLAY)
.setDataAndType(videoUri, "video/mp4")
.addCategory(MediaControlIntent.CATEGORY_REMOTE_PLAYBACK),
new MediaRouter.ControlRequestCallback() {
#Override
public void onError(String error, Bundle data) {
super.onError(error, data);
}
#Override
public void onResult(Bundle data) {
super.onResult(data);
}
}
);
It can't cast the media to the device. Any suggestions ?
It seems like you are not using the Cast SDK but using the Media Route Provider. I don't see any session being set up; you might want to look at the democastplayer sample code that is distributed along with the Android SDK (under SDK folder, go to extras/google/google_play_service/samples/cast/democastplayer). In that sample, look at the MrpCastPlayerActivity class.
Related
How can I set the display to stereoscopic programmatically in Unity for an app deployed to an Android device?
I want a UI menu where the user can toggle between "VR mode" and normal mode. I do not want VR mode by default as it should be an option at run-time. I know there is a setting for "Virtual Reality Supported" in the build settings, but again, I do not want this enabled by default.
Include using UnityEngine.XR; at the top.
Call XRSettings.LoadDeviceByName("") with empty string followed by XRSettings.enabled = false; to disable VR in the start function to disable VR.
When you want to enable it later on, call XRSettings.LoadDeviceByName("daydream") with the VR name followed by XRSettings.enabled = true;.
You should wait for a frame between each function call. That requires this to be done a corutine function.
Also, On some VR devices, you must go to Edit->Project Settings->Player and make sure that Virtual Reality Supported check-box is checked(true) before this will work. Then you can disable it in the Start function and enable it whenever you want.
EDIT:
This is known to work on some VR devices and not all VR devices. Although, it should work on Daydream VR. Complete code sample:
IEnumerator LoadDevice(string newDevice, bool enable)
{
XRSettings.LoadDeviceByName(newDevice);
yield return null;
XRSettings.enabled = enable;
}
void EnableVR()
{
StartCoroutine(LoadDevice("daydream", true));
}
void DisableVR()
{
StartCoroutine(LoadDevice("", false));
}
Call EnableVR() to enable vr and DisableVR() to disable it. If you are using anything other than daydream, pass the name of that VR device to the LoadDevice function in the EnableVR() function.
For newer builds of Unity (e.g. 2019.4.0f1) you can use the XR Plugin Management package.
To enable call:
XRGeneralSettings.Instance.Manager.InitializeLoader();
To disable call:
XRGeneralSettings.Instance.Manager.DeinitializeLoader();
I'm using Unity 2021 but this probably works in earlier versions, I'm also using XR Plug-in Management.
Start:
XRGeneralSettings.Instance.Manager.StartSubsystems();
Stop:
XRGeneralSettings.Instance.Manager.StopSubsystems();
Full documentation at:
https://docs.unity3d.com/Packages/com.unity.xr.management#4.0/manual/EndUser.html
2020.3.14f1
Doesn't work for me, I get this error when running my Android app.
Call to DeinitializeLoader without an initialized manager.Please make
sure wait for initialization to complete before calling this API.
[RuntimeInitializeOnLoadMethod(RuntimeInitializeLoadType.AfterAssembliesLoaded)]
static void TryToDeinitializeOculusLoader()
{
XRGeneralSettings.Instance.Manager.DeinitializeLoader();
}
More context.
I try to unload the Oculus loader, before he manages to load the plugin.
I have an Android app, and the Oculus loader calls Application.Quit because the device is not an Oculus headset.
Waiting for XRGeneralSettings.Instance.Manager.isInitializationComplete takes too long.
Tried all RuntimeInitializeLoadType annotations.
OculusLoader.cs
#elif (UNITY_ANDROID && !UNITY_EDITOR)
[RuntimeInitializeOnLoadMethod(RuntimeInitializeLoadType.AfterAssembliesLoaded)]
static void RuntimeLoadOVRPlugin()
{
var supported = IsDeviceSupported();
if (supported == DeviceSupportedResult.ExitApplication)
{
Debug.LogError("\n\nExiting application:\n\nThis .apk was built with the Oculus XR Plugin loader enabled, but is attempting to run on a non-Oculus device.\nTo build for general Android devices, please disable the Oculus XR Plugin before building the Android player.\n\n\n");
Application.Quit();
}
if (supported != DeviceSupportedResult.Supported)
return;
try
{
if (!NativeMethods.LoadOVRPlugin(""))
Debug.LogError("Failed to load libOVRPlugin.so");
}
catch
{
// handle Android standalone build with Oculus XR Plugin installed but disabled in loader list.
}
}
#endif
SOLUTION
Made my build class extend IPreprocessBuildWithReport
public void OnPreprocessBuild(BuildReport report)
{
DisableXRLoaders(report);
}
///https://docs.unity3d.com/Packages/com.unity.xr.management#3.2/manual/EndUser.html
/// Do this as a setup step before you start a build, because the first thing that XR Plug-in Manager does at build time
/// is to serialize the loader list to the build target.
void DisableXRLoaders(BuildReport report)
{
XRGeneralSettingsPerBuildTarget buildTargetSettings;
EditorBuildSettings.TryGetConfigObject(XRGeneralSettings.k_SettingsKey, out buildTargetSettings);
if (buildTargetSettings == null)
{
return;
}
XRGeneralSettings settings = buildTargetSettings.SettingsForBuildTarget(report.summary.platformGroup);
if (settings == null)
{
return;
}
XRManagerSettings loaderManager = settings.AssignedSettings;
if (loaderManager == null)
{
return;
}
var loaders = loaderManager.activeLoaders;
// If there are no loaders present in the current manager instance, then the settings will not be included in the current build.
if (loaders.Count == 0)
{
return;
}
var loadersForRemoval = new List<XRLoader>();
loadersForRemoval.AddRange(loaders);
foreach (var loader in loadersForRemoval)
{
loaderManager.TryRemoveLoader(loader);
}
}
public void Awake() {
StartCoroutine(SwitchToVR(()=>{
Debug.Log("Switched to VR Mode");
}));
//For disable VR Mode
XRSettings.enabled = false;
}
IEnumerator SwitchToVR(Action callback) {
// Device names are lowercase, as returned by `XRSettings.supportedDevices`.
// Google original, makes you specify
// string desiredDevice = "daydream"; // Or "cardboard".
// XRSettings.LoadDeviceByName(desiredDevice);
// this is slightly better;
string[] Devices = new string[] { "daydream", "cardboard" };
XRSettings.LoadDeviceByName(Devices);
// Must wait one frame after calling `XRSettings.LoadDeviceByName()`.
yield return null;
// Now it's ok to enable VR mode.
XRSettings.enabled = true;
callback.Invoke();
}
I'm currently working on an Android app that enables users to group-chat with each other via OpenTok API. And I want to add a feature to the app that automatically detects which user is talking right now and show his video to the others and minimize the other users' videos until someone else talks.
I cannot find such feature in OpenTok so I was wondering if there's a workaround.
private void joinVideoCall(String sessionId, String sessionToken) {
session = new Session.Builder(activity, OPENTOK_API_KEY, sessionId).build();
session.setSessionListener(this);
session.connect(sessionToken);
}
#Override
public void onConnected(Session session) {
publisher = new Publisher.Builder(activity).build();
publisher.setPublisherListener(this);
publisherView.addView(publisher.getView());
session.publish(publisher);
}
#Override
public void onStreamReceived(Session session, Stream stream) {
subscriber = new Subscriber.Builder(activity, stream).build();
session.subscribe(subscriber);
subscriberView.addView(subscriber.getView());
}
...
In order to do that, you'll need to use a custom audio driver that will detect the audio levels.
Take a look to this sample: https://github.com/opentok/opentok-android-sdk-samples/tree/master/Custom-Audio-Driver
And also, take a look to the API documentation: https://tokbox.com/developer/sdks/android/reference/com/opentok/android/BaseAudioDevice.html
I am using CastCompanionLibrary-android in my app and following CastVideos-android to play live streams on chromecast. Now the streaming of video works fine on my local player but when it comes to cast that video, it wont play. Rather it just show me my registered receiver app name and on sender app the VideoCastControllerActivity opens with only a loader which wont end.
I have registered both my receiver app (Styled Media Receiver) and the device on Google chrome cast console. Also, I tried to debug the receiver app, it show nothing on the debugger or in console.
Here is the code snippet of my sender app:
private void initMediaRouter() {
BaseCastManager.checkGooglePlayServices(this);
mCastManager = VideoCastManager.getInstance();
MediaMetadata mediaMetadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE); //also tried MediaMetadata.MEDIA_TYPE_GENERIC
mediaMetadata.putString(MediaMetadata.KEY_TITLE, channel.getChannelName());
mediaMetadata.putString(MediaMetadata.KEY_SUBTITLE, channel.getCatName());
info = new MediaInfo.Builder(channel.getHttpStream())
.setMetadata(mediaMetadata)
.setContentType("Video/*")
.setStreamType(MediaInfo.STREAM_TYPE_LIVE)
.build();
castConsumer = new CastConsumer();
mCastManager.addVideoCastConsumer(castConsumer);
}
this function is called on the player activity's OnCreate() function.
The Listener (I'm getting true value for wasLaunched param)
private class CastConsumer extends VideoCastConsumerImpl {
#Override
public void onApplicationConnected(ApplicationMetadata appMetadata, String sessionId, boolean wasLaunched) {
mCastManager.startVideoCastControllerActivity(PlayerActivity.this, info, 0, true);
}
}
And in the onCreate() function of Application:
String applicationId = getString(R.string.cast_app_id);
CastConfiguration options = new CastConfiguration.Builder(applicationId)
.enableAutoReconnect()
.enableCaptionManagement()
.enableDebug()
.enableLockScreen()
.enableNotification()
.enableWifiReconnection()
.setCastControllerImmersive(true)
.setLaunchOptions(false, Locale.getDefault())
.addNotificationAction(CastConfiguration.NOTIFICATION_ACTION_PLAY_PAUSE, true)
.addNotificationAction(CastConfiguration.NOTIFICATION_ACTION_DISCONNECT, true)
.build();
VideoCastManager.initialize(this, options);
can you please tell me where i am going wrong? Thanks.
The content type might be an issue. Try properly setting it to something like video/mp4 if that's the case.
I am developing a mobile android app and using Android Pay/Google Wallet to retrieve CC information from the user. I was able to successfully get the GitHub sample application working.
Please see the image below:
It appears that the screen shown here uses a dynamic masked wallet fragment. The payment method and shipping address and change buttons are automatically generated by Android APIs.
How can I customize my own UI for this fragment?
How can I listen to the onClick event of the "CHANGE" button?
How can I use the "Android Pay" logo in green (in the image)? The sample app appears to still use the built in "Google Wallet" logo.
I found out how to use the Android Pay logo in the MaskedWalletFragment screen. Simply use the following API:
https://developers.google.com/android/reference/com/google/android/gms/wallet/fragment/WalletFragmentStyle.LogoImageType.html#ANDROID_PAY
The key is to call the following:
setMaskedWalletDetailsLogoImageType(int)
Use constant value of 3 for Android Pay. The rest are deprecated.
U have use the methods
.setBuyButtonAppearance(WALLET_APPEARANCE) in CkecoutActivity
.setMaskedWalletDetailsLogoImageType(mWALLET_APPEARANCE) in ConfirmationActivity
.useGoogleWallet() in FullWalletButton
Android wallet is deprecated, in my case i did this:
check id the device have NFC support (android pay use NFC)
if have nfc support use android pay, if dont have i use android wallet.
First in the Constants.java add:
//values to change buyapparence (android pay-wallet)
public static final int GOOGLE_WALLET_CLASSIC=1;
public static final int GOOGLE_WALLET_GRAYSCALE =2;
public static final int GOOGLE_WALLET_MONOCHROME=3;
public static final int ANDROID_PAY_DARK = 4;
public static final int ANDROID_PAY_LIGHT=5;
public static final int ANDROID_PAY_LIGHT_WITH_BORDER=6;
Then in utils add the nfcController.java with this method:
public boolean NFCsupport(){
boolean nfcSupport;
NfcManager manager = (NfcManager)mAppContext.getSystemService(Context.NFC_SERVICE);
NfcAdapter adapter = manager.getDefaultAdapter();
if (adapter != null && adapter.isEnabled()) {
nfcSupport=true;
//Yes NFC available
}else{
nfcSupport=false;
//Your device doesn't support NFC
}
return nfcSupport;
}
Then in your CheckoutActivity.java or when you have wallet implemention add this:
if(nfcController.NFCsupport()){
//turn on nfc (other method in util nfcController.java)
nfcController.enableNfcPower(true);
//show nfc payment(android pay)
mWALLET_APPEARANCE=Constants.ANDROID_PAY_LIGHT;
createAndAddWalletFragment(mWALLET_APPEARANCE);
Log.d("nfc", "you have nfc support");
}else{
Log.d("nfc", "dont have nfc support");
//show not nfc payment(wallet)
mWALLET_APPEARANCE=Constants.GOOGLE_WALLET_CLASSIC;
createAndAddWalletFragment(mWALLET_APPEARANCE);
}
In your createAndAddWalletFragment(int WALLET_APPEARANCE) change the appearance flags:
WalletFragmentStyle walletFragmentStyle = new WalletFragmentStyle()
.setBuyButtonText(BuyButtonText.BUY_WITH_GOOGLE)
.setBuyButtonAppearance(WALLET_APPEARANCE)
.setBuyButtonWidth(WalletFragmentStyle.Dimension.MATCH_PARENT);
Second, sent the wallet_appareance in the intent:
intent.putExtra("wallet_appearance",mWALLET_APPEARANCE);
and in your confirmationActivity update this funcion to listen the event with the logo correctly in the createAndAddWalletFragment():
WalletFragmentStyle walletFragmentStyle = new WalletFragmentStyle()
.setMaskedWalletDetailsLogoImageType(mWALLET_APPEARANCE)//from getintent
.setMaskedWalletDetailsTextAppearance(
R.style.BikestoreWalletFragmentDetailsTextAppearance)
.setMaskedWalletDetailsHeaderTextAppearance(
R.style.BikestoreWalletFragmentDetailsHeaderTextAppearance)
.setMaskedWalletDetailsBackgroundColor(
getResources().getColor(R.color.white))
.setMaskedWalletDetailsButtonBackgroundResource(
R.drawable.bikestore_btn_default_holo_light);
Finally in your fullWalletconfirmationbutton.java in the onCreate method dont forget the function useGoogleWallet to create and setup the fragment:
// Set up an API client;
mGoogleApiClient = new GoogleApiClient.Builder(getActivity())
.addConnectionCallbacks(this)
.addOnConnectionFailedListener(this)
.setAccountName(accountName)
.addApi(Wallet.API, new Wallet.WalletOptions.Builder()
.useGoogleWallet()
.setEnvironment(Constants.WALLET_ENVIRONMENT)
.setTheme(WalletConstants.THEME_HOLO_LIGHT)
.build())
.build();
and you have android pay and android wallet support.
I am trying to write a DLNA application using the Cling Java library. I Can able to search all the media servers in the DLNA network and play the content also. But i need to search Media Renderers available in the network and play the content on them. Just like UPnPlay does.
Thanks in advance.
public class MyUpnpService extends AndroidUpnpServiceImpl {
#Override
protected AndroidUpnpServiceConfiguration createConfiguration(WifiManager wifiManager) {
return new AndroidUpnpServiceConfiguration(wifiManager) {
#Override
public ServiceType[] getExclusiveServiceTypes() {
return new ServiceType[] {
new UDAServiceType("AVTransport")
};
}
};
}
}
Searching for devices with "AVTransport service" capability solved the issue of searching for Media renderers for remote playback. For remote playback i found enough documnetation from this