I need to check if the Android phone my app runs on is using casting which is enabled outside of my app.
It seems CastSession or SessionManager can provide the session related to my app which is not helpful for me.
For example, I can start casting with an app called xx which will cast or mirror the entire screen of my phone. Now, I need to notify when I open my app that the phone's screen is casting/mirroring so I can prevent showing specific content on my app.
I checked it with the code below:
val isCastingEnabledLiveData = MutableLiveData<Boolean>()
fun isCastingEnabled(context: Context): Boolean {
val mediaRouter = MediaRouter.getInstance(context)
if (mediaRouter.routes.size <= 1) {
isCastingEnabledLiveData.value = false
return
}
val selector = MediaRouteSelector.Builder()
.addControlCategory(MediaControlIntent.CATEGORY_LIVE_VIDEO)
.addControlCategory(MediaControlIntent.CATEGORY_REMOTE_PLAYBACK)
.build()
mediaRouter.addCallback(selector, object : MediaRouter.Callback() {
override fun onRouteChanged(router: MediaRouter?, route: MediaRouter.RouteInfo?) {
super.onRouteChanged(router, route)
isCastingEnabledLiveData.value = if (route != mediaRouter.defaultRoute) {
route?.connectionState != MediaRouter.RouteInfo.CONNECTION_STATE_DISCONNECTED
} else false
}
})
}
you can check whether the phone screen is casting or not by using the MediaRouter class.
Here is an example of how you could check if the phone screen is casting:
MediaRouter mediaRouter = (MediaRouter)
getSystemService(Context.MEDIA_ROUTER_SERVICE);
MediaRouter.RouteInfo route = mediaRouter.getSelectedRoute();
if(route.isDefault()){
// Screen is not casting
} else {
// Screen is casting
}
This code uses the getSelectedRoute() method of the MediaRouter class to get the currently selected route. If the returned RouteInfo object is the default route, then the screen is not casting, otherwise it is.
Please note that this code uses the getSystemService(Context.MEDIA_ROUTER_SERVICE) method to get an instance of the MediaRouter class, so it should be called from an Activity or Service context.
Additionally, you could also use MediaRouter.Callback and MediaRouter.addCallback to set a callback to monitor the state of the casting, so you could get the updates on the casting state change as well.
Related
I decided to experiment with MAUI. I am approaching first an Android App, and using Shell for navigation.
My App has 2 ways of opening:
When it's opened by the user tapping on the icon
Through a deep link, triggered by another app.
The issue I'm having is that when the app is triggered through the Deep link, I need to navigate to a specific page. I am trying to do it on the OnNewIntent accessing the Current instance of Shell, but when doing GoToAsync("my_route") it gives an error when trying to navigate to the new page.
This is what I have on my MainActivity:
protected override void OnNewIntent(Intent intent)
{
base.OnNewIntent(intent);
var action = intent.Action;
var data = intent.DataString;
if (!string.IsNullOrWhiteSpace(data) && data.Contains("/data/")) {
if(Shell.Current != null)
{
Shell.Current.GoToAsync("myroute)";
// Also tried:
// - Shell.Current.GoToAsync("myroute").Wait();
// - App.Current.Dispatcher.Dispatch(async () => await Shell.Current.GoToAsync("//myroute")); (suggested by #toolmakersteve )
}
}
}
And this is the error:
Java.Lang.IllegalArgumentException: 'No view found for id 0x1
(unknown) for fragment ShellItemRenderer{19d353d}
(6c8560ab-dd58-4cbf-9e8b-2b9e12315f45 id=0x1)'
I'm assuming this has something to do with the fact that what I'm doing is not possible, so I need to find the RIGHT way to navigate to a specific page from OnNewIntent on MAUI, using Shell navigation.
UPDATE: It's also important to note that when the Deep Link triggers the app to open, there are two different behaviours:
If the app was already running, it throws the above mentioned exception
If the app was not already running, it opens regularly on the main screen, with no errors, but I would expect it to navigate to the desired Page.
Thanks!
First, make sure that GoToAsync("myroute") works if you use it somewhere more typical, such as a button press.
Assuming that works, then perhaps the intent code isn't running in the Dispatcher context (previously known as MainThread). Try:
Dispatcher.Dispatch(() => {
Shell.Current.GoToAsync("myroute");
});
VERSION 2
Perhaps deep link logic runs BEFORE App's OnResume.
If so, this might work:
In App.xaml.cs:
public partial class App : Application
{
...
public static bool FromDeepLink;
protected override void OnResume()
{
base.OnResume();
if (FromDeepLink)
{
FromDeepLink = false;
MainPage = new MainPage();
Dispatcher.Dispatch(() =>
{
Shell.Current.GoToAsync("myroute");
});
}
}
}
Then in OnNewIntent:
if (!string.IsNullOrWhiteSpace(data) && data.Contains("/data/")) {
App.FromDeepLink = true;
}
Conceptually #ToolmakerSteve answer is correct, but the OnResume event of the Application class seems not to fire when the app is resumed by intent (seems to be a Maui bug), however Android's native OnResume works and fires correctly even when the app is resumed via intent, all you have to do is in the MainActivity class to override Android's native OnResume method:
protected override void OnResume()
{
base.OnResume();
var fromDeepLink = Preferences.Get("FromDeepLink", false);
if (fromDeepLink)
{
Preferences.Set("FromDeepLink", false);
Shell.Current.GoToAsync("myroute");
}
}
protected override void OnNewIntent(Intent intent)
{
base.OnNewIntent(intent);
var action = intent.Action;
var data = intent.DataString;
if (!string.IsNullOrWhiteSpace(data) && data.Contains("/data/"))
{
Preferences.Set("FromDeepLink", true);
}
}
Android 12 came up with a new Privacy Settings to disable access to the Camera and Mic sensors, which is referred as Toggles in the docs.
As it is mentioned in the docs:
the system reminds the user that the device-wide toggle is turned off
However, it seems that it only reminds the user when requesting the Camera permission and not when trying to authenticate the user using biometrics (face authentication on Pixel phones, which guess what!? It uses the camera). [I'm using AndroidX biometrics library]
Is there any way to find out if the Camera access has been blocked by the user without requesting any permission?
I guess the note in the docs didn't take into account that the app might use face authentication:
Note: The toggles mentioned in this section shouldn't require changes to your app's logic, as long as you follow privacy best practices.
Notes:
You can't register a new face in Settings when camera access is blocked. The Settings app does not show any error, just a blank camera feed
I am using Pixel 4 (Android 12)
The feature 'Join Wi-Fi by scanning a QR code' does not work and neither shows a feedback to the user if Camera access is blocked (Pixel 5)
So, I also looking for a solution - a have a biometric library and few reports appear in DM with the same problem - FaceUnlock doesn't work on Pixel 4 when the camera 'muted'
For now, still now fix, but maybe my research can help someone.
1. I checked the new API for PrivacyToggle's.
Android 12 introduces a new SensorPrivacyManager with supportsSensorToggle() method - it returns TRUE in case of device able to 'mute' camera or mic.
val sensorPrivacyManager = applicationContext
.getSystemService(SensorPrivacyManager::class.java)
as SensorPrivacyManager
val supportsMicrophoneToggle = sensorPrivacyManager
.supportsSensorToggle(Sensors.MICROPHONE)
val supportsCameraToggle = sensorPrivacyManager
.supportsSensorToggle(Sensors.CAMERA)
If you look into SensorPrivacyManager, you can find that it provides some more useful methods, so I develop the next code:
fun isCameraAccessible(): Boolean {
return !checkIsPrivacyToggled(SensorPrivacyManager.Sensors.CAMERA)
}
#SuppressLint("PrivateApi")
private fun checkIsPrivacyToggled(sensor: Int): Boolean {
val sensorPrivacyManager: SensorPrivacyManager =
appContext.getSystemService(SensorPrivacyManager::class.java)
if (sensorPrivacyManager.supportsSensorToggle(sensor)) {
val userHandleField = UserHandle::class.java.getDeclaredField("USER_CURRENT")
userHandleField.isAccessible = true
val userHandle = userHandleField.get(null) as Int
val m = SensorPrivacyManager::class.java.getDeclaredMethod(
"isSensorPrivacyEnabled",
Int::class.javaPrimitiveType,
Int::class.javaPrimitiveType
)
m.isAccessible = true
return m.invoke(
sensorPrivacyManager,
sensor,
userHandle
) as Boolean
}
return false
}
Unfortunately, the service rejects this call due to SecurityException - missing android.permission.OBSERVE_SENSOR_PRIVACY, even if we declare it in Manifest.
At least on emulator.
2. We can try to identify a new "sensor-in-use" indicator
fun checkForIndicator(){
findViewById<View>(Window.ID_ANDROID_CONTENT)?.let {
it.setOnApplyWindowInsetsListener { view, windowInsets ->
val indicatorBounds = windowInsets.privacyIndicatorBounds
if(indicatorBounds !=null){
Toast.makeText(view.context, "Camera-in-use detected", Toast.LENGTH_LONG).show()
}
// change your UI to avoid overlapping
windowInsets
}
}
}
I didn't test this code (no real device), but as for me - it's not very useful, because we can check the camera indicator only AFTER we start Biometric Auth flow, when I need to understand is camera accessible BEFORE Biometric Auth started.
3. Because of PrivicyToogle related to QuickSettings, I decide that perhaps exists a way how Tiles determinate current Privacy Toggle state.
But this API use a very interesting solution - it does not use Settings.Global or Settings.Security section, instead, all preferences saved in "system/sensor_privacy.xml" and not accessible for 3rd party apps.
See SensorPrivacyService.java
I believe that exists a way how to find that Camera is blocked, but seems like some deeper research required
UPDATED 28/10/2021
So after some digging in AOSP sources, I found that APP_OP_CAMERA permission reflects the "blocking" state.
Just call if(SensorPrivacyCheck.isCameraBlocked()){ return } - this call also notify the system to show the "Unblock" dialog
Example
Solution:
#TargetApi(Build.VERSION_CODES.S)
#RestrictTo(RestrictTo.Scope.LIBRARY)
object SensorPrivacyCheck {
fun isMicrophoneBlocked(): Boolean {
return Utils.isAtLeastS && checkIsPrivacyToggled(SensorPrivacyManager.Sensors.MICROPHONE)
}
fun isCameraBlocked(): Boolean {
return Utils.isAtLeastS && checkIsPrivacyToggled(SensorPrivacyManager.Sensors.CAMERA)
}
#SuppressLint("PrivateApi", "BlockedPrivateApi")
private fun checkIsPrivacyToggled(sensor: Int): Boolean {
val sensorPrivacyManager: SensorPrivacyManager =
AndroidContext.appContext.getSystemService(SensorPrivacyManager::class.java)
if (sensorPrivacyManager.supportsSensorToggle(sensor)) {
try {
val permissionToOp: String =
AppOpCompatConstants.getAppOpFromPermission(
if (sensor == SensorPrivacyManager.Sensors.CAMERA)
Manifest.permission.CAMERA else Manifest.permission.RECORD_AUDIO
) ?: return false
val noteOp: Int = try {
AppOpsManagerCompat.noteOpNoThrow(
AndroidContext.appContext,
permissionToOp,
Process.myUid(),
AndroidContext.appContext.packageName
)
} catch (ignored: Throwable) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT)
PermissionUtils.appOpPermissionsCheckMiui(
permissionToOp,
Process.myUid(),
AndroidContext.appContext.packageName
) else AppOpsManagerCompat.MODE_IGNORED
}
return noteOp != AppOpsManagerCompat.MODE_ALLOWED
} catch (e: Throwable) {
e.printStackTrace()
}
}
return false
}
}
Is there a way to capture the back button click listener from Google Places Address Search (AutocompleteSupportFragment)
val btnBackClick =
autocompleteFragment?.view?.findViewById(R.id.places_autocomplete_back_button) as androidx.appcompat.widget.AppCompatImageButton
btnBackClick.setOnClickListener {
Log.e("AutoComplete", "Address Search Back")
}
Tried this leading to crash "java.lang.IllegalStateException: Places must be initialized."
I've tried to get the back button the same way you did on my end, but you are right, that View returns null even though it apparently exists: R.id.places_autocomplete_back_button.
Update: At the moment it is not currently possible to get this View, so I recommend you file a feature request for this in Google's Issue Tracker in case Google Engineers are able to consider adding this capability.
Hope this helps!
I had the same problem. I couldn't get a reference through neither the back button of the AutocompleteSupportFragment / onBackPressed listener / onBackPressedDispatcher from the activity, so I went on and used the error status instead.
// Assuming "autocompleteFragment" is the view name in your XML file
val fragment = supportFragmentManager.findFragmentById(R.id.autocompleteFragment) as AutocompleteSupportFragment
// ...
fragment.setOnPlaceSelectedListener(object : PlaceSelectionListener {
override fun onPlaceSelected(place: Place) {
// Handle the result like usual
}
override fun onError(status: Status) {
// Check if the user tapped the back button
if (status == Status.RESULT_CANCELED) {
// Do what you want to do when back button is pressed
}
}
}
Technically a workaround, but it does capture the back button flow.
I am building this app, that will recognize painting and will display the info about it with the help of AR.
And I need to call multiple image target but not simultaneously, it will only call the image target if it is detected by AR camera.
*Ive tried creating many scenes with Image target on it but I cant call different imagetarget it keeps on reverting to only 1 imagetarget.
This is wat you can see in menu,
Main menu
Start AR camera(This part should have many image target but not detecting it simultaneously)
Help(how to use the app)
Exit
*Im using vuforia in creating AR
Thanks in advance for those who will help me.
This is the imagetarget and its Database
View post on imgur.com
Run the multi target scene sample. There are three target (stone, wood and road).
Each contains the TrackableBehaviour component.
Grab it and disable it in Start. If you do it in Awake it will be set back to active most likely in the Awake of the component itself or via some other manager.
public class TrackerController:MonoBehaviour
{
private IDictionary<string,TrackableBehaviours> trackers = null;
private void Start()
{
this.trackers = new Dictionary<string,TrackableBehaviour>();
var trackers = FindObjectsOfType<TrackableBehaviour>();
foreach(TrackingBehaviour tb in trackers)
{
this.trackers.Add(tb.TrackableName, tb);
tb.enabled = false;
}
}
public bool SetTracker(string name, bool value)
{
if(string.IsNullOrEmpty(name) == true){ return false; }
if(this.trackers.ContainsKey(name) == false){ return false; }
this.trackers[name].enabled = value;
return true;
}
}
The method finds all TrackableBehaviour and places them in a dictionary for easy access. The setting method return boolean, you can change it to throw exception or else.
While making my application accessible, I have a problem - there's no way to make it SPEAK!!
By referencing google's library, I make
public boolean dispatchPopulateAccessibilityEvent(AccessibilityEvent event)
on my customized view and I get right event message - I checked it by using Log.d
However, there's no way to make talkback to speak...
My Application runs from API8 so I can't use also,
onPopulateAccessibilityEvent()
Am I thinking wrong? Please somebody help me...
For people looking to implement #Carter Hudson's code in Java (don't judge me cause I'm still not using Kotlin in 2019):
AccessibilityManager accessibilityManager = (AccessibilityManager) context.getSystemService(Context.ACCESSIBILITY_SERVICE);
AccessibilityEvent accessibilityEvent = AccessibilityEvent.obtain();
accessibilityEvent.setEventType(AccessibilityEvent.TYPE_ANNOUNCEMENT);
accessibilityEvent.getText().add("Text to be spoken by TalkBack");
if (accessibilityManager != null) {
accessibilityManager.sendAccessibilityEvent(accessibilityEvent);
}
I needed to announce when a button became visible after reloading a RecyclerView's items with a new dataset. RecyclerView being a framework view, it supports talkback / accessibility out-of-the-box. After loading new data, talkback announces "showing items x through y of z" automatically. Utilizing the TTS API to solve the use case I mentioned introduces the following pitfalls:
TTS instance initialization and management is cumbersome and questionable for the following reasons:
Managing TTS instance lifecycle with onInit listener
Managing Locale settings
Managing resources via shutdown() ties you to an Activity's lifecycle per documentation
An Activity's onDestroy is not guaranteed to be called, which seems like a poor mechanism for calling shutdown() in order to deallocate TTS resources.
An easier, more maintainable solution is to play nicely with TalkBack and utilize the Accessibility API like so:
class AccessibilityHelper {
companion object {
#JvmStatic
fun announceForAccessibility(context: Context, announcement: String) {
context
.getSystemService(ACCESSIBILITY_SERVICE)
.let { it as AccessibilityManager }
.let { manager ->
AccessibilityEvent
.obtain()
.apply {
eventType = TYPE_ANNOUNCEMENT
className = context.javaClass.name
packageName = context.packageName
text.add(announcement)
}
.let {
manager.sendAccessibilityEvent(it)
}
}
}
}
}
Call the above from wherever you need (I added a method to my base activity that forwards to the helper). This will insert the announcement into the queue of messages for TalkBack to announce out loud and requires no handling of TTS instances. I ended up adding a delay parameter and mechanism into my final implementation to separate these events from ongoing ui-triggered events as they sometimes tend to override manual announcements.
Very this is tool, can use it everywhere with guard
public static void speak_loud(String str_speak) {
if (isGoogleTalkbackActive()) {
AccessibilityManager accessibilityManager = (AccessibilityManager) getDefaultContext().getSystemService(Context.ACCESSIBILITY_SERVICE);
AccessibilityEvent accessibilityEvent = AccessibilityEvent.obtain();
accessibilityEvent.setEventType(AccessibilityEvent.TYPE_ANNOUNCEMENT);
accessibilityEvent.getText().add(str_speak);
if (accessibilityManager != null) {
accessibilityManager.sendAccessibilityEvent(accessibilityEvent);
}
}
}
public static boolean isGoogleTalkbackActive() {
AccessibilityManager am = (AccessibilityManager) getDefaultContext().getSystemService(Context.ACCESSIBILITY_SERVICE);
if (am != null && am.isEnabled()) {
List<AccessibilityServiceInfo> serviceInfoList = am.getEnabledAccessibilityServiceList(AccessibilityServiceInfo.FEEDBACK_SPOKEN);
if (!serviceInfoList.isEmpty())
return true;
}
return false;
}
If you want it to speak, use the TextToSpeech API. It takes a string and reads it outloud.
announceForAccessibility method defined in the View class probably serves the purpose here. It was introduced in API level 16. More details here.