What is the purpose of the soundEffectsEnabled view attribute in Android? - android

What is the use of the the view attribute soundEffectsEnabled (Boolean) as seen in Android Studio properties panel?
I tried setting it to true for a Button as seen in the image above, for getting a click sound when the button is clicked, but it had no effect. I looked up the documentation for the attribute at the link given below but it does not seem to be descriptive enough.
https://developer.android.com/reference/android/view/View#attr_android:soundEffectsEnabled
I understand that I can try to add sound effects by updating the onClickListener for the button and adding media resources, but I wanted to understand what the purpose of the soundEffectsEnabled attribute is, and how it can be useful. Thanks.

Answering my own question:
Setting the soundEffectsEnabled attribute of the view to true is required for enabling the click sound. However, two other things need to be done apart from this:
Make a function call to View.playSoundEffect() in the onClickListner of the view to produce a sound, as follows:
fun setListenerForButton(button: Button) {
button.setOnClickListener(object : View.OnClickListener{
override fun onClick(v: View?) {
v?.playSoundEffect(android.view.SoundEffectConstants.CLICK)
// Other functionality as required goes here
}
})
}
Enable sounds (and possibly haptic feedback) upon touch in your device settings. Generally this can be found in a place like Settings -> Sound -> (Advanced ->) Touch sounds

Related

How to set an android's view accessibility flag of "button" to "true"?

I have a ConstraintLayout with an onClickListener so users can tap anywhere on the layout for it to perform its onClickListener action.
The problem is, Android does not flag this item as a button. It will say "double tap to activate" but our accessibility team has flagged this as incorrect because screen-reader users need to know the item is a "button" (from the Android tag) to know an item is actionable.
In the past, my work-around was to change views to be a button that looks exactly alike. However, this is a lot more difficult in this case because it's a ConstraintView.
Does anyone know how to set Accessibility's 'button' flag to 'true' on a ConstraintView? Or on any view?
Simple fix worked like a charm:
fun View.setAccessibilityRole(role: String = "button") {
ViewCompat.setAccessibilityDelegate(this, object :
AccessibilityDelegateCompat() {
override fun onInitializeAccessibilityNodeInfo(host: View, info: AccessibilityNodeInfoCompat) {
super.onInitializeAccessibilityNodeInfo(host, info)
info.roleDescription = role
info.className = role
}
})
}
While the other answer would seem to work, the main issue would be that you are then relying on your app's translation team to translate "button" if the screen reader user uses any language that doesn't have the same word for button.
// example usage
// val view = findViewById(...)
// view.setAccessibilityRole<Button>()
inline fun <reified T: View> View.setAccessibilityRole() {
ViewCompat.setAccessibilityDelegate(
this,
object : AccessibilityDelegateCompat() {
override fun onInitializeAccessibilityNodeInfo(
host: View,
info: AccessibilityNodeInfoCompat
) {
super.onInitializeAccessibilityNodeInfo(host, info)
info.className = T::class.qualifiedName
}
}
)
}
This works for a button, and probably other standard controls like RatingBar, ProgressBar and CheckBox but could probably be refined to be a little safer than what I have done.
A word of caution: There be dragons here. It can walk like a button, talk like a button, but there might be elements that go against the grain that make it inaccessible further and lead you to more regressions, as can be seen by the other answer (translation issue). Other examples are keyboard navigation and highlight, or touch target size, or someone doesn't load text / content description and then wonders why it's not focusable by a screen reader, switch access or voice assistant. You could also end up with focusable controls within your "button" and that could be terribly confusing for certain users, depending on how they use the device (TalkBack does not encompass all assistive tech).
You're always better off extending existing controls for this reason.

Android: How to change the control type for Accessibility

There are some controls in our app which we'd like to update the control type read out by Talkback. For example: A TextView which would better be described as a link, or an ImageView that would better be described as a button.
We could simply update the content description to report out the desired control type, though I am wondering if there is a better way? If there is another way, can it be done both through the view XML and dynamically in the code behind?
Yes, it is possible to change the type. It is called roleDescription. You would change it as follows:
ViewCompat.setAccessibilityDelegate(yourView,
object : AccessibilityDelegateCompat() {
override fun onInitializeAccessibilityNodeInfo(v: View, info: AccessibilityNodeInfoCompat) {
super.onInitializeAccessibilityNodeInfo(v, info)
info.roleDescription = "Button"
}
})
(use string resources and translate the strings to all languages supported by your app)
This cannot be done via XML by default, but you could look into writing your own binding adapter for this.

Android Compose Unit testing - Toggle a Switch

Given a button with a Modifier:
TextButton(modified = Modifier.testTag("abc123"))
when you want to create a test to click it, you do:
composeTestRule.onNodeWithTag("abc123").performClick()
but when I'm having a:
Switch(modifier = Modifier.testTag("abc123"))
I'm trying every single perform gesture but I can't get the Switch to toggle, and can't get any documentation from Android.
What's the correct way to toggle it automatically in order to test it?
I had issues toggling a switch with performClick(), but it turned out the switch wasn't visible on screen, performClick() will then simply click the coordinates (0,0) without any error.
So to ensure it's displayed first:
composeTestRule.onNodeWithTag("abc123")
.assertIsDisplayed()
.performClick()
OLD ANSWER (can still be used if you need to click something which is not displayed)
This seems to be a working way to toggle a material Switch in a Jetpack Compose test:
composeTestRule.onNodeWithTag("abc123")
.performSemanticsAction(SemanticsActions.OnClick)
I don't know if you are still struggling with this, but the following seems to work for me:
composeTestRule
.onNodeWithTag("abc123")
.performGesture { swipeLeft() } // or swipeRight to turn it on

disabling view announcement when focused (Talkback Enabled)

I'm using a custom dynamic contentDescription for my textview, so it has been implemented in my java class and not in my xml file.
private void enableButton(TextView textView, boolean enabled) {
if (textView == null) {
return;
}
if (!enabled) {
textView.setContentDescription(textView.getResources().getString(R.string.install_disabled));
} else {
textView.setContentDescription(textView.getResources().getString(R.string.install));
}
textView.setEnabled(enabled);
}
After I'm enabling my textview to be clickable, and when talkback is enabled, focusing on my textview is announcing the state of my textview which is "disabled". Is there a way to not announce that state?
I do not want to set the accessibility to be not important because I still want my dynamic contentDescription to be recited when talkback users focus on the textview.
Suggestion:
I believe the culprit is the "setEnabled" method that is somehow triggering and announcing the state of the textview, but I'm still not able to stop it from reciting that last.
My first answer is: LEAVE IT ALONE! The "disabled" announcement tells a TalkBack user that there is a user interface control there, that under some circumstances can be interacted with, but is not currently active. Given your description, this is exactly what you have. To remove the announcement is actually going to make things WORSE from an accessibility perspective, the explanations for why this is the case are covered in WCAG 1.3.1.
Definitions:
Button = android.widget.Button
button = a user interface component that does something when you click it.
Text = a user interface component that conveys information, but is not active
Long story short, the fact that the control is ever in a state that it can be active and "not disabled" is significant information on its own, and SHOULD be shared with the user. ESPECIALLY since you're using a "TextView" to implement this. This isn't a super uncommon practice, but one of the ways TalkBack calculates roles (button, link, image, text, etc) is by looking at the Class/Type of object. So, when it sees a TextView, it is going to assume Text, unless you inform it otherwise. Now, since you have added click listeners to your TextView (or Gesture Recognizers, or whatever) TalkBack may be able to figure out that the thing you're dealing with is actually a "button", and it may not. REGARDLESS, the fact that this "button" (lower case B!) is not active is important state to share with the user, and communicates to them the fact that they can somehow enable it and come back and interact with it later. This is immensely important information! Imagine if every button/link on a WebPage looked exactly like plane text? How would you know what to interact with?
Now, I will show you the different pieces of this puzzle, as information, but I really do encourage you to leave the announcement alone. This is coming from someone who routinely speaks at Accessibility conferences on Native Android development, PLEASE LEAVE THIS ANNOUNCEMENT IN. To not do so shows a misunderstanding of how users with sight impairments want to perceive controls within your application, and the information that is important to them.
The setEnabled() function on a TextView corresponds directly with the isEnabled() property of AccessibilityNodeInfo.
In order to implement the behavior you want, you want the AccessibilityNodeInfo representation of your TextView to be different from that of the actual representation of your TextView. In order to do this you can use AccessibilityDelegate, I'm actually not sure which callback you want to use. In one of these the node is likely to be "sealed" and in one of them it might not be sealed yet. You obviously want the one where the node is not yet sealed! Regardless the general approach is:
textView.setAccessibilityDelegate(new View.AccessibilityDelegate() {
#Override
public void onInitializeAccessibilityNodeInfo(View host, AccessibilityNodeInfo info) {
// Let the default implementation populate the info.
super.onInitializeAccessibilityNodeInfo(host, info);
// Override this particular property
info.setEnabled(whateverYouThinkItShouldBe);
}
});
Use setClickable(false) to replace setEnabled(false) will solve this problem.

How to listen to Talkback's focus-change in Android?

I'm working on improving the accessibility within my app.
I have pretty complicated layout with cards. Each card has some clickable objects inside it, but it also has the global click-listener.
When I enable Talkback, select the card (not something inside it!), double-clicking (to open the card), the card gets the touch-event in the middle of the card.
As a result, nested object got click event and react respectively.
The question is how to determine, which item is in TalkBack's focus (green-rectangle-thing for me)? The idea is to disable inside touch-listeners, if card itself is in focus.
API level I want to support is 16 (Android 4.1+)
Thanks!
I think what would work best for you, is to override the accessibility delegate of layout view, listening for accessibility focus events. When focus is added to a card, remove listeners, when focus leaves your cards re attach your listeners. Attach this delegate to your layout view, and you should be able to watch as various views within your layout obtain and give up accessibility focus.
class MyAccessibilityDelegate extends View.AccessibilityDelegate {
#Override
public boolean onRequestSendAccessibilityEvent(ViewGroup viewGroup, View child, AccessibilityEvent event) {
if (event.getEventType() == AccessibilityEvent.TYPE_VIEW_ACCESSIBILITY_FOCUSED) {
//Do stuff in here! Maybe also do different stuff when focus is cleared!
}
return super.onRequestSendAccessibilityEvent(viewGroup, child, event);
}
}
The apis for this were added in API level 14 so you should be good to go!

Categories

Resources