Is there a way to create multiple different interface items similar to Xamarin Shell's flyout interface that can be activated from a single page (but not from other pages in my app), and which can contain multiple types of content? I am looking for something similar to what is seen in this video, taken of Microsoft Teams: https://www.youtube.com/watch?v=8GIa0UcOZAk
I have been looking for documentation on this, but I don't even know what this would be called, besides calling it a "flyout menu", but this refers to the single menu that's made available by Shell.
I am trying to make my app more accessible. I am having a hard time finding helpful things because there isn't a lot of documentation (at least I could not find it).
In my app, Talkback does not announce the element type for ImageViews. What I basically want is for Talkback to announce my contentDescription for the ImageView and follow it up with ", Image".
This link states that " Many accessibility services, such as TalkBack and BrailleBack, automatically announce an element's type after announcing its label, so you shouldn't include element types in your labels. For example, "submit" is a good label for a Button object, but "submitButton" isn't a good label." but it does not specify which element types it announces and which it does not.
https://developer.android.com/guide/topics/ui/accessibility/apps.html
Does anyone have any idea if Talkback announces "Image" after the contentDescription for ImageViews?
When does Talkback announce a link as a "Link"? Or is it the developer's responsibility to add it at the end of the contentDescription? Can I make talkback announce clickable text as a "Link"?
Any help/information/pointers is greatly appreciated. Thanks in advance.
A: don't add stuff to the end of the content description. It is an accessibility violation and in almost ALL circumstances just makes things less acessible (will explain more later).
B: A lot of contextual things are communicated to TalkBack users via earcons (bips, beeps, etc), you may just not be noticing.
C: Yes, this is confusing and difficult to determine, no images are not announced, though I think this is for good reason. For example, an image with a click listener may just be a fancy styled button. In iOS there are traits for you to change for this, Android has omitted this highly useful feature, so we're stuck with odd workarounds. The ideal solution would be for the Accessibility APIs to allow the developer to communicate this information.
As for links, typically inline links in text views are announced (basically anything android detects and underlines automatically), but otherwise are not. So, in practice A LOT of links are missed.
Now, as for when you should/should not supply this information yourself. If unsure, just don't and you'll obtain a reasonably high level of accessibility by following the above guidelines. In fact, any of the considerations below are really just fighting the Android OS limitations, and are their problem! However, the Android Accessibility Ecosystem is very weak, and if you want to provide a higher level of accessibility, it is understandable, however, difficult! In your attempts you can actually end up working against yourself. Let me explain:
In Accessibility there is a line between providing information and consistency. By adding contextual information to a content description you are walking along this line. What if Google said "We're not going to share contextual information, add it yourself!".
You would have buttons in music players across different music playing apps that announce in TalkBack like this:
App1: "Play, Button"
App2: "Play, Actionable"
App3: "Play, Clickable"
Do we have consistency here? Now a final example!
App4: "Play, Double Tap to click if you're on TalkBack, Hit enter if you're a keyboard user, use your select key for SwitchAccess users....."
Notice how complicated App4's Play Button is, this is illustrating that the information that TalkBack is using is NOT JUST FOR TALKBACK. The accessibility information from you app is consumed by a multitude of Accessibility services. When you "Hack" contextual information onto a content description, sure it might sound better for a TalkBack user, but what have you done to Braille Back users? To SwitchAccess users? In general, the content description of an element should describe the element, and leave contextual information for TalkBack to calculate/users to figure out given proximity to other controls.
TO ANSWER YOUR TWO PARTICULAR ISSUES (Images and Links):
What I recommend for images is in the content description make it obvious that the thing your describing is an image.
Let's say you have a picture of a kitten.
BAD: A kitten, Image
Good: A picture of a kitten.
See here, if TalkBack fails to announce this as an image (which it will) the user still gets the idea that it is a picture. You have added contextual information to the content description in a way that has ACTUALLY BETTER DESCRIBED THE CONTROL. This is actually the most accessible solution! Go figure.
Links:
For links this gets a bit tricky. There is a great debate in Accessibility about what constitutes a link vs a button. In the web browser world, this is a good debate. However, in native mobile I think we have a very clear definition:
Any button that when activated takes you away from your current Application Context, is a link.
The fact that the Context (Capital C Context!!!) is going to change significantly when a user interacts with a control is EXCEPTIONALLY important information.
TalkBack fails to recognize links A LOT, and as such, for this very important piece of information, if you find that TalkBack is not sharing this information, go ahead and append ", link" to your content description for this element. THIS IS THE ONLY EXCEPTION TO THIS RULE I HAVE FOUND, but believe it is a good exception. Reason being, YES it does add a violation or two for other Assistive Technologies, but it conveys important enough information to justify doing so. YOU CANNOT create a WCAG 2.0 compliant application of reasonable complex User Interface using the Android Accessibility APIs. They have too many limitations, you simply can't do everything you need to do to accomplish this without "hacks". So, we have to make judgement calls sometimes.
I am relatively new to android development and I'm having a good time so far. My application is functional and I'm ready to add a few options for the user. There seems to be a wealth of information on the ways to do this and I'm having trouble sorting through it and determining which way is the current "accepted" method of providing options to the user.
Does anyone have a great resource to share?
It really depends on the particular app, existing UX, branding and a ton of other design considerations.
In general, starting from scratch, a good starting point (though not universally!) would be the action bar. It gives the user access to ways of manipulating the data on screen, as well as a way of consistently presenting secondary functionality (the overflow menu). Design docs, implementation docs.
I would heartily recommend going through the Patterns section of the design documentation, as it explains the rationale behind many of the core design decisions.
P.S. The reason I'm eager to underline that it's not universal is apps with established UX and user expectations. Examples include Facebook, Path, Google Maps. They all have their reasons for not sticking strictly to the action bar paradigms but they work with it as much as they can.
This Menu Doc page is particularly helpful. But basically if it is API < 11 then you use the hard menu button for an options menu. After 10 the menu items will show in either the ActionBar as action items or in the overflow button. But can still be located in a hard menu button if one exists.
You also always have the option of a contextual menu, primarily with a long click, for things like certain menus depending on the View that is triggered. So maybe show an edit, delete, save menu for a list item.
What is the best way to show help (or user guidelines) to the user when application is launched for the first time. Some applications show overlay text and arrows to inform about the various features available in the application. What is the best way to implement it ? Do I need separate activity or do I modify my homescreen xml or something else ?
Please suggest some good approch as well as specific query to search on google (I couldn't find any specific result on Google)
You can try this lib and have a look at how it's done: https://github.com/Espiandev/ShowcaseView
The ShowcaseView library is designed to highlight and showcase
specific parts of apps to the user with a distinctive and attractive
overlay. This library is great for pointing out points of interest for
users, gestures, or obscure but useful items.
The library is based on the "Cling" view found in the Launcher on Ice-Cream Sandwich and Jelly Bean, but extended to be easier to use.
If you using Twitter for Android you can see that clicking on phone's hard search button brings fully customized quick search-like control. Now I'm not saying that it IS stock Android customized quick search but how would one build something like that?
I want quick search box behavior but I also want add some additional selectors (think Firefox search where on the left there's a dropdown to select search engine)
I suspect that clicking search brings another activity that just looks like a quick search. Now I know how to trigger search activity from the quick search but how to intercept quick box call and display your activity instead?
Where would I start with something like that? Any hints and pointers will be greatly appreciated
With no screenshot, I can't help terribly much, since I don't use the official Twitter app (I'm a Seesmic guy), so I don't quite know what you are talking about.
You can override onSearchRequested() to get control when the user requests a search in your activity. Return true to say you're handling the search yourself. Along the way, pop up whatever you like to allow the user to do a search.