This series of posts describes some of the steps you can take to enhance the programmatic accessibility of your Win32, WinForms and WPF apps.
Well, it's been a while since I've had a chance to share some of the things I've learnt around building accessible apps. I've been working closely with a number of teams who are working hard to enhance the accessibility of their apps. This has been a fascinating time for me, because of the wide range of UI frameworks being used across the various teams. It's been a reminder for me that while the principles around accessibility are the same regardless of the UI framework you're using, the implementation details are quite different.
And sometimes this is a source of confusion for devs, who are trying to figure out exactly what they need to do in code to improve the accessibility of their apps. Someone contacted me a few weeks ago, saying that they'd heard they could fix a particular accessibility-related bug in their app, through use of an AutomationPeer, but they'd not been able to figure out how to do that. It turned out that their app was a Windows Forms (WinForms) app, and AutomationPeers are not related to WinForms apps. The dev had lost valuable time trying to figure out how to leverage something that they can't leverage.
So in this series of posts, I'll call out some of the specific classes and functions that might be able to help you enhance the programmatic accessibility of your app, based on the UI framework you're using.
After my recent experiences, the first thing I ask when someone sends me a question about how they might fix an accessibility-related bug in their app, is "What UI framework are you using?". The answer's usually one or two of Win32, WinForms, WPF, UWP XAML, or HTML. I had one large team say they have a mix of UI built with Win32, WinForms, WPF, and HTML. I've yet to find a team that owns UI built with all five of the UI frameworks I deal with, but maybe there's one out there somewhere…
After getting the reminder that the term "AutomationPeer" gives no indication to someone new to accessibility as to what UI framework it relates to, I thought about how the same issue applies to a number of accessibility-related terms. So here's a big quiz.
Which of the terms below relate to which of Win32, WinForms, WPF, UWP XAML and HTML?
If you know the answer, then you've already won the prize. That prize being the power to reach out and help many devs build accessible apps, and so indirectly help many of their customers.
For those of us who aren't so familiar with all those terms, I'll connect them to related UI frameworks in this series of posts, particularly focusing on Win32, WinForms and WPF.
What about UWP XAML?
As far as accessibility goes, UWP XAML evolved from WPF. Many of the details around the accessibility of WPF, also apply to UWP XAML. However, with each release of Windows, the support for accessibility in UWP XAML is enhanced, and so there are some very handy things you can do with UWP XAML apps that aren't practical with a WPF app. But the fundamentals of how to expose certain UI Automation (UIA) properties, or add support for certain UIA patterns, is the same for both WPF and UWP XAML. Details on building accessible UWP XAML apps can be found at UWP App Accessibility.
New for the Windows 10 Creators Update
Since we're on the topic of UWP XAML, I can't resist the urge to mention one of the things that I find most exciting about the enhancements to accessibility in the Windows 10 Creators Update.
Providing a way for your customers to use the keyboard to efficiently leverage all the functionality of your app, is of huge value to your customers. Your app may show a busy page full of useful controls, and your customer might be able to use the keyboard to tab through all the controls today. Some people might claim the app is keyboard accessible, and indeed, it is essential that your customer can leverage all the great functionality in your app through use of keyboard alone.
But really – why would your customer want to have to move through all the controls in your app before they reach the control that they want to interact with? After all, you don't make your customers who use a mouse move the mouse cursor to all the controls between the control they last worked with and the control they want to reach, before they can continue with their task. All customers want to progress through the steps in the task at hand, with no delays whatsoever.
You can satisfy your customers' desires here by adding access keys to your app. Access keys are those keyboard shortcuts you can assign to controls, which enable your customer to trigger action at the control by pressing the Alt key plus some control-specific character. That's the functionality that you've always been able to add to your Win32 and WinForms apps with no effort at all, (by adding an ampersand somewhere in the text set on the control). Well, now, it's a piece of cake to do the same thing with UWP XAML apps. For example, if you want to set an access key of "C" on a Button or TextBox, add this to your control's XAML:
This is so little work for devs, and provides so much power to their customers, that I'd say every team shipping UWP XAML apps should consider leveraging this cool new functionality.
What about HTML hosted in Edge?
The accessibility of HTML is a huge subject, as indicated by the great deal of discussion on the subject on the web. In this series of posts, I can only cover whatever I have time for in flights back and forth over the Atlantic. So I'm going to concentrate on trying to help with the confusion around what options a dev has with their Win32, WinForms and WPF apps.
That said, to give an introduction into the programmatic accessibility of HTML hosted in Edge, it's all about UIA. Whatever UI framework you're using to build your UI, be it Win32, WinForms, WPF, UWP XAML or Edge-hosted HTML, an assistive technology (AT) app like the Windows Narrator screen reader uses the UIA API to interact with your UI. Narrator is a UIA client app, and if it's interacting with your UI, then something somewhere has implemented the UIA Provider API.
Often the UI framework itself will implement the UIA Provider API on your behalf. You don't want to have to do all that work yourself unless you really have to. And sure enough, Edge is doing all that work on your behalf, so that Narrator and other UIA client apps can interact with your HTML-based UI.
Edge will expose data about your UI, through the UIA API, based on how you defined your UI. For example, if you added a button tag, Edge will expose a related UIA element whose UIA ControlType property is UIA_ButtonControlTypeId, and whose UIA Name property is whatever text is shown on the button.
And where you need to enhance the default accessibility of your UI, it can sometimes be appropriate to do this with ARIA. For example, say your button shows no text string on it, and instead shows some cool glyph from some font, and you specified this by referencing the associated Unicode value. In this situation Edge has no friendly string to repurpose as the UIA Name property, so you can help by adding the string that your customers need through use of the aria-label attribute. Edge will expose that data as the UIA Name property of the button.
By using semantic HTML, Edge will expose your UI through UIA in an intuitive way, and where necessary, use of ARIA can further influence that UIA representation. Details on accessibility support in Edge can be found at Edge Accessibility.
By default, use standard controls and widgets
The above example around using an HTML button tag touches on one of the most important points relating to building accessible UI.
All the UI frameworks that I deal with do a ton of work on your behalf to provide the foundation for an accessible experience. And in some cases, portions of your UI can be accessible by default. Wherever practical, you really want to leverage the help that the UI framework can provide. By leveraging that help, it can be much quicker for you to build accessible UI, and it reduces the risk that you ship some severe accessibility bugs.
This point applies to all of Win32, WinForms, WPF and UWP XAML, but let's consider an HTML example. Say I want to show something in my app that looks like a bird feeder, and when invoked, some action occurs. It might be tempting for me to think that I won't add a button to my HTML, because my UI doesn't look like a typical button at all. So perhaps I'd add a clickable div, and style it such that it looks like a bird feeder.
But that's exactly what I don't want to do.
If I do that, Narrator won't be told that the UI has a ControlType of Button. The ControlType sets my customer's expectations around what actions they can take at the element. Ok, fair enough, maybe I could add a role of "button" to the div, and so have the ControlType exposed through UIA as Button. Well, that still won't make the element keyboard accessible. Use of ARIA, (including the role attribute,) won't affect how the UI reacts to keyboard input. Ok, fair enough, maybe I could add a tabindex attribute to the div, which would mean I could tab to it. Well, that still doesn't mean my customer can invoke the element once it has keyboard focus. Ok, fair enough, maybe I could add a key event handler until I mimic the behavior of standard HTML buttons. (Hmm, should that react to the Spacebar or Enter, and for keydown or keyup?)
Well, yes, perhaps in some cases it is possible to patch up the inaccessible UI. But really – why would you want to spend time doing that? If you use a button tag in the first place, it'll be exposed through UIA as having a ControlType of Button, and your customer can tab to it, and invoke it through the keyboard, all by default. The only work on your plate is to account for whatever custom visual styling you've done, such that keyboard focus feedback is clear when the UI has focus, and the UI is as usable as possible when a high contrast theme is active.
So by default, use standard controls and widgets.
Accessibility is more than just programmatic accessibility right?
Mentioning keyboard accessibility and use of colors above, is a reminder that there are a number of topics relating to accessibility. I tend to group all these topics into a few areas, as mentioned at Building accessible Windows Universal apps. In practice, the three areas of colors & contrast, keyboard accessibility and programmatic accessibility, still need very close attention if we're to avoid shipping some severe bugs.
And while your customers depend on you to build apps which are accessible in all these areas, this series of post focuses on programmatic accessibility.
If you're to deliver a great experience for all your customers, then you need to feel confident that your app is programmatically accessible. Today, that means its representation through UIA is rock-solid. That is, the UIA properties it exposes, the control it provides through UIA patterns, and the notifications it provides through UIA events, all work together to deliver a full and efficient experience to your customers.
As part of the process for me to learn about an app's UIA representation, I use the Inspect and AccEvent SDK tools. I couldn't do my job without these tools. So I can't recommend enough that you, or someone on your team, gets familiar with these tools. They're not exactly the most intuitive of tools, but once you've got the hang of Inspect, it can quickly draw your attention to the sorts of bugs which could render your app unusable to many people. Figure 1 below shows the Inspect tool reporting the UIA properties of a button in Visual Studio's WPF UI.
Note also that Inspect can be extremely helpful for providing information on the hierarchy of the elements exposed through the UIA tree. An unexpected order of elements in the UIA tree can be a source of severe bugs. I call this out explicitly in Part 2 of this series, because it's so easy to hit the problem when building WinForms apps. But it applies to all UI, and in fact I hit a bug related to this just a few days ago with Win32 UI, because the order of the controls in a dialog box as defined in the .rc file was very different to the visual layout of the UI.
For an introduction into UIA and the related SDK tools, take a look at this training video, Introduction to UIA: Microsoft's Accessibility API.
I'm sure there are a few options around accessibility that I've not mentioned in this series of posts. For example, how to write a full UIA Provider API implementation. In some situations, you may get involved with writing a full UIA implementation, but in general, you'll not want to have to. Rather you'll want to leverage all that the Windows platform can do on your behalf, and deliver a great accessible experience to all your customers with as little work on your part as possible. So this post concentrates on some of the more commonly used options for enhancing accessibility that I've seen used in practice.
Thanks for helping everyone benefit from the all great features of your apps!
Posts in this series:
Figure 1: The Inspect SDK tool showing that the "New Project" item in the Visual Studio toolbar supports being both programmatically invoked and expanded.