This post describes three improvements to the programmatic representation of the exploratory solitaire game described at The Sa11ytaire Experiment: Part 1 – Setting the Scene. The game was built specifically to explore how a solitaire game can be played using a variety of forms of input and output. Please let us know how the game could be enhanced further based on your experiences.
Apology up-front: When I uploaded this post to the blog site, the images did not get uploaded with the alt text that I'd set on them. So any images are followed by a title.
As we played more with the Sa11ytaire app after its initial release, and got feedback from others, we learned how we could provide a more intuitive and efficient experience for players using a screen reader such as Narrator. Narrator interacts with the app solely through its programmatic representation as exposed through the UI Automation (UIA) API, and so we set to work on enhancing that programmatic representation of the app, as described in the Three Updates section below.
Actually, we didn't. All the updates were made by my co-developer Tim, so I'm just describing his work here. So a big thanks to Tim for the work he put in for these improvements to the app!
A couple of additional thoughts
We did get some very positive feedback relating to the access keys that we'd originally added to the app. For some players, pressing the Alt key followed by some other key, can be a very efficient way to play the game. The image below shows the access keys being revealed visually in the app in response to a press and release of the Alt key. The interactable elements shown along the top of the app have access keys of N, U, C, D, H and S. The seven dealt card piles have access keys of 1 through 7.
Figure 1: Access keys being shown visually in the Sa11ytaire app.
The UWP XAML framework often makes it extremely quick 'n' easy to implement access keys. All the interactable elements along the top of the app are implemented using Buttons and ToggleButtons, and when the player presses the access keys, the associated Button's Click handler, or ToggleButton's Checked handler, is called. So that's some really powerful functionality added to the app, simply by adding something like:
The access keys for the dealt card piles are a little more interesting. I expect different apps might need to react differently in response to the press of a ListView's access key. The Sa11ytaire app needs keyboard focus to move to the last item in the list, and for that item to be selected. In order to achieve this, all the dealt card ListViews have the following markup applied:
When that AccessKeyInvoked function is called in response to the press of the access key, it moves keyboard focus and selects the item, using the steps shown below.
private void CardPile_AccessKeyInvoked(UIElement sender, AccessKeyInvokedEventArgs args)
ListView list = sender as ListView;
if (list != null)
if (list.Items.Count > 0)
list.SelectedIndex = list.Items.Count - 1;
To learn more about access keys in UWP apps, visit Access keys.
While overall this design works great, we did learn of one unexpected problem with the interaction model that relates to the discoverability of some of the access keys. An app might have all sorts of handy features which can lead to an efficient experience, but if the player isn't aware of the features, then the features might as well not exist. So the player must be made aware of all the access keys available in the Sa11ytaire app. While the game's Help documentation describes all available access keys, wherever possible, the player shouldn't have to refer to the Help documentation. Instead we want it to be intuitive to the player as to how to play the game efficiently.
So consider the player using the Narrator screen reader with the app. Say they tab to move keyboard focus to the button called "Next card". Narrator examines the UIA properties for the button, and learns that the UIA AccessKey property is "Alt, N". Narrator can then include that information in its announcement, and the player is made aware of this efficient means to invoke the "Next card" button. Going forward, the player doesn't need to tab to the button, instead they simply press Alt+N. Brilliant!
However, say the player later tabs to a card in a dealt card pile. The items in the dealt card piles don't have access keys, rather it's the ListViews that contain the items that have the access keys. So when keyboard focus moves to an item in the list, Narrator doesn't announce the access keys. As such, the player is not made aware of the really efficient means to move to, and select, an item in a list of dealt cards. At some later time the player may read through the app's Help documentation and learn of the access keys for the lists, but it really would be preferable for them to not have to do that.
The image below shows the Inspect SDK tool reporting the UIA representation of one of the dealt card ListViews.
Figure 2: The Inspect SDK tool reporting the UIA representation of one of the dealt card ListViews, with the ListView's UIA AccessKey property of the "Alt, 2" highlighted in the image.
We're still figuring out the most practical way for us to improve this experience, such that the player is automatically made aware of all the access keys that are available in the game. Narrator's Context Verbosity setting can be configured to have some information relating to an element's parent element announced when Narrator reaches the element. For example, when tabbing from the second to the third card pile list, Narrator's announcement can include "Exit Card pile 2 list, Enter Card pile 3 list". But the Context Verbosity feature can't trigger inclusion of the parent's AccessKey property when tabbing to an item in the list. And even if it could, we'd never base our actions on the Narrator experience alone. Instead, we focus on shipping the best programmatic representation of the game that we can through UIA, and have that representation be leveraged by all assistive technology apps however those apps feel is most helpful to your customers.
Perhaps one practical approach here would be for us to manually set the AccessKey property of the last item in each list. The lists already all have LayoutUpdated handlers, which are used to set the height of the items in the list. That ensures that the last item in the list is a lot taller than the other items in the list. So as a test, I just updated an existing function in the app which is called beneath the LayoutUpdated handlers, such that it also sets the UIA AccessKey properties of the items in the list. It goes something like this:
private void SetHeightOfCards(ListView list)
if (list != null)
for (int i = 0; i < list.Items.Count; ++i)
var card = list.Items[i] as PlayingCard;
var item = (list.ContainerFromItem(card) as ListViewItem);
// Is this the last item in the list?
if (i == list.Items.Count - 1)
item.Height = 120;
// Give the last item in this list the same access key as the list itself.
string listAccessKey = list.AccessKey;
AutomationProperties.SetAccessKey(item, "Alt, " + listAccessKey);
item.Height = 60;
// This list item has no access key.
With the above code, when Narrator reaches the last item in the list, it announces the access key used to move focus to the item and to select it. It really doesn't matter from the player's perspective that the UWP XAML AccessKey is actually implemented on the containing list rather than the list item. The following is an example of the announcement made by Narrator with this change, as I tab to a card in a dealt card list.
"Card pile 3, 10 of Hearts, 3 of 3, non-selected, Alt, 3"
I don't know if there's a way to achieve the same results through binding, but all in all the above approach was pretty straightforward and seemed quite useful. If it holds up after more testing, maybe we'll include it in the next update of the app.
When Microsoft recently hosted Benetech's 2018 DIAGRAM Code Sprint, I had an opportunity to demo the Sa11ytaire to a couple of attendees and get their feedback. As I was demo'ing the switch device control of the app, I was asked whether the same input model would work with other card games. I'd not really thought about that before, but I expect it would work great with a number of other card games. From memory card games to poker, the first step would be to figure out the most efficient path for the player to reach the UI elements they're interested in.
On a side note, I'm particularly interested in the idea of an accessible slider puzzle app. Way back I spent lots of time playing with slider puzzles, and Sa11ytaire's switch device control would seem to be a good match for such puzzles. Particularly given that at any given time there are only a few moves that can be made in the puzzle.
I also got a question about how the Sa11ytaire app might be updated for players using multiple switches, and it was suggested that different switches could initiate the scan through the UI at different areas in the app. Perhaps I could take the opportunity to explore that idea with the new Xbox Adaptive Controller. Today the Sa11ytaire app's switch control reacts to a press of the Space key, which means the switch device must be configured to simulate that. But I expect it'd be straightforward to update the app to react to different types of input coming from a game controller. That'd certainly be a fun experiment.
And on a rather different note, I can't help but wonder if it would be possible to effectively add switch control support to an app, from outside the app. If the target app had a good UIA representation, then its UIA tree could be manually examined by a dev to determine the most efficient path for the switch scan highlight to move through the app's UI. A standalone UIA client app could access the bounding rects of these elements, and draw a highlighting rectangle above the app, in a similar way to how my Herbi HocusFocus app highlights where keyboard focus is. Once the switch scan highlight is above the element of interest in the target app, a press of the switch would trigger either a change in level of the scan, (for example, moving into a list to scan through the items,) or perform some action on an element through a UIA pattern, (for example, invoke a button or select a list item). I doubt I'll ever have time to try that out, but it would seem technically possible.
I am grateful for the feedback I got at the recent Benetech event, and if we do get requests from people considering using the app with a switch device, I'll look forward to learning how we can make the app as efficient to use as possible.
Figure 3: A switch device being used to play the Sa11ytaire game.
The following sections describe the UIA-related updates made to the Sa11ytaire app.
A quality app provides an intuitive means for the customer to navigate around the app's UI in an efficient manner. And "intuitive" here means that if the player is familiar with standard keyboard shortcuts that work in many other apps, then they'll try using them in the Sa11ytaire app.
One common keyboard shortcut for moving keyboard focus between areas in an app, is F6. Try pressing F6 in products like PowerPoint, Outlook, and Windows Explorer, and consider how keyboard focus is moving around the UI. F6 provides a quick way to move keyboard focus into some area of interest in the app, after which the Tab key might be pressed a few times to reach a control contained inside that area. And of course, for commonly accessed controls and functionality, we'd consider adding access keys or accelerator keys to make the steps to complete a task even more efficient.
So when we first built Sa11ytaire, we did add support for a press of F6 to move between the three main areas in the app. With a press of F6, keyboard focus would move between the Next card button, the target pile for clubs, and the last item in the first dealt card pile.
While F6 can help to provide a more efficient experience for all players using the keyboard, there is also a very well-known means of efficient navigation specific to using a screen reader like Narrator. And that is to navigate between "landmarks" in an app's UI. The concept of landmarks originally sprang from the web, and some related details for web UI are available at Using ARIA landmarks to identify regions of a page. But these days it's also quick 'n' easy for a dev to add landmarks to a UWP XAML app.
For example, let's consider the Microsoft Store app. That app presents UI for navigating around various pages in the app, and that UI is always available near the top of the app, regardless of where the customer happens to be in the app. I just pointed the Inspect SDK tool at the UI, such that I can examine the app's UIA representation, and I find that a container for that navigation-related UI has a UIA LandmarkType property of UIA_NavigationLandmarkTypeId. This lets any UIA client app like Narrator know that the container is a landmark, and its purpose relates to navigation through the UI. There's an additional LocalizedLandmarkType property which provides a localized string to the UIA client app, so that the landmark's purpose can be presented to the customer in a helpful human-readable way.
Figure 4: The Inspect SDK tool reporting the UIA properties of a landmark in the Microsoft Store app, with the landmark-related properties highlighted.
If I go on to explore the Store app's UI further, I find another landmark in the UI, and its LandmarkType is UIA_MainLandmarkTypeId. That signifies where the primary content is on the page. (Exactly what the "primary content" is, is up to the app dev to decide.) Yet another landmark in the Store app's UI has a LandmarkType of UIA_SearchLandmarkTypeId, and it's associated with the UI for initiating a search in the Store.
Once an app exposes a set of useful landmarks in its UI, a customer using a screen reader such as Narrator can issue commands to quickly move between the landmarks. In Narrator's case I can press CapsLock+Space to turn on its Scan mode, and then press D or Shift+D to move to the next or the previous landmark. For example, if I press Shift+D and move backwards in the Store app to the navigation area, Narrator's announcement lets me know that I've moved to the "Home" item, and that it's contained within the navigation landmark. The announcement is as follows:
"Home selected, selection contains 8 items, navigation landmark"
The UWP XAML framework makes it really straightforward for an app like the Microsoft Store app to support landmarks. For example, to make some container element in the UI the Main landmark, add the following markup:
So how might landmarks be leveraged by the Sa11ytaire app? The app has three areas which would seem good candidates for landmarks, enabling the player to move quickly through them using a familiar navigation technique. Those areas are the remaining cards, the target card piles, and the dealt card piles. But none of those seem a good match for a "navigation" landmark or a "main" landmark. In fact, they seem quite specific to the app. As such, we gave the containers associated with those areas a LandmarkType of UIA_CustomLandmarkTypeId. This indicates that we want the elements to be accessible through screen readers' landmark-navigation functionality, but the type of landmark is specific to the app.
While that alone provides some useful navigation functionality, it doesn't provide the best experience for the player, in that the player isn't immediately made aware of the purpose of the landmark. By default, UIA will provide a LocalizedLandmarkType of (the localized form of) "custom". That might not seem particularly descriptive, but what else can UIA do? UIA doesn't know the meaning of the UI like we do. Indeed, we could set some far more helpful text through the UIA LocalizedLandmarkType property.
Setting a LocalizedLandmarkType property is as quick to do in a UWP XAML app as setting any other localized string on an element. First we make sure the element has a x:Uid, and then we add an associated AutomationProperties.LocalizedLandmarkType in the localized string resource file. By doing this, we provided LocalizedLandmarkType properties of "Remaining card piles", "Target Card Piles", and "Dealt card piles".
The following is the announcement made by Narrator when it reaches the first area in the app following landmark navigation.
"group, Remaining card piles landmark"
Figure 5: The Inspect tool reporting the UIA properties of the group element associated with the first area in the app. The landmark-related properties are highlighted.
Considering the UIA representation of the landmark elements further, it would seem appropriate for us to update their UIA representation to also give them UIA Name properties. The landmark-related elements in the app are all XAML StackPanels, which got exposed through UIA as elements with ControlTypes of Group. But we didn't set an AutomationProperties.Name on those StackPanels. During implementation of the landmarks, it seemed sufficient for the elements to have appropriate ControlsTypes, LandmarkTypes and LocalizedLandmarkTypes, but really, if an element's important to the player, it should have a helpful UIA Name too. We'll consider this and determine what the most helpful Names would be. Simply replicating the LocalizedLandmarkType in the Name might not lead to the best experience for the player.
And on a similar note, Figure 4 above shows Inspect reporting that the navigation landmark-related list in the Store app doesn't have an accessible name. By default all lists want UIA Names, so that anyone moving into a list and reaching some item can be made aware of exactly which list they're in.
One more note: it seems that the player experience is not quite as we expected when Narrator interacts with one of the landmarks in the app. Typically when the player has used landmark navigation to reach a landmark, they can press the Tab key to move keyboard focus to the first focusable element contained in the landmark. While that works fine for the first two landmarks in the app, it doesn't work at the third landmark containing the dealt card lists. Instead, a press of the Tab key after navigating to that landmark actually moves keyboard focus beyond the lists. It doesn't look like we can do anything about this, and like I said earlier, we'd never change the app's UIA representation solely to account for how Narrator behaves today. So this means once a player using Narrator has navigated to the dealt card list area via landmark navigation, they'd use Narrator's item navigation functionality to move into the first dealt card list.
Recent versions of Windows 10 support UIA's UiaRaiseNotificationEvent() function. This enables an app to raise a UIA Notification event, and by doing so, request that a screen reader announce any arbitrary string that the app feels is helpful to your customer. I discussed some of the ways that different types of UI can raise this event, (that is, Win32, WinForms, WPF and UWP XAML,) at Can your desktop app leverage the new UIA Notification event in order to have Narrator say exactly what your customers need? And for UWP XAML apps, it's really straightforward to raise the event, by calling AutomationPeer.RaiseNotificationEvent().
Warning: Just because it's easy to raise the Notification event, doesn't mean that apps should raise it. An app needs to strike a balance between making sure its customers are made aware of everything important that's going on in the app's UI, while not flooding the customers with unwanted announcements which are at best a distraction and at worse block the customers from completing their tasks. So only raise the Notification event if that will really help the customer experience in practice.
When Sa11ytaire first shipped, we found that the notification-related experience for the player wasn't always as we'd hoped. We experimented with the various AutomationNotificationProcessing values that we could pass into AutomationPeer.RaiseNotificationEvent(), and found that announcements could be interrupted in a way that didn't seem to match what we'd requested. But in the Windows 10 April 2018 Update, we're finding the experience more as we'd expect, and so we revisited how we call the RaiseNotificationEvent () function.
With the first release of Sa11ytaire, we'd always pass in ImportantAll to RaiseNotificationEvent(). For the latest release of the app, we've changed some of the calls to use ImportantMostRecent instead. For example, consider when a card is moved from a dealt card pile to a target card pile. When that happens, keyboard focus first moves to the card element in the list, and then moves to the card element in the target card pile. By default Narrator will follow keyboard focus as this happens, and make an announcement such as:
"Ace of Diamonds, 5 of 5, selected, Alt, 5"
"off, Ace of Diamonds, button, Alt, D"
However, we want the app to also make an announcement relating to the card that's been revealed in the dealt card pile following the card being moved. Keyboard focus is not at that revealed card, and so Narrator won't announce it by default. So we raise a Notification event to have the revealed card announced. By raising the event with an AutomationNotificationProcessing value of ImportantAll, then the announcement will be made in its entirety regardless of whether some other event happens to be raised by the app around the same time. The announcement will not be interrupted by another announcement. This means that in addition to the example announcements shown above, the player will also be made aware of (say):
"Revealed 5 of Diamonds in dealt card pile 5."
Importantly this means that if the FocusChanged event that's being raised by the app in response to keyboard focus moving, is received by Narrator after Narrator receives the Notification event, then the announcement relating to the focus changing won't interrupt the announcement relating to the card being revealed. Before the introduction of the UIA Notification event, this Narrator experience couldn't always be reliably delivered by a UWP XAML app.
We also use Notification events in the implementation of the feature where the player can request information about the three main areas in the app. At any time, the player can press F2, F3, or F4, to learn about the state of the remaining cards, the target card piles, and the dealt card piles respectively. For example, in response to a press of F4, Narrator might announce the following.
"Pile 1: Jack of Spades to King of Spades, Pile 2: Jack of Hearts to Queen of Clubs, 1 card face-down, Pile 3: 8 of Spades, 1 card face-down, Pile 4: 4 of Diamonds to 5 of Clubs, 3 cards face-down, Pile 5: 5 of Diamonds, 3 cards face-down, Pile 6: Jack of Clubs, Pile 7: 9 of Spades to 10 of Diamonds, 5 cards face-down"
It's quite possible that early in that announcement, the player will be made aware of some information which they want to act on immediately. For example, the player might feel there's potential for a card to be moved to a target card pile. As such, part way through the above announcement, they may press F3 to learn of the state of the target card piles. In that situation, the announcement relating to the target card piles should occur immediately, and so interrupt the in-progress announcement relating to the dealt card piles. It would be really irritating for the player if they had to wait for the dealt card pile announcement to complete before learning of the state of the target card piles. As such, in the new release of Sa11ytaire, when we call AutomationPeer.RaiseNotificationEvent() in response to a press of F2, F3 or F4, we pass in an AutomationNotificationProcessing of ImportantMostRecent.
This new experience seems to be holding up fairly well with the new release Sa11ytaire on the Windows 10 April 2018 Update, and we're looking forward to learning what more we can do with the Notification event in the future to further enhance the player experience. See AutomationNotificationProcessing for the full list of values that can be passed in to the call to RaiseNotificationEvent().
The third change to the app is not so much adding helpful new functionality, but rather it's fixing something that wasn't quite right in the first release of the app.
While all the interactable controls in the Sa11ytaire app are based on standard controls, such as Buttons and ToggleButtons, they have been customized to some extent. When I first did this, I made sure the UIA Name property reflected the suit and value of the card shown visually. For example, "Jack of Hearts". By doing this, Narrator would announce that suit and value when it encountered the card, and a player using Windows Speech Recognition could say "Jack of Hearts" to interact with the card.
But at the time, I also customized the LocalizedControlType to provide some more descriptive information about the card. For example, I'd given the top three cards in the upturned card piles, LocalizedControlTypes of "Top upturned card", "Second from top upturned card", and "Third from top upturned card". But Tim later pointed out that really, those strings aren't conveying control types. The strings gave no indication to the player as to how they can interact with the card, and an element's UIA LocalizedControlType property should wherever possible set the player's expectations around the functionality available on the control and how they can interact with it.
So the Sa11ytaire app was changed to have the default ControlTypes and LocalizedControlTypes exposed on the various elements in the UI. The screenshot below shows the Inspect SDK tool reporting that the top upturned card is now exposed as a Button which supports the UIA Toggle pattern. As such, the player is informed that they can interact with the element in the same way as any other toggleable button in a UWP XAML app.
It's still important to convey the information that was previously exposed through the LocalizedControlType, so that's now been moved to the UIA HelpText for the card elements. With this change, the following announcements are made as Narrator's scan mode is used to navigate from the remaining cards UI through the subsequent UI in the app.
"Next card, button, Alt, N, Single pile of cards, face down, Invoke this to move the top three cards from this pile, to the pile of upturned cards"
"off, Queen of Hearts, button, disabled, Third from top upturned card"
"off, 3 of Hearts, button, disabled, Second from top upturned card"
"off, 6 of Hearts, button, Alt, U, Top most upturned card"
This would seem to lead to a more intuitive experience for the player, and it's been a reminder for me of how I need to be really careful when I override the default UIA experience for the game's UI. Sometimes the default UIA properties are actually what the player needs.
Figure 6: The Inspect SDK tool reporting the UIA properties of an upturned card element, with properties relating to Name, ControlType, HelpText, and the Toggle pattern highlighted.
With each new release of Windows, the support for accessibility in UWP XAML apps continues to grow. I'm really pleased that Tim and I have been able to take advantage of some recent enhancements such as use of landmarks and more well-behaved notifications. This post certainly doesn't describe all the recent platform enhancements, so you may find that your apps can leverage exciting new accessibility-related functionality that the Sa11ytaire app doesn't. For example, a UWP XAML app can now define heading levels for text presented in its UI. Use of headings could dramatically improve the experience for customers using screen readers in some types of apps.
Please keep the feedback on your experiences with the Sa11ytaire app coming. Whether the feedback relates to using a screen reader, a switch device, eye control, or some other form of input, we always value your thoughts on how we can make the app more intuitive and efficient to use.
All the best,