This post describes one example of how the Windows UI Automation API helped when building a simple tool for people with low vision.
I said recently at Can UIA help you build a tool for someone you know? that the Windows UI Automation (UIA) API can help you build a solution which helps people who have specific challenges using a computer. Since uploading that post, I was contacted by someone in the Netherlands who works with people with low vision, and he asked whether I could build an app which highlights where keyboard focus is. He was after a free, standalone app, and felt that there was nothing available to him which exactly matched what he was looking for.
This definitely seemed worth investigating.
So I grabbed a couple of hours early the next morning to build a V1.0 of the app, and made it available to the person who contacted me to get his feedback. Within a week, I’d made some updates based on that feedback, and completed a V1.2. The app’s now at a stage where more people can try it out, and over the coming months I can keep tweaking it to make it as useful as I can. The app has some known constraints which would take me a while to address, so I’ll hold off updating the app until I know exactly what’s most important to the people using it.
Overall it’s been a fun week, building a new tool which has potential to help a lot of people. And UIA has made this possible.
Figure 1: The Herbi HocusFocus app highlighting the element with keyboard focus.
Building the new app
The new app is a C# WinForms app. (It’s not a Universal Windows Platform app, because Universal Windows Platform apps can’t use the UIA client API). I chose WinForms for the UI, because it’s really quick for me to build UI with WinForms. So it didn’t take much time to create the app UI, and Visual Studio made it easy for me to build Dutch and German versions of the app. And of course, I mustn’t forget to add localized accessible names to the comboboxes shown in the app.
Figure 2: Adding an accessible name to a combobox shown in the new app.
The highlight presented by the app is simply another form, whose position and size is set based on the bounding rectangle of the element with keyboard focus. The form has a transparency key color set to make the insides of the form transparent. The customer only sees a rectangle at the form’s border, and they can customize the rectangle’s color, thickness, margin and style to whatever works best for them.
I also added functionality to optionally have the accessible name of the element that gets keyboard focus spoken. This provides a very helpful feature with a tiny amount of code. In fact I’ve added the following code to a bunch of apps over the years:
private SpeechSynthesizer _synth;
_synth = new SpeechSynthesizer();
The UIA stuff
Now we get to the stuff of most interest to this blog, and that’s how to use UIA to know what’s happening to keyboard focus as the customer moves focus around the screen.
As I usually do, I’m using a managed wrapper around the native Windows UIA API, and the wrapper is generated using the tlbimp.exe tool. Some notes on generating the wrapper are at So how will you help people work with text? Part 2: The UIA Client.
I created the following class to perform all interaction with UIA. Hopefully the comments adequately describe what it does.
// This namespace is available through the managed wrapper generated by the
// tlbimp.exe tool.
// Add support for IUIAutomationFocusChangedEventHandler in order to get
// notifications when keyboard focus changes.
internal class FocusHandler : IUIAutomationFocusChangedEventHandler
private IUIAutomation _automation;
// Define a few properties using the values in UIAutomationClient.h.
// (The managed wrapper generated by tlbimp.exe doesn’t expose these
// in an easy-to-use way by default.)
private int _propertyIdBoundingRectangle = 30001;
private int _propertyIdName = 30005;
private bool _fAddedEventHandler = false;
private HerbiHocusFocusForm _mainForm;
// Use a delegate so that we can have the app take its highlighting
// action on the UI thread, regardless of what thread the FocusChanged
// event handler is called on below.
private HerbiHocusFocusForm.EventHandlerDelegate _eventHandlerDelegate;
this._mainForm = mainForm;
this._eventHandlerDelegate = eventHandlerDelegate;
public void Initialize()
this._automation = new CUIAutomation();
public void Uninitialize()
private void RegisterFocusChangedListener()
// Use a cache here, so that the name and bounding rectangle of the element
// with focus is cached with the FocusChanged event. This means that when
// the app gets notified of the focus change, it doesn’t then have to go back
// to the element that raised the event to get its name and bounding rect.
IUIAutomationCacheRequest cacheRequest = _automation.CreateCacheRequest();
// The above properties are all we’ll need, so we have have no need for a
// reference to the source element when we receive the event.
// Now register for the FocusChanged events.
_fAddedEventHandler = true;
private void UnregisterFocusChangedListener()
_fAddedEventHandler = false;
public void HandleFocusChangedEvent(IUIAutomationElement sender)
// UIA is notifying us that keyboard focus has moved. If we’re in the middle
// of closing down, don’t do anything here.
// Get the cached bounding rect so we know where we should highlight.
tagRECT rect = sender.CachedBoundingRectangle;
Rectangle rectBounds = new Rectangle(
rect.right – rect.left, rect.bottom – rect.top);
// Get the cached name so that it can be spoken if that’s what the customer wants.
string name = sender.CachedName;
Debug.WriteLine(“Focus now on: ” + name);
// Now have the UI thread highlight the focused UI. This will return immediately.
_mainForm.BeginInvoke(_eventHandlerDelegate, name, rectBounds);
And so thanks to those few lines of UIA client code, it’s much more practical for my customers to know where keyboard focus is in a bunch of apps.
Figure 3: A menu item in Edge being highlighted by the new app.
Thanks to the power of .NET and UIA, in a few hours over the course of one week, I built the first version of a tool which someone had specifically requested. Based on customer feedback I can update the app to make it more useful in practice. The person who contacted me had said that he’d unsuccessfully searched for such a tool for several years, and so this is a reminder of the difference we can all make in plugging gaps in what’s available to people. And sometimes – it takes very little time for us to do that.
I’d usually say here that I’ll upload the app’s source code somewhere public, but I keep saying that, and then don’t. So I’ll hold off on saying it here. If I ever make the code public, I’ll update this post.
It goes without saying that I’m very grateful to the person who contacted me, querying whether it’d be practical for me to build this tool. And if you know of someone who might find the tool useful, or might find it useful yourself, it’s freely available at Herbi HocusFocus.
Good luck with your own assistive technology projects!