When the topic of accessibility comes up, most people think it’s all about people with disabilities. While it’s true that people with disabilities were the original target audience for accessibility, more recent developments have highlighted the value of accessible technologies to everyone.
For one thing, making your program accessible means that test automation can manipulate your program in a uniform manner. They can enumerate the buttons, get text for them, press a button programmatically, and so on. Even if you don’t use a standard button control but instead opt to go windowless, as long as you expose your controls via accessibility, test automation can find them.
Also, we saw how programs can use accessibility to retrieve text from the screen. I use this in my English/Chinese dictionary program to “pluck” text off the screen and paste it into the “translate this” box, where I can arrow through the sentence and have the program translate each word or phrase on the fly. (Rats, now you know where I’m going with the program.) And once I started using this feature, I discovered to my dismay how many of the programs that I use on a regular or semi-regular basis simply fail to support this simple task. Microsoft products have a much higher success rate, but even there the support is not 100%.
Finally, check out this screencast showing off Vista’s speech recognition system. The “Say what you see” feature which Chris discusses at time code 8:52 needs to get the name of every element on the screen so it can take what you say and look for it on the screen. If your program doesn’t expose these names, the “Say what you see” feature won’t know what the user needs to say to click on your button, and users will say, “Harumph, why doesn’t this program work with voice recognition? All my other programs do.”
My secret hope is that “Say what you see” will finally be enough to prod people into taking accessibility seriously. Because it’s not just for people with disabilities.