Another round of accessibility testing

    Last fall I wrote about one of our Accessibility testing tasks. We just finished another round and since I saw this request on our discussion groups, I thought I should follow up.

    Last fall, we took a programmatic view of accessibility testing. WE checked the properties of all the elements we have in OneNote to ensure screen readers and such software can (literally) tell users what the control is and what it does. This round was a next step – use the built in voice recognition software to manipulate OneNote. If you are using Vista and have a microphone handy, you can do this yourself. Here’s a bare bones quickstart:

    1. Make sure your microphone is properly selected:
      1. Start | Control Panel | Hardware and Sound
      2. Audio Devices and Themes (on my machine, this was called "Manage Audio Devices"
      3. Make sure your microphone is set as the default
        1. If not, right-click | Set as Default


    1. Run Speech Recognition
      1. Start | All Programs | Accessories | Ease of Access | Speech Recognition (may need to search for this in the search pane)
      2. clip_image002
        1. Follow the prompts, which will take you through a brief voice recognition exercise
        2. (John’s note: this takes a few minutes but is well worth it)
    1. Test your features for accessibility
      1. Make sure OneNote is active when beginning
      2. Say "start listening" to enable speech
      3. Say "stop listening" to disable speech
      4. To drop a menu, say the menu name, e.g. "file" drops the File menu
        1. Say the menu item to access the desired menu item
      1. Say "show numbers" to have Vista display numbers for all accessible UI elements
        1. Say the number of the UI item you want to access
        2. Say OK to select the item
      1. Say "What can I say" to see a list of commands
    1. When you start testing, disconnect the mouse and keyboard.

    That last step is a doozy – manipulating controls on a computer with no mouse and keyboard is a completely different experience. The toughest part for me was the "Scroll up" and "Scroll down" commands. For some reason, they did not click in my mind when I went through the training and I struggled until I used the "Speech Reference Card" available.

    The good news is everything I set out to do (cut and paste between other apps and OneNote, manipulate my features in OneNote, share a notebook, etc…) were all possible. We did find a few bugs across the team we are planning on fixing, but I was very happy to see the results for this testing.

    It’s pretty hard to describe what you have to do with only audio input to a computer. Even if you don’t have a microphone, I suggest working through the Speech Tutorial to see how a user with no mouse or keyboard uses a computer.

    Questions, comments, concerns and criticisms always welcome,


Comments (5)

  1. Brian says:

    I’m hoping that the next round of testing will include turning off the monitor as well. If you can operate all features of the app with just a microphone and speakers, then I’ll be truly impressed.

  2. JohnGuin says:

    stay tuned.

  3. My biggest problem with Vista Voice, and this is as much as anything probably a microphone problem, is that it was TOO sensitive.  I had to shut it down and stop using it because some bit of ambient noise in the background kept activating voice commands like "Paste" and "Back" and such when I didn’t want it to.

    Even upgrading to a better headset and trying to keep as quiet a work environment as possible just never quite worked for me – I was constantly having to shut down Vista Voice to keep it from reacting to other noises.

  4. As another part of our recent accessibility push , we tested OneNote without monitors. It’s hard to write