Testing and using OneNote without a monitor

As another part of our recent accessibility push, we tested OneNote without monitors. It's hard to write about this since I can't include screenshots. If you want, you can simulate this yourself by reaching over and turning off your monitor - just be sure to get a microphone and get the speech recognition working first.

This is a completely different way of using a computer. The way I got used to it (and it took a couple of days to get used to it) was by slowing down. With a mouse and keyboard, I'm frequently interacting with the computer by changing my view, resizing windows, moving items around for a better view, etc… The keyboard and mouse let me do those operations quickly and I got used to that very fast level of interaction. Having no monitor makes many of those tasks not applicable, but forced me to rethink the way I use the computer.

As an example, one technique I adopted was running only one application at a time. I feared random pop ups stealing focus from OneNote and causing my interaction to go awry. With vision and monitor, if this happens, it's annoying but also easy to detect and fix. With no monitor, it's still possible to detect and correct this state, but it is not as quick to fix. Once I got past the hurdle of getting used to not having a monitor, I was able to get Outlook and OneNote running simultaneously and was able to complete all the tasks I had at hand. For my "big final test" I had OneNote, Powerpoint and Excel running and was copying and pasting between them. I timed myself with a mouse, keyboard and monitor and it took me just a few minutes to complete these operations. With the speech recognition and no monitor, it took about 25 minutes.

Another tip I figured out early was keeping pages as simple as possible. I tried to minimize the number of outlines I had since navigating through them was a slow process.

One other subtle difference to my workflow in general was keeping my door closed. I was using speakers instead of a headset to hear what the computer was saying and I didn't want to disturb my neighbors. I'm sure that after a very short time they were sick of hearing me say "What can I say" over and over again.

I realize this blog article sounds a little bit vague and rambles a bit. It's very hard to write about what essentially amounts to a "conversation in the dark."  I also spent a lot of time in the learning curve to figure out the speech recognition component just so I could test OneNote properly.  Without this background work, which was simply me sitting in front of a computer talking to it for hours, I never would have been productive or known what was possible in this configuration that initially seemed limited.

Since I had a comment from Brian (linked above), I figured I would go ahead and mention that we do test with this configuration.

Questions, comments, concerns and criticisms always welcome,


Comments (3)

  1. DevlinB says:

    It is great to hear that teams at MS are really taking accessibility testing seriously, and actually dogfooding their accessibility scenarios.  Having worked with visually impared users before, I know how hard it is to access technology resources with limited or no vision.  Pop-ups of any type (notifications or otherwise) can be horribly confusing.

Skip to main content