From Hack to Product, Microsoft Empowers People with Eye Control for Windows 10


Sometimes, a simple email can lead to a life-changing breakthrough.

In 2014, former NFL player Steve Gleason, who has a neuromuscular disease called amyotrophic lateral sclerosis (ALS), sent an email to Microsoft, challenging employees at the company’s first hackathon.

Steve’s ask to this diverse array of thinkers, doers and dreamers was to develop technology that could address some of the constraints he faces living with ALS, a disease which causes the death of neurons that control muscle movement, resulting in difficulty moving, speaking, swallowing, and eventually, breathing. For most people with ALS, the eyes are the only muscle not impacted by the disease.

Steve wanted to be able to play with his son, talk more easily with his wife and to move his wheelchair by himself: goals that for someone with ALS seemed like an impossible dream.

“I realized pretty quickly after my diagnosis that technology would have to become an extension of myself. Until there is a medical cure for ALS, technology will be that cure,“ Steve said.

Team of researchers in a conference room at Microsoft discuss EyeGaze Wheelchair.

The Microsoft Enable Research team meets to discuss the Eye Gaze Wheelchair.

Steve's challenge was a natural fit for Microsoft, especially during One Week Hackathon, which encourages employees to focus on passion projects that advance the company’s mission of empowering people around the planet. A team calling themselves Ability Eye Gaze took on Steve’s ask, excited by the opportunity to create technology that could genuinely affect a person’s life, and quickly got to work.

The Ability Eye Gaze team focused on developing a tool based on Steve's requests, and after three days of hacking, one of the projects stood out: the Eye Gaze Wheelchair. This unique solution allowed Steve to drive his wheelchair with only the movement of his eyes as he looked at controls on his Surface.

The judges reviewed over 3,000 projects from teams across the company, and Microsoft CEO Satya Nadella named the Eye Gaze Wheelchair the grand prize winner of the 2014 hackathon. The Eye Gaze Wheelchair received such enthusiasm from employees and the ALS community that a new Microsoft Research team was created to understand the potential of eye tracking technology. However, this was only the first chapter of an inspiring story.

The new research team started to work closely with Steve’s nonprofit, Team Gleason as well as Evergreen ALSA, an organization that supports people living with ALS and their loved ones through services and education. They also reached out to individuals locally in the Seattle ALS community to understand their experiences and needs, and how technology could help them.

When the Windows team came across this technology, they immediately saw the potential for eye tracking to change people’s lives as well. Eric Badger, Principle Software Engineer Lead on Windows, and Harish Kulkarni, Principal Software Development Engineer in Microsoft Research, quickly started prototyping new eye tracking scenarios together. The results were promising and they dedicated a team of engineers to bring eye tracking support to Windows.

Last week, Satya returned to the One Week hackathon, to share projects that had inspired him over the years, and to announce that Windows 10 will include built-in eye tracking support and an experience called Eye Control, inspired from the winning hack in 2014.

An audio description of the video is available here

Eye Control makes Windows 10 more accessible by empowering people with disabilities to operate an onscreen mouse, keyboard, and text-to-speech experience using only their eyes. The experience requires a compatible eye tracker, like the Tobii 4C, which unlocks access to the Windows operating system to be able to do the tasks one could previously accomplish with a physical mouse and keyboard.

Right now, Eye Control is in beta and people interested in early testing and providing feedback can sign up to be a Windows Insider.

Eye Control is demoed by a woman looking at a Surface tablet.

A demo of Eye Control at the 2017 Microsoft OneWeek hackathon event.

The source of the motivation is simple. It's our opportunity to embody the mission of the company, to ‘empower every person and organization to achieve more’ combined with the power of One Week. Now in its fourth year, One Week brings together thousands of Microsoft employees from across the globe to generate new ideas that push the boundaries of technology and solve real-world challenges. During this year’s hackathon, we asked our employees, “Who will you empower?” That mentality, combined with a passionate belief in the transformational power of technology, is a magic combination.

For two Microsoft employees, the passion to empower people drove them to help make Eye Control a reality.

Jon Campbell is a Senior Research Software Development Engineer and was a member of the original Ability Eye Gaze hackathon team. “There was this great vibe at One Week, it was like summer camp for innovative ideas rather than work. It was exciting to be challenged to follow our passions and think creatively with so many talented people from across the company,” said Campbell. “While not all ideas are winners, the thing that makes the hackathon great is that other ideas we kicked around had real potential – like eye tracking technology to help people with ALS and how Windows can bring it to more people.”

Jake Cohen, Program Manager on the Windows Interaction Platform team, is one of the Program Managers responsible for designing and integrating Eye Control into Windows 10. “When I heard about the Ability Eye Gaze team and what they were creating, it was super exciting to think about the possibilities of what could be done next,” said Cohen. “And it really made a dramatic difference to me when we started meeting with people living with ALS. I began to understand all the challenges they live with every single day. We’ve done a lot of great work across our teams, and we have a lot more to do, but we’re at a point where it has the potential to really start changing lives.”

The Ability Eye Gaze hackathon team accepted Steve Gleason’s challenge and as a result, Windows 10 will be more accessible for a broader set of users. Now Steve, as well as those with ALS and other disabilities will be able to live fuller lives. For them, the seemingly impossible dream is now becoming more of a possibility.

“When I approached Microsoft three years ago, I asked them to help develop innovative technology programs that would allow people with disabilities to stay productive and purposeful,” said Steve Gleason. “They agreed without hesitation. At that time, I had no idea they would continue expanding access throughout all of Windows 10. Having Eye Control in Windows 10 continues to bridge the gap between widely used technology and people with disabilities. It’s simply liberating.”

NFL player, Steve Gleason, with his son

Former NFL player, Steve Gleason, with his son.

If you have questions or feedback on how we can continue to improve our products and services, you can contact us through the Disability Answer Desk (now with ASL support) and Accessibility User Voice Forum.

You can learn more about accessibility at Microsoft by visiting, http://microsoft.com/accessibility.


Comments (5)
  1. Danchar4 says:

    Great work everyone! I remember seeing the prototypes at the 2014 hackathon. Very cool to see these developments make it into Windows!

  2. ClarkKeent says:

    I would love to have an alternative to the mouse. This could also help people with a tennis elbow! Great idea!

  3. That is wonderful news!

    Let me also point you to our GazeTheWeb. GazeTheWeb goes beyond gaze-based typing and provides a modified browser for surfing the Web and interacting with Web pages. The upside is that mouse emulation is very tiring and eye gaze-adapted Web pages provide for a much improved user experience.

    Check out a video here: http://gazetheweb.com

  4. cob999 says:

    It is positive to see these features coming to Windows 10, I know of people who use Eyetech hardware to control Windows computers and it is a great enabler. The only thing I would suggest is to explore different input options rather than an on screen keyboard. This is just a representation of the physical keyboard which people without hand movement cannot use. The Dasher Onscreen Keyboard is an amazing concept that is a much faster way to enter text using eye movement. It would be great to have this integrated as an optional input method within windows 10 as well.

  5. Jason_A_S says:

    I think this is great!

    Due to a minor disability, I have been using the Tobii 4c Eye Tracker as a mouse replacement for half a year. Based on my experience, I would like to share some challenges I have faced, and some recommendations that I think will dramatically increase the functionality of this if included.

    1. ACCURACY. For me, the eye tracker (when calibrated) has an accuracy of about the size of a half dollar. That is, when I look at a point on my screen, the cursor is usually accurate within about half an inch. This is pretty good, but can make clicking on small buttons very frustrating. I would strongly encourage you to use the eye tracker for large movements of the mouse, and to enable the head tracking for smaller, more precise movements of the mouse. Precision Gaze Mouse (http://precisiongazemouse.com/) is a good example of how to implement this.

    2. CURSOR TRACKING. Having the cursor constantly following your gaze can be very useful, but it can also be very distracting and interfere with activities like reading or watching videos. There should be a toggle that allows users to switch between having the cursor constantly tracking there gaze (like most trackers do), vs having a user be able to place the cursor where they are looking when hitting a trigger (as Tobii currently does)

    3. CLICKING: There should be a variety of ways to click where the mouse cursor is, which users can select from. Tobii currently doesn’t have this a tall. But, you can use AutoHotKey to have certain keys on your keyboard act as right/left clicks (something likely too complicated for most users). Options you may want to include are programming keys on the keyboard to trigger a click, using different facial actions to register a click (other tools register clicks with smiling, blinking, or staring for a certain period of time), or using verbal commands. (I set F2 to left click, F3 to right click, alt-F2 to hold down a left click, and alt-F4 to release a left click–for when I want to click and drag)

    4. INTEGRATION WITH SPEECH RECOGNITION: Speaking of verbal commands, you should make sure this is compatible with windows speech recognition. Currently Tobii’s software to place the mouse where you are looking is not compatible (e.g., if you set F3 to be the trigger for Tobii to place the mouse where you’re looking, saying “press F3” will not place the cursor). Thankfully, AHK shortcuts do work, so I can use speech recognition for clicking once the cursor is placed.

    What you are doing can be life-changing for people with severe disabilities. But with these functionality I mention above (and slightly more accurate eye tracking), I think you’ll find that using an eye tracker as a mouse replacement is useful for a much broader audience . People who get some wrist pain after prolonged computer use have expressed interest in my setup, and some friends who are programmers are considering making the switch because it will let them keep their hands on the keyboard more and use the computer more efficiently.

Comments are closed.

Skip to main content