Side Project Turns Kinect into a Virtual Seeing-Eye Dog

Kinectacles, a Garage project from Microsoft India, aims to use Kinect's depth camera to help the visually impaired get around.

Add virtual "seeing-eye dog" to the ever-expanding list of creative ways people are using Kinect.

Aditi Goswami, Rishabh Verma, and Atul Sharma
The Kinectacles team (Aditi Goswami, Rishabh Verma, and Atul Sharma) won the Thought Leadership award at March's Garage Science Fair.

Kinectacles, a side project from three Microsoft India employees, aims to help the visually impaired navigate unfamiliar terrain. It uses Kinect's depth camera to detect objects in a user's path and then gives audio prompts to guide the user around obstacles.

It's essentially the same way a bat uses sonar to fly around, explained Rishabh Verma, an Associate Consultant at Microsoft Global Delivery.

"Kinectacles is essentially spectacles for the blind," he said. "The software analyses depth data in real time and tells the user whether it's OK to move ahead or whether to stop. The goal is to help visually impaired people get around on their own."

Verma and two friends from the Hyderabad campus, Atul Sharma (Microsoft Global Delivery) and Aditi Goswami (Microsoft IT India), spent several months this spring building the software with the Kinect for Windows software development kit (SDK). The team is confident that it has established a viable proof of concept and has started to build a working prototype: a pair of high-tech spectacles with a Kinect printed circuit board (PCB) mounted on top.

The team has plenty of motivation to see its vision through, Verma said. Kinectacles won awards this spring at consecutive Garage science fairs and has received media coverage and interest from several nonprofits working with the visually impaired.

For Verma, it's vindication for an idea he first had six years ago. It all started just before he graduated from SGS Institute of Technology, when he was looking for a senior project that would set him apart from his classmates. Inspiration came in the form of the film "The Black," which focused on the day-to-day life of a blind individual. Verma couldn't stop thinking about the challenges people with visual impairments face every day.

So he set out to make Kinectacles version 1. Without Kinect, though, it was an incredibly difficult project. Verma built the hardware from scratch, using just a microprocessor and an infrared transmitter and receiver. Progress was slow. The things Kinectacles can do with Kinect in real time—such as scan for objects and then give a prompt — were incredibly cumbersome to program using the pre-Kinect native microprocessor assembly language code. ("It was just a nightmare to debug," he said.)

The idea was good enough to help Verma graduate, but he decided he couldn't make a real-world application. So he shelved the idea and more or less forgot about it until Kinect launched. He immediately envisioned using Kinect to pick up where he left off.

By the spring of 2012, Verma was ready. He recruited Sharma and Goswami, and they spent a few months playing around with the "super awesome and very developer-friendly" Kinect for Windows SDK. Verma quickly realized that with Kinect, he didn't have to suffer the complexities of hardware development. "You get to see everything happening in real time; as soon as an object came in front of Kinect, we could see it in real time on the computer screen," he said.

No more waiting for the microprocessor assembly language commands — and no more obstacles to building a software prototype. The team spent nights and weekends writing code, and in March it presented its idea at the Microsoft IDC and Engineering Excellence team-sponsored Garage Services and Devices Science Fair (India edition), where it earned the Thought Leadership award. After refining the code, the team won the Golden Volcano award in May at the larger Garage Science Fair (India edition), sponsored by Microsoft IT India.

The team now considers the proof-of-concept phase wrapped up and has started to build a prototype. As soon as they have something truly portable, they'll work with a local nonprofit to conduct usability tests.

The biggest hurdle right now is portability, though. It's not feasible for someone to tote around a Kinect sensor all day, Verma said. One possible solution includes removing the PCB from the Kinect sensor and mounting it onto, say, a belt or handheld device. Even better, the team hopes for a Kinect for Windows Phone SDK because replacing the tablet with a phone would make it much easier.

In the meantime, the team continues to add to its new features wish list. It wants to add image recognition capability, so rather than giving a "stop" command, Kinectacles would say "stop, wall." The team also hopes to incorporate Bing Maps someday so that the system could help people navigate from point A to point B.

Verma said he's thrilled that Kinectacles has gotten some attention for a six-year-old idea. This time, he won't stop until he sees it through.

"It feels great to get the attention, but now I feel like I have a duty to finish it. Thanks to Kinect, I'm actually living my dream of making this application."

Posted by Howard