Kinect for Windows helps decode the role of hand gestures during conversations

We all know that human communication involves more than speaking—think of how much an angry glare or an acquiescent nod says. But apart from those obvious communications via body language, we also use our hands extensively while talking. While ubiquitous, our conversational hand gestures are often difficult to analyze; it’s hard to know whether and…


Kinect Sign Language Translator – part 2

Today, we have the second part of a two-part blog posted by program managers in Beijing and Redmond respectively—second up, Stewart Tansley: When Microsoft Research shipped the first official Kinect for Windows software development kit (SDK) beta in June 2011, it was both an ending and a beginning for me. The thrilling accomplishment of rapidly and…


Kinect Sign Language Translator – part 1

Today, we have the first part of a two-part blog posted by program managers in Beijing and Redmond respectively—first up, Guobin Wu: I consider myself incredibly lucky to be the program manager of the Kinect Sign Language Translator project. There are more than 20 million people in China who are hard of hearing, and an…