In the realm of applied research, perhaps nothing is more satisfying than working on projects that can help save lives. Such is the case with a unique project at the University of Massachusetts Lowell that combines Microsoft Surface and Microsoft Robotics Developer Studio in a Human-Robot Interaction (HRI) application to create novel remote controls for rescue robots. To the best of our knowledge, this is the first time these two technologies have been used together—tell us if you know of others! Once perfected, this approach could enable emergency responders to safely maneuver rescue robots through buildings damaged by earthquakes, fire, or even terrorist attacks.
The groundbreaking work was dramatically presented on the Web in August, when doctoral candidate Mark Micire posted a live video of his PhD defense showing how to control swarms of robots using the Surface table as a touch controller. A new, higher quality video of the thesis defense and an overview video have recently been posted online. The overview shows how a team of rescue robots could be controlled remotely by using the Surface table and a device known as the DREAM Controller (a lovely acronym for Dynamically, Resizing, Ergonomic, And Multi-touch Controller).
The system could be a tremendous boon for emergency responders, who now must often wait 12 to 24 hours to obtain geo-referenced data that combine notes from rescue workers in the field with paper maps and building plans. During Hurricane Katrina, for example, many response groups were still using hand-drawn paper maps. Additionally, robot cameras sent video only to the site operators—not immediately to the command staff.
The proposed system would obviate these problems by creating a common computing platform that would bring all this information to the command staff, enabling them to more effectively utilize rescue robots. As Micire describes in his presentation, “A single-robot operator control unit and a multi-robot command and control interface [can be] used to monitor and interact with all of the robots deployed at a disaster response. Users can tap and drag commands for individual or multiple robots through a gesture set designed to maximize ease of learning.”
An example of the burgeoning research field of NUI—or Natural User Interaction—this work “illustrates just one of the many exciting new directions enabled by advanced technologies in the human-computer interface,” says ER’s NUI Theme Director, Kristin Tolle. The project, which was supervised by UMass Lowell’s renowned robotics expert, Professor Holly Yanco, also demonstrates the great synergy that can arise from collaborations between Microsoft Research and leading academic institutions. By empowering Yanco and Micire’s research with cutting-edge tools, a potentially life-saving technology is in the offing.
This work was partly supported by a grant from Microsoft Research under our Human-Robot Interaction RFP (Request For Proposals).
—Stewart Tansley, senior research program manager, External Research, a division of Microsoft Research