The Robotics Lab at the University of Massachusetts, Lowell (UML), is exploring ways to integrate Microsoft Surface and its natural user interface (NUI) into search-and-rescue robotics to control robots remotely with more precision and accuracy.
Below is a video showing you their work. Two Microsoft Technologies wereused in this project: the Microsoft Robotics Developer Studio (used for simulation) and Microsoft Surface (for the user interface).
Stewart Tansley, Senior Researcher Program Manager from Microsoft Research mentioned in the Microsoft Research Blog:
The Surface allows multiple users to interact with the computer simultaneously by using whole-hand or multiple-finger gestures. These gestures enable rescue teams to control robots with greater dexterity than they could with traditional robotics controllers—and precise control of the robots is critical for search-and-rescue efforts. In addition, the Surface permits more than one robot to be controlled simultaneously—previously not possible with a single controller.
The novel NUI approach to robotics that was employed by the Lowell robotics lab in this socially significant application helped the DREAM Controller project win one of eight grants that Microsoft Research offered under our Social Human Robot Interaction Request for Proposals (RFP). The grant award included financial support, a donated Microsoft Surface, and access to the
Microsoft Research team.