Be The Robot

One of the issues with computer science is that some people would rather move atoms than pixels. That is to say that virtual objects, even if they involve images on a screen, are less compelling to some than physical objects doing actual work in the real world. Robots have long been involved in the connection between real and virtual worlds.  How do we control those robots? How much autonomy is there and how much is machines responding to direct human control? Well that all depends. Some tasks are easily programmed and easily done by “rote.” Others have too much variability or require to much in the way of functions, vision comes to mind, that are not as easily programmed. So in  many cases there is a mix. The FIRST Robotics competition includes just that sort of mix. The event starts with a short period of autonomous action followed by a period of remote human control. This works out to create some interesting software developments. The 2012 competition will be adding a new wrinkle though. Microsoft is donating a Kinect device for each team to use. The idea is to allow the robot to become more of an extension of the human player. In a sense, to allow the human to “be the robot.” Some stuff from the press release.

In the 2012 FIRST Robotics Competition, teams will be able to control robots via Kinect. They will be able to either program their robots to respond to their own custom gestures made by their human teammates, or use default code and gestures. The added ability for teams to customize the application of the Kinect sensor data is a valuable enhancement to the FRC experience.

This is pretty exciting. During the autonomous period team members will be able to provide some guidance to one (of the three on a team) robot by moving their bodies. Gestures of various types either from sample code that will be supplied or custom code developed by team members will be used to give direction and activity for a robot. Some interesting possibilities. An additional possibility that is not obvious to me from the press release is that a Kinect could become part of the robot as well. Oh there are power concerns, issues of weight, and a small matter of software to make it all work but FIRST students are amazingly clever. So between all the possibilities I can’t wait to see what these students come up with.

I understand that some teams are getting early hardware so that they can take part in a beta program. FIRST usually runs some sort of beta in the fall before the regular season starts so that they can test new hardware, new software or other new ideas that will probably be part of the new season’s game. Also some software is being developed to act as a starting platform for student teams to use and build around. I don’t know yet when this software will be available though. Not that teams have to wait for this to start experimenting on their own of course. Established teams often have robots from previous competitions to experiment with. The Kinect is not an expensive device and the software to develop for it is free. The Kinect for Windows SDK is a free download and students can either use the Visual Studio Express editions in C++ or C# or Visual Basic to develop. Or high schools can sign up for DreamSpark to get students free professional development products from Microsoft.

Related links: