On November 15th Elliott posted a blog on the The Motion Tracking Robot Controller. I was fortunate enough to work on this project and host an interactive demo of it at Maker Faire 2011 in New York this last Fall. You can see what the installation looked like in the second half of this video:
It was a great experience and I would like to share some “Behind the Scenes” info about what it was like being there.
Maker Fair 2011
This was my first time at Maker Faire, and it was amazing. If you haven’t been, I highly recommend it. We had the Robo-Razzi robot roaming taking pictures to the delight of the crowds (Smile for the camera!). I spent nearly all my time in the Microsoft tent running the Kinect demo for scores of people each day. My arms were sore at the end of each day but it was worth it… we shared some cool technology with thousands of people and got a blue ribbon award for an “excellent exhibit”.
The ‘Driving Eddie with Kinect’ demo was a big hit. There were lines to try it much of the day both days. I had to make some last minute adjustments to the code when we first set it up and fine tune it for one of the bots, but other than that, it ran flawlessly for hours and people were just amazed. The only down time was to swap laptop batteries every few hours or to switch to a charged robot toward the end of the day and I got those down-times to under 3 minutes.
I can’t begin to tell you how many giggling kids and fascinated adults absolutely loved it. I got to see people’s faces light up all day long. I was constantly saying “OK… we have a line… times up… who’s next?” I repeatedly heard… “That is just amazing”… “Incredible!”. So many people wanted to know more and loved hearing all about it and many wanted to try it over and over again. I saw Parents take extra time to explain to their very young kids what was going on and make sure they realized how amazing it was. You could tell they saw this as something important that should be understood. No one was disappointed, everyone left with a smile and many took time to learn more about the EDDIE and the software. It was a great lead-in to talking about RDS4, the contest and Microsoft’s vision of advanced software for robots.
Everyone tall enough to see over the monitor into the robot pen could get the hang of how to drive it quickly and it was fascinating to watch them learn to drive and develop their own driving style. The best thing was when people would come and watch the robot moving around in the pen for a while without knowing how it was being controlled… then they would see someone to the side with her hands out rolling back and forth like an airplane and they would put one-and-one together and I could see them just light up with excitement and get in line. Then there were people who got in line without knowing what was going on. They saw people waving in front of a big screen and wanted to try. When they stepped up and I told them they would be controlling that robot over there with their body they couldn’t believe it and when it responded to them the smile was priceless. It pleased men, women, boys, and girls of all ages.
I purposely had some of the demo code showing behind the client app and lots of people asked about it… “Is that the code”, “did you write that”, “what language is that?”, “was it hard to write?”, “how many lines of code?”. I would take time to answer them or direct them to another team member so I could keep the line moving.
There was one young woman who got my usual fifteen second instruction and then immediately was driving it around like a pro… I could tell by the way she stood and the fluid, graceful motion of her arms, and how her wrists and fingers perfectly matched her arm movements that she was a dancer… probably ballet. The robot was gracefully swimming around the pen better than I had ever seen before. I asked her if she was a dancer and she smiled and said she was. I told her that the robot was moving more fluidly and beautifully than it had for anyone before her and she smiled and continued to make it dance and swirl for a few minutes. I was so impressed. I felt like I was in a PBS special about computer human interaction.
Another young man wanted to know if it would track someone standing on their head. So did I, and then next thing I know he is driving the robot around with his legs much to everyone’s delight. In that orientation, the Kinect software interpreted his legs as arms and away he went.
Several people asked “How is this useful?” or “What are you hoping to accomplish”. I would explain that this is a technology demonstration intended to spark the imagination and invite the user to experience first hand becoming embodied in a simple robot. While the specific setup we had there might not have a practical application besides entertainment I think we accomplished that goal. As robots become more common in our world, exploring new ways of interacting with them using powerful user interface devices like Kinect is both fun and exciting… and I’m pretty sure we sparked an imagination or two.
What would you do with an Eddie robot in your home? Enter for a chance to win $10,000!