You’re familiar with GUI even if you don’t know it – it’s the type of user interface you use every day on your current class of PC (or Mac) – a Graphical User Interface. Over the last year or so, Microsoft and others have begun referring to NUI or Natural User Interface. This is where computers start to become, well, more natural with speech and visual gestures being the modes of interaction rather than mouse and keyboard. Todd Bishop has a post about this topic today and noted that Bill Gates recently talked of NUI as “the thing that people underestimate right now."
Wikipedia defines NUI as “common parlance used by designers and developers of computer interfaces to refer to a user interface that is effectively invisible, or becomes invisible with successive learned interactions, to its users”. In that Wikipedia entry, Bill Buxton of Microsoft is mentioned and you’ll see him in the video above as he’s been at work on this stuff as far back as 1984. Project Natal and Surface from Microsoft are also mentioned in that entry alongside the amazing work of Jeff Han on multi-touch which he showed at TED in 2006. On a side note, it wasn’t a surprise to me to see that poleydee was an early contributor to this Wikipedia entry as I know he’s a big fan of Bill’s work.
There is a lot of work going on at Microsoft around NUI and this video brings much of it together – from Windows 7 touch, Surface, Natal and the Office Envisioning videos. Until now, much of this seemed a bit far off – in the realm of Minority Report it seemed very “Hollywood”. However, inexpensive display technology is enabling any surface to become an interactive screen. Cameras and microphones can now be embedded in almost anything and the Wii and iPhone have shown that computers can understand simple gestures – with Natal that goes further with more advanced gestures and speech recognition. There is much talk of augmented reality at the moment and that adds another layer of exciting potential.
Component prices are falling fast so we’ll start to see hardware catch up with the software work that has gone on to date. Microsoft Research, it we get things right has a big part to play here as for years we’ve invested in computer vision, machine learning, user interfaces and language processing – across the many labs we have but much of that here in the UK in our Cambridge lab which I’m personally pleased about. It’s been a long journey as this work goes back as early as 1991….finally we’re starting to see it bear fruit.
We’re taking a platform approach to this and looking to others to innovate on top. The PDC began some of our quest to unleash that innovation to some extent – the laptop giveaway encouraged these new types of NUI apps with a machine designed specifically to highlight the sensors in the unit and build completely new applications based on touch and more. The second video below shows you some of the apps that were in the laptop from the PDC – but these are just the tip of the iceberg – I can’t wait to see what developers do with all of this.