At a recent event at Microsoft Cambridge, we saw some amazing new user interfaces. Some used physical objects, some used large touch-enabled tables, others involved gesture recognition and there were even prototypes that allowed users to interact with virtual 3D objects projected into the real world.
Researchers outlined the evolution of user interfaces since the sixties. We’ve had the mouse (1963), the windows-icon-mouse-pointer interface (1972), the Nintendo Wii (2006) and the Apple iPad and Microsoft Kinect in 2010.
The addition of touchscreens and gesture recognition have changed the way we interact with computers and devices.
So what’s next?
- Physical interaction
- Giving computers a sense of space
- Embracing 3D interfaces
- Augmented reality
But what does this mean for businesses? In short: change happens and companies that embrace it do better than companies that don’t.
Five years ago, before the launch of the iPhone and iPad, most businesses wouldn’t have considered support for touchscreen tablets and mobile devices very important. Now, it’s essential.
Similarly, games that don’t include support for gestures suffer in the market; whether it’s the Wiimote or controller-free Kinect.
And in the early 80s, most people interacted with computers using command line interfaces and text displays. And we all know how that changed.
Already, Kinect for Windows is being used for different applications, from education, retail, in hospitals and even car dealerships. What can it do for your business? What would gesture, voice and projected 3D user interfaces mean for the way your users and customers interact?
To find out more about how Microsoft technology is supporting innovation in the Enterprise visit www.microsoft.com/en-gb/business/enterprise
By Tim Cozze-Young
Microsoft UK Enterprise Team