Search and Natural User Interfaces – Part 2

In my first post on this subject last week, I referred to a scene in the movie “Minority Report” as a visionary example of a natural user interfaces (NUIs) and, more to the theme of this blog, a visionary example of ad hoc search within a NUI.  I realize that I didn’t offer a definition of NUIs in that post, so, before I go back to the search connection, here’s a quick primer.

NUIs Defined 

Natural user interfaces or NUIs rely on natural expressions like touches and gestures to directly and intuitively control the experience of a software application. The word “natural” means that the interaction is not controlled through an artificial device, like a mouse or keyboard. (I take this to imply that a Nintendo Wii is not an example of a NUI, since there are still artificial controllers involved. Other opinions and thoughts on this are welcomed).

NUIs have been described as the next evolutionary step in human-computer interaction – the successor to graphical user interfaces (GUIs), which succeeded command line interfaces (CLIs), which succeeded physical input devices like card readers. Touch screens on hand-held devices are the most common examples of NUIs, but there are number of other emerging NUI platforms and technologies. This article on touch computing from PC Magazine offers a catalog of some of the systems currently available.

Microsoft Surface 

One of the technologies mentioned in the PC Magazine story is Microsoft Surface.  Microsoft Surface is a Windows powered device in the form factor of a table – a coffee table, if you will – with a surface that supports touch and gesture interaction. There are other NUI platforms, but there are a couple things that make Microsoft Surface different and interesting.

First, the Microsoft Surface form factor and interface are designed to allow multiple users to interact with the device at the same time. The interface can detect and track dozens of touch points simultaneously. It can even recognize the orientation of fingers prints and infer, in turn, the physical orientation of a user relative to the display. Because of these capabilities, many applications created for Microsoft Surface emphasize multi-user collaboration and interaction – for example, there are multi-user games and other collaborative consumer applications for things like music and picture sharing.

Second, Microsoft Surface devices have built-in cameras that can not only track touches and gestures, but can recognize digitally tagged objects and can initiate specific actions when these objects are placed on the table. For example, Infusion Development has created an application designed to enhance the doctor patient consultation experience. By placing a tagged card on Microsoft Surface, doctors can use and access interactive cardiac images, dynamic charts and clinical documents to help explain medical conditions and procedures to their patients.

NUIs:  Where’s the Search?

I was wowed by my first experience with Microsoft Surface – as many are when the first get a chance to play with one – but being a search guy, I looked for applications that included some sort of search function. So far, of the NUI applications I’ve seen to date, whether on Microsoft Surface or in other NUI technologies, I’ve seen very few that provide true ad hoc search. In one or two examples I’ve seen, a virtual keyboard is used to enter search terms and traditional GUI search metaphors are used to render search results. More often, though, finding information requires the user to navigate through some pre-defined structure. Even this TouchWall demo by Bill Gates from last year’s CEO Summit focused on navigation. Where’s the search?

I’ll grant that structural navigation metaphors in NUIs are really cool and work pretty well.  For example, I’ve seen a medical app that allows you to visually navigate a representation of the human body to explore different anatomical concepts. You can tap on the virtual head to explore the brain and then drill down further to learn about neurons. It looks like a fun and an interesting way to explore human anatomy, but the problem with this navigation-only approach is that if you don’t happen to know that neurons are in the brain, it will take you a while to find them. It is browsing, not ad hoc search and, as we learned from the Yahoo Directory experience back in the 90s, people tend to prefer searching over browsing.

A Prototype and a Request

At our FASTforward’09 user conference in Las Vegas in February, we showed a prototype application, built in collaboration with a very sharp team of developers at EMC Consulting, which brought together ad hoc search and the natural user interface experience of Microsoft Surface. You can see a short video of this demo here, or the longer keynote presentation from the event here.


When Mark Stone, Global Enterprise Search Lead at EMC Consulting, and I first conceived this demo, we were inspired by three things:

1)      The dramatic growth and potential of NUI technologies, particularly Microsoft Surface.

2)      The dearth of search examples in all these NUI applications.

3)      The potential for creating transformative user experiences that combine search and NUIs .

You can judge for yourself how successful the team was in combining ad hoc search with Microsoft Surface by looking at the demos, but one thing is for sure, we were in uncharted waters when building this app. The user interface patterns for search within a NUI are not well established. Even without considering search, building user interfaces in Microsoft Surface requires setting aside the old GUI models and learning brand new patterns and metaphors. As for search in a NUI, well, what are the equivalents to the search box, the search result list, navigation facets, document links, and all the interaction patterns around this “controls”?  How can we use a 3rd dimension (“depth”) and what role does “zoom” play in search? Working within a NUI environment even challenges the basic containers of information. Should you first show documents, or just extracted facts and information summaries? All these questions and more came up during the development of this prototype. Some of the answers are now known, or at least we have a better feel for the right direction to go, but others require more research and experimentation.

There is the opportunity here, and a challenge to be met by the search community. NUIs are here to stay and are demanding new patterns for true ad hoc search that satisfy the intuitive and natural interaction requirements of these environments. Reverting to browsing metaphors is not the answer; nor is simply recreating the GUI patterns of keyword search boxes and lists of blue links.

I’m very interested in this topic and am on a hunt for any good examples of true search within NUIs. If you know of an example, please send whatever pointer you can – links to demo videos, screen shots, academic papers, … anything. You can respond to this post or email me directly.


I look forward to seeing your examples and will summarize what I find in a future post.

In the mean time, I feel like we need a new name for search interfaces within NUIs. I like the phrase “Natural Search Interface” used by the Microsoft Germany Partner site in reference to the Microsoft/EMC Consulting prototype. I’ll use that.


Comments (4)

  1. Part of the question, in my mind, about considering natural user interfaces and search is how (or if) the information you are searching through is organized? The interface, and method of interaction, in searching for something which can be geographically represented could be quite different from searching for newspaper articles on a particular topic or looking up a phone number. As the user of a NUI, where is the starting point for your search? Should that differ depending on and be relevant to the ultimate object of your search?

    I think you make a very good point about not reverting to browser methods. That would be the easy way out and seem to defeat the point of having a fresh opportunity to consider a new user experience environment.

    A very interesting topic, I look forward to seeing or hearing where the research takes you.

  2. ntreloar says:

    Good points, Carl.

    It’s interesting to think about how established interactions of various types of GUI search applications (news search, product catalog search, directory geo search, …) might manifest in a NUI. The interactions will, as you point out, be different, just as they are different in GUIs.

    You’re right that the starting point for search is the challenge when it comes to NUIs. How do you issue that first query and from where? A virtual keyboard is one approach but feels a little too much like the ubiquitous search box in GUIs. And typing on a virtual keyboard is just not the same as a physical keyboard. Even my most rabid iPhone friends acknowledge that.

    In our prototype we opted to de-emphasize keyword input and to use a "seed" concept to initiate a search – in the example, a customer account name pulled from the users "profile". We then focused on search refinements and faceted search methods.

    Still, true ad hoc search ought to be there. In Microsoft Surface (and many other touch interfaces), it is possible to write words with your finger and have a handwriting recognition engine kick in. You can imagine just writing a few keywords on the Surface, and then tapping them to initiate the query.

    Voice input is another possible approach, but I haven’t seen that work so well with just a few spoken words – at least not outside very constrained menu prompting systems.

    Once initiated, NUIs offer interesting "search by example" and query refinement possibilities. One of the things we didn’t predict with Microsoft Surface was, because it’s multi-user, the powerful interactions that can happen *across* user query contexts. Imagine two users searching through information at the same time and then "joining" concepts across their respective results sets. You search for "vegan" and I search for "Italian" and then we intersect our results – like a Venn diagram.

    Thanks for the comment.


  3. ntreloar says:

    While not directly related to the post topic, this video of Microsoft Second Light reinforces the point that NUIs enable a whole new class of user interactions that go way beyond the familar GUI patterns.