OneNote Momentum

What an exciting three months OneNote 2007 has had out in the marketplace. By every measure OneNote 2007 is a hit! Check out this blog activity for one thing:


Blog posts containing "OneNote" over the last 360 days taken May 9 (from Technorati)

Traditionally many people measure a product's success by a particular metric: the number of units sold. But there are many other metrics to use: of course one is "profit" - if you gave away all those units for a song (or for free!), you didn't make any money. Also its not clear how dedicated those customers are. Conversely if you held the price high enough and people bought a lot of it, you have a good sense that people see value in the product.

Another measure is usage. You want to see that people are really using your product. That means they are getting value out of it, and also indicates loyalty.

Another measure is "buzz" like the blog measure above. Are people talking about your product? If so, that's also a good sign. Notice there is a spike not just on the "news" of availability (around Jan 30) but there are higher spikes later - that's when people are using the product and talking about it. For examples of what people are saying, check out Dan's "blog roundup" posts: January, February, March, April. Some of my favorite quotes:

  1. "OneNote 2007 sharing is indistinguishable from magic"

  2. "I just purchased a copy of Microsoft OneNote, my life will never be the same."

But those are tame. Why not really go for it?

  1. "The Greatest Invention in Human History? I vote for Microsoft OneNote"

  2. "I need Office OneNote 2007 to live."

And for the you-know-who crowd:

  1. "I can't believe I'm so excited over some program that M$ came up with. It's probably just all the adrenaline that's been pumping through me lately."

And we're just getting started!

There are other measures. For software there is also "deployment" - many companies have purchased long term contracts with Microsoft for most or all of our latest software, but they don't always get around to putting the new stuff on their users' machines since they have a lot of work to do. So we care about whether that has happened or not since it is a measure of how much they value the new stuff.

I can't share specific sales figures with you all and they don't tell the whole story anyway (there's that "deployment issue" plus lots of people get OneNote on their laptop but don't know it, and so on). I do want to show the existing trends we're seeing however.

First, it's worth noting that OneNote 2003 (the first release) was a success in its own right. A new product that costs money and isn't a visible lifestyle item (e.g. software to get work done vs. an iPod) takes time to build its user base. And as I said, the nice thing about free products or services is that they can build users fast, but because they are free their users often have no special investment in the service. OneNote 2003 shipped well over 10million units and racked up several million actual users over the 3 years it was on the market (as best we can tell). Pretty good for a whole new "category" of software most people didn't know about or know they needed, with next to no marketing budget and not being included in any Office Suite! By contrast, the top web productivity apps and suites that everyone writes about because they're "hot" all have less than 500K users, most of them far less (I can't tell you how we know that though!)

Our plans for OneNote are for it to build momentum like "rolling thunder" over several years. Each release retains users from the one before and adds proportionally more. The great majority of people only try a new thing when their friends recommend it, and that takes some time. If you think about how an application like PowerPoint went from obscurity to ubiquitous over the course of a few years - that's the idea.

Fortunately, in addition to raving fans and sales figures, we are able to get more quantitative and explicit measurements on popularity. One way is through the Customer Experience Improvement Program. Some of you may know this - it is the little balloon that pops up to ask you if we can (anonymously and in aggregate) track which commands you use in the application, how long you use the application, etc.  We use this data to make the product better in the future, but it is also a handy measure of overall activity. CEIP data is returned to us in the form of "sessions" which are fixed length blocks of time containing data.

Here's where it really gets exciting. Although we can't know for sure how many users these session counts represent, we think variables like what % of the users have signed up for the program are about the same for each release, which makes them comparable. Look at these relative numbers!

Release

Date

Number of CEIP sessions added overthe 5 months after RTM (code final)

2003

8/15/2003

310,109

2003 SP1

7/22/2004

1,050,620

2007

10/28/2006

10,744,083

Do you need a chart? Can someone say hockey stick?

How many users is this? It's really hard to say since it depends on people agreeing to join the program which is off by default. Only a tiny fraction actually send us data. But it's a lot, and look at that trend!