The Slippery Slope of Blog Statistic Gathering and Weighing

The internal Microsoft bloggers alias, where no topic is taboo and I find out things fast than the Seattle Times, has spent most of today talking about the importance or lack thereof of blog statistics. This response hit me like a tidal wave:

"You say that we can debate the utility of metrics some other time, but implicit in your opinion is that metrics are a good thing.

One of the worst things we could do for blogs is have them show up in commitments with metrics. We want people blogging who *want* to be blogging, and we want to give them freedom to blog in the way that works for them and makes sense for the customer. If you're an expert in, say, the Visual Studio Extensibility Model, or some other niche area, you may never pull big blog numbers but your blog may be *intensely useful* to the people who do read it.

I'm an opponent of metrics around community unless they are ones collected directly from customers (ie "how would you rate your experience solving your problem in the forums?" or "what .NET blogs have you found useful?"), but even in those cases, I don't think they should be on commitments. Metrics inevitably make you focus on the numbers rather than the goal.

I'm willing to temper my opposition in specific cases. If your team has no community presence, I could see a team goal of "x blogs", or if you're a product manager, I could see some sort of goal there, since being customer-visible is a big part of your job." – Eric Gunnerson

Then the geekiness starts to come out in most of us and we start talking about the different statistic gathering solutions. Blogbeat, StatCounter,Technorati, and Feedburner to name a few. So many toys and so little time.  :o)