A few of the comments I've gotten about my reputation system are telling me that I'm on the right track, but I'm really not doing enough to measure answer quality, not just quantity. Recently, the community lead for the C# team told me that she had been finding some of their newer moderators (who are excellent) by using the "Most Helpful Rated" column--not the "Top Answerers" column that I evangelize so much.
It's really too easy to just pick off low-hanging fruit and marking simple answers as answers. If somebody posts something that's truly great in the forums, shouldn't they get more credit than a random Microsoftie that closes a thread with a link to the newsgroups and marks that as an answer?
Of course, the logical answer is "Of course", but it's just not that simple. It adds a layer of moderation on top of nearly everything in a system that's already fairly taxing for moderators (for the answer marking to work well, you really need moderators that care about answer marking.) So, how do you judge answer quality without asking people to do extra moderation?
One idea I've toyed around with was using a combination of page view counts and "trackbacks" on a given post. The former we already measure--every time somebody looks at a post on the forums it increments a little counter. Sure enough, the best, most-helpful posts have the highest view counts by far. The latter--"trackbacks" to posts, might not be entirely feasible with our current forum implementation, but the idea is simple (and the key to Google's billions)--the more people that link to your post, the more likely it is that your post was useful and of high-quality.
This is some fairly half-baked brainstorming, and I'm very aware of some of the holes in the system--it's easily gamed, there's no real assurance that a given post isn't just being read quite a bit because it appears helpful until you read it, and so on, but I'd really love to brainstorm with everyone...what do you think?
How can you gage answer quality without asking a moderator to do it?