Measuring developer productivity

I love managed code… the benefits of the virtual execution environment are manifold and very impactful, but if I had to distill it all down to just one thing, just two words to describe why Microsoft is building the CLR, the .NET Framework, WinFX, VB.NET, C#, C++, etc the answer is simple and clear: developer productivity. We want developers using our platform to be the most happy, productive developers on the planet.

Ahh – but can I prove it. Can I provide evidence that developers really are more productive in managed code? Certainly we have anecdotal indications and customer testimonials, but what about hard numbers? For good or for bad, Microsoft is an extreme competence culture. We love to measure *everything* around here, because that is the only way we can know how are doing… Just today I have seen measurements of: Bugs per dev, bugs per team, bug regression rates, bug trend lines, comparative .NET Framework book sales rates, # of FxCop violations per kloc, etc, etc.. the list is really endless…

So, I need your help… I want to generate some data on productivity of managed code. I am open to however you want to think about that, but it needs to be something quantifiable. Something such as bugs per kloc, servicing issues per kloc, dev days per feature, etc. and I need to be able to compare it to something else such as unmanaged code, VB6, etc. The kind of data I am thinking of is like this mythical example: “V1.0 of the widgetX was writing in C++, it took 5 man years and we found an average of 30 bugs kloc, V2.0 was a complete rewrite in C# with significant new features it was done in 1 man year and we only found 10 bugs per kloc”.

So how about it, can you help me? Please leave a comment or drop me a line.

Thanks!