Community Health for Visual Studio Team System – Part 3 Details

vstsuetb

This post is the third in a series about who the Visual Studio Team System Community Council is, what goals we’re trying to accomplish, and what results we’ve seen so far. In the year between August 2006 and July 2007, the council monitored the following community initiatives:

·         MSDN Forums –  covered in part 1

·         Channel 9 Broadcasts – covered in part 2

·         MSDN Chats – covered in part 2

·         Visual Studio Team System Advisory Council – covered in part 2

·         CodePlex (added in April) – covered in this post

·         Webcasts – covered in this post

·         Connect – covered in this post

·         Conferences

·         MVP Program

·         Blogs

In this post, I want to talk about CodePlex, webcasts, and Connect. As I did in parts 1 and 2, I will look at the metrics that we tracked, our interpretation of those metrics, and our plans for the initiatives moving forward.

CodePlex

The CodePlex initiative was launched in April 2007 so that we could share out-of-band projects and engage in collaborative development with the community. Put another way, CodePlex helps the community work with members of the VSTS team on projects that are not part of our primary releases.

Progress against Goals

We identified only one metric for CodePlex: we wanted to see at least four new CodePlex projects per quarter.  Because CodePlex was a new initiative, we have data only for the second quarter (April, May, and June) of 2007. In that timeframe, eight projects were added to or published in CodePlex, which means that we exceeded the goal quite handily. As of this review, CodePlex has a total of 30 projects.

Some of the most popular projects in Codeplex are:

  • Team Foundation Server Administration Tool (~138,000 views)

  • Branching Guidance (~59,000 views)

  • Patterns and Practices (~46,000 views)

  • Automation (~35,000 views)

  • Migration Toolkit (~21,000 views)

Supporting the Community Initiatives

Like the forums and chats on MSDN, CodePlex is another initiative that provides the VSTS community a way to provide feedback. In this case, it’s feedback on out-of-band releases that the VSTS product team might not otherwise be able to deliver. All of the product teams in VSTS have lists of things that they would love to deliver. Many of these great suggestions do not make it into a product release (for various reasons). Some can be delivered as side projects via CodePlex, solving customer problems. This approach also provides the benefit that some community members can take ownership of these CodePlex projects and drive the delivery of features to meet their own requirements. The community can address at least some issues in a much shorter time frame than a release cycle for Visual Studio.

You can think of it as dog-walking. Any community member can easily take one dog for a walk in a pretty efficient manner. In contrast, the VSTS product teams must take hundreds of dogs for a walk, all at the same time. It requires a lot more effort and preparation to get to the desired destination.  And yes, I could make a number of other observations about that metaphor, but I will take the high road.

Final Thoughts

CodePlex provides a solution for collaborative, shared-development projects with high involvement from the community.  Although CodePlex is still new, the feedback so far has been positive. We still have some challenges to solve, mostly around starting more projects (both from Microsoft and from the community).  Looking forward, the VSTS teams want to put more sample code online to help start projects. In addition, we want to increase community involvement in the published projects.

Webcasts

VSTS teams use webcasts to communicate with the community through streaming or recorded presentations, complete with audio. By watching webcasts, you can find in-depth, technical details about how you can use VSTS to solve day-to-day problems.

Progress against Goals

We had three goals for webcasts: average 5 webcasts per month (60 total for 2007), average 500 views per webcast, and have all webcasts achieve customer ratings that averaged 7.0 or better (on a scale of 1 to 10). Essentially, we want to provide a regular flow of new webcast content. We want to choose content that the community wants to watch. And we want the community to feel as though it received good value from those webcasts. How did we do?

We published only 36 webcasts in the year ending in July 2007, well short of our goal of 60 for the year. And the majority of those were from two sources: Patterns & Practices and VSTS Team Edition for Database Professionals. (Together, those groups accounted for 55% of the webcasts and 69% of the downloads.) Although we averaged 739 views per webcast, 58% of our webcasts had fewer than the desired 500 views. The good news is that all of our published webcasts did receive ratings that averaged 7.0 or better.

Other Takeaways

So what did we learn? Well, we were not able to provide the amount of content that we expected. And when we did provide content, quite a bit of it had a fairly narrow audience. However, we still do not know whether people could not find the content (did anyone know that we published these webcasts?) or they simply didn’t care about the topics.

Supporting the Community Initiatives

We generally published webcasts around major launch events, such as community technology previews and product releases. The webcasts ended up being more of a marketing and training vehicle, rather than a source for rich community interaction.

Final Thoughts

We will remove webcasts from our list of community initiatives. We will still publish them as a part of an overall release plan, not as a community activity.

Connect

By using the Microsoft Connect site, the community can download the most recent software and content, take surveys, exchange ideas in newsgroup forums, and provide and review feedback about experiences when using Microsoft products. Primarily, customers use http://connect.microsoft.com/visualstudio to report bugs and submit suggestions. We take that feedback from the Connect database and move it into the issue-tracking database for the appropriate team, who handles the feedback along with any other bugs/suggestions for that product.

Progress against Goals

We tracked two metrics for Connect for VSTS: the number of issues that we “answered” within seven days and the percentage of customer-reported issues that we fixed. We did not have specific goals for either of these metrics.

Unfortunately, we didn’t choose the metrics well. With the current data that is available for Connect, we cannot easily tell whether an issue has been “answered.” We can tell the last date (not time) that  a customer or a product team member touched an issue. But with only a date, we cannot tell who touched the issue most recently without having to laboriously read of all reported issues.

If our data is correct (and given the problems with the “answered” metric, you can understand my skepticism), we fixed about 72% of all bugs reported on the Connect site. Over the year, customers reported 697 issues through Connect, so we fixed about 500 of them.

Other Takeaways

For Connect, the major takeaway is that we really need to revisit what we are tracking and how we are tracking it. Ownership of the Connect initiative on the VSTS Community Council has recently changed, so we hope that the new owner will significantly clarify goals and results in this area.

Supporting the Community Initiatives

To reiterate, Connect adds the most value when the community provides feedback and can track what happens to their bugs and suggestions. Although we also welcome feedback through e-mail, we really prefer customers to report bugs in Connect. The issues end up in the same database for tracking work items that the team uses for all of its other work, so issues are not lost. And from my experience in product-team triage (where bugs are prioritized), bugs that customers report tend to receive a high priority. One of our key triage criteria is the customer impact of a bug or suggestion. Customer impact is clear when customers report issues directly.

Final Thoughts

Connect has given us some very good feedback, in spite of a cumbersome interface and difficulties in retrieving metrics about the feedback. Our major improvements should be to clarify what we track and how we track it and to work with the owners of the Connect site to try to make the site easier for customers to provide feedback.

Looking Ahead

In the fourth and final post, I’ll look at conferences, the MVP program, and blogs. I hope that you are enjoying this look into how we view, measure, and adjust our interactions with the VSTS community.

 

-Steven

0 comments

Discussion is closed.

Feedback usabilla icon