SDRs - getting customer feedback early and often

For three days last week we spent time with 30 customers representing a number of companies from around the world sharing with them some ideas we have about the future of VSTS.  We typically conduct a number of these Software Design Reviews (SDRs) during the lifecycle of the product to get customer feedback early and often to ensure that we're hitting the sweet spot within our particular market.  While we certainly get feedback through other means through the lifecycle, there's something very powerful about getting face-to-face with customers and getting direct feedback both positive and negative in real time.  We select customers that are in various stages of VSTS adoption, from various sized companies from large enterprises to small consultancies and those in between.  We also try to get a good mix of job roles so that we're talking to managers, developers, testers, architects, consultants, etc.  It's quite a mix of perspectives, but with a product as large and complex as VSTS, it's necessary.

We started the session this time by asking customers what their major pain points were with regards to dealing with the day-to-day software development lifecycle.  This helped us understand where our opportunities for improvement or additional investment are.  It was especially interesting when the attendees start to 'riff' off of each other as we can see connections and patterns between the various areas of concern. 

From there, we reviewed an end-to-end scenario which showed a possible solution to what we believe are a common set software engineering challenges.  The scenario focuses on the flow of information and work through a team.  We didn't attempt to show every user gesture but instead the "key frames" which highlighted the major points of integration between our various components.  We want these sessions to be very interactive so we're constantly getting interrupted by "what if...?" and "can I instead...?" sorts of questions which helps us understand how well we've connected with the customers' reality and the appropriateness of our solution.  This also helps frame the indepth discussions that happen during the remainder of the SDR.

In preparation for the SDR, we asked each attendee to rank a series of "value propositions" both in terms of the importance to their business as well as their satisfaction of their current solution.  We also asked them to indicate, out of a limited budget of $100, how much they'd spend on each value prop.  We collected the homework, collated the results and presented them back to the attendees at this point in the SDR.  We focused on those that rose to the top of the list based on these factors and asked the attendees to help us interpret the results.  We also highlighted those value props that we had previously rated high but didn't show up on the customer top 10.  It's the discussion of these results that's the most interesting to me as it helps us understand more about the top issues facing these various customers as well as any erroneous assumptions we've made. 

Through the next couple days, we had each of our product teams present more detailed information about their thoughts for future versions. Through these conversations, we worked very hard to ensure that we balanced talking with listening so that we got our ideas on the table and then heard how people react to them.  In some cases, all we had were PowerPoint slides showing some UI mock ups while in other cases we had prototype code running.  We tried to show just enough to get the idea across and then focus on driving an interesting discussion to learn what we're doing right and what we need to rethink. 

Towards the end of the session, we reran the value prop prioritization exercise.  In this case, instead of spending $100, we gave each participant 10 green and 10 red sticky dots.  We created posters with all of the value props that we'd listed previously, as well as those added during our various discussions, and instructed attendees to place a green dot next to those value props that they would like us to invest in addressing and a red dot next to those that we should deinvest in.  Because this is an immediate and very visible exercise, we were able to quickly identify the top value props and drive a discussion to ensure we really understood why those were so appealing to the crowd.

We video recorded each session and have posted the recordings as well as a detailed index to our internal network allowing our various teams to find and highlight the various nuggets of feedback.  We're currently in the process of distilling the key lessons we've learned and hope to bubble that up to our teams over the next week or so.  Undoubtedly, we'll make some adjustments for priorities and designs based on what we learned.  Of course, this process is just another data point in the many different ways that we collect customer information, but there are very few events that we run that are as rich with data and immediate as these sorts of SDRs.