Best Practices on the patterns & practices Team

The Microsoft patterns & practices team has been around since 2000. The patterns & practices team builds prescriptive guidance for customers building applications on the Microsoft platform.  The primary mission is customer success on the platform.  As part of that mission, patterns & practices delivers guidance in the form of reusable libraries, in-tool experiences, patterns, and guides.  To put it another way, we deliver code-based and content-based guidance.

I’ve been a part of the team since 2001.   Along the way, I’ve seen a lot of changes as our people, our processes, and our catalog of products have changed over time.  Recently, I took a step back to collect and reflect our best practices.  Some practices were more effective than others, we’ve added some new ones, and we’ve lost some along the way.  To help reflect and analyze the best practices, I created a map of the key practices organized by discipline.  In this post, I’ll share the map (note that it’s a work in progress.)  Special thanks to Ed Jezierski, Michael Kropp, Per Vonge Nielsen, Shaun Hayes, and Tom Hollander (all former patterns & practices team members) for their contributions and insights to the map.

Best Practices by Discipline
The following table is a map of the key practices used by the patterns & practices team over the years.

Discipline Key Practices
Management Team
  • Milestone Reviews
  • Product Portfolio (correlated with customer & business challenges/opportunities)
  • Team development  (leadership skills, communication skills, … etc.)
  • Backlog
  • Connection with customers and partners
  • Fireside chats
  • Meeting with key stakeholders in the app plat space
  • People review
  • Scorecard management
  • Tracking overall team budget
  • Weekly Status
  • Articulate the inner (scope) and outer (context) architecture (these involve time)
  • Articulate technical principles - drive technical tradeoffs discussions
  • Be aware of roadmaps of the company, and build trust to make sure they are current
  • Be familiar with competitive tech.
  • Customer connection.
  • Groups’ technical strategy and product model.
  • Know actionable industry trends.
  • Overall design with functional breakdown.
  • Relationship with key influencers in the product groups.
  • Spikes / explorations including new approaches (technology and process)
  • Technical challenges
Development Team
  • Ship running code / guidance at the end of each iteration
  • User Stories
  • XP / Scrum with test-driven-development
  • Continuous build and integration
  • Iterations
  • Retrospectives
Product Management
  • Asset Model
  • Customer Surveys (feature voting, exit polls)
  • Standardized product model (app blocks, factories, guides, etc.)
  • Blogging throughout project (planning, development, release)
  • Case Studies
  • Community Lead
  • Customer Advisory Board
  • Customer Proof Points
  • Own Vision / Scope
  • Portfolio Planning
  • Project dashboard
Program Management
  • Customer Connected Engineering (CCE)
  • Fix-time, Flex Scope
  • Scenarios / User Stories
  • 5 customers stand behind it
  • AAD Sessions (Accelerated Analysis and Design)
  • Backlog
  • Exec Sponsor
  • Product owner from M0 (Milestone 0)
  • Quality over scope
  • Scorecards
Release Checklist
  • Release Checklist
  • Release Mail
Test Team
  • Automated tests
  • Focused on overall quality (functionality is tested by dev)
User Experience Team
  • Authoring Guide (Key practices for authors)
  • Content Spec (Content scenarios and outline)
  • Doc Tool (Template for standardizing content formatting)

Some practices are obvious, while some of the names of the practices might not be.  For example, “Fireside chat” is the name of our monthly team meeting, which is an informal gathering and open dialogue.   I may drill into some of these practices in future posts, if there’s interest and there are key insights to share.

Comments (6)

  1. anutthara says:

    JD – could you talk a bit more on the "5 customers stand behind you" please? Of course, the title is self-explanatory, but I wanted to hear more about "why do this?", "what are the key things to look for while choosing the 5 customers", "what kind of engagements worked", "how did you evolve the number to be 5" and some more…

  2. J.D. Meier says:

    Hey Anu

    This is the ultimate test.  We’ve found that if 5 customers really stand behind your solution, meaning they would justify and defend it, then you’re on a good trajectory.  This is not the same as just having 5 customers just use it or don’t complain.  It’s that they fully believe in your solution and stand behind it.

    To do so, we find the common scenarios and the common requirements (user, business, and tech.)   Nailing this sweet spot is how we get 5 customers to stand behind it.  We iteratively test our solution with customers and respond to feedback.  

    To find the right customers, we leverage customers that are actually working on the problem (they share the pain.)  We prioritize around customers that have to solve the problem multiple times (such as solution integrators.)  This helps us avoid chasing long-tail problems and instead find a common set of problems that keep showing up for many customers.

    The 5 customers is actually a proxy for more customers.  When 5 stand behind you fully, it grows exponentially.  It also means chances are that you found some of the key blockers for people and worked the kinks out.

    Having 5 makes it real and it’s possible to go deep.  We engage more frequently and deeper with a smaller set, than we can with our extended set.

    The irony is how easy it sounds, until you test somebody:

    – Do you have five customers that stand behind that solution?

  3. Phil says:

    This is an interesting summary of the entire Software Development team. I do not recall every seeing every entity place on the same page. It would be nice if another article talks about the interactions between the teams. This is important to know so as to avoid becoming so bureaucratic that decisions need to pass through too many layers prior to actions, yet have enough bureaucracy to prevent the wrong layer from making important business decisions.

  4. J.D. Meier says:

    @ Phil — It would be good to have a more in depth article on the interactions, but for now, I’ll summarize.  Our team has stayed relatively flat and simple.  The main interactions have been around the business plans, the portfolio, and the programs/projects.  The business plans identify the problem, the customer, and the goals/metrics for success.  The portfolio is a catalog of our assets, which we use to analyze our investments.  The programs and projects are how we execute and make impact against our goals/objectives, while building our portfolio of assets (code and content.)  The disciplines provide strengths and focus: the management team clarifies the business objectives and measures, PMs lead the execution, product management does the portfolio planning and marketing, … etc.

  5. anutthara says:

    Thanks JD – that is really interesting. It’s easy to get 5 customers to use it, but to create 5 raving fans — the people that will stand behind your solution is certainly harder and more fruitful. I’ll probably do a blog post on my experiences with that so far

    On the other hand, do you think you will do a more detailed post on the AAD sessions and scorecards with examples? That’ll be really cool. In fact, I’ll setup a 1:1 with you if you are game, to go over the entire PM best practices set you have posted – the list is just so compelling…

  6. J.D. Meier says:

    @ Anu — I’m exploring ways to boil down the lessons learned in a compacted way.  I might need to create some short eBooks to do the topics justice.  

    I’m game.

    Meanwhile, here are some related posts on how to be an effective PM that I think you’ll enjoy:

    * Lessons Learned in patterns & practices –

    * PM Skills for Life –

    * Proven Practices for Individual Contributors –

    * 7 Habits of Highly Effective Program Managers –

Skip to main content