Scottie, spec me up!

Those of us lucky enough to have a program manager know that while PMs are a big help they certainly aren't a magic bullet. Some PMs, in fact, are more hindrance than help. When a spec never gets past draft status (or, even worse, is in a perpetual state of One Page-ness), the spec is nothing more than a vague thought that's only useful for... well, nothing, really.

If you're one of the unlucky pm-less masses, take heart: you can still get all the advantages of a spec. You just have to write it yourself.

Specifications are a must when you're coordinating work between a largish product team or with external teams, but they're useful for smaller teams as well -- even one-person shops. I write games and utilities (mostly for my own amusement, but a few are posted at https://www.humbugreality.com) in the spare time I don't have. I've found that writing a spec before I start designing and coding helps me crystallize exactly what it is I'm trying to do.

I don't think a "typical" Microsoft spec exists, but most specs contain most of the following sections in some form.  If a particular section does not apply it is not deleted; instead, text is added explaining why it is not relevant.

[The following is culled from multiple specs from multiple teams across Microsoft.  Parts are verbatim from various spec templates; other parts are paraphrased or summarized.  I likely missed at least one section that is vitally important to someone.  Caveat emptor, caveat spector, yadda yadda yadda.]

First is the Page One section, which describes why the feature is being built.  This is the first part of the spec to be filled out.  The first round of cuts is often done based on Page One specs.  This includes:

  • Summary.  A summary of features, including high-level goals, of the feature. This should describe the whole scenario, regardless of the team implementing the feature.
  • Scenarios.  A good feature always has a user scenario in mind. This might be a specific customer/site visit or an abstraction of several visits. This scenario might be part of a design sketch you worked on with product design or a low-fidelity prototype you worked on with usability.  The scenario should include the user goal, the context, and the steps needed to achieve the goal. When the feature development is complete, one should be able to successfully recreate the scenarios described here. Make sure all of the features or services being developed are accounted for in the scenarios.  Some or all of these scenarios are often used as criteria for exiting the milestone and/or acceptance tests for the feature.
  • Design goals and justification for the feature.  What is the point of this feature?  How will we know if it is successful?  Why is it being created and to whom will it be exposed?   What research has been done to confirm this is necessary, new, and being implemented in the right location?
  • Dependencies and partners.  Describe and detail any components or features this feature is dependent on, or that is dependent on this feature.

Next is the Detailed Design section, which goes into great detail about every last corner of the feature. 

  • General Design.  Provide a detailed description of the design sufficient for implementing, testing, and documenting the feature. This section may include screen shots, registry keys, file format issues, etc.  Items to cover include:
      1.  What is the good, better, best possible experience for the end user?
      2.  How will the user discover this feature? Are the access points obvious and consistent with those for similar features?
      3.  A workflow (UI) map that diagrams how users will use this feature is included here.  Questions such as "Can you clearly identify the user tasks?" and "What concepts/definitions should the user understand to be successful with this feature?" are answered here.
      4.  Details regarding how this feature is expected is to be used with other features in this or other product, or whether this feature worked differently in the previous version of the product, go here.
  • Security.  Has threat modeling been conducted for this API? If it needs to satisfy STRIDE(https://www.microsoft.com/usa/presentations/NET_Security_in_Practice.ppt), does it?
  • Performance.  Every feature and API has performance requirements.  What are the performance guidelines (e.g, memory, processor speed, bandwidth)?  Are features being implemented that will adversely affect performance, either in the area being developed or in other areas of the product?
  • Testability.  Test issues need to be designed and budgeted for up front.  Don't create a test design here -- just ensure that you address any big-ticket, unique test requirements early on.
  • Worldwide Requirements.  Application components must function correctly on every localized OS used by this application.  Issues covered here include measures for handling worldwide script support (e.g., right to left, DBCS, vertical text, extended characters, flexibility in switching between scripts, Unicode) and date/time dependencies based on regional settings (static vs. dynamic dependence on the user locale of the system).
  • Usability.  Every feature should be usability tested.  "Hallway" tests (i.e., grab people from the hallway for impromptu usability tests) can be very helpful, but some features require more formal methods.  Include a summary of and a link to the test results.
  • Accessibility.  List what needs to be done to satisfy accessibility requirements. 
  • User Assistance.  State which of the various sets of documentation (e.g., SDK Reference, User Guide) will contain information on this feature.  If a particular doc set will not mention this feature, state why not.
  • Setup and Deployment.  This section includes such items as a list of files being added/changed, and details regarding upgrade scenarios.
  • Project Location.  Where in version control will the files be?
  • Check-in Tests.  What scripts or other tests will exist to test the feature's functionality?
  • Cuts.  Describe any functionality cut or postponed from this release.
  • Related Links.  Link to
      1.  Each milestone schedule for this feature. 
      2.  The implementation details document(s) for this feature. 
      3.  The Test Design Spec for this feature. 
      4.  Link to any relevant specs for components this feature depends on, as well as specs for any components dependent on this feature.

Overwhelmed?  This is a lot of text to write, but taking the time to do so -- and review it, and revise it, and review the revisions, and... -- makes an incredible difference in the quality of a feature.  Some teams organize the specification exactly like this, with full details for the feature in each section.  Other teams organize the spec around the areas and subareas of the feature, detailing each of these items just for that area or subarea.  Experiment to determine which works best for you.

*** Comments, questions, feedback?   Or just want a job on a team that mostly does it right? Contact me at michhu at microsoft dot com. I need a tester, and my team needs developers, program managers, and a product manager. Great coding skills required for all positions.