How do I write a test spec?

Jeke Green asks me via the Email link

"Hi Anutthara, I enjoyed reading your Spec Reviews for world class testers post and esp liked the world class testers part :-) I am new to testing and want to understand how to write a "world class" test plan firstly. Is there any resource that you would recommend for that?"

Hmm - let's start with a test spec instead of a test plan. A test spec is a much broader document than your detailed test plan. Test case generation will ensue from this master test spec using any methodology that the tester wants to employ and culminate in the test plan. 

Bread and butter – This is the basis of most of the testing that you will be doing for the rest of your cycle. Spend enough time on it – think about your spec and ensure you have a complete and structured document
Cut 'em down to size – Break down your components into bite sized chunks so that you can get a handle on writing tests for those. For instance, if you are testing a feature like say win forms designer, break it down into its sub features like controls, rendering area, code-behind etc
That's not my scope – Scoping your feature well in detail is a vital step to write a good test spec. Testing is a black hole and there are no clear boundaries unless you define them. Think about the surface area that you want to test in a product. Your non goals are as important to document as your goals. Clearly write your basic assumptions about the feature and the environment under which you will be testing this. For instance, if your feature to test is a Visual Studio plug in, you will have to call out assumptions you make in your testing – support is restricted to VS8.0 and above only and the plug in will be tested for a given set of functionality. You may not be testing basic menu options already present in VS for broken functionality due to plug in install since you have good reason to believe your plug in will not mess with intrinsic functionality. Place pegs on the ground to demarcate a lucid area on which you are going to define your tests
Horizontally challenged? – Horizontals like i18n, security, accessibility, performance, usability – all of these need to be given consideration when writing your test plan. You may not be in a position to start generating test cases for some of these straight away due to sheer lack of detail or clarity in spec on certain UI elements. But you will have to have placeholders for such cases in the least. For instance, if you are testing a test case management tool, it is vital that you define what is the least performance you expect for a given number of test cases to be retrieved or think of cases where the test cases themselves may not be in English.
How many friends does your component have - Your team's test plan needs to be like a tent that houses the entire product. Miss stitching the pieces together and you have a gaping hole in your tent and test holes are never pretty. List all the other features that interact with your feature closely and define integration tests on those. Partner with those feature test owners to come up with a comprehensive integration plan for that interface. If your PM has missed defining integration scenarios, you know what to do. :) Write partner acceptance tests for features where your feature has to interact with an external product. This helps in validating partner drops also.
Your scenarios, my scenarios – Use the user scenarios defined in the spec to drive your test plan. Your overall goodness factor scenarios have to resonate with the user scenario closely. Think in the customer's shoes to get an appropriate version of the scenario to test. Your test case priorities will derive from the relative importance of the scenario to the customer.
"Hunt the lead" campaign – Most importantly, you need to get your test plan reviewed by someone who has had prior experience with writing quality test plans. If you are at Microsoft, your lead certainly fits that bill :) Hound her until she reviews your test plan and incorporate relevant comments. Review is a good chance to take a good re-look at your test plan.

Hmm - where is the world class tester again? Obviously she has her own style and will do lots of different stuff:

Stuff we love to hate - Dysfunctional concurrency scenarios, weird error messages, screwed up multi client scenarios, client-server different time zone issues, system deadlocks and hangs, data corruption due to entering wrong format input…get the drift? These are annoying little things that get on the user's nerves. Include tests related to these in your test spec. In cases where you simply cannot afford that level of detail in your test spec, make an aside note of these test cases for future reference
I am not a positive person - Think through all possible negative scenarios. These may be low in priority but are usually high in severity. This is an extension of that "What if" scenario in the spec review. Negative testing will be a bigger set than the positive test cases but is necessary to include in the test plan.
Many pairs of eyes – Get your test plan peer reviewed. There may have been some aspects of the feature that you failed to include in the test plan. SMEs in your team will find holes in horizontal testing efficiently. A best practice is to have a test review scheduled with the entire QA team + the relevant dev and PM to take a look at the test plan draft that you have written. That way you can also ensure that the entire team is aware to a certain extent about all features being built into the product
Not too late to feel testy – As you write your test plan, you will encounter compelling needs to add testability into the product at places where it is obviously missing. If you missed detecting lack of testability in the product during spec review, now is the right time to raise this.
Bound by contract! – When your feature interacts with an external product’s features, an integration test plan is required. Many times, you make assumptions about the quality of the interacting feature. Then, you need to employ a test contract that defines the minimum amount of testing that needs to be done on the external feature for your feature to work properly in the integration scenarios. This document serves as a formal bridge between the two teams in terms of who tests what features. The contract needs to be signed by testers from both teams and updated as necessary.
Keep cycling – it's good for health – The test specification as well as test plan are highly dynamic docs. The more iterations you do over your documents, more complete and pertinent will be your documents. Keep updating the doc as the spec changes, as your understanding of the feature increases, as review comments come in, as the testing scope increases, the test spec needs to be updated.
Hey, wait up! – The product spec needs to be kept in sync with the test spec. Updating the product spec is undoubtedly the responsibility of the PM, but the tester needs to drive it in as many ways as possible. After the implementation of the product spec is complete, the spec will have no bearing on the end product. This is a dangerous thing since the sole reference point for the tester is invalidated. So, each time you update the test spec for a change in product behaviour or expansion in scope of a feature, ensure your PM updates the spec in sync too.
Test your test spec – The test spec needs to be checked against a checklist so that all aspects of the feature have been covered. Michael Hunter, my favorite testing guru has posted an awesome checklist here that should cover you well. 

Disclaimer: This is NOT a guide to write a great feature test plan - that will require several other entries by itself. These are just guidelines to ensure you have the initial test specification right. We'll talk about test plans in later posts