My team is still bogged down in Spec work, but we're getting close to being done. Over the past few weeks it's gotten pretty old. After all, this stuff isn't really all that fun. A lot of reading, a lot of nit-picky comments, a lot of discussions (sometimes arguments) in meetings. From time to time I wonder why I get so involved. I'm a tester, specs aren't my job right?
Sure, they're not my primary focus but specs are very important to me in two distinct ways. The first is simple, commenting on and contributing to the spec process is my best chance to add my input into what we build. Later on in the process, when test traditionally gets involved, the direction is set - and won't be changed. But here, early, we're still figuring out what the program is going to do and I get to add my input. This adds a lot of job satisfaction for me. Think about it, I get to have a say in how we build software that hundreds of millions of people will use. Thankfully it's not a very large say and is tempered with lots and lots of other opinions. Otherwise I fear what kind of silliness we would build. The point is that by being involved early in the spec process I get to have personal investment in the product. We all know that in this industry test often isn't given the same respect as development or management teams (yes, this happens at Microsoft too - but trust me when I say it's much, much better here) and by involving myself in the design phase I help close that gap.
But being involved in this phase is also important from a testing perspective. Here's the scenario: down the line development will be cranking away on some feature, then you'll load that code up and start testing it. You'll find something you think they did wrong and go talk to them, they'll say they did it right. You're now at an impasse. I guarantee this will hapen at least once in any product cycle you're ever involved in.
The easiest way to solve this problem is to go look at the spec. If the spec is sufficiently clear and detailed (as it should be for any user exposed features) you can use it to settle the disagreement. If the spec isn't sufficiently clear and detailed things will get ugly - you'll need some meetings, and some mediation by your management teams, and all kinds of other badness. And like everything in software, solving these problems earlier (like when the spec is written) is much cheaper then solving them later. Plus, with good work on the spec up front you can prevent lots of these misunderstandings from even coming up in the first place.
Developers should really be holding the line on this while the specs are being written. They should be in there demanding more detail. But in my experience they don't do this. In their heads they're already working on how to implement the feature. It gets hard to see the different ways something in the spec could be interpreted when you've already interpreted it yourself and started a plan for implementation. But it turns out testers are excellent at interpreting things all kinds of different ways, it's one of the things that makes us good at what we do.
So as a tester my job in the spec writing process is to keep pushing for clarity and specifics. Keep asking hard questions to be sure everyone is interpreting things in the same way and there isn't room for ambiguity. The more specific the spec, the easier our lives are down the road, and our overall cost is cheaper. It's just a hard process, so when I'm in it I have to keep reminding myself why it's needed.
With that, it's time to get back at it.