In Ring One, here, we have The Cowboys. They don't need specs or a test plan, let alone daily builds or release criteria. They don't even bother finding out what the user needs to do, really, but instead prefer to make things up as they go along. Since the applications aren't really planned but rather accrete over time, and The Cowboys always have some new bit of functionality they need to add in order to keep their users happy, their code quickly turns into a morass of impenetrable mystery that generates two regressions for every bit of new functionality, and four regressions for every bug fix.
In Ring Two, there, we have The Craftsmen. The Craftsmen always start by talking with the customer to get past what the customer thinks they need to do and find out what tasks the customer is really trying to solve. The Craftsmen plan, design, test, and code the application in multiple series of iterations, constantly checking back with the customer to verify what they are building is what the customer wants. The Craftsmen integrate their changes into the main source tree on a regular basis, and some of them are agitating to start building the app twice a day rather than just once. They are constantly on the lookout for code that doesn't quite work right anymore, and their massive suite of tests protects them against regressions.
Most people would agree that The Craftsmen create software The Way It Should Be Created. Testers whose teams bear some resemblance to The Cowboys bear the brunt of this bad behavior; smart testers in this situation do everything they can to help their team become more like The Craftsmen. No process is ever perfect, so smart testers continue to pursue improvements in the product development process regardless of how closely their team resembles The Craftsmen.
So why is it that when testers go to build their own applications and frameworks to support their testing they more often than not turn into The Cowboys? All that lovely process that helps ensure the right product is built is thrown out the window, and whaddayaknow but a steaming heap of unmaintainable code that doesn't quite solve the problem results. This is just plain wrong.
My team has a goal of having higher quality code than our developers produce. We iterate on user scenarios, design, implementation, and testing -- not just internally but with our customers (Dev and Test) as well. Ditto for test cases. *Every* checkin is code reviewed. We build several times an hour. We aren't quite there yet, but we're getting close.
If your test team isn't already composed of Craftsmen, what are you doing to help them along?
*** Comments, questions, feedback? Or just want a fun job on a great team? </g> Send two coding samples and an explanation of why you chose them, and of course your resume, to me at michhu at microsoft dot com. I need testers, and my team needs a data binding developer, program managers, and a product manager. Great coding skills required for all positions.