I am always frustrated and somewhat sad when I hear testers whine or complain that they are not treated fairly; or that they are not respected; or that their development peers look down on them. I've been sitting on this post for many months wondering if I should post it or not when this thread popped up over on the JoS boards.
On one hand, I am always happy to offer words of encouragement and advice on how to rise out of the situation or at least to make the best of it. On the other hand, part of me sometimes wants to just say "stop your whining. If you don't like it that much, quit and find someplace to work where you will be treated fairly and be respected!"
If you want to find a good testing job, you just need to ask a few questions. That said, with all due respect and references to Joel and The Joel Test, I give you "The Alan Test"
The Alan Test - aka "The Test Test"
- Are testers influential from day one of the project?
- Does the test team own their own schedule?
- Does the test manager report to the general manager (and not to development)?
- Are career paths for testers and developers equal?
- Do the developers value testers?
- Do testers have the same working conditions and resources as development?
- Do testers use good test case management and source control tools?
- Are tests built daily?
- Are automated tests and manual tests valued appropriately?
- Do testers have the same coding guidelines and rules as developers?
- Is there a culture of quality?
Are testers influential from day one of the project?
Notice that I used the word influential and not involved or even hired. From day one of the project, testers should be reviewing specs, giving feedback on schedule, and driving testability. The full test team doesn't need to be on staff from day one, but someone should be there setting the quality bar early. If testers are not involved (or hired) before coding begins, the organization obviously doesn't value test (nor quality for that matter).
Does the test team own their own schedule?
The test team should own their schedule and have influence on the overall product schedule. A one-week code complete slip cannot be "absorbed" in the test schedule. If the test team determines they need n days or weeks after code complete to finish testing, they need n days or weeks. Period. If the test schedule is also known as "buffer for the dev team", the organization doesn't recognize the value of test.
Does the test manager report to a general manager (and not to development)?
Put another way, the test manager should be a peer of the development manager. If the test manager reports to the development manager, development needs drive test, and test has a lesser voice in the product.
Are career paths for testers and developers equal?
If test and development are indeed peers, they should have equal career paths. At Microsoft, we have "levels" that line up with promotions and career paths, and developers and testers have equal opportunity for promotion. Another way to ask this question could be "would you ever pay a tester as much as you pay your top developer?". Don't fall for the paper trick on this point - as in "On paper, testers can grow as much as developers - look, we have documents". Ask for examples. "How many developers and testers are at your most senior levels in your org". If the organization isn't willing to promote testers to senior positions, they don't value test.
Do the developers value testers?
Ask if the developers see test as an ally in creating quality software, or as a gang of hooligans making their lives hard (or as a bunch of robots pushing buttons)? Testers don't exist to make developers cringe or cry. In a good organization, developers understand that the role of test can be as much about quality assurance as it can be about quality control, and know that the test team exists so that everyone can make a higher quality product.
Do testers have the same working conditions and resources as development?
Would you want to work somewhere where developers had their own offices, dual 22 inch wide-screen monitors and comfortable chairs, while the test team worked in the hallway sitting on milk crates?
Do testers use good test case management and source control tools?
I once tested software on a laptop, in a car on the way to drop off the master for duplication (at least I wasn't driving). I'm a fan of ad-hoc testing, but this was over the line (I don't work at that company anymore).
Testing is a creative activity, but some structure around recording test cases and related test code is critical on a professional test team. A test case management tool is necessary - as is the ability to version test cases and test code. If the test team doesn't value this, they probably don't really care that much about testing.
Are both tests and product code built daily?
Automated tests (assuming they are compilable code and not scripts) should be built at the same time, and in the same process as product code. This is particularly important in situations where the test code calls APIs or other functionality in the product code, as it provides a small level of testing at build time (function signatures and other header resources).
Do testers have the same coding guidelines and rules as developers?
Put another way - Is test code treated the same as product code. If the test team is writing unmaintainable buggy code, you could hardly expect the development team to respect them. Test code is just as important as production code, and is in need of similar efforts.
Are automated tests and manual tests valued appropriately?
Does management have unrealistic goals of test automation, or do they devalue all manual testing? Unrealistic goals for automated tests indicate that management doesn't understand testing well enough for you to want to work there. Similarly, if all automation is de-valued by management, this indicates that management doesn't understand testing well enough for you to want to work there. Ask about the product and testing goals, then ask how automated and non-automated tests support those goals. Ask for examples of tests on that team that are automated and for examples of manual tests. If a team tries to automate too much - or not enough, it's a sign that you probably don't want to work there.
Is there a culture of quality?
Finally, you need to determine if quality is something the organization tries to test into the product, or if it's something that drives everyone on the team. Do developers "throw code over the wall" to test, or are they embarrassed and apologetic when bugs are found? Are bugs fixed as they are found, or are they left to fix at the end. In order to meet schedule, are bugs punted, or are features cut?
Eleven simple questions. Eleven questions where I would bet the majority of the answers for many testers are "no". I wouldn't work in an organization that scored less than 9. Sadly, many organizations are much, much lower.
Of course, if you know of an eleven, please let me know where to send my resume. :}