I love Microsoft®. We’ve been together happily for many years. If you’ve been in a healthy long-term relationship, then you know what this means—there are things about Microsoft that make me curse, stomp, and spit. I’ve learned to tolerate them, but they still make me cringe.
A prime example is our disrespect for critical disciplines like testing. The test discipline is one of the two largest engineering disciplines at Microsoft and one of three key engineering triad disciplines. How can we not grant testers the same respect and opportunities we give the other two engineering triad disciplines—program managers (PMs) and developers? Perhaps our history provides the answer.
Microsoft was started by developers and run by developers for years. We’re now run by a former program manager of sorts, so PMs receive begrudging respect. Developers can’t draw and have no sense of style, so designers are becoming appreciated. Developers hate writing, so content publishing is at least considered necessary. Localization and media are magical things that just appear in the build. But test? Developers believe testing is easy, if not remedial, so developers think testers are beneath them.
Please note that I’m not giving any of these disciplines proper credit for all the work they do. I’m merely listing the superficial ways they are often viewed by developers.
It’s a different kind of flying altogether
Since developers think testing is easy compared to development, they think they can do a tester’s job. After all, isn’t that the cool, agile way? Aren’t we all just software engineers? Yeah, and everyone would get along if we just gave peace a chance. Don’t be naïve.
Developers can verify that their individual components work as specified in isolation (unit testing). They have much more trouble verifying that their components work as a system, outside of isolation and spec. Why? Tunnel vision. Developers design and write their code for a certain purpose. That’s the way developers think about it—as they should.
Real-world testing must verify that the code works appropriately when used in ways that defy all logic and purpose. To test code properly, you need to completely forget how and why it was written and instead think about how it might be used in arbitrary and even insidious ways. Developers don’t think that way—they can’t think that way and still develop with purpose. Testers do think that way. That’s why we need testers.
Even though the test discipline is essential for high quality software, some groups still consider converting all their testers to developers, expecting the combined development team to write all tests—it’s a great way to get more developer headcount! The teams that actually go through with this change experience the following problems:
- They lose their test leads and leaders. The former development leads tend to lead the combined engineering teams. (I talk about why in the next section.) This relegates the former test leads to individual contributor roles (ICs), who usually don’t take this reduction in responsibility well and leave. The former top test ICs also tend to leave—after all, clearly their skill set is not appreciated, no matter what sparkly purple lipstick the management team might put on this pathetic pig of a plan.
- They lose their testers. After the test leads and leaders go, the former test ICs gravitate toward a development mindset. They gravitate there because testers get no respect, they are calibrated against developers even if they continue to work differently as testers, and they can see there’s no career growth in test. After all, their role models left the team. Now the only path up is through development.
- Team morale drops, especially among the testers. Losing team members and team leaders impacts morale. The most impacted are the testers still clinging to their discipline who lose their role models, their self-identity, and their solid reviews. (Even great apples get bad reviews in an orange grove.)
- Their code quality is initially higher and then gradually drops. The quality improves initially because developers suddenly realize they’ve lost their safety net and actually start writing unit tests, doing design and code reviews, and paying attention to those build warnings. (Of course, developers should have been doing this all along.) However, after a while, the system and boundary errors start creeping in and building up. No one is looking for them, so they are discovered by the wrong kind of testers—customers.
Combining development and test makes sense at the unit level (such as Test-driven Development). This practice can also work at the component level for well-understood and well-factored components on teams that also have strong QA. However, combining development and test doesn’t make sense at the system level. I talked about this at length in “Undisciplined” (chapter 4).
I just can’t get enough
The lack of respect for testers is most apparent in leveling and career development. Depending on where they work in the company, testers are ranked one to three levels below developers and PMs. In other words, the test counterpart to a PM and developer who are all working on the same project with the same scope will tend to be a full career stage below his or her peers. Unbelievable!
“Yeah, but testing isn’t as difficult as development and program management,” says the dim developer. Really? Try it sometime. Try writing automation that works every time, even as developers alter configurations and data. Try performing penetration testing that closes gaps that foil sophisticated hacks. Try producing injection testing that discovers failure modes, system testing that finds sneak conditions, or end-to-end scenario testing that validates the billion-dollar bets we make. And I’m just scratching the surface.
“Yeah, but our test team isn’t that advanced—they don’t do all those things.” Exactly! We don’t value the test discipline enough to advance our testing capability to the same levels as PM and development.
Testing is frigging hard! We need great people to do it well, and then we need to pay, develop, and expect them to be world class. Our treatment of the test discipline is astonishing and pathetic. Believe or not, it used to be even worse until senior test leaders from across the company started tracking the numbers and driving promotions.
Yes, it’s true that testers do different jobs from PMs and developers. But we can’t design and construct complex systems and think that testing those systems will be any less complex. You get what you pay for, and by spending less on testing, we create imbalance. We sacrifice quality, productivity, and efficiency as a result. The sacrifice in quality is obvious. The sacrifice in productivity and efficiency comes from incomplete and fragile testing that results in higher error rates, more rework, and higher support and sustained engineering costs.
The refrain I hear when I complain about our commitment to testing is, “Yeah, but Google and Amazon have far fewer testers.” Amazon carefully tests the systems that matter, like billing and account management. Google does endless analysis of its search results.
As for Microsoft, we are in a broader business than Amazon or Google. We are a platform company and an enterprise company. Our customers expect more from us. Quality is a differentiator and a necessity. While we don’t need perfection in all things, we do need the right quality in the right products.
The world just wants us to fit in
It’s bad enough to give up quality, productivity, and efficiency by expecting less from our testers than we do from their PM and developer counterparts. What’s worse is that lowering expectations sends a clear message that to get ahead, a tester needs to become something else. This philosophy is nothing short of irresponsible and tragic.
- Lowering expectations for testers is irresponsible because some of our most challenging engineering problems involve testing—testing in production, testing many-core highly-parallel systems, testing 1000+ machine services, testing globally distributed cloud services, testing secure and private cloud systems, testing hybrid procedural and functional languages, testing natural user interfaces, and on and on.
- Lowering expectations for testers is tragic because we send a message that testing is not a legitimate career path, when in fact it is a robust career path to the highest engineering stages for both test managers and test ICs. Instead of following this path, testers abandon their discipline to do something else, often with mixed results.
By expecting less from our testers, we are encouraging them to move away from a career they love—one that is essential to Microsoft’s success and offers tremendous opportunity—and toward a career not of their choice that may inhibit their growth. It is a travesty.
Microsoft recently announced changes to its compensation plans that increase the base salary for most engineers, including testers. That’s wonderful and I’m grateful. Investing in our test discipline is a separate matter. It is choosing to be just as sophisticated in testing as we are in program management and development.
Executives might reasonably ask, “Why would we increase our spending in testing? What’s the return on investment?” While I believe a strong financial return exists, I’d turn that statement around. If we want to save money, why don’t we decrease how much we pay developers by a couple of levels? Because our quality and innovation would suffer. Why would they suffer? Our products are platforms with wide and varied usage that require sophisticated engineering to orchestrate and improve—and test.
Tell me what it means to me
At a time when we should be investing in test, we continue to demean the discipline. We have great leadership at the highest levels within the test discipline, but far too few testers join these ranks each year. Even though test is far behind PM and development, they receive a smaller promotion budget. (The promotion budget is equal in proportion to the other disciplines, but that proportion is a distortion due to the higher salaries in PM and development.) Only a handful of test ICs and test managers reach the principal and partner stages to serve as role models.
How can we allow a critical and central engineering discipline to be so disrespected and damaged? Are we that vain or foolish to think testers aren’t really needed or the problems aren’t really that difficult?
It’s time we put our money where our priorities are and push testing to the next stage. We hire the best—let’s challenge them accordingly. Let’s lay out the test career path all the way through to vice president and technical fellow. Let’s start aggressively recognizing the talent we have and developing the talent we need. Testing deserves our respect—our customers, our partners, and our business depend upon it.
What can you do to help if you aren’t a test or multidisciplinary leader?
- Accept and appreciate that the test mindset and skillset are different from development, yet their problems are equally complex and critical to our product quality.
- Write high quality code from the beginning using design and code reviews, code analysis (like PREfast and FxCop), and thorough unit testing, all of which allow testers to focus on their unique value of providing quality assurance and exposing system issues developers wouldn’t normally detect.
- Encourage your test team to hire great fulltime testers that focus on the truly challenging test problems we face—problems that when solved will improve the quality of our products, our testing, and our testers.
BTW, I could write a similar column on service engineers, who are even less understood than testers.