The fact that Microsoft now has an MSDN forum focused on software testing has got me thinking more about testing at Microsoft. I'm assuming the powers that be that put the forum together are looking for more transparency and dialog about how we do software testing both inside and outside of Microsoft. To that end, I'm thinking that I'll devote some amount of my blog effort to some posts about software testing at Microsoft from my point of view.
In this first post on testing at Microsoft, I'll talk a bit about the most important software testing position at Microsoft - Software Design Enginer in Test (hereafter referred to as "SDET").
(Historical Note: In the past, Microsoft had two testing positions: Software Test Engineer ("STE"), a position with less technical development skill requirements, and Software Design Engineer in Test ("SDET"), a position that had, in addition to testing skill requirements, strong requirements for development skills. A few years ago Microsoft recognized that their needs in the testing area required a stronger focus on test automation and took steps to eliminate the STE position. I don't know if it has been completely eliminated all across Microsoft but in the groups that I know about I don't see STE's anymore.)
As the name implies, testers in Microsoft need to also have development skills. This is because we expect that the majority of our tests are automated. Why the focus on automated testing in Microsoft? About 3 years ago, I wrote a blog post on that topic. Here is an excerpt from that post:
Assume your job is to test the UI of a product. This product runs on 4 different operating systems. It is also localized into 9 languages. You are required to run you tests against 2 of the languages and foreign subsidiaries are required to run your tests against the localized version of the product that they are doing the localization work for. Those foreign subsidiaries have allocated 1/2 person to testing your product and that person must also verify localization. During its lifecycle, your product will release 6 service packs and testing those service packs is outsourced to a team that knows nothing about your product (or the other 8 products they test as an outsourcing partner). The operating systems that it runs on will also get some number (typically 5) of service packs applied to them, 3 of which will be released before your product. Your product also ships in 6 different configurations (an enterprise edition, a professional edition, a standard edition, an academic edition, an edition that ships in books that teach folks how to program using your product, and a trial edition that expires after 90 days). Management has decreed that every test case you have must run on every configuration that the product can run on at least once before you ship.
This is my world. And now you know why I spend a lot of time thinking about UI test automation.
In a different context than Microsoft, automated testing may not be as critical but for us it is. So we need testers that can develop automated tests.
In addition to developing automated tests, SDETs also need to be great testers. That implies critical thinking about the product and its customers. For that reason, testers need to know about the domain area of the product they're testing. When I was on the Visual C++ test team, we hired developers into test because that was who are customers were. If you don't come into a test team with knowledge of the product domain you will be expected to learn about it and develop that knowledge. I'm actually in that mode now, having just moved into the SQL Server test team. Even if you stay in a product group for some amount of time, chances are the product space will change. An ability to learn and love of learning is very important to an SDET.
This has already turned into a pretty long post so I'll stop for now. Please leave a comment if you have any questions about the SDET position or anything else related to testing at Microsoft and I'll attempt to answer them.