How we test – a high level overview

Hello, my name is Erik Cutts and I'm a tester (SDET) on the Business Applications team in Visual Studio. As a tester working on developer tools for several years I have had the unique opportunity to build tools for the best (and most demanding) customers in the world: Developers! As such I'd like to give you some insight into our day-to-day test activities which refine the quality of the tools you use in Visual Studio. In return, maybe you can share some thoughts on testing Office and SharePoint applications or just testing in general. Be aware that these practices do vary across Microsoft and this is based solely on personal experience.

Testers are involved throughout the development cycle, from initial product design through shipping and servicing. Initially, we work with our team on product planning and design to ensure that customer scenarios are being met and that we are shipping the right set of features. We also ensure that the product is testable and scoped correctly so that we have enough time to refine the features within the resource constraints we have set.

Once initial product design is defined test is responsible for authoring an overall test specification. The "Test Design Specification" document is a high-level overview of how we will test the product. This test strategy describes the scope of testing, test schedule, test configurations, external dependencies, risks and mitigations, division of labor and test tools and technologies. This document is live and is updated throughout the development cycle to stay up to date (features and schedules change as do customer requirements). The document is reviewed by our peers and management until we're on the same page.

Now the real fun begins, developers are writing code, features are being checked in. How do we make sure the product is of good quality? Test begins by authoring detailed test-case enumerations which include all the areas we need to hit. These include user scenarios, performance, accessibility, globalization, API testing, security and the list goes on. These tests are reviewed and updated. Once features start to come together we do some exploratory testing. This manual testing involves attacking specific feature areas and logging code-defects (bugs), suggestions, and spec-issues. These issues are resolved by our team and we verify that their resolutions are acceptable. As an aside, our division uses Team Foundation Server to store and track our issues. It's a complete solution for our bug tracking needs which provides excellent configurability, reporting and best of all it is integrated with all the other Visual Studio Team System features.

As our features mature we are ready to start test-case automation. Many of you are familiar with the benefits of test automation. In short, it allows us to execute test scenarios more often and across multiple configurations than a tester can execute manually. Most importantly it alerts us if product functionality which was working has regressed to a non-optimal state. Automation written by dev is different in that they write unit tests which exercise discrete components of the product. Testers excel at writing automation on a scenario level to ensure that the product works as a user would exercise it. Tests are broken down into priorities to determine how often they are executed. Test automation comes in many forms. API testing is used to test internal object models or libraries exposed to the user. UI testing is based on executing the product from the UI using accessibility interfaces to click buttons, enter text, use menus and exercise all other input methods. There are many other types of automated testing such as performance tests, stress tests, fault-tests, and other areas. It's worth mentioning that we are authoring and executing some tests using the Visual Studio test-projects with success.

As the product matures it is the testers' responsibility to report the overall quality of the product across the team. This is a difficult task. How do you measure quality? Number of bugs, performance results, code-coverage, and automation pass-rates are some of the measures. However, they don't tell you how it feels to use the product. For this task we really put ourselves in the customers' shoes. It's the overall feeling of how complete the product is, how it responds, how difficult or easy it is to complete a task that is most important to us. This Overall Goodness Factor (OGF) is incredibly helpful in understanding how close we are to release.

Throughout the product cycle we are working with the customer to get feedback and requirements for the product. Community Technology Preview (CTP) and Beta releases are great opportunities to gather this data. There are early adoption customers and partners which participate in the Technology Adoption Program (TAP). These customers promise to provide detailed feedback while we sign-up to support them with representatives from the product groups of the components they are using. We also use blogs (such as this one) and forums to get in touch as well as customer visits and conference attendance.

As the product cycle comes to an end the tester has a great responsibility. We double check the exit criteria and put our name on the dotted line for Release to Market (RTM) readiness. Once that's done the DVDs are burned and the ship party is scheduled. Each employee receives a shiny plaque on a ship-it award to remind us of our accomplishment.

I hope you've learned something about testing in our group. We're interested in learning more about how you do testing. How do you schedule testing? Do you have a specific test role in your business? Are you authoring automation for your tests or is this mostly a manual process? What examples do you have of testing for SharePoint and Office applications? How can we make it easier for you to test your products? If you are interested in hearing more about any of the above our team will be happy to continue blogging in more detail on these subjects. We look forward to discussing them with you.