Advanced Load Testing Features of Visual Studio Team System

Visual Studio Team Test Edition provides two capabilities that can allow users to create load tests of web sites quickly and easily: the Web Test recorder and the Load Test wizard.    In fact, these tools make this process so easy that it is tempting to use the web tests and the load tests that result from using these tools without much modification.   However, to use the web testing and load testing capability most effectively it is beneficial to understand how to use other web test and load test properties that are not set by these tools.    This note describes some of the most important considerations for load testing with Visual Studio Team System that are not addressed by the web test recorder and the load test wizard.

Verify web tests and unit tests

Before running a load test, it is a good idea to make sure that all of the tests contained in the load test will pass when run by themselves by running them from either the Test Explorer or Test View windows, or from the Web Test editor in the case of a web test.    For data bound web tests, run through all of the data values.

See Josh Christie’s excellent paper Web Test Authoring and Debugging Techniques at https://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnvs05/html/WTAuthDebug.asp for more details on creating and verify web tests.   Some additional web test properties that are useful when running load test are described later in this document.

Choose an appropriate load profile

Choose a load profile for each Scenario in your load test that is appropriate for your test goals.    Here is a summary of the three choices for load profile.

Using a Constant Profile
A Constant Load Profile can be used to run the same user load throughout a load test.   Be careful about using a Constant Load Profile with a high user count; doing so may place an unreasonable and unrealistic demand on your server(s) at the beginning of the load test.   For example, if your load test contains a web test that starts with a request to a home page, and you set up the load test with a constant load of 1,000 users, the load test will submit the first 1,000 requests to the home page as fast as possible.   This may not be a realistic simulation of real-world access to your web site.    To mitigate this, consider using a Step Load Profile that ramps up gradually to 1,000 users, or specify a warm-up period in the Load Test Run Settings.   If a warm-up period is specified, the load test will automatically increase the load gradually during the warm-up period.

Using a Step Load Profile
A Step Load Profile can be used to increase the load on the server(s) as the load test runs so that you can see how performance varies as the user load increases.     For example, to see how your server(s) perform as the user load increasing to 2,000 users, you might run a 10-hour load test using a Step Load Profile with the following properties:
Initial User Count: 100
Maximum User Count: 2000
Step Duration (seconds): 1800
Step Ramp Time (seconds): 20
Step User Count: 100
These settings have the load test running for 30 minutes (1800 seconds) at user loads of 100, 200, 300, up to 2,000 users.   The Step Ramp Time property is worth special mention here because it is the only one of these properties that is not available to choose in the Load Test Wizard.   This property allows the increase from one step to the next (for example from 100 to 200 users) to be gradual rather than immediate.   In this example, the user load would be increased from 100 to 200 users over a 20 second period (an increase of 5 users every second).

Using a Goal-Based Load Profile
A Goal Based Load Profile is useful when you want to determine the number of users that your system can support before reaching some level of resource utilization.    This option works best when you have already identified the limiting resource (i.e. the bottleneck) in your system.   For example, if you know that the limiting resource in your system is the CPU on your database server, and you want to see how many users can be supported when the CPU on the database server is approximately 75% busy, you could use a Goal Based Load Profile with the goal of keeping the value of the performance counter “%Processor Time” between 70% and 80%.    One thing to watch out for is that if some other resource is limiting the throughput of the system, the goal specified by the Goal Based Load Profile may never be reached, and the user load will continue to rise until the value specified for the Maximum User Count is reached.  This is usually not the desired load, so be careful about the choice of the performance counter in the Goal Based Load Profile, and also make a conscious decision about the value for the Maximum User Count to place an upper bound on the user load.

Choosing the location of the Load Test Results Store
When the Visual Studio Team Test Controller is installed, the Load Test Results Store is set up to use an instance of SQL Express that is installed on the controller computer.    SQL Express is limited to using a maximum of 4 GB of disk space.   If you are going to run many load tests and want to keep the results for a while, you should consider configuring the Load Test Results Store to use an instance of the full SQL Server product if available.    See the Visual Studio Team Test documentation for instructions on setting up the database to be used as the Load Test Results Store.
Increase the performance counter sampling interval for longer tests
Choose an appropriate value for the “Sample Rate” property in the Load Test Run Settings based on the length of your load test.  A smaller sample rate, such as the default value of five seconds, requires more space in the load test results database.  For longer load tests, increasing the sample rate reduces the amount of data collected.
Here are some guidelines for sample rates:

Load Test Duration Recommended Sample Rate
      < 1 Hour                   5 seconds
    1 - 8 Hours                 15 seconds
   8 - 24 Hours                 30 seconds
     > 24 Hours                 60 seconds

Consider including Timing Details to collect percentile data
There is a property on the Run Settings in the Load Test Editor named "Timing Details Storage".  If Timing Details Storage is enabled, then the time to execute each individual test, transaction, and page during the load test will be stored in the load test results repository.   This allows 90th and 95th percentile data to be shown in the load test analyzer in the Tests, Transactions, and Pages tables.   The amount of space required in the load test results repository to store the Timing Details data may be very large, especially for longer running load tests.  Also, the time to store this data in the load test results repository at the end of the load test is longer because this data is stored on the load test agents until the load test has finished executing at which time the data is stored into the repository.   For these reasons, Timing Details is disabled by default.  However if sufficient disk space is available in the load test results repository, you may wish to enable Timing Details to get the percentile data.  Note that there are two choices for enabling Timing Details in the Run Settings properties named "StatisticsOnly" and "AllIndividualDetails".  With either option, all of the individual tests, pages, and transactions are timed, and percentile data is calculated from the individual timing data.   The difference is that with the StatisticsOnly option, once the percentile data has been calculated, the individual timing data is deleted from the repository.   This reduces the amount of space required in the repository when using Timing Details.   However, advanced users may want to process the timing detail data in other way using SQL tools, in which case the AllIndividualDetails option should be used so that the timing detail data is available for that processing.

Consider enabling SQL Tracing
There is a set of properties on the Run Settings in the Load Test Editor that allow the SQL tracing feature of Microsoft SQL Server to be enabled for the duration of the load test.   If enabled, this allows SQL trace data to be displayed in the load test analyzer on the "SQL Trace" table available in the Tables dropdown.   This is a fairly easy-to-use alternative to starting a separate SQL Profiler session while the load test is running to diagnose SQL performance problems.  To enable this feature, the user running the load test (or the controller user in the case of a load test run on a rig) must have the SQL privileges needed to perform SQL tracing, and a directory (usually a share) where the trace file will be written must be specified.   At the completion of the load test, the trace file data is imported into the load test repository and associated with the load test that was run so that it can be viewed at any later time using the load test analyzer.

Don’t Overload the Agent(s)
If an agent machine has more than 75% CPU utilization or has less than 10% of physical memory available, add more agents to your test rig to ensure that the agent machine does not become the bottleneck in your load test.

Add an Analysis Comment
After the load test is complete and you have spent some time analyzing the results, you can a short one line description and an arbitrarily long analysis comment to be stored permanently with the load test result.    To do this, in the load test result viewer, right click and choose the “Analysis” option.   This brings up a dialog that allows you to enter your analysis text which is stored in the load test results database you click OK to close the dialog.

Consideration for Load Tests that contain Web Tests
The following considerations apply to load tests that contain web tests, but are not applicable to load tests that contain only unit tests.

Choose the Appropriate Connection Pool Model
The property WebTest Connection Model found on the RunSettings property sheet in the Load Test editor allows you to choose between two models for managing the connections that the load test runtime engine uses for connecting to the target web site(s):

The ConnectionPerUser model most closely simulates the behavior of a real browser: each virtual user running a web test creates one or two connections to the web server that are dedicated to that virtual user. The first connection is established when the first request in the web test is issued. A second connection may be used when a page contains more than one dependent request; these requests may be issued in parallel using the two connections. These connections are re-used for subsequent requests within the web test, and are closed when the web test completes. The only drawback to the ConnectionPerUser model is that the number of connections held open on the agent machine may be high (as high as 2 times the user load), and the resources required to support this high connection count may limit the user load that can be driven from a single load test agent.  The extra resources used are the memory associated with the connection and extra processing time to close and reopen the connections as web tests complete and new web tests are started.

The ConnectionPool model conserves the resources on the load test agent by sharing connections to the web server among multiple virtual web test users. When the Connection Model is ConnectionPool, the Connection Pool Size specifies the maximum number of connections to make between the load test agent and the web server. If the user load is larger than the connection pool size, then web tests executing on behalf of different virtual users will share a connection. This means that one web test will need to wait before issuing a request when another web test is using the connection.  The average time that a web test waits before submitting a request is tracked by the load test performance counter “Avg. Connection Wait Time”.  The value of this performance counter should be very low; if it is not, you need to increase the connection pool size to prevent it from limiting the throughput of your load test.

Consider setting response time goals for web test requests
One of the properties of a web test request is a response time goal.  If you define response time goals for your web test requests, when the web test is run in a load test, the load test analyzer will report the percentage of the web tests executed within the load test for which the page response time was less than the goal.  Note that the response time goal is for the entire page, meaning the time to receive the response for the request specified in the web test including the time to receive the response for any dependent requests.  By default there are no response time goals defined.

Consider setting timeouts for web test requests
Another property of a web test request is a timeout value.  By default there is no timeout value specified on web test requests, so if the web server never responds to the request, the web test request (and the web test) will never complete and no error will be reported.     If a timeout value (even a high one such as 300 seconds) is set, then the web test request will eventually timeout, an error will be reported, and the web test will continue.

Choose a value for the “Percentage of New Users” property
There is a property on each scenario in a Load Test named "Percentage of New Users”.  This property affects the way in which the load test runtime engine simulates the caching that would be performed by a web browser.   The default value for the “Percentage of New Users” property is 100%.   This means that each web test run in a load test is treated like a first time user to the web site who doesn't have any content from the web site in their browser cache from previous visits.   Thus all requests in the web test (including all dependent requests such as images) are downloaded (except in the case where the same cacheable resource is requested more than once in a web test).  If you are load testing a web site that has a significant number of return users who are likely to have images and other cacheable content cached locally, then using the default setting of 100% for “Percentage of New Users” will generate more download requests than would occur in real-world usage.   In this case, you should estimate the percentage of visits to your web site that are from first time users of the web site, and set “Percentage of New Users” accordingly.

Consider setting the “ParseDependentRequests” property of your web test requests to false
By default, when a web test is executed, HTML responses received from the web server are parsed and any dependent requests such as images or style sheets are automatically submitted.   This is usually desirable as it places the most realistic load on the server.    However, if the goal of your load test is to place the maximum load on your application server as opposed to your web server, you may wish to disable this behavior by setting the ParseDependentRequests on all of your web test requests to false.