Verifying the Results of your Automated Performance Test

When running an automated performance test against a web application it is important to verify that your results are true and accurate. Many times Automated Performance tests that are poorly written and/or incomplete will cause unwarranted errors within the application under test. Record and Playback type Automated Performance Tools make test script generation extremely fast and easy, these scripts are unfortunately not always accurate and steps should be taken to ensure that the script is valid prior to running tests and reporting results.

 

The following check-list provides several ways to verify your results and ensure a clean, accurate performance test:

 

  1. Clear all logs prior to running your test. These should include the Event Viewer, IIS log and SQL Profiler trace tools. These logs will tell the “story” of what happened while your test ran.

 

A simple verification of your test script authenticity would be to execute a manual walk-through of your application scenario, save all logs mentioned above. Next refresh all of the logs and execute a single iteration of your automated performance test with a single user. Now compare the two logs and look for any discrepancies.

 

  1. Identify key tables within your applications database that you will increase in size from your test script execution. Make note of the row counts before and after each test iteration. For example: a search engine may have a table that logs search results. Take a row count of this table before and after your automated performance test and compare the difference to the number of iterations your test script executes.

 

The following SQL statement can be used to get a count of every table within a database:

--Table Counts

select object_name(id), rowcnt from sysindexes

where indid in (0,1)

and object_name(id) not like 'sys%'

order by 1

 

  1. Append an extra query string parameter to the end of key requests within your test script. Using the example above, if you were interested in tracking the number of search requests that were submitted during your automated performance test you might add a simple keyword name value pair to the request:

 

/mySearchApplication/search.aspx?action=search

Now you could count the number of times “action=search” appears in the IIS log and you would have the total number of times a search was completed. Compare this number to the number the table counts results from item 2 above.

 

Appending a query string parameter to requests within your test script is extremely helpful in testing web services where you POST to the same asmx page while invoking different methods. Append a query string parameter that indicates the method name that is invoked for example:

 

/mySearchWebService/search.asmx?method=search

Verifying the overall validity of the test results is a critical part of performance testing as code changes to the application may be made based on these findings. The integrity of the test must be indisputable.