The question which always comes up in my sessions about SQL Server 2008 is according to my most loved feature in SQL Server 2008, the MERGE command. People always ask me how the performance comparison with MERGE is compared to some standard / traditional mechanisms like SQLCommand / SQLDataadapter / MERGE.
Well, for that reason I created a small test setup described in the following:
I created a sample database containing 1.000.000 rows using the data generator from Visual Studio for database Professional. That database should server as the basis for the upcoming tests. I am running the etest on a virtual machines with a non-dedicated disk for SQL IO (therefore results may differ in your environment).
The test will do simply do the following, having two nested loops involved:
a) Loop through the first loop for a step in 1000n number of rows (increasing for 1000 rows each loop).
b) Within the first loop, do another loop, doing the mentioned stuff below for a 50n step of datasize (increasing 50 possible bytes each loop)
1.Call the different manipulation methods, by themselves they will do the following
1.1 Load data from the database giving the variable number of rows defined in the loop.
1.2.Scramble the data, in terms of Insert / Update / Delete something. (Notice that I will use an extra column "deleted" for those manipulation methods which cannot work with the state of the current data row like Merge)
The scramble will update or Insert the specified amount of data (another variable in our loop) by using a random sized string.
1.3 Call the update methods
1.4 Stop the time only for the command execution (not the Command generation or anything else)
Other Test Properties:
Other Test Properties:
Tests per iteration are always executed twice to assure that we do not suffer from any IO bottlenecks / log writing in the background.
Transactions: All commands (except the Merge command which does an implicit transaction on its own on the server) are wrapped into user transaction to reflect a productional environment for the case the commands fail and have to be rolled back.
TestReset: TestData was reset to a previous state using a separate table after ten iterations of the first loop.
CommandCaching: Most people have command Caching implemented in their application, preventing the application / datalayer to create commands everytime they are called as Command creation can be an expensive process, it will inititalize the commands the first time and store it in a Dictionary<string,SqlCommand> for later use.
The results were extracted from the LoggingTable, always picking the best run for an inner iteration per permutation. From the results file and the following graphics, we can see that the execution of the MERGE statement is pretty good for a small number of rows / amount of data. As the test was done using cached commands and a local server instance, instead of doing the transfer over the wire I will try managing to use another test having two machines running the client on the first machine and the server on the second one. The test (presented in Excel) can be seen in the attachment of the blog entry.
If anyone wants to do a replay, the sources are available here.