Comparing .NET 2.0 to 1.1: Transactional Performance with MQSeries

Comparing Transactional Performance with MQSeries: System.Transactions vs System.EnterpriseServices SwC

I spent some time measuring the performance of the new System.Transactions APIs in .NET v2.0, comparing it to the performance of transaction using the Services-without-components (SwC) capabilities in .NET v1.1, specifically when using IBM MQSeries as a resource manager. Some surprising results. System.Transactions shows some good performance gains for minimal changes in code.

I used 2 simple transactional apps to benchmark. The first app simply puts or gets a single message to or from a MQ queue, transactionally. It does this via IBM's supplied .NET Class library for MQ (namespace IBM.WMQ). The app randomly selects whether to put or get, and does this in a loop, with multiple threads. In every transaction, there is always just one resource manager involved (the MQ queue itself), and one queue operation (a put or get). The lpayload size is ridiculously small: under 20 bytes.

The second app involves a mix of different set of 2 flavors of transaction. The first flavor is an enqueue operation, to a single queue, much like the put operation in app #1. But, the second flavor of transaction involves a distributed commit: the app dequeues from the queue, and then inserts a row derived from the queue data into SQL Server. Again this is done via IBM's MQ classes for .NET, in a loop, with multiple worker threads. And as with app #1, the app randomizes its operations so that 50% of the transactions are one flavor (enqueues), and 50% are the other (2PC transactions).

In all cases, I used MQSeries v6.0, the latest version from IBM. Also, in all cases I performed these tests on Windows XP SP2. For the 2PC transactions, I tested against both SQL 2000 SP4, and SQL2005.

I tested with both .NET v1.1 and .NET v2.0, using the System.EnterpriseServices namespace and Services without components (SwC). For .NET 2.0, I additionally test with the System.Transactions APIs. So the complete matrix looks like this:

.NET 1.1 SwC .NET 2.0 SwC .NET 2.0 Sys.Tran
App #1 - each transaction does a single MQ operation X X X
+ SQL2000 + SQL2005 + SQL2000 + SQL2005 + SQL2000 + SQL2005
App #2 - 50% TxA: a single MQ put, 50% TxB: MQ get and SQL insert X X X X X X

The App Code

The basic structure of each application is this:

        using( TransactionScope scope = new TransactionScope()) {

          try {




          catch (Exception ex1) {




It creates a transaction, then within the transaction, does some work, and tries to commit it. The transactional scope class is either the built-in class if the test uses .NET 2.0 (System.Transactions.TransactionScope) or a utility class built atop "Services without Components", which behaves in much the same way.

The TransactionalWork() varies depending on the app type, and then within the TransactionalWork(), there is some randomization.

This work is run iteratively, by multiple threads, over a set period of time, and performance data is collected during the run. This is not a web application, so I cannot use a Mercury load generation tool or the Visual Studio web Load Tester. Instead I use a custom performance test harness that starts and manages a configurable number of threads. The test harness is the same for all variations of tests.

The Results

Summary of my findings:

  • System.Transactions in .NET 2.0 delivers 40% better throughput as compared to .NET 1.1 System.EnterpriseServices (SwC/ServiceDomain) on transactions involving only MQ. Sys.Tx delivers 55% better performance as compared to .NET 2.0 S.ES SwC in the same scenario.
  • When doing the app with the 50/50 mix of transactions, half of which involved a database, when the database was SQL2005, System.Transactions in .NET 2.0 delivers 75% better throughput as compared to .NET 1.1 System.EnterpriseServices (SwC/ServiceDomain). Sys.Tx delivered 100% better performance (double) as compared to .NET 2.0 S.ES (SwC).
  • in the same scenario, when the database was SQL2000, Sys.Tx delivers about 25% lower throughput as compared to .NET 1.1 System.EnterpriseServices (SwC), and 15% lower than .NET 2.0 S.ES. This seems like a strange anomaly. We need to investigate why this would be.
  • As you can conclude from the previous figures, System.EnterpriseServices (Services without Components) actually got slightly slower in .NET 2.0. The same app code compiled and run against .NET 2.0 delivered between 10 and 15% lower throughput as compared to .NET 1.1. I attribute this to additional security checks in .NET 2.0, as well as longer code paths to deal with System.Transactions.
  • Finally, running with 12 threads gave 5% better performance in app #1 (transactions involve only a single MQ operation). Running with 24 threads gave 5% better perf in app #2 (the one that employed the 50/50 mix of transactions that involved a database).

What can we conclude from all of this?

  1. Apps that connect to MQ v6.0 work fine with .NET v1.1 and .NET v2.0.
  2. Upgrading to .NET 2.0 delivers performance advantages. You get good performance increases in most scenarios.
  3. If you're using SwC transactions in .NET 1.1 today, your code can run unchanged on .NET 2.0, but you will benefit by converting to System.Transactions.
  4. be careful in the specific scenario where MQ and SQL2000 are involved in a single transaction managed by System.Transactions. I have to investigate that issue further.
  5. test with various thread pool sizes - your results will vary depending on your workload profile, and the available computer resources.

Disclaimer: I ran these tests on several machines running Windows XP SP2. The results were consistent across trials, and machines.

Try it Yourself

I packed up the test code and made it available for you. If you follow the readme you should be able to set up a database, MQSeries, and your .NET apps to all run on a single node.

Get the code.


I'm going to try to obtain some more appropriate server-grade hardware, running Windows Server 2003, to re-run these tests. I'll get back to you with those results, I hope soon.

    Comments (0)

    Skip to main content