IBM's Trade 6.1 benchmark application is a sample and performance testing application published by IBM, and is designed to illustrate the proper application design practices for high performance J2EE-based applications running within WebSphere Application Server. Greg took the Trade app and built a functional equivalent on .NET. This provides 2 interesting opportunities:
- Performance comparisons. Anyone can run the IBM system and the .NET-based system on the same hardware, and compare results.
- Interoperability demonstrations. Because IBM has designed web services interfaces on the various service layers in the distributed Trade application, if the Web services standard support in WebSphere and .NET is real, then we ought to be able to just replace a Websphere-based service with a .NET-based service, with just a configuration change. The application should continue to run, without problems.
What Greg Leake found was this: .NET performs better than the IBM WebSphere app. And, interoperability just works. Microsoft isn't just making those statements. We're providing the materials so people can verify this for themselves. Anyone can go and download the Trade app (WebSphere), download the StockTrader (.NET) app, set them up, and try it themselves. The StockTrader writeup includes all the instructions. All the source code. If anyone gets different results, we want to know about it.
Now comes word from Andrew Spyker at IBM, apparently he's not happy with these results, on both counts. It's not surprising that an IBMer is not happy about the performance results. It is a little surprising, to me anyway, that an IBMer is not happy with the "interoperability just works" results. Puzzling even. Let me not put words into Andrew's mouth; here's what he said.
...the interoperability the Microsoft report speaks to is basic web services functionality (SOAP/WSDL/XML) only. It does not focus on interoperability of transactions, security, reliability, durability and does not use industry standard schemas that many of our customers need for cross enterprise (B2B) or intra-enterprise web services interoperability with .NET clients calling WebSphere servers.
Ok, that was on the interop part. Then, Andrew comments that " that "we believe we would beat .NET" in a properly conducted benchmark test:
We have reviewed the paper and results and found inconsistencies with the best practices for how to run WebSphere Application Server. Assuming items Microsoft chose not to document along with improvements in performance allowed by following best practices, we in fact believe that IBM WebSphere Application Server would win across all the scenarios shown in the results.
Hmm, this all sure sounds odd to me. Andrew said a bunch of other things, too. But let's just look at these two.
First, Mr. Spyker says that the StockTrader exercise did not use web services standards around "transactions, security, reliability, durability". Quite true. But of course, StockTrader was patterned after Trade, which is IBM's app. It's not as if Microsoft purposefully omitted these features. IBM's Trade app didn't use them, and so neither did StockTrader. But also, on this point, Mr Spyker seems to be implying (correct me if I am drawing the wrong conclusion here) that basic web services interop is not interesting. I completely disagree. Actually I would guess that most web services deployments are rather simple, even now. Let's not discount the value there. Showing customers how to do interoperable web services is a worthwhile effort, even if the systems use basic web services protocols.
Mr Spyker also says that StockTrader "does not use industry standard schemas". Quite true, if by "industry standard schemas", he means, higher-level message and data schemas like SWIFT or some other vertical-industry schema. Let us all be clear that the web services interfaces are based on standards like XML Schema, and SOAP. Again, the services interfaces were pre-existing - they are part of IBM's Trade app. Trade does not use industry standard schemas, therefore neither does StockTrader. This does not invalidate the result.
He goes on to say that StockTrader "does not use industry standard schemas that many of our customers need for cross enterprise (B2B) or intra-enterprise web services interoperability with .NET clients calling WebSphere servers." Ah, and we should pay close attention here, because Mr Spyker has done something very interesting. He is explicitly stating the premise that web services are for " .NET clients calling WebSphere servers." But this is much, much too limited. StockTrader shows this quite clearly -- that .NET clients as well as .NET servers can interoperate with WebSphere servers. Web services standards work for everyone, for clients, servers, mobile devices. This is the key thing: StockTrader has shown that you need not be completely homogeneous on the server, even if you have WebSphere today. With Web services standards, you can plug in .NET-based apps in a service-oriented architecture, and get good interoperability. Really good. And it's not difficult.
Ok, what else does Mr Spyker say? "Assuming items Microsoft chose not to document along with improvements in performance allowed by following best practices, we in fact believe that IBM WebSphere Application Server would win across all the scenarios shown in the results." This is a very interesting statement. What Greg Leake did with StockTrader was build an app, benchmark it, then publish the results, along with full source code, a configuration guide, and test scripts. What IBM is doing here is saying, "we think we would win, if we ran the benchmark."
Mr Spyker is also implying that the StockTrader falls short of full disclosure. ("Assuming items Microsoft chose not to document...") The StockTrader paper says this but maybe it is worth repeating: Everything required to reproduce these results is included in the document. This is a full disclosure document.
If IBM thinks they would produce better performance, Microsoft, and the greater industry I'm sure, would welcome an effort where they publish the source code, test scripts, and configuration guide for the test, along with the results they've achieved. It just doesn't hold water to say "we think we could win if we ran the benchmark."
And really, what can anyone say to a comment like that?