SCA is an endorsement of WCF?

huh?

SCA is an endorsement of WCF? Seriously?

I posted earlier in the summer about SCA and made a statement to that effect.  That post just got a comment questioning my sanity or intelligence or both (the comment asked if I was in marketing.  Can you believe that? The nerve of some people!)

Talking about SCA is difficult.  People don't agree.  Sometimes they vehemently disagree on stuff that seems like it ought to be very agreeable.   Why?  I've thought about this, and concluded that the current difficulty in the dialog about SCA, is due to the fact that SCA not well understood, or the understandings people hold about SCA are fairly different.  I was in a series of meetings a while back where the topic was SCA, and we had a couple of experts who had entirely opposed views of the content of SCA.  These were smart, well-informed people, one of them had actually written parts of the SCA specifications.  And yet the disagreement was very stark.  How can this be?  The only way I can figure it - SCA is still new, the implementations that are out there are fairly new, and there's not a broad, common understanding of concretely, just what it is and what it offers.

The commenter I mentioned wrote this:

First, you said "Microsoft views this part of SCA as a big endorsement of the approach we have taken in Windows Communication Foundation". Then, you say "the primary difference between WCF and SCA is that WCF is about wire-level interop and SCA is about a portable programming model". So can you explain how it's an endorsement of your approach when they aren't about the same thing?

You set up the strawman that "SCA is oriented toward consolidating the disparate communications models and APIs within Java", then you knock it down with WCF.

You then agree isn't what SCA is about communications.[sic] Are you in marketing?

When I used the phrase "big endorsement" I was referring to the part of SCA that unifies the programming model regardless whether you are doing web services, angle brackets, transactions, asynchronous comms, and so on.  This is what WCF has already done for .NET.  and part of SCA is dedicated to doing this now, specifically for Java.  A key observation here is, SCA's programming model is intended to be portable across implementations.  The WCF programming model is not.  There's only one implementation of WCF, and that is the one in .NET 3.0 (and above).  From the perspective of a systems architect hooking stuff together (aka HST) in a heterogeneous system, the programming model used by WCF programmers is irrelevant!  Wait, let me extend that - from the systems architect perspective, the programming model used by any programmer of the individual pieces in the system, whether they use VB, Java, C#, or somethign else, ought to be irrelevant.  The only thing that should be relevant, from the HST perspective, are the on-the-wire protocols

I don't grok the rest of the comment, so I'll leave it at that.

Back to my point about people not really agreeing on just what SCA is - there was a previous post from Dana Gardner about Microsoft and SCA.  Dana wrote, in part:

The SCA/SDO backers invited Microsoft to join their efforts, they said, to adopt a programming-level of SOA standardization [my emphasis], rather than a Web services level of interoperability. But the members voiced little hope that Microsoft would have a sufficient motivation to move .NET to a programmatic open standards level. SCA/SDO is nonetheless expected to make interoperability between .NET- and non-.NET-based services a natural and rudimentary aspect of SOAs, but at a higher cost — a tax, if you will — due to Microsoft’s separation from the pack.

"programming-level of SOA standardization"? This, to me, is seriously wrongheaded thinking. We've been down this path before. Many times. Common programming model does not mean interop. Come on, people, look at the facts:

  • J2EE - standardized APIs, but interop between EJBs? Fahgedaboutit. It was always, in theory, a potential benefit of standardizing the APIs. But it never really happened. J2EE people will have to content themselves with the benefit of "sort of portability" - keeping in mind that portability remains partial among implementations, and at this point, is anyone paying attention any more?

  • CORBA - standardized APIs, but interop was sorely lacking. Even with the advent of the IOR, there was still difficulty getting an Object running within Orbix to intercommunicate with an object running in Visibroker. (Ahh, a blast from the past).

  • DCE - ahh, going back in time now. Anyone remember DCE? DCE had a set of standardized APIs, C-based, not object oriented. Something like CORBA but for the C programmer, and not the C++ programmer. This was hot stuff before the advent of Java in 1995. Because DCE had a standardized API, programs were portable (mostly) across implementations. One aspect of DCE was the DCE RPC, or Remote Procedure Call API set. This was the commuications layer of DCE. There was an IDL (before CORBA's IDL) and a compiler of the IDL that produced server-side stubs and client-side proxies. Hmm, this sounds vaguely similar to something I've heard of more recently (can anyone say wsdl.exe?).

    But once again, interop wasn't magically guaranteed by the consistency of the API across implementations. Interop was guaranteed in DCE by the network data representation, or NDR. In other words, the network data protocol. And trust me on this, there was a ton of work put into the NDR. We had big-endian vs little-endian issues, other encoding issues. All the same issues we have today, but NDR was binary, which meant, for practical purposes, it was opaque to humans. I remember seeing the rpctrace tool for the first time; built on tcptrace, rpctrace was endowed with an understanding of the NDR and so it could show you in real time what it thought the on-the-wire data packets were. Something like Fiddler for HTTP/SOAP traffic today. Even with rpctrace, diagnosing interop issues between, let's say, HP's DCE v1.0 and Transarc's DCE v1.1 on Solaris, was a ... challenge. And as soon as we rolled in encryption (yes, kerberos-based message security based on MD5 hashes and DES encryption (back when DES was good stuff) was integrated with the RPC layer), rpctrace was useless.

Have we learned nothing from the past!??!?!?!?

To me, the promise of web services was that in pursuit of interoperability, we are no longer tilting at windmills trying to produce a single, stable, standardized programming model for all programmers, all devices, all nodes, all languages, and so on. The WS-* work the industry has pursued since 1999 shows that we (vendors, customers, developers, pretty much everybody) recognized that protocols were the sine-qua-non for interop. PROTOCOLS people, not programming models. Protocols, Protocols, Protocols, Protocols, Protocols, Protocols! I say.

And let 1000 flowers bloom! Given a standard protocol, the world can support a myriad of programming models, and they can look like anything they want. As long as each implementation produces the same on-the-wire protocol, they can all intercommunicate. Glory be!

In the post I cited earlier, Mr Gardner asked:

why is standardized interoperability at a programmatic level, embedded within widely accepted SOA open standards, any less beneficial?

That is to say, any less beneficial than a standardized wire protocol.  I'll tell you why: Standardizing on a programming model as a way to to boost interoperability has proven to be a failure, repeatedly.   [added 1 October 2007: It's a very nifty rhetorical device to coin a phrase such as "standardized interoperability at a programmatic level" but in my view that concept is imaginary. It is a complete fabrication. You cannot "standardize interoperability at a programmatic level." It is not possible. By definition!]

By the way, I wonder sometimes if I am the crazy one, but if so, there are others out there who are similarly demented.

Let's be Clear Here

Look, I am not arguing against the merits of standardized programming models in general. I see what J2EE did for the consortium of Java vendors that drove it - it produced a common programming model, which meant, a single set of skills could be portable across vendor implementations. Yes, I said skills portability, not app portability, because while J2EE promised app portability, it never really happened in practice. As we all know, APIs are not the only thing in an app. An app has lots of metadata, and J2EE did not standardize all of that. An app has administrative interfaces (programmatic and user interfaces) and J2EE did not standardize all of that. As a result, J2EE apps could be "mostly" portable, but never truly portable.  [added 1 October 2007: Yes, yes, I know, you can write a JSP app that runs on a bunch of different servlet engines. I am totally clear on that. That to me is interesting, but it is not proof of real world portability of J2EE in general. ]

But skills were mostly portable. That is to say, once a programmer learned the J2EE metaphor, once that person understood MVC and what a session bean was and how a remote EJB method worked, then that knowledge applied across multiple vendor implementations of the J2EE spec. That was valuable, both to the vendors themselves, because they could create a critical mass of devs that could be productive on their product, as well as to the developers, who had more options.

So I see the benefits of a common programming model. But I also see very clearly that a common programming model does not in any way guarantee interoperability. Am I the only one who sees this?

Keep in mind that the SCA spec itself, says that you cannot create cross-domain composites. The composite is a collection of components that are stitched together. According to the SCA spec, those components must run, all in the same SCA container. There is no cross-domain interop provided by SCA, even with its common programming model.

Does this surprise you? It shouldn't!

This begs the question, if a common programming model is beneficial, why shouldn't Microsoft throw in with SCA in pursuit of that? That's a worthwhile question and one I will leave for another post.

Bottom line:

  • SCA is primarily about programming model consistency among a consortium of vendors. The cast of characters in SCA looks quite similar to the same people who just left the J2EE party, after Sun began exercising the control it had guaranteed itself in writing up the charter of the JCP.
  • Microsoft does not believe that investing in a common programming model boosts interop, nor do we believe, given dynamic languages, structured languages, and the other advances in programming languages that are occurring today and will continue to occur in the foreseeable future, that it is even practical to enforce a common API.

ps: Yes, I am in marketing. Can you not tell, from the length of this post?

Till next time,

-Dino