Convergence Zones


I had a lot of time to think about Elliotte Harold’s call for XML predictions on the way home from Redmond Wednesday night.  We got several inches of snow, which is rare here and the highway folks just can’t deal with . There were massive traffic tieups, and lots of time spent staring off into the snowflakes. Most of us commuters were caught off-guard by the snow, since the forecast was something like “cloudy with a chance of snow showers”.  That’s not inaccurate, but not very helpful … like most of those “successful” beginning-of-2006 predictions pundits are bragging about this month.

Our unexpectedly intense snow yesterday was created by a Puget Sound Convergence Zone:

Those northwest winds will collide with the Olympic Mountains. Part of the air flow will be deflected east down the Strait of Juan de Fuca, while the other part will be deflected down the western side of the Olympics. When the northern branch reaches the I-5 Corridor and the Cascade Mountains, it will then be forced to the south. Meanwhile, when the southern branch reaches the I-5 corridor and Cascade Mountains past the southern side of the Olympics, it will then turn to the north.

Eventually, the south-flowing branch and the north-flowing branch will converge. When that happens, the air has nowhere to go but up. Rising air will lead to convection. That will lead to cloud and storm development.

That seems like a good metaphor for why technology prediction is hard.  The interesting stuff happens in the “convergence zones” where different streams of ideas and technologies collide and create upward convection currents (driven partly by hot air and vapor I suppose).  The World Wide Web arose out of a convergence of the internet and SGML-ish markup;  the Internet Bubble arose out of a convergence of this enabling technology and the vast numbers of PCs in every home and office; XML arose out of a convergence of the capability for universal interoperability spawned by the SGML and the Web,  and the demand for a Web-like experience in a whole range of document and data products that was accelerated by the Bubble. 

But not every wind / technology convergence creates a convergence zone.   See the “Meteorological Mumbo-Jumbo” section in this article about our weather yesterday for an explanation of how this weather pattern was different than the other 20 recent occasions where the winds were about right:

This time, the cold air is, at least initially, coming from the Gulf of Alaska. Normally, that pattern doesn’t make for much snow because the air has a long time to moderate over the warmer ocean waters. But the air mass up there is SO cold … the air was still cold enough to snow by the time it made it here.

In technology, lots of things — technologies, personalities, zeitgeist — have to come together at just the right time.   Why was “Dynamic HTML” a bit of a yawner in the late ’90s but “AJAX” a big hit in 2006? Why did the Newton flop and then the Palm become a hit a few years later?  Why is YAML a backwater but JSON the Next Big Thing this year? I don’t pretend to know.  I do think, however, that we know enough about the XML technology landscape and the forces that interact with it and each other to at least talk about some likely convergences and divergences.

First, as the weather truism goes, the best prediction of tomorrow’s weather is today’s weather.  2007 will look a lot like 2006.  Duh.  Overall, XML has become a fairly mature and slow-changing technology because there is a 10-year legacy holding back any radical change, e.g. a mass migration to RELAX NG, JSON, LINQ, or whatever.  We will see a gradual migration to XSLT 2.0, growing interest and support for XQuery (as a query language!), continued improvements in XML Schema 1.0 interoperability and growing interest in 1.1 as it solidifies, and so on.

Second, there’s a strong prevailing wind that puts XML almost everywhere, even places where it might not belong (e.g., multi-gigabyte log files!). But the pervasiveness of XML means that the “XML community” has less and less in common.   The streams that collided with Mt. NonInteroperable and reconverged to create XML 10 years ago have shifted because Mt. NonInteroperable has been greatly eroded.  People care more about interoperable documents OR data OR messages than interoperable documents AND data AND messages, so the fact that XML underlies all of their interop stories is not particularly interesting anymore. What predictions does this pattern imply?

  •  Assuming that nothing pops up to reconverge these separate streams, we’ll probably see a continuing decline in the popularity in generic XML forums, books, conferences, etc., and more integration of domain-focused XML topics into more specialized venues.
  • Likewise, we’ll probably see an increase in the use of niche XML-related technologies, such as RELAX NG for purely textual documents and JSON for web applications, in the domains for which they are well-suited
  • Conversely, we’ll see a decline of interest in “one size fits all” approaches.  For example, it looks (from the outside) like the W3C Efficient XML Interchange people are focusing on one problem — the fact that XML is too bloated for many limited-bandwidth scenarios — and not trying all that hard to come up with something that is smaller / faster / cheaper / more universal than XML for all scenarios. 
  • Finally, the breadth of XML’s adoption comes at the expense of depth. This suggests that we’ll see more of the pattern that Jon Bosak decried in his XML 2006 keynote: “business users will never adopt a solution that depends on an additional XSLT pass because it would require them to learn something new (never mind that this β€œnew” thing had been widely employed in other contexts for years).” Those vendors may know their customers, and suspect that they really don’t want to learn yet another XML technology just because it would offer an elegant solution to a somewhat peripheral problem.

Third, like the wind XML is mostly invisible — it’s hidden behind the firewall, embedded in ZIP files, or just called something else. What’s more, most people really don’t want to see the XML that surrounds them. They want to see readable documents, updated feeds, processable objects, and delivered services.  The better the XML is hidden, the more the bulk of the world likes it. Being out of sight, it’s also out of mind.  Few noticed that “Asynchronous Javascript And XML” actually refers to the XmlHttpRequest API much more than XML content, so the substitution of JSON for XML as the typical bits on the wire format caused little concern. Actually that understates the case — most people are happy to use more familiar technologies instead of XML or in front of XML.  What might we predict from this pattern?

  • I’ll make a (self serving!) prediction that the LINQ approach of focusing on what is common across data formats will get more mainstream traction than will the notion that users want specialized tools for their XML data.
  • Tools that make it relatively easy to consume XML directly into programming objects such as LINQ to XSD will continue to mature technically and be adopted by pragmatic developers.

What about some of the XML technologies that have been swirling around out there …  What landfalls may we expect in 2007? Again, the whole point of the argument here is that only incremental change can be foreseen unless XML technologies get tangled up in some larger crosscurrents.  So, the Semantic Web will remain highly domain-specific (e.g., the biomedical arena where well-established taxonomies and ontologies really can be leveraged by OWL inferences, SPARLQ query processors, etc.).  I can’t think of any larger social forces that might collide to provide upward convection for the semantic web technologies other than some low-probability event such as a major terrorist attack being thwarted as a direct result of Homeland Security’s semantic technology investment.  How about SVG, or XML 1.1, or some other major XML technology that hasn’t gotten mainstream support (ahem, partly because of decisions in Redmond)?  I don’t foresee any major shifts in the winds here … but then again I didn’t know about the Puget Sound Convergence Zone the other day until the streets near home were iced over.

What great technological, economic, or political forces can you envision changing the XML climate in the next year?


Comments (6)

  1. len says:

    None really.   Batteries are the next big thing.  πŸ™‚

    It may be that 2007 is the year of divergences.

    If it is any symptom of anything other than local concerns, I find myself less and less concerned with generic XML or even XML at all and more concerned with working applications specifically X3D.  

    Convergences in the real-time 3D worlds are increasing momentum there that will result in different applications and possibly are return to the focus on CD delivery over web delivery.   The tensions between the technological haves and the content have nots are forcing the content providers to rethink web applications and their impact on revenue recognition.

  2. The thing that XML, JSON, AJAX and so have in common is that they all re-invigorate/re-brand/re-jig established standard technologies after some necessary technical infrastructure has become common. XML brings together ISO SGML, URLs and Unicode  and the infrastructure of page-based Web servers to serve documents; AJAX brings ECMA JavaScript with the infrastructure of document-based Web servers to serve data; JSON brings together ECMA JavaScript and the infrastructure of data-based Web servers. People naturally move from pages to small documents to data.

    So for Linq, the best approach would be to release it open source, let variants emerge, standardize it, wait for 7 years until the necessary technical & business environment (and the meme!) establishes itself, and see whether it gets re-invigorated/re-branded/re-established. Linq should have a 10 year strategy if it wants to get that kind of grassroots uptake.

    Ideas take time to get established. There is little other way around it. What’s next? Well, I would say ODF/OOXML are looking good for the next big things: servers progressing from serving small documents/data to serving mid-sized documents & data. They are both being standardized, have a decent history and backing infrastructure etc.

    A standard is a kind of open source API where the creators commit themselves to scrutiny, modest change in response to external review, transparency of process, to not nobbling the total market by hogging their share, and to technology that works by the spec rather than specs that idealize or lie about the fixed technology. They are in the users interest in many cases.

    ISO SGML & W3C XML, ISO/IEC/ECMA EcmaScript & AJAX, ISO/IEC/ECMA EcmaScript & JSON; they aren’t successful because they are standards, they are successful because their standardization by pioneers created agreed and review and public and stable specifications that were ready on the shelf to be bought by Joe Public when the infrastructure and memes were right. Standards are a library of stable technological possibilities. Since Michael is comparing Linq with XML/AJAX/JSON, what is its story with open standardization?

  3. I think you are misquoting Jon more than a little, Mike, in your bullet point "Finally, the breadth of XML’s adoption comes at the expense of depth."

    You make it sound that Jon is saying that business users won’t adopt Schematron/XSLT validation and that vendors are wisely aware of their customers and just following the market.

    However, Jon is saying the reverse. It is the vendors who are calling the shots regardless of the actual needs of users, such as him and UBL. It is the vendors who have given up the depth, vendors who claim that problems that don’t fit into their technologies are "somewhat peripheral". Not the users.

    Jon’s quote is not "business users will never adopt a solution that depends on an additional XSLT pass"  but "Furthermore (I was told), business users will never adopt a solution that depends on an additional XSLT pass."  In other words, he is quoting vendors rather than giving an opinion he himself endorses.

    You left out the "(I was told)" and so turned his chastisement of vendors into a chastisement of users. I suppose, in a sense, Jon is actually complaining about people in your role,  so it is natural and interesting to have your response or deflection.

    Jon’s speech has a major section on how UBL solved a long standing problem simply with established and straightforward technology, only to have vendors say "XSLT is too complex for our users." Jon is chastizing vendors for not providing solutions to users requirements.

    And what do we find on this MicroSoft blog here? Jon being misquoted to say that his "big shock" is a user phenomenon rather than because of vendor non-agility. On Schematron, just because Jon is employed by Sun doesn’t mean MicroSoft should reject his words; he has been right before you know!  

    (I should clarify: I don’t think Jon is being remotely personal in his speech. I haven’t communicated with Jon about it though.)

  4. mikechampion says:

    On the standards comment,especially "Standards are a library of stable technological possibilities", I guess I have a different perspective:  Standards are technological REALITIES that one can use with some confidence that they are supported by at least a critical mass of some audience. Given the need to support products we release for many years, and given the training / documentation / translation / etc. burden of releasing them to a worldwide audience, we simply can’t afford to implement every XML spec that comes along and put it on the shelf in case somebody finds it useful.  It would be nice to have community implementations of a wider range of specs on the shelf, but that’s not happening, for a number of good and bad reasons.

    As for LINQ and open standardizaton, there is no story I’m aware of one way or the other.  As far as I know, the basic ideas have been around since at least Haskell, so there is plenty of room for competitors and open source projects to innovate in similar ways.  They can also implement LINQ providers for other data sources or underlying technologies, e.g. something akin to "LINQ to JSON", "LINQ to Oracle", "LINQ to XQuery", etc. But the bottom line for me is that until this stuff proves itself in the real world, talk of standardization is premature.

    On the Bosak quote, I cut and pasted from the linked article, so I don’t think I misquoted.  I thought I made it clear that he "decried" this attitude, not approved it.  Maybe stripping off the "I was told" and trying to quickly set the context was confusing, sorry.  I’ll look it over and consider revising to make it clearer what Bosak’s own position is.

    I’m not sure who the vendors he’s chastising are, and I don’t think it’s a Sun vs MS thing.  We fully support XSLT and advocate it for appropriate situations; if we had any interest in UBL I suspect we would appreciate the approach Bosak favors, which is completely appropriate from a technical point of view.  My *prediction*, not my preference, is that this fact will not carry much weight in the wider world.  Those unnamed vendors aren’t necessarily clueless (as Bosak seems to imply), they may calculate that a good enough but ugly solution with one XML technology is more practical than a better, cleaner solution with two.  Not many people bet against "worse is better" and come out ahead.  Anyway, that’s one prediction I’d be very happy to be wrong about.

    So, I don’t think I misrepresented Bosak’s position but I do disagree if the the moral of his anecdote is about vendor non-agility rather than user resistance. I see it as part of a much larger phenomenon of user resistance to the more sophisticated bits of the XML corpus. (I’m talking about mainstream corporate developers just trying to get their jobs done, most of whom do not currently use the tools either of us develop!)  

    If the moral of your story is that the world would be better off if people used XSLT / Schematron rather than XSD for problems such as the one Bosak describes, I personally agree. I found his talk very persuasive as a case study of the limitatations of the grammar-based approach that you have been educating people about for years. I suspect, however,  that the typical paying customer will be happier with the Schematron-like features in XSD 1.1 than they would with having yet another XML spec to think about, but we’re keeping an open mind on what to do about both specs.

  5. Kurt Cagle says:

    Mike,

    On LINQ – several things will make a major difference here – until you get LINQ into IE, it will be an also-ran technology, no matter how technically superior it is to anything else (re: E4X) out there. I’d recommend opening up a VERY high level summit between yourself, John Schneider at Agile Delta, and Brendan Eich to see whether it would be possible to push LINQ-like technology into E4X – the latter is still relatively immature, and is fairly malleable even now, but with both Flash and Mozilla adopting it (and others, like Opera, poised to) I see E4X gaining far more market share than LINQ in that space, and I think that space in general will tend to dominate developer mindshare for years to come.

    I ironically see JSON (or more properly, Javascript object entities) and E4X as being VERY complimentary technologies, largely because the notation provides a lightweight mechanism for discretized packaging of XML fragments without having to get into all of the headaches associated with namespaces … something that may appall longtime XML developers but which is pretty attractive to web developers looking at keeping transport content lightweight. I WOULD like to see some of the Haskell-like features of LINQ migrate their way into E4X (Monads, anyone?), but I think the trend-lines point to E4X gaining dominance, unless you get one of those particularly Puget Sound micro-climate effects that change the rules completely.

    — Kurt

  6. Hi Mike,

    I think I was stuck in the same traffic with you in Redmond Wednesday night. What a fantastic storm — reminds me of my days back east!

    Given that your post conjured up Efficient XML (aka binary XML), E4X and a mention of me personally (thanks Kurt!), I could hardly resist jumping in. πŸ™‚

    First, I should clarify that the W3C Efficient XML Interchange group is focused on far more than XML bloat and limited bandwidth scenarios. They are focused on a very wide range of use cases that need speed, compactness and a wide range of other characteristics. They just completed a comprehensive review of binary XML proposals looking at size, encode speed, decode speed, and other requirements across a wide range of XML applications and data (messages, data, documents, web services, SVG, financial, scientific, military, etc.). The results show that you can get excellent size and speed improvements simultaneously across the full range of use cases with only one format. They selected Efficient XML as the basis for the standard because it was one of the fastest formats AND was consistently smaller across the full range of tests. BTW the speed tests were conducted in-memory (simulating the fastest possible network) rather than over low bandwidth networks.

    On E4X, Kurt is right on target. There are some kinds of data where JSON works great and others where XML works great. E4X allows you to blend the two together to get the best of both worlds. In E4X, XML objects *are* JSON objects. As such, they can be arbitrarily nested one inside the other. E4X has been adopted by Mozilla and Flash and Opera, Apple, Adobe and others are also pursuing it. Once Internet Explorer catches up, everyone will have it. ;-> The power of E4X is that it builds on top of the widely understood and deployed JavaScript base, adding a minimal layer to support XML as a native Javascript concept rather than introducing an unnecesarily complex array of new paradigms. And of course, its been an approved Javascript standard for well over 2 years.

    Tongue-in-cheek comments aside, I have a great respect for Microsoft and believe you and your customers could greatly benefit from both E4X and Efficient XML in 2007. I’d be happy to participate in the kind of summit Kurt recommends.

     All the best!,

     John