I had a lot of time to think about Elliotte Harold’s call for XML predictions on the way home from Redmond Wednesday night. We got several inches of snow, which is rare here and the highway folks just can’t deal with . There were massive traffic tieups, and lots of time spent staring off into the snowflakes. Most of us commuters were caught off-guard by the snow, since the forecast was something like “cloudy with a chance of snow showers”. That’s not inaccurate, but not very helpful … like most of those “successful” beginning-of-2006 predictions pundits are bragging about this month.
Our unexpectedly intense snow yesterday was created by a Puget Sound Convergence Zone:
Those northwest winds will collide with the Olympic Mountains. Part of the air flow will be deflected east down the Strait of Juan de Fuca, while the other part will be deflected down the western side of the Olympics. When the northern branch reaches the I-5 Corridor and the Cascade Mountains, it will then be forced to the south. Meanwhile, when the southern branch reaches the I-5 corridor and Cascade Mountains past the southern side of the Olympics, it will then turn to the north.
Eventually, the south-flowing branch and the north-flowing branch will converge. When that happens, the air has nowhere to go but up. Rising air will lead to convection. That will lead to cloud and storm development.
That seems like a good metaphor for why technology prediction is hard. The interesting stuff happens in the “convergence zones” where different streams of ideas and technologies collide and create upward convection currents (driven partly by hot air and vapor I suppose). The World Wide Web arose out of a convergence of the internet and SGML-ish markup; the Internet Bubble arose out of a convergence of this enabling technology and the vast numbers of PCs in every home and office; XML arose out of a convergence of the capability for universal interoperability spawned by the SGML and the Web, and the demand for a Web-like experience in a whole range of document and data products that was accelerated by the Bubble.
But not every wind / technology convergence creates a convergence zone. See the “Meteorological Mumbo-Jumbo” section in this article about our weather yesterday for an explanation of how this weather pattern was different than the other 20 recent occasions where the winds were about right:
This time, the cold air is, at least initially, coming from the Gulf of Alaska. Normally, that pattern doesn’t make for much snow because the air has a long time to moderate over the warmer ocean waters. But the air mass up there is SO cold … the air was still cold enough to snow by the time it made it here.
In technology, lots of things — technologies, personalities, zeitgeist — have to come together at just the right time. Why was “Dynamic HTML” a bit of a yawner in the late ’90s but “AJAX” a big hit in 2006? Why did the Newton flop and then the Palm become a hit a few years later? Why is YAML a backwater but JSON the Next Big Thing this year? I don’t pretend to know. I do think, however, that we know enough about the XML technology landscape and the forces that interact with it and each other to at least talk about some likely convergences and divergences.
First, as the weather truism goes, the best prediction of tomorrow’s weather is today’s weather. 2007 will look a lot like 2006. Duh. Overall, XML has become a fairly mature and slow-changing technology because there is a 10-year legacy holding back any radical change, e.g. a mass migration to RELAX NG, JSON, LINQ, or whatever. We will see a gradual migration to XSLT 2.0, growing interest and support for XQuery (as a query language!), continued improvements in XML Schema 1.0 interoperability and growing interest in 1.1 as it solidifies, and so on.
Second, there’s a strong prevailing wind that puts XML almost everywhere, even places where it might not belong (e.g., multi-gigabyte log files!). But the pervasiveness of XML means that the “XML community” has less and less in common. The streams that collided with Mt. NonInteroperable and reconverged to create XML 10 years ago have shifted because Mt. NonInteroperable has been greatly eroded. People care more about interoperable documents OR data OR messages than interoperable documents AND data AND messages, so the fact that XML underlies all of their interop stories is not particularly interesting anymore. What predictions does this pattern imply?
- Assuming that nothing pops up to reconverge these separate streams, we’ll probably see a continuing decline in the popularity in generic XML forums, books, conferences, etc., and more integration of domain-focused XML topics into more specialized venues.
- Likewise, we’ll probably see an increase in the use of niche XML-related technologies, such as RELAX NG for purely textual documents and JSON for web applications, in the domains for which they are well-suited.
- Conversely, we’ll see a decline of interest in “one size fits all” approaches. For example, it looks (from the outside) like the W3C Efficient XML Interchange people are focusing on one problem — the fact that XML is too bloated for many limited-bandwidth scenarios — and not trying all that hard to come up with something that is smaller / faster / cheaper / more universal than XML for all scenarios.
- Finally, the breadth of XML’s adoption comes at the expense of depth. This suggests that we’ll see more of the pattern that Jon Bosak decried in his XML 2006 keynote: “business users will never adopt a solution that depends on an additional XSLT pass because it would require them to learn something new (never mind that this “new” thing had been widely employed in other contexts for years).” Those vendors may know their customers, and suspect that they really don’t want to learn yet another XML technology just because it would offer an elegant solution to a somewhat peripheral problem.
- I’ll make a (self serving!) prediction that the LINQ approach of focusing on what is common across data formats will get more mainstream traction than will the notion that users want specialized tools for their XML data.
- Tools that make it relatively easy to consume XML directly into programming objects such as LINQ to XSD will continue to mature technically and be adopted by pragmatic developers.
What about some of the XML technologies that have been swirling around out there … What landfalls may we expect in 2007? Again, the whole point of the argument here is that only incremental change can be foreseen unless XML technologies get tangled up in some larger crosscurrents. So, the Semantic Web will remain highly domain-specific (e.g., the biomedical arena where well-established taxonomies and ontologies really can be leveraged by OWL inferences, SPARLQ query processors, etc.). I can’t think of any larger social forces that might collide to provide upward convection for the semantic web technologies other than some low-probability event such as a major terrorist attack being thwarted as a direct result of Homeland Security’s semantic technology investment. How about SVG, or XML 1.1, or some other major XML technology that hasn’t gotten mainstream support (ahem, partly because of decisions in Redmond)? I don’t foresee any major shifts in the winds here … but then again I didn’t know about the Puget Sound Convergence Zone the other day until the streets near home were iced over.
What great technological, economic, or political forces can you envision changing the XML climate in the next year?