The JSON vs XML debate begins in earnest


After seeing  Douglas Crockford’s talk on JSON at XML 2006 recently, I figured that some sort of great debate between XML and JSON advocates was brewing.  I had been waiting for Elliotte Harold’s rebuttal of what Crockford is missing, but haven’t seen it yet.  What has happened is that Dave Winer got off a  rant against JSON as reinventing XML’s (and more specifically XML-RPC’s) wheel:

God bless the re-inventers

Gotta love em, because there’s no way they’re going to stop breaking what works, and fixing what don’t need no fixing

Crockford gets off a very pithy response: 

The good thing about reinventing the wheel is that you can get a round one.

There are a lot of other very interesting observations in that comment thread.  Some points that hadn’t been obvious before to me include:

  • There seemed to be a lot more JSON fans than XML fans in that thread (maybe because the original post was just a wee bit inflammatory)
  • JSON may be something like 100x faster to parse than XML in today’s browsers (but I doubt very much if the best JSON parsers are anywhere near that much faster than the best XML parsers … it would be interesting to know!),
  • JSON parsing ends up with something akin to a  typed “business object” rather than an untyped DOM tree
  • To do that in XML requires yet another layer or two of cruft (a schema and a databinding tool)
  • The bottom line argument for JSON comes down to elegance — it does what is does simply and cleanly, and simply refuses to worry about many of the things that complicate XML such as metadata (attributes), comments, processing instructions, a schema language, and namespaces.

As a 10 year XML veteran, and informal minister of propaganda for the “XML Team”, aren’t I supposed to leap to XML’s defense?  I just can’t summon the energy.  First, XML’s lack of elegance, presumably the result of its genesis in a series of committee rooms and compromises, is unarguable. One just can’t look at it without remembering the old saw that a “camel is a horse designed by a committee.” I’m not sure if that’s a problem, since I like the comeback “but a camel is a lot more sturdy beast if you are exploring the unknown.”  Still, horses’ elegance generates a lot more enthusiasm in the rest of the world than camels’ sturdy pragmatism does.

Second, it’s also hard to argue with the proposition that the XML wheel could be a lot rounder.  As far as I can tell, there is just about zero enthusiasm among XML cognoscenti for re-opening the debates from a decade ago about which aspects of XML (or SGML) are “features” and which are “bugs” — it depends on who is doing what, and whether there are enough people doing it to matter.  By making evolution toward something more simple and secure impossible, the XML community made something like the JSON revolution inevitable.  That doesn’t mean that the revolutionaries will win;  after all, JSON’s own limitations will become apparent only once is tested in scenarios that were not anticipated by its designers . At a minimum, a few key victories for the revolutionaries might motivate the old guard to make some needed reforms.

Third, the fact that the argument for JSON comes down to a matter of elegance doesn’t bode well for its ultimate success.  It’s hard to forget the famous lament of a LISP devotee reflecting on its lack of success against the much less elegant C/Unix/etc. competition: Worse is Better I wonder how many of the legitimate challenges that people are finding easier to solve with JSON than XML (especially the infamous cross-domain data mashup problem imposed by XmlHttpRequest’s security model) mightn’t be solved more expediently with some small tweaks to the XML infrastructure rather than a wholesale adoption of a disruptive innovation such as JSON. 

Finally, in the larger scheme of things it doesn’t matter.  What does matter is that there be standardized, widely supported means for making data interoperable across applications, platforms, programming languages, and time.  Life would be easier for us infrastructure implementers if there were a single, stable standard, but it’s unrealistic to expect that XML 1.0 would be the last word on the subject.  We will cope with whatever happens — small tweaks to address critical bugs that JSON illuminates, multiple de facto data interoperability standards,  guided evolution of XML to be a better universal data interchange format, or wholesale revolution to produce a better world.


It would be nice if the JSON vs XML debate does not go the way of the “REST vs Web Services” perma-talking-past-one-another-fest.  Ultimately they are very likely to end up in the same place.  As Eric Newcomer puts it:

once a technology passes the hype cycle and becomes adopted, all its warts and bumps become more obvious as we find out what it is really good for, and what it is not…we should no more propose Web services as the right solution for everything than we should propose REST as the right solution for everything.

I hope nobody proposes JSON as the right solution for everything, and that the debate reminds us that XML is not the right solution for everything. One way or another, this debate will take us toward either a couple of rounder wheels better fit for their specialized purposes, or inspire a next-generation wheel that really does better than either do well today.



 Quick update with some other interesting links: (Winer’s piece made the top of Techmeme while I was typing …)

Dare Obasanjo

The obvious reaction was to make the Google and announcements into a REST vs. SOAP or XML vs. JSON story since geeks like to turn every business decision into a technology decision. However if you scratch the surface, the one thing that is slowly becoming clear is that providers of data services would rather provide you their data in ways they can explicitly monetize (e.g. driving traffic to their social bookmarking site or showing their search ads) instead of letting you drain their resources for free no matter how much geek cred it gets them in the blogosphere.

Simon Willison

The sweet spot for JSON is serializing simple data structures for transfer between programming languages. If you need more complex data structures (maybe with some kind of schema for validation), use XML. If you want to do full blown RPC use SOAP or XML-RPC. If you just want a light-weight format for moving data around, JSON fits the bill admirably.

What do we lose from not using XML? The ability to use XML tools. If you’re someone who breathes XSLT that might be a problem; if like me your approach when faced with XML is to parse it in to a more agreeable data structure as soon as possible you’ll find JSON far more productive.

Comments (14)

  1. Soumitra says:

    Good to hear your level headed assessment of this issue.  I have a few observations:

    a. Aside from AJAX apps and API’s that support these apps, where else is JSON being used?

    b. Interop is a whole lot more than just serializing data structures.  For example, I for have not found an easy way to serialize graphs using JSON.  The serialization format is the least of our problems.

    c. Cross domain scripting has nothing to do with XML or JSON.  

    d. Any standard that has no path to support change will die.

    Having said all these, one reason I love JSON is that it will get all us XML geeks riled up and thinking about how we can simplify XML.  And that will be a good thing.

  2. MCChampion says:

    " one reason I love JSON is that it will get all us XML geeks riled up and thinking about how we can simplify XML"  Yes!

    I think that cross domain scripting is easier with JSON because browsers will let code within a <script> tag pull from multiple domains, but XmlHttpRequest will not.  I only know what I read in comment threads, I don’t have first hand experience.

  3. Dave says:

    The ability to cross domains using <script> with JSON is very handy. Also, since most uses of JSON are with Javascript, it’s an easy-to-understand format.

    As for serializing graphs, don’t! You serialize the data, and then the client can choose how to use that data once deserialized. What would it mean to serialize graphs, anyway? Do you mean somehow serialize the presentation details and send it over the wire?

  4. Serializing graphs is one specific reason that I take the position that you really ought to stick to looking at JSON as "a technology to output something that can be ‘eval’ed in a browser really quickly and efficiently". If you do that, then you can actually ship down psuedo-JSON that looks something like

    {nodes: [{node1: {children: []}}, {node2: {children: [node(‘node1’), node(‘node13’)]}]}

    (Forgive me if I got the delimiters wrong.) Then, if you have a ‘node’ function in the namespace you run the ‘eval’ in, that function can provide services to tie the graph together at the end, without a separate post-processing phase. (In this case that may not be such an advantage but there can be some cases where it is.)

    And I don’t guarantee that there is *still* a 100x speed advantage over XML, just that there was in Mozilla in the summer of 2005 (IIRC). But I would expect there to still be some difference as creating Javascript objects is much easier than creating DOM nodes, so even with fully optimized versions of both parsers it would make sense that eval(JSONstring) would be faster.

  5. h3 says:

    This is a question that I had to ask myself at some point. At the begining I liked XML, but after trying JSON  it’s hard to tell yourself that there’s nothing wrong with XML’s over complexity.

    I still have not chose one over the other, I just use my intuition to decide which onoe would do the better job in current context. But to be honest, I tend to choose JSON more often for it’s simplicity. Why ? Because most of the tasks I need to do are simple, but for a configuration file I happily use XML 🙂

    The bottom line is use the right tool for the right job.

    And for the hardcore XML lovers who’d like to simplify it, you are not the firt. YAML is a nice alternative, the symfony PHP framework use it.

  6. fishbane says:

    I find it hard to believe that someone would locate difficulty in marshalling a graph in the JSON-ish fashion. It is just a hash of (possibly) more complex datatypes.

    I don’t even care for these sorts of wars. (XML, I deal with, whatever. Sometimes the notation is handy, usually it is a pain.  JSON, whatever. One link in the chain is "solved". Good job, here’s a cookie. Go trade it into a career with ISO over at that desk.) I mean, other than Winer and corporate product managers, who cares about this sort of crap?

  7. MCChampion says:

    I’m not sure if I count as "corporate product manager", but I care because we need to figure out where to invest our resources to make data programmability better/faster/easier. If JSON is going to be used in a lot of the ways XML is used today, we should probably be doing more.  If, on the other hand,it’s clear that JSON has intrinsic limits that most people who try this will stumble over, maybe we should discourage its use outside the AJAX realm.

  8. rogerv says:

    Hey Michael, ran across this link where this debate is being reported:

    Debate: JSON vs. XML as a data interchange format

    They site this comment attributed to you:


    Mike also commented his own opinion that he sees JSON as a good solution for browser-server communication, "but not a serious contender for interoperability purposes


    Hmm, I tend to agree with this.

    The Google GWT AJAX-toolset is an excellent example of where its use of JSON for the GWT async RPC (built on top of XmlHttpRequest) is very well suited. It’s rather straight forward to serialize JSON as Java objects on the server-side and as JavaScript objects on the client-browser side.

    However, am not tempted to switch over from XML to JSON for the messaging format in my behind-the-firewall enterprise distributed applications. What I like about XML in that regard is that can be even more loosely coupled by avoiding direct object serialization altogether and instead using a technique that I refer to as "XML data duck typing".

    Such is more facile to evolve a distributed application when coupling between nodes is not overly rigid. Object serialization too tightly couples and causes a distributed app to be too brittle.

    Yeah, didn’t take long to learn that the ease of evolving distributed app software far out-weighs any so-called convenience factor derived from object serialization.

  9. Matt Stark says:

    This post is limited to developing RIA’s … I am an application developer, and an architect, and a technical lead.  Have many of you ever explained to a Jr. Javascript developer how to use Http requests and parse XML inside different browsers?  Have you tried to explain the <script> tag?  Which causes more confusion?


    1) You do not require a JSON parser, nor do you need the eval statement to "process" JSON … this is done for secuirty, the same flaws that would be prevail in JSON could surely prevail in XML depending on your 100% manditory parser requirement in the XML space.

    2) Serialization is done at the data level, data is then serialized into interface UI’s.  Hence these graph serialization issues are mentioned here have no validity.  You continue to have to "deserialize" XML into JavaScript data structures or JSON Objects.  With JSON you eliminate this pointless and mutually redundant processing of your requests.

    3) The purpose of JSON is the same as developing .Net Generics & as Yukon spits out custom .Net types.  It talks to and how you load, manipulate and work with data in your Web Client Application layer.  All we are asking for is that it spits out a pre-serialized JavaScript object rather than a huge XML file.  This makes sense.

    4) JSON isn’t a replacement for XML and infact I’m sure most all programmers would rather work with Widget API’s (EX// JavaScript API’s) than understand service contracts and complexities which are inherit in working with Web Services.  JavaScript is a simple ‘standardized’ do what you want language. Please don’t standardize our data structures as it’s this flexibility that allowed us to come up with this approach in the first place.

    5) JSON is not a replacement for Service based stand alone applications – nor for situations where a web application consumes back end business systems VIA web services.  I’d actually prefer that to be XML as there are ton’s of tools avail for that approach already.

    There are a lot of people here who are totally confusing this stuff as a replacement for XML.  It isn’t that, it’s just another option for the Web Based RIA transport layer …



  10. I missed lecture on JSON last week, so I’m trying to teach myself the basics. From the book, and from a blog I was reading, it looks like JSON does the same thing as XML using the Javascript language. If…

  11. After seeing Douglas Crockford’s talk on JSON at XML 2006 recently, I figured that some sort of great debate between XML and JSON advocates was brewing. I had been waiting for Elliotte Harold’s rebuttal of what Crockford is missing, but haven’t seen i