The Model T and the Prius: Simplicity vs Complexity, yet again


My favorite conundrum, the difficulty of being simple, pops up everywhere I look these days. 

I had an epiphany shortly after buying a Toyota Prius recently:  The thing is an absolute Rube Goldberg machine under the hood, but is if anything simpler to operate than an ordinary car. By contrast, the Ford Model T was so simple in design that it came with all the tools needed to maintain and repair it, and a generation or two of backyard mechanics learned how to do just that.  It got 25 miles per gallon, better than Ford’s typical product today. The Prius, on the other hand, is so backyard mechanic-unfriendly that the 12 volt battery is sealed away under the trunk; there is apparently no way even to use it to perform that one bit of auto repair that almost anyone can do, jump-start another vehicle.  But that complexity returns a big benefit – a Prius has all the interior size and environmental comforts of a conventional midsize sedan, but twice the fuel efficiency of the simple, uncomfortable Model T (and 3-4x that of Ford’s current best seller).  The complexity is mostly hidden away from the driver; the only  leaks in the abstractions that make it look like an ordinary car are the tricks one can use to optimize fuel economy.

I noticed that Len Bullard recently made essentially the same point I am trying to make:

So it is with the modern combustion engine: it works, it is efficient with respect to the power delivered for the fuel and the weight of the overall vehicle, thus adding more efficiency, but it is not something that the average backyard mechanic can rebuild. The tools required, the parts required and the knowledge of how they work in combination exceed his or her resources, understanding and nerve.

Complexity is just as much in the nature of the evolution of systems as simplicity is desirable. At some point, the demands of an environment or a market require complex solutions. While simplicity is a goal, it can also become a religion just as harmful as fundamentalism when pursued with a sword. Complex systems can do what simple systems cannot do. The goal that all systems be accessible can be met with open standards, but the goal that they be powerful, workable and light might not even as the principles over which they are built remain the same.

Now Donald Norman, the guru of high design, weighs in:

“We want simplicity” cry the people befuddled by all the features of their latest whatever. Do they really mean it?  No.

But when it came time for the journalists to review the simple products they had gathered together, they complained that they lacked what they considered to be “critical” features. So, what do people mean when they ask for simplicity? One-button operation, of course, but with all of their favorite features.

Joel Spolsky elaborates:

In the case of the iPod, the way beauty is provided happens to be through a clean and simple design, but it doesn’t have to be. The Hummer is aesthetically appealing precisely because it’s ugly and complicated.

I think it is a misattribution to say, for example, that the iPod is successful because it lacks features. If you start to believe that, you’ll believe, among other things, that you should take out features to increase your product’s success. With six years of experience running my own software company I can tell you that nothing we have ever done at Fog Creek has increased our revenue more than releasing a new version with more features

Gadgetopia summarizes — “you need features, but they need to appear  simple to the end users”. 

There’s a name for the discipline of working within a network of complex constraints to produce something simply usable – engineering.  It’s not easy to get the complicated system of batteries, motors, charging systems, and drivetrain in a modern hybrid car to work together efficiently, but some very clever engineers managed to do that.  Those who dismiss various XML technologies, or XML itself, because of ugly complexities or unpleasant constraints may someday look as prescient as the folks in the auto industry who killed off the electric car in the 1990’s, only to be humbled by the engineers who made the hybrid system practical a few years later.

OpenXML, WS-*, XSD, and XML for that matter are not doomed because they are complex, nor are they destined for success because they attempt to hide their complexity from the end user.   Those who build infrastructures and applications on top of them are challenged to do good engineering to make their purported features actually work and to appear simple to the ultimate consumers.  In the long run we’ll probably end up more or less where automobiles are now – complexity under the hood that would inspire Rube Goldberg, but engineering quality that makes it appear simple to the driver.

Reasonable people can disagree on whether we are currently on the right track to making this happen.  If we are, the yelping in the blogosphere might be dismissed as backyard mechanics lamenting the fact that “enterprisey” applications can’t be built with hand tools anymore.  If the critics are right, we could plausibly be in the midst of a disruption akin to the 1970’s consumer revolt against Detroit-made gas guzzling behemoths in favor of smaller, better-designed imports. We shall see, but the answer is not at all obvious despite various “Mission Accomplished” declarations by one faction or another.

 Here what  I (personally … obligatory disclaimer about not speaking with the xmlteam hat on) think about the technologies listed at the top of this post with respect to the simplicity/complexity dilemma:

  • OpenXML document format vs the Open Document Format? Nobody wins, neither loses.  Specialized libraries and hard work will make OpenXML appear simple enough for most purposes; the underspecification in ODF (e.g. for spreadsheet formulas) will be remedied with spec revisions that add complexity. Actual users will seldom know or care about the underlying formats because modern office apps are already doing a decent job of making all that stuff look simple.
  • W3C XML Schema vs RELAX NG? There’s lots of room for evolution here; neither will win in the long run, but something will emerge that steals ideas from both.  XSD 1.1 already addresses some of 1.0’s  more egregious flaws and incorporates a lot of ideas from Schematron.  In the “2.0” timeframe I’d expect a hybrid schema language that incorporates the best ideas from all three (although I don’t know which organization or company that will come from).  It will be as “complex” as XSD measured in features, spec size, etc. but more like RElAX NG in terms of formal underpinnings and readability.
  • WS-* vs REST?  We’ll end up with a composite Web/Enterprise architecture that has simple REST-like concepts at the core that will continue to be used by “Web 1.0”, but can be more cleanly (than the current WS-* stack) extended to allow multi-hop / multi-protocol reliable messaging, security protocol negotiations, etc.  Even if the WS-* stuff disappeared, someone would quickly reinvent something very similar because the REST primitives are just too primitive to appear simple to any but the most zealous backyard mechanics.
  • XML vs JSON? I’m still undecided about this.  I’m feeling just a bit smugly prescient these days because I was once a flamethrower in the “XML is too complex” wars, and people are finally voting with their feet against XML’s complexity.  On the other hand, XML has gotten so pervasive now that I don’t see how JSON can achieve success as a data interchange standard (outside the AJAX realm) by the classic disruptive route of  competing with non-consumption.  It would be lots of fun to reinvent a JSON-based stack-o-stuff to give it a schema language, transformation language, query language … not to mention a “LINQ to JSON” API!  I’m not so sure that our customers would find it fun to endure that disruption, however.  A well known and vocal colleague [hint: he likes to remind me that DOM means ‘dumb’ in Dutch] thinks I’m a corporate drone wimp  🙂 for that view … what do you think?

  • Comments (7)

    1. Thanks for an interesting article! I agree with much of what you say. In particular, it is not reasonable to expect complex features from a simple system. Beyond a certain point, to do complex things, the system providing the functionality has to be complex too. (The space shuttle is complex because it has to be, not because the designers like complexity.)

      In my opinion, the iPod is so successful because it provides the appearance of being simple: the case design is so minimalistic that a consumer definitely does not feel threatened. Moreover, even though an iPod is actually not that simple, the designers did a brilliant job at following the "You don’t pay for what you don’t use" principle. Those things that people commonly want to do with an iPod (skip a song, find a song, set the volume) are indeed simple. The complexity comes through only once I navigate to the more advanced menus. (But many people won’t ever navigate there or, if they do, won’t find it surprising that advanced features require a little more complexity.) So, the iPod has the *appearance* of simplicity and, in addition, the most frequently used functions *are* simple. The iPod embodies what many people (including Donald Norman, I suspect) would call good design.

      However, I believe that the analogy you present is inappropriate:

      > Those who dismiss various XML technologies, or XML itself, because of ugly complexities or

      > unpleasant constraints may someday look as prescient as the folks in the auto industry who

      > killed off the electric car […] only to be humbled by the engineers who made the hybrid

      > system practical a few years later.

      There are two problems with that analogy. For one, implies that XML needs be as complex as it is and it implies that there will be a pay-off in the future for all that complexity. Both implications are unproven and untested conjecture. Second, the analogy then wields the stick: "only to be humbled…" In other words, we had better heed the warning, otherwise it might be XML users who will be humbled? So, I find that paragraph both inappropriate and manipulative.

      The complexity problem in general, and for XML and web services in particular, is not about complexity per se, but about *needless* complexity. And few people would disagree that XML and WS-* have long ago eclipsed everything that has come before it in terms of complexity, including CORBA, DCOM, and DCE. And yet, web services have not come even close to achieving the kind of show-case applications that CORBA can boast about. In other words, despite about seven years of hype and standardization, web services have not yet reached the level of usefulness reached by CORBA. That’s a rather damning track record, in my opinion. And, by all accountes, the entire WS-* story gets more complex as time goes on.

      Much has been written about needless complexity, including by myself (http://www.acmqueue.org/modules.php?name=Content&pa=showpage&pid=396) and Jim Waldo (http://www.artima.com/weblogs/viewpost.jsp?thread=4840, http://www.artima.com/weblogs/viewpost.jsp?thread=4892). Standards bodies, as long as they don’t commit to standardizing *only* existing best practice, will continue to produce technology that is far more complex than necessary. And, in doing so, they ensure the technology’s demise: eventually, someone bothers to sit down and solve the same problem properly, without all the needless complexity, and that’s the end of the standardized technology.

      Technical excellence (which includes simplicity) is not a sufficient prerequisite for success but, long term, it is a *necessary* prerequisite. XML and WS-* are too far from being technically excellent to be a long-term success, in my opinion.

      Cheers,

      Michi.

    2. bryan rasmussen says:

      " Those who build infrastructures and applications on top of them are challenged to do good engineering to make their purported features actually work and to appear simple to the ultimate consumers. "

      Well, what do you mean by that? Do you mean that people who build XML Schema validators are challenged to do good engineering or that the users of XML Schema to describe their data structures are challenged to do Good Engineering? I suppose that the both is of course true, but surely one  example of really good engineering would be not using XML Schema to describe ones data structures given the well known incompatibilities between validators in a number of areas.

      If I am starting out on a project shouldn’t one of the primary considerations be choosing tools that are stable?

      The same applies to WS-*

    3. mikechampion says:

      Bryan, I meant that the developers of the schema validators, databinding tools, etc. are going to have a hard (but hopefully manageable) challenge to present the necessary raw functionality in a way that looks easy to the users.  If the users do have to be good engineers to use it (like drivers had to be mechanics to use a Model T), then XSD is indeed doomed.   And NONE of these tools is stable enough for non-mechanics to use at the moment, the question is which can best evolve be hidden away under the proverbial hood.

      Michi, "may" was the operative word in the sentence you’re challenging.  I personally suspect that XML does have a lot of needless complexity that will get refactored out before it can be hidden away under the hood.  The interesting question for me is whether XML can strip off cruft faster than JSON, YAML, or whatever can add functionality and gain credibility. I don’t know.  I do think the the core XML community should be more worried about this than they seem to be, for the very reason you cite: If standards bodies don’t clean up after themselves, somebody else will, and they will perform the overdue surgery with an axe rather than a scalpel.

    4. This argument sounds a bit forced to me. Part of your argument seems to be <i>it is a poor artist who blames his tools</i>. You suggest that it is your users (developers) who must make the effort to make the system beautiful, but it sounds a bit too much like a tool-maker griping. If it is indeed true that these complex tools are not only capable of being used to create beautiful systems, but actually <b>required</b>, then the proof will come with the applications. No amount of talk will convince anyone. The backyard mechanics were not silenced by reasoned arguments, they were (for the most part) silenced by the newer cars themselves and how the public embraced them.

      What I think is important to remember is that there is tension between complexity and simplicity at various scales. You address the fact that users want complex systems that appear simple. But it is also the case that developers want to use simple tools to build complex systems that appear simple to users. This is not unreasonable.

      Our entire industry is predicated on the power of abstraction: the encapsulation of complex functionality in simple interfaces. The only way to build more complicated layers of technology is to simplify the way the previous layer is used. You can’t just push the burden of abstraction up to the next level.

      I don’t think any inherent complexity of WS-* or XML is really problematic. What is problematic is much more mundane: incompatible implementations and a difficult debugging story for WS-*, and verbosity for XML. If the complex functionality enabled by these technologies was simple to use, I don’t think you’d see the same complaints about complexity. Instead, we are surrounded by working examples that force us to ask what we stand to gain from these more unwieldy tools. Consider the Ruby language: even the most partisan user of the language would agree that the internals of the runtime are frighteningly inelegant, yet the language itself is heralded as a shining example of a simple-yet-powerful tool for developers.

      While you may see this as an issue of developers resisting the new capabilities of these technologies, your post could also be characterized a claim that "hard to use" is positively correlated with "more capable."

    5. Gunnar says:

      Good stuff. I think the most important thing is to understand what tradeoffs you are making and why. From a security perspective this is called risk management. From a security design standpoint, yes a vanilla REST implementation looks a lot simpler than SOAP and WS-Security, but what happens if you figure out that SSL is terminated about 6 hops away from your app? Then you realize that message level security actually is important, and bolting it onto the side of your REST app may actually make something that is less secure (less proven) and more complex. Bad tradeoff. This is just one example, YMMV even in a Prius.

    6. mikechampion says:

      Anthony, yes the argument by analogy is a bit forced, as are all arguments by analogy 🙂  The whole thing is just a meditation on my epiphany that the hybrid car architecture mind numbingly complex … but I don’t care. (Assuming it continues to work for the expected lifetime of the car … obviously that’s a risk I’m taking as a relatively early adopter).

      That’s an interesting point that "you can’t push the burden of abstraction up to the next level".  I’ll have to think about that — whether I agree, and what it implies for XML-ish stuff.  

      I’m certainly not claiming that hard to use == more capable.  I’m claiming that complexity per se is not a showstopper, contrary to the implications of numerous sneers against WS-* or XSD. "Hard to use" is a showstopper at the user level, so the question is whether the more complex stuff can be more easily wrapped up in a simple interface than the simple stuff.  Developers should be resisting XSD now because it’s too hard.  Is the way forward to hide it behind databinding / mapping / code generation tools, or ask people to learn something "simpler" but still not what they are asking for? I don’t think there’s an obvious answer.

    7. Mike, I completely agree with you that complexity is not a show stopper. The way I phrase this problem to myself when I’m grappling with it is that, on the one hand, we always try to avoid complexity because we know that the system will tend towards chaos despite our best efforts at order. Therefore, we must do everything in our power to not speed up the movement towards chaos (I’m cribbing a lot from Brooks here).

      On the other hand, many wonderful things are enabled specifically because someone was able to seemingly hop off the continuous, monotonically increasing curve towards complexity, and wrap up a complex system in a surprisingly simple package that "just works." To wit, those wonderful feats of "simple" engineering often bring with them a sense of surprise along with pleasure. I think this is because they come about in an environment dominated by a steady trend of new features bringing proportionally more complexity for the user before someone figured out how to buck the trend.

      This suggests that perhaps there is an element of inspiration in simple engineering that may not even be attainable by the kinds of committees that we employ to form our standards. A team, or even an individual, with a purpose seems far more capable of engineering the simple than a consortium trying to reach a quorum.

      I think (pure opinion here) that the way forward will be someone offering something simpler that is not what everyone asked for, but that most people realize they can make do with (e.g. HTTP).