The Danger in Relying on the Electric Utility Model for the Cloud.

But first: Why I haven’t been posting as much lately

Every so often I read a post that starts with an explanation of “why I haven’t been posting lately”. So here we go. J

The reason that I haven’t been posting as much lately is because I have been focusing on amplifying other worthwhile Windows Azure Platform posts from other people’s blogs. My team has a team amplification blog: Each member of our Architecture Evangelism team focuses weekly on a separate area of new Microsoft technology and innovation.  We try to identify for you the most relevant posts on our given topic and then do a short post about that. Give the state of the Windows Azure Platform and the fact that new features are being released at Internet speed there is no dearth of material to amplify.  Take a look at the articles on the Innovation Showcase blog. I am sure that you will not be disappointed. 


 About this post: The Danger in Relying on the Electric Utility Model for the Cloud

Sitting in my nth Cloud symposium last week I again heard the argument that the Cloud should be modeled after an electric power utility. You just plug into it and stuff works.  This analogy has been used lots of places, most recently in Nicholas Carr’s book The Big Switch: Rewiring the World, from Edison to Google .   In that book the author chronicles the evolution of electric power from being self-generated by factories located on the banks of rivers and by individual cities for their own use and then later evolving to the utility model where extensive distribution systems allow power to be generated centrally and then distributed to cites, factories and homes. His thesis, shared by others, is that computing is evolving toward that utility model.

I think this model has some validity, but only to a degree.

Don’t get me wrong, Architects love models. They help us understand a complex often unmanageable situation and make it manageable.  But the problem is that all models break down at some point. One of my favorite sayings, attributed to Albert Einstein, is that “Things need to be made a simple as possible, but no simpler”.   Sometimes models work, but only in a given range. For instance the models of Newtonian Physics worked at non-relativistic speeds but failed a relativistic speeds forcing Einstein to invent new models.   

Building Architecture as a model for Computer System Architecture also fits, again within limits.  (I have never seen anyone build a building from the top down or had to replace a building foundation while the building was still standing. J).  With models sometimes it is more interesting to investigate where the model breaks down then where it is valid. That is what the whole study of patterns and anti-patterns is all about.

Using an electric power utility as the model for the Cloud (Utility) computing only works over a limited range.   

Early on in my career I had the good fortune of working for a company that specialized in developing systems for electric power utilities such as Long Island Lighting and Duke Power.  My experience with this industry as well as some experience building data centers later on has led me to believe that this model is an oversimplification at best that works only over a limited range.  The true model is far more complex.

Home power as seen by the consumer is pretty simple. It comes to the electric outlet and you can plug anything into it and it “just works”.   Try to build a factory or a data center and the situation is much more complex.  You need to meet with the power company beforehand to predict your expected usage so that they can “provision” the appropriate capacity for you.  They may even need to build a sub-station next door to your factory or data center. Certainly they will need to make sure that their generating plant and distribution lines can support the load. Sounds sort of familiar? J

And or course another way that the model breaks down is that units of computation and storage, unlike electrons, are not interchangeable.

So what do you think?



Shout it

Comments (3)
  1. Wzack says:

    Frank Koch (BERLIN) commented via direct email to me.

    Interesting as the power market now goes back from the central utility model to the self-generated, distributed power production again (if you like: back to the roots). Clearly the future will show both power plants; the big, inflexible ones and the smaller (water, wind, solar) for the high demand peaks etc. The new concept is more the smart grid; or if you like “SOA” in IT terms. But from my point of view, the electric market is clearly not a good proof for cloud computing as the bright future, but the S+S idea Microsoft is pushing.

  2. Wzack says:

    Alan Hakimi wrote via direct email:

    This depends on what the perspective one has of the cloud.   If you are a cloud consumer, you want the same characteristics that one comes to expect from a utility:   pay on what you consume, available, scalable, on demand, etc.      

    If you have goals of being a cloud provider,  the idea of a hub and spoke of “clouds” may be interesting view of the future, like an electrical grid if you will, but this is my view is some ways away.    

    I think cloud providers would have to have some standards to allow for federated clouds, etc.       Building the cloud in the exact same way as the electrical utility model in my view is an interesting concept, but it misses the fact that the electrical grid provides ONE “standardized” service.    

    In addition, power standards typically vary from country to country, so there is no worldwide standard on how to plug into the electrical grid.  International travelers get this.  🙂

  3. Wzack says:

    David Smith in a direct email wrote:


    In the US you get to choose 110,220,440  VAC, and that has been my persuasive point when using this analogy,  sure you could demand 148VAC but not from a standard service offering.  Therefore I pose the question are either  110 or 220 VAC good enough?  If so you align with our platform and we will gladly provide you service, at an attractive and very predicable price.

    One of my biggest challenges in the SaaS space has been the deprecation of the many knobs and dials the on premise or traditional  hosted solution provider would offer up a customer. In BPOS we align the customer with the platform and not vice versa.  Getting this concept through to the customer is paramount for accepting the platform is “good enough.”   If it’s not that’s fine too since we are continually maturing our SaaS offering while providing “modern choice” alternatives as well.  Both traditional hosting and on premise options are still viable in many cases.  Therefore while the electricity analogy breaks down in numerous areas, I will continue to use it as a reference model to demonstrate cloud as offering a “standard service”, that while configurable is certainly not infinitely customizable.  Some may even argue it’s not customizable at all, but that’s really just  semantics.

    Of course the added benefit of subscribing to this type of model is freedom from the “lock-in”  that often occurs when packaged products recieve so much customization and become tightly coupled in their  integration to external systems that the organization is paralyzed  from upgrades.  Yes I do believe this is the same old SOA story, however in SaaS world the platform  can enforce the customization limits to ensure an evergreen platform  can be delivered.  Just don’t tell that to the BPOS SPO team since they still have much work to figure out how to make this “evergreen” aspect real when the platform moves from MOSS to SPO 2010 and must seamless move everyone site collections and data… transparently one would hope

Comments are closed.

Skip to main content