Reading last week’s Campus Review, there’s an article about the cost of energy, the impact on Australian universities and the fact that computing is a large element of their university electricity usage:
Power bills to force an IT re-think
If electricity prices surge post-carbon tax, universities could face huge extra cost burdens because computing is a prime suspect when it comes to sucking up power.
The article quoted some statistics – like 7.1% of electricity consumption in Australia is by computers – and then went on to say that up to half of universities’ power bills might be for IT. This is because of the large fleets of computers on campus, and their associated datacentres. The other statistic that caught my eye was that the average Australian university data centre has a PUE (Power Usage Effectiveness) rating of 1.9 – which means that for nearly every $1 spent on running the servers, there’s another $1 spent on electricity that’s ‘lost’ – in cooling, inefficient energy design, lighting etc.
The lower the PUE, the more effective data centre is (a PUE of 1.5 means you’re only ‘wasting’ an additional 50 cents for every $1 of computer power). So, if we’re building data centres in the Cloud to run services like Windows Azure, how can anybody afford to keep them running?
We have a team, called Microsoft Global Foundation Services, who have the job of building clouds. Or at least, building ‘the Cloud’ – they design, build, run and support our global data centres which are at the hub of all of our cloud services. Our nearest datacentre to Australia is in Singapore. I don’t know about the Singapore one, but the Chicago one is 65,000 square metres – about 10 football fields.
Obviously, at the rate we’re building these data centres, and the huge cost involved, there’s a constant journey to work out how to make the data centres increasingly efficient – especially because of their energy usage, which is a huge part of the cost of running a data centre.
Now, some of the lessons we’ve learnt aren’t things you can apply in your university server rooms easily (like cleaning the roof and painting it white, which reduces cooling cost), and playing around with the wall positioning to improve air flow. Having said they aren’t things you could easily do, that’s probably wrong – you could paint the roof white if you could cost justify it?
However, some of the things that have been learnt could be of use to you, and help you to reduce your carbon emissions and running costs – like making a trade-off in processor performance to achieve the most efficient Performance per Watt per dollar (which is one reflection of the true cost of providing a server service). We’ve also made adjustments to the temperature servers are cooled to – and switching to using more free air cooling to replace air conditioning. And we’ve even experimented by operating servers outside under a tent. The Microsoft data centres were quoted as already hitting a PUE of 1.22 in 2008.
The good news is that as we do this work, we publish it in a consumable format. If you’re interested in how to help reduce your server running costs, or in what we’re doing when we’re building massive data centres, then I can recommend “A Holistic Approach to Energy Efficiency in Datacentres” from the Microsoft Global Foundation Services team.
There is also a lot of detail about different projects going on to look at energy efficient computing, within data centres and elsewhere, on our www.microsoft.com/environment website. Some of the research up there is around Cloud Computing futures, data centre monitoring and optimisation, reducing disk energy consumption, universal parallel computing and power aware developer tools.
And finally, if your interest knows no boundaries, then you might be interested in the MS Datacenters blog, which tells the story of how we’ve grown our data centres around the world over the last few years, and shares some more of the lessons we’ve learnt.
Hopefully, there are some lessons which will help you to deliver more energy efficiency – and save money – in an Australian university data centre.