This story appeared on Network World at
http://www.networkworld.com/columnists/2008/070808bradner.html

 

Data centers: Green because you have to

 

Virtualization key to controlling hardware, power costs

 

'Net Insider By Scott Bradner , Network World , 07/08/2008

 

So you work for a big company that has told you that it's your job to build a big data center in a big city. Good luck! More often than not your job may be impossible, and even where it might be possible today, the window is closing fast.

 

What can you do short of relocating to Iceland? Going green by going virtual may be your only possibility -- but even that may not last.

 

Data centers can take a lot of power, individually and collectively.

 

A data center being constructed last year was projected to consume 30 megawatts and the Environmental Protection Agency estimated that data centers in the United States consumed about 61 billion kilowatt-hours in 2006 -- and that's likely increased now. The fact that this amount of power costs a lot of money is only part of the problem.

 

A few months ago I was told by the CIO of a big chunk of a big company that in the last few years he had been stopped from building new data centers in a number of U.S. locations. He was not stopped by his comptroller for wanting to spend too much money on servers or power. Rather, he was stopped by the local power companies that said they would not guarantee they could provide the power he needed to run the data centers -- even if he was willing to pay a premium. The power was
just not there.

 

I suppose he could have built a data center in Iceland where power is available and where connectivity soon will be. But I expect he would not much like the commute -- even though the weather is often better than in Boston when you get there  

 

Data center efficiency has been a discussion topic for a while. (See the EPA data center efficiency site.) But a lot of the discussion has been around things like trying to get server vendors to integrate better with cooling systems so that you can reduce spending on power to cool down what you are spending on power to heat up.

 

There are a lot of things about data centers that could be more efficient. I've seen quotes from 'knowledgeable industry sources' that a typical data center uses eight or more times the power that it theoretically needs to. I'm not sure how those numbers get determined but they are depressing.  Inexpensive or hidden power over the years has led
to complacency. (See "Data center power: the cost reality".)

 

There are lots of things that will need to be worked on to raise the efficiency of data centers, but by far, the low hanging fruit is what can be achieved by a simple move to virtualization.

 

There has been a longstanding assumption in data centers that you need separate computers for separate applications. This is reinforced by a lot of IT auditors and consultants that push for this on the assumption that separate servers are safer because they can be better protected and hacking one does not compromise others.

 

But this perceived security comes at a very big cost -- for replicated hardware and for extra power. Data center servers often run at very low utilization -- 6% is a number I've seen often.

 

Just combining servers on virtualized hardware is an instant win for hardware and power cost as well as space reductions. My CIO friend was able to reduce the physical footprint of a number of his current data centers by more than 75% and the power by more than 80% in this way. That bought him a few years before he has to start looking into flights to Iceland.

 

Disclaimer: Harvard does not operate a data center in Iceland (as far as I know) so the above travelogue is mine alone.

 

All contents copyright 1995-2009 Network World, Inc. http://www.networkworld.com