Energy Guzzlers No More: Data Centers Finally Using Less Electricity

A decade ago, they were widely feared as the source of a huge new environmental problem. America's data centers, the vast storehouses of computers that hold everything from Facebook pages to Netflix movies to billions of emails, were guzzling electricity at a ferocious rate, increasing air pollution and greenhouse gases as internet use grew by leaps and bounds.

But a new study shows that data centers, also commonly known as "server farms," have slimmed down dramatically.

Hit with soaring electricity bills and hounded by environmental groups to reduce their carbon footprints, tech companies have significantly improved the way that the centers use energy -- from the way they are cooled to how their masses of computers save power when not fully in use.

"Before there were inefficiencies, like leaving-the-refrigerator-door-open kinds of inefficiencies," said Arman Shehabi, a research scientist at Lawrence Berkeley National Laboratory who helped write the report. "But the industry really has put attention toward making their data centers more efficient. It's an amazing success story."

From 2000 to 2005, the study found, data centers in the United States increased their electricity consumption by 90 percent. From 2005 to 2010, it went up another 24 percent, even with the Great Recession. But since then it has been flat, growing by only 4 percent from 2010 to 2014 despite a boom in online activity, millions of new smartphones, social media mania and other trends that have driven Americans to spend evermore time online.

And, the study projects, from now until 2020 electricity use from U.S. data centers will grow only 4 percent and could actually be reduced by as much as 45 percent -- back to 2003 levels -- with additional energy efficiency measures.

"This is very welcome news," said Pierre Delforge, director of high-tech energy efficiency for the Natural Resources Defense Council, an environmental group...

Comments are closed.