EPA report says energy consumption by servers, data centers has doubled since 2000.
August 3, 2007 (RedHerring.com) – Energy consumption by servers and data centers has doubled since 2000 and is set to nearly double again by 2011, the U.S. Environmental Protection Agency said in a report released Friday.
The report, prepared at the behest of Congress and led by researchers at Lawrence Berkeley National Laboratory, found that data centers and servers gobbled up 61 billion kilowatt-hours in 2006, adding up to a whopping $4.5 billion electricity bill.
As paper records increasingly become a thing of the past, demand for data processing and storage is skyrocketing, sucking up 1.5 percent of the total electricity consumption in the U.S. in 2006, an amount roughly equivalent to that consumed by 5.8 million average American households, by the EPA’s estimate. The federal government’s server farms and data centers alone account for 10 percent of that overall energy consumption and cost taxpayers $450 million a year, the report found.
Spurring demand is the digitization of what used to be mountains of paperwork—everything from online banking and financial services records to electronic medical records to electronic shipping invoices in transportation—not just the growth in popularity of MySpace, YouTube, and Internet entertainment.
Most of the electricity consumed by IT equipment in data centers goes to volume servers while the power and cooling infrastructure of the centers sucks up half the energy consumed overall, according to the EPA.
But all is not lost. Existing technologies can reduce a server’s energy consumption by 25 percent or more, the EPA reported. “Even with existing IT equipment, implementing best energy-management practices in existing data centers and consolidating applications from many servers to one server could reduce current data center energy usage by around 20 percent,” the study’s authors wrote.