Three and a half years ago, the New York Times ran an article on data center efficiency: http://www.nytimes.com/2012/09/23/technology/data-centers-waste-vast-amounts-of-energy-belying-industry-image.html. It's relatively old news, but I've seen it reposted a few places recently so I thought I'd react to it here.
It's disingenuous that the article never mentions that US per capita electricity consumption is actually falling. The US Energy Information Administration has great data their Monthly Energy Reviews: http://www.eia.gov/totalenergy/data/monthly/... though I'm having trouble finding the raw CSV's to make my own graphs. From a recent report, even total electricity consumption has leveled off over the last 15 years:
This is 100% driven by technology: you can run dozens of LCD monitors on the power once required for one cathode ray tube TV. You can replace a one 100W lightbulb with about 15W of LEDs... AND you don't have to expend even more energy to get rid of that 85W of lightbulb heat. And whether it's your house or a datacenter, cooling is expensive.
A typical in-windows air conditioner has an energy efficiency ratio of about 12 BTU/W•hr, which corresponds to a coefficient of performance of about 3.5. This means that 1W of electrical power can pump 3.5W of heat from the cold side (inside your house) to the hot side (outdoors). The colder the cold side, the hotter the hot side, and the more gunk-clogged your air conditioner unit, the less energy you can pump per unit of electrical energy.
Electricity is a cost, and technology companies are financially incentivized to reduce consumption (as long as it doesn't reduce reliability). Big users like Google, hardware manufacturers like Intel, and system integrators like Emerson Power are all working to make this happen:
• Google - Efficiency: How We Do It
• Intel - Increasing Data Center Efficiency Through Server Power Management
• Increasing Data Center Efficiency & Availability through Infrastructure Monitoring
Data centers are well-positioned to be early adopters of improved grid-scale energy storage and backup technologies, like next-generation batteries, small-scale hydroelectric, and fuel cells. Microsoft Research even explored a concept of installing heat-producing servers—"data furnaces"—in homes and office buildings so that the waste heat could be beneficially used to heat the space: https://research.microsoft.com/pubs/150265/heating.pdf
Data centers now use around 2% of US electricity, and electricity is 40% of total US energy. Transportation uses 30% of total US energy, and virtually all of it is in the form of nonrenewable petroleum products. Data center efficiency will undoubtedly improve as the technology improves and the sector gets more competitive,
I've been posting stuff on the Internet in a systematic way for 3.5 years now. I have a backlog of literally hundreds of unfinished bits and pieces, so I'm going to try a new strategy of posting smaller things more frequently. This may also help me spend less time writing long comments on social media that nobody will ever read.