Servers in data center have been blamed to consume a lot of electricity, but a recent study performed by researcher Jonathan Koomey shows that, in recent years, data centers did not follow the path some have expected them to.
The facts are that, between 2000 and 2005 the electricity consumption of data centers had doubled. Between 2005 and 2010, though, the energy consumption grew by only 56 percent worldwide, despite a significant growth in capacity. Over the same period, the electricity consumption grew by 36 percent in the U.S..
The EPA (Environmental Protection Agency) had predicted the doubling in energy consumption in the first place. The predicted increase in energy consumption between 2005 and 2010 has been slowed down partly by the recession and partly by the actions of datacenter operators, who prefer virtualization and cloud computing to dedicated hardware resources, fact that reduces energy consumption by a lot.
Finally, Google has been found responsible for only 0.01 percent of total worldwide electricity consumption (which is relatively much). Anyway, Google is the largest company of the sort in the world and plays the most important role between all the others, so judging by that we’d expect them to eat more electricity than they currently do.
The conclusion: long live the recession and the virtual machines! (kidding)