If you remember, in August I published an article about a researcher from Stanford who had discovered that the electricity consumption of datacenters worldwide didn’t double between 2005 and 2010, versus the period between 2000 and 2005, when it did. His name is Jonathan Koomey, and by now there’s a law wearing his name.
“Koomey’s law,” as it was named, says that the energy efficiency of computers doubles every 18 months. The theory resembles Moore’s law, but with a twist for efficiency.
Gordon Moore had been working at Intel when observed that since 1946, starting with the first general purpose computer (called ENIAC, which consumed 150 kW and only did a few hundred calculations per second), the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years (source: wikipedia).
It seems that even the energy efficiency is following the same rule. In a world where the importance of battery power overtakes performance in tablets, smartphones and laptops there’s a constant need for optimization rather than for innovating more powerful processors.
Theoretically, the energy efficiency of computers working with transistors, as it was assessed in 1985 by physicist Richard Feynman, could have improved by as much as 100 billion times before reaching its limit. By now it has only reached 40,000 times, so there’s a long way ahead.
Of course, all of the rules above could be changed by the eventual introduction of quantum computing, whose evolution in terms of both power and efficiency would be on another scale.