Skip to toolbar

How Energy Efficient Can Computers Get?

At Greenbiz’s VERGE conference outside Washington, D.C. yesterday morning, Jon Koomey, a professor at Stanford University, put forward a couple of intriguing questions. How efficient can computers get? Is there a Moore’s Law of efficiency?

Moore’s Law, of course, is the idea that the number of transistors that can fit on a computer circuit will double every two years, a rule that has generally held true since the 1970s. But Koomey said that energy efficiency has been experiencing similar leaps dating back to the 1940s (see my grainy photo of his slide below).

VERGE is sort of like a series of TED talks for sustainability and efficiency professionals, and the audience knew these efficiency gains have serious implications.

This rate of improvement — reducing energy usage by a factor of 100 every decade or so — has made possible the devices of today like the laptop and smart phone that can run complex calculations and last for hours on one battery charge. But as devices sip less and less electricity, Koomey said, new possibilities open up.

Needing only a tiny dose of power will enable a network of miniscule, mobile controls and sensors that gather their energy from heat, light, or movement, or even scavenge their power from radio and TV signals. With these, we can “do exactly what we want with a minimal use of resources,” Koomey said.

These tiny, self-energized computers will likely be embedded into all sorts of products and create the ‘Internet of things” — where products will yield new classes of useful data.

Computers have gotten 40,000 times more efficient since 1985, Koomey said. Back then, physicist Richard Feynman estimated that computers could be 100 billion times more efficient than they were at the time.

Koomey said that Feynman’s guess was based on a transistor made from only three atoms. But, Koomey pointed out, researchers at the University of South Wales and Purdue University announced last month the creation of a transistor from a single atom.

“We think we’ve seen big data, but we ain’t seen nothing yet,” Koomey added.

Article by David Ferris, appearing courtesy the Matter Network.