SAN FRANCISCO — If Moore's Law -- the enduring axiom about increases in computing power -- held true in other industries, a New York-to-Paris flight that in 1978 cost $900 and took about seven hours would today cost about a penny and take less than a second.
The laws of physics make such lightning-fast travel all but impossible. But those laws have also permitted remarkable advances in technology, at a pace first predicted 40 years ago Tuesday by a young engineer at Fairchild Semiconductor.
In the decades since Electronics magazine published Gordon Moore's article "Cramming More Components Onto Integrated Circuits," Moore's Law has evolved from an obscure theoretical postulation to the very foundation of Silicon Valley's entrepreneurial ethos.
And although Moore's Law has proved strikingly accurate for nearly half a century, the engineers who made silicon one of the most important ingredients in modern life are asking themselves: "What comes next?"
Simply put, Moore's Law posits that the number of components on integrated circuits such as silicon computer processors will double every couple of years, while the cost per component declines at a commensurate rate.
It explains how computing gets cheaper even as it gets ever faster. By making transistors tinier and tinier, researchers have cut the distances that electricity must travel, boosting chip speeds.
"I wanted to get across the idea that integrated circuits were going to be a way to build things, and that things were going to get cheaper as they got more complex," Moore, now 76, who co-founded chip giant Intel Corp. in 1968, said last month in an interview.
The squeezing of progressively more components onto silicon chips has enabled technology to pervade virtually every aspect of daily life, affecting the way people communicate, work, learn and relax.
"Moore's Law is indelibly linked to the history of our industry and the economic benefits that it has provided over the years," chip expert Dan Hutcheson of semiconductor consultancy VLSI Research wrote in a recent article. "The integrated circuit developed rapidly, leading to Moore's observation that became known as a law -- and in turn, launched the information revolution."
Moore's maxim has weathered industry downturns, skepticism by experts and breathtaking technological advances. But it endures. Every 10 years or so, Moore himself predicts that his law has another 10 years or so before the technology on which it is based begins to hit its limits and the pace of advance slows.
Today many researchers peg the date as 15 years out. Already, though, they are stretching their brains -- and the boundaries of physics and chemistry -- to figure out what comes next and supplants the transistor-on- silicon computer chips built as the drivers of the Digital Age.
"Whether it's carbon nanotubes, latches or something optical or DNA, there's something out there, but it is really too early to call any winners," said Fred Weber, chief technology officer of chip maker Advanced Micro Devices Inc. of Sunnyvale, Calif.
For ordinary computer users, PCs have long been plenty fast. They are unlikely to notice much difference between a chip that operates at 4.0 gigahertz instead of 2.0 GHz.
But for university computers doing complex mathematical research, or government systems crunching census data or simulating nuclear explosions, the need for speed and power grows ever greater. Researchers and companies also strive to make things ever smaller, hoping that someday the equivalent of a PC can be worn on the wrist, or that tiny robots can perform surgery inside blood vessels, and even pills can be equipped with circuitry to help with medical treatment.
The competition to achieve such performance is intense, as universities and companies around the world seek to reach markets for such inventions.
Understanding Moore's Law, and its potential limits, first requires a basic understanding of how integrated circuits, or chips, work. The building blocks of a chip are circuits that turn on and off to produce electrical pulses that act as digital instructions.
Hundreds of millions of tiny transistors do most of the work in modern chips, switching current on and off millions of time a second. The more transistors there are, the faster and more powerful the chip.
The transistors are built onto ultra-thin layers of silicon wafers. Silicon, which is one of the most abundant elements on Earth, doesn't conduct electricity. But when mixed with other elements, it can conduct a weak current -- hence the name semiconductor.
In his now-legendary article, buried back on Page 114 of the April 19, 1965, issue of Electronics, Moore figured that the number of transistors would about double every year. Few noticed at the time, but as the years passed, Moore's optimistic observations proved surprisingly astute -- and the article became iconic for Silicon Valley's optimism.