IT is a truism of the 21st century that we swim in a sea of information. It gushes forth from the morning paper, it croons from the racks of magazine stands and the million-plus volumes on Amazon.com. It floods down the pipelines of the Internet into our laptops, iPods, PDAs and cellphones. Life is increasingly structured around the delivery and reception of information -- ever more, ever faster and ever timelier data.
Of course information has always been a valuable commodity, but until recently it largely served as a means to other ends, say, as a tool to gain military advantage or a competitive edge in the marketplace. In "Programming the Universe," Seth Lloyd sets out to convince us that information is the foundation of reality itself and that the cosmos may be seen as a vast computer for processing and transforming data.
Lloyd is a professor of quantum mechanical engineering at MIT and a pioneer in the field of quantum computing. He was the first person to propose how a quantum computer could be built, a task that is now proceeding apace in research labs around the world. Along with a growing number of physicists, he believes that computation and information theory offer a new paradigm for understanding the physical world. His aim in this dense but utterly charming book is to trace the history of computational manipulation of information along with the rise of "information science," and simultaneously to tell the story of our universe as an unfolding sequence of "information revolutions."
Humans have been computing almost since we became human. "Like the first tools, the first computers were rocks," Lloyd writes. "Calculus" is the Latin word for pebble, and the first calculations were made by rearranging pebbles. "Rock computers didn't have to be small," Lloyd tells us. "Stonehenge may well have been a big rock computer for calculating the relations between the calendar and the arrangement of the planets."\o7 \f7 Rocks eventually led to the abacus, which remains one of the most successful computing devices of all time. Later revolutions were made possible by the slide rule in the 17th century; geared cogs in the 19th century (these formed the basis of mechanical calculators); vacuum tubes in the 1940s; and transistors in the 1960s.
Over the last half-century, the computational power available to us has roughly doubled every 18 months, a fact first articulated by Intel co-founder Gordon E. Moore and formally known as Moore's Law. But we are approaching the limits of what silicon can do, and it is clear that if we are to keep up the blinding pace of computational enhancement a new technology is needed. Many scientists are hoping that quantum mechanics holds the key and that quantum computers will take us as far past semiconductors as semiconductors took us from pebbles. Lloyd and his colleagues are leading that charge.
As humans have learned to process information on ever grander scales, so too Lloyd sees the history of the universe as a series of information-processing revolutions. The first was the Big Bang, which brought into being from nothingness the cosmic seed of space and time. In the beginning, he tells us, there was very little information in the universe, but as the nascent bubble of spacetime fluoresced into being, its data content exploded. "\o7The Big Bang was also a Bit Bang\f7,"\o7 \f7 he cutely surmises. Soon the primal bits were coalescing into the structures we know as particles (protons, electrons and so on), which later congregated to form atoms, which in turn clumped together to make stars and galaxies. "Every time a new ingredient of the soup condensed out ... new information was written in the cosmic cookbook." \o7 \f7
Planetary systems formed and more complicated molecules came into being, including eventually the life-encoding information structure known as DNA. The evolution of higher organisms required a revolution in intracellular communication or information processing between cells, as did the development of immune systems. Living things in the form of hairless apes soon began to orchestrate their own informatics revolutions. "Life, language, human beings, society, culture -- all owe their existence to the intrinsic ability of matter and energy to process information,"\o7 \f7 Lloyd writes.
This view of reality has been gaining ground in scientific circles for several decades. But though it has come to the fore in the age of computers, its roots lie in the attempts of 19th century physicists to understand steam engines. That effort led to the laws of thermodynamics and the articulation of the concept of "entropy," a mysterious quality that always increases when any action is carried out. It turns out that entropy is a measure of the information content of a system and as time goes on, both entropy and information in the universe increase.