Advertisement
YOU ARE HERE: LAT HomeCollections

THE SOCIAL LIFE OF INFORMATION By John Seely Brown
and Paul Duguid; Harvard Business School Press: 320
pp., $25.95

The Human Touch

September 03, 2000|ALEX SOOJUNG-KIM PANG | Alex Soojung-Kim Pang is a historian and Web developer in the Stanford University library. He recently produced an online exhibit on "Making the Macintosh," a history of the Macintosh computer

The belief that the world is about to tip into a new phase of history is a near-permanent feature of modern life. Enlightened, optimistic Europeans began the last century thinking that the railroad and telegraph had made advanced nations too interdependent to afford armed conflict. By mid-century, it seemed clear that radio, cinema and mass media were transforming society as profoundly as steam power and factories had transformed industry in the 1700s.

Today, it seems beyond dispute that we are at the beginning of an "information age." Digital libraries and encyclopedias render their physical ancestors obsolete; dot-coms create value (or at least valuations) in defiance of traditional laws of economics; and a host of new cyberactivities--data mining, knowledge management and information warfare--take old familiar verbs that used to apply to solid objects and apply them to an ephemeral world of zeros and ones. According to Stewart Brand, even the nature of change is changing, thanks to the combined force of Moore's Law (which states that the processing power of microprocessors doubles approximately every 18 months) and Metcalfe's Law (which describes the increasing returns of expanding information networks).

Nonetheless, in some ways ours bears a striking resemblance to past ages, most notably Europe in the wake of the printing press. In both periods, information technologies seem to be driving history. Francis Bacon declared the printing press, invented by Johann Gutenberg in the early 1450s, one of the three great inventions of all time (gunpowder and the magnetic compass being the other two). Today, dot-commentators describe computers, the Internet and the World Wide Web in similar grand terms. Likewise, in both cases the impact of these technologies was ambiguous. Printing increased the availability of desirable classics and lowered the costs of spreading new knowledge, but the unscrupulous used it to publish scurrilous, heretical, pornographic or simply inaccurate works. Access to information created its own problems. Sorting truth from falsehood could be a full-time activity, and savants worried about spending a lifetime with information overload. It's a lament that would fit all too comfortably in the pages of Wired or Fast Company.

Some historians (most recently Anthony Grafton in his elegant 1992 book "New Worlds, Ancient Texts") argue that the challenge of dealing with all this information did more to undermine faith in ancient learning than the discovery of the New World: It may even have provided impetus for the rise of the scientific method. According to this line of reasoning, Galileo, Bacon and other philosopher-scientists sought to construct a simpler, more secure foundation for knowledge based on direct observation of nature--in sharp contrast to the endless, contradictory stream of published works vying for attention and making unverifiable claims of authority--in which scientific facts might be few in number but would be more trustworthy than their squabbling scholastic kin.

Today, attempts to explain the impact of computers and the Internet on the economy, education and society have re-created the early modern problem of information abundance. Bookstore shelves are filled with bold predictions about the impending obsolescence of everything from the printed book, office, career, corporation and university to the nation-state and geography. The Internet, these argue, makes all of these obsolete by driving down transaction costs, closing the distance between workers and making it possible to coordinate activities around the world. A smaller number of books tries to place the changes we're witnessing in the larger scope of human history.

Thus Michael Hobart and Zachary Schifflin's "Information Ages: Literacy, Numeracy, and the Computer Revolution" (1998) argues that ours is the third great age of information, the first being marked by the ancient world's invention of writing and the second by development in the early modern period of mathematical approaches to understanding nature. Albert Borgmann's "Holding On to Reality: The Nature of Information at the Turn of the Millennium" (1999) offers a different but equally stark tripartite division. Pre-digital societies possess two kinds of information: "natural information," worldly signs--clouds, tracks, landscapes, the position of the stars and phases of the moon--that offer clues about the abundance of game, the progress of the weather and so on; and "cultural information," which explains how to create everything from recipes to architectural plans to musical scores. Today, however, virtual reality, digitally recorded music, 3-D walk-throughs and video games have recast the relationships between humans and information. So dense has the digital world become that the information it contains no longer refers to an external reality but instead "rivals and replaces reality."

Advertisement
Los Angeles Times Articles
|
|
|