The list of inventions that have truly changed the world is short. But by almost any standard, the roster should include the little machine introduced 20 years ago this Sunday by the world's biggest computer company.
The IBM personal computer, unveiled Aug. 12, 1981, was not the first PC on the market. Nor was it the cheapest, the most powerful or the most technically advanced. In the strictest sense of the word, it was not even an "invention": Although IBM engineers devised its overall architecture, they relied on components, including the microprocessor, memory chip and floppy disk, that had been developed by others over a period of years.
But IBM's introduction of a so-called microcomputer under its own brand name rocketed what had been a niche market into the big time. Corporate purchasing agents had been wary of buying PCs from the leading maker, Apple Computer, which had been in existence for scarcely five years. But as the saying went, "You couldn't be fired for buying IBM." Within a year, IBM dominated an industry that had grown to more than 100 manufacturers shipping 2.4 million machines annually.
IBM "had the size, the resources and the experience to set up the infrastructure to deliver millions of these," Kenneth H. Olsen, founder of Digital Equipment Corp., a leading maker of medium-sized minicomputers, said later. "After they had done it, it became easier for others to enter the market."
By selling PCs through Sears and other consumer retailers, moreover, IBM signaled the machine's elevation from a plaything for nerds into an appliance for every home, like a refrigerator. Another lasting, if unintentional, legacy was the creation of one of the world's largest monopolies and personal fortunes: Microsoft, which provided IBM with some of the machine's fundamental software and went on with co-founders Bill Gates and Paul Allen to ride the PC boom to billions of dollars in riches.
Technology's Virtues, Drawbacks
To computer historians, IBM's introduction of its microcomputer marks the start of the PC age, an era in which the beige-colored box with a video screen perched on top came to exemplify technology's virtues and drawbacks in equal measure.
In naming the computer its "machine of the year" for 1982, Time Magazine managed to be both hyperbolic and conservative in its prognosis. Time reported that most Americans expected PCs to be "as commonplace as televisions" in the near future (today only about half the households with TVs also own PCs, and the ratio might never change much). But it underestimated the growth of the overall market by noting that experts were forecasting that 80 million PCs would be in use by 2000. In fact, there are an estimated 500 million PCs in use worldwide today; last year, new sales alone totaled 131 million.
The PC's evolution has mirrored the march of technological change: Today's $1,000 desktop machine runs on a chip with roughly 300 times the power of the original. It holds at least 2,000 times as much memory and nearly 120,000 times the data-storage capacity--all for a price that has fallen, in constant dollars, by more than 80%.
Nevertheless, many users curse the PC as a device that, despite 20 years of development, still doesn't work right.
In part, that is because the PC is expected to perform so many disparate functions that its software programs often interfere with one another, producing its notorious propensity to freeze and crash (often blamed on glitches embedded in the standard operating software from Microsoft).
Moreover, the question of whether the PC has been an unalloyed economic blessing remains wide open. Although corporate purchases of PCs have been justified for decades by the expectation of sharply increased worker productivity, there are signs that computerization might have the opposite effect in some industries, partly because it generates billions of dollars in hidden costs.
"It's very costly to maintain this stuff. There's a long learning curve. They break down all the time," said economist and business essayist Jeff G. Madrick. Much of what computers enable people to accomplish doesn't generate real economic gains, he contends.
"Legal briefs are a lot longer than before, but the practice of law isn't arguably better," he said. "Writers write faster, but certainly novels aren't any better."
Madrick does believe, however, that computers might have helped industry manage a social change that already was occurring--the fragmentation of the consumer market because of buyers' demands for more choice. Without small, fast computers, manufacturers might be hard-pressed to keep straight the vast variety of detergents, automobiles and, indeed, PCs themselves that inundate the marketplace. "The computer has been a great tool for keeping up," he said.
Meanwhile, some experts speculate that as a centerpiece of daily life, the PC already might have reached its high-water mark. PC sales in the U.S. are falling this year for the first time.