Advertisement
YOU ARE HERE: LAT HomeCollectionsComputers

Chip Designer's 20-Year Quest : Computers: Gilbert Hyatt's solitary battle to patent the microprocessor appears to have paid off, if it can withstand legal challenges. Here's his story.

October 21, 1990|DEAN TAKAHASHI | TIMES STAFF WRITER

In 1968, much of the world was in turmoil. Thousands died in the bloody Tet offensive in Vietnam. Russian tanks rumbled into Czechoslovakia. Robert F. Kennedy and Martin Luther King Jr. fell to assassin's bullets. Police clashed with demonstrators in Chicago.

But Gilbert P. Hyatt distanced himself from the turbulence. He was not interested in politics or protests. The 30-year-old electrical engineer was absorbed with matters both big and small: designing a computer to fit on a silicon microchip no bigger than a fingernail.

Risking his family's savings and security, Hyatt quit his well-paying job as a research scientist at Teledyne Inc. and retreated to his Northridge home. In the family room, alone, seven days a week, Hyatt tinkered with home-made electronics hardware.

An idea emerged. And though the design was never specifically used to produce a chip that powered an electronic gadget, Hyatt is now asking the world to recognize him two decades later as the creative inventor who made the computer revolution possible.

In July, after a 20-year legal fight that generated an estimated 10,000 pages of paperwork, the U.S. Patent and Trademark Office gave Hyatt patent No. 4,942,516 for a "Single Chip Integrated Circuit Computer Architecture." The patent shocked the electronics world.

If it withstands legal challenges, the patent could establish Hyatt as the father of the "computer on a chip," or microprocessor, the technology that spawned modern wonders of electronics from pocket calculators to microwave ovens.

"I didn't invent the computer, but I came up with a very good improvement," said Hyatt, a 52-year-old workaholic who prefers to work alone in a modest home on a cul-de-sac in La Palma. "My work in those days led to the PCs (personal computers) of today."

The patent has brought Hyatt's work under the scrutiny of an army of attorneys for computer-chip makers who are trying to determine its scope and validity. If upheld, he would become wealthy, possibly earning millions of dollars in royalties a year.

Depending on the viewpoint, Hyatt is either an underdog inventor who was undermined by greedy investors or a frustrated scientist who exploited an idea in the patent system instead of the marketplace.

Among those who begrudge Hyatt a place in history are former Intel Corp. researchers Marcian E. (Ted) Hoff and Federico Faggin, who along with engineer Stan Mazor have been credited with inventing the first commercial microprocessor between 1969 and 1971.

Another inventor, Gary W. Boone, an ex-Texas Instruments engineer, also has been given recognition because he was awarded the first microprocessor-related patent in 1973--three years after Hyatt filed for his patent.

Boone, now a 45-year-old researcher in Colorado Springs, Colo., built TI's first microprocessor, the TMS 0100, starting in January, 1970. He said in a statement that several firms contributed to the technology, including TI, Intel and a common customer, Computer Terminals Corp., now known as Datapoint Corp., of San Antonio.

But Hyatt contends that he is the true inventor of the microprocessor and that the technology "leaked out" to the industry through investors. He also says that patents awarded to Intel and TI cover only limited microprocessor improvements that were product specific.

"The patent office did a very thorough job before they issued the patent," he said. "That is why it took 20 years."

This is the story of Hyatt's efforts two decades ago to change the world, and the difficult and complex legal battle to get the world to formally recognize those efforts.

The competitive atmosphere in the microelectronics industry in the mid- to late 1960s was similar to the race between the United States and the Soviet Union to send a manned spacecraft to the moon. Electronics companies were going all-out to build chips with consumer uses.

The semiconductor chip had been invented a decade earlier by engineers Jack S. Kilby at TI in Dallas and Robert N. Noyce, then at Fairchild Semiconductor in the Silicon Valley. With the chip, the components of a complete electronic circuit were built on a single slice of silicon.

Still, the computers built from the chips stood several feet high, mainly because of the huge core memory they held on reels of magnetic tape. They could control missiles but were too big to drive consumer items such as watches. The industry needed something smaller.

"The idea of a computer on a chip was the next frontier of what we were doing," said former Intel researcher Faggin, 48, now president of Synaptics Inc., a San Jose company developing artificial-intelligence products. "A lot of people were playing with it."

Like the technology, the electronics industry itself was in flux. Noyce and Gordon E. Moore left Fairchild Semiconductor and formed Intel in July, 1968. Other researchers left companies such as TI or Motorola Inc. to be entrepreneurs.

Advertisement
Los Angeles Times Articles
|
|
|