YOU ARE HERE: LAT HomeCollections

COLUMN ONE : Fragile Virtual Libraries : Digitizing books and papers opens up a trove of culture to anyone with a modem. But archivists worry about what happens when the power fails.


The electronic archive at Washington University is a library without walls for books without pages--a wonder of the nether world called the Internet.

Operated from the school's St. Louis campus, the archive may be the world's largest public computerized information source--so indispensable that an average of 45,000 people reference it every day; so besieged with requests that another 60,000 people daily are turned away.

None of them actually has to set foot in St. Louis.

The archive, housed in a small computer on a folding table, is a fledgling example of what proponents call a virtual library, in which international computer networks and automated databases replace traditional book repositories. A modem serves in lieu of a library card.

The advent of the virtual library may be the most significant change in the nature of the public library in centuries, experts say.

With several hundred thousand files of text, software and images available instantaneously worldwide, the 60-gigabyte Washington University archive is a tentative step toward a time when electronic libraries will make books seem as archaic as clay tablets.

So the consternation was understandable one morning not so long ago when the archive vanished--vaporized when the computer's memory failed.

In their digital bindings, the books of that virtual library were as vulnerable to a flipped bit or a power surge as monastic scrolls were to the barbarian's torch.

For library specialists and some computer scientists, the fragility of the St. Louis archive, which has since been painstakingly restored, is a cautionary tale. Dozens of even more comprehensive electronic libraries are being planned.

Some experts worry that reliance on electronic archives may make humanity's hard-won knowledge more vulnerable and expose it to unexpected risks of technological obsolescence.

Computer equipment becomes obsolete so quickly that it may be impossible for historians of the next generation to study today's electronic records, documents or databases. As computers make it easier to store, catalogue and retrieve information, the information itself is becoming more fragile. Conventional type can withstand all but destruction of the page on which it is printed, but it only takes a stray magnetic field to kill an electronic file forever.

And when material of history and culture is electronic, what happens when the power fails? "We are very scared about the electronic media," said Peter Hirter, coordinator of the electronic public access initiative at the National Archives, which is responsible for preserving valuable government records in perpetuity.

"The problems that are associated with long-term preservation of scanned, electronic material are immense," he said.

After all, almost no one requires special equipment to read a book.

Computer scientists, however, are used to thinking of memory in terms of nanoseconds, not decades or centuries. "Very few people in the computer science world have really thought much about the problem of longevity," said Jeff Rothenberg, an expert on computer storage and longevity at the RAND Corp.

The Lure of Digital Immortality

For researchers and archivists awash in hard-copy information, the immediate promise of electronic archives and libraries is a liberating one.

In the Washington area alone, the National Archives houses about 6 billion documents, 7 million pictures, 118,000 movie reels and 200,000 recordings. The 65-acre Library of Congress, the world's largest library, houses more than 107 million items, ranging from the papers of 23 U.S. Presidents to one of only three existing perfect copies of the Gutenberg Bible. Panicked over how to preserve what they have, federal archivists are watching their collections grow by more than 5 million items a year.

Confronted with such perishable mountains of material, the Library of Congress, the National Archives and the University of California and other major universities are investigating ways to transform their collections into digital computerized records that can be distributed as widely as possible.

The aim is to drastically lower the cost of warehousing books, manuscripts, photographs, maps, motion pictures and sound recordings, while dramatically broadening public access around the world to even the most obscure historical collections. With computerized indexes, scholars might even discover everything that is stored on the hundreds of miles of shelves and file drawers in the National Archives.

"There is no area in our library or any other library that will remain untouched by digitization and computers," said Suzanne Thorin, chief of staff at the Library of Congress.

This spring, the Library of Congress joined with 14 other major research libraries to begin a national digital library. As a start, they hope to have 5 million digital documents available to the public through the Internet and on CD-ROM by decade's end.

Los Angeles Times Articles