YOU ARE HERE: LAT HomeCollections


Filming Without the Film

High-definition digital cameras, feared by some directors, could end the careers of those unable to make the transition.


NICASIO, Calif. — Oliver Stone stared in disbelief. Here he was, sitting in a velvet seat in George Lucas' private screening room, listening to the "Star Wars" director foretell the death of film.

To Stone, director of such films as "Platoon" and "JFK," Lucas' vision of digital movie making sounded like blasphemy. Around him, other A-list directors--including Steven Spielberg, Francis Ford Coppola and Robert Zemeckis--fidgeted as Lucas challenged a century of tradition, warning his colleagues to embrace the future or be left behind.

Lucas' blunt message stands at the center of a schism in Hollywood over the fate of film in the film business. New high-definition video cameras and digital editing equipment challenge the longtime supremacy of film. They are cheaper and more flexible. But they also frighten directors and cinematographers who understand every nuance of film.

A creative misstep can tarnish a career, so many of those established in the film industry blanch at the thought of showing their inexperience with the latest technology. A colossal mistake, seen by millions of fans, might reveal that they are passe storytellers--easily replaced with younger, cheaper and more tech-savvy rivals.

"Film is what we do. It's what we use," Stone sniped at Lucas. "You'll be known as the man who killed cinema."

Lucas merely rolled his eyes as Stone waxed about the poetry of celluloid and the coldness of pixels.

Finally, according to those who were there, Lucas interrupted.

"Just watch."

Raising a hand, Lucas cued his demonstration and told his audience what they would see: identical clips--each stored on different formats--from the animated movie "Monsters, Inc."

One was completely electronic--compiled by a computer, stored on digital tape and shown through a digital projector. In footage looking less like a motion picture and more like an open window onto a real world, the monsters gabbed in crisp clarity and rich tones.

Next came a traditional film reel that spent four weeks in a mall theater. With each showing, heat from the projector and dust in the air faded and degraded the reel. The difference was jarring. Radically out of focus, the film reel cast an image on the screen that jiggled and popped, as if an earthquake were rocking the projector.

Lights came up as the demonstration ended. No one spoke for several seconds.

Debate within the industry is not nearly so quiet.

For directors such as Lucas, the choice is obvious. Breaking new ground for major motion pictures, his "Star Wars: Episode II Attack of the Clones" was shot entirely with high-definition digital cameras, edited with digital equipment and, for a few dozen theaters, distributed and projected digitally.

Testing the Technology

Spotting the change, a growing number of filmmakers have been testing the digital waters. From students and independent filmmakers capturing their low-budget works on digital video to established directors such as Michael Mann testing high-definition cameras in "Ali," they are curious about the new tools and fearful of being left behind.

But after nearly a century of using film, much of Hollywood's old guard is reluctant to shift gears, a reticence that speaks to a powerful culture of fear among some of the industry's most elite directors.

"Film is rather like the magic lantern. There's a sense of mystery, because you don't know what's going into the magic black-box camera until you send the film to the lab," said cinematographer Roger Deakins, director of photography for Ron Howard's film "A Beautiful Mind."

"With digital, it's all very businesslike," Deakins said. "We're not businessmen. We're artists and magicians."

Despite significant advances in the art and science of film since the first roll of flexible celluloid was produced in 1889, the basic process remains the same: Chemicals layered on the surface of the film react when they are exposed to light, changing into hues that match the light's wavelength.

Digital cameras, which began to appear in the mid-1990s, use powerful computer chips that convert light into electronic pulses, which they then translate into data and store on videotape.

With the first cameras, the images were unusually crisp and realistic but no match for the smooth lines and range of colors delivered by 35-millimeter film. Those differences stemmed partly from the cameras' chips, which couldn't capture as much information as film, and partly from the technology used to shrink and store the data.

Sony Corp., Panasonic and other manufacturers developed high-definition digital cameras in the late 1990s that could deliver far more detail and a wider range of color. This summer, Thomson Grass Valley is bringing out a new line of cameras that can capture almost five times as much detail and twice the range of color as previous high-definition models, said Jeff Rosica, vice president of marketing.

Los Angeles Times Articles