Advertisement
YOU ARE HERE: LAT HomeCollectionsCameras

The Nation

One Last City Is Scanning for Faces in the Crowd

Virginia Beach, Va., is sticking with its controversial system of surveillance despite indications that it has little effect on crime.

September 29, 2003|David Lamb | Times Staff Writer

VIRGINIA BEACH, Va. — Twelve feet above the sidewalk, three cameras scan the faces of unsuspecting crowds on Atlantic Avenue. In a police control room a few blocks away, Lt. Dennis Santos sits before a bank of screens, holding a joystick that enables him to pivot the cameras and zoom in or out. With the help of computers, he is looking for terrorists and criminals.

Now that Tampa, Fla., has scrapped a similar system, calling it a failure, Virginia Beach is the only American city using controversial facial-recognition cameras, and although the system has led to no arrests since it was put into operation in this popular resort town a year ago, senior police officers call it an important law enforcement tool that is here to stay.

"This isn't a silver bullet," Deputy Police Chief Gregory Mullen said. "It's not going to stop or prevent crime. But the system works. It can capture images and it can match them with those in the database. It is one step in making this a safer city, and we have taken that step while being careful to protect the rights and liberties of our citizens."

Basic surveillance cameras -- in schools, department stores and parking lots -- have been part of our lives for years, and some private businesses, such as casinos, use facial-recognition cameras. But for many people, the idea of government agencies capturing our images in public venues is an unsettling invasion of privacy that smacks of the Orwellian message "Big Brother is watching."

"Broadly speaking, we are becoming a surveillance society," said Kent Willis, executive director of the American Civil Liberties Union in Virginia. "While we may have concerns about private business using surveillance cameras, when government with its great power is the one doing the surveillance, we should worry. There is a long history of governments using the information they gather."

The ACLU has requested, under the Freedom of Information Act, the Police Department's data on the accuracy rate of the system and the contents of the image database. Officers said the system has registered numerous "false positives," all of which were dismissed by police personnel in the control room as unlikely matches. Police have been dispatched only twice: One "match" disappeared in the crowd and the other was not the felon whose image had come up on the screen.

Facial-recognition technology, still in its infancy, appears to work well in ideal laboratory conditions, but its reliability remains in question in real-life situations when light is bad or the camera lens gets dirty. Last year, for instance, two separate systems at Boston's Logan Airport failed 96 times to identify volunteer "terrorists" whose images had been entered into the database. It correctly identified the "terrorists" 153 times.

And last month Tampa pulled the plug on its system, not because of constitutional issues but because "it didn't work," said Capt. Bob Guidara, a Police Department spokesman. "We never identified, were alerted to, or caught any criminal."

Tampa installed the facial-recognition system in its Ybor City tourist area in June 2001 after testing it, unannounced, at Super Bowl XXXV the previous January. That mass screening of fans caused so much controversy that civil libertarians called for congressional hearings on high-tech surveillance, and Dick Armey (R-Texas), then the House majority leader, said, "Our comings and goings are none of the government's ... business."

After the terrorist attacks of Sept. 11 that year, opposition to the system quickly faded as a nervous public began demanding better screening at airports and public venues. In a Harris Poll two weeks after the attacks, 86% of Americans endorsed the use of facial-recognition software to spot terrorists.

Virginia Beach, consistently rated among the nation's safest cities of its size -- it has a population of 450,000 -- went to great lengths to build safeguards into the system, financed largely by a $150,000 state grant. It held public forums to air the views of the National Assn. for the Advancement of Colored People, ACLU, human rights activists and the business community. It appointed an independent audit group whose members can walk unannounced into the control room. It put up signs on Atlantic Avenue warning, "Attention. Smart CCTV in use." Only three officers have the passwords to access the system and enter or purge data.

Technicians put about 1,000 images into Virginia Beach's database, including those of 650 felons with outstanding warrants, known terrorists and people on the FBI's most-wanted list. Runaway children have also been entered with their parents' consent. The cameras are monitored up to 20 hours a day during the busy summer tourist months, much less in the off-season.

When the system is operational, recognition software analyzes faces based on 26 bone structures and 80 measurements, such as the distance from the tip of the nose to the chin or the distance between the eyes. If there is a match in 14 of the 80 items between a face captured on camera and one stored in the database, a "foe alarm" buzzes. The software does not recognize skin color, facial hair or the distinction between genders. If there is no match, the camera image is deleted. The system has no storage capability.

"We'd never rely 100% on the system to make an arrest," Lt. Santos said. "In the end it comes down to old-fashioned police work. An officer goes out to determine: Is this person a suspect or not? The decision is based on the human element, not technology."

Advertisement
Los Angeles Times Articles
|
|
|