Advertisement
YOU ARE HERE: LAT HomeCollections

Constituent Polling: Is It Reliable or Just PR? : Congress: One lawmaker got a questionnaire return rate of 2.6%. Professional pollsters consider 50% rock bottom for a survey to have statistical validity.

February 19, 1990|ALAN C. MILLER | TIMES STAFF WRITER

WASHINGTON — Every year for the past decade, Rep. Anthony C. Beilenson (D-Los Angeles) has sent questionnaires to each household in his West San Fernando Valley and Westside Los Angeles district, soliciting constituent views on national defense, the deficit and social issues.

"It gives me a feel for how people out there feel," Beilenson said, citing the thousands of responses his office receives. "It does give people an opportunity to tell their representative how they feel. People like that and they feel good about it."

Many of Beilenson's colleagues agree that such surveys help them keep tabs on the pulse of their far-flung districts.

In the Valley area, Rep. Elton Gallegly (R-Simi Valley) has referred to results from such mailings as reflecting the views of taxpayers in his district, in much the same way as a public opinion poll. And Rep. Carlos J. Moorhead (R-Glendale) has cited his constituent surveys to make points with colleagues, most recently his constituents' perspective on air pollution problems.

Not everyone familiar with the questionnaires, however, embraces them as a time-honored rite of American democracy. Many observers and participants, in fact, are skeptical about what they represent and how they are used.

First, taxpayers do not appear terribly enthusiastic, based on the small percentage that participate. Gallegly received 7,500 responses to 285,000 questionnaires last year--a meager 2.6% return rate. Beilenson and Moorhead did only slightly better, with about 4%.

Professional pollsters, meanwhile, said that lawmakers who circulate such surveys are misguided at best and duplicitous at worst when they treat the responses as accurate representations of the views of all their constituents.

And election opponents assert that the surveys--each of which costs $6,000 to $10,000 to print and mail and takes many days of valuable staff time to tabulate--are a self-serving waste of taxpayers' money intended primarily to score points with voters.

"I don't think people in Congress are doing these surveys to find out how their constituents feel," said Mervin Field, director of the California Poll and president of the nonpartisan, public affairs Field Institute in San Francisco.

"I think they're doing it for two reasons," Field said. "It's publicity for the congressperson. And if you open it up and you see a survey and look at the questions, you say, 'Here's my representative who wants to know what I think. Isn't that a good thing!' "

Scientific opinion polls--such as the Gallup and Harris polls and those commissioned by many members of Congress for their election campaigns--solicit the opinions of people who are carefully chosen to be representative of an entire group. In contrast, the constituents who return congressional surveys are self-selecting and, pollsters said, probably not typical.

"In many cases, they are people who tend to be closer to that elected official," said Arnold Steinberg, a Republican pollster based in Sherman Oaks. "A lot of other people write the congressman off and say, 'He doesn't care about my views.' "

A related problem is the small percentage of people who respond. Professional pollsters seek a participation rate of at least 60% to 70%--with 50% considered rock bottom--for a survey to have statistical validity. As the reply rate drops, so does the probability that the sample accurately reflects the views of the whole group.

Thus, even if the number of responses in a congressional survey appears impressively large, the results still are considered of dubious validity when the response rate is less than 5% of those queried.

"At 5%, the sampling error is astronomical," said I. A. Lewis, director of the Los Angeles Times Poll. "It's probably incalculable."

Further distortion occurs, pollsters say, if questions are framed in a manner that is likely to induce answers that support a lawmaker's position.

In 1989, for instance, Gallegly stated his views on various issues and asked constituents whether they agreed, disagreed or had no opinion.

"Defense spending has decreased in real terms (after inflation) in each of the past three years. In real terms, the United States must, at a minimum, maintain current spending levels" was a typical item in Gallegly's survey.

Field called this "the kind of loaded question you would put in a survey methods textbook to show how you can load an issue.

"It's saying, 'We've already cut the thing to the bone. We don't want to cut it any further, do we?' The public will always respond in the direction the question is worded. They'll try to agree."

In general, the veteran pollster said, by only giving one side of most issues, Gallegly's mailing was "not a survey but a planned effort to get a sycophantic response."

Advertisement
Los Angeles Times Articles
|
|
|