Advertisement
YOU ARE HERE: LAT HomeCollections

A Poll Taker on the Receiving End : Answering Questions Is Harder Than Asking Them

February 13, 1986|JOHN TAGG | Tagg is a San Marcos, Calif., free-lance writer and former speech and writing instructor at Cal State Northridge and UC Berkeley. and

I was randomly chosen last evening as the subject of a public opinion poll. It brought back memories. When I was in college, I worked one summer doing surveys for a polling firm. They were face-to-face surveys; the one I responded to last night was over the telephone. But the questions were of the same sort, with the same flaws.

Degrees of Opinion

When I asked people questions, I had a pink card about the size of a computer punch card that listed seven degrees of opinion. One, at the left side of the card, might stand for "strongly disagree" and seven, at the right side, for "strongly agree." I would ask people a question, show them the card, and ask them to tell me where on the scale they would place themselves. Some would actually answer this question with a number: "Oh, I guess I'd be about a five." But quite a few would take the card, look at it for a moment, and then say something like, "Oh, I don't know about that, but . . . " They would then launch into an explanation of their views on the subject, gesturing with the card but otherwise ignoring it. Often it was left to me to wrench an extended explanation into an "agree for the most part."

I distinctly remember one particular polling subject I came across on a very hot Saturday afternoon in Hemet. We were doing a survey for John Tunney, who was going to run for Congress--and later, of course, for the United States Senate. It was the summer of 1967 and some of the questions unavoidably dealt with the Vietnam War. I was surveying in a mobile home park, one with very few trees, so when a small gray-haired woman asked me to come into her air-conditioned living room, I readily consented. The woman held strong opinions about foreign policy. When I asked her the first question about Vietnam, she embarked on a detailed and specific critique of government policy, supported with detailed and specific evidence. I almost lost my pink card; she had forgotten where it came from and started to use it for a bookmark in a volume by Bernard Fall from which she'd read me a quotation.

For the first half-hour or so I kept trying to get her to summarize, to draw simple conclusions which would answer my two-dimensional questions. But she would wave me off, using the pink card as a fan, and explain that it wasn't that simple. She was less concerned with conclusions than with reasons. She refused to simplify because, as she said a dozen times, "it's not that simple." I was there for more than an hour and I learned a great deal, but I gave up entirely trying to get her to answer my questions.

Answers Didn't Fit

After I left I tried to fill in the answers on her behalf--having spent an hour with a single respondent I at least had to submit a completed questionnaire! But I was struck by the inherent discontinuity between this woman's opinions and my questions. They just didn't fit. She had intelligent opinions, and I was asking, by her standards, stupid questions.

I was reminded of this experience last evening when I was trying to respond to the questions on the telephone survey. The woman would ask me a question and then list my potential answers. More often than not, my answer wasn't on the list.

"What do you think of negative campaigning? If one candidate sent out 'hit mail' aimed at his opponent would you (a) vote against the candidate who sent out the negative material? (b) not vote? (c) . . . "

But wait, I said, more or less. It would depend on what the mail said, on whether what the mail said was true, on the timing of the mail, and on how the other candidate responded to it. I even gave the woman an example of someone I'd voted against because they'd campaigned negatively, and someone I'd voted for because they'd campaigned negatively. But even as I spoke I could visualize the questioner at the other end of the line making up an answer for me that would fit on the form in front of her. That, after all, is her job.

Public opinion polls have their uses, but defining the options for public policy should not be one of them. If you are dealing with thinking people, you can not use a predefined scale of responses to limit their answers because they will go beyond, or behind, or above your scale. For most of the questions on most opinion surveys the best and wisest answer will always be "none of the above."

Advertisement
Los Angeles Times Articles
|
|
|