Advertisement
YOU ARE HERE: LAT HomeCollectionsPolls

MEASURING AMERICA

Confessions of an ex-pollster

February 11, 2007|Nicholas Goldberg | NICHOLAS GOLDBERG is editor of the Op-Ed page and Current section of The Times.

THE WEEK I became a pollster, the following paragraph appeared in a New Yorker article about President Clinton, written by the magazine's then-political columnist Joe Klein:

"The president [is] an absolute slave to data: nothing is left to chance (or sadly, to moral imperative). Each new idea is market tested before it is presented to the public.... Other presidents have been poll-obsessed, but none quite so microscopically as this one.... Such micromarketing may be remembered as this president's most lasting, and most dubious, contribution to the art of governance."

Some people might have been disheartened, but I was delighted. At last, I was entering a field just as it was coming into its own! From its earliest days, political polling had its critics -- those who thought it was too intrusive or too unscientific, or who worried (like Klein) that political candidates would become excessively dependent on public opinion at the expense of their personal principles -- but I was inclined to give it the benefit of the doubt.

I knew, of course, that candidates might take polls too much to heart -- and might even cynically change their positions based on them. But I tried to keep in mind what Abraham Lincoln supposedly said: "What I want to get done is what the people desire to have done, and the question for me is how to find that out exactly."

In retrospect, my understanding of the business when I began the job in early 1999 was extremely unsophisticated (though perhaps not as unsophisticated as that of my mother, who suggested at one point -- perhaps in jest? -- that she thought I was becoming an "upholsterer"). I believed, as most Americans probably do, that polls are simply a tool for finding out where people stand on issues and who they are going to vote for.

What I failed to grasp was that the primary purpose of our business was not to learn what voters think -- but to determine how they could best be persuaded.

The surveys I created in the years that followed would have little in common with the public opinion polls I had read in newspapers all my life (the kind that tell you, as Arianna Huffington once put it, that "59% of all Americans think 'Ed' is an 'OK' name, while 64% put on their pants left leg first").

Yes, our clients wanted to know whether the voters would support them. And they certainly needed to know what were the most important issues to voters -- schools, taxes or crime.

But at their core, our polls were not about taking the proverbial pulse of the voters. As we used to explain in our pitch letters, we had no interest in providing clients with a useless "data dump." We were seeking "actionable" information to prepare a detailed, quantitatively tested "blueprint" that in turn would help us craft the arguments that would resonate most forcefully with voters.

What did that mean exactly? It meant pinpointing how the public felt about our clients -- and then figuring out how to transform those perceptions among the voters we needed most. It was about driving up our candidate's positive attributes while inoculating him or her against potential attacks. And the same with our opponents: We'd probe for their vulnerabilities and determine how they could be exploited.

Say, for example, our client was a 20-year veteran of the House of Representatives who wanted to run for the Senate. But after two decades in office, he wasn't sure whether he was perceived as an energetic fighter for his constituents or as a lazy, aging political hack.

Enter the pollster! Could we buff up our client's image by leveraging his votes in favor of healthcare coverage and his vehement opposition to raising taxes (which we would hammer home in uplifting televisions ads set against a background of comforting guitar music and photos of him with his kids)? Or would his opponents be able to outmaneuver us by harping on the 25 House votes he missed last year while vacationing in Bermuda?

Our goal was to run the campaign in theory before it started. To call 600, 800 or 1,000 test voters on the phone and begin to play out the arguments that would later be heard in union halls, direct-mail attacks, candidate forums, public debates and -- most important by far -- in millions of dollars of paid television ads to be aired in the closing weeks of the campaign.

A typical poll would open by screening out representatives of the media and screening in likely voters. We'd ask which issues were most important. We'd ask the critical "horse race" question: "On Nov. 3, there will be an election for U.S. Senate. Who are you going to vote for, Democrat John Smith or Republican Tom Jones?" We'd ask about our candidate's "image": Would you say he "cares about people like me" and "is effective," or is he "in the pocket of the special interests"?

Advertisement
Los Angeles Times Articles
|
|
|