Advertisement
YOU ARE HERE: LAT HomeCollectionsOpinion

Op-Ed

What was John Edwards thinking?

A landmark study shows what happens in our brains when we're given the chance to lie. For some, doing the correct, difficult thing seems natural -- a trait worth cultivating.

June 03, 2012|By Robert M. Sapolsky
  • John Edwards is seen returning to a federal courthouse during the ninth day of jury deliberations in his trial on charges of campaign corruption in Greensboro, N.C.
John Edwards is seen returning to a federal courthouse during the ninth… (Chuck Burton / Associated…)

Is John Edwards a criminal or merely a sleazebag of breathtaking proportions? The jury couldn't quite make up its mind.

But various reasonable questions come to mind in the wake of the mistrial. For example: Whoa, close call, can you believe that this guy might have been president? What exactly was he thinking? And then there's: Why didn't he have a shred of willpower when it comes to honesty and doing the right thing?

The spectacle reminds me of something moving about honesty I read recently, in, of all places, a scientific journal.

In a 2010 report in the prestigious Proceedings of the National Academy of Sciences, Harvard psychologists Joshua Greene and Joseph Paxton asked a great question: When people are confronted with the opportunity to lie, what differs in the brains of people who succumb to the temptation and those who don't?

For the study, each subject was placed in an MRI machine, a brain scanner that indicates the ongoing levels of activity in different brain regions. The volunteers had a simple task. There'd be a series of virtual coin tosses by a computer, and before each one, the subject had to predict the outcome. Guess right, and there'd be a financial reward.

But there was a twist. Subjects were told a great piece of nonsense, namely that the purpose of the study was to determine whether people had better paranormal powers at predicting the future when the predictions were made in private. To examine this, scattered through the series of coin tosses would be the occasional instance where instead of a subject entering the prediction before the toss, he would privately make his prediction. Then, after the coin toss, he'd be asked: So, did you guess right? In other words, people were given the opportunity to lie, to claim that they had predicted correctly and then reap the financial reward.

How could Greene and Paxton tell if someone were lying? Coin tosses being what they are, across, say, 50 instances in which a subject had to register a prediction beforehand, he'd be correct roughly half the time — mere chance. If for the tosses where there was the opportunity to cheat the subject's success rate skyrocketed, odds were that there was a liar in the brain scanner.

To start, here's some demoralizing news — fewer than half of the people were in the clear-cut honest range, with success rates remaining around 50% when they had the chance to cheat. About a third seemed to be lying often enough that their success rates were well above 50% at those times. The remaining subjects had success rates that were somewhere in between, and thus hard to classify. (Remarkably, the cheaters included one doozy of a scoundrel who claimed something close to perfect accuracy when given the chance to lie.)

What went on in the brains of people when temptation beckoned? Let's start with the "liars" — the people who lied with sufficient frequency that they could be detected statistically. Central to the results was a region called the prefrontal cortex, or the PFC. This is one interesting part of the brain — it's all about self-discipline, gratification postponement, emotional regulation, control of impulsiveness. It's the part of the brain that keeps you from being a serial murderer, that makes you a good dinner guest who proclaims the fare was delicious even when you're about to sprint for the porcelain throne. It makes you do what's hard to do when it is the right thing to do. It's bigger and more complex in humans than in any other species, is our most recently evolved brain region and is the last part of our brains to fully mature.

So when the opportunity to cheat arose, the activity in the PFCs of liars shot up like crazy. The scans showed the trace of an epic moral battle — do it, don't, yes do it, no don't — that the liars lost. And the larger the percentage of lies told, the greater the overall PFC activity.

And what were levels of activity in the PFCs of those who, from a statistical standpoint, never lied? Greene and Paxton present two differing views in moral philosophy about honesty: Is honesty an act of will; does it require a person working hard to refrain from doing the wrong thing? Or is it an act of grace, effortless because temptation isn't tempting? In the study's paradigm, it was grace all the way — among the unequivocally honest, there was no increase in PFC activity when the chance to cheat arose. Their PFC neurons weren't successfully wrestling Satan into submission. There was no wrestling. It was simple — you don't do that, it's wrong, period.

In the face of real life's temptations, a majority of us are not going to get by on pure grace. We resist, but sometimes it takes work. Lots. Or perhaps we succumb and thereby shock ourselves with what we are capable of. We ooze our human frailties.

And yet there are those who glide through minefields of enticement, doing the difficult, rare, brave, correct thing as naturally as breathing. It can seem hard to believe that a person could really be this way. But a high-tech brain scanner documented that it's possible. It's an achievable goal. And should be. Even for someone who would be president.

Robert M. Sapolsky is a professor of neuroscience at Stanford University and the author of "A Primate's Memoir," among other books.

Advertisement
Los Angeles Times Articles
|
|
|