When it comes to preventing mistakes, American medicine is just beginning to see the error of its ways.
Though it prides itself on huge advances in technology and skill, medicine has lagged far behind other life-and-death industries in its understanding of simple human error. Ask those who run nuclear reactors for a living or ferry passengers through the sky: Accidents will happen; the trick is to minimize them with built-in precautions.
But doctors have been expected to individually transcend their mortal flaws or, failing that, to accept the sting of blame. Moreover, the nation's malpractice system, the only public means of assigning blame and compensating victims, has left many physicians terrified and patients traumatized. What it has not done is substantially stem the tide of medical errors.
With jarring recent estimates that tens of thousands of Americans a year die from medical mistakes, however, the health care profession is being forced to examine itself. The result is shaping up as a profound transformation in the way the profession treats the scourge of human error.
Many health care leaders are coming around to the view that mistakes are not, by and large, the fault of "bad" doctors; more frequently, they are committed by good--but imperfect--doctors working in a badly designed system.
Reform, in this view, must target the system and not errant individuals. Through technology, computerized reminders and routine double-checks, doctors can be deterred from making mistakes rather than suffering punishment later.
"When you have a culture that stresses the only way to keep patients safe is to be perfect, that's not a good system," said Dr. James Bagian, director of the National Center for Patient Safety. "You can want to be perfect and strive to be perfect, but when you fall short, the question is, what do you do about it?"
Increasingly, medical leaders are looking to other high-risk fields for advice. Often, they look skyward, to aviation, which for decades has used built-in safeguards to make it tough for pilots to err.
Many of the leaders in the new medical safety movement are aviators themselves, or at least flying buffs. Bagian is a former astronaut. John Nance, a commercial airline pilot and consultant, is a member of the National Patient Safety Foundation at the American Medical Assn.
According to Nance, aviation learned more than 30 years ago that "when the question is 'who's wrong' rather than 'what's wrong,' you accomplish nothing."
What the airline industry did was establish a reporting system that allowed pilots and others to flag errors and close calls confidentially, minimizing the risk to their careers. From this wealth of experience on possible calamities, it designed safety measures to make such disasters nearly impossible.
Nance gives a simple example: "If you don't want people to put the landing gear in the 'up' position on the ground, then you disconnect their ability to retract it."
In medicine, the same concept is now being applied to the operating room: To prevent an anesthesiologist from killing a patient by mistaking the nitrous oxide hose for the oxygen hose, you make the fittings to the anesthesia machine in different sizes for different gases.
Aviation is full of such "forcing functions" that steer pilots away from errors. Its manuals also contain checklists and standardized procedures that Nance says have sometimes been "paid for in human blood."
The flying industry doesn't wait for real crises; it simulates them. During sometimes hair-raising rides in pseudo-cockpits, captains are taught to avoid panicky errors and trained to work well in teams.
By comparison, medicine's attention to basic safety lags far behind, according to a report last November by the national Institute of Medicine. That report, titled "To Err Is Human," gave momentum to change like nothing before. It jolted the media, Congress, President Clinton and--perhaps most important--the medical industry itself, into awareness and action.
The report estimated that as many as 98,000 people each year die from medical mistakes. Although the accuracy of that estimate has been challenged, the impact of the report was "huge--just huge," said Dr. Gil Kuperman, director of clinical systems research at Partners HealthCare System, a nonprofit company based in Boston. "The numbers were staggering. It's impossible to avoid this issue now."
Yet there are formidable barriers to change, perhaps the greatest being the prevailing culture of medicine itself.
Doctors aren't universally required to report their mistakes, and they lack incentives to come forward. Pilots, at least, have the ultimate incentive: avoiding what Randall R. Bovbjerg of the Urban Institute calls "capital punishment." If they really blow it, they go down with the plane.
In medicine, many doctors say, fear of litigation often discourages the candor needed to pinpoint where and why errors occur--an essential prelude to designing prevention systems.