YOU ARE HERE: LAT HomeCollections

The Healthy Skeptic

Validity of hospital rankings

Facilities on U.S. News & World Report’s annual list ride their reputations heavily.

April 26, 2010|By Chris Woolston | Special to the Los Angeles Times

Even if you're the kind who never reads a review before heading to the car lot or the movie theater, you probably want to know how your hospital stacks up before you walk through the door. You definitely wouldn't want to find out after the fact that your hospital is a clunker and your surgeon usually gets two thumbs down.

There are many ways to check out a hospital. You could look up survival statistics at the Medicare Hospital Compare site (at or you could contact the hospital directly to find out how often it performs the procedure that you need.

But one of the most popular methods is to simply look up a hospital's ranking from U.S. News & World Report.

Every year, the magazine, which also publishes highly influential rankings of colleges and universities, compiles a list of the top 50 hospitals in a wide range of specialties. The 2009 version covered 16 areas, including cancer, heart disease, respiratory diseases and urology. The 2010 rankings will come out this July.

The rankings tend to be dominated by institutions with big budgets, highly regarded doctors and stellar reputations — the medical equivalent of blockbusters. For example, hospitals receiving the top five slots for treatment of digestive disorders in 2009 were, in order, the Mayo Clinic in Rochester, Minn.; the Cleveland Clinic; Johns Hopkins Hospital in Baltimore; Massachusetts General Hospital in Boston; and the Ronald Reagan UCLA Medical Center. These five institutions were also the top five (in a different order) in the 2009 Honor Roll, a list of hospitals with high scores in at least six specialties.

Hospitals that win high rankings tend to trumpet their achievements in press releases, TV ads and big banners. Policymakers watch the rankings closely too. In the ongoing effort to improve healthcare, it only makes sense to know which hospitals can serve as shining examples to all those that don't make the grade.

The magazine clarifies that it isn't interested in how well hospitals handle relatively minor issues such as hernias or noninvasive breast cancers. Instead, it says it tries to identify hospitals that are best able to handle the most serious challenges, such as replacing a heart valve in an already frail patient or removing a tumor from the spine.

Although the exact formula for determining rankings isn't made public, the magazine says it takes three basic factors into account: A hospital's resources (such as how many MRI machines or nurses it has), patient outcomes (such as survival rates) and its reputation among physicians. Resources and outcomes can be measured directly. To determine reputation, the magazine asks 600 randomly selected physicians to name the top five hospitals in their field for serious cases, assuming location and cost were no issue. Four specialties that generally don't deal with life-and-death issues — psychiatry, rheumatology, rehabilitation and ophthalmology — are ranked on reputation alone.

The claims: U.S. News & World Report says that its "annual rankings of the country's elite medical centers is a tool for patients who need medical sophistication that most facilities cannot offer." In a video posted on the magazine's website, health editor Dr. Bernadine Healy says the rankings "give a unique kind of information to people at a time when they desperately need it."

The bottom line: For all of the buzz and gravitas that surrounds the U.S. News & World Report hospital rankings, critics have long claimed that the rankings put too much emphasis on a hospital's reputation. The most common charge: Famous hospitals get high scores based on their name alone, while other high-quality but lesser-known hospitals have no chance to break into the club.

That criticism has now been put to a mathematical test. In an analysis published in the April 20 issue of the Annals of Internal Medicine, Dr. Ashwini Sehgal, an associate professor of bioethics, medicine and epidemiology and biostatistics at Case Western Reserve University in Cleveland, took a closer look at the numbers to see just how much reputation really mattered for rankings.

The findings surprised Sehgal. ("I nearly fell out of my seat," he says.) According to his analysis, reputation dominated the rankings, especially near the top of the list. The study found that if the top 20 hospitals in 12 categories had been ranked on reputation alone, the results would have been about 90% the same as the actual rankings. (The study didn't include the four specialties already ranked solely on reputation.)

And when Sehgal compared reputation scores with the other factors used by the magazine — such as death rates, technological resources and nurse-to-patient ratios — he found that those reputations didn't necessarily reflect reality. "There was virtually no relationship between reputation and objective measures of quality," Sehgal says.

Los Angeles Times Articles