Advertisement
YOU ARE HERE: LAT HomeCollections

Mixed reviews for Rotten Tomatoes and other aggregate websites

They have considerable clout, but they don't have a consensus as to the reviews are best measured.

June 13, 2009|John Horn

Disney's newspaper advertisement for "Up" featured the kind of blurbs typically associated with a critical hit, with Roger Ebert, Leonard Maltin, and the Wall Street Journal's Joe Morgenstern among the reviewers quoted. But a closer look at Saturday's full-page ad revealed a more unusual endorsement: a 98% "fresh" rating from the website Rotten Tomatoes.

The studios are always searching for new ways to sell movie tickets, and they are now looking to review aggregators such as Rotten Tomatoes, Metacritic and newcomer Movie Review Intelligence to generate box-office buzz by amplifying the sound of the critical chorus. As the sites grow more prominent, however, they also are attracting questions about their methodologies, and who exactly qualifies as a film critic in the Internet age?

For The Record
Los Angeles Times Wednesday, June 17, 2009 Home Edition Main News Part A Page 4 National Desk 2 inches; 91 words Type of Material: Correction
Movie review websites: An article in Saturday's Calendar about websites that aggregate movie reviews misstated the percentage and type of positive reviews needed to earn a "certified fresh" rating from Rotten Tomatoes. The story said that the rating is given if a movie gets favorable notices from 60% of the critics it surveys. In fact, for a film to be certified fresh it must be reviewed by 40 or more reviewers, with at least five considered top critics by the website, and 75% or more of the reviews must be positive.

In a way, the review aggregators are to movies what TripAdvisor is to hotels and the Zagat guides are to restaurants -- one-stop sites for consensus opinion. Whereas TripAdvisor and Zagat base their marks on consumer ratings, the movie aggregators generally use professional critics, although Rotten Tomatoes includes a number of citizen-reviewers who write on obscure websites, like Georgia's self-proclaimed "entertainment man" Jackie K. Cooper.

Movie marketers say they like the sites because they can boost movie admissions.

"Are they the driver? No. Can they help drive business? Yes," says Mike Vollman, MGM's marketing chief. "People want to know what the consensus is. I am a huge believer that in today's culture, people don't pay as much attention to individual voices as to the aggregate score."

Adds Peter Adee, the marketing head at Overture Films: "I definitely think that if you're fresh, it helps sell the movie. Reviews matter for an older audience, and certainly matter for an older female audience."

Rotten Tomatoes (a reference to what moviegoers once hurled at screens showing bad movies) is by far the most popular of the three aggregator sites, attracting about 1.8 million unique visitors monthly, according to comScore. As soon as the site deems a movie "certified fresh" -- meaning that at least 60% of the roughly 400 critics it surveys give a movie a favorable notice -- studio executives call Rotten Tomatoes, asking for the small trophies the website dispenses to commemorate the accomplishment. Rotten Tomatoes also is considering adding its mark to DVD packages for movies scoring well.

But as rivals Metacritic and Movie Review Intelligence point out, Rotten Tomatoes can give its coveted "fresh" rating to films that any number (and hypothetically all) of its counted reviewers don't really love. And though all three sites present numerical averages in their ratings, the calculations involve subjective scoring by the aggregators themselves, not just the critics.

Rotten Tomatoes' scores are based on the ratio of favorable to unfavorable reviews. If a film lands 10 positive reviews and 10 negative reviews, in other words, it's 50% fresh, and if the ratio is 15 good to five bad, it's 75%. But if all 20 of those critics give that same film the equivalent of a B-minus letter grade, it's 100% fresh, because all of the reviews were positive, even if only barely so.

"Our goal is the extension of thumbs-up and thumbs-down," says Shannon Ludovissy, Rotten Tomatoes' general manager. "It's not a measure of the degree of quality."

Metacritic and Movie Review Intelligence try to come up with an average reflecting how much critics actually like a movie, rather than a ratio of raves to pans. If a movie on those two sites gets a 50% score, it means the consensus of all of the reviews it read was 50% positive -- the average review, put another way, was two out of four stars.

Like Rotten Tomatoes, Metacritic and Movie Review Intelligence assign every review it reads a numerical score, a sometimes tricky endeavor because many leading critics (including those who write for the Los Angeles Times) don't award letter grades or stars as part of their reviews.

And that's where the subjectivity comes in.

David Gross, a former market research and 20th Century Fox studio executive who launched Movie Review Intelligence a month ago, says he and his staff read (or watch and listen to) reviews from about 65 top U.S. and Canadian outlets, including media companies as varied as Newsweek and National Public Radio, excluding the little-known Internet critics Rotten Tomatoes includes.

Gross says about three-quarters of the appraisals his site tracks carry letter reviews, and that two analysts from his company score the notices that don't with letter grades. "It's the clearest way in the human mind to differentiate" a review's enthusiasm, Gross says. "And if the analysts differ in their grades, we have a discussion about it."

Advertisement
Los Angeles Times Articles
|
|
|