Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Vegetable
Oct 22, 2010

Some thoughts about Rotten Tomatoes and Metacritic

1. Review aggregators are clear about what they represent: a survey of critical reviews. Some folks get mad at them because they don't accurately measure film quality. But the sites aren't pretending to measure that. And come on, what is film quality? A hundred years of cinema and we can't agree on an answer; don't hate two websites for coming up with one.

2. The classic Tomatometer does measure just the proportion of reviews that are "positive." But look below it and the site does provide an average rating too (i.e. the mean of every review score):



Though I'll say it's not obvious how they translate, say, a B+ to a scale of 100, but I doubt it's that big a deal in practice.

3. Both sites provide avenues for clarifications. RT allows reviewers to write in and say that their 60% review was actually meant to be negative. MC allows reviewers to clarify that their B- actually translates as a 50% score. So it's at least safe to say the reviews being aggregated are represented correctly in the final formula.

4. RT and MC collect reviews in very different ways. RT includes far more publications and reviewers from all kinds of sources; sites included for Suicide Squad's reviews include Beliefnet, One Guy's Opinion, Legion of Leia, Birth.Movies.Death, and The Blu Spot. They do document their approval criteria for critics and publications extensively, but suffice to say the result includes a lot of random rear end sites. They even include video reviews.

MC prides itself on curating an exclusive list of publications. Their list of approved publications is far shorter and tends to represent the biggest names in movie reviews. They do a quality vs quantity jab at RT in their FAQ page.

5. They also compute their "average rating" in different ways. RT's average rating, as depicted in point 2, is a pretty traditional
code:
(total review score / total reviews)
but MC actually weighs the contribution of each review to the final score. So it's entirely possible that The New York Times always counts more than, say, The Village Voice. Some people get mad about this, but I don't think it's anything wrong. These aggregators are already acting as gatekeepers of who's a real critic by deciding who to include; deciding who counts more is just the logical extension of that.

6. Both sides also aggregate audience reviews, which is more interesting than you'd think. Movies like Spring Breakers, Hail, Caesar!, and The Witch where there's a clear mismatch of marketing and actual content often receive audience scores far lower than the critic score. In general, it's just useful to see if audiences love a film as much as critics -- it often says a lot about the film.

7. People also get mad at review aggregators, I think, because they try to put a number on art. It's popular to think that art is sacrosanct, something worthy of sacrifice and impossible to measure. They're fighting modernity on this one.

Adbot
ADBOT LOVES YOU

  • Locked thread