Bad Reviews Really Could Hurt A Film At The Box Office

By James O Malley on at

If there’s one thing better than sitting down and watching a good film… it might just be watching a bad one. Look, I know that it is better when films are good - but there is also a certain pleasure to be derived from the bad too. Why? Because hating on films is fun. Who doesn’t like revelling in the absurdity of Geostorm with their friends? Who doesn’t enjoy picking apart Batman v Superman to try to work out exactly why it is such a spectacular failure?

And then there are the critics. Content begets content. We know that when Disney churn out another interminable Pirates of the Caribbean, at least we can look forward to an epic “Kermodian Rant” from Mark Kermode. As Justice League reared it’s ugly head, we knew at least there would be a thoughtful dissection from MovieBob and a furious rant from Robbie Collin.

This makes critics theoretically powerful people: We all listen to them, after all. But are they powerful enough to make or break a film? Could a thumbs up or a thumbs down from Mark Kermode be enough to decide whether something Hollywood has spent millions of dollars on turns into a lucrative moneymaker or a disastrous failure?

We wanted to find out - so we crunched the data. Today, we can reveal that according to our study.... Yes, it appears critics make a difference. Just… not a very big one.

Read on to find out more.

How We Did It

Last week, we dived into film criticism using a big dataset we downloaded from Metacritic, a site which aggregates film reviews, normalises scores from critics and gives each film a rating. Using the data we previously told you which films are most controversial amongst film critics, and which films have the biggest gap between critical opinion and the opinions of normal viewers.

In order to dig into how successful films were, we hooked up our dataset with Box Office Mojo, which tracks how much money films are making at cinemas (yes, it took ages). It means we’re analysing US data rather than UK data - but this is good because the US the biggest film market, and available data is much richer.

We took the dataset and narrowed it down to films that opened on 2000 or more screens - which is pretty much the lower bar for a wide release in the US. We also limited it to films that have 30 or more reviews on Metacritic. Both of these decisions was to screen out any “noise”. The fewer reviews a film has, the more one mad critic can skew it. We reasoned that at least 30 reviews means that the average score is broadly reflective of critical opinion as a whole. Similarly, we limited it to over 2000 screens, because smaller releases could conceivably be more dramatically skewed by other factors. We used Box Office Mojo’s adjusted figures for 2017 to account for inflation.

What The Numbers Say

So here’s the results of all of that data mashing - a dataset of just over 1000 films. Along the bottom of the chart below we have the average critic score, and on the left is the average taking per screen on the film’s opening weekend (this means we can fairly compare, say, The Force Awakens, which was projected on to basically every flat surface on Earth, to smaller releases).

As you can see by the trendline - on average, the higher the film scores with critics, the slightly higher earnings at the box office. The correlation coefficient - basically a score for how closely linked the two figures are - is 0.293 - which means there is some sort of relationship, albeit one that is relatively weak (and for the stats nerds - this means the r-squared, a measure of how closely the points conform to the trendline, is 0.086).

Perhaps Batman v Superman and The Dark Knight are the most interesting films in this scatter plot: Christopher Nolan's masterpiece opened slightly higher (even when adjusted for inflation), and the distance between the two appears to be more or less in keeping with the trendline. What this also shows is that a powerful brand or franchise - like Batman - will have a baked in audience regardless of critical reaction, but perhaps it shows that a positive critical reaction can add a little icing on the cake for the studios or that a critical panning will send more less more casual viewers towards something else instead?

The trend also persists when you compare lifetime box office takings - this is now just the opening weekend, but box office income for the entire theatrical run. Here the correlation coefficient is 0.382 (and again stats nerds, r-squared is 0.146 - and according to this test, it is statistically significant - hurrah!).

Though there is clearly a relationship between critical scores and box office takings, it could be that there isn’t a causal relationship. For example, what if it is the audiences themselves liking films and recommending them through word of mouth that is causing a bigger box office take for some films? As we discovered last week, sometimes audiences and critics think very differently.

However - interestingly, if you compare the scores of the great unwashed - normal Metacritic users - to Box Office takings, there’s actually a weaker correlation: Only 0.176 with opening weekends, and 0.259 with lifetime figures. This suggests that the critics are better predictors of box office performance than the general public.

A Nerdy Diversion

I’m not the first person to have attempted to make a comparison like this like. Earlier this year Yves Bergquist published a detailed, more wonkish blogpost digging into the relationship between Rotten Tomatoes scores and film success, and annoyingly I found it after I’d spent absolutely ages downloading and processing all of the data that I used. In Bergquist’s view, his data doesn’t suggest any relationship between that site’s “Tomatometer” rating and box office takings - measuring just a correlation of just 0.12 on lifetime takings, with an r-squared of 0.009.

So what’s different? I wonder if our differing approaches can explain the difference. Berquist’s data used Rotten Tomatoes - not Metacritic. On RT. the Tomatometer score for a film is simply the proportion of reviews above 60% to the ones below, which makes it relatively easy to skew, as as far as RT’s score is concerned, a score of 59 and 1 are worth the same in the Tomatometer calculation - and a score of 61 and 100 are the same too.

For my data, I calculated the mean scores for each film based on all of the reviews (which is different from Metacritic’s headline Metascores, because of the way the site weights that score).

Perhaps this suggests that - as per Bergquist - we can conclude that while Rotten Tomatoes in particular cannot be easily linked to box office takings, but (as per my data), the entire profession of film reviewing can be.

Critics Do Hit The Box Office

So can the critics make or break a film? Though these correlation numbers aren’t completely definitive, what’s clear is that there is a relationship between the two.

There’s dozens of reasons why people may choose, or choose not to go to the cinema - and these decisions are influenced by marketing, casting, the reputation of the director and so on - even the weather.

Heck, when Avengers: Age of Ultron failed to match the same opening weekend heights as the first Avengers film, there was plenty of speculation that this was because it coincidentally opened on the same weekend as the Mayweather vs Pacquiao boxing match that was billed the “fight of the century”. So completely unpredictable events can also impact a film’s bottom line.

Given these countless factors, it is inevitable that that data will always be messy. But given that our data suggests there is a small relationship between the critics and the box office, it does appear to suggest that critics matter… at least a little bit.

Huge thanks to my scientist pal Stephen Jorgenson-Murray for his statistical know-how and my film critic pal Chris Blohm for suggesting this would be fun to dig into.

James O’Malley is Interim Editor of Gizmodo UK and tweets as @Psythor.