Too many games had metacritic scores between 82-88 this year.

This is maybe just me, and I know the 7-9 review scale has been a problem for a really long time now, but this is the first year I remember noticing this particular issue - almost every game I was excited about this year, and there were many, scored between the low to mid 80’s. There are a few exceptions (Disco Elysium, for example, or the port of Dragon Quest over to the Switch) but it’s crazy to me how many times I would eagerly hit OpenCritic this year after a review embargo and see a game I was sure would score well be at like 82 or an 86 or something. In years past I guess I remember a lot more 90+ average scores - this year it was like, Disco Elysium (for games I was specifically looking forward to).

Luigi’s Mansion - 86
The Outer Worlds - 85
Link’s Awakening - 87
Astral Chain - 87
No Man’s Sky - Beyond - 86
Age of Wonders: Planetfall - 81
Fire Emblem: Three Houses - 89
Bloodstained - 83
Cadence of Hyrule - 86
Steamworld Quest - 82
Anno 1800 - 81
Metro Exodus - 83
Etrian Odyssey Nexus - 82
Wargroove - 83
Sunless Skies - 85
Slay the Spire - 89

Some of those at the 89 marks (Slay the Spire, Fire Emblem) I know (and agree) are considered “great” reviews, but I list them for additional emphasis because I also remember several times seeing a game I wanted to score well to receive an 8 and then when reading the text almost no actual issues were brought up to justify docking two full points.

Has anyone else noticed this being especially bad this year?

Either a lot of reviews are just leaning on using an 8 for their scores this year, or maybe we had a lot of “good but not amazing” games come out in 2019. Or just the stuff I was specifically excited about turned out to be “good” and that’s as far as reviewers were willing to give these titles.

Obvious answer is the correct one: scores don’t mean anything, look to the content of the review for your review needs.

And aggregating these scores, that’s even worse. When half (not a scientific poll) of reviewers think scores work like an algebra exam and anything under 70 is bad, and the other half think 50 is an average score, how are you going to get a meaningful number from throwing them in a blender?

Or, you know, just don’t get hung up on Metacritic averages.

I don’t know if I’d say I’ve gotten hung up on them, but I think it’s strange how many I’ve noticed this year to be in this realm, which is like an average on the 7-8 scale.

EDIT: To clarify, it’s not the score I’m hung up on - it’s the idea that most reviewers scored most games as “pretty good, not great” so to speak.

When you see how many weirdos pop up around here anytime one of Tom or Nick’s review scores fall outside this perceived “golden mean”, I’m surprised every game’s meta critic score isn’t between 80 and 90.

I also noticed on Metacritic a bunch of super amateurish reviews are put forward as their recommended. They probably have quite widened their range, and when I see any half-competent Switch games get an 80 for having handled a free code to an enthusiast freelancer, as DiveCube said, scores don’t mean anything.
The saddest part is all the cool games not having high profiles getting 70 because that person didn’t bother playing it. Those are ruining the carefully established scale!

Maybe it’s the maturity of the reviewers and/or some genres showing? Someone who’s been doing this for a few years and has had to review a couple different games within the same genre will imo be (subconsciously) more critical of the new offerings, especially if the genre is heavily saturated.

I think we’re also at a point where an average game is far more polished and has more even design/mechanics than in the past, which could also be a contributing factor.

It’s like how I’m just estimating every story/feature between 20-30 hours at work this year.

I don’t even recognize hardly any of the “critics” I see posting scores on Metacritic. Remember when there were just a handful of gaming websites you’d rely on for reviews? Now everyone and their brother is a game reviewer.

Hmm, I wonder if this is because the distribution of game scores is not a nice bell-curve, at which point taking the average is going to be super misleading.

But yeah, more likely, reviewers are just playing it safe by giving an average score, maybe plus or minus a little bit. I mean, it’s low risk and take no effort to provide an “average” score.

Ultimately though, it means that the score is useless.

I would argue that a rating of 10 would need something really above-and-beyond, something putting it in ‘classic’ or ‘near-classic’ territory, such that ‘only’ getting the 8 isn’t really something that needs to be explained or justified if a game is ‘merely’ very good.

But in the end, yeah, ratings are kind of pointless. They are just friendly for aggregation and short attention spans.

Let’s look at how many PS4 games in various years:

2019: 4 games at 90+; though it’s 1 remaster, 1 expansion, and 1 GOTY edition…
2018: 7 games; 1 remaster
2017: 2 games
2016: 4 games; 1 expansion
2015: 7 games; 1 remaster, 2 expansions

That’s the most popular console. How about the PC?

2019: 4 games at 90+
2018: 1 game
2017: 3 games
2016: 5 games
2015: 4 games

So I don’t think I buy your theory. It looks like a fairly normal year.

Makes me think of Steam, where when I sort my 800+ long wish list by review rating, the games are randomly sorted by review cluster (eg - a game with a 93% “Overwhelmingly Positive” may get listed before a 100% “Overwhelmingly Positive” for no rhyme or reason). The perfectionist in me gets annoyed to no end by this, yet upon reflection I’ve found it to help as I consider more games than I otherwise might.

I look at Metacritic kind of the same way; if there’s something outside of the normal range, it will stand out for better or worse. Otherwise, something in the 80’s is just a checkmark after which I move on and reflect on the qualities of the actual game.

Maybe what’s going on is more games I actually care about this year than in the past are scoring 8’s? That’s certainly possible. It was just a feeling I wanted to share, I didn’t figure everyone would feel the same way. :)

I assume metacritic doesn’t have exportable/queryable APIs? Otherwise we could graph change over time by release date.

Can it just be that the reviewers are just simply jaded?

Metacritic doesn’t publish who among its reviewers gets more “weight” in the aggregate score than others.

OpenCritic may have such an API. Might want to check there.

Otherwise, you may have to crawl Wikidata.

82 to 88 is the new 7 to 9.

-Tom