I’ve been working on a project to rank NFL teams based on non-subjective performance. Using a combination of Pythagorean Expectation, Log5 odds, and a spreadsheet, I believe I have done so. To test this out, I did the completed 2010 season. While some teams achieved a certain percentage relatively early and didn’t veer to far away, some went through very wide shifts as the season progressed and more data was added with Tennessee being the biggest extreme. Regardless, I thought I would share the data for anyone curious. I am quite happy that 9 of the top 10 teams made the play offs, and the eventual two and three teams both made it to the Super Bowl. It also showed that Carolina was a truly awful team last season, and Seattle had to be the worst team to ever make the playoffs.

1 NE 80.9%
2 GB 78.0%
3 PIT 75.0%
4 ATL 70.5%
5 SD 68.8%
6 BAL 66.4%
7 NYJ 62.1%
8 NO 62.1%
9 PHI 60.8%
10 CHI 58.4%
11 IND 57.9%
12 NYG 55.0%
13 OAK 54.5%
14 TEN 53.6%
15 KC 53.3%
16 DET 51.7%
17 TB 51.0%
18 HOU 46.3%
19 DAL 44.9%
20 MIA 42.6%
21 CIN 42.5%
22 CLE 41.1%
23 MIN 40.4%
24 SF 38.6%
25 WAS 38.0%
26 JAX 37.6%
27 STL 36.0%
28 DEN 31.2%
29 SEA 30.1%
30 BUF 30.0%
31 ARI 22.0%
32 CAR 14.4%

Notes:

  1. The math ignored wins except indirectly, and instead is based on points. This is a better way to judge team strength, since it gives more information.
  2. The numbers are adjusted for Strength of Schedule.
  3. The spread sheet was for the regular season only, and weighted all games equally regardless of injuries, location, and week.
  4. One ways to interpret the percent is as a team’s hypothetical winning percentage is they played a very long schedule (10,000+ games) against all opponents equally.

Also, there’s an unsubstantiated rumor exploding on the Internets right now that Peyton needs to go back for more neck work.

I’m not sure why you didn’t post this in the active NFL thread. Would generate more discussion.

At any rate, that Seattle did make the playoffs and two teams with notably better rankings by this system in their division did not, suggests the system still has its flaws. It may work as a general predictor (of something, though applying this forward as opposed to using it an analysis tool is a whole different thing), but I’d like to see it applied to more seasons to see how good it is rather then using just the 2010 season to show what it can do.

I thought about it, but the ranking is ultimately about the 2010 season. Plus, this way what discussion does arrise will not have to compete with the latest ESPN drama of the hour.

At any rate, that Seattle did make the playoffs and two teams with notably better rankings by this system in their division did not, suggests the system still has its flaws. It may work as a general predictor (of something, though applying this forward as opposed to using it an analysis tool is a whole different thing), but I’d like to see it applied to more seasons to see how good it is rather then using just the 2010 season to show what it can do.

Well, if I wanted to figure out who would make the playoffs, I’d use a system where if a team scored more points than their opponent I’d give them a 1, and if not they would get a zero. Then come up with a long list of ways to break ties.

Here’s the thing, better teams do not always win, and in the NFL and football in general with its highly variable scoring rates this is even more true. I am fine with this as a way to determine playoff spots and championships. It adds excitement, drama, magic, and hope to game. The Chargers were essentially an 11 win team, but they only won 9. Losing close games will do that to a team. Meanwhile, Seattle should have won only 5 games, but over performed and won 7. Part of this was playing in a weak division, the rest was getting the lucky breaks and holding the tie breaker. While Seattle did win the first playoff round, they did so barely and with home field advantage. They went no further.

I’ll admit the rankings are still flawed, but to put them in perspective: the baseball rule of thumb is it takes 50 games to have a large enough sample size to make judgements about the quality of teams. The NFL only has 16 games overall. With this in mind, there is only so much a number crunch can do. Still, I think this ranking serves several useful purposes.

  1. Teams arrive at a more accurate percentage using points (and more) much quicker than they would if they just using win percentage.
  2. It replaces “common knowledge” and reputation for judging how good a team is. For example, the Colts and New England are constantly viewed as great clubs because of their quarterbacks. This puts that reputation to a test.
  3. This lets everyone compare their team with others more rigorously than just using win percentages. I remember last year when New England lost to the Browns their was a general sense of disbelief, but had others been paying attention they would have known the Browns had lost a lot of close games and were much better (though still only average) than their record and their reputation suggested.

Hey, as a bonus, I averaged each division’s TS ranking and multiplied it by a number so a score of “100” represents an average conference. The two North conferences came out on top, the combinations of Pittsburg and Baltimore, and Green Bay and Chicago alongside the not that bad Cleveland, Cincinnati, Minnesotta, and Detroit make it so. However, the real shocker is the NFC West. I’m sure you guys realized it was a bad conference, but this really highlights just how bad. There is twice as much a difference between the NFC West and the second to worst division as there is between the second to worst division and the best division.

AFC East 108
AFC North 113
AFC South 98
AFC West 104
NFC East 99
NFC North 114
NFC South 99
NFC West 63

The Seahawks were the first team in history to make the playoffs with a losing record so, uh… not that big a surprise there.

And again, all this analysis is of limited value when you only look at one season. The NFC West has been considered pretty bad for years, and yet two teams from that division have made the Superbowl in recent years, with neither being “blown out.” So I’m not sure what is hoped to be learned by only looking at last year.

Besides, shouldn’t the real question be why the NFC South has almost the second worst rating when it produced three 10+ winning teams last year?

Dang it, I was almost finished writing a reply and minimized to check on something and accidently cancelled out.

Because playoffs are ultimately a crap shoot? Because the NFL is actually a pretty balanced league where even the bad teams will routinely play respectably against the good teams and even beat them one in a while? Because I want an objective way to measure how well teams performed that looks past the crude measurement of wins? Because past success is no indicator of future performance? A lot of teams go through drastic changes season to season, and my method is all ignoring talking head knowledge for a closer look at performance.

Besides, shouldn’t the real question be why the NFC South has almost the second worst rating when it produced three 10+ winning teams last year?

This one is easy, Carolina Panthers were the worst team in football last year by a sizable margin. They went 0-6 in the division and inflated everyone else’s total wins. I won’t say that all teams in the NFC South are bad, as clearly Atlanta was very good, New Orleans was good, and Tampa Bay was better than average. Still, Carolina was so bad that it dragged down the average. C’est la math.

Out of curiosity:

  1. Have you checked out football outsiders at all? Seems like you’re barking up a similar tree.
  2. Have you tested your system with previous seasons?