The Roar
The Roar

AFL
Advertisement

A math geek's guide to the true AFL ladder

Jordan Lewis has a chance to get his first win against the Hawks (Photo by Michael Willson/AFL Media/Getty Images)
Roar Guru
24th July, 2018
19

We all have opinions about footy. That’s why we follow sports; that’s why sports exist. The only real reason we allow so much money to be poured into sports in our society is that we enjoy wrestling with it, mentally and emotionally.

Why is my team better than your team? Who’s the best player? Why is player X better, more important, or improving faster than player Y? Which team is most likely to win the flag? Make the finals? Finish last? Improve the most?

How can I statistically justify the love for my chosen team, player, coach (or mascot)?

It gives us pleasure to cheer our team on. But it also gives us pleasure to be right about our beliefs!

For some of us, trying to quantify that process gives us even more pleasure – call us math geeks, we won’t complain.

So, for you math geeks and spreadsheet fanatics out there, we’re going to quantify the most important question we can ask about our sport of preference – where does your team rank among the 18 AFL clubs right now?

What are their chances for the rest of the season?

I’ve been writing about this topic using my patented ELO-Following Football rating system in this publication for two years now. The system’s been in place for several more years than that, long enough to work most of the kinks out.

Advertisement

But it’s not the only such rating system out there. Today, we’re going to combine four of the most mathematically justifiable systems online today: besides mine, we’re going to utilize The Arc, FMI, and The Wooden Finger.

There are other sites I like which use pure statistical analyses to evaluate teams, like HPN or A Matter of Stats. However, these four provide a single number to describe each team, and therefore makes today’s exercise much easier.

So, here are the ratings in each system for each of the 18 teams in the AFL as of Round 18. Fair warning; those of you who think math is evil, skip on down below the two spreadsheets and let us math geeks have our fun.

Club ELO The Arc Wooden Finger FMI
Adelaide 51.7 1531 -0.9 1168
Brisbane 52.0 1431 -5.1 976
Carlton 7.9 1276 -40.7 755
Collingwood 64.6 1581 15.1 1217
Essendon 54.2 1539 3.5 1100
Fremantle 32.8 1398 -17.1 927
Geelong 63.4 1592 15.8 1274
Gold Coast 18.1 1326 -34.0 728
GWS Giants 63.9 1561 12.5 1210
Hawthorn 53.8 1523 8.7 1162
Melbourne 68.0 1590 18.9 1252
North Melbourne 45.1 1487 0.9 1095
Port Adelaide 53.8 1535 5.0 1194
Richmond 80.6 1671 29.3 1385
St Kilda 35.9 1415 -11.5 943
Sydney 56.1 1554 6.9 1292
West Coast 65.0 1605 15.6 1273
Western Bulldogs 27.6 1385 -22.9 918

To compare these four disparate systems is difficult as is. But there’s something each system has in common; there is a centre point which each set of numbers revolves around.

For the ELO-FF, the eighteen ratings always add up to 900, and the average rating is always 50 because whatever is added to one team’s score is subtracted from their opponent’s rating (that’s what makes it an ELO system).

For The Arc, Matt Cowgill uses 1500 as his centre point, and when Gold Coast gained 41 points with its upset victory Saturday, Sydney lost those same 41 points on their Arc score.

Advertisement

The other two systems also balance the total ratings of the eighteen teams, but over a longer period of time – generally the last 22 games.

So, the overall total of the teams’ ratings average about the same throughout the year: zero for Wooden Finger, and about 1100 for FMI.

While Gold Coast went up 7.5 points last week on the Finger’s rating system, Sydney’s went down slightly more than that (7.6 points). That’s because of idiosyncrasies in the ups and downs across each team’s previous 22 games. But that tenth of a point will balance out over the course of the round and the season.

So, if we re-examine each rating system using some basic statistical analysis, we can take those center points and figure out what the standard deviation of each rating is and compare those.

A quick explanation for those non-math geeks who snuck into the conversation; we use standard deviation as a fairly arbitrary measure of how far a rating is from the average, and how dramatically that rating deviates from the mean.

In general, with a large enough sample of numbers, about 68% of all numbers should fall between one standard deviation above and one below the average – just over 95% fall between two standard deviations on either side.

In our ELO-FF system, that deviation is about 18.8 points, so we’re describing all the ratings between 31.2 (50–18.8) and 68.8 (50+18.8).

Advertisement

Right now that’s 14 of the 18 teams falling in that range. Earlier in the season, it was as low as 12 clubs. 68% of 18 would be between 12-13 teams, so we’re in the right ballpark.

On the chart below, I’ve translated all those numbers from the first list into standard deviations, so we can compare them across the four rating systems. Zero means the team is exactly on the average rating. Positive numbers indicate ratings above the mean and negative indicate teams below the mean.

Club ELO The Arc Wooden Finger FMI Average Ranking
A cut above the rest
Richmond 1.672 1.589 1.518 1.491 1.568 1
Legitimate competition
Melbourne 0.984 0.836 0.979 0.787 0.896 2
West Coast 0.820 0.976 0.808 0.898 0.875 3
Geelong 0.732 0.855 0.819 0.903 0.827 4
Collingwood 0.798 0.753 0.782 0.601 0.734 5
Semi-legitimate competition
GWS Giants 0.760 0.567 0.648 0.564 0.635 6
Sydney 0.333 0.502 0.358 0.998 0.548 7
Splashing about at sea level
Port Adelaide 0.208 0.325 0.259 0.479 0.318 8
Hawthorn 0.208 0.214 0.451 0.310 0.296 9
Essendon 0.230 0.362 0.181 -0.019 0.189 10
Adelaide 0.093 0.288 -0.047 0.342 0.169 11
North Melbourne -0.268 -0.121 0.047 -0.045 -0.097 12
Below average, but respectably so
Brisbane 0.109 -0.641 -0.264 -0.675 -0.368 13
St Kilda -0.770 -0.790 -0.596 -0.850 -0.752 14
Fremantle -0.940 -0.948 -0.886 -0.935 -0.927 15
Western Bulldogs -1.224 -1.069 -1.187 -0.983 -1.115 16
Witness protection program
Gold Coast -1.743 -1.617 -1.762 -1.989 -1.778 17
Carlton -2.301 -2.082 -2.109 -1.846 -2.084 18

I also took the liberty of shuffling the order to demonstrate how the teams have stratified this season.

While the numbers slide around a bit from round to round (Brisbane is slowly moving from group four up to group three, while the Giants made a significant jump over the last two or three weeks as well), we can see some definite separations worthy of note.

Richmond is in a class of their own right now. Their lead stands out like a sore thumb when you see it like this – no other team is a full deviation above the average anywhere, but the Tigers are about one and a half deviations up in every system.

Right now, no matter how you measure it, Richmond is a couple of cuts above the rest.

Advertisement

Remember, though, the only thing our four systems measure is current performance. If Dustin Martin and Alex Rance get injured, everything could change in a heartbeat.

Also, these systems only compare apples to apples. It says nothing about how 2018 Richmond would do against, say, the 2000 Bombers, the 2009 Cats, the 2013 Hawks or the 1929 Magpies.

The next group, the competitors with a real chance to overtake the leaders, is composed of at least three and perhaps as many as six other teams.

The Demons, Eagles, and Cats are all demonstrably above average in every system, and to be close to a full deviation above average is to stand out against all (ordinary) competition.

Collingwood is part of that group – although the FMI isn’t so sure, and they have legitimate reasons, statistically.

Sydney was unquestionably in that group until they chose to take three quarters off on Saturday, whereas the Giants have been steadily moving up from the bottom of the lower group back to where we expected them to be all along.

Then we can see this collection of five teams struggling to stay at or above average. Competitively, they’re still trying to make finals, even if they know they have fatal flaws that by all rights should doom them to a quick out.

Advertisement

Thank you, 2016 Bulldogs, for being the patron saint of this group, giving them a reason for hope from now on.

All five were above the mean last week; the Kangaroos dropped just below after their demolition by Collingwood.

Every one of them is worthy of playing in finals as each one is a demonstrably good team, at least average among the 18 best in the country. But compared to the seven teams ahead of them? Yeah, they might win a game or three, but that’s not the way we’d tip.

Then there’s the group Brisbane’s working hard to advance out of: the teams that the top twelve figure they should have routine wins over each time out.

Notice something important – see how much larger the deviations are for these teams (ignoring the Lions) than even the top non-Richmond teams are, meaning that the Dockers and Bulldogs are ‘more bad’ than teams like Geelong and West Coast are ‘good’.

Ross Lyon

Freo almost certainly won’t be around in September. (Photo by Adam Trafford/AFL Media/Getty Images)

The gap between these six teams and the top twelve is so large that the weak literally balance twice as many good teams on the other side.

Advertisement

Detour number two for non-math geeks; think of these deviation numbers as a way to balance the eighteen teams on a seesaw. The bigger the number, ignoring the negative sign, the ‘heavier’ the team’s rating is.

Essentially, there are six fat teams on the negative side balancing Richmond and eleven much lighter teams on the positive side.

Then, there are two really, really fat teams on the bottom of the ladder, despite one dramatic win to the contrary.

Before Round 17’s victory, Gold Coast were also two full deviations below the average ratings, as Carlton’s is.

One look at Gold Coast and Carlton’s numbers on this chart tells you why there’s such consternation on their viability moving forward.

What those two teams need and what they’re going through deserve an entire column of its own.

Suffice to say right now that, for the two teams to continue to exist in the AFL two full deviations below even the average team means they wouldn’t be expected to win against anyone except each other. (That said, I’ll take Gold Coast by less than the twenty-point margin this weekend).

Advertisement

The chances of them winning against any other opponent are always going to be one-in-ten or worse. Over their previous five games, each of these two teams had just one occasion when the oddsmakers and public tipsters gave them better than that ten per cent chance of winning.

That’s not healthy for the players who have dedicated themselves to achieving the level of personal excellence necessary to play AFL-quality football, nor is it healthy for the teams these bereft clubs play, nor is it beneficial for either the barrackers of each squad.

It’s also unhealthy for, strangely, the opposition fans, for whom it’s a lose-lose proposition. Win, and you were supposed to win; lose, and you’re Sydney this week.

A league is healthiest when every team can go into the season dreaming of greatness.

The AFL is in pretty good shape in 2018. Twelve of its eighteen teams can still realistically say they’ve got a legitimate shot at finals and maybe a bit more – 13 if you’re reading this on the west coast.

Sixteen of its teams can project out past 2018 and see success on the horizon if their vision’s good enough. (You know, if they have 2020 vision).

And as for Carlton and Gold Coast? Well, they can always hope for help from above.

Advertisement

Specifically, Gil McLachlan’s office.

close