The Roar
The Roar

AFL
Advertisement

The comprehensive end-of-year review: In summation

Autoplay in... 6 (Cancel)
Up Next No more videos! Playlist is empty -
Replay
Cancel
Next
Roar Guru
29th November, 2018
7

This is the last of 19 articles that are looking at the meta-results for both team and players, as collected from ELO-Following Football’s wide range of sources.

In summation, let’s look at the overall picture.

At the end of 2017

Richmond was the cock-of-the-walk and started with the highest ELO-rating (77.8), as well as the premiership cup in their hands. Adelaide had been minor premiers but were bested in the grand final.

Along with the resurgent Sydney Swans, who had come from 0-6 to win 14 games, those three teams led the ratings list and were the only clubs in the 70s.

Port Adelaide, Geelong and GWS were in the low 60s; Collingwood, Melbourne, and Hawthorn all had ratings just barely above 50. Just below the average mark sat West Coast, St Kilda, Essendon, and the Bulldogs; farther back were North Melbourne and Carlton, at 37.5 and 35.3, respectively.

The bottom three were expected to be Brisbane (29.6), Fremantle (25.9), and Gold Coast (17.6).

The expectations for the ladder

Advertisement

Averaged out over more than fifty prognosticators, they looked like this: Adelaide, GWS, Sydney, Richmond, Geelong, Port Adelaide, Melbourne, Essendon, Bulldogs, Hawthorn, St Kilda, West Coast, Collingwood (that’s right: we all thought the Grand Finalists would be 12th and 13th!), Fremantle, Brisbane, Carlton, North, and Gold Coast.

As of the week before the start of the season, CrownBet (now BetEasy) had very close to the same order expressed in their odds – you could have had either of the Grand Finalists at $51 on a one-dollar bet when the season started. On the other hand, the chances of Adelaide making finals were only a $1.14-to-one bet, and you’d have lost that dollar.

Coming into the season, the players who were considered to be in the top 15 or so in the league by the AFLPA or The Roar included Rory Sloane (ADE), Dayne Zorko (BL), Scott Pendlebury (COL), Gary Ablett Jr. (GEE), Joel Selwood (GEE), Tom Lynch (GC), Josh Kelly (GWS), Robbie Gray (PA), Alex Rance (RCH), Josh Kennedy (SYD), Josh Kennedy (WC), and Marcus Bontempelli (WB).

The top four were Dustin Martin (RCH), Patrick Dangerfield (GEE), Nat Fyfe (FR), and Lance Franklin (SYD) – usually in that approximate order.

By the end of the year, those four were all still top ten, but many of the others didn’t make our top 22 – often because of injuries, like Josh Kelly, or age, like Sydney’s Kennedy.

The “once-around” ladder – the hypothetical 17-game season – produced this ladder;

Advertisement

1st – Richmond, 13-4
2nd – West Coast, 12-5
3rd – Melbourne, 11-6
4th – Hawthorn, 11-6
5th – Collingwood, 11-6
6th – Sydney, 11-6
7th – Port Adelaide, 11-6
8th – GWS Giants, 10-6-1

9th – Geelong, 10-7
10th – Essendon, 10-7
11th – Adelaide, 10-7
12th – North Melbourne, 9-8
13th – Western Bulldogs, 6-11
14th – Fremantle, 6-11
15th – Gold Coast, 4-13
16th – St Kilda, 3-13-1
17th – Brisbane, 3-14
18th – Carlton, 1-16

This ladder takes only the first game played between any pair of teams – any return contests were ignored, including derbies.

For the hypothetical finals series, we used any second game that was played between the candidates (including September); otherwise, we had to reprise the only game the two teams played this year.

Qualifying and Elimination finals
Richmond (#1) d. Hawthorn (#4) in Round 24, 95-64.
West Coast (#2) d. Melbourne (#3) in Round 26, 127-51.
Collingwood (#5) d. GWS (#8) in Round 25, 61-51.
Port Adelaide (#7) d. Sydney (#6) in Round 2, 94-71.

Semi-finals
Hawthorn d. Collingwood in Round 1, 101-67.
Port Adelaide d. Melbourne in Round 14, 75-65.

Preliminary finals
Port Adelaide d. Richmond in Round 12, 72-58.
West Coast d. Hawthorn in Round 10, 75-60.

Grand final
West Coast d, Port Adelaide in Round 22, 62-58.

Advertisement
Luke Shuey

The Eagles would’ve still won the flag. (AAP Image/David Mariuz)

So, West Coast would have won anyway? Hmm. All right, we might as well play the other five games then!

I wouldn’t have wanted to live a life without Round 20 this year!

It’s been four years since we had a repeat premier – actually a ‘three-peat’ premier – the Hawthorn Hawks of 2013-15.

Since then, we’ve had six different teams in the last three grand finals.

It’s been 12 years since two things happened prior to this season: the last West Coast premiership, and the last Grand Final that ended at less than a goal’s difference. (Not counting the fruitless first game in 2010.)

According to our patented “ELO-Following Football” rating system, the final list of ratings from the 2018 season isn’t in the same order as the list – it tends to indicate how the last couple of months of the year went.

Advertisement

1st – Geelong (76.9)
2nd – West Coast (74.2)
3rd – Melbourne (72.2)
4th – Collingwood (68.9)
5th – Richmond (68.6)
6th – Greater Western Sydney (66.3)
7th – Essendon (63.2)
8th – Adelaide (60.5)

9th – Hawthorn (59.3)
10th – Sydney (53.3)
11th – North Melbourne (50.6)
12th – Port Adelaide (49.0)

Gap – there was a huge break between the top 12 and the bottom six all year. As a joke, I almost wrote an extra year-in-review for the gap, but realized I didn’t have nearly enough material to make it work.

13th – Western Bulldogs (39.7)
14th – Brisbane Lions (39.5)
15th – St Kilda (30.7)
16th – Fremantle (20.1)

There’s actually a larger gap here, between sixteen competitive AFL teams and two which, frankly, should have been relegated ala English football leagues.

17th – Gold Coast (6.9)
18th – Carlton (0.1)

Compared to the other ratings systems?

Here’s an interesting chart – a very simple comparison averaging the current placements from the major ratings systems I’ve been using (The Arc, Wooden Finger, and Footymaths Institute, or FMI), plus several good new ones to our usage (Squiggle, PlusSixOne, Massey and GraftRatings), our own ELO-FF ratings, the actual 2018 ladder, and the current BetEasy odds posted regarding the most and least likely placements for each team.

Advertisement

1st – Richmond (2.0)
2nd – West Coast (2.6)
3rd – Melbourne (3.0)
4th – Collingwood (4.2)
5th – Geelong (5.2)
6th – GWS Giants (6.0)
7th – Essendon (7.1)
8th – Hawthorn (7.7)

9th – Adelaide (9.1)
10th – Sydney (9.3)
11th – Port Adelaide (10.9)
12th – North Melbourne (11.4)
13th – Western Bulldogs (13.4)
14th – Brisbane (13.9)
15th – Fremantle (15.1)
16th – St Kilda (15.1)
17th – Gold Coast (17.4)
18th – Carlton (17.6)

One last thing to remember: going back at least into the mid-1990s, there has never been a season in which fewer than two teams dropped out of finals (and, obviously, a similar number move in to finals the following year).

Paul Seedsman

(Photo by Adam Trafford/AFL Media/Getty Images)

I’m not sure what the surest way to make a prediction is, but the surest way to be wrong is to predict the same eight teams in finals as there were this year!

Across the spectrum game-by-game expectations

Final record: 198 home-and-away games played.
Betting line: The opening line picked 139 correct winners, 58 incorrect, and there was one draw.
ELO-Following Football forecasts: Our system went 137-60-1 during the season, and picked seven winners during the finals series.
AFL.com.au game predictions: 129-68-1.
The Roar predictions: Aggregate predictions totalled to 132 correct and 64 incorrect, with two that were either an evenly split panel or the game was a draw.
“Pick-A-Winner” predictions: The two punters totalled 262-135, discounting the draw.
forecasters: Aggregate predictions totalled to 134 correct, 63 incorrect, and 1 draw.
BetEasy “CrowdBet” percentages: Actually, the best of the lot: counting the majority as the bet, the crowd went 140-47-1.

Generally, football games (and most forms of the game: Aussie rules, American rules, even rugby) go to the favourite about two thirds of the time.

Advertisement

Even the best predictors can’t beat about 70 per cent in the long term in footy; betting against the spread tends to look at even 55 per cent as a holy grail of a long-term target.

Translation: don’t give up your day job, tipsters.

By the way, Following Football runs rating and forecast models for American and Canadian football as well – curiously, those sports run closer to form, with the best prognosticators pushing past 75 per cent over the course of a season. Immodestly, I’ll brag that Following Football is running above 76% as of this writing.

Conversely, in the sport we Americans call “soccer”, actual football, because of the propensity of draws, even getting half your picks right is commendable.

PIf you think you have a mathematical system to predict the winners, test it by retroactively applying it to the 198 games of this seasons just concluded.

If you do worse than 127 correct, throw it out immediately, because you could have gotten that many right by simply using my uncle’s tried-and-true method of picking the team with the better record and, if they have the same record, picking the home team.

Meta-player of the year results and the ELO-Following Football AFL first team

Advertisement

Over the course of the season, we gathered game-by-game naming of the outstanding players of the game or the week from 14 different sources. These range from “Team of the Week” listings to more Brownlow-like top player of the game scenarios.

We then tally up all the scores across the season and come up with not only a ranking through all 659 players who saw the field in an AFL game during 2018, but also a 22-man team that we would put up against any combination anyone else could come up with!

Yes, I cheated on positions to fit as many high-point total players as possible. Don’t talk to me about that until you can explain naming Buddy Franklin as the All-Australian captain.

1. Max Gawn, Melbourne (518 points) – First team ruckman
2. Tom Mitchell, Hawthorn (510 points) – First team midfielder
3. Dustin Martin, Richmond (478 points) – First team midfielder
4. Brodie Grundy, Collingwood (476 points) – First team forward/alternate ruckman
5. Patrick Cripps, Carlton (475 points) – First team midfielder
6. Lance Franklin, Sydney (438 points) – First team forward
7. Clayton Oliver, Melbourne (434 points) – First team midfielder
8. Patrick Dangerfield (417 points) – First team small forward
9. Jack Macrae (390 points) – First team midfielder
10. Nat Fyfe, Fremantle (371 points) – First team, interchange midfielder
11. Dayne Beams, Brisbane (339 points) – First team, interchange midfielder
12. Rory Laird, Adelaide (329 points) – First team defender
13. Shaun Higgins, North Melbourne (326 points)
14. Lachie Whitfield, GWS (323 points) – First team defender
15. Jack Riewoldt, Richmond (320 points) – First team forward
16. Elliot Yeo, West Coast (317 points) – First team defender
17. Steele Sidebottom, Collingwood (315 points)
18. Andrew Gaff, West Coast (312 points)
19. Tom Hawkins, Geelong (309 points) – First team forward

Tom Hawkins Geelong Cats AFL 2017

Cats player Tom Hawkins. (AAP Image/Dave Hunt)

20. Stephen Coniglio, GWS (306 points)
21. Gary Ablett, Jr., Geelong (297 points)
22. Luke Breust, Hawthorn (295 points) – First team small forward
23. Joel Selwood, Geelong (293 points)
24. Scott Pendlebury, Collingwood (290 points)
25. Robbie Gray, Port Adelaide (284 points)
26. Lachie Neale, Fremantle (277 points)
27. Callan Ward, GWS (274 points)
28. Marcus Bontempelli (273 points)
29. Dyson Heppell, Essendon (261 points)
30. Jordan deGoey, Collingwood (258 points) – First team, interchange forward
31. Ben Cunnington, North Melbourne (258 points)
32. Jake Lloyd, Sydney (258 points)
33. Ben Brown, North Melbourne (252 points) – First team, interchange forward
34. Alex Rance, Richmond (251 points) – First team defender
37. Jeremy McGovern, West Coast (243 points) – First team defender
48. Shannon Hurn, West Coast (221 points) – First team defender

Forecast for 2019

Advertisement

To summarise our predictions for the season to come in October and November, is a futile task – but as we’ve seen, it’s no easier the day before the season starts!

So, with malice towards none and best wishes towards all, here is the official ELO-Following Football Forecast four months in advance of the 2019 season (undoubtedly to be revised in March!)

1st – Richmond Tigers
2nd – Collingwood Magpies
3rd – Hawthorn Hawks
4th – West Coast Eagles
5th – Melbourne Demons
6th – Sydney Swans
7th – Brisbane Lions
8th – Adelaide Crows

9th – Essendon Bombers
10th – North Melbourne Kangaroos
11th – GWS Giants
12th – Fremantle Dockers
13th – Geelong Cats
14th – Western Bulldogs
15th – Port Adelaide Power
16th – Carlton Blues
17th – St Kilda Saints
18th – Gold Coast Suns

Sports opinion delivered daily 

   

For comparison, here’s what the oddsmakers at BetEasy are implying the most-likely outcome will be with their current numbers.

Richmond is definitively first. West Coast, Collingwood, and Melbourne are in a tight pack for second. Essendon, Adelaide, and GWS are tightly bunched in fifth, sixth, and seventh.

Advertisement

Hawthorn and Sydney are listed together in the pivotal slots eight and nine. Geelong, Port Adelaide, and North Melbourne are just outside the finalists, in that order, for 10th through 12th.

The last six in order are the Western Bulldogs, Brisbane, Fremantle, St Kilda, Carlton, and Gold Coast.

For 2019

We intend to expand our analysis to include comparisons from more than just a handful of other systems – there are literally dozens of competing rating systems out there on the internet, some more complex than others, but remarkably the best of them are all butting up against that mythical two-thirds barrier to accuracy that our sport imposes.

Squiggle ran a competition this year among a host of the most successful prediction models, and out of the 207 games played this year, the most accurate predictors were the combinations of various methods: crowd sources, aggregates of professional oddsmakers, and so forth.

Even then, the maximum correct matched The Roar’s crowd survey total of 147 for the season, right at the 70 per cent mark.

Counting the finals, ELO-Following Football hit on 144 correct.

Advertisement

We’ll also continue to refine our methodology for both the team ratings and the meta-player of the year standings. One of the great things about having the AFLW out of season from the men’s league is the chance to test concepts in one before applying it to the other; because of the timing, that’s usually meant the women’s league gets the benefit of the first application before the men’s league sees it.

But with such a short season, plus two expansion teams (and four more next year!), accuracy in the AFLW ratings has been a bigger problem than it has been in the older, larger league.

Still, the Bulldogs, Demons, Lions and Magpies will start with the four highest ratings in the AFLW this February.

Thanks for sharing your comments all season long, readers! It means the world to me, halfway ‘round the world!

close