The Roar
The Roar

Advertisement

Winners and losers revisited - how did the Monte Carlo model go at predicting the Rugby World Cup?

Autoplay in... 6 (Cancel)
Up Next No more videos! Playlist is empty -
Replay
Cancel
Next
Roar Rookie
1st November, 2023
28
1093 Reads

A few months ago, I wrote an article that shocked some on The Roar when I predicted the All Blacks were the favorites to win the 2023 Rugby World Cup in France.

This prediction came from a “Monte Carlo” simulation model I had developed for the tournament (Monte Carlo simulations are a powerful uncertainty modelling technique that calculates the chance of any particular outcome occurring within a complex system).

The model used historical outcomes from head-to-head encounters between each team, the recent form of the teams as expressed in the relative World ranking and the impact of the draw on the outcome of the tournament.

After I published the article, I added a tournament factor to account for the performance of some teams during knock-out phases that had a positive impact on team like the Springboks, All Blacks and England and a negative impact on teams like Ireland, Fiji, Japan and Tonga.

Aaron Smith of New Zealand celebrates with teammates after scoring his team's fourth try during the Rugby World Cup France 2023 semi-final match between Argentina and New Zealand at Stade de France on October 20, 2023 in Paris, France. (Photo by Cameron Spencer/Getty Images)

Aaron Smith celebrates with his All Blacks teammates. (Photo by Cameron Spencer/Getty Images)

So how accurate was the model?

Across the tournament, the model correctly predicted the winner for 41 out of 48 games, more than 85 per cent accuracy. The model was especially good at predicting the likely winner during the pool stages and only got 4 from 40 games incorrect. These were the Wallabies losses against Wales and Fiji, Portugal’s upset win against Fiji (all in Pool C) and Japan’s win against Samoa in Pool D.

Advertisement

The model correctly predicted the winner and the runner up for Pool A, Pool B and Pool D, however predicted that the Wallabies would perform much better by winning Pool C with Wales second. The model didn’t foresee the shocking outcomes of the Wallabies during the pool stages. Seven of the eight finalists were correctly predicted, with 6 in the correct order. I think that was a fantastic outcome.

The real test came in the quarterfinals as most of these games were judged by rugby experts as 50/50. This was the stage when the knock-out factor also came into play. The model predicted:
– A 70% chance that Argentina will knock out Wales in QF1
– A 55% chance that Ireland will knock out New Zealand in QF2
– A 68% chance that England will knock out Fiji in QF3
– A 54% change that France will knock out South Africa in QF4

The model managed to correctly predict only 50 per cent of the games – a pleasant disappointment for All Black and Bok supporters. At the semi-finals the model estimated the probabilities as:
– a 90% chance that New Zealand will beat Argentina in SF1 and a
– 78% chance that South Africa will beat England in SF2.

Angus Bell of Australia celebrates with Robert Leota of Australia after scoring his team's third try during the Rugby World Cup France 2023 match between Australia and Portugal at Stade Geoffroy-Guichard on October 01, 2023 in Saint-Etienne, France. (Photo by Chris Hyde/Getty Images)

Wallabies Angus Bell and Robert Leota celebrate a try. (Photo by Chris Hyde/Getty Images)

The model predicted the SF winners correctly. For the Bronze medal, the likelihood was 80% that England win beat Argentina and the final the likelihood was slightly in the All-Blacks favour (51%). Bok supporters will be delighted that model got that prediction also wrong.

The biggest surprises were the Portugal win over Fiji that had less than a 1/50 chance followed by Australia’s loss against Fiji that was considered only 8% likely. The Boks also surprised by recording wins against expectations against both France and New Zealand (these games were almost 50/50).

Advertisement

Before the tournament, the model considered the All Blacks as is most likely to win. After the application of a tournament factor, the probability was estimated at more than 30% for an All Black win, 20% for the Boks followed by Ireland (18%) and France (15%). The model also correctly predicted England would come 3rd (more than 35% likely) with Australia and Argentina most likely in 4th position (33% and 25% respectively).

What do you think – was that a good, reasonable or bad performance?

close