In professional team sport, a preoccupation with the outcome of competition itself is common – who won and who lost – above and beyond a side’s performance. While wins and losses are undeniably the end game for clubs and fans, as a sole focus it is not always valid.
Uneven competition conditions exist in almost every professional sporting league around the world. For example, extremely inequitable salaries exist between clubs in European football.
Consider the Super Rugby teams from Australia, New Zealand and South Africa (and now Japan and Argentina). Each face vastly differing amounts of travel throughout the course of a regular season. Clearly, not all clubs are created equal so how do we judge their successes accurately?
The ability of teams to overcome and win in spite of these challenges is rightly lauded. By quantifying the influence of these conditions on how a team competes, more informed match predictions can be provided. This allows for the subsequent outcome to be separated from the performance of the team itself, by comparing directly with the pre-game expectation.
Redefining performance as the magnitude by which a team surpasses expectation is attractive on multiple levels. The notion of performance is embedded into the contracts of many athletes and professional staff working in sport. It may also equip coaches with the ability to assess their team selections and strategies more precisely.
As a consequence of geographical distance, we found that the team Western Force, based in Perth, Western Australia, were at a disadvantage with respect to the rest of the competition.
Essentially, in order to achieve the same outcomes as another side, the Force needed to perform at a higher level relative to expectation, as partially determined solely by factors relating to their fixture.
The Australian Football League (AFL) is also characterised by a fixture whereby teams compete against each other an unequal number of times throughout the course of a season.
Teams face varying volumes of travel and a contrasting number of days’ break between matches. Adapting the same approach as mentioned above, we developed two models of match difficulty for the 2015 AFL season.
The models work by predicting a margin and outcome for a given match, using fixed factors available in the lead up to the game. Examples include the rank of the opponent from the previous year and the location of the match.
But instead of focusing on returning a financial dividend, the team’s margin differential relative to the predicted expectation is instead taken as a representation of their performance.
The second model is capable of being iterated on a weekly basis. It also takes into account the conditions subject to change regularly during the season. These include the number of team changes, the current difference in ladder position between the two sides and the current form of each team.
The contrasting findings between both models illustrates the influence of these dynamic factors on match difficulty.
For instance, prior to the 2015 season starting, Geelong faced the most difficult run of matches. But as the season progressed (and the fixture difficulty changed), on average Essendon ultimately competed in more difficult games.
Performance of each of the 18 AFL teams in 2015 relative to expectation can be seen below. Expressed as a percentage, the average performance of each team above or below that of the model prediction is shown. The figure clearly shows that West Coast, Geelong and Richmond all performed well above expectations (9% or greater) for the season.
Note also that both the 2015 grand final teams – Hawthorn and West Coast – performed better than expectations with Hawthorn eventually taking the title.
For comparison, the number of wins recorded by each team for the 2015 season is shown below, with the clubs ranked in the same order as in the above figure.
While a clear relationship exists between the final ladder position and team Performance vs Expectation (dubbed “PVE”), we see that despite only winning half their matches for the year, Geelong were somewhat able to overcome their difficult fixture and perform above expectations.
In contrast, GWS and Collingwood could have been expected to finish higher, having had a relatively easier fixture.
Sure, the models don’t yet consider each team’s individual player characteristics and game style, but it is a start. Use of the approach within a performance context could allow for a much needed increase in the precision by which performance is evaluated.
The 2016 season
With the 2016 AFL season upon us, the average match difficulty for the 18 teams competing in the competition can be seen below.
Adelaide, GWS and again West Coast are going to need to be at their best in order to overcome the most challenging fixture, whereas Geelong are well-placed for a successful year given their easier run comparative to 2015.
It remains to be seen as to who can outperform the model in the upcoming AFL season. But considering their only moderate average match difficulty for the season ahead, it would take a brave person to suggest that the Hawks can’t make it four in a row in 2016.