In our previous article, we took a look at how home advantage can affect a team’s success. However, some issues were also discovered, for various reasons: certain teams were stuck in divisions where most of their away games were against weaker teams, such as Limerick, or against stronger opposition, such as Dublin. There are also certain teams, for example Tipperary, who rarely play games on neutral ground due to their home stadium being used for games in the knockout stages. Small sample size issues, particularly for the lower tier teams, also played a part in skewing the precision of the results. In this article, we will attempt to create a better representation of home advantage by utilising our rating system to determine how much the actual result of each match differed from the expected result, and figuring out how much home advantage played a part in this. Secondly, we will be using these findings to improve (hopefully) our rating system, with the aim of producing more accurate predictions.
Expectation versus Reality
In the basic Elo rating, the improvement in score is determined by first producing each team’s odds of winning or losing, based on the their relative ratings. The difference between the odds and the actual result, multiplied by the k value (a constant which determines how sensitive the ratings are to wins or losses), is what determines how large a swing in the team’s rating will be. For example:
The k-value is set to 24. A team is given a 70% chance of winning, and indeed goes on to win. It’s rating is calculated by subtracting 70% from 100% (because this team now have a 100% chance that they won), and multiplying the result by 24. Therefore, their rating increases by 7.2 points (which is 24 x ( 1.0 - 0.7 )). If they had lost, you would have subtracted 70% from 0% instead. and their rating would have decreased by 16.8 points.
In order to perform this new analysis of home advantage, the program reran through all the matches in the database, and noted this difference between expectation and reality; the change in rating dividing by the k value. For each of these differences, it was noted whether the match was played at home, away, or on neutral ground. Each team’s overall gap between expectation and reality, regardless of venue, was also taken into account, to reduce the impact of unrelated factors in the outcome of the match. This could include teams changing players, or rapidly improving or declining over a certain time period, for example. In doing so, we were able to determine how much the venue affects the difference between expectation and reality. The bigger the gap, the bigger the implication that the rating system wasn’t accounting for something.
By adjusting the data to reflect the quality of competition, it was found that the value of playing at home had actually been underestimated by the previous article. The average home team performed 7.3 percentage points above what was expected of them, with a median improvement of 8 percentage points. This means that in a match between two evenly rated teams, in which draws were not permitted, the home team would be expected to win about 57-58% of the time, instead of 50% of the time. Perhaps not enough to ever guarantee victory, but certainly a significant effect.
Let’s take another look at how teams performed at home, away, and on neutral ground, now that we’ve accounted for the relative strength of their opposition.
The above chart displays how many percentage points above expectation each team performed at home, once other factors had been removed. Dublin and Tipperary remain in the top five teams at home, as both have remained steady over the last 5 years. Clare are lower rated than before, however, possibly due to facing easier competition in the times where they were in Division 1B in the league, rather than 1A. Kilkenny have fallen massively compared to the previous article, but this is most likely due to how highly rated they are, rather than how poor their opposition is: Kilkenny are pretty much always expected to win, regardless of location or opponent, so even a single loss at home hurts their score a great deal. One team of interest, which has performed well by the new measure, is Galway: In light of their ongoing dispute with Leinster, their big improvement in home games certainly supports their drive to have games played in Pearse Stadium during the All-Ireland championship. Of the upper tier teams, Cork and Limerick have both underperformed over the 2012-2016 period. This is not too surprising for Cork, who have had to make do with Páirc Úi Rinn for much of this timeframe while their usual home of Páirc Uí Chaoimh undergoes its renovation. Prior to adjusting the score to account for other factors, Limerick had actually performed reasonably well at home. However, their unusually strong performance in away matches skewed this rating down.
The above chart displays how many percentage points above expectation each team performed away, once other influencing factors had been removed. Limerick come out way ahead, as they did in the previous rating. Here we find that the theory from the previous article that they simply weren’t playing the same quality of competition doesn’t hold up, and we’re forced to concede that Limerick simply overperform in away games.
The hypothesis that part of the disadvantage of away games is simply having the longer, more uncomfortable journey seems to hold some water based on this list: Teams further from the hurling heartland of the south of Ireland, such as London, Antrim and Sligo, all performed well below their already modest expectations in away games.
The above chart displays how many percentage points above expectation each team performed on neutral grounds, once other influencing factors had been removed. Once again, the teams that perform best on neutral ground are the lower division teams, which may still be the result of having a lower sample size. Lancashire, for example, have never won a game. However, they’ve lost less often and less surprisingly in neutral matches than in away ones. As such, they are considered to have performed above expectation in these situations, but only because their regular expectation is already so low. With the weaker teams only playing a couple of neutral games a year, if even a few of those are upset victories, or losses against better teams to what they’re used to, it massively increases their score.
By contrast, most of the top tier teams perform below expectation in neutral games, but this again could be related to the small sample size. Very few games are played on neutral grounds for the top teams, and when they are, winning usually means that you’re a finalist or a champion, not an easy feat for any team, unless you’re Brian Cody. The best performing top-tier teams are currently Galway and Kilkenny, which is no surprise considering their ability to go deep into the championship almost every year over the period studied.
Updating the ratings
Now that we have evidence which strongly supports the idea that home advantage matters, it seems only sensible that this effect be reflected in the rating system. But there’s now a question of how best to implement it? One way would be to simply give the home team a fixed boost to its expected result in each match, based on the average or median improvement found above. Alternatively, a complex formula could be devised based on some of the theories about why home advantage matters, taking into account travel time, stadium size, and estimated home and away attendance.
However, the decision was made to implement home advantage by adding new ratings in parallel to a team’s main rating. Each team is assigned a new home, away and neutral rating, in addition to their main score. A positive rating is added to their rating when calculating match expectations, whereas a negative score is subtracted. As is the case with the main rating, these values are adjusted based on match performance, with wins increasing the rating and losses decreasing the rating. While the initial values for each team was assigned based on the average values calculated above, these soon diverged as matches were played. The following chart displays the updated rating for each team, as well as their home, away and neutral match ratings.
Here we can see a similar trend to what was found so far; good home performances by teams such as Dublin, good away performances by teams such as Limerick, and good neutral performances by teams such as Galway. However, it is noteworthy that some team’s more closely resemble the previous week’s findings than those above. This is likely due to the number of matches played at each venue affecting the result a bit more, whereas the above findings were more on a ‘per-game average’ basis.
In summary, this dive into the nature of home advantage has provided some tangible results, both in terms of insight into where different teams have the most advantage, and a hopefully improved predictive ability of the site’s rating system