How did your county change in 2017?

It’s the end of another GAA hurling season. A lot changed this year, as former greats appeared to fall behind, and teams that had fallen behind appeared to get back on the road to greatness. It could be argued that Galway and Waterford were the only two teams who had been in contention for an All-Ireland last year who remained in contention this year, and made up the two 2017 finalists. Galway, of course, won all around this year, coming away with a league, a Leinster championship and the All-Ireland itself, but they were far from the only team to perform above expectation.

It’s always interesting to look back over the year, and to try to determine which teams were the most and least improved. This article is going to investigate this question, focusing on the teams who either played in Division 1 of the league or in the All-Ireland this year, or those who have qualified to play in this tier next year. It will also look at this issue through three different lenses or what counts as ‘improved’ or ‘disimproved’: each team’s performance in various tournaments compared with their 2016 performance, their scoring margin compared with that of 2016, and how the site’s rating system considers how much they performed above or below expectation.

How far did each team get?

This is, arguably, a fairly crude measure of improvement or disimprovement. It fails to take into account how strong a team’s opposition is, and how you weight it depends on your personal view on how important each tournament is. However, it could also be argued that this measure is the only one that matters: you can be as good as you like, but what does it matter if you can’t win big matches?

As well as trying to determine how best to weight the various tournaments looked at, there are other issues in that some teams haven’t played in the same competition from year to year. Carlow were relegated from Leinster in 2016, and so couldn’t play in it in 2017, instead playing in the Christy Ring tournament. Is relegation better than not competing at all? Is it fair to even make a comparison if they couldn’t play this year? The competitions themselves may change too: By the 2017 rules, Laois would have been relegated to Division 2A in 2016, and Westmeath would have been promoted.

For clarity, the following decisions were made for this ranking or who was most to least improved. All the information is given in the table below for you to make your own decisions on how to rate each team.

-For the purposes of this comparison, provincial tournaments are considered separate from the All-Ireland tournament. Therefore, teams knocked out in the Leinster round-robin stage are considered as not competing in the All-Ireland for that year.

-Winning a lower division is still less impressive than competing in a higher division, even if relegated. Competing in a tournament is considered better than not competing, even if relegated.

-In the league, winning the division or getting further in the knockout stages is what matters, rather than being placed higher in the table (where applicable). Otherwise, league winners Galway would be considered worse than they were in 2016, because they were 2nd in 1B instead of 5th in 1A.

-If a team does better in a tournament, they have a point added. If they perform worse, they have a point subtracted. Therefore, a team with a total score of 3 did better in all three, a score of 2 means they did better in two and the same in one, and so on.  

-In tie-break situations, the All-Ireland (or Christy Ring) is more important than the provincial tournaments, which is more important than the league. Though, personally, I feel the league is probably the best competition for consistently competitive matches, the attendance difference between the league final and those of Munster, Leinster or the All-Ireland speaks for itself.

-If teams are still tied, then whoever got further in the competitions overall is ranked above. Going from being in the semi final to winning outright should be ranked as better than going from round 1 to the quarter finals.

So, with all that out of the way, here is how we rated each team, from most to least improved:

For all its flaws, this measure of improvement does seem to hold up for most counties. As expected, Galway, Cork and Waterford top the ‘most improved’ side of the rankings. Kilkenny, despite having what would be a good year for most, find themselves right at the bottom: when you’re held to such a high standard each year, it doesn’t take much to fall down. Other interesting counties, whose positions could be debated, include Waterford and Offaly. Waterford were runners up in the All-Ireland this year, yet are ranked as slightly disimproved! Similar to Kilkenny, the fact that they were runners up in the league and in Munster in 2016 made it very difficult for them to match last year’s performance. Offaly, had the opposite issue: even though 2017 was considered a terrible year for Offaly hurling, they performed no worse by this metric, as their 2016 was also a fairly terrible year.

However, as mentioned above, this metric has many issues, and how each team is rated largely comes down to one’s own interpretation of the results. Let’s move on to something a bit more tangible…


Although this suffers from one of the same issues as above, in that it doesn’t take strength of the opposition into account, looking into how much each team scored and conceded per match can given us a strong insight into how they improved or disimproved between 2016 and 2017.

Each team was ranked, for both 2016 and 2017, by how much they scored and conceded in an average match. The results are below:

These tables show each team’s offensive and defensive rating; as such, it should be remembered that a swing in performance doesn’t necessarily mean that a team had a different average winning margin from year to year, but simply that they didn’t improve or disimprove as much relative to the opposition. The first offensively scored the most per game of the teams listed, the first defensively conceded the fewest per game of the teams listed.

A few teams leap out immediately in this chart: Clare’s 2016 rating was highly impressive; they were 3rd both offensively and defensively. This year, they fell back to 10th offensively and 11th defensively, the biggest combined drop of any one team. Now, Clare may have suffered from playing tougher opposition: in 2016, they played in Division 1B of the league, and went through the preliminary rounds of the All-Ireland, while in 2017 they had to face Division 1A, and went straight to the quarter finals after losing the Munster final. However, even accounting for this, you still couldn’t argue that Clare got better between 2016 and 2017.

The biggest combined improvement belongs, unsurprisingly, to Galway. In 2016, they were a high scoring team, coming 5th offensively, but conceded almost as much as they scored. In 2017, Galway massively improved their defense, capturing the 2nd best defensive rating, while upping the scoring even more, putting them into 1st place.

This metric supports some of the findings from the previous section, while also giving us new insights: Kilkenny and Dublin dropped off heavily. Waterford improved through their offence, Galway through their defence, and Wexford improved through both. However, we get more information about Offaly; while they didn’t drop off in terms of how far they got in the league or in the All-Ireland, their scoring margins absolutely plummeted. Similarly, Laois, who also remained in more or less the same position at the end of the year as at the start, saw a big boost in their scoring, even if their defense was still lacking. Part of this is certainly to do with schedule; Offaly played in the Leinster round robin last year, Laois this year, but just as with Clare, you couldn’t argue that Offaly really got better, or that Laois really got worse.

The tables below gives some more details on the specific scoring numbers each team had, including their average winning or losing margins. The tables are sorted by which teams had the best scoring margin, show these values for 2016 and for 2017, and display the difference between the two years for each team.

These tables also give us a bit more information on how teams did. We can see that the scores in matches involving these teams increased on average between the years, so teams with similar scores to 2016 could have seen their offensive ranking drop and their defensive ranking go up. We see this in teams like Antrim and Kerry, whose scoring was similar, but had large swings in their ratings; Antrim’s defensive rating shot up six spots, even though they conceded less than 2 fewer points per match, while Kerry’s offensive rating went down 4 spots even though they scored 0.2 points more per game. Even if Clare hadn’t had their drop-off in scoring, and had maintained their 2016 ratings, they would have seen their offensive rating fall from 3rd to 6th, and their defensive rating remain the same.

View from the rating system

The above measures of improvement were all interfered with by who each team played. The site’s rating system, however, takes strength of opposition into account. Here is a quick display of how much each team’s rating went up or down over the course of 2017:

The rating system confirms many of the implications of the previous sections: Galway were the most improved, with Wexford and Cork also being at the right end of the table. Kilkenny, Dublin and Kerry find themselves at the wrong end. We also see how, now that the opposition is taken into account, Clare and Offaly didn’t do quite as badly as the scoring difference would imply, and Tipperary haven’t fallen off quite as much as the competition performance table would indicate.

Following on from this, and getting away from how much teams won by, or how far they got, or what tier they were playing in, let’s use this information from the rating system to answer a final question, which should get to the heart of how much better or worse a team was: How much, on average, did each team exceed or fall short of expectations? How much of a difference was there between how many games a team won, and how many the rating system expected them to win?

Each team’s matches were reviewed, and their expected odds according to the rating system were recorded. This was then compared with whether they actually won, lost or drew. As the number of matches differs for each team between years, we will simply look at the average difference per match. The larger the number, the more they exceeded expectations in 2017, the lower the number, the more they underachieved. A team with a score close to zero would have performed almost exactly to expectation.

As an example: if all teams were equal in ability, each team would be given a 50% chance of winning any game. In 10 matches, a team would be expected to win 5. Therefore, if they won 7 out of ten, they would be given a score of 0.2 (i.e. 20 percentage points above expectation per game). If they lost all of their matches, their score would be -0.5. A draw would be considered meeting expectation if both teams were evenly matched, and so getting ten draws would be equal to getting five wins, and result in a total score of 0.

The chart above shows the score above expectation, per match, for each team. This seems very much in line with what we witnessed in 2017: Galway top the table, having won every competition available to them. Wexford also find themselves near the top of the most improved teams. They had been considered a middling team following 2016, and so while they didn’t get further than a semi final in the league or a quarter final in the All-Ireland, they massively exceeded expectations, gaining points from beating much higher ranked teams like Galway and Kilkenny, and minimising their losses by only being beaten by much higher ranked teams, such as Tipperary or Waterford. Cork also performed well, but perhaps faltered by losing to a lower ranked team in Dublin, and also did not score as well when they won due to having higher starting expectations than Wexford.

Going down to the middle of the table, we find Waterford, Tipperary, Clare and Offaly. Waterford, despite reaching a final, were only marginally improved. It seems that the computer believed they were a contender from the beginning, and so reaching an All-Ireland final, combined with a couple of losses along the way, wasn’t too high above expectations. Offaly had the opposite reason for being here; they performed poorly, but very little was expected of them in the first place. Tipperary and Clare both had marginal disimprovement. Though Tipperary won a much higher percentage of their games this year than Clare, they finished with similar scores, as Tipperary had much higher expectations following their excellent 2016 season. Neither performed disastrously, however, as neither lost any major upsets against much lower rated teams. They both won most of what they were expected to win, and lost mostly when the odds were against them.

Down around the bottom of the list, we find teams like Kerry, Kilkenny and Dublin. Similar to Waterford, Kilkenny’s score was further damaged by high expectations. Coming away with only three wins and a draw out of nine matches is fine for a perennial underdog, but when you’re always the favourite, it’s harder to justify. Dublin and Kerry both only won two games out of nine this year, but Kerry’s rating was hit much harder, as while Dublin lost to quality teams like Galway, Waterford and Tipperary, Kerry were beaten by teams like Meath, who have played in division 2B of the league since the league adopted its current format (though both teams will find themselves in 2A next year). Overall, by taking into account the quality of the teams faced, rather than the stage of the competition reached or the scoring differences, we get a much fairer view of how much or little each team improved, which scales to the level of the team, and so this table, at least for me, best represents the progress of each team.


So, that’s it for another year. Some teams made progress, some teams regressed, and some simply spent the year spinning their wheels. There are a variety of ways you can measure these changes, and each of them may be valid depending on what it is you want to highlight. All methods used here seemed to agree on a few things: for example, that Kilkenny have finally fallen from their perch, and are now just another top-tier team instead of something untouchable, and that Galway’s rise has been spectacular, as they won every tournament examined by this article that they took part in. Nevertheless, the 2017 season is now over, and the question remains, for all teams: Will the trend continue? Those who fell will hope that they’ve reached the bottom, those who’ve risen will hope to keep building on their progress, and those in between will hope that those other two groups don’t get their wish. Only time will tell how each county will actually fare when 2018 rolls around.