E ratings: League 2 2014/15
Following on from my earlier post introducing the new “E ratings” I’ve been experimenting with, I thought it would be helpful to illustrate their use by reviewing each division from last season.
E ratings are my newest creation, combining the concept of an “expected goals” measure of shot quality with an Elo-style rating system, to track the attacking and defensive performances of clubs over time. While you can read about them at the link above, I’ve used last season’s data here to give a “real life” example of how they can be used.
Below is a graphic showing how the E ratings for each League 2 club changed over the 2014/15 season, with the clubs in descending order of their net rating (effectively “expected goal difference per game”) at the end of the season:
The first column of “sparklines” shows how each team’s attack rating – i.e. expected goals scored per match – changed over the season and their final rating (ranked in brackets). The grey horizontal line is the long-term average of around 1.3 goals, so strong attacks will be comfortably above it.
The second column shows the defence rating – which has the same long-term average obviously – where lower is now better.
The third column just subtracts the defence rating from the attacking one to get a net “expected goal difference per game”. The average line here is set at zero.
As you’d probably expect, there’s a reasonable amount of similarity between the order of the teams here and their final league placing. Differences will be primarily driven by abnormal (for better or worse) shot conversion and/or save percentages, which aren’t currently accounted for in these ratings.
Some observations:
- It’s reassuring that the four best-performing clubs by this measure were the ones who got promoted, but it was a different story at the bottom, with Morecambe and Dagenham & Redbridge appearing fortunate to have survived.
- Cheltenham‘s relegation seems to have been due to a lack of cutting edge up front, with their attack rating deteriorating significantly over the season while their defence actually performed better than average at restricting their opponents’ chances.
- Wycombe provide a useful lesson in the application and limitations of these ratings. A dire 2013/14 campaign in which they scored just 1 goal per match and were nearly relegated meant that they started last season with a very poor attack rating. A much stronger showing saw their rating increase significantly over 2014/15, suggesting that it takes these ratings a while to “catch up” when a team’s ability changes a lot between seasons. It got there in the end though: if you add up the expected goals from every club’s shots last season Wycombe rank 16th, while their attack ended up 17th when using E ratings.
I’m aware that that graphic is a bit busy, so here are the final attack and defence ratings mapped against each other in good old scatter plot format:
We can’t see the trend here – only an end-of-season snapshot – but it does help us to see the overall spread:
- Shrewsbury‘s promotion appears well-deserved, with Southend and Burton‘s strong attacking performances also visible.
- This emphasises the improvement needed at Dagenham & Redbridge, while defensively there’s work to do at Carlisle too after a disappointing return to the fourth tier
- Plymouth‘s play-off finish is a bit surprising given their almost bang-average ratings at both ends – it looks like their defensive resilience gave them the edge.
Note: There’s been a slight change to the positioning of the axes, which are now centred on the median for each measure rather than the mean. This was suggested by Chris Anderson and does a far neater job of separating the teams into quadrants.
Pingback: Introducing “E ratings” | Experimental 3-6-1