E ratings: Championship 2014/15
Following on from my earlier post introducing the new “E ratings” I’ve been experimenting with, I thought it would be helpful to illustrate their use by reviewing each division from last season.
E ratings are my newest creation, combining the concept of an “expected goals” measure of shot quality with an Elo-style rating system, to track the attacking and defensive performances of clubs over time. While you can read about them at the link above, I’ve used last season’s data here to give a “real life” example of how they can be used.
Below is a graphic showing how the E ratings for each Championship club changed over the 2014/15 season, with the clubs in descending order of their net rating (effectively “expected goal difference per game”) at the end of the season:
The first column of “sparklines” shows how each team’s attack rating – i.e. expected goals scored per match – changed over the season and their final rating (ranked in brackets). The grey horizontal line is the long-term average of around 1.3 goals, so strong attacks will be comfortably above it.
The second column shows the defence rating – which has the same long-term average obviously – where now lower is better.
The third column just subtracts the defence rating from the attacking one to get a net “expected goal difference per game”. The average line here is set at zero.
As you’d probably expect, there’s a reasonable amount of similarity between the order of the teams here and their final league placing. Differences will be primarily driven by abnormal (for better or worse) shot conversion and/or save percentages, which aren’t currently accounted for in these ratings.
Some observations:
- Both Bournemouth and Norwich appear to have deserved their promotions, comfortably out-creating their opponents by over half an expected goal per match.
- Charlton and Leeds survived despite striking the ball from far less dangerous positions than their opponents, suggesting that both could struggle in 2015/16 without significant improvement.
- It looks like Wigan‘s deteriorating attack was primarily to blame for their relegation: their defensive performances were consistently above average, but their attack rating dropped from one of the best to one of the worst over a single campaign.
- Whatever Fulham changed after the first few months of the season blew a big hole in their defence, which was relatively average to begin with before starting to haemorrhage goals.
I’m aware that that graphic is a bit bust, so here are the final attack and defence ratings mapped against each other in good old scatter plot format:We can’t see the trend here – only an end-of-season snapshot – but it does help us to see the overall spread:
- The superiority of Bournemouth and Norwich is clearly visible, as are the worrying performances of Charlton and Leeds.
- Despite Gary Rowett’s successful rescue of Birmingham, their expected goals ratings hardly budged, which is pretty interesting. Certainly at the back they continued to allow plenty of chances but were adept at soaking these up – the question is whether they can continue to do so.
Note: There’s been a slight change to the positioning of the axes, which are now centred on the median for each measure rather than the mean. This was suggested by Chris Anderson and does a far neater job of separating the teams into quadrants.
Pingback: Introducing “E ratings” | Experimental 3-6-1