E ratings: Conference 2014/15
Following on from my earlier post introducing the new “E ratings” I’ve been experimenting with, I thought it would be helpful to illustrate their use by reviewing each division from last season.
E ratings are my newest creation, combining the concept of an “expected goals” measure of shot quality with an Elo-style rating system, to track the attacking and defensive performances of clubs over time. While you can read about them at the link above, I’ve used last season’s data here to give a “real life” example of how they can be used.
Below is a graphic showing how the E ratings for each Conference club changed over the 2014/15 season, with the clubs in descending order of their net rating (effectively “expected goal difference per game”) at the end of the season:
The first column of “sparklines” shows how each team’s attack rating – i.e. expected goals scored per match – changed over the season and their final rating (ranked in brackets). The grey horizontal line is the long-term average of around 1.3 goals, so strong attacks will be comfortably above it.
The second column shows the defence rating – which has the same long-term average obviously – where lower is now better.
The third column just subtracts the defence rating from the attacking one to get a net “expected goal difference per game”. The average line here is set at zero.
As you’d probably expect, there’s a reasonable amount of similarity between the order of the teams here and their final league placing. Differences will be primarily driven by abnormal (for better or worse) shot conversion and/or save percentages, which aren’t currently accounted for in these ratings.
- The story of Dover’s season is clearly visible here – they started with ratings below average in attack and defence (coinciding with them losing 8 of their first 11 games) but then massively improved at both ends to finish 8th.
- Bristol Rovers appear to have deserved their promotion, with a late surge up the table powered by an impressive improvement in defence, although they ranked highest at both ends.
- Fans of Grimsby can be justifiably disappointed not to have won promotion, with a consistently strong attack and – like Rovers – an improving defence.
- On a purely selfish note, I’m reassured that the four lowest ranked teams are the four who went down because that means I didn’t screw the maths up too badly.
I’m aware that that graphic is a bit busy, so here are the final attack and defence ratings mapped against each other in good old scatter plot format:
We can’t see the trend here – only an end-of-season snapshot – but it does help us to see the overall spread:
- We can clearly see Bristol Rovers‘ superiority here, along with Grimsby‘s similarly strong – yet unrewarded – performance.
- Wrexham‘s defence also sees them stand out – they allowed chances only a fraction better than Rovers but created far less – if they can improve up front without destabilising their back line then they should be challenging for promotion.
- Despite finishing bottom of the pile, Nuneaton actually performed slightly better than the other three relegated sides. Their difficulties in converting their chances ultimately cost them.
Note: There’s been a slight change to the positioning of the axes, which are now centred on the median for each measure rather than the mean. This was suggested by Chris Anderson and does a far neater job of separating the teams into quadrants.
Pingback: Introducing “E ratings” | Experimental 3-6-1