How well do expected goals match up with league position?
As usually happens in the early part of the season, I got impatient while waiting for a meaningful amount of match data to build up and blasted out some very early scatter graphics showing how the EFL clubs’ expected goals tallies were looking so far:
As I punted these onto Twitter without so much as a cursory explanation, I thought it was only fair to add a bit of context around how these charts work and how reliable they are as a long-term measure of performance.
A brief explainer
For anyone unfamiliar with the concept behind “expected goals”, these use data to measure the quality of chances created and allowed by a team based on how many goals we would expect the average team to score (or concede) from them. The BBC have recently put up an explainer of how Opta’s expected goals model works here ahead of them using it on Match of the Day, which is bound to be more eloquently phrased than anything I can come up with.
While it’s too early to draw any firm conclusions from these numbers at such an early stage – as they’ll be heavily influenced by things like the fixture list, “game changing” events like red cards and penalties, plus good old fashioned luck – there are still some interesting stories in there.
For example, Ipswich have maximum points from their first four games but have seemingly allowed chances of far greater combined menace than those they’ve created, while Millwall (one of the teams to suffer at the Tractor Boys’ hands) have dominated matches without getting the results to go with their performances. Neither looks like a sustainable pattern so it will be interesting to see what happens over the coming weeks.
Comparing expected goals with league finish
I had five previous seasons of data to hand so I thought it’d be interesting to see how well the average expected goals totals for individual clubs over a season matched up with their final league position.
I divided the league up into six groups of four (1st to 4th, 5th to 8th etc) to keep things neat. The animated GIF below plots every club using the same format as the tweet above – i.e. attack on the horizontal axis and defence on the vertical – and highlights each of the six groups in descending order:
If you click on this, it should bring up a version that can be paused and skipped through.
I was pretty relieved with how this turned out, given the relatively simple data available for the lower leagues compared to the top flight. Even though there’s quite a wide spread, the average performance of each group on the chart is progressively worse than the one above it in the table.
Therefore when we look at a team’s position on one of these charts, we can at least set a benchmark for how unusually good (or bad) they’d need to be in order to end up in a specific part of the league table.
The averages from these charts are as follows:
Those who followed the blog last season will have a pretty good idea of who that rogue dot at the top of the 1st – 4th graph is:Hopefully this chart will lay to rest any lingering suspicions that I have some sort of vendetta against Reading, given how badly my model rated them last season. Their performances were so far off what you’d normally expect from a “top four” team, nearly all of whom are tucked in or around the bottom-right quadrant.
2012/13 was an interesting season, with Burton only missing out on automatic promotion from League 2 by 3 points. Their porous defence conceded more goals than all but four other clubs in the division, giving them a goal difference of just +6. In the tier above, Doncaster won the title despite scoring only 62 goals thanks to an impressively tight back line that conceded less than once per game on average.