The Sunday Times published yet another piece bashing schools in the north and Midlands yesterday.

It is certainly true that schools (and local authorities) in the north and Midlands tend to be lower attaining.

The piece quotes Sir Michael Wilshaw as saying that 13 of the 16 worst local authorities in terms of secondary school standards are in the north and midlands. This appears to be based on Progress 8 data for 2018. In fact, using provisional 2019 data, 16 of the 18 lowest performing local authorities are in the north.

Differences in attainment

We’ve written before about how differences in attainment between areas tend to be due to differences in demography. Previous examples are here and here.

What we want to really know is this: If we could somehow send the pupils from the north to the schools in the south, would they achieve any better?

I suspect their attainment would be slightly higher but not much so.

While the question is impossible to answer, we can try to get an approximate answer by using the data at our disposal and making some assumptions.

What if we compared attainment at each school (or local authority) to a hypothetical national average for similar pupils in similar schools?

Progress 8 itself goes some way towards this by comparing pupils’ Attainment 8 scores to the national average for pupils with equivalent prior (Key Stage 2) attainment.

But it does not take into account other factors known to influence attainment like gender, disadvantage and ethnic background. To what extent do P8 scores for northern local authorities simply reflect differences in pupil populations?

To answer this, we can go back to the 2018 contextual value added scores that I calculated in this blogpost. (I’ll use the version of the scores that includes school characteristics.)

We can then compare the contextualized P8 score for each local authority with its published P8 score.[1]

Results are shown in the chart below. I’ve highlighted schools in the north east and those in London.

If the P8 score of a local authority did not change once context was taken into account it would sit exactly on the dashed line. For those below the line, taking context into account brings down scores, while for those above the line, taking context into account increases P8 scores.

There are a number of important features of the chart to note.

  1. Scores for local authorities in the north east all improve when P8 is contextualised.
  2. Scores for local authorities in London tend to fall
  3. There are some London LAs that have a higher contextualised P8 score than the highest scoring LA in the north east, but there are also some with lower scores than the lowest scoring LA in the north east.
  4. The range (spread) of contextualised P8 scores narrows considerably when context is taken into account.

So on the basis of this analysis, it does not appear that the performance of local authorities in the north east is much different to those in London.

The impact of disadvantage

As we showed here, Progress 8 scores for most schools aren’t that different, especially when context is taken into account. There are some schools whose performance is manifestly lower than others, but there are not many.

And as we showed in this blogpost, for all its successes, some groups of pupils have not benefited from the ‘London effect’. White British, long-term disadvantaged pupils perform no better in London and the south than elsewhere. If anything, they appeared to perform better in Yorkshire and the Humber than anywhere else.

Rather than policies and interventions that are geographically targeted, we need policy that raises the attainment of low attaining pupils in all schools all over the country. If we got this right, schools in the north and midlands would be disproportionately affected and so attainment there should increase by a greater margin.

Wilshaw claims that “these kids [in northern schools] are not different from the kids I taught in London 10 years ago but they are not in good schools with good leaders.”

The first part is certainly questionable. The second part may be true, but evidence other than attainment data would be needed to support it.

Want to stay up-to-date with the latest research from FFT Education Datalab? Sign up to Datalab’s mailing list to get notifications about new blogposts, or to receive the team’s half-termly newsletter.

Notes

1. As I set out here, contextual value added is not necessarily a measure of school effectiveness but it is useful for this purpose nonetheless.