A couple of weekends ago I attended the London Festival of Education. The “London Effect” in secondary schools, a topic on which I have blogged previously, was much discussed.

One of the important contributions to the debate, cited by a number of presenters at the festival, was made last year by Simon Burgess from Bristol University. He found that the London Effect could be explained by differences in ethnic composition between schools in the Capital and those elsewhere.

The paper created much heat and a bit of light. It upset some who believed that the London Effect was real and gave ammunition to those who did not. Some, such as Chris Cook of the BBC, argued that the London Effect was still evident when outcomes in academic qualifications (excluding vocational qualifications) were analysed. In fact, Burgess had found such an effect in his research but in a follow-up blog, he asked whether it was “legitimate to decide now, after the fact, that some qualifications count less or not at all in a measure of what schools do?” This was a fair question. Would schools have entered pupils for vocational qualifications if they knew they would not be counted in Performance Tables? At the time the research was published there was no easy answer to this question resulting in an impasse.

As luck would have it, the rules of the School Performance Tables game changed following the Wolf review and the consequences can be seen in the 2014 data. In brief, the changes were that:

  • Large numbers of previously eligible qualifications were no longer counted
  • No qualification was counted as more than one GCSE
  • A maximum of 2 non-GCSE qualifications per pupil were counted
  • ‘First entries’ rather than ‘best entries’ were counted in some subjects (English, maths and other EBACC subjects).

This led to some substantial changes in qualifications offered by schools which we will cover in greater detail in a later blog. In short, pupils entered fewer qualifications overall but tended to enter more GCSEs compared to previous national cohorts. So what if Burgess had done his research on 2014 data?

To investigate this, we firstly run a simple value added analysis of KS2 test scores (in English, mathematics and science) on capped ‘best 8’ points plus English and mathematics ‘bonuses’ for pupils attending state-funded mainstream schools. This is the outcome measure is that used in DfE value added calculations. The results for 2013 are shown in Table 1. This largely concurs with Table T3 in Burgess’s paper. Pupils in London schools achieved, on average, almost 12 additional points in the outcome measure. This equates to an additional 2 grades on average across the 10 subjects counted in the measure. However, some groups, such as Bangladeshi, Black Caribbean and Black African, appeared to make less progress in London than in the rest of England

Table 1: Key Stage 2 to Key Stage 4 Value Added, Pupils in State-funded Mainstream Schools 2013

  Value Added % Pupil Population
  London Rest of England Difference London Rest of England
White – British -3.94 -4.46 0.52 36% 85%
White – Irish -0.62 -6.39 5.77 1% 0%
Any Other White Background 25.68 21.59 4.09 8% 2%
White and Black Caribbean -6.76 -11.98 5.22 3% 1%
White and Black African 7.47 -0.25 7.72 1% 0%
White and Asian 15.51 3.88 11.63 1% 1%
Any Other Mixed Background 6.70 2.03 4.67 3% 1%
Indian 28.26 28.96 -0.70 6% 2%
Pakistani 22.43 19.02 3.41 4% 3%
Bangladeshi 19.36 27.69 -8.32 5% 1%
Any Other Asian Background 34.55 30.47 4.08 4% 1%
Black – African 20.16 31.44 -11.28 12% 1%
Black Caribbean -1.79 6.53 -8.32 7% 1%
Any Other Black Background 9.09 9.24 -0.14 2% 0%
Any Other Ethnic Group 41.75 41.33 0.42 1% 0%
Chinese 31.25 31.97 -0.72 5% 0%
Information Not Yet Obtained 5.86 -16.46 22.32 1% 0%
Refused 6.47 -3.36 9.82 1% 1%
All pupils 10.07 -1.50 11.57    

 

We then repeat the analysis for 2014. This time, however, the outcome measure is based on the new rules for School Performance Tables. Results are shown in Table 2. Compared to 2013, the London effect is larger at 19 points, equivalent to three additional grades on average in the 10 subjects counted in the measure (or 0.17 standard deviations). The average VA score is higher in London for all ethnic groups apart from Black African (the 2nd largest group in London), including Bangladeshi and Black Caribbean groups which appeared to make less progress in London than elsewhere in 2013.

Table 2: Key Stage 2 to Key Stage 4 Value Added, Pupils in State-funded Mainstream Schools 2014

  Value Added % Pupil Population
  London Rest of England Difference London Rest of England
White – British -0.92 -6.46 5.53 35% 84%
White – Irish 1.71 -6.39 8.10 1% 0%
Any Other White Background 37.11 21.59 15.52 9% 2%
White and Black Caribbean -8.21 -11.98 3.77 3% 1%
White and Black African 17.16 -0.25 17.41 1% 0%
White and Asian 20.52 3.88 16.64 1% 1%
Any Other Mixed Background 12.57 2.03 10.54 3% 1%
Indian 39.38 28.96 10.42 6% 2%
Pakistani 33.32 19.02 14.30 4% 3%
Bangladeshi 30.91 27.69 3.23 5% 1%
Any Other Asian Background 44.86 30.47 14.39 4% 1%
Black – African 3.10 31.44 -28.34 13% 1%
Black Caribbean 31.39 6.53 24.85 7% 1%
Any Other Black Background 13.93 9.24 4.70 2% 0%
Any Other Ethnic Group 44.08 41.33 2.75 1% 0%
Chinese 44.81 31.97 12.83 5% 1%
Information Not Yet Obtained 26.18 -16.46 42.64 1% 0%
Refused 12.95 -3.36 16.30 1% 0%
All Pupils 17.67 -1.50 19.17    

 

Of course, I do not control for ethnicity and other factors, such as English as an additional language and free school meal eligibility in Tables 1 and 2. Doing so removes the London Effect in 2013 as Simon Burgess found. But for 2014 a small positive effect for capped points is apparent at almost 4 points (little more than 0.04 of a standard deviation or a quarter of the size of the effect controlling just for prior attainment).

This does seem to suggest that if Burgess updated his research using 2014 data he might conclude that there is a small London effect after all. As I noted in the previous blog I wrote on this subject, schools in London tended to maintain more traditional, GCSE-based curricula and enter pupils for fewer vocational qualifications compared to schools in the rest of England. This meant that they were better placed to respond to the changes in the rules of engagement for Performance Tables so that the performance of London has once again pulled away from the rest of England in 2014. They may also be better placed to respond to accountability changes Post-16 as I will discuss in a future blog.