The TES reported today that Ofqual has identified that some types of school have been more ‘optimistic’ with this year’s GCSE and A-Level centre assessment grades, but that it would not say which.
We collected some preliminary GCSE centre assessment grades from schools back in May. In one of our three blogposts we wrote on the data, we looked at centre variability in each subject. In other words, the extent to which schools’ results change from one year to the next.
In this blogpost, we’ll look at overall centre variability, defined as the change in average point score in GCSEs. Is there any evidence that some types of schools were more optimistic?
Comparing results from year to year
As before, there are limitations. We are working with a subset of around 1,900 secondary schools and we don’t have any information on independent schools. The grades were also preliminary. Some schools may well have made further changes before submitting to the awarding bodies.
We’ll start by using published 2018 and 2019 examination-level data to calculate the GCSE average point score (APS) in all subjects for a school. To ensure a like-for-like comparison, we’ll restrict subjects to those which had been reformed by 2018.
On the whole, there is a spread, with some schools’ results falling and others’ increasing – generally by between -1.0 and +1.0 points. There was less variation among schools with high results. The correlation between the two measures displayed is very slightly negative (-0.2). As we wrote here, schools with low results one year are more likely to improve the next.
Now let’s look at the change in APS between 2019’s actual results and 2020’s proposed results – shown on the next chart.
Clearly, there are far fewer schools reporting lower results in 2020 than 2019 (those plotted below the horizontal axis). But the correlation between the values plotted has increased in magnitude to -0.4. In other words, lower-attaining schools were more likely to report very large increases in results. There were 44 schools that reported increases in their APS of more than +1.0.
But do we see much difference between different types of school? The table below would suggest, on average and for state-funded mainstream schools at least, not really. Studio schools and UTCs tended to report the largest rises but there are relatively few of these schools and numbers of pupils tend to be small.
However, this is only a quick analysis. Ofqual would have had more data available to it (including data from independent schools) and undoubtedly will have been able to undertake a more detailed analysis given the time and resources available to it.
If the grades schools submitted to the awarding bodies are anything like those they sent us then it looks like lower attaining schools tended to submit more optimistic results.
We would expect lower attaining schools to improve the most as a group, but what Ofqual’s statistical moderation process is unlikely to have been able to detect is which of those would have improved, and which would not, had exams been in place.
Want to stay up-to-date with the latest research from FFT Education Datalab? Sign up to Datalab’s mailing list to get notifications about new blogposts, or to receive the team’s half-termly newsletter.
1. We only include schools with three years of data and remove schools where the average number of centre assessment grades per pupil was less than five. In total, we use 1,899 schools in our analysis.
The subjects included art and design, art and design (photo), biology, chemistry, citizenship, computing, food and nutrition, dance, drama, English language, English literature, French, geography, German, Greek, history, Latin, mathematics, music, physical education/sport studies, physics, religious studies, combined science and Spanish.
2. These are mostly, but not exclusively, grammar schools.