Yesterday the Government published its response to the Wolf Review on Vocational Education. The Response sets out a number of proposals, accepting all of the Review’s recommendations. These include the eye-catching scheme to ensure that young people who do not achieve C grade in English and maths at age 16 continue studying them to age 19.
The response also proposes reforms to school performance tables. This is based on a recognition that schools’ behaviour in selecting qualifications for their students is strongly influenced by the incentive structure they face. A crucial component of this structure is the published school performance tables. These tables are important in influencing parental choice of school, and school leadership teams pay them a lot of attention.
From this year, the content of the performance tables will change quite significantly. The long-standing measure of the percentage of students achieving at least 5 A* to C grades will be retained. But in addition, a differential average points score will be published for each school, which provides information on how well the school does for students at the lower and upper ends of the ability distribution, as well as at the average:
“It is vital that performance indicators do not inadvertently cause schools to concentrate on particular groups of pupils at the expense of others. To avoid this we will continue to include performance measures, like average point scores, which capture the full range of outcomes for pupils of all abilities. In addition, from 2011 the performance tables will show for each school the variation in performance of low attaining pupils, high attaining pupils and those performing as expected.” (Wolf Review of Vocational Education, Government Response, p. 6).
This is a step forward. In our analysis we argued for exactly this measure: average GCSE points score, presented at three points in the ability distribution, low ability, average and high ability. Our criteria were functionality of the performance measure, relevance to parents and also comprehensibility. A measure is relevant if it informs parents about the performance of children very similar to their own in ability and social characteristics. It is comprehensible if it is given to them in a metric that they can meaningfully interpret. It is functional if it helps parents to answer the question: “In which feasible choice school will my child achieve the highest exam score?”. Overall, this performance measure came out on top. We also described ways that the information could best be displayed for parents: paper-based and web-based delivery mechanisms.
A second issue is that the “price” or GCSE-equivalent points of the new vocational exams seems set to change. Precise details of this are unclear at the moment. It is worth making the point again that schools will have an eye on the performance table impact of courses they offer to students. If vocational qualifications are to be worth less than they are at present receive, there is a danger that schools will not be keen to promote them to students who may be unlikely to score highly on more academic courses. In turn, this may make schools less keen to accept low ability pupils.
Of course, the old league table measure of percentage with 5 A* to C grades is staying. Perhaps there is a performance management version of Gresham’s Law and good performance measures will drive out bad ones. If parents come to rely more on this measure, the media will give it more prominence and the grip of the “%5A-C” measure on the public mind will finally begin to weaken.