The UK schools inspectorate, OFSTED, today published a league
table of local authorities’ comparative performance in inspections.
I’ve always considered OFSTED’s methodology in assessing the
effectiveness of schools to be deeply flawed. Schools get, or at least used to
get, an obscene amount of notice that OFSTED were going to turn up. I vividly
recall my secondary school spending what seemed like months preparing for an
impending OFSTED visit – tidying classrooms, selecting pupils to be interviewed
by the inspectors, prepping them on what they were and weren’t allowed to say,
painting the walls, and even installing an electronic information board in the
main corridor which, needless to say, was removed once the inspectors bade them
farewell for another few years. But the inspectors’ findings have been used for
years to compile school league tables forcing schools to compete against each
other in vying for for the top spot and, in turn, parents’ business.
My secondary school cherished its OFSTED “Outstanding”
rating more than any other prize, which made it a popular choice for parents in
Bradford – the school was over
four times oversubscribed for its 2012 intake – marginally better, from a probability-of-getting-in
perspective, than the University
of Cambridge (figures taken for 2010 as last year available). This fact was
used to exert control over its student population – frequently we were reminded
of the enormous privilege it was to attend and the scores of applicants we had
deprived of a place, in order to maintain discipline and attendance – which wouldn’t
perhaps have been so bad if the school employed the sibling rule and/or aligned
its holiday calendar with the city’s other schools.
This element of control served two purposes. The first of
these was that behavioural standards were high, as was the attendance rate –
which is some mean feat when unfortunate parents with children in two different
schools can’t plan holidays when all
their children are out of term-time. Whether these ends justify the means is
another question entirely – but it doesn’t take a rocket scientist to work out
that being continually reminded of the fantastic privilege with which one has
been honoured rather takes the shine off. And for that you can blame my year 8
English teacher for never teaching me the rule about ending sentences with
prepositions.
The second of these
is that the OFSTED rating was maintained, which allowed the school to continue
the cycle of control into the future. Thinking about it, another consequence
was probably an increase in the differential in comparative OFSTED ratings between the school and neighbouring
schools – holidays in term-time were forbidden, so parents with children in two
schools probably took the other child
out. From the school’s perspective, a virtuous circle.
It’s only natural that, once they know they will be tested,
schools will pull out all the stops to ensure they attain the highest rating
possible. Even schools that don’t seek to lord vague conceptions of “privilege”
over their pupil populations will seek to attract applicants – and, of course,
there’s the headteacher’s reputation at stake. No-one wants to be seen to be at
the helm of a sinking ship. And school league tables arguably serve a valuable
purpose in allowing parents to make an informed decision as to where to send
their children – even if the underlying methodology could use some work.
What I cannot understand is what useful inferences we can
draw from the local authority league table that OFSTED published today. Sure,
it’s interesting to be able to ask why the worst-performing local authority,
Coventry, only sends 42% of its children to “Good” or “Outstanding” schools
when neighbouring Solihull – fourth in the league table – can send 77%. But it’s
not as though the league table can actually promote competition between
authorities in the same way that a schools league table can. People generally
live in reasonably close proximity to their families – it’s not as though a
family in Derby, unhappy with the 43% chance that their child has of attending
a “Good” or “Outstanding” school can just up-sticks the hundred miles to
Buckinghamshire where their chance will rise to 78%, leaving their entire
support network behind.
What I think is more dangerous is the potential for misuse,
or misinterpretation, of the information. Local authorities are huge areas –
and there’s the potential that the combining of data in this way does more harm
than good. In Wakefield – a local authority of some 131 square miles, OFSTED
calculate that a child has a 52% chance of attending a “Good” or “Outstanding”
school. Hop over the boundary into Kirklees, a district of a mere 158 square
miles, and that figure jumps to 71%. This may tempt a concerned parent in
Wakefield to apply for a place for their child out-of-district in Kirklees.
But these figures don’t give any meaningful insight into the
performance of individual schools within an authority. Within each authority
there will be enormous variation in terms of schools that underperform, schools
that perform adequately and schools that excel. The OFSTED table shows only
those schools in the “Good” or “Outstanding” category – more granular
information is required to draw an informed conclusion. There are absolutely no
grounds to infer that any one school in Kirklees is better than any other
school in Wakefield – it’s perfectly possible that a Wakefield school just over
the border from another one in Kirklees is the better one. Furthermore, no
mention is made of what proportion of other grades there are within the
authority. Which is better – an authority with 80% “Outstanding” and 20% “Special
Measures” or one with 60% “Outstanding”, 20% “Good” and 20% “Satisfactory”?
The table could, in limited circumstances, be beneficial as
a benchmark of the comparative performance of local education authorities. Or,
at least, it could if the league
table included only those schools within local authority control, but that isn’t
what they have done. This league table includes all state schools within each
local authority, including Free Schools, Academies and Technology Colleges,
over which the local authority has no control. I’m no council apologist – I’ve
had too many parking fines for that – but to rank the performance of local
authorities against each other in areas where they have no remit to take action
is the height of unfairness.
I’m all for presenting complex situations simply, and I can
see what OFSTED have tried to do, but this over-simplification of the facts
does nothing but obfuscate them. In any kind of ranking there will be a winner
at the top and a loser at the bottom, but in this case the ranking is so
meaningless, due to the massive variation within
the reported units and the lack of control of the local authorities over the
schools within their boundaries. Miriam Rosen, HM Inspector of Schools, knows
this, and might as well not have bothered.
No comments:
Post a Comment