Ofsted Fails to Reach a Grade B in Statistics
Michael Wilshaw’s Ofsted has made a fool of itself yet again as it publishes a report which says more about its naive approach to statistics than it does about the progress of the most able students.
Ofsted is not happy. Its 2013 report on the progress of those students who achieved level 5 in their Key Stage 2 exams made some recommendations. Apparently, Ofsted were unhappy then that less than a quarter of those achieving the highest level in Maths and English went on to achieve a B grade or above in their GCSEs, and two years later nothing has improved.
Notwithstanding that expecting that two years is enough time to see improvements when the children involved had been through twelve years of education already, is it a reasonable complaint?
Lessons for Ofsted
The point that Wilshaw has emphasised is that those pupils achieving level 5 at age eleven are expected to reach at least a grade B at GCSE, so when 20% or so do not that must be a school failing. And Ofsted are supposed to stop schools failing, aren’t they? Unfortunately, Ofsted under Wilshaw have a history of statistical and attribution errors.
Word Mission Creep
First, expected does not, and never has, meant 100% guaranteed. And once you accept that it cannot mean 100% the argument is what percentage success is the minimum acceptable — why not accept the 80% success rate that is actually the case?
When the Key Stages were being worked out, level 4 was the mean average achieved level. This mutated once politicians got hold of it to expected, a statistics word that means the average for a prediction. But expected has other meanings, so it warped into required. And although the preferred weasel-word has reverted back to expected, its meaning is not the neutral, mathematical, original meaning, but the coercive, demanding meaning of expected. Its use is insidious, assuming that all pupils are capable of matching the average one.
Most level 5 pupils reach grade B and above. But should all of them do so? Key stage 5 assessments are not the mythical perfect exam. The shortness of the tests ensures that there is an uncertainty when it comes to assigning a level due to the fact that they only test a subset of the curriculum knowledge and skills. It is likely that some of those grade 5 pupils would not have achieved a 5 the year before or after with different questions, and some who ended up with a GCSE grade C may have got a B on another day with another paper.
How much does this uncertainty affect the progress issue?
If some pupils were over-rated at age eleven then they can’t be expected to make C at GCSE. The reliability assessments of KS2 assessments is likely to be around 0.80, meaning that around a third of pupils are misclassified. A similar figure applies to individual GCSE gradings. The precision of exam grades is not good enough to make the judgements that Ofsted is trying to make.
Identifying Failure, or Pushing an Agenda?
Ofsted says in its report, based on a sample of just forty secondary schools, that
For our part, Ofsted will make sure that inspections keep focusing sharply on the progress made by the most able students, particularly those from poorer backgrounds. Inspectors will also report more sharply about how well schools promote the needs of the most able through the quality of the curriculum and the information, advice and guidance they offer to their most able students.
It is especially disappointing to find that, almost two years on from our first report, the same problems remain. I hope school leaders see this report as a call to action – and raise the bar higher for their most able pupils, so that they can reach their full potential.
So it seems that Ofsted are so sure that their view of schools is right that they will abandon their original role of identifying failing schools, and continue with their attempts to define what good schools should be doing and how they should manage the progress of their pupils. Head-teachers are responsible for their schools and can have their priorities distorted by Ofsted’s progressive political agenda, while Ofsted itself seem to have no-one overseeing their pronouncements. Who is supposed to watch the watchers when they are a law unto themselves?
It is instructive to read the full report and see how often the writers refer to peer reviewed work from profession education researchers. The answer (to save you from the inevitable brain-rot you will get from from the exposure) is none.
No references. No peer review. No concern for the proper disciplines required for reputable inquiries. Ofsted call their document research, but it would never get published in a reputable Education Research journal.