Ofsted still found to be in much need of improvement

FacebooktwitterlinkedinmailFacebooktwitterlinkedinmail

Colin Forrest, Carolyn Medlin and Mike Cooper of the Policy Consortium analyse what Sir Michael Wilshaw’s Ofsted Annual Report for 2012/13 says about further education and college improvement.

This is an extended version of the post ‘Setting the scene for the Ofsted Annual Report’ that appeared on 11 December 2013.

It’s nearly 10 years since Ofsted under David Bell published Improvement through Inspection – ‘a sincere attempt at externally tested self-evaluation’ of its impact on quality in the FE and Skills sector. As Ofsted re-engages directly with improvement activities, is a similar review now overdue?

Each year, the Chief Inspector’s Annual Report gives a picture of the ‘state of the nation’ – but usually gives few clues to how well the Inspectorate impacts on improvement. The new Report, published on 11 December 2013, seems to continue that pattern.

There was anger around the FE sector when, clearly, the findings of last year’s Report were treated selectively, and then applied to the whole FE and skills sector. However, a more conciliatory approach from Ofsted has been signposted for several months. In the summer, for example, its Director of FE and Skills, Matthew Coffey, wrote:

                I would like to commend the hard work of so many further education and skills leaders, teachers and trainers and other staff in helping to raise standards.                I have been impressed by the determined and positive response I have seen to last year’s Chief Inspector’s Annual Report. I look forward to seeing the                         positive impact this will have for learners.

The FE and Skills Annual Report for 2012-13 starts in a similar tone, with Sir Michael Wilshaw identifying ‘grounds for optimism’ over improving the quality of Teaching, Learning and Assessment. In this year’s Report, the main topic of criticism shifts to how well provision meets employers’ needs, aligning with the direction of travel in funding changes, and reducing skills shortages.

In both comments, however, there is a subtle element of implied rebuke about the furore around the December 2012 Report as mentioned above, tacitly refusing to accept the ‘sector’s criticisms of Ofsted’s criticisms’. In both cases, too, there are sub-texts suggesting that improvement in the FE sector is largely attributable to such broad messages about preceding grade patterns and what underlies them.

Even this year’s new major reproach begs questions. If the 2012-13 Report’s assertion of the issue of responsiveness to the needs of local employers is accepted, what is Ofsted’s role in improving that situation? How will it go beyond the obvious business of just stating ‘the big picture’?

In the Report, Ofsted encourages better links with employers, and bodies such as Local Economic Partnerships (LEPs), as a possible ‘solution’. Is that, and the provision of a case study of where it went wrong in Bristol, the main contribution Ofsted should make? After all, the record of Ofsted’s grades published since September of this year for provision largely led by employers (in at least six firms) hardly inspires confidence.

In sum, both last year’s main report and its new successor are silent on the impact of Ofsted’s role in improvement, beyond the implied one of drawing attention to shortcomings. That revived role has been in place for over a year now, and was planned beforehand. Are we therefore seeing genuine ‘shared ownership’ of improvement? If so, what are the implications for providers and the way Ofsted operates?

We need a clearer picture of how impact is measured

There are clues, but answers so far are neither very clear nor full. The list of Ofsted’s priorities for 2013/14 includes: ‘increase the impact of inspection’ and ‘increase the impact and scope of improvement work’. But there is no indication of how these impacts are to be measured, evaluated and shared.

To be fair, this year’s overall Annual Report is now accompanied by eight regional reports for the first time – and this is where the narrative of improvement starts to surface. There are references to HMI ‘research’ to explore key strategies and sharing practice, though (beyond the assumption that this refers to more case studies, which are of variable usefulness for real improvement actions) it is not at all clear how that research might be used. It is also arguable how far a regional dimension is now a significant factor in the landscape of FE and of improvement, as compared to the heyday of regional structures and funding in the early 2000s.

If Ofsted argues that their direct and positive impact comes through how they measure and evaluate improvement, as demonstrated in their processes, grading and reporting, parts of the sector may well be unconvinced. Guidance on ‘Support and Challenge’ was updated in July – but it is still hard to see how these elements are balanced to support improvement. Indeed, it appears that little consideration has been given to what the common barriers to improvement are, how they are best overcome and how improvement can be sustained. One obvious way in which this might be accomplished by Ofsted is through meta-analysis and evaluation of what the post-inspection monitoring visit reports show. Such a process does not appear to be under way, or even contemplated.

Some providers attend Ofsted improvement events: ‘by invitation’, selected by narrow criteria and, as so often with such occasions, failing to fully meet demand. The new regional reports refer to invitation-only ‘getting to good’ improvement events. These have the potential to enhance improvement; but this cannot be a given, since attendance is only a first step in making effective changes ‘back at the ranch’.

At the very least, HMI input on such occasions must be inspiring, challenging and imaginative, if they are to exemplify the very characteristics that Ofsted looks for in inspections. Yet there is as yet no convincing evidence of the impact of these events, or of ‘the learner voice’ on such occasions. We know inspectors can be very tough on self-assessment statements like “we held a staff development event”. The response is rightly: “What was the improvement impact of what you did, and how do you know?”

Certainly, providers that ‘require improvement’ report that the relationships of designated HMIs are valued. Some of the new regional reports give a broad insight into the form this relationship takes: from carrying out joint observations to enhancing provider self-assessment, as well as fostering links with good or outstanding providers. But the level of resources available from Ofsted is limited.

Pressure is increasing from other directions, too. The recent Skills Commission report Move to Improve recommends that Ofsted quickly publish an account of the impact they have had in the first year of ‘Support and Challenge’. That account needs to demonstrate such impact convincingly, establishing strong causality rather than coincidence, and using data – in exactly the reliable and robust ways Ofsted itself demands of providers – rather than through generalised assertions.

What does improvement really look like?

We offer an example of recent data to be considered. Of the 44 FE and skills inspections since September 2013 (up to 2nd December 2013) 25% of providers improved by a grade since their previous inspection (where applicable). Yet 30% dropped at least one grade; 32% were unchanged (for 14% of those inspected, it was their first occasion). Is this evidence of adequate improvement, and of Ofsted’s impact on improvement?

The Inspection Handbook, with its rationale for how improvement is promoted by inspection, shows a narrow perspective on that concept. What does improvement look like, in all its dimensions? Taking risks (or, using a familiar near-synonym, ‘innovation’) and making mistakes is an important part of improvement. But what scope is there for that, especially when providers are under a variety of pressures to come up with quick answers and sustainable gains, alongside decreasing resources?

There are other difficult issues that are neither easily dismissed, nor easily answered. For instance, when there is general concern about consistency across all inspectors, and a need rigorously to interrogate improvement plans and actions, what risks are there when embarking on such a journey when most are steered by a single HMI? What is the purpose of good practice case-studies, when sometimes these are largely generalised, and/or repeat fairly obvious exhortations to ‘do better’?

It’s also worth asking whether the fact that inspectors are trained in the art or science of inspection and evaluation is in itself adequate. How good are they as change agents? That link is by no means guaranteed.

And equally – when the emphasis is so strongly on sector self-improvement, in the absence now of a specific improvement body such as LSIS – just how good are the successful organisations identified by Ofsted at fostering culture change in others, and facilitating the transfer of good practice?

FE ‘self-improvement’ has a long history; and alas a somewhat chequered one. Providers, however outstanding, are also now increasingly hard-pressed. Helping others requires considerable co-operation with the newly-refocused Inspectorate. Is Ofsted as good at that as it needs to be – and as the sector needs it to be?

Different sections of the suite of reports making up the Annual Report are illustrated with ‘outstanding provider case studies’. These though may sometimes only serve to illustrate the issues that exist with improvement and inspection, in the reductionist nature of the outputs of inspection. Ofsted’s processes capture huge rafts of qualitative and quantitative information. Evidence forms contain a rich narrative on the central aspects of Teaching, Learning and Assessment. Yet the vast bulk of this information is held only within the Inspectorate, and remains largely inaccessible for improvement purposes.

These case studies, as suggested, may do little to address this deficit in any significantly practical way. Although popular with some, such documents are generally deemed to be at the lower end of effectiveness for those strategies that support improvement.

Where’s the evidence of co-ownership?

A good example is one published in October 2013, linked to the impressive achievement of a full set of Grade 1s in December 2012 by Walsall Adult and Community College. Entitled ‘Getting from Good to Outstanding’, the central focus is environmental sustainability. Important as that issue unarguably is, the message appears primarily to be that this is the key route for achieving that kind of improvement. Doubtless Ofsted would protest such an interpretation as being simplistic and naive – as the College would, too. Nevertheless, this particular document as it stands may not be all that helpful to hard-pressed providers looking for answers. And as one of Ofsted’s major contributions to the improvement agenda in FE, the case-study approach as a whole seems of similarly limited value – perhaps even on occasions it risks being a counter-productive tool.

We offer two final and significant points. What isn’t mentioned in the suite of reports within the 2012/13 Chief Inspector’s Annual Report is the fact that some providers saw significant improvements in grades after they appealed against inspection findings. The Inspectorate usually rush to publish the grades and the report, even while such complaints are being investigated. In addition, evaluations of inspection are not in the public domain.

So, in conclusion, we ask how the Inspectorate can claim proper co-ownership of the improvement agenda, when their own approaches appear to fall short. This seems to be the case not only in terms of David Bell’s expectations established a decade ago, but also of the expectations that Ofsted (rightly) have of providers – in the same territory of open, rigorous, data-focused self-assessment and improvement planning.

This article was originally published in the Education Journal of 17 December 2013. 

 

twittertwitter  The Policy Consortium on Twitter

Leave a Reply