Evidence for practice


Andrew Morris celebrates the gradual advance of evidence-based practice…

We’re all evidence-based now – or so it would seem judging by the frequency with which the phrase crops up in current affairs programmes, ministerial claims and mission statements.

In all seriousness, this is an achievement worth celebrating. The idea that evidence matters, whether at policy level or in daily practice, has only recently been acknowledged. For decades a small crew of diehards has been making the case for evidence, usually in the face of indifference if not active hostility. Academics managed to stifle the issue with fine arguments about epistemology and futile battles about quantitative and qualitative methods; civil servants and politicians gleefully used this excuse to sideline the topic. The struggle for evidence-based medicine only a few decades ago warned us that this kind of stand-off was to be expected.

Recent years however have seen significant changes in many branches of education and other public services: the arrival of the Education Endowment Foundation with its funding of school-based interventions evaluated by randomised control trials, the development of the Teaching and Learning Toolkit of soundly based practices by Durham University, the creation of What Works Centres by the Cabinet Office and the formation of an Alliance for Useful Evidence by the National Endowment for Science, Technology and the Arts (NESTA), to name but a few. Yes, these acknowledgements of the importance of sound and useful evidence mark an important milestone in the development of public services.

If attitudes are finally shifting what’s left to do now for the proponents of evidence-based approaches? Plenty, of course! With evidence finally on the agenda, subtler questions begin to arise. If evidence is now sacred, won’t we see organisations and individuals simply making sure they pick the right evidence before presenting their case? Will we see all manner of interpretations of the very word evidence?

The proponents of evidence-based approaches must move the debate forward. We now need to be talking about the kinds of evidence that are needed. Is it the numerical data that drives so much school and college improvement planning today? Is it evidence of one-off studies of synthetic phonics, academy performance or the value of vocational qualifications? Or is it the synthesised findings of multiple studies? Should effect sizes measured in experimental studies outweigh insights from ethnographic studies of the lives of individuals? Do statistical models from longitudinal studies over people’s lifetimes help in making policy choices? What is the role of small scale studies by practitioners that provide crucial detail about specific teaching methods?

These questions need to be taken seriously at all levels and satisfactory compromises made that raise the quality of evidence in use without leaving the practitioner bereft in the absence of top quality  evidence. A glance at more mature evidence-using communities shows that almost all these types of evidence, so often the cause of contention, in are in fact needed at the right time and place. In healthcare, qualitative studies of individual patients are as important as randomised controlled trials in large populations. Personal and institutional data are valuable, in their right place, in working out how to improve particular services. Statistical analyses shed light on trends but need careful use to avoid the tyranny of the average value and the league table. Practitioner research is of crucial importance in working out how to adapt general findings to specific circumstances and in shaping the specification of major studies.

The crucial issue now facing champions of evidence is what healthcare researchers call ‘implementation’ – the problem of helping practitioners use evidence in actual settings to alter the way they operate. Until now there have been so many obstacles to this in education that the question was barely addressed. Evidence was not readily accessible; where it was, it was not tuned for practical use; where it had been, teachers had too little time and colleges and schools too little capacity to make use of it. Fortunately, some of these barriers are now gradually being lowered: relevant research is being synthesised and made accessible in teacher-friendly formats. The deeper questions of using it effectively are bubbling to the surface. In some colleges and schools people are being designated to lead on evidence; in some, practitioners are being encouraged to engage with evidence or to carry out studies themselves using evidence and applying it in their local context. The LSIS Research Development Fellowships and IfL research bursaries have helped support this and Inside Evidence magazines and various practitioner journals have helped to spread the word.

However, there is no real prospect of any government providing significant funding to support research or the use of evidence in colleges or schools. Even apart from the current austerity measures, no government is likely to prioritise this kind of expenditure. The time and effort needed to explore the evidence base and to carry out local, small-scale studies will inevitably be the responsibility of local decision makers. And for a leadership team in a college to choose to set aside resources for using evidence, they will need to be convinced that it is the most promising way to spend that money to achieve a desired result. This is, after all, the basis on which organisations generally decide on funding levels for R&D to improve their services.

We are a long way from this position in the sector as a whole, but even today there are a number of providers pioneering the use of research evidence and sector organisations promoting it, including the Learning and Skills Research Network, the Association for Learning Technology , the Institute for Learning,  the Education and Training Foundation  and the National Foundation for Educational Research. The changes needed to spread this and to develop confidence that using evidence will eventually lead to better outcomes at lower cost are very long term indeed; they amount to a gradual shift in culture.

Attitudes and behaviours do not change overnight – we know this from countless virtuous, evidence-based campaigns to help us change our eating habits and exercise routines. But tough though evidence-based change may be, it isn’t just a pipe dream. Remember those seat belts we used not to wear, those cigarettes we failed to give up? Evidence can take a hold, if slowly.



twittertwitter  The Policy Consortium on Twitter

Leave a Reply