Lee Elliot Major and Steve Higgins, professor of Education at Durham University argue that effective implementation matters once the evidence for what works has been identified.

In England we spend around £2 billion a year on 170,000 teaching assistants in schools. The total impact of this money on the attainment of our children is zero.  The best evidence we have indicates that for every TA who improves a pupil’s progress there is another whose deployment has a negative impact.

This is a powerful example of why we need evidence based policy and practice, but it also highlights the difficulties of promoting changes to improve practice – because finding that TAs are deployed ineffectively does not tell you what to do about it.

Such issues are soon to be faced across government, following the launch last week of a network of What Works centres to champion evidence-based social policy.

At the launch, Oliver Letwin, Minister of State at the Cabinet Office, said that the biggest question was why government hadn’t done something like this before.  But if government hasn’t, others have, and the centres will be building on existing practicies of evidence use in health and education.

In health, the Cochrane Collaboration this year celebrates two decades of producing systematic research reviews. Its approach has shaped advice offered by the National Institute for Health and Clinical Excellence on NHS treatments.

The cultural shifts that introduced evidence-based medicine to surgeries and hospitals 20 years ago are now playing out in classrooms and schools. The What Works education centre will use a toolkit we developed three years ago summarizing thousands of studies to show the best and worst bets for improving pupils’ results. It is the model now being advocated by the Cabinet Office for other policy areas.

Since 2011, the Education Endowment Foundation, a charity that aims to improve the educational achievement of disadvantaged children, has developed this toolkit into a resource for disseminating evidence across the sector. The EEF has overseen a quiet revolution in England’s schools, commissioning some 40 randomized control trials so far.

The toolkit shows that working on teacher-learner interaction – improving feedback to pupils, for example –  gives the biggest bang for the education buck.  Yet our surveys reveal that most head teachers prioritise actions that evidence suggests, on average, have little impact: reducing class sizes or recruiting TAs.

In education, the route to evidence-based policy is particularly challenging, because the navigation instruments are less powerful and predictable than in medicine.  Imagine a world where the laws of nature vary through time and space. That is the reality across the thousands of different classrooms, teachers and children that make up our education system. Over the last 30 years curriculum and assessment have changed many times, and variation in schools and teachers has a profound impact on how an intervention works or doesn’t work.

 We have little idea how to help schools implement the best bets for improvement. Some may need a highly prescriptive programme; others general principles to shape and evaluate a tailored programme.

To return to TAs, for example, the evidence does not mean that they should be scrapped. There are many ways in which TAs might be better recruited, trained, deployed and evaluated. Some approaches will be more effective in lower-performing schools, schools serving a high proportion of children with special needs, or schools with particular teachers. Knowing what works where and for whom could improve a school’s choices about TAs and everything else.

A commitment to what works (strictly, what’s worked) in education must also consider the constantly changing pedagogical landscape. Take phonics teaching: if the current emphasis on phonics becomes routine, then remedial support based on phonics is likely to become less effective than research currently suggests. Children who have failed in a phonics-rich pedagogy may benefit more from a different remedial style.

These are important lessons for the other four planned What Works centres. Evidence can be boring or inconvenient for politicians more interested in an immediate and popular policy fix. But, as Letwin stressed, “this is only the start of a journey, not the destination”, and the outlook at this early stage is promising. This programme has the potential to revolutionise public policy in areas as diverse as ageing, education and policing, replacing dogma and tradition with research and randomised trials. Billions of pounds of public money could be saved.