Laura Barbour reflects on lessons learned from five early years interventions supported by the Parental Engagement Fund.

Four years ago the Sutton Trust started brainstorming with colleagues at Esmée Fairbairn Foundation, regarding ways to address the significant gap in child attainment already evident in children from poor families at the start of school.

There are a number of factors which have contributed to this gap, but arguably the most significant is parenting and the home learning environment (Sutton Trust Low Income and Early Cognitive development in the UK). Crucially, the home learning environment was a more powerful predictor of outcomes than family background. So it seemed like an obvious area to invest, but the stumbling block came when further investigation revealed that there was limited robust evidence about how to effectively support the development of parental engagement and a positive home learning environment. In short, it was known what was important, but not what to do about it. Furthermore, while a large number of organisations were working in this area, very few were able to evaluate their work with any degree of confidence.

Together we established the Parental Engagement Fund (PEF) to increase our knowledge of what works to improve the home learning environment. PEF supported five organisations already working at grass roots level in the UK to help them to develop their understanding of evaluation in order to both develop delivery and demonstrate impact. The offer consisted of funding for delivery during the trial period but most importantly, support from the University of Oxford, Department of Education – Prof Kathy Sylva, Naomi Eisenstadt and Fiona Jelley.  They have jointly acted as a critical friend, expert advisor and arm’s length evaluator.

PEF reached 1,330 new families and supported the organisations to collect quantitative data. Three of the organisations have run ‘feasibility’ trials (Making it REAL, the Reader, Peeple) and two have run small-scale Randomised Control Trials (EasyPeasy, PEN).

PEF was able to identify promising outcomes on parenting skills for three of the interventions (Easy Peasy, Making it Real and PEN) and on child outcomes for one of the organisations (EasyPeasy).

PEF has also supported the organisations to leverage future support – two have gone on to large-scale Education Endowment Fund trials and all of them have accessed new funding and delivery opportunities.

We are encouraged by these outcomes for the organisations involved, but we recognise that it involved significant commitment from them. They all demonstrated what we described as ‘persistent curiosity’, a willingness to expose themselves to examine whether or not they were having the impact they hoped for and inevitably there were some disappointing findings. It is also important not to underestimate the critical skill mix from the support team: a deep understanding of the science of evaluation combined with a knowledge of and sympathy for the challenges of delivery at the front line.

There were some key messages that came out of PEF:

Evidence is important but so is understanding of the delivery context, to combine the two requires significant commitment, time, expertise and money.

The desire of commissioners and funders to invest in already tested programs is understandable particularly given the increasing demands on ever decreasing financial resources, but it is essential to consider the implications.

The organisations that PEF worked with all had valuable knowledge of the local communities where they are situated and a real understanding of the needs of the families that they serve. What they were much less likely to have was an in depth understanding of rigorous evaluation trials. This is what PEF provided in the form of the critical friend team from the University of Oxford.

It is worth noting that the relatively few parental engagement interventions that are underpinned by rigorous trials tend to have been developed by academic teams who have had the capacity and the funding to build in robust evaluation design from the start. Most of these rigorously tested programs are from the US – there is always the question as to how well they will translate to the UK context.

There is an inevitable tension between innovation and evidence of impact. There is a process that takes innovative practice through several iterations of small improvements before looking to establish clear evidence of impact. PEF encouraged organisations to constantly reflect on their delivery and make small adaptations along the way.

Robust evidence should not be an end in itself. There is a risk that once a particular intervention has established evidence of impact under one or even several sets of circumstances, it can lead to an ongoing generalised assumption of effectiveness. But at the front line, small changes are always made. If commissioners do not focus on the details of delivery, but assume that certain interventions are a safe bet, it will both constrain innovation, and more seriously, fund programs that in a new context do not work.

I would urge commissioners to consider all the PEF organisations for future support, not only because trials have been able to identify promising findings for several of them but because they have all shown themselves to be willing to constantly adapt and improve.

Laura Barbour is Early Years Lead at the Sutton Trust.