Guest post: Learning from the impact of lockdown

Image by <a href=";utm_medium=referral&amp;utm_campaign=image&amp;utm_content=1974369">Juraj Varga</a> from <a href=";utm_medium=referral&amp;utm_campaign=image&amp;utm_content=1974369">Pixabay</a>

ImpactEd’s mission of addressing the evaluation deficit has taken on a new meaning during this period of school closures. 

In this posting we carry a report written by ImpactEd explaining the nature of their work. This coincides with the publication on February 8 2021 of their report –  Lockdown Lessons: pupil learning and wellbeing during the Covid-19 pandemic.  The report summarises the findings from 62,254 pupils aged from 6 to18 across England over a seven month period. It highlights the issues resulting from lockdown on student learning, suggests ways of dealing with them and provides case studies as examples. You can access the report at

ImpactEd are an example of effective emerging innovation and they attempt to answer the question: where are we now? during the learning process rather than waiting for the final public examination results. This, as readers will know by now, is a key stage in our theory of action.  

ImpactEd: Addressing the Evaluation Deficit in Schools 

The problem we address

Each year, schools choose from a host of interventions ranging from school-based tutoring programmes to non-formal enrichment programmes to boost pupil learning. These activities can have positive, negligible or negative effects depending on content, context and implementation. But only 3 per cent of schools that ImpactEd interviewed before the pandemic were confident in their impact evaluation of such programmes, meaning they may be investing time, money and energy in initiatives that make no difference to pupil outcomes. Furthermore, considering Covid-19 related disruptions to learning, schools now have the mammoth task of supporting pupils as they catch-up on learning lost during school closures while adapting to a ‘new normal’.[1]  With limited data on academic progress during lockdown, schools need robust evidence on what works and why or why not to make informed decisions.

At the same time, small-scale education interventions with the potential for impact struggle to scale and secure funding due to difficulties associated with explicitly measuring impact.  Upon interacting with organisations and stakeholders within the education ecosystem, we arrived at two possible reasons for the prevalence of the evaluation deficit. First, the education sector in specific lacks flexible approaches and academically validated tools that integrate quantitative and qualitative methods to measure a range of outcomes including non-cognitive outcomes such as resilience and wellbeing. Second, inadequate capacity within organisations to effectively design and employ evaluation tools inhibit their ability to measure impact, widening the evidence gap in education. As a result, promising interventions with the potential to scale are not well researched, and the knowledge of what works is often lost. 

The ImpactEd approach to evaluation

ImpactEd exists to address this evaluation deficit in education. We work with schools and education organisations to help them make evidence-informed decisions about what is making the biggest difference to their pupils. We employ a three-pronged approach that includes bespoke evaluation support and training, access to the ImpactEd digital platform and dynamic reporting. The digital platform makes monitoring and evaluation easy for schools, matching robust measures of impact and making it intuitive to understand the difference an activity is making while the tailored partnership and wrap-around services strengthen their capacity for evaluation. The approach results in evidence-informed decisions and practice at a school or organisational level. 

While we work with many schools and organisations who seek to measure their impact on attainment, we recognise that exam results are only part of the story. Although, academic attainment has largely been the benchmark for educational success, research also tells us that non-cognitive skills such as intrinsic motivation, resilience and self-efficacy significantly enhances academic learning.[2] Nearly all the partner organisations and schools we work with are interested in evaluating a wider range of emotional, behavioural and non-cognitive characteristics as well. This is particularly important for disadvantaged pupils where national, standardised assessments do not sufficiently reflect the range of variables that impact life outcomes. Through a focus on both academic and non-cognitive outcomes, we provide a fuller picture of programme impact while acknowledging the limitations of our methods and the data at hand.      

The case for distributive justice 

While ImpactEd is driven by our mission of addressing the evaluation deficit, our work is also about promoting distributive justice in education. As articulated by Ruth Levine, Director of Global Development and Population at the Hewlett Foundation, distributive justice is linked to ensuring fair allocation of resources within the education sector – with the aim of improving our communities.[3] For example, we want to ensure that we are able to provide ethical guidance to our partners that reflects the fact that in many instances the evidence emerging from our work influences funding and policy decisions in the sector. Therefore, we strive to generate evidence that is specific, actionable and justice-oriented to support schools that fund interventions that improve life outcomes of the most disadvantaged pupils.

This approach is particularly seen in our work supporting networks of schools, foundations, and multi-academy trusts to address the disadvantage gap through evaluating the differential impact of interventions. For instance, we have been working with the Academies Enterprise Trust to help them measure the impact of a range of their interventions. As a large Trust of around 60 schools investing significantly in different catch-up programmes, it was enormously important to them to understand what works and what does not so that they can make investments in the right interventions. We have worked with them over two years to implement a quasi-experimental design to compare outcomes between different types of intervention groups, alongside interviews to understand how schools can make the biggest difference with the time and resources they are investing.   

One of the key implications of our partnership process is that in many cases, findings might reveal that even the most well intended of initiatives might have small or indeed negative impacts. As such, our partnership process works best where it is built on reflective and open school leadership, recognising that not everything will work all the time – and that we need to distinguish impact from opinion, even when those findings are inconvenient.

To find out more about our work, or discuss opportunities for collaboration, please visit  


ImpactEd is used by a number of the schools in Challenge Partners.  In addition, the partnership sponsored the early years of their work following an initial recommendation by Dame Sue John. 

This posting describes an emerging effective innovation.  When this is taken with the previous posting from the Western Quebec School Board about the development of their non-qualified teacher programme, which is an example of  best practice, they illustrate two of the three sources of knowledge we need to access if we are to provide an education for our students that represents the wisdom of the global education community. 

Take care and stay safe


[1] Quilter-Pinner, H., and Ambrose, A. 2020. “The ‘New Normal’: The Future of Education After COVID-19”. IPPR. 

[2] Gutman.L., and Schoon.I. 2013. “The Impact of Non-cognitive skills on outcomes for Young People”. Education Endowment Foundation.

[3] Levine, R. 2017. “The Moral Case for Evidence Policymaking.” Hewlett Foundation.